The purpose of this document is to provide a cookbook approach to building a Squid proxy that can:
- Intercept HTTPS/SSL packets;
- Inject the HTTP headers
- X-GoogApps-Allowed-Domains;
- X-YouTube-Edu-Filter;
- Rewrite URLs for Youtube to add the edufilter option.
The base operating system for this deployment is Ubuntu Server 12.04 LTS. You can accept all of the defaults for the installation but when it asks what software to install, only select the SSH Server.
If you’re installing this server into a virtual machine, you may need to install the VMWare Tools. This article covers the installation of that. My server is running on 2 processor cores, 1Gb RAM and 16Gb of storage. In my setup this server is only being used for header injection and so doesn’t require storage for caching.
When you have the server installed, you can begin the instructions below.
Compiling Squid for SSL Header Addition
Log into the server with the account you made during the installation of the Ubuntu Server OS. The first step is to make sure that your new installation is completely up to date:
$ sudo apt-get update $ sudo apt-get dist-upgrade $ sudo shutdown -r now
Your server will now restart and you will need to log in again to continue. The next section will download Squid, unpack it and then install all of the dependencies required for the compiling of the code. Some of the package definitions of what’s being installed are a bit broad here (openssl* and libcap-*) which results in too much stuff being installed. That meant I had to uninstall aolserver4-daemon before apache2 because they will conflict. If anyone wants to take the time to narrow the definition down to exact packages required, please update me in the comments and I’ll fix up the document. As it stands, it does work but could be improved.
You can check for the latest version of Squid at (http://www.squid-cache.org/Versions/v3/3.3/). Newer versions than the one I’m using here should work but I haven’t tested them.
$ sudo -i # wget http://www.squid-cache.org/Versions/v3/3.3/squid-3.3.9-20131011-r12634.tar.gz # tar xvzf squid-3.3.9-20131011-r12634.tar.gz # apt-get install g++ gawk m4 gcc-multilib make smbclient openssl* libcap-* # apt-get remove aolserver4-daemon # apt-get install apache2 # shutdown -r now
Your server will restart again and after logging back in you’re ready to start compiling the source code of Squid.
$ sudo -i # cd squid-3.3.9-20131011-r12634 #./configure --enable-delay-pools --enable-ssl --enable-ssl-crtd --enable-linux-netfilter # --enable-arp-acl --enable-snmp --enable-gnuregex --enable-wccpv2 --enable-http-violations && echo $? # make all && echo $?
Before this next step I took a snapshot of my virtual server so I could always roll back any changes quickly. This might not really be necessary though because the configuration options used above will install Squid into /usr/local/squid and you can uninstall Squid simply by deleting that folder.
The next command will install Squid to /usr/local/squid and after that we’re making the SSL certificates that are used to resign the information being passed to the clients. The SSL certificate below is set expire in 10 years after creation.
# make install && echo $? # cd /usr/local/squid # mkdir ssl_cert # cd ssl_cert # openssl req -new -newkey rsa:1024 -days 3650 -nodes -x509 -keyout myCA.pem -out myCA.pem # openssl x509 -in myCA.pem -outform DER -out myCA.der
Next we need to edit the Squid configuration file:
# nano /usr/local/squid/etc/squid.conf
This is a copy of the squid configuration I’m using. I’ve highlighted in red parts that you need to change to the information for your domain.
# # Recommended minimum Access Permission configuration: # # Deny requests to certain unsafe ports
http_access deny !Safe_ports
# Deny CONNECT to other than secure SSL ports http_access deny CONNECT !SSL_ports
# Only allow cachemgr access from localhost http_access allow localhost manager http_access deny manager
# We strongly recommend the following be uncommented to protect innocent # web applications running on the proxy server who think the only # one who can access services on "localhost" is a local user #http_access deny to_localhost # # INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS #
# Example rule allowing access from your local networks. # Adapt localnet in the ACL section to list your (internal) IP networks # from where browsing should be allowed http_access allow localnet http_access allow localhost http_access deny all
# Squid normally listens to port 3128 http_port 3128 ssl-bump generate-host-certificates=on dynamic_cert_mem_cache_size=20MB cert=/usr/local/squid/ssl_cert/myCA.pem
always_direct allow all sslproxy_cert_error allow all sslproxy_flags DONT_VERIFY_PEER sslcrtd_program /usr/local/squid/libexec/ssl_crtd -s /usr/local/squid/var/ssl_db -M 20MB sslcrtd_children 100 request_header_add X-GoogApps-Allowed-Domains domain.local localnet request_header_add X-YouTube-Edu-Filter zhsdjh_JMFNDKjbFSxmmsA localnet acl broken_sites dstdomain "/usr/local/squid/etc/bypass_ssl.txt"
ssl_bump none broken_sites ssl_bump client-first all redirect_program /usr/local/squid/sbin/rewriter.pl redirect_children 90 coredump_dir /usr/local/squid/var/cache/squid
# # Add any of your own refresh_pattern entries above these. # refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|?) 0 0% 0 refresh_pattern . 0 20% 4320
Create the file /usr/local/squid/etc/bypass_ssl.txt and add the following content:
accounts.google.com accounts.youtube.com clients1.google.com clients2.google.com clients3.google.com clients4.google.com cros-omahaproxy.appspot.com dl.google.com dl-ssl.google.com www.googleapis.com m.google.com omahaproxy.appspot.com safebrowsing-cache.google.com m.safebrowsing-cache.google.com safebrowsing.google.com ssl.gstatic.com tools.google.com pack.google.com www.gstatic.com gweb-gettingstartedguide.appspot.com storage.googleapis.com commondatastorage.googleapis.com
Create the dil /usr/local/squid/sbin/rewriter.pl with the following content:
!/usr/bin/perl
$|=1;
$strSchoolID = 'edufilter=zhsdjh_JMFNDKjbFSxmmsA';
while (<>) { @X = split; $strURL = $X[0]; if ((index($strURL, 'youtube.com') + 1)) { if ((index($strURL, 'edufilter') + 1)) { print $strURL, "n"; } elsif ((index($strURL, '.css') + 1)) { print $strURL, "n"; } elsif ((index($strURL, '.gif') + 1)) { print $strURL, "n"; } elsif ((index($strURL, '.png') + 1)) { print $strURL, "n"; } elsif ((index($strURL, 'gif') + 1)) { print $strURL, "n"; } elsif ((index($strURL, '.js') + 1)) { print $strURL, "n"; } elsif ((index($strURL, '.xml') + 1)) { print $strURL, "n"; } elsif ((index($strURL, '?') + 1)) { $strURL = $strURL . '&' . $strSchoolID; print "$strURLn"; } else { $strURL = $strURL . '?' . $strSchoolID; print "$strURLn"; } } else { print "$strURLn"; } }
Then run the command:
# chmod 755 /usr/local/squid/sbin/rewriter.pl
Edit the rc.local file to start Squid on boot.
# nano /etc/rc.local
Add the following line BEFORE the “exit 0” line.
/usr/local/squid/sbin/squid
Copy the SSL certificate to the web server’s root folder so that clients can download it easily. This will be accessible at http://Server IP Address/certificate.der.
# cp /usr/local/squid/ssl_cert/myCA.der /var/www/certificate.der
Restart the server to finish up.
# shutdown -r now
If you find any errors in this document please comment and I’ll update it.
First, thanks so much for the post. I am a retired 68 year old IT geek who used to work in a school system and I still volunteer to do projects. I took a look at youtube for schools about 3 years ago, about 9 months before I was to retire, and decided I did not have enough time to learn what had to be done to squid with everything else going on. Also, at the time, I would have had to use an ancilliary package to add headers.
I wondering about the apt-get statements after the squid wget. I have no experience with ubuntu and will be doing the install on centos. Should I try to create equivalent yum statements. Does the apt-get install apache2 install apache? And, if so, is it required for youtube for schools?
The apt-get install apache2 command does indeed install apache and is not required to make this installation work. I added that so that I could have an easy way to get the SSL certificates that need to be installed on the client machines so they’ll accept the SSL interception that’s being done by Squid.
Not able to download attachement from emails, after configurring ssl_bump.
Vikash,
It’s most likely to be caused by Google updating the way some of their stuff works and the server that provides the attachment download not being exempted from the SSL interception. The link below shows the sites that you’ll most likely need to be exempted and I’ve updated this article to reflect the changes also.
https://support.google.com/chrome/a/answer/3504942?hl=en