Tag: Squid

Connecting to a squid proxy securely

Squid allows the use of https_port to specify a port that it will accept proxy requests over SSL.

An example is as follows in the squid.conf file:

https_port 8123 cert=/etc/pki/tls/private/squid.pem

I am using port 8123 as it is already a port allowed under the default SELinux installation on Red Hat Enterprise Linux. The cert instruction is the path to the SSL cert in PEM format. If you do not specify the key Squid assumes it is bundled in the SSL cert PEM file.

Don’t forget to open the firewall port.

At present Firefox and Chrome support connecting to the squid proxy over SSL, though one must use a pac file at the moment rather than the internal application settings.

An example pac file looks like:

cat /var/www/html/wpad.pac
function FindProxyForURL(url, host) {
return “HTTPS proxy.example.com:8123;”
}

You can then stick the pac file on a friendly web server or on your local file system and point the web browser in the configuration settings.

You can also launch chrome from the command line on mac with the following settings:

open ./Google\ Chrome.app  –args –proxy-server=https://proxy.example.com:8123

Windows will take similar settings details here:

http://www.chromium.org/developers/design-documents/secure-web-proxy

Check your proxy logs to ensure that the proxy is functioning.

PLEASE NOTE: Chrome and Firefox will not take the secure proxy settings in their application configuration settings. They require the use of a proxy.pac / wpad.dat file to function. If you configure the secure proxy in the application settings the web browser will fail to connect and throw an error.


Squid General

Proxy Overview

A proxy intercepts requests from a web browser to a web server. The advantage of this is that if you have a proxy in an office or school where people may request the same page multiple times such as google.co.uk or yahoo.co.uk, save bandwidth and speed up browsing. The proxy will cache (keep a copy) the page and images and serve them directly to the browser rather than downloading them from the server again.

This sort of thing can assist greatly when doing bug fixes on computers. If multiple computers are running the same operating system, the update files will only have to be downloaded to the LAN once making updates much faster.

The proxy focused on here is squid a piece of GplSoftware availavle from http://www.squid-cache.org
Squid configuration
There is extensive documentation available for Squid and a mailing list where many of the queries you may have are covered in detail many times before :-)

What people often get stuck on is the ACLs (Access Control Lists). ACLs are important, for if the proxy is left open to all, it will be abused by mail spammers and others.

It is possible to have password authentication against a samba server, a Windows computer (NT W2K or XP), htpasswd files, Novell servers and LDAP servers. I have probably missed a few – check the documentation.

It is possible to provide continuous logins against a samba server or NT server using NTLM logins, check again for how to do this in the docs.

The ACLs were misunderstood by me for a long time, I had assumed that the ACLs were an OR argument rather than AND.

So for example if you create an ACL for the local network 192.168.1.0/24 and a colleague’s network 80.233.192.0/24 to allow both of these you would have to do the following:

acl local src 192.168.10.0/24
acl colleague src 80.233.135.0/24

http_access allow local
http_access allow colleague

The following would NOT work

http_access allow local colleague

The second would not work because a computer / user would never be in both networks

The ACLs are stackable and so Squid will work through them in order to get a match; hence you can order them to make finer adjustments of the ACLs

You might then want to allow remote users passworded access if they are not on either of the two networks. If the password ACL was put first, everyone would be asked for a password, which would be undesirable.

auth_param basic program /usr/lib/squid/pam_auth
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours

The realm is required and here the authentication is set to last for two hours before e re-request.

So here we have pam authentication. Pam is modular and allows authentication against a whole range of services. Note that you also have to make a squid entry for pam in /etc/pam.d/squid consisting of

auth required /lib/security/pam_unix.so
account required /lib/security/pam_unix.so

You will also need to set the file /usr/lib/squid/pam_auth to be SUID. This is so when the squid user executes the file it is run with root permissions and authentication is granted. I have been caught on this more than once as it works well when testing as root :-) to change the file to SUID execute the following at a root prompt: chmod 4755 /usr/lib/squid/pam_auth Each user on the system with a password will be able to use the proxy.

I have found pam a bit flakey for authenticating against a samba or NT server and here it is better to use the NTLM authentication.

acl password proxy_auth REQUIRED
http_access allow password

Site filtering

This is just a quick view. There is much you can do here, you can also use http://www.squidguard.org to filter sites.

acl filter url_regex “/etc/squid/banned”
deny_info XANDER_ERR filter
http_access deny filter

stick partial site matches such as doubleclick into the file /etc/squid/banned to block urls containing these words in the titles.

banned file

the file XANDER_ERR is found at /etc/squid/errors/English/XANDER_ERR and is a standard html page giving a custom error message.

Have a look at the multitude of variables you can embed in the file.
Web Proxy Auto-discovery
WPAD may be provided by a number of methods. DHCP, DNS (by A record), DNS (by SVR record)

I have found very limited documentation of WPAD by DHCP and have not managed to implement it.
WPAD using DNS by A record is relatively straight forward once you know the pitfalls :-)

WPAD by SVR record is not implemented in many clients.

A client set to try to discover their own proxys looks at its own FQDN (Fully Qualiffied Domain Name) such as workstation1.domain.co.uk and will look for the following hosts wpad.domain.co.uk then wpad.co.uk It looks for the A record of this host.

It will then try to download the following file

http://ipaddress/wpad.dat
This is important if you are running virtual hosts.

The client does not request a domain when it asks for the wpad.dat file, it only forwards the IP address (or at least IE6 does), therefore if you are running multiple websites on that IP address, the wpad.dat file must be placed in the default website. In short you must be able to access it by IP.

The dat file would be similar to this, however it may be much more complex:

function FindProxyForURL(url, host)
{
if (isPlainHostName(host) ||
dnsDomainIs(host, “domain.co.uk”) ||
isInNet(host, “192.168.10.0”, “255.255.255.0”))
return “DIRECT” ;
return “PROXY proxy.domain.co.uk:3128 ; PROXY 192.168.10.217:3128” ;
}

At present there is a patch available for Mozilla. which will allow it to do WPAD; however for Mozilla it may be easier to copy the wpad.dat file to proxy.pac file on the same server and set Mozilla to use the URL http://ipaddress/proxy.pac as an autmatic configuration URL (see Mozilla’s settings).


Web Filtering

Web Filtering

The http caching that has been reviewed here uses Squid Cache. Similarly for filtering we will look at configuration details and modules to be used with Squid will be reviewed here.

* url_regex
* Squid Guard

url_regex

This is integrel to Squid and grabs url segments from a specified file and if a match occurs squid will either allow or disallow dependent upon the configuration. Here is a segment from the example squid file and an example of a banned file.

acl filter url_regex “/etc/squid/banned”
http_access deny filter

Here we have an acl called filter, the type of filter is a url_regex and we use the file /etc/squid/banned.
The http_access is set to deny upon match, as you can see from the example file, this is set to block advert sites and other rubbish. It is easy to add new sites to block just by adding another domain to the list.

After a new entry has been added, squid needs to be told by the following command:
squid -k reconfigure

To restart the http cache, you can run the following command:
service squid restart

Squid Guard

Squid Guard has to be downloaded and compiled. This is easier than it sounds. It is dependant upon having gcc package installed.

It runs as follows:

tar zxvf squidguard-xxxx.tar.gz

cd squidGuard-xxx

./configure

make

The install has to be done as root.
make install

Have a read of the documentation and any other information on the site. You will also have to download and install the block lists. There are a large number of different blacklists available, from porn to violence. These are regularly updated and contain tens of thousands of sites and IPs. These are located normally in /var/spool/squidguard/

The Access Control Lists work very similarly to those in the squid configuration file.

Read what documentation you can. Once you have it up and working it is launched from squid using the re-director config option, have a look at the sample file for details.

Once you have downloaded or changed any of the files, you can rebuild the database files using the command:

squidGuard -C all

You will note that there are blockfiles such as:
drwxr-xr-x 2 squid squid 4096 Mar 3 01:23 ads
drwxr-xr-x 2 squid squid 4096 Feb 11 19:12 aggressive
drwxr-xr-x 2 squid squid 4096 Feb 11 19:12 audio-video
drwxr-xr-x 2 squid squid 4096 Feb 11 19:12 drugs
drwxr-xr-x 2 squid squid 4096 Feb 11 19:12 gambling
drwxr-xr-x 2 squid squid 4096 Feb 11 19:12 hacking
drwxr-xr-x 2 squid squid 4096 Feb 12 18:26 mail

Within these directories you will find files such as:
ls -l /var/spool/squidguard/blacklists/ads
total 184
-rw-r—– 1 squid squid 44500 Mar 3 01:23 domains
-rw-r–r– 1 squid squid 122880 Mar 3 01:24 domains.db
-rw-r–r– 1 squid squid 27 Feb 25 13:18 expressions
-rw-r—– 1 squid squid 3147 Feb 7 23:55 urls
-rw-r–r– 1 squid squid 8192 Mar 3 01:24 urls.db

Note that you have domains and urls, and domains.db and urls.db, these are the database files that are built by the command above.

The blocklists also provide a good list, if you build your ACLs with good then !bad the URL will be accepted if it is found in the good list, even if it is in any of the blacklists.


Copyright © 1996-2013 Xander Harkness. All rights reserved.
iDream theme by Templates Next | Powered by WordPress
loading