Tag: Caching Proxy

Connecting to a squid proxy securely

Squid allows the use of https_port to specify a port that it will accept proxy requests over SSL.

An example is as follows in the squid.conf file:

https_port 8123 cert=/etc/pki/tls/private/squid.pem

I am using port 8123 as it is already a port allowed under the default SELinux installation on Red Hat Enterprise Linux. The cert instruction is the path to the SSL cert in PEM format. If you do not specify the key Squid assumes it is bundled in the SSL cert PEM file.

Don’t forget to open the firewall port.

At present Firefox and Chrome support connecting to the squid proxy over SSL, though one must use a pac file at the moment rather than the internal application settings.

An example pac file looks like:

cat /var/www/html/wpad.pac
function FindProxyForURL(url, host) {
return “HTTPS proxy.example.com:8123;”
}

You can then stick the pac file on a friendly web server or on your local file system and point the web browser in the configuration settings.

You can also launch chrome from the command line on mac with the following settings:

open ./Google\ Chrome.app  –args –proxy-server=https://proxy.example.com:8123

Windows will take similar settings details here:

http://www.chromium.org/developers/design-documents/secure-web-proxy

Check your proxy logs to ensure that the proxy is functioning.

PLEASE NOTE: Chrome and Firefox will not take the secure proxy settings in their application configuration settings. They require the use of a proxy.pac / wpad.dat file to function. If you configure the secure proxy in the application settings the web browser will fail to connect and throw an error.


Squid General

Proxy Overview

A proxy intercepts requests from a web browser to a web server. The advantage of this is that if you have a proxy in an office or school where people may request the same page multiple times such as google.co.uk or yahoo.co.uk, save bandwidth and speed up browsing. The proxy will cache (keep a copy) the page and images and serve them directly to the browser rather than downloading them from the server again.

This sort of thing can assist greatly when doing bug fixes on computers. If multiple computers are running the same operating system, the update files will only have to be downloaded to the LAN once making updates much faster.

The proxy focused on here is squid a piece of GplSoftware availavle from http://www.squid-cache.org
Squid configuration
There is extensive documentation available for Squid and a mailing list where many of the queries you may have are covered in detail many times before :-)

What people often get stuck on is the ACLs (Access Control Lists). ACLs are important, for if the proxy is left open to all, it will be abused by mail spammers and others.

It is possible to have password authentication against a samba server, a Windows computer (NT W2K or XP), htpasswd files, Novell servers and LDAP servers. I have probably missed a few – check the documentation.

It is possible to provide continuous logins against a samba server or NT server using NTLM logins, check again for how to do this in the docs.

The ACLs were misunderstood by me for a long time, I had assumed that the ACLs were an OR argument rather than AND.

So for example if you create an ACL for the local network 192.168.1.0/24 and a colleague’s network 80.233.192.0/24 to allow both of these you would have to do the following:

acl local src 192.168.10.0/24
acl colleague src 80.233.135.0/24

http_access allow local
http_access allow colleague

The following would NOT work

http_access allow local colleague

The second would not work because a computer / user would never be in both networks

The ACLs are stackable and so Squid will work through them in order to get a match; hence you can order them to make finer adjustments of the ACLs

You might then want to allow remote users passworded access if they are not on either of the two networks. If the password ACL was put first, everyone would be asked for a password, which would be undesirable.

auth_param basic program /usr/lib/squid/pam_auth
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours

The realm is required and here the authentication is set to last for two hours before e re-request.

So here we have pam authentication. Pam is modular and allows authentication against a whole range of services. Note that you also have to make a squid entry for pam in /etc/pam.d/squid consisting of

auth required /lib/security/pam_unix.so
account required /lib/security/pam_unix.so

You will also need to set the file /usr/lib/squid/pam_auth to be SUID. This is so when the squid user executes the file it is run with root permissions and authentication is granted. I have been caught on this more than once as it works well when testing as root :-) to change the file to SUID execute the following at a root prompt: chmod 4755 /usr/lib/squid/pam_auth Each user on the system with a password will be able to use the proxy.

I have found pam a bit flakey for authenticating against a samba or NT server and here it is better to use the NTLM authentication.

acl password proxy_auth REQUIRED
http_access allow password

Site filtering

This is just a quick view. There is much you can do here, you can also use http://www.squidguard.org to filter sites.

acl filter url_regex “/etc/squid/banned”
deny_info XANDER_ERR filter
http_access deny filter

stick partial site matches such as doubleclick into the file /etc/squid/banned to block urls containing these words in the titles.

banned file

the file XANDER_ERR is found at /etc/squid/errors/English/XANDER_ERR and is a standard html page giving a custom error message.

Have a look at the multitude of variables you can embed in the file.
Web Proxy Auto-discovery
WPAD may be provided by a number of methods. DHCP, DNS (by A record), DNS (by SVR record)

I have found very limited documentation of WPAD by DHCP and have not managed to implement it.
WPAD using DNS by A record is relatively straight forward once you know the pitfalls :-)

WPAD by SVR record is not implemented in many clients.

A client set to try to discover their own proxys looks at its own FQDN (Fully Qualiffied Domain Name) such as workstation1.domain.co.uk and will look for the following hosts wpad.domain.co.uk then wpad.co.uk It looks for the A record of this host.

It will then try to download the following file

http://ipaddress/wpad.dat
This is important if you are running virtual hosts.

The client does not request a domain when it asks for the wpad.dat file, it only forwards the IP address (or at least IE6 does), therefore if you are running multiple websites on that IP address, the wpad.dat file must be placed in the default website. In short you must be able to access it by IP.

The dat file would be similar to this, however it may be much more complex:

function FindProxyForURL(url, host)
{
if (isPlainHostName(host) ||
dnsDomainIs(host, “domain.co.uk”) ||
isInNet(host, “192.168.10.0”, “255.255.255.0”))
return “DIRECT” ;
return “PROXY proxy.domain.co.uk:3128 ; PROXY 192.168.10.217:3128” ;
}

At present there is a patch available for Mozilla. which will allow it to do WPAD; however for Mozilla it may be easier to copy the wpad.dat file to proxy.pac file on the same server and set Mozilla to use the URL http://ipaddress/proxy.pac as an autmatic configuration URL (see Mozilla’s settings).


Copyright © 1996-2013 Xander Harkness. All rights reserved.
iDream theme by Templates Next | Powered by WordPress
loading