Basic Authentication of Squid 4.5 - /usr/lib64/squid/basic_ncsa_auth file not found - basic-authentication

I have centos 7.6 & installed squid 4.5 on it.
sudo yum -y install squid
I followed this link for Basic Authentication.
Without authentication squid works fine.
Here is squid.conf file after adding # Basic Authentication part :
#
# Recommended minimum configuration:
#
# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src 0.0.0.1-0.255.255.255 # RFC 1122 "this" network (LAN)
acl localnet src 10.0.0.0/8 # RFC 1918 local private network (LAN)
acl localnet src 100.64.0.0/10 # RFC 6598 shared address space (CGN)
acl localnet src 169.254.0.0/16 # RFC 3927 link-local (directly plugged) machines
acl localnet src 172.16.0.0/12 # RFC 1918 local private network (LAN)
acl localnet src 192.168.0.0/16 # RFC 1918 local private network (LAN)
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
#
# Recommended minimum Access Permission configuration:
#
# Deny requests to certain unsafe ports
http_access deny !Safe_ports
# Deny CONNECT to other than secure SSL ports
# http_access deny CONNECT !SSL_ports
# Only allow cachemgr access from localhost
http_access allow localhost manager
http_access deny manager
# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on "localhost" is a local user
#http_access deny to_localhost
#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#
# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
http_access allow localnet
http_access allow localhost
# Basic Authentication
auth_param basic program /usr/lib64/squid/basic_ncsa_auth /etc/squid/passwd
auth_param basic children 5
auth_param basic realm Squid Basic Authentication
auth_param basic credentialsttl 2 hours
acl auth_users proxy_auth REQUIRED
http_access allow auth_users
# allow all requests
acl all src 0.0.0.0/0
http_access allow all
# And finally deny all other access to this proxy
http_access deny all
# Squid normally listens to port 3128
http_port 3128
# Uncomment and adjust the following to add a disk cache directory.
#cache_dir ufs /var/spool/squid 100 16 256
# Leave coredumps in the first cache dir
coredump_dir /var/spool/squid
#
# Add any of your own refresh_pattern entries above these.
#
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
Please see # Basic Authentication part.
The problem is :
/usr/lib64/squid/basic_ncsa_auth file not exist.
Where is that file?
How can i fix this problem & What is the correct configuration of squid 4.5?

auth_param basic program /usr/lib/squid/basic_ncsa_auth /etc/squid/passwd

for squid v4:
yum install squid-helpers

Change the path for basic_ncsa_auth library:
Basic Authentication
auth_param basic program /usr/lib64/squid/basic_ncsa_auth /etc/squid/passwd
for:
Basic Authentication
auth_param basic program /usr/lib/squid/basic_ncsa_auth /etc/squid/passwd
this will work :)

Related

Squid Proxy deny rules doesn't work when authentication is included in squid configuration file

I am using the SQUID proxy for my outbound connection.My whitelist rule and deny rule doesn't work the moment i added the authentication layer.When i am trying to hit the URL with the URL which is not defined in my configuration file the proxy is returning 200.Is something with the rule priority list. It filters only at the authentication if it is correct it directly by pass all filters.
# Proxy Authentication
auth_param basic program /usr/lib64/squid/basic_ncsa_auth /etc/squid/passwd
acl authenticated proxy_auth REQUIRED
http_access allow authenticated
# Local network access to proxy
# Safe ports that can be used
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 3128
acl CONNECT method CONNECT
# Deny requests to certain unsafe ports
http_access deny !Safe_ports
# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports
# Destination domains that can be accessed
acl whitelist dstdomain .bing.com
acl whitelist dstdomain .google.com
http_access allow whitelist
# Destination domains that cannot be accessed
http_access deny all
the issue is described in the "Common Mistakes" section of the wiki:
https://wiki.squid-cache.org/SquidFaq/SquidAcl#Common_Mistakes
specifically:
All elements of an acl entry are OR'ed together.
All elements of an access entry are AND'ed together (e.g. http_access and icp_access)
so when you write two different lines for http_access, e.g.:
http_access allow authenticated
http_access allow whitelist
they are interpreted as "OR" and therefore either one will "hit".
if you want to force the proxy to only allow authenticated users to use whitelist acl, they have to be on the same line.
so in your case:
http_access allow authenticated whitelist
and that means - (only) allow authenticated AND whitelist.
followed by a http_access deny all this should block all other traffic as well.

url_regex sees only the domain part in the URL in Squid Proxy

I configured Squid Proxy v4.13 with SSL bump on Ubuntu. I read about the url_regex directive and my objective is to access only https sites and block other. However it is not working for me as it sees only the domain part of the URL when i use url_regex directive.
For example,
1.) acl whitelist url_regex cric(info|buzz) allows cricbuzz.com and cricinfo.com and blocks other URL
2.) acl whitelist url_regex https:// blocks all URLs
My understanding is that if i have line 2 in conf file, the regex should match all URL starting with https:// right?
Also i tried using ssl::server_name_regex directive with no go. Do i have to modify squid.conf file in order to make this to work? Could someone explain me what's the issue?
Thanks in advance!!
Here is my conf file:
http_port 3128 ssl-bump generate-host-certificates=on dynamic_cert_mem_cache_size=4MB cert=/etc/squid/squidCA.pem
acl step1 at_step SslBump1
ssl_bump peek step1
ssl_bump bump all
acl localnet src 0.0.0.1-0.255.255.255 # RFC 1122 "this" network (LAN)
acl localnet src 10.0.0.0/8 # RFC 1918 local private network (LAN)
acl localnet src 100.64.0.0/10 # RFC 6598 shared address space (CGN)
acl localnet src 169.254.0.0/16 # RFC 3927 link-local (directly plugged) machines
acl localnet src 172.16.0.0/12 # RFC 1918 local private network (LAN)
acl localnet src 192.168.0.0/16 # RFC 1918 local private network (LAN)
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost manager
http_access deny manager
acl whitelist url_regex https://
http_access allow whitelist
http_access deny all
include /etc/squid/conf.d/*
http_access allow localnet
http_access allow localhost
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|?) 0 0% 0
refresh_pattern . 0 20% 432
sslcrtd_program /usr/lib/squid/security_file_certgen -s /var/lib/squid/ssl_db -M 4MB
sslcrtd_children 5
ssl_bump server-first all
sslproxy_cert_error deny all
Use https_port #https_port 3128 ssl-bump generate-host-certificates=on dynamic_cert_mem_cache_size=4MB cert=/etc/squid/squidCA.pem
Use http_access #http_acces only serves to handle traffic without ssl, to block all unsafe traffic, just http_access deny all

Squid proxy configuration for allowing few youtube URLs

I want to full block the youtube.com website from accessing via squid. After blocking the youtube I want to allow few 100 URLs of youtube to be accessed via squid. That only educational videos will be allowed. My squid installation is working fine. But I got stuck at the url_regex part. I can't allow few URLs of youtube. Either youtube is blocked or youtube is fully opened. Here is my sqid configuration file.
#
# Recommended minimum configuration:
#
# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
#acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
#acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
acl localnet src 192.168.1.0/24 # RFC1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
acl allowed_videos dstdomain "/etc/squid/allowed_videos"
acl blocked_sites dstdomain "/etc/squid/blocked_sites"
http_access allow allowed_videos
http_access deny blocked_sites !allowed_videos
#
# Recommended minimum Access Permission configuration:
#
# Deny requests to certain unsafe ports
http_access deny !Safe_ports
# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports
# Only allow cachemgr access from localhost
http_access allow localhost manager
http_access deny manager
# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on "localhost" is a local user
#http_access deny to_localhost
#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#
# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
http_access allow localnet
http_access allow localhost
# And finally deny all other access to this proxy
http_access deny all
# Squid normally listens to port 3128
http_port 3128
# Uncomment and adjust the following to add a disk cache directory.
cache_dir ufs /var/spool/squid 512 16 256
# Leave coredumps in the first cache dir
coredump_dir /var/spool/squid
#
# Add any of your own refresh_pattern entries above these.
#
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
Content of the allowed videos file is give below
.youtube.com/watch\?v=6tO_isJQfCY
Content of the blocked websites file is given below
.youtube.com
Please help so I can partially allow youtube videos for accessing via squid
You could not use Squid Proxy as a Firewall for content filtering.
SquidGuard is Good too who wants to filters lots of URLS and Sites with a DB files.
url_regex is not good to you because Youtube channels are contain with unknown characters
So you should use a some app or make developing code with Python to Check the entire Content of Sites to make it allow or deny for your clients.
But in this case I suggest you manually allow the URL ( like as White list ).
This is sample code that you want :
First save the URLS in the file : "/etc/squid/mywhitelist.txt"
Then you should allow just this file for your clients like this code :
acl youtube_access src "/etc/squid/mywhitelist.txt"
http_access allow youtube_access
http_access deny all

Squid TCP_DENIED/403 with internal ERROR Page

I have a plain installation of new Squid and Apache2. Both are with default configuration. The Server (ubunut 18.04) is registered on internal DNS Server like: srv1.foo.bar.
If some one is not allowed to get access to the internet squid displays the internal Error Message Page but without die Squid logo. I get the following Error Message in Log File:
TCP_DENIED/403 4187 GET http://srv1.foo.bar:3128/squid-internal-static/icons/SN.png - HIER_NONE/- text/html
The only thing to display the logo was remark http_access deny all. my configuration: Squid Cache: Version 3.5.27
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 3128
acl CONNECT method CONNECT
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
acl localhost src 127.0.0.1/32
acl localnet src 192.168.168.0/24
http_access allow localhost
http_access allow localnet
http_access allow localhost manager
http_access deny manager
http_access deny all
http_port 3128
coredump_dir /var/spool/squid
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern (Release|Packages(.gz)*)$ 0 20% 2880
refresh_pattern . 0 20% 4320
Hope someone can help me. wrbrgds AxelF

Squid is giving me 403 from time to time

I have a squid server that is used by a lots of collage.
This is the squid config:
dns_v4_first on
# ACL Squid
external_acl_type is_user ipv4 ttl=600 negative_ttl=10 children-max=2000 %SRC /opt/acl_squid.py
# ACL PROXY Access
acl is_real_user external is_user
#acl SSL method CONNECT
acl SSL_ports port 443
acl Safe_ports port 80 # http
#acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
#acl Safe_ports port 70 # gopher
#acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
#acl Safe_ports port 280 # http-mgmt
#acl Safe_ports port 488 # gss-http
#acl Safe_ports port 591 # filemaker
#acl Safe_ports port 777 # multiling http
acl SSL method CONNECT
acl CONNECT method CONNECT
acl to_ipv6 dst ipv6 # Enable IPv6
http_access deny !Safe_ports
# ACL Allow Host/Domain
http_access allow is_real_user
http_access deny !Safe_ports
http_access allow localhost
http_access deny all
# Enable IPv6
#tcp_outgoing_address ipv6_address to_ipv6
# Port
http_port 0.0.0.0:3128
coredump_dir /var/spool/squid
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern (Release|Packages(.gz)*)$ 0 20% 2880
refresh_pattern . 0 20% 4320
# Chache Off
cache deny all
# Performance tuning
maximum_object_size 1 MB
maximum_object_size_in_memory 128 KB
cache_mem 64 MB
quick_abort_min 1024 KB
quick_abort_max 2048 KB
quick_abort_pct 90
pipeline_prefetch on
shutdown_lifetime 1 second
# Log
access_log syslog:local3.info squid
The external ACL is just checking the request IP to be sure that has access to that proxy (its checking via API to a DB to see if the IP exists there).
The issue is that from time to time (not only me, lots of collage) I'm receiving 403 (access denied) for no reason.
Could you please let me know what exactly I can do in order to have access all the time with no interruption?
What I do when I'm receiving 403 is to restart the squid server and the everything is back to normal.
Regards,
Ciprian

Resources