Updated Version of squid which cache youtube and many other contents. read following
https://aacable.wordpress.com/2012/01/19/youtube-caching-with-squid-2-7-using-storeurl-pl/
Advantages of Youtube Caching !!!
In most part of the world, bandwidth is very expensive, therefore it is (in some scenarios) very useful to Cache Youtube videos or any other flash videos, so if one of user downloads video / flash file , why again the same user or other user can’t download the same file from the CACHE, why he sucking the internet pipe for same content again n again?
Peoples on same LAN ,sometimes watch similar videos. If I put some youtube video link on on FACEBOOK, TWITTER or likewise , and all my friend will watch that video and that particular video gets viewed many times in few hours. Usually the videos are shared over facebook or other social networking sites so the chances are high for multiple hits per popular videos for my LAN users / friends. [syed.jahanzaib]
This is the reason why I wrote this article.
Disadvantages of Youtube Caching !!!
The chances, that another user will watch the same video, is really slim. if I search for something specific on youtube, i get more then hundreds of search results for same video. What is the chance that another user will search for the same thing, and will click on the same link / result? Youtube hosts more than 10 million videos. Which is too much to cache anyway. You need lot of space to cache videos. Also accordingly you will be needing ultra modern fast hardware with tons of SPACE to handle such kind of cache giant. anyhow Try it
AFAIK you are not supposed to cache youtube videos, youtube don’t like it. I don’t understand why. Probably because their ranking mechanism relies on views, and possibly completed views, which wouldn’t be measurable if the content was served from a local cache.
After unsuccessful struggling with storeurl.pl method , I was searching for alternate method to cache youtube videos. Finally I found ruby base method using Nginx to cache YT. Using this method I was able to cache all Youtube videos almost perfectly. (not 100%, but it works fine in most cases with some modification.I am sure there will be some improvement in near future).
Updated: 24thth August, 2012
Thanks to Mr. Eliezer Croitoru & Mr.Christian Loth & others for there kind guidance.
Following components were used in this guide.
Proxy Server Configuration:
Ubuntu Desktop 10.4
Nginix version: nginx/0.7.65
Squid Cache: Version 2.7.STABLE7
Client Configuration for testing videos:
Windows XP with Internet Explorer 6
Windows 7 with Internet Explorer 8
Lets start with the Proxy Server Configuration:
1) Update Ubuntu
First install Ubuntu, After installation, configure its networking components, then update it by following command
apt-get install update
2) Install SSH Server [Optional]
Now install SSH server so that you can manage your server remotely using PUTTY or any other ssh tool.
apt-get install openssh-server
3) Install Squid Server
Now install Squid Server by following command
apt-get install squid
[This will install squid 2.7 by default]
Now edit squid configuration files by using following command
nano /etc/squid/squid.conf
Remove all lines and paste the following data
# SQUID 2.7/ Nginx TEST CONFIG FILE # Email: aacable@hotmail.com # Web : https://aacable.wordpress.com # PORT and Transparent Option http_port 8080 transparent server_http11 on icp_port 0 # Cache is set to 5GB in this example (zaib) store_dir_select_algorithm round-robin cache_dir aufs /cache1 5000 16 256 cache_replacement_policy heap LFUDA memory_replacement_policy heap LFUDA # If you want to enable DATE time n SQUID Logs,use following emulate_httpd_log on logformat squid %tl %6tr %>a %Ss/%03Hs %<st %rm %ru %un %Sh/%<A %mt log_fqdn off # How much days to keep users access web logs # You need to rotate your log files with a cron job. For example: # 0 0 * * * /usr/local/squid/bin/squid -k rotate logfile_rotate 14 debug_options ALL,1 cache_access_log /var/log/squid/access.log cache_log /var/log/squid/cache.log cache_store_log /var/log/squid/store.log #[zaib] I used DNSAMSQ service for fast dns resolving #so install by using "apt-get install dnsmasq" first dns_nameservers 127.0.0.1 221.132.112.8 #ACL Section acl all src 0.0.0.0/0.0.0.0 acl manager proto cache_object acl localhost src 127.0.0.1/255.255.255.255 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 563 # https, snews acl SSL_ports port 873 # rsync acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 563 # https, snews acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl Safe_ports port 631 # cups acl Safe_ports port 873 # rsync acl Safe_ports port 901 # SWAT acl purge method PURGE acl CONNECT method CONNECT http_access allow manager localhost http_access deny manager http_access allow purge localhost http_access deny purge http_access deny !Safe_ports http_access deny CONNECT !SSL_ports http_access allow localhost http_access allow all http_reply_access allow all icp_access allow all #[zaib]I used UBUNTU so user is proxy, in FEDORA you may use use squid cache_effective_user proxy cache_effective_group proxy cache_mgr aacable@hotmail.com visible_hostname proxy.aacable.net unique_hostname aacable@hotmail.com cache_mem 8 MB minimum_object_size 0 bytes maximum_object_size 100 MB maximum_object_size_in_memory 128 KB refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern (Release|Packages(.gz)*)$ 0 20% 2880 refresh_pattern . 0 50% 4320 acl apache rep_header Server ^Apache broken_vary_encoding allow apache # Youtube Cache Section [zaib] url_rewrite_program /etc/nginx/nginx.rb url_rewrite_host_header off acl youtube_videos url_regex -i ^http://[^/]+\.youtube\.com/videoplayback\? acl range_request req_header Range . acl begin_param url_regex -i [?&]begin= acl id_param url_regex -i [?&]id= acl itag_param url_regex -i [?&]itag= acl sver3_param url_regex -i [?&]sver=3 cache_peer 127.0.0.1 parent 8081 0 proxy-only no-query connect-timeout=10 cache_peer_access 127.0.0.1 allow youtube_videos id_param itag_param sver3_param !begin_param !range_request cache_peer_access 127.0.0.1 deny all
Save & Exit.
4) Install Nginx
Now install Nginix by
apt-get install nginx
Now edit its config file by using following command
nano /etc/nginx/nginx.conf
Remove all lines and paste the following data
# This config file is not written by me, [syed.jahanzaib] # My Email address is inserted Just for tracking purposes # For more info, visit http://code.google.com/p/youtube-cache/ # Syed Jahanzaib / aacable [at] hotmail.com user www-data; worker_processes 4; pid /var/run/nginx.pid; events { worker_connections 768; } http { sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 65; types_hash_max_size 2048; include /etc/nginx/mime.types; default_type application/octet-stream; access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; gzip on; gzip_static on; gzip_comp_level 6; gzip_disable .msie6.; gzip_vary on; gzip_types text/plain text/css text/xml text/javascript application/json application/x-javascript application/xml application/xml+rss; gzip_proxied expired no-cache no-store private auth; gzip_buffers 16 8k; gzip_http_version 1.1; include /etc/nginx/conf.d/*.conf; include /etc/nginx/sites-enabled/*; # starting youtube section server { listen 127.0.0.1:8081; location / { root /usr/local/www/nginx_cache/files; #try_files "/id=$arg_id.itag=$arg_itag" @proxy_youtube; # Old one #try_files "$uri" "/id=$arg_id.itag=$arg_itag.flv" "/id=$arg_id-range=$arg_range.itag=$arg_itag.flv" @proxy_youtube; #old2 try_files "/id=$arg_id.itag=$arg_itag.range=$arg_range.algo=$arg_algorithm" @proxy_youtube; } location @proxy_youtube { resolver 221.132.112.8; proxy_pass http://$host$request_uri; proxy_temp_path "/usr/local/www/nginx_cache/tmp"; #proxy_store "/usr/local/www/nginx_cache/files/id=$arg_id.itag=$arg_itag"; # Old 1 proxy_store "/usr/local/www/nginx_cache/files/id=$arg_id.itag=$arg_itag.range=$arg_range.algo=$arg_algorithm"; proxy_ignore_client_abort off; proxy_method GET; proxy_set_header X-YouTube-Cache "aacable@hotmail.com"; proxy_set_header Accept "video/*"; proxy_set_header User-Agent "YouTube Cacher (nginx)"; proxy_set_header Accept-Encoding ""; proxy_set_header Accept-Language ""; proxy_set_header Accept-Charset ""; proxy_set_header Cache-Control "";} } }
Save & Exit.
Now Create directories to hold cache files
mkdir /usr/local/www
mkdir /usr/local/www/nginx_cache
mkdir /usr/local/www/nginx_cache/tmp
mkdir /usr/local/www/nginx_cache/files
chown www-data /usr/local/www/nginx_cache/files/ -Rf
Now create nginx .rb file
touch /etc/nginx/nginx.rb
chmod 755 /etc/nginx/nginx.rb
nano /etc/nginx/nginx.rb
Paste the following data in this newly created file
#!/usr/bin/env ruby1.8 # This script is not written by me, # My Email address is inserted Just for tracking purposes # For more info, visit http://code.google.com/p/youtube-cache/ # Syed Jahanzaib / aacable [at] hotmail.com # url_rewrite_program <path>/nginx.rb # url_rewrite_host_header off require "syslog" require "base64" class SquidRequest attr_accessor :url, :user attr_reader :client_ip, :method def method=(s) @method = s.downcase end def client_ip=(s) @client_ip = s.split('/').first end end def read_requests # URL <SP> client_ip "/" fqdn <SP> user <SP> method [<SP> kvpairs]<NL> STDIN.each_line do |ln| r = SquidRequest.new r.url, r.client_ip, r.user, r.method, *dummy = ln.rstrip.split(' ') (STDOUT << "#{yield r}\n").flush end end def log(msg) Syslog.log(Syslog::LOG_ERR, "%s", msg) end def main Syslog.open('nginx.rb', Syslog::LOG_PID) log("Started") read_requests do |r| if r.method == 'get' && r.url !~ /[?&]begin=/ && r.url =~ %r{\Ahttp://[^/]+\.youtube\.com/(videoplayback\?.*)\z} log("YouTube Video [#{r.url}].") "http://127.0.0.1:8081/#{$1}" else r.url end end end main
Save & Exit.
5) Install RUBY
What is RUBY?
Ruby is a dynamic, open source programming language with a focus on simplicity and productivity. It has an elegant syntax that is natural to read and easy to write. [syed.jahanzaib]
Now install RUBY by following command
apt-get install ruby
6) Configure Squid Cache DIR and Permissions
Now create cache dir and assign proper permission to proxy user
mkdir /cache1
chown proxy:proxy /cache1
chmod -R 777 /cache1
Now initialize squid cache directories by
squid -z
You should see Following message
Creating Swap Directories
7) Finally Start/restart SQUID & Nginx
service squid start
service nginx restart
Now from test pc, open youtube and play any video, after it download completely, delete the browser cache, and play the same video again, This time it will be served from the cache. You can verify it by monitoring your WAN link utilization while playing the cached file.
Look at the below WAN utilization graph, it was taken while watching the clip which is not in cache
Now Look at the below WAN utilization graph, it was taken while watching the clip which is now in CACHE.
It will load first chunk from the cache, if the user keep watching the clip, it will load next chunk at the end of first chunk, and will continue to do so.
Video cache files can be found in following locations.
/usr/local/www/nginx_cache/files
e.g:
ls -lh /usr/local/www/nginx_cache/files
The above file shows the clip is in 360p quality, and the length of the clip is 5:54 Seconds.
itag=34 shows the video quality is 360p.
Credits: Thanks to Mr. Eliezer Croitoru & Mr.Christian Loth & others for there kind guidance.
Find files that have not been accessed from x days. Useful to delete old cache files that have not been accessed since x days.
find /usr/local/www/nginx_cache/files -atime +30 -type f
Regard’s
Syed Jahanzaib
What about “an error occured” ?did you get any solution for it or not,i face same problem.
Adeel Ahmed
http://www.facebook.com/wifitech
LikeLike
Comment by adeelkml — August 13, 2012 @ 1:28 PM
Have u tried 240p quality ?
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 13, 2012 @ 6:37 PM
yeah it works fine at 240p….do we need to make changes to cache other video sites?
can we use storeurl.pl and Nginx together?
LikeLike
Comment by adeelkml — August 13, 2012 @ 10:45 PM
i used 240p,480p and 720p these all works fine…..
i use Smart Video Firefox Plugin which forces the user to watch video in desired quality every time…..It automatically select your desired quality on each n every video.
http:// youtu.be/ 5BejfxzXqMw
LikeLike
Comment by adeelkml — August 17, 2012 @ 5:31 AM
salam janab youtube cache kam to krta hay mgr youtube hq video open nahi hoti err aa jata hay
LikeLike
Comment by M.Tahir Shafiq — August 25, 2012 @ 7:07 PM
There are some limitations and restrictions using this method. I will post more findings about it next week,
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 26, 2012 @ 10:02 AM
thanks for new YT cache
but in squid.conf there is no ZPH directives to mark cache content, so that it can later pick by Mikrotik.
Did you find the solution ???
Did Nginx method cache Windows Update , AntiVirus Update and (Google,MSN,CNN,Metacafe,etc) Videos ???
LikeLike
Comment by Ahmed Morgan — August 13, 2012 @ 4:43 PM
This method is just an example for caching youtube videos. You can integrate it to cache other dynamic content also along with squid I will do some more testing and will post the updates whenever I will get time. The problem is I have a job related to Microsoft environment and I only get 1 free day in 10-15 days to do my personnel testing, so its hard to manage time for R&D.
🙂
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 13, 2012 @ 6:36 PM
Hello Mr.Syed
I test Squid + Nginx for caching , but I can’t use it becuase every video open give me “an error occured” or when video playing show this error
Did you solve it ????
LikeLike
Comment by Ahmed Morgan — August 13, 2012 @ 5:22 PM
Have you tried watching videos in 240p ?
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 13, 2012 @ 6:34 PM
thanks for new Solution but the method cache all web sites or not?
any videos will be cache like xnxx ,redtube ,tube8 ,facebook videos ?
and download files will be cache ?
thanks .
LikeLike
Comment by aa — August 13, 2012 @ 6:01 PM
This method is just an example for caching youtube videos. You can integrate it to cache otehr dynamic content also along with squid.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 13, 2012 @ 6:34 PM
Dear sir,
I followed all you above mentioned steps. Only 240p videos working fine. 360p giving alot pain means random errors like stuck at 39 second, error occurred, cache video play as black screen.
For 240p format working flawless. Kindly do something about 360p.
Thank you
LikeLike
Comment by Saqib — August 14, 2012 @ 7:41 AM
i will install and check thanks
LikeLike
Comment by Hussien — August 14, 2012 @ 7:41 AM
what about
cache_dir aufs /cache1 5000 16 256
i have 6 hdd every one 2TB
LikeLike
Comment by Hussien — August 14, 2012 @ 9:07 AM
Is this True ????
cache_dir
This is where you set the directories you will be using. You should have already mkreiserfs’d your cache directory partitions, so you’ll have an easy time deciding the values here. First, you will want to use about 60% or less of each cache directory for the web cache. If you use any more than that you will begin to see a slight degradation in performance. Remember that cache size is not as important as cache speed, since for maximum effectiveness your cache needs only store about a weeks worth of traffic. You’ll also need to define the number of directories and subdirectories. The formula for deciding that is this:
x=Size of cache dir in KB (i.e. 6GB=~6,000,000KB) y=Average object size
(just use 13KB z=Number of directories per first level directory
`(((x / y) / 256) / 256) * 2 = # of directories
As an example, I use 6GB of each of my 13GB drives, so:
6,000,000 / 13 = 461538.5 / 256 = 1802.9 / 256 = 7 * 2 = 14
So my cache_dir line would look like this:
cache_dir 6000 14 256
LikeLike
Comment by Hussien — August 14, 2012 @ 9:19 AM
so mine would be :
#cache_dir aufs /1 1200000 2817 256
#cache_dir aufs /2 1200000 2817 256
#cache_dir aufs /3 1200000 2817 256
#cache_dir aufs /4 1200000 2817 256
#cache_dir aufs /5 1200000 2817 256
#cache_dir aufs /6 1200000 2817 256
but i got error and squid not start
FATAL: xcalloc: Unable to allocate 4112557923 blocks of 1 bytes!
LikeLike
Comment by Hussien — August 14, 2012 @ 9:20 AM
What squid version you have ?
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 17, 2012 @ 9:31 AM
I Have the exact same problem, it was working 100% with all 4 drives, but I made a boo boo and needed to reinstall the OS.
I Have 4 x 2 TB drives with each 1.7 TB available and 8 GB Ram. When I use only 3 drives, it works.
Squid conf:
cache_mem 6 GB
minimum_object_size 0 KB
maximum_object_size 128 MB
maximum_object_size_in_memory 128 KB
cache_swap_low 97
cache_swap_high 99
cache_dir aufs /cache1 1782211 38 256
cache_dir aufs /cache2 1782211 38 256
cache_dir aufs /cache3 1782211 38 256
cache_dir aufs /cache4 1782211 38 256
That will give you 1.6 TB allocated.
root@proxy:/# /usr/local/squid/sbin/squid -d1N -f /usr/local/squid/etc/squid.conf
Not currently OK to rewrite swap log.
storeDirWriteCleanLogs: Operation aborted.
Starting Squid Cache version 2.7.STABLE9 for x86_64-unknown-linux-gnu…
Process ID 4700
With 1048576 file descriptors available
Using epoll for the IO loop
Performing DNS Tests…
Successful DNS name lookup tests…
DNS Socket created at 0.0.0.0, port 32888, FD 6
Adding nameserver 127.0.0.1 from squid.conf
Adding nameserver **1.***.0.36 from squid.conf
Adding nameserver **1.***.0.37 from squid.conf
helperOpenServers: Starting 15 ‘storeurl.pl’ processes
logfileOpen: opening log /var/log/squid/access.log
Swap maxSize 7299936256 + 6291456 KB, estimated 562017516 objects
Target number of buckets: 28100875
Using 33554432 Store buckets
Max Mem size: 6291456 KB
Max Swap size: 7299936256 KB
FATAL: xcalloc: Unable to allocate 4109054858 blocks of 1 bytes!
Not currently OK to rewrite swap log.
storeDirWriteCleanLogs: Operation aborted.
Why can’t I use all 4 drives and what if I wanted to add drives?
LikeLike
Comment by A.J. Hart — November 17, 2013 @ 2:02 PM
sorry for many replies … i solved them
but about your way to cache youtube ..
root@CACHE:~# ls -lh /usr/local/www/nginx_cache/files
total 6.8M
-rw——- 1 www-data www-data 1.7M 2012-04-12 23:19 id=4a7d9b42ce959d72.itag=34
-rw——- 1 www-data www-data 1.7M 2012-04-14 23:08 id=bd651baa1f14e2fe.itag=34
-rw——- 1 www-data www-data 1.7M 2012-03-02 22:09 id=d32be70e6b010b70.itag=34
-rw——- 1 www-data www-data 1.7M 2012-04-17 17:41 id=e06969d12723e863.itag=34
i cant cache more than 1.7M for every video and then i got :
an error occured. please try again later.
LikeLike
Comment by Hussien — August 14, 2012 @ 10:43 AM
please tell me how you solved your cache_dir of 2TB?
LikeLike
Comment by operatorglobalnet — September 29, 2013 @ 2:31 PM
The ACL whcich is used to send YT request to nginx will only allow first segment of video to be cached. 240p is working because if we select 240p then &range tag will be removed and video will be requested as full file not range request.
LikeLike
Comment by Saqib — August 14, 2012 @ 1:47 PM
so how you cache 20M and 6M
LikeLike
Comment by Hussien — August 14, 2012 @ 4:13 PM
very nice, but needs some improvements even though i haven’t try the method as Pak Syed posting.
I hope there is some one or himself have spare time to make it better.
LikeLike
Comment by Ma'el — August 14, 2012 @ 5:11 PM
Sir kindly help my mikrotik cache hit queue is not giving more than 16mbps(2MB/sec). if configure browser to use proxy then it will getting 95-98 mbps on hit content.
name=”Unlimited speed for cache hits” parent=global-out
packet-mark=cache-hits limit-at=0 queue=default priority=8
max-limit=0 burst-limit=0 burst-threshold=0 burst-time=0s
LikeLike
Comment by saqib — August 14, 2012 @ 9:12 PM
hello sir `nice working but only 1.7 mb yt file cache why not combine ur previous storeurl.pl script and that new also thanks for ur lusa dynamic youtube cache for squid thanks a lot please finish ur project why we are looking for use our also project u r great
LikeLike
Comment by rajeevsamal — August 14, 2012 @ 11:10 PM
Mangling on DSCP(TOS) based slows down cache hit content to deliver at 17mbps(2.2MB/sec). I have check it on both MT v3.3 & v5.18.
Is it a bug or Mikrotik limitation. Sir Jahanzaib and all blog members kindly take a look into this matter. Thanks
LikeLike
Comment by saqib — August 15, 2012 @ 12:49 AM
I haven’t checked it yet, I will see into it.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 17, 2012 @ 9:29 AM
Hi Syed, please know this project, we are using and getting good results
http://sourceforge.net/projects/incomum/
Anything mail-me
LikeLike
Comment by int21 — August 15, 2012 @ 1:09 AM
Thanks for sharing. I will surely check it soon and if its good, I will write something about it too.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 15, 2012 @ 8:08 AM
hello syed.
please can you tell me why when i change cache_dir pointed to another hdd the squid stop working
i mounted it and sqiud make swap but after that squid not work and it saying
root@CACHE:~# /etc/init.d/squid3 restart
* Restarting Squid HTTP Proxy 3.x squid3 [ OK ]
LikeLike
Comment by Hussien — August 15, 2012 @ 10:11 AM
I implemented you Incomum solution on my test box but same issue Incomum caching only first 51 seconds of 360p videos. 240p are working fine. Thanks for sharing.
LikeLike
Comment by Saqib — August 15, 2012 @ 8:54 PM
It’s not a perfect solution, but it works in some aspects, for example, if your client have windows XP/2003, The cache works perfectly, On Windows 7, only 240p Works great.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 17, 2012 @ 9:24 AM
Your email address is bouncing mails.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 17, 2012 @ 9:28 AM
cquid cache only youtube cache not other site cache ??????????????
Youtube only 240 working
LikeLike
Comment by M.Tahir Shafiq — August 28, 2012 @ 7:16 PM
2012/08/15 08:43:47| Rebuilding storage in /c1 (DIRTY)
2012/08/15 08:44:36| Rebuilding storage in /c2 (DIRTY)
2012/08/15 08:45:24| Rebuilding storage in /c3 (DIRTY)
2012/08/15 08:46:12| Rebuilding storage in /c4 (DIRTY)
2012/08/15 08:47:00| Rebuilding storage in /c5 (DIRTY
LikeLike
Comment by Hussien — August 15, 2012 @ 10:49 AM
help please
LikeLike
Comment by Hussien — August 15, 2012 @ 1:04 PM
Squid is rebuilding the storage after an unclean shutdown of squid
(crash).
It means that the previous time you ran Squid you did not let it to
terminate in a clean manner, and Squid need to verify the consistency of
the cache a little harder while rebuilding the internal index of what is
cached.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 17, 2012 @ 9:27 AM
i try to uninstall then install it again and the same problem it didnt give dirty
LikeLike
Comment by Hussien — August 17, 2012 @ 8:14 PM
when i changed the hard disks it said CLEAN
why seagate hdd’s always DIRTY and western digital are CLEAN ???
i have more question .. can i cache iphone and android applications ??
thanks syed 🙂
LikeLike
Comment by Hussien — August 18, 2012 @ 9:49 PM
Try lowering down the cache DIR size.
Try starting with lower number, lets say start with 100 GB and then check, if its OK, then start increasing until it gives error.
Try changing the aufs to ufs
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 18, 2012 @ 10:04 PM
Nice sharing !
LikeLike
Comment by faizan — August 15, 2012 @ 3:02 PM
Thank you for sharing it,
I tried it but like other people reported. It caches for very few seconds then stops and video then becomes unplayable.
It would be great of there is a fix for it.
LikeLike
Comment by Badr — August 17, 2012 @ 11:27 AM
me too I interpreted the problem I run
and we need configure its networking components
LikeLike
Comment by abdallah — August 18, 2012 @ 4:06 PM
\sir plz mujhe btae ge k may ubuntu ki porani setting kaise khatam karo directory may masla kar raha ha
LikeLike
Comment by arsalan malick — August 18, 2012 @ 3:01 AM
sir .
ager youtube Cache ho rahi hai to downloads Video per PTCL Ka server Address kyou ata hai.. hai is ISP aka
http://o-o—preferred—par03g08—v4—nonxt5.c.youtube.com/
is per to Cache porxy server ka Address hona chahiye ?
LikeLike
Comment by Kashif Lari — August 18, 2012 @ 2:45 PM
salam,
this is rehmat ali gulwating have a request that i follow these steps but squid cant start error is 2012/08/16 01:32:09| WARNING: url_rewriter #5 (FD 10) exited
2012/08/16 01:32:09| WARNING: url_rewriter #4 (FD 9) exited
2012/08/16 01:32:09| WARNING: url_rewriter #3 (FD 8) exited
2012/08/16 01:32:09| Too few url_rewriter processes are running)
when i remove this lines from squid.conf
#url_rewrite_program /etc/nginx/nginx.rb
#url_rewrite_host_header off
squid starts all work fine but youtube cannot cache becoz of these lines
when i adding these lines
#url_rewrite_program /etc/nginx/nginx.rb
#url_rewrite_host_header off
squid giving error
2012/08/16 01:32:09| WARNING: url_rewriter #5 (FD 10) exited
2012/08/16 01:32:09| WARNING: url_rewriter #4 (FD 9) exited
2012/08/16 01:32:09| WARNING: url_rewriter #3 (FD 8) exited
2012/08/16 01:32:09| Too few url_rewriter processes are running)
please give me any guide to resolve this
thanking you,
LikeLike
Comment by rehmat ali — August 19, 2012 @ 1:42 AM
one more thing i have to inform you that i m using lusca/squid with nginx on centos and what is user www-data; in nginx.conf becoz i have no user created, except root.
i changed www-data; into user squid;
still same issue please resolved this as soon as possible.
thanking you,
LikeLike
Comment by rehmat ali — August 19, 2012 @ 7:21 AM
nginx error.log file error
2012/08/16 07:46:04 [error] 8286#0: *34 open() “/usr/share/nginx/html/squid-internal-periodic/store_digest” failed (2: No such file or directory), client: 127.0.0.1, server: localhost, request: “GET /squid-internal-periodic/store_digest HTTP/1.0”, host: “127.0.0.1”
LikeLike
Comment by rehmat ali — August 19, 2012 @ 7:51 AM
2012/08/16 08:03:30 [crit] 8538#0: *12 rename() “/home/www/nginx_cache/tmp/0000000005” to “/home/www/nginx_cache/files/id=09b3a2c99e648c4f.itag=34” failed (13: Permission denied) while reading upstream, client: 127.0.0.1, server: , request: “GET http://o-o—preferred—ptcl-khi1—v9—lscache8.c.youtube.com/videoplayback?algorithm=throttle
LikeLike
Comment by rehmat ali — August 19, 2012 @ 8:05 AM
actully in my squid.conf my user and group is squid should i change all typed (proxy) into (squid) in nginx.conf file ?
thanks
LikeLike
Comment by rehmat ali — August 19, 2012 @ 8:08 AM
if you are using ubuntu, then the user will be proxy.
If other then ubuntu like fedora, then stick with the SQUID user.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 20, 2012 @ 3:59 PM
I tried it but like other people reported. It caches for very few seconds then stops and video then becomes unplayable.
It would be great of there is a fix for it.
and we need configure its networking components
LikeLike
Comment by abdallah — August 20, 2012 @ 4:07 PM
wait . . .
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 22, 2012 @ 9:33 AM
I tried it but like other people reported. It caches for very few seconds then stops and video then becomes unplayable.
It would be great of there is a fix for it.
and we need configure its networking component
LikeLike
Comment by abdallah — August 20, 2012 @ 4:23 PM
salam
finally i found some errors when starting squid on centos with nginx method seems that ruby cant access by squid i tried and change permission of /usr/bin/env but nothing happend plzzz check access.log and correct my conf.thx
2012/08/19 09:34:35| Starting Squid Cache version LUSCA_HEAD-r14809 for i686-pc-linux-gnu…
2012/08/19 09:34:35| Process ID 9430
2012/08/19 09:34:35| NOTICE: Could not increase the number of filedescriptors
2012/08/19 09:34:35| With 1024 file descriptors available
2012/08/19 09:34:35| Using poll for the IO loop
2012/08/19 09:34:35| Performing DNS Tests…
2012/08/19 09:34:35| Successful DNS name lookup tests…
2012/08/19 09:34:35| Adding nameserver 10.0.5.1 from squid.conf
2012/08/19 09:34:35| DNS Socket created at 0.0.0.0, port 40182, FD 5
2012/08/19 09:34:35| Adding nameserver 203.99.163.240 from squid.conf
2012/08/19 09:34:35| helperOpenServers: Starting 5 ‘nginx.rb’ processes
/usr/bin/env: ruby1.8: Permission denied
/usr/bin/env: ruby1.8: Permission denied
/usr/bin/env: ruby1.8: Permission denied
/usr/bin/env: ruby1.8: Permission denied
/usr/bin/env: ruby1.8: Permission denied
2012/08/19 09:34:35| logfileOpen: opening log /var/log/squid/access.log
2012/08/19 09:34:35| logfileOpen: opening log /var/log/squid/access1.log
2012/08/19 09:34:35| logfileOpen: opening log /var/log/squid/access2.log
2012/08/19 09:34:35| logfileOpen: opening log /var/log/squid/access3.log
2012/08/19 09:34:35| logfileOpen: opening log /var/log/squid/access4.log
2012/08/19 09:34:35| Unlinkd pipe opened on FD 19
2012/08/19 09:34:35| Swap maxSize 1024000000 + 2097152 KB, estimated 78930550 objects
2012/08/19 09:34:35| Target number of buckets: 3946527
2012/08/19 09:34:35| Using 4194304 Store buckets
2012/08/19 09:34:35| Max Mem size: 2097152 KB
2012/08/19 09:34:35| Max Swap size: 1024000000 KB
2012/08/19 09:34:35| Local cache digest enabled; rebuild/rewrite every 3600/3600 sec
2012/08/19 09:34:35| logfileOpen: opening log /var/log/squid/store4.log
2012/08/19 09:34:35| AUFS: /home/cache1: log ‘/home/cache1/swap.state’ opened on FD 21
2012/08/19 09:34:35| AUFS: /home/cache1: tmp log /home/cache1/swap.state.new opened on FD 21
2012/08/19 09:34:35| Rebuilding storage in /home/cache1 (DIRTY)
2012/08/19 09:34:35| Using Least Load store dir selection
2012/08/19 09:34:35| Current Directory is /root
2012/08/19 09:34:36| ufs_rebuild: /home/cache1: rebuild type: REBUILD_DISK
2012/08/19 09:34:36| ufs_rebuild: /home/cache1: beginning rebuild from directory
2012/08/19 09:34:35| Loaded Icons.
2012/08/19 09:34:36| Accepting transparently proxied HTTP connections at 0.0.0.0, port 8080, FD 23.
2012/08/19 09:34:36| Configuring 127.0.0.1 Parent 127.0.0.1/8081/0
2012/08/19 09:34:36| Ready to serve requests.
2012/08/19 09:34:36| WARNING: url_rewriter #5 (FD 10) exited
2012/08/19 09:34:36| WARNING: url_rewriter #4 (FD 9) exited
2012/08/19 09:34:36| WARNING: url_rewriter #3 (FD 8) exited
2012/08/19 09:34:36| Too few url_rewriter processes are running
FATAL: The url_rewriter helpers are crashing too rapidly, need help!
Squid Cache (Version LUSCA_HEAD-r14809): Terminated abnormally.
CPU Usage: 0.041 seconds = 0.020 user + 0.021 sys
Maximum Resident Size: 21360 KB
Page faults with physical i/o: 0
Memory usage for squid via mallinfo():
total space in arena: 2412 KB
Ordinary blocks: 2352 KB 6 blks
Small blocks: 0 KB 0 blks
Holding blocks: 65208 KB 3 blks
Free Small blocks: 0 KB
Free Ordinary blocks: 59 KB
Total in use: 67560 KB 100%
Total free: 59 KB 0%
LikeLike
Comment by rehmat ali — August 22, 2012 @ 9:38 AM
OR paste a script 4 centos 4 for squid.conf, nginx.conf, nginx.rb…….thanks 4 ur reply
LikeLike
Comment by Rehmat ali — August 22, 2012 @ 11:59 AM
thanks mr.
I tried it but like other people reported. It caches for very few seconds then stops and video then becomes unplayable.
It would be great of there is a fix for it.
and we need configure its networking component
LikeLike
Comment by abdallah — August 23, 2012 @ 4:20 PM
Updated: 24th August, 2012: Caching Working OK. Check with the new config.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 24, 2012 @ 1:08 PM
[…] https://aacable.wordpress.com/2012/08/13/youtube-caching-with-squid-nginx/ […]
LikeLike
Pingback by Howto Cache Youtube with SQUID / LUSCA and bypass Cached Videos from Mikrotik Queue « Syed Jahanzaib Personnel Blog to Share Knowledge ! — August 24, 2012 @ 1:31 PM
[…] Youtube caching with Squid + Nginx […]
LikeLike
Pingback by Youtube Caching Problem : An error occured. Please try again later. [SOLVED] updated storeurl.pl « Syed Jahanzaib Personnel Blog to Share Knowledge ! — August 24, 2012 @ 1:32 PM
Awesome,
I tried it on Windows 7 and Chrome, then loaded the video with internet explorer 8 and it read it from cache without any problem.
Thanks,
LikeLike
Comment by Badr — August 24, 2012 @ 7:09 PM
Good to know.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 24, 2012 @ 9:25 PM
Thank you very much. My proxy work fine.
LikeLike
Comment by Maz Bhenks — August 24, 2012 @ 7:59 PM
Thanks Mr Syed Jahanzaib
i make all step and finally squid doesn’t work and get error
WARNING: url_rewriter #5 (FD 10) exited
WARNING: url_rewriter #4 (FD 9) exited
WARNING: url_rewriter #3 (FD 8) exited
Too few url_rewriter processes are running
FATAL: The url_rewriter helpers are crashing too rapidly, need help!
but when i remove this line from squid.conf
#url_rewrite_program /etc/nginx/nginx.rb
#url_rewrite_host_header off
it works but doesn’t cache YT plz help how to solve it , Thx
LikeLike
Comment by Squidy — August 25, 2012 @ 1:15 PM
Check your permissions.
chmod 777 -R /squidcache
touch /etc/nginx/nginx.rb
chmod 755 /etc/nginx/nginx.rb
nano /etc/nginx/nginx.rb
chown www-data /usr/local/www/nginx_cache/files/ -Rf
chown proxy:proxy /squidcache
LikeLike
Comment by Badr — August 25, 2012 @ 1:43 PM
Also if you have SELinux. Try disabling it.
Thanx
LikeLike
Comment by Badr — August 25, 2012 @ 1:47 PM
Most common reason of “FATAL: The url_rewriter helpers are crashing too rapidly, need help!” is copy pasting error in nginx.rb . It sometimes happens when pasting the code in wordpress blog. I will test it on monday and will let you know. I will post on pastebin, Also send me your email and I will send you the raw code.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 25, 2012 @ 4:30 PM
Thx all for Help Mr Badr and Mr Syed Jahanzaib , i remove Ubuntu and reinstall it now it’s works good with no error 🙂
LikeLike
Comment by SquidY — August 25, 2012 @ 10:07 PM
Great. Can you share your experience with some screenshots.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 26, 2012 @ 10:01 AM
if I dont use dnsmasq, how I configure the dns_nameservers & resolver in nginx.conf? in your conf, you use 221.132.112.8.
my proxy IP 192.168.0.1
jazaakallaahu khayra..
LikeLike
Comment by Mahdiy — August 27, 2012 @ 4:42 AM
one more question,
I have Lusca installed in my proxy. did I need to uninstall it?
LikeLike
Comment by Mahdiy — August 27, 2012 @ 7:07 AM
Lusca can work with NGINX too. its configuration is almost same as squid.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 27, 2012 @ 9:11 AM
use any dns server ip, its relatively not an issue.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 27, 2012 @ 9:12 AM
chown www-data /usr/local/www/nginx_cache/files/ -Rf
what is www-data, is this is a user? i have centos with lusca and i never crete any user named www-data so in which i should change www-data into squid or nginx…? plzzz clear this and my e-mail is big.bang.now@gmail.com plz send me nginx.rb raw code may be it will resolved by this.
LikeLike
Comment by Rehmat ali — August 28, 2012 @ 7:38 AM
in UBUNTU , www-data is a user/group set created specifically for web servers. It should be listed in /etc/passwd as a user, and can be configured to run as another user in /etc/apache2/apache2.conf.
Basically, it’s just a user with stripped permissions so if someone managed to find a security hole in one of your web applications they wouldn’t be able to do much. Without a lower-user like www-data set, apache2 would run as root, which would be a Bad Thing, since it would be able to do anything and everything to your system.
So for CENTOS, you may use, apache
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 28, 2012 @ 9:19 AM
Do you know why it doesnt work for 360p or 480p.
LikeLike
Comment by nav — August 28, 2012 @ 1:28 PM
I have tested it and its caching 360p perfectly (which is Default video quality for youtube player).
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 29, 2012 @ 12:42 PM
Do you know if it will work with Ipad. I dont think it uses range query
LikeLike
Comment by nav — August 30, 2012 @ 9:27 AM
when starting squid i have error look like ruby has some permission problem i tried to change permission of /usr/bin/env to squid, nginx, Apache but still same issue please resolve this or guide me to resolve this i m using centos with lusca. thxzzzzzz 4 ur reply,
2012/08/25 16:00:03| helperOpenServers: Starting 5 ‘nginx.rb’ processes
/usr/bin/env: ruby1.8: Permission denied
/usr/bin/env: ruby1.8: Permission denied
/usr/bin/env: ruby1.8: Permission denied
/usr/bin/env: ruby1.8: Permission denied
LikeLike
Comment by rehmat ali — August 28, 2012 @ 4:15 PM
Dear jahanzaib bhai i got this error when i try to enable url rewriter program.
m using Centos 5.5 kindly advise me how can i resolve this issue.
2012/08/29 10:44:10| Ready to serve requests.
2012/08/29 10:44:10| WARNING: url_rewriter #1 (FD 6) exited
2012/08/29 10:44:10| WARNING: url_rewriter #2 (FD 8) exited
2012/08/29 10:44:10| WARNING: url_rewriter #3 (FD 9) exited
2012/08/29 10:44:10| Too few url_rewriter processes are running
FATAL: The url_rewriter helpers are crashing too rapidly, need help!
LikeLike
Comment by owais — August 29, 2012 @ 10:58 AM
Dear jahanzaib bhai i got this error when i try to enable url rewriter program.
m using Centos 6 and fedora 14 with squid 3.1 and LUSCA kindly advise me how can i resolve this issue.
2012/08/29 10:44:10| Ready to serve requests.
2012/08/29 10:44:10| WARNING: url_rewriter #1 (FD 6) exited
2012/08/29 10:44:10| WARNING: url_rewriter #2 (FD 8) exited
2012/08/29 10:44:10| WARNING: url_rewriter #3 (FD 9) exited
2012/08/29 10:44:10| Too few url_rewriter processes are running
FATAL: The url_rewriter helpers are crashing too rapidly, need help!
LikeLike
Comment by owais — August 29, 2012 @ 10:59 AM
good working with me but full lan speed not working with mikrotik
LikeLike
Comment by rajeevsamal — August 29, 2012 @ 6:48 PM
videocacheview … check out this tool on google … you don’t need for all of that. 🙂 happy downloading
LikeLike
Comment by Anonymous — August 30, 2012 @ 3:54 AM
videocacheview is a client side tool only.
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — August 30, 2012 @ 9:07 AM
Can someone please tell me how to install squid 2.7 its installing squid3
Tried apt-get install squid=2.7
LikeLike
Comment by nav — August 31, 2012 @ 7:25 AM
apt-get install squid
that install squid 2.7.
How i mangle the hits from NGINX in mikrotik for ZPH?
Someone know?
LikeLike
Comment by xbyt — August 31, 2012 @ 8:31 AM
apt-get install squid is installing squid3
LikeLike
Comment by nav — August 31, 2012 @ 10:50 AM
I have installed squid 2.7 from source as apt-get is installing squid3
However I get the following too
2012/08/31 16:28:17| WARNING: url_rewriter #1 (FD 7) exited
2012/08/31 16:28:17| WARNING: url_rewriter #3 (FD 9) exited
2012/08/31 16:28:17| WARNING: url_rewriter #2 (FD 8) exited
2012/08/31 16:28:17| Too few url_rewriter processes are running
2012/08/31 16:28:17| ALERT: setgid: (1) Operation not permitted
FATAL: The url_rewriter helpers are crashing too rapidly, need help!
LikeLike
Comment by nav — August 31, 2012 @ 11:31 AM
I am on Ubuntu
LikeLike
Comment by nav — August 31, 2012 @ 11:32 AM
execute:
sed -i “/^#/d;/^ *$/d” /etc/squid/squid.conf
and post your squid.conf here.
LikeLike
Comment by xbyt — August 31, 2012 @ 8:38 PM
Sorry had installed ubuntu 12. After installing ubuntu 10 its all working fine
LikeLike
Comment by nav — September 2, 2012 @ 5:36 AM
It can work with Ubuntu 12 too, But if it worked on 10 for you , I am glad it worked 🙂
LikeLike
Comment by Syed Jahanzaib / Pinochio~:) — September 3, 2012 @ 9:01 AM
‘re having problems with some computers on the network that are not doing the proxy server, which can be?
LikeLike
Comment by lucasffernandes — August 31, 2012 @ 10:53 PM
Sorry I was installing Ubuntu 12. With Ubuntu 10.4 everything worked fine however cannot access any website
cache.log says
/etc/nginx/nginx.rb: 10: require: not found
/etc/nginx/nginx.rb: 11: require: not found
/etc/nginx/nginx.rb: 13: class: not found
/etc/nginx/nginx.rb: 14: attr_accessor: not found
/etc/nginx/nginx.rb: 15: attr_reader: not found
/etc/nginx/nginx.rb: 17: Syntax error: “(” unexpected
/etc/nginx/nginx.rb: 10: require: not found
/etc/nginx/nginx.rb: 11: require: not found
/etc/nginx/nginx.rb: 13: class: not found
/etc/nginx/nginx.rb: 14: attr_accessor: not found
/etc/nginx/nginx.rb: 15: attr_reader: not found
.
.
.
2012/09/01 17:40:13| Configuring 127.0.0.1 Parent 127.0.0.1/8081/0
2012/09/01 17:40:13| Ready to serve requests.
2012/09/01 17:40:13| WARNING: url_rewriter #1 (FD 7) exited
2012/09/01 17:40:13| WARNING: url_rewriter #2 (FD 8) exited
2012/09/01 17:40:13| WARNING: url_rewriter #3 (FD 9) exited
2012/09/01 17:40:13| Too few url_rewriter processes are running
FATAL: The url_rewriter helpers are crashing too rapidly, need help!
Squid Cache (Version 2.7.STABLE7): Terminated abnormally.
LikeLike
Comment by nav — September 1, 2012 @ 12:49 PM
Sorry its working now there was a blank line at the top of nginx.rb file.. oops
LikeLike
Comment by nav — September 1, 2012 @ 12:59 PM
Assalam O alaikum Sir ubuntu ko Mikrotik ma kaise Connect karo Sir jaldi bata