Batch download files from a website

I needed a quick bash script that will batch download files from a website and I came up with the following.

The files had the names like 1.gif, 2.gif, etc and were accessible via a CDN subdomain like http://images.mydomain.com.

So here it is:

for i in {1..18000}; do wget images.mydomain.com/$i.gif; sleep 5;done

Replace mydomain.com with your site and that’s it.
Alternatively you can get rid of “sleep 5” and put the whole script in a file and execute it with:

[root@nix]# nohup ./script.sh &

This will keep your script running even if you disconnect from the shell console.

Posted in BASH, scripts.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.