For linux guys, wget is always a good old friend. Citing from the man page, it is a non-interactive network downloader. In situations where we need to download from multiple urls, wget can take input from files which contain those urls. ‘ -i ‘ option can be used along with wget to specify the input file. Apart from that you can even write a small script and run it in background as is explained below.
Suppose I need to download certain files from the following urls (Note:- Urls provided here are provided as a matter of example. They may or may not be real links and is out of my control.)
http://www.abcd.com/page1/file1.pdf
http://www.efgh.com/page2/file2.ogg
http://www.ijkl.com/page3/file3.jpg
Thus we include these in a variable URL as $URLS=”www.abcd.com/page1/file1.pdf, http://www.efgh.com/page2/file2.ogg, http://www.ijkl.com/page3/file3.jpg”. Now put a simple for loop.
for u in $URLS
do
wget -b -P /home/user/Downloads $u –no-check-certificate –user=username –password=password
done
-b : for running it in background
-P : specify download directory
Other argumnets are optional like username and password or to proceed without certificate etc. You may or may not include those arguments based on your requirements. Consolidating all the above the overall script will be similar to one below.
#!/bin/bash
URLS=”www.abcd.com/page1/file1.pdf \
http://www.efgh.com/page2/file2.ogg \
http://www.ijkl.com/page3/file3.jpg”
for u in $URLS
do
wget -b -P /home/user/Downloads $u –no-check-certificate –user=username –password=password
done
exit 0
In case you feel to pause the download in between, just try to pull out the process id by issuing the pidof command and execute kill -STOP <pid> for pausing and kill -CONT <pid> for resuming the same.
Try out and have fun with wget…