broken downloads in CLI

General discussion about Linux, Linux distribution, using Linux etc.
newbie
Company Havaldaar Major
Posts: 156
Joined: Thu Aug 08, 2002 4:18 am
Location: lahore
Contact:

broken downloads in CLI

Postby newbie » Wed Dec 17, 2003 5:13 am

salam!
how are u all

normally when we start a download through wget or ncftp if the machine is rebooted then how can we resume that download.

if its not possible then how can we stop a download manually, and then after some time resume it manually.(the process can be run in background)

thanks for reading .

LinuxFreaK
Site Admin
Posts: 5132
Joined: Fri May 02, 2003 10:24 am
Location: Karachi
Contact:

Re:

Postby LinuxFreaK » Wed Dec 17, 2003 6:59 am

Dear Newbie,
Salam,

use wget -c whatever_url if the server supports it. and for ncftp there's an option on the get command, not sure what it is. Try 'help get'

Best Regards.
Farrukh Ahmed

fawad
Site Admin
Posts: 918
Joined: Wed Aug 07, 2002 8:00 pm
Location: Addison, IL
Contact:

Postby fawad » Wed Dec 17, 2003 9:45 am

In general, the wget -c option mentioned by LinuxFreaK should work with all http/1.1 servers (httpd, iis are both compliant). I find the resuming capabilities of Konqueror very helpful (of course that's not CLI, so kinda OT).

newbie
Company Havaldaar Major
Posts: 156
Joined: Thu Aug 08, 2002 4:18 am
Location: lahore
Contact:

Postby newbie » Thu Dec 18, 2003 12:16 am

thanks a lot linux freak it worked.

thanks a lot fawad bhai for konquror information because i was also looking for a gui download resuming software for linux.

zaeemarshad
Lieutenant Colonel
Posts: 660
Joined: Sat Jul 06, 2002 12:35 pm
Location: Islamabad
Contact:

Postby zaeemarshad » Thu Dec 18, 2003 2:50 am

one thing i will like to add and that shall help. Often when downloading from a website the webserver terminates the session while the connection remains established but no data is transferred. download software in windows like DAP fail to download from such sites. an example is books.dimka.ee and a few sourceforge links. what i did was make a shellscript like this

#!/bin/bash
kill -9 `pgrep wget`
sleep 5
wget -c url_to_file

and then add an entry in crontab to run it after every 4 minutes. now cron xecutes this script killing wget first and then launching it again after 5 seconds which i found was necessary for a few servers. i hope that helps. if somebody thinks the error is due to any other reason than kindly let me know. anyway the script combined with crond works absolutely fine.

Regards
Zaeem Arshad

lambda
Major General
Posts: 3452
Joined: Tue May 27, 2003 7:04 pm
Location: Lahore
Contact:

Postby lambda » Fri Dec 19, 2003 2:05 pm

don't use kill -9 unless you really need it.

Faraz.Fazil
Major General
Posts: 1024
Joined: Thu Jul 04, 2002 5:31 pm
Location: Karachi/Pakistan/Earth/Universe
Contact:

Postby Faraz.Fazil » Sat Dec 20, 2003 9:23 am

I donot recommend Wget other than for ripping websites.
Agreed it is flexible...but its only console based....and its better to use one of the specially designed gui download managers for linux.

For downloading files (not websites), use a good download manager for linux.

Use either one of:

1.Aria
2.Downloader 4 X

Both are really cool GUI based download managers for linux, and have many gr8 features like auto resuming or multi threaded downloads.etc.I love D4X

D4X Download:
==========

http://www.krasu.ru/soft/chuchelo/

Happy downloading!
Linux for Life!


Return to “General”

Who is online

Users browsing this forum: No registered users and 1 guest