broken downloads in CLI

General discussion about Linux, Linux distribution, using Linux etc.

broken downloads in CLI

Postby newbie » Wed Dec 17, 2003 5:13 am

salam!
how are u all

normally when we start a download through wget or ncftp if the machine is rebooted then how can we resume that download.

if its not possible then how can we stop a download manually, and then after some time resume it manually.(the process can be run in background)

thanks for reading .
newbie
Company Havaldaar Major
 
Posts: 156
Joined: Thu Aug 08, 2002 4:18 am
WLM: usman_fool@hotmail.com
Location: lahore

Re:

Postby LinuxFreaK » Wed Dec 17, 2003 6:59 am

Dear Newbie,
Salam,

use wget -c whatever_url if the server supports it. and for ncftp there's an option on the get command, not sure what it is. Try 'help get'

Best Regards.
Farrukh Ahmed
LinuxFreaK
Site Admin
 
Posts: 5132
Joined: Fri May 02, 2003 10:24 am
ICQ: 82075802
Website: http://www.linuxpakistan.net/wiki/index.php?pagename=LinuxFreak
WLM: f4fahmed@hotmail.com
Yahoo Messenger: f4fahmed@yahoo.com
AOL: linuxpakistan@aol.com
Location: Karachi

Postby fawad » Wed Dec 17, 2003 9:45 am

In general, the wget -c option mentioned by LinuxFreaK should work with all http/1.1 servers (httpd, iis are both compliant). I find the resuming capabilities of Konqueror very helpful (of course that's not CLI, so kinda OT).
fawad
Site Admin
 
Posts: 918
Joined: Wed Aug 07, 2002 8:00 pm
ICQ: 17672437
Website: http://www.fawad.net
WLM: fawadhalim@hotmail.com
Yahoo Messenger: fawad2048
AOL: fawadhalim
Location: Addison, IL

Postby newbie » Thu Dec 18, 2003 12:16 am

thanks a lot linux freak it worked.

thanks a lot fawad bhai for konquror information because i was also looking for a gui download resuming software for linux.
newbie
Company Havaldaar Major
 
Posts: 156
Joined: Thu Aug 08, 2002 4:18 am
WLM: usman_fool@hotmail.com
Location: lahore

Postby zaeemarshad » Thu Dec 18, 2003 2:50 am

one thing i will like to add and that shall help. Often when downloading from a website the webserver terminates the session while the connection remains established but no data is transferred. download software in windows like DAP fail to download from such sites. an example is books.dimka.ee and a few sourceforge links. what i did was make a shellscript like this

#!/bin/bash
kill -9 `pgrep wget`
sleep 5
wget -c url_to_file

and then add an entry in crontab to run it after every 4 minutes. now cron xecutes this script killing wget first and then launching it again after 5 seconds which i found was necessary for a few servers. i hope that helps. if somebody thinks the error is due to any other reason than kindly let me know. anyway the script combined with crond works absolutely fine.

Regards
Zaeem Arshad
zaeemarshad
Lieutenant Colonel
 
Posts: 660
Joined: Sat Jul 06, 2002 12:35 pm
Website: http://zaeem.no-ip.org
WLM: zarshadvirk@hotmail.com
Yahoo Messenger: negativecreep61@yahoo.com
AOL: zarshadvirk
Location: Islamabad

Postby lambda » Fri Dec 19, 2003 2:05 pm

don't use kill -9 unless you really need it.
lambda
Major General
 
Posts: 3452
Joined: Tue May 27, 2003 7:04 pm
Website: http://www.hungry.com/~fn/
Location: Lahore

Postby Faraz.Fazil » Sat Dec 20, 2003 9:23 am

I donot recommend Wget other than for ripping websites.
Agreed it is flexible...but its only console based....and its better to use one of the specially designed gui download managers for linux.

For downloading files (not websites), use a good download manager for linux.

Use either one of:

1.Aria
2.Downloader 4 X

Both are really cool GUI based download managers for linux, and have many gr8 features like auto resuming or multi threaded downloads.etc.I love D4X

D4X Download:
==========

http://www.krasu.ru/soft/chuchelo/

Happy downloading!
Linux for Life!
Faraz.Fazil
Major General
 
Posts: 1024
Joined: Thu Jul 04, 2002 5:31 pm
WLM: faraz7476@hotmail.com
Location: Karachi/Pakistan/Earth/Universe


Return to “%s” General

Who is online

Users browsing this forum: No registered users and 1 guest

cron