*BSD News Article 71791


Return to BSD News archive

Path: euryale.cc.adfa.oz.au!newshost.anu.edu.au!harbinger.cc.monash.edu.au!news.rmit.EDU.AU!news.unimelb.EDU.AU!munnari.OZ.AU!spool.mu.edu!howland.reston.ans.net!nntp.coast.net!dispatch.news.demon.net!demon!xara.net!agate.xara.net!zynet.net!alex
From: alex@starfleet.zynet.co.uk (Alex McLean)
Newsgroups: demon.ip.support,demon.tech.unix,comp.unix.bsd.freebsd.misc
Subject: Re: Batch FTP and Web Pages
Date: 22 Jun 1996 13:50:59 GMT
Organization: Starfleet Command
Lines: 19
Message-ID: <slrn4snue7.5qr.alex@starfleet.zynet.co.uk>
References: <31c2e7bd.14691630@news.demon.co.uk> <834878464snz@pair.com> <834921960snz@michaels.demon.co.uk> <835206024.5881.2@diltd.demon.co.uk>
Reply-To: alex@area51.upsu.plym.ac.uk
NNTP-Posting-Host: starfleet.zynet.co.uk
X-Newsreader: slrn (0.8.5)

Hi Ric,

To 'batch http' - download a file in the background on a Un*x machine, do 
something like this:

  nohup lynx -source http://area51.upsu.plym.ac.uk/~alex/word-gen.tar.gz &

You can then log off, and it will continue downloading the file.

It'll work for html files as well, but as Dom pointed out, would not get any
inline images.  Obviously you'll need lynx to do this though.

As for getting whole sets of pages, there is mirroring software for the world wide
web available which could probably be adapted for this fairly easily.

Cheers,

Alex