Help - Search - Member List - Calendar
Full Version: Copying parts of websites
WorkTheWeb Forums > Webmaster Resources > Webmaster - General Help
Support our Sponsors!
Tom Galenson
I am very interested in being able to copy parts of websites. The websites
I want to copy feature cd-roms online (on the hard disk) from old bbs
collections as well as Gnu etc.

I have discovered that most website file grabbers choke if I try to grab a
whole site with 90 cd-roms online. But many refuse to limit themselves to a
paticular sub-directory path and lower. So I am looking for
recommendations. My target is: http://cd.textfiles.com and I run
www.chatnfiles.com

Thanks,
Tom Miller, longtime computer hobbyist

ps. I am taking another whack at a better looking website... but the above
is my most important technical issue. T.

John Bokma
"Tom Galenson" <[Email Removed]> wrote:

QUOTE
I am very interested in being able to copy parts of websites.  The
websites I want to copy feature cd-roms online (on the hard disk) from
old bbs collections as well as Gnu etc.

I have discovered that most website file grabbers choke if I try to
grab a whole site with 90 cd-roms online.  But many refuse to limit
themselves to a paticular sub-directory path and lower.  So I am
looking for recommendations.  My target is: http://cd.textfiles.com
and I run www.chatnfiles.com

wget?

--
John Perl SEO tools: http://johnbokma.com/perl/
Experienced (web) developer: http://castleamber.com/
Get a SEO report of your site for just 100 USD:
http://johnbokma.com/websitedesign/seo-expert-help.html

Blinky the Shark
Tom Galenson wrote:

QUOTE
I am very interested in being able to copy parts of websites.  The
websites I want to copy feature cd-roms online (on the hard disk) from
old bbs collections as well as Gnu etc.

I have discovered that most website file grabbers choke if I try to
grab a whole site with 90 cd-roms online.  But many refuse to limit
themselves to a paticular sub-directory path and lower.  So I am
looking for recommendations.  My target is: http://cd.textfiles.com
and I run www.chatnfiles.com

That's quite an archive, already. Just a couple of minutes of poking
around rendered a nifty old 486 logo and a photo of Christina Applegate
from 1994.

--
Blinky Linux Registered User 297263
Killing all Usenet posts from Google Groups
Info: http://blinkynet.net/comp/uip5.html
*ALSO contains links for access to the NON-BETA GG archive interface*

Tom Galenson
"Blinky the Shark" <[Email Removed]> wrote in message
news:[Email Removed]...
--snip-
QUOTE
themselves to a paticular sub-directory path and lower.  So I am
looking for recommendations.  My target is: http://cd.textfiles.com
and I run www.chatnfiles.com

That's quite an archive, already.  Just a couple of minutes of poking
around rendered a nifty old 486 logo and a photo of Christina Applegate
from 1994.
-snip-

True. If I wasn't able to hand cd-roms directly to the Host Master, I
couldn't do this, there is no way I could upload that many without paying an
arm/leg/other flesh....
However, I do have an ambition to fully fill up his Western Digital 250
Gigabyte drive with my downloads.... Most of the rest of his websites live
on a different drive... <g>

Tom, longtime comuter hobbyist...

--
---------------------------------------------
Would you believe 1 of the largest files collections
in the world with 1 of the most limited web-designs?
www.chatnfiles.com

Tom Galenson
"John Bokma" <[Email Removed]> wrote in message
news:[email protected]...
QUOTE
"Tom Galenson" <[Email Removed]> wrote:

I am very interested in being able to copy parts of websites.  The
--snip-

So I am
QUOTE
looking for recommendations.  My target is: http://cd.textfiles.com
and I run www.chatnfiles.com

wget?

--
John
John, I have a copy of wget.  Unfortunately, I am consulting with someone

else to figure out the regular expressions I need on the command line. I
just am not upto figuring out the "regular expressions" I need. I may have
to go bother some perl-mongers...

Thanks,
Tom, longtime computer hobbyist

--
---------------------------------------------
Would you believe 1 of the largest files collections
in the world with 1 of the most limited web-designs?
www.chatnfiles.com

John Bokma
"Tom Galenson" <[Email Removed]> wrote:

QUOTE
"John Bokma" <[Email Removed]> wrote in message
news:[email protected]...
"Tom Galenson" <[Email Removed]> wrote:

I am very interested in being able to copy parts of websites.  The
--snip-
So I am
looking for recommendations.  My target is: http://cd.textfiles.com
and I run www.chatnfiles.com

wget?


QUOTE
John, I have a copy of wget.  Unfortunately, I am consulting with
someone else to figure out the regular expressions I need on the
command line.  I just am not upto figuring out the "regular
expressions" I need.  I may have to go bother some perl-mongers...

wget is quite powerful. I guess you mean the GET that comes with perl? Not
sure what that can do.

wget has recently been updated, can do even more IIRC.

--
John Perl SEO tools: http://johnbokma.com/perl/
Experienced (web) developer: http://castleamber.com/
Get a SEO report of your site for just 100 USD:
http://johnbokma.com/websitedesign/seo-expert-help.html


PHP Help | Linux Help | Web Hosting | Reseller Hosting | SSL Hosting
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2006 Invision Power Services, Inc.