Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Baitball Blogger

(46,757 posts)
Fri Aug 17, 2012, 02:13 PM Aug 2012

So, if someone had a website and they wanted people to download all the files for

insurance reasons, would it be easy to do if there were hundreds of records all over the website? Is there a program that can deconstruct the downloadable docs? Like pdf and docs?

12 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
So, if someone had a website and they wanted people to download all the files for (Original Post) Baitball Blogger Aug 2012 OP
Unless I'm missing something... discntnt_irny_srcsm Aug 2012 #1
Let's say I put a website together that has hundreds of documents scattered Baitball Blogger Aug 2012 #2
Having a special program to do it would indicate that the user is computer savvy. ManiacJoe Aug 2012 #3
Sadly, you are correct. Baitball Blogger Aug 2012 #4
There are programs (and even FireFox extensions, I think) that will do what the OP is asking Occulus Aug 2012 #10
This is a little complicated. discntnt_irny_srcsm Aug 2012 #5
Are you looking for a spider crawled archive of a website? ChromeFoundry Aug 2012 #6
I'm just going to have to hope that some of the DUers that will answer my plea to Baitball Blogger Aug 2012 #7
I think this is what you want... ChromeFoundry Aug 2012 #8
Wow! Thanks! Baitball Blogger Aug 2012 #9
Is this a porn site? HopeHoops Aug 2012 #11
If only I had the time for such prurient interests. Baitball Blogger Aug 2012 #12

discntnt_irny_srcsm

(18,482 posts)
1. Unless I'm missing something...
Fri Aug 17, 2012, 03:40 PM
Aug 2012

...it would be easier for the person with access (ftp permission implied) to just copy the individual files.

What do you mean by deconstruct?

Baitball Blogger

(46,757 posts)
2. Let's say I put a website together that has hundreds of documents scattered
Fri Aug 17, 2012, 03:45 PM
Aug 2012

throughout the pages that make the website. Can someone come in with a special program to identify all the documents for easy download? I want people without computer background to be able to do it. If it's too tedious a task, they probably won't bother.

ManiacJoe

(10,136 posts)
3. Having a special program to do it would indicate that the user is computer savvy.
Fri Aug 17, 2012, 03:54 PM
Aug 2012

If you want it to be easy for the non-savvy, you need to package all the files in one place so that no hunting is required.

Occulus

(20,599 posts)
10. There are programs (and even FireFox extensions, I think) that will do what the OP is asking
Fri Aug 17, 2012, 11:11 PM
Aug 2012

I don't use them because it's far, far too easy to accidentally tell them to download a whole website, links, images, documents, and all.

Depending on the site, that's a lot to download in a batch. But yes, there are programs that will do it all in a one-click operation.

discntnt_irny_srcsm

(18,482 posts)
5. This is a little complicated.
Fri Aug 17, 2012, 05:27 PM
Aug 2012

The files making up a website can be place in a single folder (directory) on the serving pc or in multiple folders.

You can install any good ftp client (just google "ftp client&quot to make this easier and faster. Your host will usually grant ftp access (you get a user name, password and sometimes account number) to those having content control. Some hosts have friendly tools for you to use to make pages and build or edit your pages.

If you're lucky all the files will be in the one folder and the ftp app can just copy them all, similar to how you would copy files from your hard drive to a floppy or USB key using Windows Explorer.

Help should be available from your host if you have issues.

Best of luck.

ChromeFoundry

(3,270 posts)
6. Are you looking for a spider crawled archive of a website?
Fri Aug 17, 2012, 09:48 PM
Aug 2012

It will only get the files that are actually linked from the site pages, but I have used utilities like this in the past and work fairly easily. If you look around there are probably freeware versions out on Github, SourceForge or CodePlex.

http://www.inspyder.com/products/Web2Disk/Default.aspx

Baitball Blogger

(46,757 posts)
7. I'm just going to have to hope that some of the DUers that will answer my plea to
Fri Aug 17, 2012, 09:53 PM
Aug 2012

"steal my data" will know how to do it.

ChromeFoundry

(3,270 posts)
8. I think this is what you want...
Fri Aug 17, 2012, 10:07 PM
Aug 2012
http://www.httrack.com/

You can copy the entire website to your hard drive. Then just ZIP the root folder and send it to whomever wants it. It will have all the pages, images and files on the site in a single file.

Baitball Blogger

(46,757 posts)
12. If only I had the time for such prurient interests.
Sun Aug 19, 2012, 08:32 PM
Aug 2012

No. Not even then. I think my first boyfriend was right. I'm asexual.

Latest Discussions»Help & Search»Computer Help and Support»So, if someone had a webs...