pipe - Fetching via wget to memory & bypassing disk writes -


is possible download contents of website—a set of html pages—straight memory without writing out disk?

i have cluster of machines 24g of installed each, i’m limited disk quota several hundreds mb. thinking of redirecting output wget kind of in-memory structure without storing contents on disk. other option create own version of wget may there simple way pipes

also best way run download in parallel (the cluster has >20 nodes). can’t use file system in case.

see wget download options:

‘-o file’

‘--output-document=file’

the documents not written appropriate files, concatenated , written file. if ‘-’ used file, documents printed standard output, disabling link conversion. (use ‘./-’ print file literally named ‘-’.)

if want read files perl program, can invoke wget using backticks.

depending on really need do, might able using lwp::simple's get.

use lwp::simple; $content = get("http://www.example.com/"); die "couldn't it!" unless defined $content; 

update: had no idea implement own file system in perl using fuse , fuse.pm. see fuse::inmemory.


Comments

Popular posts from this blog

c++ - Convert big endian to little endian when reading from a binary file -

C#: Application without a window or taskbar item (background app) that can still use Console.WriteLine() -

unicode - Are email addresses allowed to contain non-alphanumeric characters? -