As the doctor gone rogue

July 14, 2010

Using wget to download files from secured website

Filed under: bash — Tags: , — hypotheses @ 4:56 pm

Our collaborator just uploaded a bunch of files to their website today. One way to get those data easily is to download them all using wget, which mirror all the structures on the remote websites.

Normally you can


wget -r http://fly.srk.fer.hr --user=bhoom --password=bhoom_Password

-r : for recursive

You can also limit how many levels you want to download, the space limit, etc.

Advertisements

1 Comment »

  1. Two more useful options:
    -l2 : specify not to follow more than two levels of links
    –no-parent : this is a must for http download when you don’t want to get all other files outside of the current parent directory

    EXTRA:
    -e robots=off : to turn of robot exclusion if you know what it means.

    Comment by Bhoom — July 15, 2010 @ 2:37 pm


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at WordPress.com.

%d bloggers like this: