Friday, August 17, 2012

Migration Website Using Wget


There are occasions where you need to move a website from a hosting provider to another and the more standard approach of using FTP to collect all the files is not available.

This can sometimes occur because there was a split between the website owner and web host exists, the data access have been lost, the web host can not be contacted, the migration is urgent, etc.

Wget is a common unix tool, which is also available in Windows. Wget works from the command line, and has many different configuration options to control exactly what will be downloaded from the point of departure is given, and then what he does with what it finds.

Artworks by the wget command to leave home and trawling through the site to get a copy of each html file or image that you can find a connection, which is part of the site began.

We often use wget to reflect fully remote sites, when a new client comes to us from another web hosting provider, often the site using wget to copy. To use it on our server, log in using ssh. From the command prompt, run wget with the URL of the file you want to download. This will download the file directly to our server. As a hosting provider, as we must work very fast Internet connections, and then using wget directly from our servers is much faster than downloading to your local computer and then re-upload the files on our server.

Another common use is, as I said, to reflect an entire site. Suppose that you are moving the anchor site hosting company web site from A to B. hosting company you have your new account setup, and you have logged in via ssh to server B. Now, to reflect your site, wget-r http://www.anchor.com.au wget to download recursively, and your website to the new account.

Now you should have a complete copy of your website, but beware, wget does not read javascript, then all the fancy rollover effects will not work unless you copy the correct files manually.

By default wget will create a directory named after the site is low, you probably want to put the files in the directory where you are at that moment, so just add-nd in command. This tells wget not to create directories, except when necessary for your site.

The final command should look something like

wget-nd-RNP http://www.anchor.com.au

Another word of warning is in relation to websites that are produced by programming languages. Wget is really only useful for mirroring sites in a specific set of circumstances. If the site was built with asp, php, perl, java, etc., only wget to download the html files that make these programs rather than the original files. This is important to take note of these programming languages ​​can be run taskssuch how to change the page content based on user, interacting with a database to collect statistics, or accept orders.

Once you've used wget to make a copy of your website, it is important to check the files to the new location to ensure it is behaving the same way that the original site did .......

No comments:

Post a Comment