I had an old Rails 2 app (a blog) that still got visits, but no updates. It’s effectively been read-only for years.
Since I’m consolidating servers, I wanted to get rid of the machine it was hosted on, and moving the Rails app elsewhere wouldn’t be trivial.
So I replaced it with a static copy of the site. Just flat files.
(I also made a database dump just in case I want to make it dynamic again in the future.)
This is how I did it.
Archive the site
brew install wget
If you don’t have it already on e.g. Ubuntu, try
sudo apt-get install wget
Then I told wget to archive the site:
wget --convert-links --mirror mysite.com
It will end up in a
Upload the site
rsync -azv mysite.com myserver:apps
Substituting whatever server and path you prefer. I keep sites in subdirectories of
You could also call
wget on the server, of course, and skip the upload step. I wanted a local copy and to verify the download with my local tools.
In the Nginx configuration for the site, I had to do some special things:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
try_files $request_uri so that requesting e.g.
stylesheets/all.css?1393152599 would look for a file by that exact name, query string and all.
And I needed the
default_type declarations to handle HTML files archived without an extension, as well as e.g. stylesheets ending with a query string.
I only had JPG uploads, but you could use a regexp for more complex needs.
Hope this helps!