I'd Rather Not Have To Write This
Hey! Where is the blog post about moving hosting providers? Good question. Lamentably, the answer is "pilot error", A.K.A. it is all my fault. During a code push to my new hosting provider, I accidentally clobbered my new database access config file with my old config file. Fortunately, the end result of my mistake, was the loss of just one blog post.
So What Happened?
When the config was clobbered, all database requests pointed to my old hosting provider and when my service time with them expired, they removed my former database from their systems.
As a result, any information that was added to the database after the clobber was lost (with the exception of the RSS PHP post which I still had on my computer in markdown format). sql
So What Was Lost?
The decision making criteria that I used when selecting http://www.webfaction.com as my new hosting provider and http://gandi.net as my new domain registrar is gone. I'll do a quick recap:
webfaction
- multiple websites on one hosting account
- shell access
- runs web apps and not just scripts
How about I make this nice and short and just send you to their features page
Gandi.net
- They are not GoDaddy
- No Bullshit is there motto
- They support project that I like
And who could forget the loss of information about http://googlebutter.com? Not me, that's for sure. Hi Buddy!
What Am I Doing to Limit Loss in the Future?
The real culprit here is me for not making enough database backups. So I had better write a script to make database backups for me. For security reasons, my new provider does not allow mysql connection from other than localhost or 127.0.0.1, I need to:
- Create an SSH tunnel to my new host
- Run a mysqldump and pipe the output to a text file
- tar gzip the text file
Enter the Ruby
require 'date'
require 'open3'
#create variable for database access
db_server = "MYSERVER_IP_ADDRESS"
db_user = "MY_DATABASE_USER"
db_pass = "MY_DATABASE_USER_PASSWORD"
#what day is it?
Ymd = Date.today.strftime("%Y%m%d")
sql_file = "#{Ymd}_dump.sql"
#create an ssh tunnel
sshcmd = "ssh -L 3307:127.0.0.1:3306 #{db_server} -N"
dumpcmd = "mysqldump -A -u#{db_user} -P3307 -h127.0.0.1 -p#{db_pass} > #{sql_file}"
#start the sshcmd
puts "starting ssh tunnel..."
Open3.pipeline_start(sshcmd) do |threads|
#sleep for a bit and give the thread time to connect
sleep 10
#get the first thread
t = threads[0]
puts "ssh tunnel has pid: #{t.pid}"
#run the dump command
puts "dumping database info"
system(dumpcmd)
puts "data dump is complete, killing ssh tunnel pid #{t.pid}"
#kill the ssh tunnel
Process.kill("TERM", t.pid)
end
#tar the sqlfile
tarcmd = "tar -czvf #{sql_file}.tgz #{sql_file}"
system(tarcmd)
system("rm #{sql_file}")
For easy copy paste, the code is in the hoof http://hoof.jezra.net/snip/nT
What I should really do, instead of tunneling the mysqldump and then tar gzipping the data on my machine, Is perform the dump and archive on the server and then transfer the compressed archive to my machine.
Now quit reading, and go backup your data! (Then go see my buddy at http://googlebutter.com) Hi Buddy! hahaha