Tuesday, February 14, 2012

Porting Postgres to google cloud SQL

I know it's quite a peculiar problem - you have a postgres database, and suddenly you're migrating to Google App Engine and you want to use the new Google Cloud SQL (look it up, it's public and it's pretty cool).

Anyway, you need to port the old data. Google cloud sql is cool except for a big problem - sql dump imports have 0 error reporting - if it fails, it just turns red and tells you that an unknown error occurred. So here's how I made it work:

  1. dump the postgres stuff selecting the tables you want with a couple extra options on:
  2. pg_dump --column-inserts --data-only POSTGRES_DATABASE_NAME -t TABLE_NAME -t ANOTHER_TABLE_NAME -f NEW_FILE_NAME.sql [note: you need to have psql privileges already here].
  3. delete the top lines of the dump file created in 2) until the first "insert" line.
  4. load it into mysql locally, where you can catch any errors:
  5. mysql -u USER -p DATABASE_NAME < NEW_FILE_NAME
  6. dump it from the local mysql:
  7. mysqldump -u USERNAME -p --add-drop-table MYSQL_DATABASE_NAME TABLE_NAME ANOTHER TABLE_NAME> FIXED_SQL_DUMP_FILE.sql
  8. add as the first line in the new dumpfile: "use DATABASE_NAME;" (ignore the quotes, add the name of the database you want the data loaded into on google).
  9. Now you can load this new file into a google cloud storage bucket using their web browser gui and from there import it into cloud sql.
  10. pray, as you wait for the stupid thing with no error reporting to turn green.
facepalms: 7




No comments:

Post a Comment