Tuesday, December 18, 2012

Re: Automating Offsite Backups?

Hi Alexander

There seem to be three issues here:

The first is the security of the infrastructure underlying the Google Cloud Platform. We take this very seriously, and you can read more about the safeguards we put in place here: https://cloud.google.com/files/Google-CommonSecurity-WhitePaper-v1.4.pdf
Second, as you say, replication in multiple locations means your data is safe, and available even in the event of very large failures.
Third, the question of operator error or malicious actions. For this we provide regular scheduled backups which are preserved for 7 days and cannot be deleted by operator actions.

With these three levels believe we have provided sufficient robustness and security to support even the most demanding of applications, nonetheless we want to make it straightforward to get your data in and out of Google's cloud. The import/export functionality allows you to do this manually, and we plan to make it easier to use early next year. Stay tuned!

J

On Tue, Dec 18, 2012 at 2:25 AM, Alexander Bertram <alex@bedatadriven.com> wrote:
HI there,
Having been using CloudSQL in production for a few months and am quite happy!

However, I'm still trying to find the best way to automate offsite backups. I'm satisfied that CloudSQL provides good safeguards against hardware failure, through replication and nightly backups, but I'm am still concerned about the possibility of data loss through operator error or the actions of a malicious intruder  -- total destruction of our production database is only two clicks a way!

To guard against this, we've been doing manual daily-weekly dumps and archiving them offsite. Is there anyway to automate this? Looking further down the road, have you considered adding "termination" protection so that a key database cannot be terminated without additional user authentication / safeguards?

Thanks,
Alex



--
Joe Faith | Product Manager | Google Cloud

No comments:

Post a Comment