Thursday, June 16, 2016

[google-cloud-sql-discuss] Re: Feasibility of archival storage

Happy to answer your questions though I would suggest posting such Cloud Storage questions to the App Engine Google Groups in the future.

The operations table shows the different cost of each operation type.  The Cloud Storage APIs article states the following:
By default, gsutil versions starting with 4.0 interact with the JSON API.
As such, you can consider the costs relative to the JSON API when using gsutil for your sync/backups.  The cost of your backups will depend entirely the volume or reads and writes.  Without knowing too much specific detail about your needs, I can only suggest crunching some numbers for yourself with the documentation provided and testing it cautiously.

On Tuesday, June 14, 2016 at 4:11:35 PM UTC-4, Upendra Gandhi wrote:
Hi,

Our organisation is looking at storing 5-10TB of data for archival/DR purposes on cloud storage. The initial data shall be around 5-10TB but we will do a weekly sync (upload/deletion) to the archived data.

The data is scattered across solaris, linux, windows systems in in-house data center.

1. Is there a way to get cost estimation?
2. What is the fastest way to export data to google cloud (initial export) to google cloud
3. What different options are there for weekly syncing data from scattered data from different platforms to the storage bucket?

Any help is appreciated.

Thanks,
Upendra

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/1bb3fe08-8615-4fe1-a268-60f18a1767e0%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

No comments:

Post a Comment