Monday, June 27, 2016

[google-cloud-sql-discuss] Re: Query Google Big Query from Cloud SQl

Hajar,

ParaSQL's Hyper Connect Engine is a MySQL compatible database engine that runs on Google Compute Engine. Unlike CloudSQL, it allows you to add linked servers sort of like a Microsoft SQL linked server (or federated tables in MySQL) but using any data source that has an ODBC driver available (JDBC coming soon). There are several commercial ODBC drivers available for BigQuery (for example, from Simba Technologies). Once the servers are connected, you can simply issue a command like:

insert into MySQLTable
select a,b from BigQueryTable where ...

You can also dynamically join across MySQL and BigQuery without copying the data first. So something like this is valid:

select t1.col1, t2.col3
from MySQLTable as t1
left join BigQueryTable on (BigQueryTable.col1 = MySQLTable.col7) as t2
where BigQueryTable.col12 between 45 and 56
order by BigQueryTable.col2, MySQLTable.col5

GROUP BY and aggregates (sum,min,avg,etc) also work. Nested sub-select syntax also works.
There WHERE clause is pushed down to BigQuery, so this works well so long as the amount of data coming back from the BigQuery part of the query isn't too huge.


On Friday, June 17, 2016 at 1:11:56 PM UTC-4, Hajar Homayouni wrote:
Hi all,

I have a large dataset in GBQ, and I want to query (select) a small part of it in Cloud SQL. Is there any way to do this?


--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/b9c14379-cae6-4942-bf06-ee5c7a94fc79%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

[google-cloud-sql-discuss] Re: Fetching data from BigQuery and inserting into MySQL

You can also use something like ParaSQL Hyper Connect Engine to link the servers directly... so you can write a statement like:

insert into MySQLTable
select a,b,c from MBigQueryTable where ....

This is far more efficient as the data doesn't have to go to the client app and then back to the database (this would be more like a database to database copy).


On Tuesday, June 14, 2016 at 3:31:02 PM UTC-4, truptanan...@homedepot.com wrote:

My Python program connects to big query and fetching data which I want to insert into a mysql table. 

Its successfully fetching the results from bigquery. Its also successfully connecting to mysql DB. but its not inserting the data I see its complaining for the row[1] . 

Whats the right way to insert the values from bigquery response into mysql table columns.

I was following the sample code @ https://cloud.google.com/bigquery/create-simple-app-api#completecode  but my requirement is not to pring but to insert the data into mysql table/

query_data = {mybigquery}

query_response = query_request.query(projectId='myprojectid',body=query_data).execute()

for row in query_response['rows']: cursor.execute ("INSERT INTO database.table VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');")

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.datable VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.OperationalError: (1366, "Incorrect integer value: 'row[0]' for column 'CountAll' at row 1")


Also, I tried to use 

cursor.execute ("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);") 

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '%s,%s,%s,%s,%s)' at line 1")

or 

cursor.execute ("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);") 

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '[0],row[1],row[2],row[3],row[4])' at line 1")

But in all it fails while inserting values in mysql table

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/662cf4c5-ba14-427b-8c22-8a91b0914078%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Sunday, June 26, 2016

[google-cloud-sql-discuss] Re: "Lost connection to MySQL server" in Cloud Proxy

I figured out what the problem I was having was :tada:  
After trying to work with just the proxy without docker I was getting the full error message finally:
```
patrickdougall@hello-rails-1:~$ mysql -u root -p -S /cloudsql/gce-testing-1354:us-central1:proxy-test-db
Enter password:
2016/06/26 23:50:02 New connection for "gce-testing-1354:us-central1:proxy-test-db"
2016/06/26 23:50:02 couldn't connect to "gce-testing-1354:us-central1:proxy-test-db": ensure that the account has access to "gce-testing-1354:us-central1:proxy-test-db" (and make sure there's no typo in that name). Error during createEphemeral for gce-testing-1354:us-central1:proxy-test-db: googleapi: Error 403: Access Not Configured. Cloud SQL Administration API has not been used in project 795050566156 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/sqladmin/overview?project=795050566156 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry., accessNotConfigured
ERROR 2013 (HY000): Lost connection to MySQL server at 'reading initial communication packet', system error: 0

```

So I went to that page and clicked enable and both the app bare and the one inside of docker both work now.  It's too bad that full error gets swallowed up in docker though, it makes it pretty hard to work with.

Also, is there a way to programatially (through gcloud preferrably or through the API) handle this problem?
"Cloud SQL Administration API has not been used in project 795050566156 before or it is disabled"





On Monday, March 14, 2016 at 2:51:31 AM UTC-4, 小川純平 wrote:
Hi, I'm trying to connect to Cloud SQL instance using Cloud SQL Proxy, but I experienced "Lost connection to MySQL server" error.
I have no idea what the cause is. Any idea?

I tried following from CentOS 6 instance on GCE:

$ sudo /usr/local/bin/cloud_sql_proxy -dir=/tmp/cloudsql -fuse -credential_file=./gcp.json &
[1] 3139
2016/03/14 06:23:35 Mounting "/tmp/cloudsql"...
2016/03/14 06:23:35 Mounted "/tmp/cloudsql"
2016/03/14 06:23:35 Socket prefix: /tmp/cloudsql

$ mysql -u DB_USER_NAME -S /tmp/cloudsql/PROJECT_NAME:asia-east1:INSTANCE_NAME
2016/03/14 06:14:25 couldn't connect to "PROJECT_NAME:asia-east1:INSTANCE_NAME": dial tcp CLOUD_SQL_INSTANCE_IP:3307: getsockopt: connection timed out
ERROR 2013 (HY000): Lost connection to MySQL server at 'reading initial communication packet', system error: 0


I also tried without -credential_file, and I got another error:

$ sudo /usr/local/bin/cloud_sql_proxy -dir=/tmp/cloudsql -fuse &
[1] 3481
2016/03/14 06:32:10 Mounting "/tmp/cloudsql"...
2016/03/14 06:32:10 Mounted "/tmp/cloudsql"
2016/03/14 06:32:10 Socket prefix: /tmp/cloudsql

$ mysql -u DB_USER_NAME -S /tmp/cloudsql/PROJECT_NAME:asia-east1:INSTANCE_NAME
2016/03/14 06:32:19 couldn't connect to "PROJECT_NAME:asia-east1:INSTANCE_NAME": POST "https://www.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/instances/INSTANCE_NAME/createEphemeral": 403 Forbidden; Body="{\n \"error\": {\n  \"errors\": [\n   {\n    \"domain\": \"global\",\n    \"reason\": \"insufficientPermissions\",\n    \"message\": \"Insufficient Permission\"\n   }\n  ],\n  \"code\": 403,\n  \"message\": \"Insufficient Permission\"\n }\n}\n"; read error: <nil>
ERROR 2013 (HY000): Lost connection to MySQL server at 'reading initial communication packet', system error: 0


Permission of the socket directory is:

$ ll -R /tmp/
/tmp/:
dr-xr-xr-x. 0 root  root     0 Aug 30  1754 cloudsql
drwxr-xr-x. 2 root  root  4096 Mar 14 06:32 cloudsql-proxy-tmp

/tmp/cloudsql:
-r--r--r--. 0 root root 404 Aug 30  1754 README

/tmp/cloudsql-proxy-tmp:
srwxrwxrwx. 1 root root 0 Mar 14 06:13 PROJECT_NAME:asia-east1:INSTANCE_NAME
srwxrwxrwx. 1 root root 0 Mar 14 05:48 PROJECT_NAME:asia-east1:ANOTHER_INSTANCE_NAME


If this may be a bug and need my project and instance name for investigation,
please refer email I've sent to clou...@google.com on Mar 1st (UTC), titled "Cannot change root password".
If you could't find it, I will gladly resend you.

Thanks,
Jumpei

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/5f1787a2-12d5-49ca-9499-4e1fed4a82b7%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

[google-cloud-sql-discuss] Re: "Lost connection to MySQL server" in Cloud Proxy

Hi Vadim Berexniker, I am actually having the same problem trying to get the sql-proxy to work.  I've got as basic of setup as I can figure out.  A stock compute engine and a stock sql instance.  I'm following https://cloud.google.com/sql/docs/mysql-connect-docker as closely as possible (perfectly as far as I can tell).  I can even enter mysql form the mysql command line passing the IP of the sql instance

`mysql host=<ip address> -u root -p`

And I can start the proxy just fine.  But when I try to enter mysql from the proxy socket I get the dredded error 
ERROR 2013 (HY000): Lost connection to MySQL server at 'reading initial communication packet', system error: 0

I've tried this with two different engine instances now following the advice you've already given, but I'm still not able to get past that error.

`gcloud compute instances create gce-proxy-5 --scopes sql,sql-admin,storage-rw` to create the instance,

`gcloud compute instances describe gce-proxy-5` validates the scopes are present
serviceAccounts:
  scopes:
  - https://www.googleapis.com/auth/devstorage.read_write
  - https://www.googleapis.com/auth/sqlservice
  - https://www.googleapis.com/auth/sqlservice.admin

`gcloud compute ssh gce-proxy-5` to ssh in

follow that ^ link to set up mysql client and docker and the proxy image

`docker run -d -v /cloudsql:/cloudsql -v /etc/ssl/certs:/etc/ssl/certs b.gcr.io/cloudsql-docker/gce-proxy /cloud_sql_proxy -dir=/cloudsql -instances=gce-testing-1354:us-central1:proxy-test-db`

to run the image

`mysql -u root -p -S /cloudsql/gce-testing-1354:us-central1:proxy-test-db` to start mysql through the proxy

That prompts for a passwork, and I give the one I provided here: https://console.cloud.google.com/sql/instances/proxy-test-db/access-control/users
and boom, same error, every time.

ERROR 2013 (HY000): Lost connection to MySQL server at 'reading initial communication packet', system error: 0

I would love to hear if you've got any ideas!?!?!

Thanks
Patrick




On Monday, March 14, 2016 at 2:51:31 AM UTC-4, 小川純平 wrote:
Hi, I'm trying to connect to Cloud SQL instance using Cloud SQL Proxy, but I experienced "Lost connection to MySQL server" error.
I have no idea what the cause is. Any idea?

I tried following from CentOS 6 instance on GCE:

$ sudo /usr/local/bin/cloud_sql_proxy -dir=/tmp/cloudsql -fuse -credential_file=./gcp.json &
[1] 3139
2016/03/14 06:23:35 Mounting "/tmp/cloudsql"...
2016/03/14 06:23:35 Mounted "/tmp/cloudsql"
2016/03/14 06:23:35 Socket prefix: /tmp/cloudsql

$ mysql -u DB_USER_NAME -S /tmp/cloudsql/PROJECT_NAME:asia-east1:INSTANCE_NAME
2016/03/14 06:14:25 couldn't connect to "PROJECT_NAME:asia-east1:INSTANCE_NAME": dial tcp CLOUD_SQL_INSTANCE_IP:3307: getsockopt: connection timed out
ERROR 2013 (HY000): Lost connection to MySQL server at 'reading initial communication packet', system error: 0


I also tried without -credential_file, and I got another error:

$ sudo /usr/local/bin/cloud_sql_proxy -dir=/tmp/cloudsql -fuse &
[1] 3481
2016/03/14 06:32:10 Mounting "/tmp/cloudsql"...
2016/03/14 06:32:10 Mounted "/tmp/cloudsql"
2016/03/14 06:32:10 Socket prefix: /tmp/cloudsql

$ mysql -u DB_USER_NAME -S /tmp/cloudsql/PROJECT_NAME:asia-east1:INSTANCE_NAME
2016/03/14 06:32:19 couldn't connect to "PROJECT_NAME:asia-east1:INSTANCE_NAME": POST "https://www.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/instances/INSTANCE_NAME/createEphemeral": 403 Forbidden; Body="{\n \"error\": {\n  \"errors\": [\n   {\n    \"domain\": \"global\",\n    \"reason\": \"insufficientPermissions\",\n    \"message\": \"Insufficient Permission\"\n   }\n  ],\n  \"code\": 403,\n  \"message\": \"Insufficient Permission\"\n }\n}\n"; read error: <nil>
ERROR 2013 (HY000): Lost connection to MySQL server at 'reading initial communication packet', system error: 0


Permission of the socket directory is:

$ ll -R /tmp/
/tmp/:
dr-xr-xr-x. 0 root  root     0 Aug 30  1754 cloudsql
drwxr-xr-x. 2 root  root  4096 Mar 14 06:32 cloudsql-proxy-tmp

/tmp/cloudsql:
-r--r--r--. 0 root root 404 Aug 30  1754 README

/tmp/cloudsql-proxy-tmp:
srwxrwxrwx. 1 root root 0 Mar 14 06:13 PROJECT_NAME:asia-east1:INSTANCE_NAME
srwxrwxrwx. 1 root root 0 Mar 14 05:48 PROJECT_NAME:asia-east1:ANOTHER_INSTANCE_NAME


If this may be a bug and need my project and instance name for investigation,
please refer email I've sent to clou...@google.com on Mar 1st (UTC), titled "Cannot change root password".
If you could't find it, I will gladly resend you.

Thanks,
Jumpei

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/86a385d3-c4f0-4cad-a534-5506df943b3f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Friday, June 24, 2016

[google-cloud-sql-discuss] Re: Query Google Big Query from Cloud SQl

As the SQL query is executed on the database this would require the use of federated tables. Unfortunately BigQuery only supports Google Cloud Storage and Google Drive as federated data sources, and MySQL only supports other MySQL databases.

Adding CloudSQL/MySQL as a federated data source for BigQuery sounds like a good feature to request on the issue tracker.

On Monday, June 20, 2016 at 3:45:36 PM UTC-4, Hajar Homayouni wrote:
Actually, I have a R application in which I want to read data from GBQ tables and write into CloudSql tables through a single query:

insert into [CloudSql table]
 select x from [GBQ]


I need something to connect CloudSql to GBQ (integrate them). Any idea would be really appreciated.

On Monday, June 20, 2016 at 1:32:00 PM UTC-6, Adam (Cloud Platform Support) wrote:
It's a bit unclear what you need to do. Are you asking how to import a subset of your BigQuery data into a Cloud SQL database?

On Monday, June 20, 2016 at 12:18:23 PM UTC-4, Hajar Homayouni wrote:
Thank you Rob, could you please explain more how to do this?

On Monday, June 20, 2016 at 10:13:30 AM UTC-6, Rob wrote:
ParaSQL's Hyper Connect will allow you to do this. 

On Friday, June 17, 2016 at 1:11:56 PM UTC-4, Hajar Homayouni wrote:
Hi all,

I have a large dataset in GBQ, and I want to query (select) a small part of it in Cloud SQL. Is there any way to do this?


--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/e6651215-2742-4ebc-8e8a-9b0a0ef56c07%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Thursday, June 23, 2016

[google-cloud-sql-discuss] Re: set enforce-gtid-consistency back to false


Hello Chen,

My colleague posted an answer to your question on this stackoverflow thread. Hope it helps.

Sincerely 

On Thursday, June 23, 2016 at 11:11:22 AM UTC-4, chen@triggermail.io wrote:
anyone has the experience that cloudsql replication change this parameter to true, which doesn't allow multiple statement in a transaction, or create temp table. how do I change it back to false in cloudsql? 
Thx

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/b91ed718-afbc-49e9-bc1d-5cac93548e74%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

[google-cloud-sql-discuss] set enforce-gtid-consistency back to false

anyone has the experience that cloudsql replication change this parameter to true, which doesn't allow multiple statement in a transaction, or create temp table. how do I change it back to false in cloudsql? 
Thx

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/1cb2dba7-b936-431a-a327-a2d423c49948%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Monday, June 20, 2016

[google-cloud-sql-discuss] Re: Query Google Big Query from Cloud SQl

Actually, I have a R application in which I want to read data from GBQ tables and write into CloudSql tables through a single query:

insert into [CloudSql table]
 select x from [GBQ]


I need something to connect CloudSql to GBQ (integrate them). Any idea would be really appreciated.

On Monday, June 20, 2016 at 1:32:00 PM UTC-6, Adam (Cloud Platform Support) wrote:
It's a bit unclear what you need to do. Are you asking how to import a subset of your BigQuery data into a Cloud SQL database?

On Monday, June 20, 2016 at 12:18:23 PM UTC-4, Hajar Homayouni wrote:
Thank you Rob, could you please explain more how to do this?

On Monday, June 20, 2016 at 10:13:30 AM UTC-6, Rob wrote:
ParaSQL's Hyper Connect will allow you to do this. 

On Friday, June 17, 2016 at 1:11:56 PM UTC-4, Hajar Homayouni wrote:
Hi all,

I have a large dataset in GBQ, and I want to query (select) a small part of it in Cloud SQL. Is there any way to do this?


--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/2fbdf8f3-2e80-44eb-9683-3c466a0a5cbf%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

[google-cloud-sql-discuss] Re: Query Google Big Query from Cloud SQl

It's a bit unclear what you need to do. Are you asking how to import a subset of your BigQuery data into a Cloud SQL database?

On Monday, June 20, 2016 at 12:18:23 PM UTC-4, Hajar Homayouni wrote:
Thank you Rob, could you please explain more how to do this?

On Monday, June 20, 2016 at 10:13:30 AM UTC-6, Rob wrote:
ParaSQL's Hyper Connect will allow you to do this. 

On Friday, June 17, 2016 at 1:11:56 PM UTC-4, Hajar Homayouni wrote:
Hi all,

I have a large dataset in GBQ, and I want to query (select) a small part of it in Cloud SQL. Is there any way to do this?


--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/b9ad9f26-e329-4ad2-a031-bb2f7424b60d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

[google-cloud-sql-discuss] Re: Query Google Big Query from Cloud SQl

Thank you Rob, could you please explain more how to do this?

On Monday, June 20, 2016 at 10:13:30 AM UTC-6, Rob wrote:
ParaSQL's Hyper Connect will allow you to do this. 

On Friday, June 17, 2016 at 1:11:56 PM UTC-4, Hajar Homayouni wrote:
Hi all,

I have a large dataset in GBQ, and I want to query (select) a small part of it in Cloud SQL. Is there any way to do this?


--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/2f404a88-db57-4caa-9ef0-4c281971b236%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

[google-cloud-sql-discuss] Re: Query Google Big Query from Cloud SQl

ParaSQL's Hyper Connect will allow you to do this. 

On Friday, June 17, 2016 at 1:11:56 PM UTC-4, Hajar Homayouni wrote:
Hi all,

I have a large dataset in GBQ, and I want to query (select) a small part of it in Cloud SQL. Is there any way to do this?


--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/28883a08-d59d-4259-a344-8c646a1bf7bd%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Friday, June 17, 2016

[google-cloud-sql-discuss] Query Google Big Query from Cloud SQl

Hi all,

I have a large dataset in GBQ, and I want to query (select) a small part of it in Cloud SQL. Is there any way to do this?


--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/957b134b-2911-4af9-95f8-ed2902cd233f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Thursday, June 16, 2016

Re: [google-cloud-sql-discuss] Cannot connect to Cloud SQL 2nd generation - from web interface and via Cloud SQL Proxy

Hi, Vadim,
Sorry for late reply.

As you told me, I increased storage size and it worked.

Thanks.

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/CANMHQ%2BuQApDCEa5Z7sC9mPz9h3rHx9rZQ4C4JZS7KiXcQa9TtA%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

[google-cloud-sql-discuss] Re: Fetching data from BigQuery and inserting into MySQL

Hi Nicholas,

Here is the Stack Overflow link for this 

http://stackoverflow.com/questions/37817042/how-to-insert-values-into-mysql-table-from-another-bigquery-response

Thanks

On Thursday, June 16, 2016 at 10:19:55 AM UTC-5, Nicholas (Google Cloud Support) wrote:
At this point, I would strongly suggest posting this as a question on Stack Overflow as that's a far better forum for this type of code debugging.  Be sure to provide a complete code sample stack trace so that the Stack Exchange community is best equipped to help.  Then post a link to your Stack Overflow question here so that others encountering this post can follow through.

This forum is more appropriate for general discussions, announcements and sharing of beset practices.

On Tuesday, June 14, 2016 at 3:31:02 PM UTC-4, truptanan...@homedepot.com wrote:

My Python program connects to big query and fetching data which I want to insert into a mysql table. 

Its successfully fetching the results from bigquery. Its also successfully connecting to mysql DB. but its not inserting the data I see its complaining for the row[1] . 

Whats the right way to insert the values from bigquery response into mysql table columns.

I was following the sample code @ https://cloud.google.com/bigquery/create-simple-app-api#completecode  but my requirement is not to pring but to insert the data into mysql table/

query_data = {mybigquery}

query_response = query_request.query(projectId='myprojectid',body=query_data).execute()

for row in query_response['rows']: cursor.execute ("INSERT INTO database.table VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');")

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.datable VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.OperationalError: (1366, "Incorrect integer value: 'row[0]' for column 'CountAll' at row 1")


Also, I tried to use 

cursor.execute ("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);") 

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '%s,%s,%s,%s,%s)' at line 1")

or 

cursor.execute ("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);") 

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '[0],row[1],row[2],row[3],row[4])' at line 1")

But in all it fails while inserting values in mysql table

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/a326e8f2-ee13-4163-b12d-5a00e2f260b3%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

[google-cloud-sql-discuss] Re: Fetching data from BigQuery and inserting into MySQL

At this point, I would strongly suggest posting this as a question on Stack Overflow as that's a far better forum for this type of code debugging.  Be sure to provide a complete code sample stack trace so that the Stack Exchange community is best equipped to help.  Then post a link to your Stack Overflow question here so that others encountering this post can follow through.

This forum is more appropriate for general discussions, announcements and sharing of beset practices.

On Tuesday, June 14, 2016 at 3:31:02 PM UTC-4, truptanand_badatya@homedepot.com wrote:

My Python program connects to big query and fetching data which I want to insert into a mysql table. 

Its successfully fetching the results from bigquery. Its also successfully connecting to mysql DB. but its not inserting the data I see its complaining for the row[1] . 

Whats the right way to insert the values from bigquery response into mysql table columns.

I was following the sample code @ https://cloud.google.com/bigquery/create-simple-app-api#completecode  but my requirement is not to pring but to insert the data into mysql table/

query_data = {mybigquery}

query_response = query_request.query(projectId='myprojectid',body=query_data).execute()

for row in query_response['rows']: cursor.execute ("INSERT INTO database.table VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');")

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.datable VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.OperationalError: (1366, "Incorrect integer value: 'row[0]' for column 'CountAll' at row 1")


Also, I tried to use 

cursor.execute ("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);") 

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '%s,%s,%s,%s,%s)' at line 1")

or 

cursor.execute ("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);") 

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '[0],row[1],row[2],row[3],row[4])' at line 1")

But in all it fails while inserting values in mysql table

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/247553eb-d51a-45b3-b83d-7d27125599c5%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

[google-cloud-sql-discuss] Re: Feasibility of archival storage

Happy to answer your questions though I would suggest posting such Cloud Storage questions to the App Engine Google Groups in the future.

The operations table shows the different cost of each operation type.  The Cloud Storage APIs article states the following:
By default, gsutil versions starting with 4.0 interact with the JSON API.
As such, you can consider the costs relative to the JSON API when using gsutil for your sync/backups.  The cost of your backups will depend entirely the volume or reads and writes.  Without knowing too much specific detail about your needs, I can only suggest crunching some numbers for yourself with the documentation provided and testing it cautiously.

On Tuesday, June 14, 2016 at 4:11:35 PM UTC-4, Upendra Gandhi wrote:
Hi,

Our organisation is looking at storing 5-10TB of data for archival/DR purposes on cloud storage. The initial data shall be around 5-10TB but we will do a weekly sync (upload/deletion) to the archived data.

The data is scattered across solaris, linux, windows systems in in-house data center.

1. Is there a way to get cost estimation?
2. What is the fastest way to export data to google cloud (initial export) to google cloud
3. What different options are there for weekly syncing data from scattered data from different platforms to the storage bucket?

Any help is appreciated.

Thanks,
Upendra

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/1bb3fe08-8615-4fe1-a268-60f18a1767e0%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Wednesday, June 15, 2016

[google-cloud-sql-discuss] Re: Fetching data from BigQuery and inserting into MySQL

Hi Nicholas,

Thank you very much for the detail note. I tried to use your suggested method but thats again throwing below error
[root@myserver ~]# python myfile.py 
Traceback (most recent call last):
  File "myfile.py", line 27, in <module>
    row[0],
KeyError: 0
[root@myserver ~]# 

When I used  print('\t'.join(field['v'] for field in row['f']))

it does print the output : ( These are the values fetched from my BigQuery response. The BigQuery code is taken from the google site https://cloud.google.com/bigquery/create-simple-app-api#completecode )
15658 53.35023630093262 221.0 237.0 436.0

My requirement is insert these values to various columns in the MySQL database table. I have tested the connection and able to describe the table within my python program.

On Wednesday, June 15, 2016 at 2:52:23 PM UTC-5, Nicholas (Google Cloud Support) wrote:
Good day and thanks sharing posting your questions here!

Assuming your cursor is a MySQLCursor, I might suggest reviewing the documentation for the execute method.  Based on the errors you've encountered, the issue lies with cursor.execute() method.  If you don't mind, I'll go through each of your attempts to point out what is likely causing the errors you're seeing.

1: cursor.execute("INSERT INTO database.table VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');")
The resulting SQL statement here will insert a new record with 5 string literals as values: 'row[0]', 'row[1]', 'row[2]', 'row[3]', 'row[4]'
These literals will not be replaced with the values found at row[0], row[1], etc.

2: cursor.execute("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);")
The syntax here is correct.  The main issue is that the execute method attempts to replace occurrences of %s with values provided to its second argument.  In this case cursor.execute is only invoked with 1 argument.  You've not provided any values.

3: cursor.execute("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);")
This statement throws a syntax error because row[0], row[1], etc. are not known variables to the SQL instance.  The row list only exists in python at this point.

I believe what you might be looking for is something more like this:
# prepare the statement
insert_statement
= 'INSERT INTO database.tables VALUES (%s,%s, %s, %s, %s);'

# loop through each BQ row
for row in query_response['rows']:
   
# prepare the set of values
   
# strongly advise sanitizing the values before inserting
   
# type checks, value checks, SQL injection checks, etc.
    values_to_insert
= (
        row
[0],
        row
[1],
        row
[2],
        row
[3],
        row
[4])

   
# insert data
    cursor
.execute(insert_statement, values_to_insert)

Hope this helps!

On Tuesday, June 14, 2016 at 3:31:02 PM UTC-4, truptanan...@homedepot.com wrote:

My Python program connects to big query and fetching data which I want to insert into a mysql table. 

Its successfully fetching the results from bigquery. Its also successfully connecting to mysql DB. but its not inserting the data I see its complaining for the row[1] . 

Whats the right way to insert the values from bigquery response into mysql table columns.

I was following the sample code @ https://cloud.google.com/bigquery/create-simple-app-api#completecode  but my requirement is not to pring but to insert the data into mysql table/

query_data = {mybigquery}

query_response = query_request.query(projectId='myprojectid',body=query_data).execute()

for row in query_response['rows']: cursor.execute ("INSERT INTO database.table VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');")

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.datable VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.OperationalError: (1366, "Incorrect integer value: 'row[0]' for column 'CountAll' at row 1")


Also, I tried to use 

cursor.execute ("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);") 

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '%s,%s,%s,%s,%s)' at line 1")

or 

cursor.execute ("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);") 

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '[0],row[1],row[2],row[3],row[4])' at line 1")

But in all it fails while inserting values in mysql table

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/1b9fabd6-9630-41fd-9b1e-679bde1b2031%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

[google-cloud-sql-discuss] Re: Fetching data from BigQuery and inserting into MySQL

Good day and thanks sharing posting your questions here!

Assuming your cursor is a MySQLCursor, I might suggest reviewing the documentation for the execute method.  Based on the errors you've encountered, the issue lies with cursor.execute() method.  If you don't mind, I'll go through each of your attempts to point out what is likely causing the errors you're seeing.

1: cursor.execute("INSERT INTO database.table VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');")
The resulting SQL statement here will insert a new record with 5 string literals as values: 'row[0]', 'row[1]', 'row[2]', 'row[3]', 'row[4]'
These literals will not be replaced with the values found at row[0], row[1], etc.

2: cursor.execute("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);")
The syntax here is correct.  The main issue is that the execute method attempts to replace occurrences of %s with values provided to its second argument.  In this case cursor.execute is only invoked with 1 argument.  You've not provided any values.

3: cursor.execute("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);")
This statement throws a syntax error because row[0], row[1], etc. are not known variables to the SQL instance.  The row list only exists in python at this point.

I believe what you might be looking for is something more like this:
# prepare the statement
insert_statement
= 'INSERT INTO database.tables VALUES (%s,%s, %s, %s, %s);'

# loop through each BQ row
for row in query_response['rows']:
   
# prepare the set of values
   
# strongly advise sanitizing the values before inserting
   
# type checks, value checks, SQL injection checks, etc.
    values_to_insert
= (
        row
[0],
        row
[1],
        row
[2],
        row
[3],
        row
[4])

   
# insert data
    cursor
.execute(insert_statement, values_to_insert)

Hope this helps!

On Tuesday, June 14, 2016 at 3:31:02 PM UTC-4, truptanand_badatya@homedepot.com wrote:

My Python program connects to big query and fetching data which I want to insert into a mysql table. 

Its successfully fetching the results from bigquery. Its also successfully connecting to mysql DB. but its not inserting the data I see its complaining for the row[1] . 

Whats the right way to insert the values from bigquery response into mysql table columns.

I was following the sample code @ https://cloud.google.com/bigquery/create-simple-app-api#completecode  but my requirement is not to pring but to insert the data into mysql table/

query_data = {mybigquery}

query_response = query_request.query(projectId='myprojectid',body=query_data).execute()

for row in query_response['rows']: cursor.execute ("INSERT INTO database.table VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');")

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.datable VALUES ('row[0]','row[1]','row[2]','row[3]','row[4]');") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.OperationalError: (1366, "Incorrect integer value: 'row[0]' for column 'CountAll' at row 1")


Also, I tried to use 

cursor.execute ("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);") 

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.table VALUES (%s,%s,%s,%s,%s);") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '%s,%s,%s,%s,%s)' at line 1")

or 

cursor.execute ("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);") 

Traceback (most recent call last): File "./myfile.py", line 32, in <module> cursor.execute ("INSERT INTO database.table VALUES (row[0],row[1],row[2],row[3],row[4]);") File "/usr/lib64/python2.7/site-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib64/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '[0],row[1],row[2],row[3],row[4])' at line 1")

But in all it fails while inserting values in mysql table

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/b29a0057-cacc-44c2-82d6-bf59b31215a6%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Re: [google-cloud-sql-discuss] Re: Feasibility of archival storage

Thanks Nichols and all for addressing my concerns despite that this group is not meant for answering such type of questions. I appreciate everyone's patience.

Thank you for the information. One last I wanted to know while estimating costs is Class A calls. How do I know how many Class A operations will be made in the Weekly sync(get/post/delete)? If you can point me to the url which has that information, that will be really helpful as well. If not, that's fine. Thanks again!!

~UG

On Wed, Jun 15, 2016 at 11:53 AM, 'Nicholas (Google Cloud Support)' via Google Cloud SQL discuss <google-cloud-sql-discuss@googlegroups.com> wrote:
Thanks for sharing your questions here!  Cloud Storage may indeed be the appropriate tool for the task you describe.

Cost estimations
The Google Cloud Storage Pricing article describes the cost of storage for the various storage classes as well as the costs of network egress.  With this information, you should be able to estimate the costs your organization would incur for its use.  You can also find in this same article some pricing examples to get a better idea of the end result.

Fastest export to the cloud
Assuming you mean the fastest way to upload data to a cloud storage bucket, gsutil is a command line tool specifically for interacting with cloud storage buckets.  It features many linux-like commands such as gsutil cp that can be used to copy files from local-to-bucket, bucket-to-local or bucket-to-bucket.  One can also use the -r option to perform this copy operation recursively through subdirectories.

Weekly syncing
Sticking with the gsutil tool described above, I would point out the rsync command.  As per the documentation:
The gsutil rsync command makes the contents under dst_url the same as the contents under src_url, by copying any missing files/objects (or those whose data has changed), and (if the -d option is specified) deleting any extra files/objects.
If you cannot install gsutil on each of those systems but CAN read their drive contents remotely (mapped network drive), you could have gsutil from a single machine sync content from the network drive to a storage.  This would increase your internal network traffic though as all data would have to go through the single machine first.  Otherwise, you could simply install gsutil on each of the machines to instead to upload to the bucket directly.

Please note the system requirements for gsutil.  I don't think you'll have any success installing it on a Solaris machine though I've not tested this myself.

Hope this helps!

On Tuesday, June 14, 2016 at 4:11:35 PM UTC-4, Upendra Gandhi wrote:
Hi,

Our organisation is looking at storing 5-10TB of data for archival/DR purposes on cloud storage. The initial data shall be around 5-10TB but we will do a weekly sync (upload/deletion) to the archived data.

The data is scattered across solaris, linux, windows systems in in-house data center.

1. Is there a way to get cost estimation?
2. What is the fastest way to export data to google cloud (initial export) to google cloud
3. What different options are there for weekly syncing data from scattered data from different platforms to the storage bucket?

Any help is appreciated.

Thanks,
Upendra

--
You received this message because you are subscribed to a topic in the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/google-cloud-sql-discuss/dQv2QxhywFE/unsubscribe.
To unsubscribe from this group and all its topics, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/4f5fb324-cb1e-4cf7-9fea-4e163d77a479%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.



--
Upendra Gandhi
Systems Administrator, OTS
IIT Tower
10 W. 35th Street, 8th Floor, 8E3-2
Chicago, IL 60616
O: (312)567-3283
Alternate Email:
upendra.gandhi@iit.edu, upendra.gandhi@gmail.com
Keyserver: http://pgp.mit.edu/
 
 

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/CAKJPEHh5O8UAHexu1sdy%2BOZF6%2BE7J6DDcpmXzVrR88OSFyJS5A%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

[google-cloud-sql-discuss] Re: Feasibility of archival storage

Thanks for sharing your questions here!  Cloud Storage may indeed be the appropriate tool for the task you describe.

Cost estimations
The Google Cloud Storage Pricing article describes the cost of storage for the various storage classes as well as the costs of network egress.  With this information, you should be able to estimate the costs your organization would incur for its use.  You can also find in this same article some pricing examples to get a better idea of the end result.

Fastest export to the cloud
Assuming you mean the fastest way to upload data to a cloud storage bucket, gsutil is a command line tool specifically for interacting with cloud storage buckets.  It features many linux-like commands such as gsutil cp that can be used to copy files from local-to-bucket, bucket-to-local or bucket-to-bucket.  One can also use the -r option to perform this copy operation recursively through subdirectories.

Weekly syncing
Sticking with the gsutil tool described above, I would point out the rsync command.  As per the documentation:
The gsutil rsync command makes the contents under dst_url the same as the contents under src_url, by copying any missing files/objects (or those whose data has changed), and (if the -d option is specified) deleting any extra files/objects.
If you cannot install gsutil on each of those systems but CAN read their drive contents remotely (mapped network drive), you could have gsutil from a single machine sync content from the network drive to a storage.  This would increase your internal network traffic though as all data would have to go through the single machine first.  Otherwise, you could simply install gsutil on each of the machines to instead to upload to the bucket directly.

Please note the system requirements for gsutil.  I don't think you'll have any success installing it on a Solaris machine though I've not tested this myself.

Hope this helps!

On Tuesday, June 14, 2016 at 4:11:35 PM UTC-4, Upendra Gandhi wrote:
Hi,

Our organisation is looking at storing 5-10TB of data for archival/DR purposes on cloud storage. The initial data shall be around 5-10TB but we will do a weekly sync (upload/deletion) to the archived data.

The data is scattered across solaris, linux, windows systems in in-house data center.

1. Is there a way to get cost estimation?
2. What is the fastest way to export data to google cloud (initial export) to google cloud
3. What different options are there for weekly syncing data from scattered data from different platforms to the storage bucket?

Any help is appreciated.

Thanks,
Upendra

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/4f5fb324-cb1e-4cf7-9fea-4e163d77a479%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Re: [google-cloud-sql-discuss] Trying to Connect to database in PHP using PDO and Cloud SQL Proxy

Hi Guys. I`m trying to start cloud proxy with startup script in instance, or startup script in /etc/init.d or even with cron but I can´t.
But if I try to execute those scripts manually I could start cloud proxy. In that scripts I invoke command with full path to cloud_sql_proxy script.
Do you test in Centos startup scripts?
What could be the problem with those scripts?
Thanks in advance.

El martes, 22 de marzo de 2016, 15:55:19 (UTC-3), Kevin Malachowski escribió:
Sorry for the delay.

I'm a little confused by your question but I'll try to answer what I think you're asking. Feel free to ask again if I missed the mark.

If you create a user dev@'cloudsqlproxy~%' in a database named 'sql-inst' in 'us-central1' in the project 'my-project', then you can connect to that instance using the following 'mysql' command (while the proxy is running on that machine):

mysql -u dev -S /cloudsql/my-project:us-central1:sql-inst

The user that has the 'cloudsqlproxy~%' hostname is the one you want to provide as the '-u' flag (short for --user).

For your PHP/PDO script, it appears that the second parameter of the constructor is the username [0]. In your original post you put 'db' for this value; given a dev@'cloudsqlproxy~%' user, you should use 'dev' instead.


On Wed, Mar 16, 2016 at 9:48 PM, <wob...@yblew.com> wrote:
It seems the only thing stable is below, however it seems I have to assign all databases in mysql to dev@'cloudsqlproxy~%' user before I can see interact with database through the proxy via non socket which is weird; can you confirm? I checked developmentDB.json and it's an editor of all my insitances. If I remember correctly, Kevin said I don't have to assign dev@'cloudsqlproxy~%' to anything, just create it and i'm all set. 

sudo ./cloud_sql_proxy -dir=/cloudsql -instances=my-project:us-central1:sql-inst=tcp:3308 -credential_file=/root/developmentDB.json &

I also tried below for php/mysqli and got permission denied on the browser but it worked for php/pdo in browser

./cloud_sql_proxy -dir=/cloudsql -instances=my-project:us-central1:sql-inst &
mysql -u <user_name> -S /
cloudsql/my-project:us-central1:sql-inst

On Wednesday, March 16, 2016 at 5:07:10 PM UTC-4, Vadim Berezniker wrote:
No, I see the same problem. I filed an issue to investigate: https://github.com/GoogleCloudPlatform/cloudsql-proxy/issues/7
Can you use the non-fuse mode as a workaround until the issue is resolved?

On Wed, Mar 16, 2016 at 1:46 PM <wob...@yblew.com> wrote:
So after you disabled Selinux it worked for you?

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/58c2ac04-1a5e-4c47-91fc-2d3fa3116925%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to a topic in the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/google-cloud-sql-discuss/AsVdpcRF5gA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/b9f94b0d-6647-4ca4-9892-ae818f17cdeb%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/3dcbdbc4-1034-40cb-b02c-00382b601a66%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.