Thursday, January 31, 2013

Re: Cloud SQL Instance does not shut down

Hi Yissachar,
One way to shut down your instance is to click on the "restart" button in the Google API console (under "Actions" -> "Restart instance"). Despite its name, it actually brings down then instance. Your instance will then only start again upon receiving traffic. If you still have issues with shutdown, please send an email to sql-team@google.com with the name of your instance.


On Thu, Jan 31, 2013 at 4:32 PM, Yissachar Radcliffe <yissachar.radcliffe@caseware.com> wrote:
I've been using a D0 (dev testing) instance for a while to test out Google Cloud SQL. A couple of days ago I upgraded the testing instance to a per use D2 instance. I am still using this as a test/dev instance but the D0 instance could not support the amount of data I need on there to test, hence why I needed to upgrade to D2.

Looking at the billing and dsahboard for the instance shows that this instance has been active since I started it 2.5 days ago. This is confusing to me as it has not been used at all today (Jan 31) and only minimally yesterday. I specifically set this as a pay-per-use instance instead of package as it is being used very lightly as a test instance (as mentioned above, D0 was completely fine for my dev usage aside from the storage size issue) and the pay-per-use is supposed to shut down after 15 minutes of inactivity. However it seems that this is not happening.

This is quite frustrating as I am only minimally using this instance yet being charged for a full day of instance hours. How can I fix this?

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Cloud SQL Instance does not shut down

I've been using a D0 (dev testing) instance for a while to test out Google Cloud SQL. A couple of days ago I upgraded the testing instance to a per use D2 instance. I am still using this as a test/dev instance but the D0 instance could not support the amount of data I need on there to test, hence why I needed to upgrade to D2.

Looking at the billing and dsahboard for the instance shows that this instance has been active since I started it 2.5 days ago. This is confusing to me as it has not been used at all today (Jan 31) and only minimally yesterday. I specifically set this as a pay-per-use instance instead of package as it is being used very lightly as a test instance (as mentioned above, D0 was completely fine for my dev usage aside from the storage size issue) and the pay-per-use is supposed to shut down after 15 minutes of inactivity. However it seems that this is not happening.

This is quite frustrating as I am only minimally using this instance yet being charged for a full day of instance hours. How can I fix this?

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Re: Astonishing slowness when making a connection

If you are using a Per Use Plan (https://developers.google.com/cloud-sql/docs/billing#per_use), the system will shut down your instance if it has been idle for more than 15 minutes.  The next connection will cause the instance to spin up.  That typically takes ~8 sec, but 15 sec is not surprising.  If this spin up latency is unacceptable, you can switch to a Package Plan and your instance will only be shut down after 12 hours of inactivity.

Ken

On Thu, Jan 31, 2013 at 12:45 AM, <alan.murphy@paddypower.com> wrote:
Hi All, 

I'm using Spring, JPA and Hibernate to connect to a Google Cloud SQL instance, though my application sits in-house within my company (for reasons I won't bore anybody with) rather than being deployed to the Google cloud platform. Early on, several warning signs led me to believe that this outside-in access is 'barely' supported by Google (e.g. scant mention in the documentation, the fact that the driver for connecting to Google Cloud SQL from outside is not in any JAR available through Maven, etc etc). 

My specific problem is that my application seems to take an inordinate amount of time to connect to the database. If you contrast the timestamps of the two log statements below, you'l note a 15 second latency during connection. Has anybody else encountered similar? Might this be a firewall problem within my own company, or could it be that the databases are simply not technologically built to accept connections from applications not deployed within the same cloud? 

All thoughts/suggestions much appreciated, 

Cheers, 
Al


2013-01-25 09:05:44,742 [com.paddypower.metrics.kanban.report.monthly.OpsReviewEmailer.main()] INFO  org.hibernate.connection.DriverManagerConnectionProvider  - connection properties: {autocommit=true, release_mode=auto}  2013-01-25 09:05:59,962 [com.paddypower.metrics.kanban.report.monthly.OpsReviewEmailer.main()] INFO  org.hibernate.cfg.SettingsFactory  - Database ->         name : MySQL/Google Cloud SQL      version : 5.5.28        major : 5        minor : 5  2013-01-25 09:05:59,962 [com.paddypower.metrics.kanban.report.monthly.OpsReviewEmailer.main()] INFO  org.hibernate.cfg.SettingsFactory  - Driver ->         name : Google Cloud SQL JDBC Driver      version : 1.0        major : 1        minor : 0

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Astonishing slowness when making a connection

Hi All, 

I'm using Spring, JPA and Hibernate to connect to a Google Cloud SQL instance, though my application sits in-house within my company (for reasons I won't bore anybody with) rather than being deployed to the Google cloud platform. Early on, several warning signs led me to believe that this outside-in access is 'barely' supported by Google (e.g. scant mention in the documentation, the fact that the driver for connecting to Google Cloud SQL from outside is not in any JAR available through Maven, etc etc). 

My specific problem is that my application seems to take an inordinate amount of time to connect to the database. If you contrast the timestamps of the two log statements below, you'l note a 15 second latency during connection. Has anybody else encountered similar? Might this be a firewall problem within my own company, or could it be that the databases are simply not technologically built to accept connections from applications not deployed within the same cloud? 

All thoughts/suggestions much appreciated, 

Cheers, 
Al


2013-01-25 09:05:44,742 [com.paddypower.metrics.kanban.report.monthly.OpsReviewEmailer.main()] INFO  org.hibernate.connection.DriverManagerConnectionProvider  - connection properties: {autocommit=true, release_mode=auto}  2013-01-25 09:05:59,962 [com.paddypower.metrics.kanban.report.monthly.OpsReviewEmailer.main()] INFO  org.hibernate.cfg.SettingsFactory  - Database ->         name : MySQL/Google Cloud SQL      version : 5.5.28        major : 5        minor : 5  2013-01-25 09:05:59,962 [com.paddypower.metrics.kanban.report.monthly.OpsReviewEmailer.main()] INFO  org.hibernate.cfg.SettingsFactory  - Driver ->         name : Google Cloud SQL JDBC Driver      version : 1.0        major : 1        minor : 0

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Re: Passing proxy settings to connect Cloud SQL using command line tool

Yep, I got this fixed eventually. In google_sql.sh, I changed...

{JAVA} -jar "${JAR}" "$@"

...to...

-Dhttps.proxyHost=<my_company_proxy_ip> -Dhttps.proxyPort=<my_company_proxy_port>


On Friday, January 25, 2013 11:53:51 AM UTC, InterfileDM DotModus wrote:
Did someone managed to find a fix for this problem

Exiting; Unable to open connection.
Unable to fetch OAuth2 tokens.


On Thursday, December 27, 2012 3:50:25 PM UTC+2, alan....@paddypower.com wrote:
Hey Fabio, 

Did this work for you in the end? No matter what I try, I only ever seem to get the following...

Exiting; Unable to open connection.
Unable to fetch OAuth2 tokens.

Very frustrating. Any advice appreciated, 

Cheers, 
Al

On Thursday, November 8, 2012 6:46:56 PM UTC, Fábio Peruchi wrote:
Thansk Rob!

Em quinta-feira, 8 de novembro de 2012 16h40min00s UTC-2, Rob escreveu:
The arguments you will want are from here.


I'd change the following line:

${JAVA} -jar "${JAR}" "$@"

to something like:

${JAVA} -jar "${JAR}" -Dhttp.proxyPort=8080 -Dhttp.proxyHost=webcache.mydomain.com "$@"


On Thu, Nov 8, 2012 at 10:38 AM, Rob Clevenger <rcle...@google.com> wrote:
You'll need to edit google_sql.sh (or .cmd) to setting the system properties used by Java for http proxy server support.


On Thu, Nov 8, 2012 at 10:35 AM, Fábio Peruchi <fhpe...@ciandt.com> wrote:
Hi!

Can I pass proxy server settings to connect to Cloud SQL via command line tool (google_sql) ? I can't find this information in command line tool documentation (https://developers.google.com/cloud-sql/docs/commandline).

Thanks in advance.

Fábio Peruchi


--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Sunday, January 27, 2013

How can I specify an oauth2 token in JDBC call?

I know I can use the command-line tool to create a java prefs.xml file. 
My problem is that I have a multi-tenant SaaS applications (Explore Analytics) and I need to manage separate tokens for each user and use the right token when I create the JDBC connection for the user's instance. Is there a way of passing that token in the JDBC connect string?


--
 
 

Re: Python Cloud SQL Search Example?

You should be able to use cursor.execut() inside the if because all the data is already fetched so the connection to the database is idle.

-- Razvan ME


On Sat, Jan 26, 2013 at 12:46 PM, Andre Bruton <andrebruton@gmail.com> wrote:
Hi

How do I manage two different "cursors" in SQL Cloud?

I have one cursor moving through a database checking for certain data.
When there is data missing I want to add new data and then continue
moving through the data. How do I add data using a new cursor?

            # Check if Station exists
            sqlString = "SELECT * FROM weather WHERE weather_code = '"
+ station_id + "' "
            outputString += "<tr><td colspan=8>" + sqlString + "</td></tr>"

            cursor.execute(sqlString)

            data = cursor.fetchall()
            if len(data) == 0:
              outputString += "<tr><td colspan=8>Need to Add Data</td></tr>"

              cursor.execute('INSERT INTO metar (metar_data) VALUES
(%s)', (MySQLdb.escape_string(mdata)))
              conn.commit()

Do I use cursor.execute here? or cursor2.execute?

              # weather_id = <no idea>.lastrowid
              # outputString += "<tr><td colspan=8>Data Exists ID: " +
str(weather_id) + "</td></tr>"
            else:
              weather_id = data[0][0]
              # outputString += "<tr><td colspan=8>Data Exists ID: " +
str(weather_id) + "</td></tr>"

Best regards

Andre F Bruton






On Sat, Jan 26, 2013 at 8:07 PM, Andre Bruton <andrebruton@gmail.com> wrote:
> Thank you. Yes this did help.
>
> I did a search on fetchall() and found a great example at:
>
> http://stackoverflow.com/questions/2440147/how-to-check-the-existence-of-a-row-in-sqlite-with-python
>
> Best regards
>
> Andre
>
>
> On Saturday, January 26, 2013 5:33:47 PM UTC+2, Razvan Musaloiu-E. wrote:
>>
>> Some nice examples are in the "MySQLdb module" section from:
>>
>>   http://zetcode.com/db/mysqlpython/
>>
>> The .fetchall()/.fetchone() will returns an empty tuple if the query
>> doesn't return any rows:
>>
>> Sample code:
>>
>> con = rdbms.connect(instance=...)
>> cur = con.cursor()
>> cur.execute("CREATE TABLE IF NOT EXISTS test.t(i int)")
>> cur.execute("SELECT * FROM test.t")
>> print cur.fetchall()
>>
>> cur.execute("SELECT * FROM test.t")
>> print cur.fetchall()
>>
>>
>>
>> Output:
>>
>> ()
>> ()
>>
>>
>> -- Razvan ME
>>
>>
>> On Sat, Jan 26, 2013 at 6:59 AM, Andre Bruton <andre...@gmail.com> wrote:
>>>
>>> Hi
>>>
>>> Google gives an example of how to add and list data from a Cloud SQL
>>> database, and that is running great.
>>>
>>> How would I search for a record using Python? I want to search for a
>>> record and do something if there is a record and do something else if there
>>> is no record.
>>>
>>> Does anyone have a working example that I can look at?
>>>
>>> Best regards
>>>
>>> Andre F Bruton
>>>
>>>
>>> --
>>>
>>>
>>
>>
> --
>
>

Re: You cannot create new instances because billing is disabled

Did you just enabled the billing?

-- Razvan ME


On Sun, Jan 27, 2013 at 1:40 AM, Hồng anh Dương <honganh.cn8@gmail.com> wrote:
My app with id : 833786181518
I'm create project and see message :
You cannot create new instances because billing is disabled

Can I help me ?

Thanks, So much.

Vào 06:29:08 UTC+7 Thứ sáu, ngày 09 tháng mười một năm 2012, Tony Piazza đã viết:
Just setup my billing and verified the email account. Any ideas as to why I am getting this message:

You cannot create new instances because billing is disabled

Thanks in advance for any help you can provide.

-Tony

--
 
 

--
 
 

Re: You cannot create new instances because billing is disabled

My app with id : 833786181518
I'm create project and see message :
You cannot create new instances because billing is disabled

Can I help me ?

Thanks, So much.

Vào 06:29:08 UTC+7 Thứ sáu, ngày 09 tháng mười một năm 2012, Tony Piazza đã viết:
Just setup my billing and verified the email account. Any ideas as to why I am getting this message:

You cannot create new instances because billing is disabled

Thanks in advance for any help you can provide.

-Tony

--
 
 

Saturday, January 26, 2013

Re: Python Cloud SQL Search Example?

Hi

How do I manage two different "cursors" in SQL Cloud?

I have one cursor moving through a database checking for certain data.
When there is data missing I want to add new data and then continue
moving through the data. How do I add data using a new cursor?

# Check if Station exists
sqlString = "SELECT * FROM weather WHERE weather_code = '"
+ station_id + "' "
outputString += "<tr><td colspan=8>" + sqlString + "</td></tr>"

cursor.execute(sqlString)

data = cursor.fetchall()
if len(data) == 0:
outputString += "<tr><td colspan=8>Need to Add Data</td></tr>"

cursor.execute('INSERT INTO metar (metar_data) VALUES
(%s)', (MySQLdb.escape_string(mdata)))
conn.commit()

Do I use cursor.execute here? or cursor2.execute?

# weather_id = <no idea>.lastrowid
# outputString += "<tr><td colspan=8>Data Exists ID: " +
str(weather_id) + "</td></tr>"
else:
weather_id = data[0][0]
# outputString += "<tr><td colspan=8>Data Exists ID: " +
str(weather_id) + "</td></tr>"

Best regards

Andre F Bruton






On Sat, Jan 26, 2013 at 8:07 PM, Andre Bruton <andrebruton@gmail.com> wrote:
> Thank you. Yes this did help.
>
> I did a search on fetchall() and found a great example at:
>
> http://stackoverflow.com/questions/2440147/how-to-check-the-existence-of-a-row-in-sqlite-with-python
>
> Best regards
>
> Andre
>
>
> On Saturday, January 26, 2013 5:33:47 PM UTC+2, Razvan Musaloiu-E. wrote:
>>
>> Some nice examples are in the "MySQLdb module" section from:
>>
>> http://zetcode.com/db/mysqlpython/
>>
>> The .fetchall()/.fetchone() will returns an empty tuple if the query
>> doesn't return any rows:
>>
>> Sample code:
>>
>> con = rdbms.connect(instance=...)
>> cur = con.cursor()
>> cur.execute("CREATE TABLE IF NOT EXISTS test.t(i int)")
>> cur.execute("SELECT * FROM test.t")
>> print cur.fetchall()
>>
>> cur.execute("SELECT * FROM test.t")
>> print cur.fetchall()
>>
>>
>>
>> Output:
>>
>> ()
>> ()
>>
>>
>> -- Razvan ME
>>
>>
>> On Sat, Jan 26, 2013 at 6:59 AM, Andre Bruton <andre...@gmail.com> wrote:
>>>
>>> Hi
>>>
>>> Google gives an example of how to add and list data from a Cloud SQL
>>> database, and that is running great.
>>>
>>> How would I search for a record using Python? I want to search for a
>>> record and do something if there is a record and do something else if there
>>> is no record.
>>>
>>> Does anyone have a working example that I can look at?
>>>
>>> Best regards
>>>
>>> Andre F Bruton
>>>
>>>
>>> --
>>>
>>>
>>
>>
> --
>
>

Re: Python Cloud SQL Search Example?

Thank you. Yes this did help.

I did a search on fetchall() and found a great example at:

http://stackoverflow.com/questions/2440147/how-to-check-the-existence-of-a-row-in-sqlite-with-python

Best regards

Andre


On Saturday, January 26, 2013 5:33:47 PM UTC+2, Razvan Musaloiu-E. wrote:
Some nice examples are in the "MySQLdb module" section from:

  http://zetcode.com/db/mysqlpython/

The .fetchall()/.fetchone() will returns an empty tuple if the query doesn't return any rows:

Sample code:

con = rdbms.connect(instance=...)
cur = con.cursor()
cur.execute("CREATE TABLE IF NOT EXISTS test.t(i int)")
cur.execute("SELECT * FROM test.t")
print cur.fetchall()

cur.execute("SELECT * FROM test.t")
print cur.fetchall()


Output:

()
()

-- Razvan ME


On Sat, Jan 26, 2013 at 6:59 AM, Andre Bruton <andre...@gmail.com> wrote:
Hi

Google gives an example of how to add and list data from a Cloud SQL database, and that is running great.

How would I search for a record using Python? I want to search for a record and do something if there is a record and do something else if there is no record.

Does anyone have a working example that I can look at?

Best regards

Andre F Bruton


--
 
 

--
 
 

Re: Python Cloud SQL Search Example?

Some nice examples are in the "MySQLdb module" section from:

  http://zetcode.com/db/mysqlpython/

The .fetchall()/.fetchone() will returns an empty tuple if the query doesn't return any rows:

Sample code:

con = rdbms.connect(instance=...)
cur = con.cursor()
cur.execute("CREATE TABLE IF NOT EXISTS test.t(i int)")
cur.execute("SELECT * FROM test.t")
print cur.fetchall()

cur.execute("SELECT * FROM test.t")
print cur.fetchall()


Output:

()
()

-- Razvan ME


On Sat, Jan 26, 2013 at 6:59 AM, Andre Bruton <andrebruton@gmail.com> wrote:
Hi

Google gives an example of how to add and list data from a Cloud SQL database, and that is running great.

How would I search for a record using Python? I want to search for a record and do something if there is a record and do something else if there is no record.

Does anyone have a working example that I can look at?

Best regards

Andre F Bruton


--
 
 

Python Cloud SQL Search Example?

Hi

Google gives an example of how to add and list data from a Cloud SQL database, and that is running great.

How would I search for a record using Python? I want to search for a record and do something if there is a record and do something else if there is no record.

Does anyone have a working example that I can look at?

Best regards

Andre F Bruton


--
 
 

Friday, January 25, 2013

Re: Passing proxy settings to connect Cloud SQL using command line tool

Did someone managed to find a fix for this problem

Exiting; Unable to open connection.
Unable to fetch OAuth2 tokens.


On Thursday, December 27, 2012 3:50:25 PM UTC+2, alan....@paddypower.com wrote:
Hey Fabio, 

Did this work for you in the end? No matter what I try, I only ever seem to get the following...

Exiting; Unable to open connection.
Unable to fetch OAuth2 tokens.

Very frustrating. Any advice appreciated, 

Cheers, 
Al

On Thursday, November 8, 2012 6:46:56 PM UTC, Fábio Peruchi wrote:
Thansk Rob!

Em quinta-feira, 8 de novembro de 2012 16h40min00s UTC-2, Rob escreveu:
The arguments you will want are from here.


I'd change the following line:

${JAVA} -jar "${JAR}" "$@"

to something like:

${JAVA} -jar "${JAR}" -Dhttp.proxyPort=8080 -Dhttp.proxyHost=webcache.mydomain.com "$@"


On Thu, Nov 8, 2012 at 10:38 AM, Rob Clevenger <rcle...@google.com> wrote:
You'll need to edit google_sql.sh (or .cmd) to setting the system properties used by Java for http proxy server support.


On Thu, Nov 8, 2012 at 10:35 AM, Fábio Peruchi <fhpe...@ciandt.com> wrote:
Hi!

Can I pass proxy server settings to connect to Cloud SQL via command line tool (google_sql) ? I can't find this information in command line tool documentation (https://developers.google.com/cloud-sql/docs/commandline).

Thanks in advance.

Fábio Peruchi


Thursday, January 24, 2013

Re: Using Google Cloud SQL with C# .Net application

Hi Sarah

We currently support Java, Python, and anything that can use a JDBC -- see https://developers.google.com/cloud-sql/docs/external
We'd like to support a wider range of external applications in the future, but don't have anything we can announce yet

J

On Thu, Jan 24, 2013 at 11:16 AM, Sarah Fernandezlopez <sfernandezlopez@gmail.com> wrote:
I saw a previous post that indicated this is not yet available.  Are there any plans or timelines to add this capability?
Thanks!



--
Joe Faith | Product Manager | Google Cloud

Using Google Cloud SQL with C# .Net application

I saw a previous post that indicated this is not yet available.  Are there any plans or timelines to add this capability?
Thanks!

Re: Sync data from Google spreadsheet to Cloud SQL using Google Apps Script

Hi Sonal,
I'd refer you to https://developers.google.com/apps-script/support for those questions. For Q1, I'd imagine that depends on whether autocommit is enabled and whether the request has been sent from Apps Script already.


On Thu, Jan 24, 2013 at 1:43 AM, SonalB <sonal.bante@accenture.com> wrote:
 I am creating an application where I will be inserting spreadsheet data to Cloud SQL using jdbc service. I would like to know answers of following :

Q1. What will happen if the insert statement is executing and someone closes the spreadsheet/script?

Q2. is there any possibility of inserting the data to Cloud SQL using Google Apps script in offline mode?

Sync data from Google spreadsheet to Cloud SQL using Google Apps Script

 I am creating an application where I will be inserting spreadsheet data to Cloud SQL using jdbc service. I would like to know answers of following :

Q1. What will happen if the insert statement is executing and someone closes the spreadsheet/script?

Q2. is there any possibility of inserting the data to Cloud SQL using Google Apps script in offline mode?

Thursday, January 17, 2013

Re: JDBC Connection Pool

Does this help?:


On Thu, Jan 17, 2013 at 5:20 AM, ec <eddie.email@gmail.com> wrote:
just wondering whether can use this JDBC connection and setup as connection pool ?



--
Joe Faith | Product Manager | Google Cloud

JDBC Connection Pool

just wondering whether can use this JDBC connection and setup as connection pool ?

Tuesday, January 15, 2013

Re: Value '0000-00-00 00:00:00' can not be represented as java.sql.Timestamp

I wrestled with this problem and implemented the URL concatenation solution contributed by @Kushan in the accepted answer above. It worked in my local MySql instance. But when I deployed my Play/Scala app to Heroku it no longer would work. Heroku also concatenates several args to the DB URL that they provide users, and this solution, because of Heroku's use concatenation of "?" before their own set of args, will not work. However I found a different solution which seems to work equally well.

    SET sql_mode = 'NO_ZERO_DATE';

I put this in my table descriptions and it solved the problem of
    '0000-00-00 00:00:00' can not be represented as java.sql.Timestamp


On Sunday, March 11, 2012 2:19:33 PM UTC-7, Markus Unger wrote:
Hey!

I have a Timestamp Field with a Null Value. But the JDBC say the following error message:

Value '0000-00-00 00:00:00' can not be represented as java.sql.Timestamp

Seems that MySQL convert a null timestamp into '0000-00-00 00:00:00. Anybody prefered to set this flag zeroDateTimeBehavior=convertToNull

But I found no way to set it in Cloud SQL. Is it possible to set this flag?

Regards, 

Markus

Re: An unexpected error has occurred. We're looking into it. - minutes after uploading with gutil

Your Google person can look in log and see real error message.

Thomas L. Hill
University of Texas at Dallas
tom.hill.fellow@gmail.com
USA (469) 767 5011

On Jan 15, 2013 5:08 PM, "Edis S" <edis.sehalic@gmail.com> wrote:
I installed gutil and followed steps on how to import data via this documentation link https://developers.google.com/cloud-sql/docs/import_export#importing

After successfully creating bucket and uploading first file, I tried to import the .sql file but it failed with an error "ERROR_RDBMS" and I realised that it needs to have USE 'database_name' at the start in order for import to work.

Naturally, I edited the .sql file and re-uploaded it again, and went to console to perform import again. After few re-loads of the console page, next thing I see is this error


An unexpected error has occurred. We're looking into it.

I tried doing some debugging - I removed all the files in the bucket along with it, hoping that it might fix the problem - but that was in vain, therefore I keep getting this error.

My project ID is: 1038802266832

Thanks.

Re: An unexpected error has occurred. We're looking into it. - minutes after uploading with gutil

Can you make sure your dump includes a 'USE ...' that selects in what database to do the import?

-- Razvan ME


On Tue, Jan 15, 2013 at 3:08 PM, Edis S <edis.sehalic@gmail.com> wrote:
I installed gutil and followed steps on how to import data via this documentation link https://developers.google.com/cloud-sql/docs/import_export#importing

After successfully creating bucket and uploading first file, I tried to import the .sql file but it failed with an error "ERROR_RDBMS" and I realised that it needs to have USE 'database_name' at the start in order for import to work.

Naturally, I edited the .sql file and re-uploaded it again, and went to console to perform import again. After few re-loads of the console page, next thing I see is this error


An unexpected error has occurred. We're looking into it.

I tried doing some debugging - I removed all the files in the bucket along with it, hoping that it might fix the problem - but that was in vain, therefore I keep getting this error.

My project ID is: 1038802266832

Thanks.

An unexpected error has occurred. We're looking into it. - minutes after uploading with gutil

I installed gutil and followed steps on how to import data via this documentation link https://developers.google.com/cloud-sql/docs/import_export#importing

After successfully creating bucket and uploading first file, I tried to import the .sql file but it failed with an error "ERROR_RDBMS" and I realised that it needs to have USE 'database_name' at the start in order for import to work.

Naturally, I edited the .sql file and re-uploaded it again, and went to console to perform import again. After few re-loads of the console page, next thing I see is this error


An unexpected error has occurred. We're looking into it.

I tried doing some debugging - I removed all the files in the bucket along with it, hoping that it might fix the problem - but that was in vain, therefore I keep getting this error.

My project ID is: 1038802266832

Thanks.

Re: Database access problem `updating settings`

Thanks, the instance should be happy now.

-- Razvan ME


On Tue, Jan 15, 2013 at 9:04 AM, Ference van Munster <ference@munstermedia.nl> wrote:
I've send an email with the instance name. 
It's still locked.. and when i try to connect with the console i still get the message "Instance has too many concurrent requests: 101"

Op dinsdag 15 januari 2013 16:56:56 UTC+1 schreef Razvan Musaloiu-E. het volgende:
Can you please tell us the name of your instance (send it to sql-team@google.com if don't want it to be public)? Switching to ASYNC is much faster than that. The pile of active connection might slow down the switch though. One hour sounds excessive.

-- Razvan ME


On Tue, Jan 15, 2013 at 7:32 AM, Ference van Munster <ference@munstermedia.nl> wrote:
Hi,

I was optimizing my app because i've got a lot of 'Instance has too many concurrent requests: 101' errors.

So after looking at the replication type i've changed the replication mode to Asynchronous.
I understand that there is some time needed to process the replication mode, but i'm now waiting for about an hour.. and the admin keeps giving me the message 'Updating settings of instance'
All my options like 'actions' and 'instance settings' are disabled so i can't do anything in the admin.

Is this normal?


Re: Database access problem `updating settings`

I've send an email with the instance name. 
It's still locked.. and when i try to connect with the console i still get the message "Instance has too many concurrent requests: 101"

Op dinsdag 15 januari 2013 16:56:56 UTC+1 schreef Razvan Musaloiu-E. het volgende:
Can you please tell us the name of your instance (send it to sql-team@google.com if don't want it to be public)? Switching to ASYNC is much faster than that. The pile of active connection might slow down the switch though. One hour sounds excessive.

-- Razvan ME


On Tue, Jan 15, 2013 at 7:32 AM, Ference van Munster <ference@munstermedia.nl> wrote:
Hi,

I was optimizing my app because i've got a lot of 'Instance has too many concurrent requests: 101' errors.

So after looking at the replication type i've changed the replication mode to Asynchronous.
I understand that there is some time needed to process the replication mode, but i'm now waiting for about an hour.. and the admin keeps giving me the message 'Updating settings of instance'
All my options like 'actions' and 'instance settings' are disabled so i can't do anything in the admin.

Is this normal?

Re: 100 Connection Limit

Hi Manish

This explanation of the relationship between connection and concurrent request limits may help

j

On Mon, Jan 14, 2013 at 11:52 PM, Munish Dhiman <manisdhmn@gmail.com> wrote:
Currently we are facing 100 connection limit in our application. I accessed the link given below:


Following is the FAQ section of  Cloud SQL related to connection management.
How should I manage connections?
Database connections in a cloud hosted environment should be managed differently to those on a conventional server. In particular, be aware that your database instance may be taken offline while not in use, and any pooled connections would be closed. We recommend that a new connection is created to service each HTTP request, and re-used for the duration of that request (since the time to create a new connection is similar to that required to test the liveness of an existing connection).

According to this we can handle only hundred request simultaneously. We are getting error in our application continually. Please provide any suggestion 



--
Joe Faith | Product Manager | Google Cloud

Re: 100 Connection Limit

Could you please contact us at sql-team@google.com with the name of your instance?

Thanks!
Razvan ME


On Mon, Jan 14, 2013 at 11:52 PM, Munish Dhiman <manisdhmn@gmail.com> wrote:
Currently we are facing 100 connection limit in our application. I accessed the link given below:


Following is the FAQ section of  Cloud SQL related to connection management.
How should I manage connections?
Database connections in a cloud hosted environment should be managed differently to those on a conventional server. In particular, be aware that your database instance may be taken offline while not in use, and any pooled connections would be closed. We recommend that a new connection is created to service each HTTP request, and re-used for the duration of that request (since the time to create a new connection is similar to that required to test the liveness of an existing connection).

According to this we can handle only hundred request simultaneously. We are getting error in our application continually. Please provide any suggestion 

Re: Database access problem `updating settings`

Can you please tell us the name of your instance (send it to sql-team@google.com if don't want it to be public)? Switching to ASYNC is much faster than that. The pile of active connection might slow down the switch though. One hour sounds excessive.

-- Razvan ME


On Tue, Jan 15, 2013 at 7:32 AM, Ference van Munster <ference@munstermedia.nl> wrote:
Hi,

I was optimizing my app because i've got a lot of 'Instance has too many concurrent requests: 101' errors.

So after looking at the replication type i've changed the replication mode to Asynchronous.
I understand that there is some time needed to process the replication mode, but i'm now waiting for about an hour.. and the admin keeps giving me the message 'Updating settings of instance'
All my options like 'actions' and 'instance settings' are disabled so i can't do anything in the admin.

Is this normal?

Database access problem `updating settings`

Hi,

I was optimizing my app because i've got a lot of 'Instance has too many concurrent requests: 101' errors.

So after looking at the replication type i've changed the replication mode to Asynchronous.
I understand that there is some time needed to process the replication mode, but i'm now waiting for about an hour.. and the admin keeps giving me the message 'Updating settings of instance'
All my options like 'actions' and 'instance settings' are disabled so i can't do anything in the admin.

Is this normal?

Monday, January 14, 2013

100 Connection Limit

Currently we are facing 100 connection limit in our application. I accessed the link given below:


Following is the FAQ section of  Cloud SQL related to connection management.
How should I manage connections?
Database connections in a cloud hosted environment should be managed differently to those on a conventional server. In particular, be aware that your database instance may be taken offline while not in use, and any pooled connections would be closed. We recommend that a new connection is created to service each HTTP request, and re-used for the duration of that request (since the time to create a new connection is similar to that required to test the liveness of an existing connection).

According to this we can handle only hundred request simultaneously. We are getting error in our application continually. Please provide any suggestion 

Re: CloudSQL application in Clustering

Have you tried to copy the access tokens? The instructions on that are here:


-- Razvan ME


On Mon, Jan 14, 2013 at 6:23 AM, Robson Brock <rbrock@ciandt.com> wrote:

My application connects to the Cloud SQL DB and run in clustered machines. For that, I need to authenticate the Cloud SQL on each machine to make queries.

Is there any way to avoid having to authenticate each machine?


Thank you.


CloudSQL application in Clustering

My application connects to the Cloud SQL DB and run in clustered machines. For that, I need to authenticate the Cloud SQL on each machine to make queries.

Is there any way to avoid having to authenticate each machine?


Thank you.

Sunday, January 13, 2013

Re: Cannot create new instance (billing not enabled)

Quick question: are you trying to create another D0? Note that there is a limit of one D0 per project.

-- Razvan ME


On Sun, Jan 13, 2013 at 8:26 PM, Ilya Albrekht <ilya.albrekht@evaluzio.net> wrote:
Same issue.

Project Number 1019757322646

Re: Cannot create new instance (billing not enabled)

Same issue.

Project Number1019757322646

Re: Reclaiming disk usage after deleting data?

Quick note: I filed #57 to track this issue. Feel free to subscribe to it to get updates on the progress.

-- Razvan ME


On Sun, Jan 13, 2013 at 6:23 PM, Razvan Musaloiu-E. <razvanm@google.com> wrote:
We currently don't run with innodb_file_per_table enabled [1] so all the tables live in the global tablespace. The InnoDB tablespaces don't shrink but the space already allocated that contains deleted data will be reused.

One important note: innodb_file_per_table is a dynamic variable so you could switch it on and create tables with separate InnoDB tablespaces. Please don't do it because currently that will make your instance unavailable on the next spin up. :-) We are going to fix this.


-- Razvan ME


On Sun, Jan 13, 2013 at 4:34 PM, KC Budd <phreakmonkey@gmail.com> wrote:
Interestingly, doing a 'SELECT TABLE_NAME, TABLE_TYPE, TABLE_ROWS, DATA_LENGTH, DATA_FREE FROM TABLES;' in the console on the information_schema shows what I am expecting.  (The database in question is using almost no space and has almost 500MB free.)

So maybe the counter in the Cloud SQL console is off?  I'll try using it and report back if I hit the space limit unexpectedly.


On Sunday, January 13, 2013 4:22:23 PM UTC-8, KC Budd wrote:
I have a D0 (free trial) instance.   Yesterday, I populated it with a dump from a local MySQL database using the google_cloud_sql command.   I noticed that, while the entire sqldump output file was ~280MB, the claimed used space in the Google Cloud SQL console was ~400MB.    (Even the local files on my mysql instance aren't close to 400MB.)

After doing some queries and examining the output, I decided I didn't like one of the transformations done by the sqldump command (which naturally got propagated into the Cloud SQL instance.)  So, I deleted all the data in the table with a 'DELETE FROM' command. 

This took the disk usage reported in the Cloud SQL console from 400MB to 482MB.  (?)

So, I did a 'DROP TABLE' and removed the (now empty) table and recreated it.   No change, my instance is still reporting 482MB disk usage.  This was all several hours ago.  My questions:

1. Why is the disk space usage reported so much higher in Cloud SQL than on my local database?

2. Why did I not get the space back when I deleted the data?  (Or the table?)

3. Is there any way for me to reclaim the space?


Re: Reclaiming disk usage after deleting data?

We currently don't run with innodb_file_per_table enabled [1] so all the tables live in the global tablespace. The InnoDB tablespaces don't shrink but the space already allocated that contains deleted data will be reused.

One important note: innodb_file_per_table is a dynamic variable so you could switch it on and create tables with separate InnoDB tablespaces. Please don't do it because currently that will make your instance unavailable on the next spin up. :-) We are going to fix this.


-- Razvan ME


On Sun, Jan 13, 2013 at 4:34 PM, KC Budd <phreakmonkey@gmail.com> wrote:
Interestingly, doing a 'SELECT TABLE_NAME, TABLE_TYPE, TABLE_ROWS, DATA_LENGTH, DATA_FREE FROM TABLES;' in the console on the information_schema shows what I am expecting.  (The database in question is using almost no space and has almost 500MB free.)

So maybe the counter in the Cloud SQL console is off?  I'll try using it and report back if I hit the space limit unexpectedly.


On Sunday, January 13, 2013 4:22:23 PM UTC-8, KC Budd wrote:
I have a D0 (free trial) instance.   Yesterday, I populated it with a dump from a local MySQL database using the google_cloud_sql command.   I noticed that, while the entire sqldump output file was ~280MB, the claimed used space in the Google Cloud SQL console was ~400MB.    (Even the local files on my mysql instance aren't close to 400MB.)

After doing some queries and examining the output, I decided I didn't like one of the transformations done by the sqldump command (which naturally got propagated into the Cloud SQL instance.)  So, I deleted all the data in the table with a 'DELETE FROM' command. 

This took the disk usage reported in the Cloud SQL console from 400MB to 482MB.  (?)

So, I did a 'DROP TABLE' and removed the (now empty) table and recreated it.   No change, my instance is still reporting 482MB disk usage.  This was all several hours ago.  My questions:

1. Why is the disk space usage reported so much higher in Cloud SQL than on my local database?

2. Why did I not get the space back when I deleted the data?  (Or the table?)

3. Is there any way for me to reclaim the space?

Re: Reclaiming disk usage after deleting data?

Interestingly, doing a 'SELECT TABLE_NAME, TABLE_TYPE, TABLE_ROWS, DATA_LENGTH, DATA_FREE FROM TABLES;' in the console on the information_schema shows what I am expecting.  (The database in question is using almost no space and has almost 500MB free.)

So maybe the counter in the Cloud SQL console is off?  I'll try using it and report back if I hit the space limit unexpectedly.


On Sunday, January 13, 2013 4:22:23 PM UTC-8, KC Budd wrote:
I have a D0 (free trial) instance.   Yesterday, I populated it with a dump from a local MySQL database using the google_cloud_sql command.   I noticed that, while the entire sqldump output file was ~280MB, the claimed used space in the Google Cloud SQL console was ~400MB.    (Even the local files on my mysql instance aren't close to 400MB.)

After doing some queries and examining the output, I decided I didn't like one of the transformations done by the sqldump command (which naturally got propagated into the Cloud SQL instance.)  So, I deleted all the data in the table with a 'DELETE FROM' command. 

This took the disk usage reported in the Cloud SQL console from 400MB to 482MB.  (?)

So, I did a 'DROP TABLE' and removed the (now empty) table and recreated it.   No change, my instance is still reporting 482MB disk usage.  This was all several hours ago.  My questions:

1. Why is the disk space usage reported so much higher in Cloud SQL than on my local database?

2. Why did I not get the space back when I deleted the data?  (Or the table?)

3. Is there any way for me to reclaim the space?

Reclaiming disk usage after deleting data?

I have a D0 (free trial) instance.   Yesterday, I populated it with a dump from a local MySQL database using the google_cloud_sql command.   I noticed that, while the entire sqldump output file was ~280MB, the claimed used space in the Google Cloud SQL console was ~400MB.    (Even the local files on my mysql instance aren't close to 400MB.)

After doing some queries and examining the output, I decided I didn't like one of the transformations done by the sqldump command (which naturally got propagated into the Cloud SQL instance.)  So, I deleted all the data in the table with a 'DELETE FROM' command. 

This took the disk usage reported in the Cloud SQL console from 400MB to 482MB.  (?)

So, I did a 'DROP TABLE' and removed the (now empty) table and recreated it.   No change, my instance is still reporting 482MB disk usage.  This was all several hours ago.  My questions:

1. Why is the disk space usage reported so much higher in Cloud SQL than on my local database?

2. Why did I not get the space back when I deleted the data?  (Or the table?)

3. Is there any way for me to reclaim the space?

Re: Cannot create new instance (billing not enabled)

Can you please sent a screenshot to sql-team@google.com that shows the error? The project looks ok to me.

-- Razvan ME


On Sun, Jan 13, 2013 at 7:37 AM, Mohamed EL-Qady <scientist.363@gmail.com> wrote:
I have the same problem 
Project number : 68009198775

please help me ASAP 

my Boss will fire me :'(

On Monday, November 5, 2012 5:58:50 PM UTC+2, Razvan Musaloiu-E. wrote:
Can you please try to create an instance now?

-- Razvan ME


On Sun, Nov 4, 2012 at 1:33 PM, Mike Drummond <mike.d...@gmail.com> wrote:
I'm receiving the following error when attempting to create a new instance: "you cannot create new instances because billing is disabled".
I can confirm billing is enabled.
Project number: 411892037600


Re: Cannot create new instance (billing not enabled)

I have the same problem 
Project number : 68009198775

please help me ASAP 

my Boss will fire me :'(
On Monday, November 5, 2012 5:58:50 PM UTC+2, Razvan Musaloiu-E. wrote:
Can you please try to create an instance now?

-- Razvan ME


On Sun, Nov 4, 2012 at 1:33 PM, Mike Drummond <mike.d...@gmail.com> wrote:
I'm receiving the following error when attempting to create a new instance: "you cannot create new instances because billing is disabled".
I can confirm billing is enabled.
Project number: 411892037600

Friday, January 11, 2013

Re: Dropping or replacing existing views

I figured out what the problem was: There was a separate bit of code that had selected from the view I was trying to drop and which wasn't closing it's cursor down properly. While it didn't seem to matter on my local development environment, that open cursor was blocking the DROP VIEW command. I modified my code to ensure that all cursors and connections were closed and the problem went away.


On Friday, January 11, 2013 4:56:44 PM UTC, Daniel Thompson wrote:
I'm attempting to execute a "DROP VIEW ..." or "CREATE OR REPLACE ...." statement from within my app and via the SQL Prompt on the Google APIs Console. The view does exist. In both the app and the SQL Prompt the system hangs for while (I'm guessing 60 seconds) and then reports "
Unable to execute statement". My SQL syntax is valid and it works perfectly on my local MySQL. Am I doing something wrong?

Thanks,
Dan

Dropping or replacing existing views

I'm attempting to execute a "DROP VIEW ..." or "CREATE OR REPLACE ...." statement from within my app and via the SQL Prompt on the Google APIs Console. The view does exist. In both the app and the SQL Prompt the system hangs for while (I'm guessing 60 seconds) and then reports "
Unable to execute statement". My SQL syntax is valid and it works perfectly on my local MySQL. Am I doing something wrong?

Thanks,
Dan

Thursday, January 10, 2013

using Google cloud/sql and ftp

 I'm looking at trying to setup an edi  server in  the  cloud but need to make sure ftp  can  be used with the Google platform  would like to know before purchasing the services.

Wednesday, January 9, 2013

How can connect cloud db in tomcat on ubuntu

I use build-in tomcat on ubuntu, and use 'sudo su -m tomcat7' to login to tomcat7 user, and running google_sql command tool ,  but can not find Prefs.xml in tomcat7 's home folder /usr/share/tomcat7 ,  so can not connect cloud db

Can I put Prefs.xml into tomcat webapps folder? 

Tuesday, January 8, 2013

iPhone App - 20/20 Vision

iPhone App - 20/20 Vision - http://2020visioniphoneapp.weebly.com

Security of OAuth tokens in windows registry?

I must be reading the docs wrong - it seems that the tokens to access OAuth on windows are stored, plain text no encryption, in the windows registry?

Is that right? For deployment we need to have any machine running the google cloud sql driver to have a copy of the tokens?

This is not secure in any way, why can't the tokens be stored encrypted on disk at the least?!? The driver should let you provide the tokens, not using hardcoded schemes like reading them from an unsecure location such as the windows registry. What a strange design.

Thoughts?

"

Windows: The tokens are stored in the registry under the key:

HKCU\Software\JavaSoft\Prefs\com.google.cloud.sqlservice
These entries will have to be copied to the same key for the user who will be running the application on the deployment machine."

Thursday, January 3, 2013

Re: Connecting Webforms to an instance of google SQL in the cloud

Hi Kelly

Its not clear what you mean by a 'webform' in this context, but you could use either Google App Engine or Apps Script:

Google App Engine is a rich platform for building web applications and includes very good support for Cloud SQL.

Alternatively you could build your form using Google Apps Script GUI Builder and then save the data to Cloud SQL using JDBC

Does this help?

J

On Thu, Jan 3, 2013 at 11:42 AM, Kelly Turner <kelyturner@gmail.com> wrote:
Does anyone know if you can connect a webform and have it populate a instance of Google's SQL in the cloud?  I need to find out if this is possible. Any help would be great.
 
regards,
 
Kelly Turner
 



--
Joe Faith | Product Manager | Google Cloud

Connecting Webforms to an instance of google SQL in the cloud

Does anyone know if you can connect a webform and have it populate a instance of Google's SQL in the cloud?  I need to find out if this is possible. Any help would be great.
 
regards,
 
Kelly Turner
 

Wednesday, January 2, 2013

Re: Ability to Schedule Database Export to Cloud Storage

Sorry, but Cloud SQL does not have this functionality currently. We'd like to make this easier in the future. In the meantime, you could use Selenium (or some other UI scripting framework) in combination with a cron job.

On Wed, Jan 2, 2013 at 11:07 AM, Tom Hanks <thanks@qualitydistribution.com> wrote:
Is it possible (or planned) to schedule a Google Cloud SQL database to Google Cloud Storage?

What we'd like to do is download the export and restore it to a local instance of MySQL for reporting purposes. 

Ability to Schedule Database Export to Cloud Storage

Is it possible (or planned) to schedule a Google Cloud SQL database to Google Cloud Storage?

What we'd like to do is download the export and restore it to a local instance of MySQL for reporting purposes. 

Tuesday, January 1, 2013

Re: OperationalError: could not connect: ApplicationError: 1002

Are you still getting this error? If you are, can you tell me the name of your instance? Feel free to reply off list if you don't want to mention your instance name in a public mailing list.

On Dec 31, 2012 5:21 PM, "Richard Druce" <richard@yreceipts.com> wrote:
I've been getting 

OperationalError: could not connect: ApplicationError: 1002
From my python Appengine account while trying to execute requests. Does anyone know what this means? 

  Thanks,
Richard