Tuesday, July 31, 2012

Re: Can't create new instance

You should be able to create new instances now.

As for the storage limit question, once you've reached the maximum size, mysql won't be able to write to disk anymore. Once you're in that state, mysql might crash or hang.

If you really need to go beyond 10 GB, you can consider sharding your data into separate cloud sql instances. See http://stackoverflow.com/questions/292039/resources-for-database-sharding-and-partitioning.

On Tue, Jul 31, 2012 at 4:08 PM, jlemes <jslsolucoes@gmail.com> wrote:

Sorry , the number is 38855119945 .

About the second question :

i have a instance named jslsolucoes:app with two databases created : tagria and faturamento , when the maximum size (10GB) of storage is reached , the applications will crash because the size of database ? Or nothing happens and i will pay just for extra storage (10GB) + extra GB that is used ?

About documentation :

"Use of storage or I/O over the included quota is charged at the Per Use rate ($0.24 per GB per month for storage, $0.10 per Million of I/Os). The maximum storage for any instance is currently 10GB. "

In short
: How can I use more than 10GB for the same databases .
Create a new instance ? But when i create a new instance the same databases is avaiable ???



Re: Can't create new instance


Sorry , the number is 38855119945 .

About the second question :

i have a instance named jslsolucoes:app with two databases created : tagria and faturamento , when the maximum size (10GB) of storage is reached , the applications will crash because the size of database ? Or nothing happens and i will pay just for extra storage (10GB) + extra GB that is used ?

About documentation :

"Use of storage or I/O over the included quota is charged at the Per Use rate ($0.24 per GB per month for storage, $0.10 per Million of I/Os). The maximum storage for any instance is currently 10GB. "

In short
: How can I use more than 10GB for the same databases .
Create a new instance ? But when i create a new instance the same databases is avaiable ???


Re: Can't create new instance

Hi,
What is your numerical project id?  (the number portion in the API console URL)

For your second question, could you clarify it a bit? What is the use case you have in mind?


On Tue, Jul 31, 2012 at 3:19 PM, jlemes <jslsolucoes@gmail.com> wrote:
Hi, 

I'm trying to create an new instance, however even after enabling I get an error saying "You cannot create new instances because billing is disabled", but the console shows that billing is enabled and active

My project id is "jslsolucoes" .

Another question : If my instance with 2 databases installeds get the maximum storage (10GB) i will payied just for aditional storage or i have to create a new instance and that new instance will access the same databases ?



Can't create new instance

Hi, 

I'm trying to create an new instance, however even after enabling I get an error saying "You cannot create new instances because billing is disabled", but the console shows that billing is enabled and active

My project id is "jslsolucoes" .

Another question : If my instance with 2 databases installeds get the maximum storage (10GB) i will payied just for aditional storage or i have to create a new instance and that new instance will access the same databases ?


Re: Excel - VBA - Google Cloud SQL

You can connect to Cloud SQL from external applications using a JDBC: https://developers.google.com/cloud-sql/docs/external

From Excel you should be able to create a ODBC datasource, which then talks to Cloud SQL using a ODBC-JDBC bridge -- though you would have to check the VBA / Excel documentation how to do this.

j

On Tue, Jul 31, 2012 at 9:56 AM, Alex <alexandre.despontin@meritoinvestimentos.com> wrote:
Hi,

Is it possible to make a connection with Google Cloud SQL using VBA in Excel?
How did I make?

Tks.




--
Joe Faith | Product Manager | Google Cloud

Re: Multiple Queries in One JDBC Statement

Hi,

Can I add a vote for this feature too please :-)

Just to be clear... allowing something such as the following to be accepted:-

DELETE FROM a WHERE b; DELETE FROM c WHERE d;

Kindest

Steve



On Thursday, 26 July 2012 17:12:13 UTC+1, Rob wrote:
Hi Chris,

We don't support that option currently.

Rob


On Thu, Jul 26, 2012 at 7:42 AM, Chris Hatton <codecrocodile@gmail.com> wrote:
I'm trying to execute multiple queries in one statement, but can't seem to get it to work. The steps I have taken so far are to deliminate the queries with a semi-colon e.g. :

"select 1; select 2; select 3;"

and I have also appended:

?allowMultiQuery=true

to my connection string.

Is this even allowed by the jdbc app engine driver and have any of you guys managed to do this yet? I can do my task by other means, but i would like to know why this is not working.

Cheers


Chris

Excel - VBA - Google Cloud SQL

Hi,

Is it possible to make a connection with Google Cloud SQL using VBA in Excel?
How did I make?

Tks.

Sunday, July 29, 2012

Re: Problem importing data into Cloud SQL database

I've replied to Aubrey off-list.

On Sun, Jul 29, 2012 at 6:38 AM, Aubrey Malabie <malengatiger@gmail.com> wrote:
Hi,

My instance: caramel-office:caramel-instance

Failed to import gs://mydev/caramelDB.sql: An unknown problem occurred (ERROR_RDBMS)

Thanks for the help!

Regards,
Aubrey

Problem importing data into Cloud SQL database

Hi,

My instance: caramel-office:caramel-instance

Failed to import gs://mydev/caramelDB.sql: An unknown problem occurred (ERROR_RDBMS)

Thanks for the help!

Regards,
Aubrey

Wednesday, July 25, 2012

Re: Using DictCursors

Wanted to check in and see if this was still being worked on?  I've been working in development with the code below.  I just went and posted the app and now get the 'module' object has no attribute 'cursors' error.  I then tried the cursor = conn.cursor(use_dict_cursor=True) code and that did not work either.

conn = rdbms.connect(

instance='instance',

database='db',

cursorclass=rdbms.cursors.DictCursor) 

Tuesday, July 24, 2012

Re: Editing the Full-Text Stopwords List

Hi Tony,

Thanks for the quick reply, I was afraid you were going to say that.  Oh well, I'll just have to be creative then in how I approach certain searches.  I'll get that feature request in, as it is definitely something I'd like to see for the future.

Re: Tomcat to Google SQL Cloud - NOT_AUTHORIZED

Hoops you are right ...   beginners mistake
 
It works now.
 
Thanks a lot,
 
B.

On Tuesday, July 24, 2012 10:35:01 PM UTC+2, Tony Tseng wrote:
Hi Benoit,
Looks like you made a typo in your tomcat config.
The instance name should be xclindemodb:stph instead of xclindemo:stph.

On Tue, Jul 24, 2012 at 1:27 PM,  wrote:
Dear Group,
 
Did somebody tried to access Google Cloud SQL successfully from a tomcat context?
 
I have authorized successfully Google Cloud SQL on the server: sh google_sql.sh xclindemodb:stph  
 
From command tool I can query my database.
 
I try to to define in Tomcat the following datasource:

<Resource name="jdbc/1" auth="Container"

type="javax.sql.DataSource"

username="root" password=""

driverClassName="com.google.cloud.sql.Driver"

url="jdbc:google:rdbms://xclindemo:stph/mysql"

maxActive="80" />

</Context>

The only difference with a successful local database is the driver, url and username/password.   (removing completly username/password does not help)

I am nevertheless getting exception:

(boss::NOT_AUTHORIZED: Not authorized to access instance: xclindemo:stph)

 

Please help?

 

Thanks,

 

Benoit


Re: Tomcat to Google SQL Cloud - NOT_AUTHORIZED

Hi Benoit,
Looks like you made a typo in your tomcat config.
The instance name should be xclindemodb:stph instead of xclindemo:stph.

On Tue, Jul 24, 2012 at 1:27 PM, Benoit Marchal <b.marchal88@gmail.com> wrote:
Dear Group,
 
Did somebody tried to access Google Cloud SQL successfully from a tomcat context?
 
I have authorized successfully Google Cloud SQL on the server: sh google_sql.sh xclindemodb:stph  
 
From command tool I can query my database.
 
I try to to define in Tomcat the following datasource:

<Resource name="jdbc/1" auth="Container"

type="javax.sql.DataSource"

username="root" password=""

driverClassName="com.google.cloud.sql.Driver"

url="jdbc:google:rdbms://xclindemo:stph/mysql"

maxActive="80" />

</Context>

The only difference with a successful local database is the driver, url and username/password.   (removing completly username/password does not help)

I am nevertheless getting exception:

(boss::NOT_AUTHORIZED: Not authorized to access instance: xclindemo:stph)

 

Please help?

 

Thanks,

 

Benoit


Re: Editing the Full-Text Stopwords List

Hi Brian,
Cloud SQL uses the identical list of stopwords as regular MySQL.
Typically you can use your own stopwords file by setting the ft_stopword_file system variable (http://dev.mysql.com/doc//refman/5.5/en/fulltext-fine-tuning.html).
Unfortunately that feature is not supported by Cloud SQL.

It'd be great if you could file a feature request on http://code.google.com/p/googlecloudsql/issues/list for this.

On Tue, Jul 24, 2012 at 1:15 PM, Brian Hayward <brian.hayward@tidyware.com> wrote:
Hi,

I'm running into a problem where the full-text stopwords in Google Cloud SQL are preventing me from running certain searches that are relevant to my app.  I did some searching but I couldn't find a list of the stopwords that Google Cloud SQL uses, but I did find the list that mySQL uses (http://dev.mysql.com/doc//refman/5.5/en/fulltext-stopwords.html) so I'm using that as a baseline.  It's quite an extensive list and I would like to be able to search by some of the words listed on there.  Is there a way to edit the stopwords list for my own instance of Google Cloud SQL?

If there is not a way to do this, then would it be possible to shorten the list for future releases of Google Cloud SQL or to implement a feature that does allow you to edit the list for your own instances?

Tomcat to Google SQL Cloud - NOT_AUTHORIZED

Dear Group,
 
Did somebody tried to access Google Cloud SQL successfully from a tomcat context?
 
I have authorized successfully Google Cloud SQL on the server: sh google_sql.sh xclindemodb:stph  
 
From command tool I can query my database.
 
I try to to define in Tomcat the following datasource:

<Resource name="jdbc/1" auth="Container"

type="javax.sql.DataSource"

username="root" password=""

driverClassName="com.google.cloud.sql.Driver"

url="jdbc:google:rdbms://xclindemo:stph/mysql"

maxActive="80" />

</Context>

The only difference with a successful local database is the driver, url and username/password.   (removing completly username/password does not help)

I am nevertheless getting exception:

(boss::NOT_AUTHORIZED: Not authorized to access instance: xclindemo:stph)

 

Please help?

 

Thanks,

 

Benoit

Editing the Full-Text Stopwords List

Hi,

I'm running into a problem where the full-text stopwords in Google Cloud SQL are preventing me from running certain searches that are relevant to my app.  I did some searching but I couldn't find a list of the stopwords that Google Cloud SQL uses, but I did find the list that mySQL uses (http://dev.mysql.com/doc//refman/5.5/en/fulltext-stopwords.html) so I'm using that as a baseline.  It's quite an extensive list and I would like to be able to search by some of the words listed on there.  Is there a way to edit the stopwords list for my own instance of Google Cloud SQL?

If there is not a way to do this, then would it be possible to shorten the list for future releases of Google Cloud SQL or to implement a feature that does allow you to edit the list for your own instances?

Monday, July 23, 2012

Re: Google Cloud SQL instance is suspended

Hi,
Your instance was suspended due to billing reason, but I see it's now been resolved.
Please let us know it's still not working for you.

On Mon, Jul 23, 2012 at 1:35 AM, Interbooker App <interbookerapp@gmail.com> wrote:
Hi, my Google Cloud SQL instance is suspended for some reasons and I couldn't figure out why. In the log, it doesn't say anything about billing, so I figure, it's not the billing problem. Could you take a look at what's going wrong with my instance? Instance = interbookerapp;interbooker

Thanks in advance!

Re: Authentication error when opening connection to cloud SQL in the development server

Try revoking access from the commandline tool (then run it again to repopulate the cached credentials).

Rob

On Mon, Jul 23, 2012 at 5:21 AM, Partho Sarkar <partho.sarkar@gmail.com> wrote:
Hi Jordi,

I am facing the same issue with Google Cloud SQL command line tool :

Exiting; Unable to open connection.
unauthorized_client: 


Have you been able to resolve this error ???


Thanks in advance.

Regards,
Partho.

On Monday, April 9, 2012 4:06:02 AM UTC+5:30, Jordi Poch wrote:
Hi,

I am testing Google app Engine with Cloud SQL. I use:
- App engine SDK 1.6.2.1
- maven-gae-plugin v0.9.2

Some days ago I managed to get my application correctly running in the development server and connecting to my cloud SQL instance. However, now I face (I don't know why) the following problem. When start running the application, I see the following warnings in the console:

ADVERTENCIA: Authentication error: Unable to respond to any of these challenges: {authsub=WWW-Authenticate: AuthSub realm="https://www.google.com/accounts/AuthSubRequest" allowed-scopes="https://www.googleapis.com/auth/sqlservice"}
08-abr-2012 22:13:14 com.google.appengine.repackaged.org.apache.http.impl.client.DefaultRequestDirector handleResponse
ADVERTENCIA: Authentication error: Unable to respond to any of these challenges: {authsub=WWW-Authenticate: AuthSub realm="https://www.google.com/accounts/AuthSubRequest" allowed-scopes="https://www.googleapis.com/auth/sqlservice"}
08-abr-2012 22:13:14 com.google.appengine.api.rdbms.dev.LocalRdbmsServiceLocalDriver openConnection


Followed by this error:

GRAVE: Could not allocate a connection
java.sql.SQLException: unauthorized_client:
    at com.google.cloud.sql.jdbc.internal.googleapi.RpcGoogleApi.newOpenConnectionIOException(RpcGoogleApi.java:187)
    at com.google.cloud.sql.jdbc.internal.googleapi.RpcGoogleApi.openConnection(RpcGoogleApi.java:105)
    at com.google.cloud.sql.jdbc.internal.SqlProtoClient.openConnection(SqlProtoClient.java:58)
    at com.google.cloud.sql.jdbc.Driver.connect(Driver.java:59)
    at com.google.cloud.sql.Driver.connectImpl(Driver.java:105)
    at com.google.cloud.sql.Driver.connect(Driver.java:94)
    at com.google.cloud.sql.Driver.connect(Driver.java:31)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:154)
    at com.google.appengine.api.rdbms.dev.LocalRdbmsServiceLocalDriver.openConnection(LocalRdbmsServiceLocalDriver.java:142)
    at com.google.appengine.api.rdbms.dev.LocalRdbmsService.openConnection(LocalRdbmsService.java:121)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at com.google.appengine.tools.development.ApiProxyLocalImpl$AsyncApiCall.callInternal(ApiProxyLocalImpl.java:498)
    at com.google.appengine.tools.development.ApiProxyLocalImpl$AsyncApiCall.call(ApiProxyLocalImpl.java:452)
    at com.google.appengine.tools.development.ApiProxyLocalImpl$AsyncApiCall.call(ApiProxyLocalImpl.java:430)
    at java.util.concurrent.Executors$PrivilegedCallable$1.run(Executors.java:463)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.util.concurrent.Executors$PrivilegedCallable.call(Executors.java:460)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:662)
Caused by: com.google.cloud.sql.jdbc.internal.googleapi.RefreshTokenAccessTokenRefresher$OAuth2AuthorizationException: unauthorized_client:
    at com.google.cloud.sql.jdbc.internal.googleapi.RefreshTokenAccessTokenRefresher.refreshAccessToken(RefreshTokenAccessTokenRefresher.java:55)
    at com.google.cloud.sql.jdbc.internal.googleapi.RpcGoogleApi$DefaultGoogleApi.refreshAccessToken(RpcGoogleApi.java:330)
    at com.google.cloud.sql.jdbc.internal.googleapi.RpcGoogleApi$DefaultGoogleApi.exec(RpcGoogleApi.java:321)
    at com.google.cloud.sql.jdbc.internal.googleapi.RpcGoogleApi.openConnection(RpcGoogleApi.java:100)
    ... 24 more
Caused by: com.google.api.client.http.HttpResponseException: 400 OK
    at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:789)
    at com.google.api.client.auth.oauth2.draft10.AccessTokenRequest.executeUnparsed(AccessTokenRequest.java:459)
    at com.google.cloud.sql.jdbc.internal.googleapi.RefreshTokenAccessTokenRefresher.refreshAccessToken(RefreshTokenAccessTokenRefresher.java:48)
    ... 27 more


However, if I execute the application in the development server using de Google plugin for Eclipse, it works Ok. And it also runs fine in the appengine server (deploying with both Google plugin for Eclipse and maven-gae-plugin).

Curiously, I was also trying the Cloud SQL Command Line Tool and also found the same problem when executed the command "google_sql.cmd <instance>":

Exiting; Unable to open connection.
unauthorized_client:


Any idea?

Thanks in advance.

Re: Authentication error when opening connection to cloud SQL in the development server

Hi Jordi,

I am facing the same issue with Google Cloud SQL command line tool :

Exiting; Unable to open connection.
unauthorized_client: 


Have you been able to resolve this error ???


Thanks in advance.

Regards,
Partho.

On Monday, April 9, 2012 4:06:02 AM UTC+5:30, Jordi Poch wrote:
Hi,

I am testing Google app Engine with Cloud SQL. I use:
- App engine SDK 1.6.2.1
- maven-gae-plugin v0.9.2

Some days ago I managed to get my application correctly running in the development server and connecting to my cloud SQL instance. However, now I face (I don't know why) the following problem. When start running the application, I see the following warnings in the console:

ADVERTENCIA: Authentication error: Unable to respond to any of these challenges: {authsub=WWW-Authenticate: AuthSub realm="https://www.google.com/accounts/AuthSubRequest" allowed-scopes="https://www.googleapis.com/auth/sqlservice"}
08-abr-2012 22:13:14 com.google.appengine.repackaged.org.apache.http.impl.client.DefaultRequestDirector handleResponse
ADVERTENCIA: Authentication error: Unable to respond to any of these challenges: {authsub=WWW-Authenticate: AuthSub realm="https://www.google.com/accounts/AuthSubRequest" allowed-scopes="https://www.googleapis.com/auth/sqlservice"}
08-abr-2012 22:13:14 com.google.appengine.api.rdbms.dev.LocalRdbmsServiceLocalDriver openConnection


Followed by this error:

GRAVE: Could not allocate a connection
java.sql.SQLException: unauthorized_client:
    at com.google.cloud.sql.jdbc.internal.googleapi.RpcGoogleApi.newOpenConnectionIOException(RpcGoogleApi.java:187)
    at com.google.cloud.sql.jdbc.internal.googleapi.RpcGoogleApi.openConnection(RpcGoogleApi.java:105)
    at com.google.cloud.sql.jdbc.internal.SqlProtoClient.openConnection(SqlProtoClient.java:58)
    at com.google.cloud.sql.jdbc.Driver.connect(Driver.java:59)
    at com.google.cloud.sql.Driver.connectImpl(Driver.java:105)
    at com.google.cloud.sql.Driver.connect(Driver.java:94)
    at com.google.cloud.sql.Driver.connect(Driver.java:31)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:154)
    at com.google.appengine.api.rdbms.dev.LocalRdbmsServiceLocalDriver.openConnection(LocalRdbmsServiceLocalDriver.java:142)
    at com.google.appengine.api.rdbms.dev.LocalRdbmsService.openConnection(LocalRdbmsService.java:121)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at com.google.appengine.tools.development.ApiProxyLocalImpl$AsyncApiCall.callInternal(ApiProxyLocalImpl.java:498)
    at com.google.appengine.tools.development.ApiProxyLocalImpl$AsyncApiCall.call(ApiProxyLocalImpl.java:452)
    at com.google.appengine.tools.development.ApiProxyLocalImpl$AsyncApiCall.call(ApiProxyLocalImpl.java:430)
    at java.util.concurrent.Executors$PrivilegedCallable$1.run(Executors.java:463)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.util.concurrent.Executors$PrivilegedCallable.call(Executors.java:460)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:662)
Caused by: com.google.cloud.sql.jdbc.internal.googleapi.RefreshTokenAccessTokenRefresher$OAuth2AuthorizationException: unauthorized_client:
    at com.google.cloud.sql.jdbc.internal.googleapi.RefreshTokenAccessTokenRefresher.refreshAccessToken(RefreshTokenAccessTokenRefresher.java:55)
    at com.google.cloud.sql.jdbc.internal.googleapi.RpcGoogleApi$DefaultGoogleApi.refreshAccessToken(RpcGoogleApi.java:330)
    at com.google.cloud.sql.jdbc.internal.googleapi.RpcGoogleApi$DefaultGoogleApi.exec(RpcGoogleApi.java:321)
    at com.google.cloud.sql.jdbc.internal.googleapi.RpcGoogleApi.openConnection(RpcGoogleApi.java:100)
    ... 24 more
Caused by: com.google.api.client.http.HttpResponseException: 400 OK
    at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:789)
    at com.google.api.client.auth.oauth2.draft10.AccessTokenRequest.executeUnparsed(AccessTokenRequest.java:459)
    at com.google.cloud.sql.jdbc.internal.googleapi.RefreshTokenAccessTokenRefresher.refreshAccessToken(RefreshTokenAccessTokenRefresher.java:48)
    ... 27 more


However, if I execute the application in the development server using de Google plugin for Eclipse, it works Ok. And it also runs fine in the appengine server (deploying with both Google plugin for Eclipse and maven-gae-plugin).

Curiously, I was also trying the Cloud SQL Command Line Tool and also found the same problem when executed the command "google_sql.cmd <instance>":

Exiting; Unable to open connection.
unauthorized_client:


Any idea?

Thanks in advance.

Google Cloud SQL instance is suspended

Hi, my Google Cloud SQL instance is suspended for some reasons and I couldn't figure out why. In the log, it doesn't say anything about billing, so I figure, it's not the billing problem. Could you take a look at what's going wrong with my instance? Instance = interbookerapp;interbooker

Thanks in advance!

Sunday, July 22, 2012

Re: MySQL import fails

I checked the error log which tells me the error was indeed transient. Could you please retry and let me know if it keep happening again and again. Feel free to reply off the list.

-Amit


On Sun, Jul 22, 2012 at 7:01 AM, Aubrey Malabie <malengatiger@gmail.com> wrote:
Hi,

My instance is sgela-backend-api:sgela-instance and I have the following error when I attempt to import a mysqldump file. 
Failed to import gs://sgela-dev/sgeladump.sql: An unknown problem occurred (TRANSIENT_ERROR)

How do I fix this?
Thanks for any assistance!

MySQL import fails

Hi,

My instance is sgela-backend-api:sgela-instance and I have the following error when I attempt to import a mysqldump file. 
Failed to import gs://sgela-dev/sgeladump.sql: An unknown problem occurred (TRANSIENT_ERROR)

How do I fix this?
Thanks for any assistance!

Friday, July 20, 2012

Re: Can I pay annually?

On Fri, Jul 20, 2012 at 3:47 PM, Pedro A. Rodríguez López <pedro.arodriguezl@gmail.com> wrote:
Can I pay annually for the SQL Cloud service?

That sort of thing is handled on a case-by-case basis by the sales team:

Can I pay annually?

Can I pay annually for the SQL Cloud service?

Re: Connection error java.sql.SQLInvalidAuthorizationSpecException: boss::NOT_AUTHORIZED: Not authorized to access instance: proy-laz:tmp-sio:mina

Thanks Ken, Eureka!!!
According to the configuration for "Admin and Reporting Tools"(https://developers.google.com/cloud-sql/docs/admin_tools), I was using this string as my url connection:
"jdbc:google:rdbms:proy-laz:tmp-sio:mina"

Now I changed the connection string to the format used in the article "Using Google Cloud SQL with App Engine Java SDK" and works fine!
"jdbc:google:rdbms://proy-laz:tmp-sio/mina"

Thanks a lot for your help!
Now is working fine, greeting from Peru!

Luis Acosta Zúñiga

Re: Connection error java.sql.SQLInvalidAuthorizationSpecException: boss::NOT_AUTHORIZED: Not authorized to access instance: proy-laz:tmp-sio:mina

On Fri, Jul 20, 2012 at 8:00 AM, Luis Esteban Acosta Zuñiga <luis.acostaz@gmail.com> wrote:
Good morning.
I'm trying to connect to my Google Cloud SQL instance - database from Talend software.
I add a JDBC Connection, with these parameters:

JDBC URL: jdbc:google:rdbms:proy-laz:tmp-sio:mina

Double check that instance name.  I don't think it exists.

Ken
 
DRIVER JAR: google_sql.jar (This is the jar used by the Google SQL Command Line Tool)
DRIVER CLASS: com.google.cloud.sql.Driver
USER: (the same used and working from Eclipse App Engine Plugin)
PASSWORD: (the same used and working from Eclipse App Engine Plugin)

Whe I try to run the connection I get the next exception:

Exception in component tJDBCConnection_1
java.sql.SQLInvalidAuthorizationSpecException: boss::NOT_AUTHORIZED: Not authorized to access instance: proy-laz:tmp-sio:mina
at com.google.cloud.sql.jdbc.internal.Exceptions.newSqlExceptionForApplicationError(Exceptions.java:192)
at com.google.cloud.sql.jdbc.internal.Exceptions.newSqlException(Exceptions.java:211)
at com.google.cloud.sql.jdbc.internal.SqlProtoClient.check(SqlProtoClient.java:158)
at com.google.cloud.sql.jdbc.internal.SqlProtoClient.openConnection(SqlProtoClient.java:60)
at com.google.cloud.sql.jdbc.Driver.connect(Driver.java:66)
at com.google.cloud.sql.Driver.connectImpl(Driver.java:109)
at com.google.cloud.sql.Driver.connect(Driver.java:98)
at com.google.cloud.sql.Driver.connect(Driver.java:31)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at talenddemosjava.connect_cloud_0_1.connect_cloud.tJDBCConnection_1Process(connect_cloud.java:279)
at talenddemosjava.connect_cloud_0_1.connect_cloud.runJobInTOS(connect_cloud.java:620)
at talenddemosjava.connect_cloud_0_1.connect_cloud.main(connect_cloud.java:488)

I have already installed and configured the Command Line Tool and it works fine with the user and password used in this connection.
Please, somebody knows if I need to configure anything else to get the connection working?

Thanks in advance for your help.
Regards.

Lucho Acosta

Connection error java.sql.SQLInvalidAuthorizationSpecException: boss::NOT_AUTHORIZED: Not authorized to access instance: proy-laz:tmp-sio:mina

Good morning.
I'm trying to connect to my Google Cloud SQL instance - database from Talend software.
I add a JDBC Connection, with these parameters:

JDBC URL: jdbc:google:rdbms:proy-laz:tmp-sio:mina
DRIVER JAR: google_sql.jar (This is the jar used by the Google SQL Command Line Tool)
DRIVER CLASS: com.google.cloud.sql.Driver
USER: (the same used and working from Eclipse App Engine Plugin)
PASSWORD: (the same used and working from Eclipse App Engine Plugin)

Whe I try to run the connection I get the next exception:

Exception in component tJDBCConnection_1
java.sql.SQLInvalidAuthorizationSpecException: boss::NOT_AUTHORIZED: Not authorized to access instance: proy-laz:tmp-sio:mina
at com.google.cloud.sql.jdbc.internal.Exceptions.newSqlExceptionForApplicationError(Exceptions.java:192)
at com.google.cloud.sql.jdbc.internal.Exceptions.newSqlException(Exceptions.java:211)
at com.google.cloud.sql.jdbc.internal.SqlProtoClient.check(SqlProtoClient.java:158)
at com.google.cloud.sql.jdbc.internal.SqlProtoClient.openConnection(SqlProtoClient.java:60)
at com.google.cloud.sql.jdbc.Driver.connect(Driver.java:66)
at com.google.cloud.sql.Driver.connectImpl(Driver.java:109)
at com.google.cloud.sql.Driver.connect(Driver.java:98)
at com.google.cloud.sql.Driver.connect(Driver.java:31)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at talenddemosjava.connect_cloud_0_1.connect_cloud.tJDBCConnection_1Process(connect_cloud.java:279)
at talenddemosjava.connect_cloud_0_1.connect_cloud.runJobInTOS(connect_cloud.java:620)
at talenddemosjava.connect_cloud_0_1.connect_cloud.main(connect_cloud.java:488)

I have already installed and configured the Command Line Tool and it works fine with the user and password used in this connection.
Please, somebody knows if I need to configure anything else to get the connection working?

Thanks in advance for your help.
Regards.

Lucho Acosta

Re: full text search

Hi Joe,

First I used Innodb engine only then that time performance for accessing data was very low so later i thought to used myISAM engine and implementd fulltext  due to this little bit performance is increased but still we require good speed to access the data from colud sql through google app within 2 to 3 seconds.

E.g for 1300 rows:-
for Innodb - its taking more than  one minute so request time out occur and page is not displayed in google app (1300 rows)
for myISAM - its taking near to 52 seconds (1300 rows)

but actual size of our data will be more than 1300 rows near about 100k rows

On Thursday, July 19, 2012 8:40:37 PM UTC+5:30, Joe Faith wrote:
Hi Amber

We would strongly suggest you use InnoDB tables wherever possible.


J

On Wed, Jul 18, 2012 at 9:33 PM, amber <tekipconsulting@gmail.com> wrote:
Hi Razvan,

I used  MATCH() ... AGAINST also but still performance not improved,... its taking time near about 52 seconds for 1300 records but actual size of our database will be more than 100k records so that's time it wont work Please suggest the any solution to increase speed of accessing data through google app engine application  


On Wednesday, July 18, 2012 10:19:02 PM UTC+5:30, Razvan Musaloiu-E. wrote:
LIKE works with InnoDB tables. Do this tables change often? Could you maintain a copy of them in InnoDB format?

Another idea: why don't you use MATCH() ... AGAINST instead of LIKE?

-- Razvan ME


On Wed, Jul 18, 2012 at 1:09 AM, amber <tekipconsulting@gmail.com> wrote:
Hi,

tables structure are as follows,

CREATE TABLE table1
(UniqueID VARCHAR(255) NOT NULL PRIMARY KEY, Company VARCHAR(255) NULL, Category1 VARCHAR(255) NULL, Category2 VARCHAR(255) NULL, Category3 
VARCHAR(255) NULL, Category4 VARCHAR(255) NULL, Category5 VARCHAR(255) NULL,
 SubCategory1 VARCHAR(255) NULL, SubCategory2 VARCHAR(255) NULL, SubCategory3 VARCHAR(255) NULL, SubCategory4 VARCHAR(255) NULL,
FULLTEXT (Company,Category1,Category2,Category3,Category4)
) ENGINE=MyISAM DEFAULT CHARSET=utf8; 


CREATE TABLE table2 
(SubID VARCHAR(255) NOT NULL PRIMARY KEY , UniqueID VARCHAR(255) NOT NULL FOREIGN KEY REFERENCES table1(UniqueID), COUNTRYCODE VARCHAR(127) NULL, STATUS VARCHAR(255) NULL, TITLE MEDIUMTEXT NULL,  ABSTRACT LONGTEXT NULL, CLAIMS LONGTEXT NULL, PUBLICATIONDATE DATE NULL, EARLIERPRIOTITYDATE DATE NULL, DESCRIPTION LONGTEXT NULL
FULLTEXT (TITLE,ABSTRACT,CLAIMS,DESCRIPTION)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

Sample Query,

select title,SubID from table2 where SubID in(select max(SubID) from table2 where Abstract like '%systemt%' group by UniqueID)

Result:
display the max subid in group of uniqueid records which match 'system' keyword in abstract column 

On Wednesday, July 18, 2012 11:22:07 AM UTC+5:30, Razvan Musaloiu-E. wrote:
Can you run EXPLAIN on one of the queries and examine the query plan? If the query uses temporary tables and the result is too big then mysql will spill the temporary tables to disk and that can cause a significant slowdown.

Some good references on EXPLAIN:


On Tue, Jul 17, 2012 at 10:39 PM, amber <tekipconsulting@gmail.com> wrote:
Hi all,

thanks to ur quick reply,

i m using nested queries to join to two table and  in googel cloud sql, we stored 100k of records and now while searching through google app its taking too much time and page not display anything.

and in log its showing following error,

This request caused a new process to be started for your application, and thus caused your application code to be loaded for the first time. This request may thus take longer and use more CPU than a typical request for your application. 

On Tuesday, July 17, 2012 12:35:03 PM UTC+5:30, Razvan Musaloiu-E. wrote:
Do you mind providing a dataset and some examples of slow queries? Feel fre to contact me off list for this.

-- Razvan ME


On Mon, Jul 16, 2012 at 11:55 PM, amber <tekipconsulting@gmail.com> wrote:
Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?







--
Joe Faith | Product Manager | Google Cloud

Thursday, July 19, 2012

Re: full text search

Hi Amber

We would strongly suggest you use InnoDB tables wherever possible.


J

On Wed, Jul 18, 2012 at 9:33 PM, amber <tekipconsulting@gmail.com> wrote:
Hi Razvan,

I used  MATCH() ... AGAINST also but still performance not improved,... its taking time near about 52 seconds for 1300 records but actual size of our database will be more than 100k records so that's time it wont work Please suggest the any solution to increase speed of accessing data through google app engine application  


On Wednesday, July 18, 2012 10:19:02 PM UTC+5:30, Razvan Musaloiu-E. wrote:
LIKE works with InnoDB tables. Do this tables change often? Could you maintain a copy of them in InnoDB format?

Another idea: why don't you use MATCH() ... AGAINST instead of LIKE?

-- Razvan ME


On Wed, Jul 18, 2012 at 1:09 AM, amber <tekipconsulting@gmail.com> wrote:
Hi,

tables structure are as follows,

CREATE TABLE table1
(UniqueID VARCHAR(255) NOT NULL PRIMARY KEY, Company VARCHAR(255) NULL, Category1 VARCHAR(255) NULL, Category2 VARCHAR(255) NULL, Category3 
VARCHAR(255) NULL, Category4 VARCHAR(255) NULL, Category5 VARCHAR(255) NULL,
 SubCategory1 VARCHAR(255) NULL, SubCategory2 VARCHAR(255) NULL, SubCategory3 VARCHAR(255) NULL, SubCategory4 VARCHAR(255) NULL,
FULLTEXT (Company,Category1,Category2,Category3,Category4)
) ENGINE=MyISAM DEFAULT CHARSET=utf8; 


CREATE TABLE table2 
(SubID VARCHAR(255) NOT NULL PRIMARY KEY , UniqueID VARCHAR(255) NOT NULL FOREIGN KEY REFERENCES table1(UniqueID), COUNTRYCODE VARCHAR(127) NULL, STATUS VARCHAR(255) NULL, TITLE MEDIUMTEXT NULL,  ABSTRACT LONGTEXT NULL, CLAIMS LONGTEXT NULL, PUBLICATIONDATE DATE NULL, EARLIERPRIOTITYDATE DATE NULL, DESCRIPTION LONGTEXT NULL
FULLTEXT (TITLE,ABSTRACT,CLAIMS,DESCRIPTION)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

Sample Query,

select title,SubID from table2 where SubID in(select max(SubID) from table2 where Abstract like '%systemt%' group by UniqueID)

Result:
display the max subid in group of uniqueid records which match 'system' keyword in abstract column 

On Wednesday, July 18, 2012 11:22:07 AM UTC+5:30, Razvan Musaloiu-E. wrote:
Can you run EXPLAIN on one of the queries and examine the query plan? If the query uses temporary tables and the result is too big then mysql will spill the temporary tables to disk and that can cause a significant slowdown.

Some good references on EXPLAIN:


On Tue, Jul 17, 2012 at 10:39 PM, amber <tekipconsulting@gmail.com> wrote:
Hi all,

thanks to ur quick reply,

i m using nested queries to join to two table and  in googel cloud sql, we stored 100k of records and now while searching through google app its taking too much time and page not display anything.

and in log its showing following error,

This request caused a new process to be started for your application, and thus caused your application code to be loaded for the first time. This request may thus take longer and use more CPU than a typical request for your application. 

On Tuesday, July 17, 2012 12:35:03 PM UTC+5:30, Razvan Musaloiu-E. wrote:
Do you mind providing a dataset and some examples of slow queries? Feel fre to contact me off list for this.

-- Razvan ME


On Mon, Jul 16, 2012 at 11:55 PM, amber <tekipconsulting@gmail.com> wrote:
Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?







--
Joe Faith | Product Manager | Google Cloud

Wednesday, July 18, 2012

Re: full text search

Hi Razvan,

I used  MATCH() ... AGAINST also but still performance not improved,... its taking time near about 52 seconds for 1300 records but actual size of our database will be more than 100k records so that's time it wont work Please suggest the any solution to increase speed of accessing data through google app engine application  

On Wednesday, July 18, 2012 10:19:02 PM UTC+5:30, Razvan Musaloiu-E. wrote:
LIKE works with InnoDB tables. Do this tables change often? Could you maintain a copy of them in InnoDB format?

Another idea: why don't you use MATCH() ... AGAINST instead of LIKE?

-- Razvan ME


On Wed, Jul 18, 2012 at 1:09 AM, amber <tekipconsulting@gmail.com> wrote:
Hi,

tables structure are as follows,

CREATE TABLE table1
(UniqueID VARCHAR(255) NOT NULL PRIMARY KEY, Company VARCHAR(255) NULL, Category1 VARCHAR(255) NULL, Category2 VARCHAR(255) NULL, Category3 
VARCHAR(255) NULL, Category4 VARCHAR(255) NULL, Category5 VARCHAR(255) NULL,
 SubCategory1 VARCHAR(255) NULL, SubCategory2 VARCHAR(255) NULL, SubCategory3 VARCHAR(255) NULL, SubCategory4 VARCHAR(255) NULL,
FULLTEXT (Company,Category1,Category2,Category3,Category4)
) ENGINE=MyISAM DEFAULT CHARSET=utf8; 


CREATE TABLE table2 
(SubID VARCHAR(255) NOT NULL PRIMARY KEY , UniqueID VARCHAR(255) NOT NULL FOREIGN KEY REFERENCES table1(UniqueID), COUNTRYCODE VARCHAR(127) NULL, STATUS VARCHAR(255) NULL, TITLE MEDIUMTEXT NULL,  ABSTRACT LONGTEXT NULL, CLAIMS LONGTEXT NULL, PUBLICATIONDATE DATE NULL, EARLIERPRIOTITYDATE DATE NULL, DESCRIPTION LONGTEXT NULL
FULLTEXT (TITLE,ABSTRACT,CLAIMS,DESCRIPTION)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

Sample Query,

select title,SubID from table2 where SubID in(select max(SubID) from table2 where Abstract like '%systemt%' group by UniqueID)

Result:
display the max subid in group of uniqueid records which match 'system' keyword in abstract column 

On Wednesday, July 18, 2012 11:22:07 AM UTC+5:30, Razvan Musaloiu-E. wrote:
Can you run EXPLAIN on one of the queries and examine the query plan? If the query uses temporary tables and the result is too big then mysql will spill the temporary tables to disk and that can cause a significant slowdown.

Some good references on EXPLAIN:


On Tue, Jul 17, 2012 at 10:39 PM, amber <tekipconsulting@gmail.com> wrote:
Hi all,

thanks to ur quick reply,

i m using nested queries to join to two table and  in googel cloud sql, we stored 100k of records and now while searching through google app its taking too much time and page not display anything.

and in log its showing following error,

This request caused a new process to be started for your application, and thus caused your application code to be loaded for the first time. This request may thus take longer and use more CPU than a typical request for your application. 

On Tuesday, July 17, 2012 12:35:03 PM UTC+5:30, Razvan Musaloiu-E. wrote:
Do you mind providing a dataset and some examples of slow queries? Feel fre to contact me off list for this.

-- Razvan ME


On Mon, Jul 16, 2012 at 11:55 PM, amber <tekipconsulting@gmail.com> wrote:
Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?




Re: full text search

LIKE works with InnoDB tables. Do this tables change often? Could you maintain a copy of them in InnoDB format?

Another idea: why don't you use MATCH() ... AGAINST instead of LIKE?

-- Razvan ME


On Wed, Jul 18, 2012 at 1:09 AM, amber <tekipconsulting@gmail.com> wrote:
Hi,

tables structure are as follows,

CREATE TABLE table1
(UniqueID VARCHAR(255) NOT NULL PRIMARY KEY, Company VARCHAR(255) NULL, Category1 VARCHAR(255) NULL, Category2 VARCHAR(255) NULL, Category3 
VARCHAR(255) NULL, Category4 VARCHAR(255) NULL, Category5 VARCHAR(255) NULL,
 SubCategory1 VARCHAR(255) NULL, SubCategory2 VARCHAR(255) NULL, SubCategory3 VARCHAR(255) NULL, SubCategory4 VARCHAR(255) NULL,
FULLTEXT (Company,Category1,Category2,Category3,Category4)
) ENGINE=MyISAM DEFAULT CHARSET=utf8; 


CREATE TABLE table2 
(SubID VARCHAR(255) NOT NULL PRIMARY KEY , UniqueID VARCHAR(255) NOT NULL FOREIGN KEY REFERENCES table1(UniqueID), COUNTRYCODE VARCHAR(127) NULL, STATUS VARCHAR(255) NULL, TITLE MEDIUMTEXT NULL,  ABSTRACT LONGTEXT NULL, CLAIMS LONGTEXT NULL, PUBLICATIONDATE DATE NULL, EARLIERPRIOTITYDATE DATE NULL, DESCRIPTION LONGTEXT NULL
FULLTEXT (TITLE,ABSTRACT,CLAIMS,DESCRIPTION)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

Sample Query,

select title,SubID from table2 where SubID in(select max(SubID) from table2 where Abstract like '%systemt%' group by UniqueID)

Result:
display the max subid in group of uniqueid records which match 'system' keyword in abstract column 

On Wednesday, July 18, 2012 11:22:07 AM UTC+5:30, Razvan Musaloiu-E. wrote:
Can you run EXPLAIN on one of the queries and examine the query plan? If the query uses temporary tables and the result is too big then mysql will spill the temporary tables to disk and that can cause a significant slowdown.

Some good references on EXPLAIN:


On Tue, Jul 17, 2012 at 10:39 PM, amber <tekipconsulting@gmail.com> wrote:
Hi all,

thanks to ur quick reply,

i m using nested queries to join to two table and  in googel cloud sql, we stored 100k of records and now while searching through google app its taking too much time and page not display anything.

and in log its showing following error,

This request caused a new process to be started for your application, and thus caused your application code to be loaded for the first time. This request may thus take longer and use more CPU than a typical request for your application. 

On Tuesday, July 17, 2012 12:35:03 PM UTC+5:30, Razvan Musaloiu-E. wrote:
Do you mind providing a dataset and some examples of slow queries? Feel fre to contact me off list for this.

-- Razvan ME


On Mon, Jul 16, 2012 at 11:55 PM, amber <tekipconsulting@gmail.com> wrote:
Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?




Re: Import file not found

typo
try gs://probike-test/probike-test.sql


On Wed, Jul 18, 2012 at 7:55 AM, Michael Wright <m.d.wright4@gmail.com> wrote:
I'm getting an import error, but the file is there. Any ideas from the cloud? Here is an error log entry and a Google Cloud storage file list

Jul 18, 2012 7:16 AMm.d.wright4@gmail.com
EFailed to import gs://probike.test/probike-test.sql: Does not exist


NameSizeLast UpdatedShare PubliclyPath
probike-test.sql 23.6 MB7:55 am
probike-test/probike-test.sql



--
Joe Faith | Product Manager | Google Cloud

Import file not found

I'm getting an import error, but the file is there. Any ideas from the cloud? Here is an error log entry and a Google Cloud storage file list

Jul 18, 2012 7:16 AMm.d.wright4@gmail.com
EFailed to import gs://probike.test/probike-test.sql: Does not exist


NameSizeLast UpdatedShare PubliclyPath
probike-test.sql23.6 MB7:55 am
probike-test/probike-test.sql

Re: full text search

Hi,

tables structure are as follows,

CREATE TABLE table1
(UniqueID VARCHAR(255) NOT NULL PRIMARY KEY, Company VARCHAR(255) NULL, Category1 VARCHAR(255) NULL, Category2 VARCHAR(255) NULL, Category3 
VARCHAR(255) NULL, Category4 VARCHAR(255) NULL, Category5 VARCHAR(255) NULL,
 SubCategory1 VARCHAR(255) NULL, SubCategory2 VARCHAR(255) NULL, SubCategory3 VARCHAR(255) NULL, SubCategory4 VARCHAR(255) NULL,
FULLTEXT (Company,Category1,Category2,Category3,Category4)
) ENGINE=MyISAM DEFAULT CHARSET=utf8; 


CREATE TABLE table2 
(SubID VARCHAR(255) NOT NULL PRIMARY KEY , UniqueID VARCHAR(255) NOT NULL FOREIGN KEY REFERENCES table1(UniqueID), COUNTRYCODE VARCHAR(127) NULL, STATUS VARCHAR(255) NULL, TITLE MEDIUMTEXT NULL,  ABSTRACT LONGTEXT NULL, CLAIMS LONGTEXT NULL, PUBLICATIONDATE DATE NULL, EARLIERPRIOTITYDATE DATE NULL, DESCRIPTION LONGTEXT NULL
FULLTEXT (TITLE,ABSTRACT,CLAIMS,DESCRIPTION)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

Sample Query,

select title,SubID from table2 where SubID in(select max(SubID) from table2 where Abstract like '%systemt%' group by UniqueID)

Result:
display the max subid in group of uniqueid records which match 'system' keyword in abstract column 

On Wednesday, July 18, 2012 11:22:07 AM UTC+5:30, Razvan Musaloiu-E. wrote:
Can you run EXPLAIN on one of the queries and examine the query plan? If the query uses temporary tables and the result is too big then mysql will spill the temporary tables to disk and that can cause a significant slowdown.

Some good references on EXPLAIN:


On Tue, Jul 17, 2012 at 10:39 PM, amber <tekipconsulting@gmail.com> wrote:
Hi all,

thanks to ur quick reply,

i m using nested queries to join to two table and  in googel cloud sql, we stored 100k of records and now while searching through google app its taking too much time and page not display anything.

and in log its showing following error,

This request caused a new process to be started for your application, and thus caused your application code to be loaded for the first time. This request may thus take longer and use more CPU than a typical request for your application. 

On Tuesday, July 17, 2012 12:35:03 PM UTC+5:30, Razvan Musaloiu-E. wrote:
Do you mind providing a dataset and some examples of slow queries? Feel fre to contact me off list for this.

-- Razvan ME


On Mon, Jul 16, 2012 at 11:55 PM, amber <tekipconsulting@gmail.com> wrote:
Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?



Tuesday, July 17, 2012

Re: full text search

Can you run EXPLAIN on one of the queries and examine the query plan? If the query uses temporary tables and the result is too big then mysql will spill the temporary tables to disk and that can cause a significant slowdown.

Some good references on EXPLAIN:


On Tue, Jul 17, 2012 at 10:39 PM, amber <tekipconsulting@gmail.com> wrote:
Hi all,

thanks to ur quick reply,

i m using nested queries to join to two table and  in googel cloud sql, we stored 100k of records and now while searching through google app its taking too much time and page not display anything.

and in log its showing following error,

This request caused a new process to be started for your application, and thus caused your application code to be loaded for the first time. This request may thus take longer and use more CPU than a typical request for your application. 

On Tuesday, July 17, 2012 12:35:03 PM UTC+5:30, Razvan Musaloiu-E. wrote:
Do you mind providing a dataset and some examples of slow queries? Feel fre to contact me off list for this.

-- Razvan ME


On Mon, Jul 16, 2012 at 11:55 PM, amber <tekipconsulting@gmail.com> wrote:
Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?



Re: full text search

Hi Joe,

I m using google cloud sql and myISAM engine and searching through google app engine application.

On Tuesday, July 17, 2012 9:11:53 PM UTC+5:30, Joe Faith wrote:

Hi Amber

What storage engine were you using? Full text search is currently (MySQL 5.5) only available in myISAM, but the performance of InnoDB is much better in general.

One solution is to keep just the tables that require search in myISAM and move the rest.

J

On Jul 16, 2012 11:55 PM, "amber" <tekipconsulting@gmail.com> wrote:
Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?

Re: full text search

Hi all,

thanks to ur quick reply,

i m using nested queries to join to two table and  in googel cloud sql, we stored 100k of records and now while searching through google app its taking too much time and page not display anything.

and in log its showing following error,

This request caused a new process to be started for your application, and thus caused your application code to be loaded for the first time. This request may thus take longer and use more CPU than a typical request for your application. 

On Tuesday, July 17, 2012 12:35:03 PM UTC+5:30, Razvan Musaloiu-E. wrote:
Do you mind providing a dataset and some examples of slow queries? Feel fre to contact me off list for this.

-- Razvan ME


On Mon, Jul 16, 2012 at 11:55 PM, amber <tekipconsulting@gmail.com> wrote:
Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?


Re: full text search

Hi Amber

What storage engine were you using? Full text search is currently (MySQL 5.5) only available in myISAM, but the performance of InnoDB is much better in general.

One solution is to keep just the tables that require search in myISAM and move the rest.

J

On Jul 16, 2012 11:55 PM, "amber" <tekipconsulting@gmail.com> wrote:
Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?

Re: disable billing in google cloud sql

Once you've deleted your instances, you won't be billed for them anymore.

Now are you having trouble disabling cloud sql service or billing? Can you tell me your project id? (feel free to reply off-list)


On Tue, Jul 17, 2012 at 12:02 AM, Chinnaa Rathinam <chinnasamyrp@gmail.com> wrote:

hi,

I enabled the google cloud sql instance billing on 13-7-2012. Now i have deleted all the instances and disabled the cloud sql option. But when i login for the next time it is enabled by default. How can i stop this.How long will it takes to know that my cloud instances billing is disabled.

Thanks in advance

Re: full text search

Too slow, I tried this and had to do a custom solution.

What I did: 

- Get an amazon EC2 linux instance, install sphinx, create a snapshot image of the machine.
- Create a web service that exposes your indexed data in xml format with a kill-list (the xml saved to blobstore but served trough a password protected app engine handler).
- Create a cronjob on the ami, that fetches de XML every n hours and runs the indexer.
- Create a webservise on the Ami (I used py's SimpleHTTPServer) to expose the search server.  Re-save the ami.
- Query the search service using appengine's url fetch service and get the actual objects from mysql by ID.

Actually, once I was at this point I switched to Datastore completely because it is faster and more reliable. I only was using mysql for the full text searches anyways ...



On Tue, Jul 17, 2012 at 2:55 AM, amber <tekipconsulting@gmail.com> wrote:
Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?


Re: full text search

Do you mind providing a dataset and some examples of slow queries? Feel fre to contact me off list for this.

-- Razvan ME


On Mon, Jul 16, 2012 at 11:55 PM, amber <tekipconsulting@gmail.com> wrote:
Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?


Re: disable billing in google cloud sql


hi,

I enabled the google cloud sql instance billing on 13-7-2012. Now i have deleted all the instances and disabled the cloud sql option. But when i login for the next time it is enabled by default. How can i stop this.How long will it takes to know that my cloud instances billing is disabled.

Thanks in advance

Monday, July 16, 2012

full text search

Hi , 

I imported a DB (~100k rows) on which I am doing full text searches, 
In CloudSQL they are taking more than 30 seconds causing 
Apennine to exceed the request time limit. 

Any suggestions? Or are full text searches just too slow to be done in 
the GCS?

Re: disable billing in google cloud sql

Hi,
If you have deleted all instances, billing is effectively disabled - you won't be charged for cloud sql instances anymore. If you want to disable the cloud sql service, you can do it by clicking on the "service" link in the upper left corner of api console and switching the cloud sql service status to "off".

On Mon, Jul 16, 2012 at 9:43 PM, Chinnaa Rathinam <chinnasamyrp@gmail.com> wrote:
Hi,

I have enabled billing for google cloud sql. I need to disable it. As per the instructions given i have deleted all instances, but not able to find the disable billing link in the billing section. How can i get that link

Thanks in advance

--
Regards,
R.CHINNASAMY
ME-PERVASIVE COMPUTING TECHNOLOGIES
ANNA UNIVERSITY OF TECHNOLOGY
TIRUCHIRAPPALLI



Re: disable billing in google cloud sql

The link should show up at the bottom left of the Billing page. Here is a screenshot.



-- Razvan ME


On Mon, Jul 16, 2012 at 9:43 PM, Chinnaa Rathinam <chinnasamyrp@gmail.com> wrote:
Hi,

I have enabled billing for google cloud sql. I need to disable it. As per the instructions given i have deleted all instances, but not able to find the disable billing link in the billing section. How can i get that link

Thanks in advance

--
Regards,
R.CHINNASAMY
ME-PERVASIVE COMPUTING TECHNOLOGIES
ANNA UNIVERSITY OF TECHNOLOGY
TIRUCHIRAPPALLI



disable billing in google cloud sql

Hi,

I have enabled billing for google cloud sql. I need to disable it. As per the instructions given i have deleted all instances, but not able to find the disable billing link in the billing section. How can i get that link

Thanks in advance

--
Regards,
R.CHINNASAMY
ME-PERVASIVE COMPUTING TECHNOLOGIES
ANNA UNIVERSITY OF TECHNOLOGY
TIRUCHIRAPPALLI