Thursday, June 30, 2022

[google-cloud-sql-discuss] Re: Why Cloud Postgres SQL instance updates randomly by itself?

These Update operations would be logged when the configurations of the Cloud SQL have been changed, for example the modifying of the database flags. 
On Wednesday, June 29, 2022 at 4:41:20 PM UTC-5 mappe...@gmail.com wrote:
Hi, 

I check the operation log for our Postgres SQL instance and found out it shows many times - "Update finished", every time it takes about a minute or two which causes a connection to drop if someone using it from the application.

I have no idea what it is, it is not part of the maintenance schedule we setup up.

Does anyone know what is that cause? how to avoid that?

Please see the attached screenshot to see how it looks in the SQL log entry.

Thanks in advance.


--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/2f035f78-3ed5-4e0a-a8b7-008b25ef9dd4n%40googlegroups.com.

Wednesday, June 29, 2022

[google-cloud-sql-discuss] Re: How to change root@localhost password or delete user

Unfortunately, it is not possible to delete 'root@localhost' or modify it, according to [1].

[1] https://cloud.google.com/sql/docs/mysql/users#system_users

On Tuesday, June 28, 2022 at 3:57:53 PM UTC-5 snee...@gmail.com wrote:
We have a mysql v5.7 instance. Which got compromised (Ransomware), Attacker delete databases and left one saying where to send money.

Anyway, The logins we found were using admin or root which were having default setting.

We deleted root and admin, including all older users, and created new users with different names. When I list users from GCP Console I seen only newly created users, while listing users through mysql shell I see an addition user (system user): root@localhost.

All GCP created users are regular user. So deleting or changing root@localhost fails.

Is it secure? Is there a way to change or remove it? Does GCP Console provide a way to create "system user"?

I have another instance which does not have root user!
  

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/6def2b4b-2525-4168-b491-e4c5db630398n%40googlegroups.com.

[google-cloud-sql-discuss] Why Cloud Postgres SQL instance updates randomly by itself?

Hi, 

I check the operation log for our Postgres SQL instance and found out it shows many times - "Update finished", every time it takes about a minute or two which causes a connection to drop if someone using it from the application.

I have no idea what it is, it is not part of the maintenance schedule we setup up.

Does anyone know what is that cause? how to avoid that?

Please see the attached screenshot to see how it looks in the SQL log entry.

Thanks in advance.


--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/30596767-60a7-4fda-bc0f-88d2e02d9a95n%40googlegroups.com.

Tuesday, June 28, 2022

[google-cloud-sql-discuss] How to change root@localhost password or delete user

We have a mysql v5.7 instance. Which got compromised (Ransomware), Attacker delete databases and left one saying where to send money.

Anyway, The logins we found were using admin or root which were having default setting.

We deleted root and admin, including all older users, and created new users with different names. When I list users from GCP Console I seen only newly created users, while listing users through mysql shell I see an addition user (system user): root@localhost.

All GCP created users are regular user. So deleting or changing root@localhost fails.

Is it secure? Is there a way to change or remove it? Does GCP Console provide a way to create "system user"?

I have another instance which does not have root user!
  

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/2b136d21-8083-484b-ad16-666fffed406cn%40googlegroups.com.

Friday, June 24, 2022

Re: [google-cloud-sql-discuss] Re: Can't connect to instance because "Policy checks are unavailable"

The Cloud SQL team is working on fixing this right now.

I'll share updates here as they're available: https://github.com/GoogleCloudPlatform/cloudsql-proxy/issues/1248.

On Fri, Jun 24, 2022 at 12:10 PM 'perezsanchez' via Google Cloud SQL discuss <google-cloud-sql-discuss@googlegroups.com> wrote:

Reviewing other cases with the same issue, the suggestion so far is to use Cloudflare's WARP to get around the issue. I suspect a DNS blocking issue of some sort.

Basically, CloudFlare's VPN tunnel. According to what I most recently read in the following site, it's a Wireguard implementation:

https://developers.cloudflare.com/warp-client/warp-modes/#1111-with-warp


You may access the github repository in case there is another update of the issue related to.

On Thursday, June 23, 2022 at 1:30:49 PM UTC-5 haide...@tajir-app.com wrote:
Here's what `gcloud sql connect` says:

ERROR: (gcloud.sql.connect) UNAVAILABLE: Policy checks are unavailable.


Here's what cloud_sql_proxy says:

2022/06/23 17:22:11 errors parsing config:

    googleapi: Error 503: Policy checks are unavailable., backendError

What's wrong here?

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/38c76976-5a52-4d3d-b18b-1aa617ce7caen%40googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/CA%2BND%3DxcY0UygMUWRc1SQ3_mm8bNhB_w5OY5Z%3D6kaxA6aohOCYQ%40mail.gmail.com.

[google-cloud-sql-discuss] Re: Can't connect to instance because "Policy checks are unavailable"

Reviewing other cases with the same issue, the suggestion so far is to use Cloudflare's WARP to get around the issue. I suspect a DNS blocking issue of some sort.

Basically, CloudFlare's VPN tunnel. According to what I most recently read in the following site, it's a Wireguard implementation:

https://developers.cloudflare.com/warp-client/warp-modes/#1111-with-warp


You may access the github repository in case there is another update of the issue related to.

On Thursday, June 23, 2022 at 1:30:49 PM UTC-5 haide...@tajir-app.com wrote:
Here's what `gcloud sql connect` says:

ERROR: (gcloud.sql.connect) UNAVAILABLE: Policy checks are unavailable.


Here's what cloud_sql_proxy says:

2022/06/23 17:22:11 errors parsing config:

    googleapi: Error 503: Policy checks are unavailable., backendError

What's wrong here?

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/38c76976-5a52-4d3d-b18b-1aa617ce7caen%40googlegroups.com.

Thursday, June 23, 2022

Re: [google-cloud-sql-discuss] 403 Unauthorized issue when connecting to Cloud SQL using IAM

You also need the cloudsql.instances.login role to use IAM authentication.


Try that and see if it works?

Gabe Weiss (he/him) | Developer Advocate | gweiss@google.com



On Thu, Jun 23, 2022 at 11:30 AM Tony Ouyang <tony@terratrue.com> wrote:
Hi, I'm trying to use cloud IAM connection to Cloud SQL. I set up a service account and grant this account `Cloud SQL client` and `Cloud SQL instance user` role and configured it correctly in my Java code. It works fine in my local sample program.

However, when I write this as dataflow pipeline job and use apache Beam library while left all the configuration and settings the same. I encountered 403 notAuthorized issue when running in cloud. The dataflow runs perfectly fine in my local, just failing on cloud:

Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
GET https://sqladmin.googleapis.com/v1/projects/sim-main-backend/instances/us-west1~sim-main-backend-mysql/connectSettings
{
  "code": 403,
  "errors": [
    {
      "domain": "global",
      "message": "The client is not authorized to make this request.",
      "reason": "notAuthorized"
    }
  ],
  "message": "The client is not authorized to make this request."
}
at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:439)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:525)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:466)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:576)
at com.google.cloud.sql.core.CloudSqlInstance.fetchMetadata(CloudSqlInstance.java:462)
... 9 more

Jun 22, 2022 3:31:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-06-22T22:31:34.777Z: com.zaxxer.hikari.pool.HikariPool$PoolInitializationException: Failed to initialize pool: Could not create connection to database server.
at com.zaxxer.hikari.pool.HikariPool.throwPoolInitializationException(HikariPool.java:596)
at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:582)
at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:100)
at com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:81)
at com.terratrue.dataflow.gcp.pipelines.GcpStorageSchemaClassifier$CloudSqlIamAuth.processSQL(GcpStorageSchemaClassifier.java:127)
at com.terratrue.dataflow.gcp.pipelines.GcpStorageSchemaClassifier$CloudSqlIamAuth.processElement(GcpStorageSchemaClassifier.java:94)
Caused by: java.sql.SQLNonTransientConnectionException: Could not create connection to database server.
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:110)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:89)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:63)
at com.mysql.cj.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:1001)
at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:818)
at com.mysql.cj.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:448)
at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:241)
at com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:198)
at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:121)
at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:359)
at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:201)
at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:470)
at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:561)
at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:100)
at com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:81)
at com.terratrue.dataflow.gcp.pipelines.GcpStorageSchemaClassifier$CloudSqlIamAuth.processSQL(GcpStorageSchemaClassifier.java:127)
at com.terratrue.dataflow.gcp.pipelines.GcpStorageSchemaClassifier$CloudSqlIamAuth.processElement(GcpStorageSchemaClassifier.java:94)
at com.terratrue.dataflow.gcp.pipelines.GcpStorageSchemaClassifier$CloudSqlIamAuth$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:218)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:169)
at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:83)
at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:420)
at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:389)
at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:314)
at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.RuntimeException: [sim-main-backend:us-west1:sim-main-backend-mysql] The Cloud SQL Instance does not exist or your account is not authorized to access it. Please verify the instance connection name and check the IAM permissions for project "sim-main-backend"
at com.google.cloud.sql.core.CloudSqlInstance.addExceptionContext(CloudSqlInstance.java:618)
at com.google.cloud.sql.core.CloudSqlInstance.fetchEphemeralCertificate(CloudSqlInstance.java:541)
at com.google.cloud.sql.core.CloudSqlInstance.lambda$performRefresh$0(CloudSqlInstance.java:332)
at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:131)
at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:74)
at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:82)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
... 3 more

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/9fd06411-ef6f-428f-8d58-b19b7edcaacbn%40googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/CAM9LKEpQNupFDcfmQG7hoXONu46BmKCWmqQ_mKwrJ_5LHYz8gw%40mail.gmail.com.

[google-cloud-sql-discuss] Can't connect to instance because "Policy checks are unavailable"

Here's what `gcloud sql connect` says:

ERROR: (gcloud.sql.connect) UNAVAILABLE: Policy checks are unavailable.


Here's what cloud_sql_proxy says:

2022/06/23 17:22:11 errors parsing config:

    googleapi: Error 503: Policy checks are unavailable., backendError

What's wrong here?

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/a2fd53ac-a916-40c2-a5a4-a86e1c4a1e2bn%40googlegroups.com.

Wednesday, June 22, 2022

[google-cloud-sql-discuss] 403 Unauthorized issue when connecting to Cloud SQL using IAM

Hi, I'm trying to use cloud IAM connection to Cloud SQL. I set up a service account and grant this account `Cloud SQL client` and `Cloud SQL instance user` role and configured it correctly in my Java code. It works fine in my local sample program.

However, when I write this as dataflow pipeline job and use apache Beam library while left all the configuration and settings the same. I encountered 403 notAuthorized issue when running in cloud. The dataflow runs perfectly fine in my local, just failing on cloud:

Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
GET https://sqladmin.googleapis.com/v1/projects/sim-main-backend/instances/us-west1~sim-main-backend-mysql/connectSettings
{
  "code": 403,
  "errors": [
    {
      "domain": "global",
      "message": "The client is not authorized to make this request.",
      "reason": "notAuthorized"
    }
  ],
  "message": "The client is not authorized to make this request."
}
at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:439)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:525)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:466)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:576)
at com.google.cloud.sql.core.CloudSqlInstance.fetchMetadata(CloudSqlInstance.java:462)
... 9 more

Jun 22, 2022 3:31:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-06-22T22:31:34.777Z: com.zaxxer.hikari.pool.HikariPool$PoolInitializationException: Failed to initialize pool: Could not create connection to database server.
at com.zaxxer.hikari.pool.HikariPool.throwPoolInitializationException(HikariPool.java:596)
at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:582)
at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:100)
at com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:81)
at com.terratrue.dataflow.gcp.pipelines.GcpStorageSchemaClassifier$CloudSqlIamAuth.processSQL(GcpStorageSchemaClassifier.java:127)
at com.terratrue.dataflow.gcp.pipelines.GcpStorageSchemaClassifier$CloudSqlIamAuth.processElement(GcpStorageSchemaClassifier.java:94)
Caused by: java.sql.SQLNonTransientConnectionException: Could not create connection to database server.
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:110)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:89)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:63)
at com.mysql.cj.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:1001)
at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:818)
at com.mysql.cj.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:448)
at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:241)
at com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:198)
at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:121)
at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:359)
at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:201)
at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:470)
at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:561)
at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:100)
at com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:81)
at com.terratrue.dataflow.gcp.pipelines.GcpStorageSchemaClassifier$CloudSqlIamAuth.processSQL(GcpStorageSchemaClassifier.java:127)
at com.terratrue.dataflow.gcp.pipelines.GcpStorageSchemaClassifier$CloudSqlIamAuth.processElement(GcpStorageSchemaClassifier.java:94)
at com.terratrue.dataflow.gcp.pipelines.GcpStorageSchemaClassifier$CloudSqlIamAuth$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:218)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:169)
at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:83)
at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:420)
at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:389)
at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:314)
at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.RuntimeException: [sim-main-backend:us-west1:sim-main-backend-mysql] The Cloud SQL Instance does not exist or your account is not authorized to access it. Please verify the instance connection name and check the IAM permissions for project "sim-main-backend"
at com.google.cloud.sql.core.CloudSqlInstance.addExceptionContext(CloudSqlInstance.java:618)
at com.google.cloud.sql.core.CloudSqlInstance.fetchEphemeralCertificate(CloudSqlInstance.java:541)
at com.google.cloud.sql.core.CloudSqlInstance.lambda$performRefresh$0(CloudSqlInstance.java:332)
at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:131)
at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:74)
at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:82)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
... 3 more

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/9fd06411-ef6f-428f-8d58-b19b7edcaacbn%40googlegroups.com.

Tuesday, June 21, 2022

[google-cloud-sql-discuss] Re: Differentiate between 2 parameters Total memory usage and Memory usage?

The Memory usage metric is measured at the VM level and does not include the OS cache. It's equivalent to "MemTotal - MemAvailable" from the VM's /proc/meminfo.

The Total memory usage metric is measured at the database container level. It includes memory used by the database container itself, and the OS cache.

The metric names might suggest that Total memory usage is always higher or equal to Memory usage. This is not true because those metrics are measured at different levels. (VM versus Docker). In other words, it's not Memory usage and Total memory usage for the same thing, but rather Memory usage for one thing and Total memory usage for another thing.

For example: 

  • Memory usage can be higher than Total memory usage because it's measured at the entire VM level, so it includes memory usage by components other than the database container. If the container and cache usage is low (e.g. because the database is small, idle, and doesn't use much OS cache), VM usage can potentially be higher than container usage. There are also differences in how memory allocation is measured in Docker versus the OS, which can contribute to the discrepancy.

  • Total memory usage can be, and often is, higher than Memory usage because it includes OS cache. Busy, warm databases can use large amounts of OS cache, which inflates Total memory usage.


On Tuesday, June 21, 2022 at 9:52:33 AM UTC-5 eucli...@gmail.com wrote:
I am using CloudSQL with MySQL 5.7
This is my configuration
  • vCPUs: 1
  • Memory: 614.4 MB
  • SSD storage: 10 GB
Lately my application has been running pretty slow, and I'm guessing it's due to a bottleneck in MySQL.

But when I look at the following 2 parameters, I don't understand why Total memory usage is smaller than Memory usage.??? (Attachment)

Total memory usage.png
 
Memory usage.png
As references, we known:
  • Total memory usage: Total RAM usage in bytes including buffer cache. Sampled every 60 seconds. After sampling, data is not visible for up to 210 seconds.
  • Memory usage: RAM usage in bytes. Sampled every 60 seconds. After sampling, data is not visible for up to 210 seconds.
Can anyone help me explain in more detail about these 2 metrics?

Thanks very much

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/8a6a0867-94e0-41f8-9163-58103292b384n%40googlegroups.com.

Monday, June 20, 2022

[google-cloud-sql-discuss] Differentiate between 2 parameters Total memory usage and Memory usage?

I am using CloudSQL with MySQL 5.7
This is my configuration
  • vCPUs: 1
  • Memory: 614.4 MB
  • SSD storage: 10 GB
Lately my application has been running pretty slow, and I'm guessing it's due to a bottleneck in MySQL.

But when I look at the following 2 parameters, I don't understand why Total memory usage is smaller than Memory usage.??? (Attachment)

Total memory usage.png
 
Memory usage.png
As references, we known:
  • Total memory usage: Total RAM usage in bytes including buffer cache. Sampled every 60 seconds. After sampling, data is not visible for up to 210 seconds.
  • Memory usage: RAM usage in bytes. Sampled every 60 seconds. After sampling, data is not visible for up to 210 seconds.
Can anyone help me explain in more detail about these 2 metrics?

Thanks very much

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/f315d5ff-6016-42e2-906b-15adffd0f4bbn%40googlegroups.com.

Friday, June 17, 2022

[google-cloud-sql-discuss] Re: Performance issues moving data between Cloud SQL instances (PostgreSQL)

In the Cloud SQL for PostgreSQL documentation are some articles related to importing/exporting data.

Regarding the Best practices for importing and exporting data, here are some of them:

Also known issues are referred to in this documentation.

As shown in this Stack Overflow answer, you can use pg_dump to extract the table. This page describes exporting and importing data into Cloud SQL instances using pg_dump and pg_restore.

The postgres_fdw is an extension allows tables from other ("foreign") PostgreSQL databases to be exposed as "foreign" tables in the current database. Currently, this extension works for two Cloud SQL private IP instances within the same VPC network, or for cross databases within the same instance.

To connect to databases within the same instance, you cannot set host to localhost or to 127.0.0.1. Instead, you must use the IP address shown for your instance in the Google Cloud console.

Additionally, you cannot choose, in the Google Cloud console, the Allow only SSL connections button for an instance where foreign data is stored. Only cloudsqlsuperuser can be the owner of a postgres_fdw foreign data wrapper.

On Thursday, June 16, 2022 at 4:26:43 PM UTC-5 aba...@gmail.com wrote:
Hi. I'm having performance issues trying to move a large amount of data between two Cloud SQL instances using Postgre.

Are there any guidelines or best practices to move data between two Cloud SQL instances? Not the entire database, just some tables, directly between instances.

I'm using the Foreign Data Wrapper extention in Postgre to archive the migration.

Thanks in advance.

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/3e0da8a2-4e5b-49fb-9741-dfe0f9b33e80n%40googlegroups.com.

Thursday, June 16, 2022

[google-cloud-sql-discuss] Performance issues moving data between Cloud SQL instances (PostgreSQL)

Hi. I'm having performance issues trying to move a large amount of data between two Cloud SQL instances using Postgre.

Are there any guidelines or best practices to move data between two Cloud SQL instances? Not the entire database, just some tables, directly between instances.

I'm using the Foreign Data Wrapper extention in Postgre to archive the migration.

Thanks in advance.

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/9b337e10-d547-4272-920c-92d8f03bc6d1n%40googlegroups.com.

Wednesday, June 1, 2022

[google-cloud-sql-discuss] Re: CloudIAP with CloudSQL/Redis and Private Connect

As said before, 

    "there is no ETA's on Feature requests, however, keep track of this issue for updates" 

To keep track on this feature request, you can follow updates at the following link: https://issuetracker.google.com/159364421

Regarding your question to configure Cloud IAP with Cloud SQL, the following tutorial could help you.



On Monday, May 30, 2022 at 11:39:19 AM UTC-5 hs...@ischoolconnect.com wrote:
is there any update by IAP team? can we configure IAP with Cloud SQL with private IP.

On Thursday, April 30, 2020 at 7:23:32 PM UTC+5:30 Olu wrote:
As indicated on this IAP Overview documentation[1], IAP may be used with applications running on App Engine standard environment, App Engine flexible environment, Compute Engine, and GKE. The Cloud IAP cannot be configured with CloudSQL or Redis at the moment. A feature request was submitted with the IAP team for this implementation but there is no ETA for such implementations at this time. 

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/1e71b86c-6ecf-45ae-8f12-73e72d253727n%40googlegroups.com.

[google-cloud-sql-discuss] Re: Import of data to GCP Database is failing with error

You will need to provide more information. If you could provide more details such as the process you are following or the documentation or reference used, it would be very helpful for the community to provide a better answer.

I will leave this other Groups Post that can be used as a good example on how to post a more informational question and could have some hints for your issue. 

I would also recommend trying to follow the official documentation for Export and import using BAK files, including the Best practices for importing and exporting data

On Monday, May 30, 2022 at 11:39:19 AM UTC-5 pri...@signicat.com wrote:
Getting this error  when trying to Import data to GCP Database is failing with error - 

Multiple databases detected in BAK file. Only importing a single database is supported

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/8113c1d4-36dd-4c3f-955f-dfb72bf47541n%40googlegroups.com.