Saturday, August 25, 2018

[google-cloud-sql-discuss] Re: Output DataPrep job to PostGres

Errrr, so If I kicked off a DataFlow job, and it's been running for over 13 hours...should I just kill it?  I did the same job with a smaller data set and it completed in 20 minutes...now with a larger data set it's taken about 14 hours and it's still not there.

On Wednesday, August 22, 2018 at 3:55:34 PM UTC-4, Jordan (Cloud Platform Support) wrote:
You can find examples on Stack Exchange of writing to a Postgres database via the Java JdbcIO transform. If you are not using Java than it is recommended to post an official feature request with the Apache Beam team in their Issue Tracker. 

As a workaround you can always use the TextIO transform to write the data to Google Cloud Storage in something like a .csv file. Then setup a trigger that would run a simple function in Google Cloud Functions that would read the file and write it to your Postgres db. 

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/564ab5c4-08ea-4c18-9163-d16696c83800%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

No comments:

Post a Comment