Friday, April 13, 2012

Re: Import data … → same implementation as google_sql.sh?

Hi Johannes,
Yes the round trip time will make importing via google_sql.sh not usable for 1GB worth of data.
If you can tell me your instance name offline, I can find out what went wrong during the import from google storage.

We're aware of the lack of error messages during import. It's definitely not a good user experience, and we do plan to fix it.

On Fri, Apr 13, 2012 at 7:55 AM, Johannes Braunias <johannes.braunias@gmail.com> wrote:
Hello,

I need to import a bulk of data from a SQL file. 
The data is rather much, around 1 GB in SQL code.

The best method which is least error-prone and best can handle big files
is the google_sql.sh client on Linux, since it seems to fully support UTF-8.

However, it seems to me that running the SQL script from my computer
slows down the whole process because of the roundtrip time for each line to be imported.

I would fancy using the "Import data …" feature from the API console.
But the implementation is different :-( and error messages are not very helpful.
Will there be improvement and consolidation?


No comments:

Post a Comment