I need to import a bulk of data from a SQL file.
The data is rather much, around 1 GB in SQL code.
The best method which is least error-prone and best can handle big files
is the google_sql.sh client on Linux, since it seems to fully support UTF-8.
is the google_sql.sh client on Linux, since it seems to fully support UTF-8.
However, it seems to me that running the SQL script from my computer
slows down the whole process because of the roundtrip time for each line to be imported.
slows down the whole process because of the roundtrip time for each line to be imported.
I would fancy using the "Import data …" feature from the API console.
But the implementation is different :-( and error messages are not very helpful.
Will there be improvement and consolidation?
No comments:
Post a Comment