Friday, May 17, 2019

[google-cloud-sql-discuss] Re: About mysql.slow_log output

Hello


Thank you for using Google Groups!


For #1, enabling general_log and setting log_output flag to FILE will produce log output using the logs viewers in the Google Cloud Platform Console as indicated here. Additionally, if you wish to download these logs locally, you may refer to the documentation found here as these are the official instructions for downloading the logs into the JSON or CSV format. However, you can directly export the log entries to BigQuery using GCP console. You can do this by creating a sink by specifying the destination being BigQuery as indicated here.

For #2, I believe specifying a schema for Bigquery would allow you to uniformly structure your logs between both products. You may feel free to explore setting up a schema for the logs and determine if they are applicable in your use case of structuring logs like mysql.slow_sql.


If you have any technical specific issues or inquiries related to this, you may find it suitable to post on Stackoverflow with the appropriate Google Cloud Platform tags. This action will ensure you would be getting the best possible support from the community for any of the issues you have encountered.


I hope this helps!



On Monday, May 13, 2019 at 2:07:18 PM UTC-4, Tuan Vu wrote:
I have 2 questions about cloudsql (2nd generation mysql 5.7) 's slow_log:

1. When i set general_log=TRUE and log_output=FILE, i can view logs in StackDriver logging, but where is the place that saves log file? 
By mysql document, it said that log file will be output at : 
general_log_file=/var/lib/mysql/localhost.log
/var/log/mysql-slow.log

but, how can i find that files that is output? 

2. When i set log_output=FILE, and export slow_log to big query ( by creating sinks), the column in Big query table is different from mysql.slow_log table, so it is impossible to query from bigquery to caculate query time etc ...

Below is column information about mysql.slow_log table.

'slow_log', 'CREATE TABLE `slow_log` (\n  `start_time` timestamp(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6) ON UPDATE CURRENT_TIMESTAMP(6),\n  `user_host` mediumtext NOT NULL,\n  `query_time` time(6) NOT NULL,\n  `lock_time` time(6) NOT NULL,\n  `rows_sent` int(11) NOT NULL,\n  `rows_examined` int(11) NOT NULL,\n  `db` varchar(512) NOT NULL,\n  `last_insert_id` int(11) NOT NULL,\n  `insert_id` int(11) NOT NULL,\n  `server_id` int(10) unsigned NOT NULL,\n  `sql_text` mediumblob NOT NULL,\n  `thread_id` bigint(21) unsigned NOT NULL\n) ENGINE=CSV DEFAULT CHARSET=utf8 COMMENT=\'Slow log\''

All above information is put into textPayload column, but is segmented like attachment capture.

So my question is: is there any way to export log with structure like mysql.slow_sql?

I have a 2nd Generation Mysql instance in Google Cloud SQL. I've turned on slow query logging with log_output set to TABLE and I'd like to read from that table.

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/f2095728-2633-4a97-8ebd-5af44c6bf131%40googlegroups.com.

No comments:

Post a Comment