Hi, for some reasons, we have a huge amount of error logs, like 500K lines per day. The problem is that the database is keeping logs since day 1, and there seems to be no way of cleaning them or configuring the maximum number of files and size.
None of the options described in https://learn.microsoft.com/en-us/sql/database-engine/configure-windows/scm-services-configure-sql-s... are available, probably because of limited permissions to the underlying instance.
I suspect that these logs are consuming space of the sql server instance, because the space observed in gcloud monitoring is like 5 times larger than our actual accumulated tables sizes.
Any solution or workaround?
Thanks.
Hi @mcanaleta,
Welcome to Google Cloud Community!
EXEC sp_configure 'max error log size', '{size_in_mb}'
RECONFIGURE
EXEC sp_configure 'max number of error log files', '{number_of_files}'
RECONFIGURE
Hi @christianpaula! Thanks for your reply. We already tried the solutions you suggest without success, because of permissions:
I understand managing the error log requires sysadmin role in SQL Server, and acording to the Cloud SQL documentation (https://cloud.google.com/sql/docs/sqlserver/users) The sysadmin role is not supported.
Thanks.