This is part 3 of a series looking at customizing Sitecore to use logging to the database instead of the default file-based logging. Refer to the first and second posts to get a full grasp of what I’m talking about here.
I alluded to this earlier, but the Log4Net ADONetAppender is not asynchronous, despite the buffering behavior discussed briefly in the second post in this series. It stands to reason that logging to a database could negatively impact performance in a high-traffic situation with lots of logging. I recommend load testing with the same level of logging (WARN, ERROR, etc) to confirm the database logging does not introduce a performance bottleneck. Furthermore, it is worthwhile to run load tests with more verbose logging levels (such as INFO) to gauge any potential impact on the implementation. If a situation develops that requires a short-term change to the logging level, it’s impotant to understand repurcussions.
To provide for asynchronous logging through a custom ADONetAppender, one can build an asynchronous ADONetAppender for log4net such as https://github.com/cjbhaines/Log4Net.Async. This is not officially part of log4net nor officially Sitecore supported code, but is an open-source library for asynchronous logging based on log4net. It’s one approach. In preliminary tests, this project is written using a different version of log4net but updating it to be compatible with the log4net library referenced in Sitecore.Logging.dll should be possible (but beyond the scope of this write-up — but I may revisit this subject in a follow-up post). This post has another promising example of building the asynchronous support to log4net: https://issues.apache.org/jira/browse/LOG4NET-407?jql=project%20%3D%20LOG4NET%20AND%20text%20~%20asynchronous.
Given the synchronous nature of the ADONetAppender in log4net, an examination of the performance impact of logging to the database is something we need to pay close attention to.
I did a modest analysis of page load times using the configurations described in this blog series. For this analysis, tests were run on a single computer (IIS combined with SQL Server) with JMeter exercising a Sitecore 7.2 site with a basic page writing a randomized INFO, WARN, and ERROR log message through the Sitecore.Diagnostics object. The code ran in the code-behind of the page_load event of a SubLayout with no component caching in effect. The BufferSize setting for the database appender was set to “1” to avoid any buffering effect (except in the specific test as noted below).
A pattern emerges showing a consistently faster page response time, albeit only slightly, for the Stored Procedure approach over the plain SQL text alternative.
Basic JMeter Page Load Analysis
|All Logging Disabled||Plain SQL Logging||Stored Proc SQL Logging||File System Logging||File & Stored Procedure||Stored Proc with default BufferSize|
|20 Users (4,000 samples)||149 ms||180 ms||169 ms||158 ms||150 ms||147 ms|
|100 Users (40,000 samples)||840 ms||940 ms||884 ms||788 ms||936 ms||832 ms|
The results show page load time in milliseconds. The data may say more about the workstation they were run on than about the performance of one logging implementation over another in a true high traffic situation, but the relative comparison is informative. Implementations should run their own tests to evaluate the particulars of their situation and environment, of course.
The basic analysis done here supports the concept of using a Stored Procedure when logging to a database instead of plain SQL, and taking advantage of the BufferSize setting. It’s interesting that having logging entirely disabled is nearly the same as logging to the database using a Stored Procedure and the default BufferSize value (512 messages buffered at a time). Again, these are conclusions drawn from a limited set of test data.
A general rule drawn from this analysis is that compared with the Sitecore installation defaults (logging to the local file system), logging to a database as we’ve gone over in this blog series may add .05 of a second (50 milliseconds) to the overall page processing time. This is just a rough figure, but can serve as a starting point for planning a Sitecore logging strategy, and of course each implementation is different so running your own perf tests is absolutely critical!
Typically, Sitecore automatically cleans up the text-based log files on a regular basis. One should have a clean-up routine in place for when logs are stored in the database, too. Consider adding a scheduled SQL Server Task to run clean-up logic such as:
DELETE CustomLog WHERE [Date]<DATEADD(day,DATEDIFF(day,0,GETDATE()-30),0)
This example would remove data over 30 days old from the log file. One could schedule the frequency of the clean-up and the exact SQL to run based around the parameters of the implementation (and the SQL Server capacity and environment). Taking backups of the logged data before cleaning up could be in order, for example, if a responsibility exists to keep records for long-term audit purposes or just to be able to view changes over time.
This exploration of logging to the database in Sitecore is representative of what my group does a lot of at Sitecore. A customer might have a single line on a requirements document stating: “log to the database” and then we get involved to understand why the customer wants to do it and how a customer might best make it happen. Like nearly every facet of software development, things can get complicated beneath the surface. Be suspicious of quick and easy answers, and this logging conversation is packed with nuance. At some point (especially if a customer needs it!), I may elaborate on the asynchronous logging approach I sketched out above — but with the performance numbers I generated, I feel confident that database logging isn’t a non-starter for a high-traffic site. The potential is there, of course, for database logging to be a bottleneck so one must be very deliberate in how or when to pursue it.