Login | Register   
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

Java Application Instrumentation with Log4J : Page 3

Learn how to instrument your Java applications using log4j and regular expressions. It's as simple as mining your logs for application and performance metrics—in real-time.


advertisement
Implementing a Log Interceptor Using Log4
In order to step through an actual log interceptor implementation, let's first establish some key abstractions (see Figure 1):
  • Reading
    A measurement of time, memory, and scrapped log value at a point in time
  • Metric
    A data-transfer object for one or two readings (A metric triggers a change to a statistic value.)
  • Filter
    A filter to be applied to every log message (It defines a match expression for detecting significant text and an optional scrape pattern for scraping a numeric value from a message.)
  • Statistic
    A user-defined statistic of interest (It specifies one or two filters and a calculator, for example, number of users logged onto the system or method duration.)
  • Calculator
    A component for deriving statistic results from metrics (For example, Count or Average)
  • MetricCollector
    The central component of the log interception framework, responsible for detecting matches, turning matches into readings and metrics, and deriving statistics

 
Figure 1. Class Diagram of Key Abstractions

Statistics may be classified as single or dual. Single statistics can be derived from a single match in a log (e.g., counts and rates) and therefore specify a single filter. Dual statistics can be derived from two related matches in a log (e.g., deltas, durations, memory consumption, and running counts) and specify two filters. Calculations such as minimum, maximum, and average can be applied to either type. The unit of measurement can be over whatever data the JVM or the log provides, including:

  • Time (e.g., Duration)
  • Memory (e.g., Memory consumption)
  • Occurrences: number of times a match occurs (e.g., count and running count)
  • A user defined value scraped from the log message itself
  • Combination of the above (e.g., rate: occurrences over time)



To understand how a statistic may be derived, take a simple example: number of users logged onto a system. Imagine that every time a user logs in or out a log message is recorded in the log of the form "login {user}" or "logout {user}". To derive the statistic, each log message is filtered for occurrences of either pattern. If a login match is found, then the number of users is incremented. If a logout pattern is found, then the number of users is decremented.

Statistic Configuration
The simplest way to set up pattern matches is to adopt a configuration approach where statistics and patterns are configured in a property file. Regular expressions can be loaded and compiled when the log interceptor is created. (Note that if your logs are precisely structured, then you can use String functions in lieu of regular expressions.)

Define a statistic 'UserCount' to represent the number of users currently logged on to the system by setting up two filters, one for each pattern:

statistic.UserCount.description=Number of users logged onto the system statistic.UserCount.calculator=runningcount statistic.UserCount.first.match=login {.* statistic.UserCount.second.match=logout .{*

Matches either pattern will be sent to the runningcount calculator to process.

Extending Log4j, Creating an Appender
To plug in your own log-processing component, create an Appender class that extends the log4j adapter class AppenderSkeleton. Log messages configured to go to the appender will be passed in via the append() method. To intercept log messages, implement the append() method and redirect messages into the metric collection mechanism. Be careful not to consume your own logs—ignore all messages that come from your own components:

public class Stat4jAppender extends AppenderSkeleton { protected static String c; public Stat4jAppender() { c = Util.getCategory(); } protected void append(LoggingEvent logEvent) { // don't scrap our own logs if (logEvent.categoryName.equals(c)) return; // direct log to metric capture mechanism
MetricCollector.getInstance().applyLog( logEvent.getRenderedMessage()); } .. }

To hook into log4j, configure an appender (stat4j) in your log4j configuration file (Click here to access the complete source for stat4j):

log4j.appender.stat4j=au.com.stat4j.log4j.Stat4jAppender

Configure the stat4j appender for all log categories that you want intercepted. To avoid the I/O associated with other appenders, set the additivity flag to false. This will guarantee that target log messages are sent to the log interceptor and not passed to any other appenders:

log4j.logger.test=DEBUG,stat4j log4j.additivity.test=false

At this point you have set up a basic log interceptor using the log4j Appender mechanism. All logs logged to the test category will be sent to the stat4jAppender, which will forward messages into the metric collector class.

Detecting Matches and Generating Readings
Create a main processing class (MetricCollector) for managing log filtering, and reading generation and statistics derivation. Use a singleton pattern to ensure one instance per JVM. That way, data can be centralized and correlated across threads:

public class MetricCollector { public void applyLog(String log) { // Iterate over filters and see if // a match has occurred for (int i = 0; i < filters.length; ++i) { if (filters[i].isMatch(log)) { generateMetrics(filters[i], log); } //fi } //rof } .. }

In order to be efficient, ensure that statistic filter patterns are compiled and cached. Duplicates should be consolidated. This is so that messages are processed once for each distinct pattern. If a match is detected then a reading can be generated for each statistic to which the pattern applies.

The statistic calculator, in conjunction with the statistic type (single or dual), will determine whether a single or a dual match is sufficient to trigger a change to a statistic value. A dual statistic always defines two filters (regular expressions). If a single match is sufficient, a metric can be constructed immediately and forwarded on to the appropriate calculator to process. If a dual reading is required, the first reading will be cached and only when the second filter matches will the second reading be created and a metric constructed and forwarded. This approach requires that readings for dual statistics be cached until the second reading is generated. The advantage of creating metrics across multiple readings is that changes in time, memory, or scraped values are encapsulated inside the metric. Imagine the MetricCollector as part of a larger agent-based architecture. Metrics can be collated at point of capture and then forwarded on to a centralized server. By deferring metric creation until the second reading, the amount of traffic between the agent (MetricCollector) and the server is reduced.

Thread-global vs. Thread-Local
For a dual statistic (duration, for example), you need to specify whether the measurement applies within a given thread (i.e., method call duration) or across all threads (e.g., user session duration). A typical enterprise application has multiple concurrent threads that may be calling the same method and logging the same messages. In order to generate an accurate metric for a method call duration, the start and end readings (matching start and end logs from the method itself) must be correlated within a particular thread. For low-level statistics, you cannot mix and match across threads.

Because the log interceptor runs in the same process space as the application, the thread that generates a log message passes into the interceptor. This means that the interceptor can leverage java ThreadLocal to cache all readings made for a particular thread. In a thread-pooled environment, this works quite well as the number of threads—and therefore the number of caches—is gated.

In the MetricCollector, back up your reading cache with ThreadLocal. If the statistic is collected locally, use the ThreadLocal cache. Otherwise, default to the global cache:

public class MetricCollector { ... // Reading caches - one for within threads // and the other global protected ThreadLocal localReadingCache = new ThreadLocal() { protected synchronized Object initialValue() { return new HashMap(); } }; protected Map readingCache; .. protected Map getReadingCache(Statistic statistic) { if (statistic.threadLocal()) { // look for match locally return (Map) localReadingCache.get(); } else { return readingCache; } } ..

Sometimes, pairing up readings can be difficult because the correlation is context sensitive. For example, session durations can be measured only if readings can be correlated across logins and logouts for a particular user. In order to support this type of semantics, you have to be able to tag a reading dynamically with a value from the log message (i.e., user name) that triggered the reading.

Unfortunately, log interception can't tell you if a log hasn't been generated. For a dual statistic, it is quite difficult to detect when the first filter matches but the second filter fails (as the thread was consumed by a rogue process). Only by monitoring how long a reading has been cached and setting an upper reading lifetime limit can such situations be found and alerted.

Log Scraping
Reading measurements can be generated using System.currentTimeInMillis() for time and Runtime.getRuntime().freeMemory() for memory. However, sometimes the value that you want measured is in the log message itself. You can use regular expressions to scrap the value from text. If a statistics filter specifies a scrap expression, then it will be used to extract a numeric reading value. For example, to scrap the numeric from the message "Time to process post=5.0", use the scrap regular expression ".*post=([0-9].*)+". The reading will contain a value of 5.0:

public Double scrapUserDefinedValue(String text, String pattern) { // if the text is empty or null just return no reading if ((text == null) || (text.length() == 0) || (pattern == null) || (pattern.length() == 0)) { return null; } // get pre-compiled pattern Pattern p = getPattern(pattern); // Create a matcher Matcher m = p.matcher(text); try { if (!m.matches()) return null; // Get first match group String str = m.group(1); // Use the formatter to parse out the value . The formatter will // take account of the country specific formatting of the number // whereas Double.parse will not. DecimalFormat df = (DecimalFormat) format.get(); Number value = df.parse(str); return new Double(value.doubleValue()); } catch (Exception ex) { return null; }

The UserCount statistic is slightly unusual. It is a dual statistic, but a filter match on either pattern (login or logout) will elicit a change to the number of users. So a metric is constructed and passed on to the runningcount calculator for every match.



Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap