Wednesday, January 30, 2013

Performance diagnostics with ADFLogger

This post will show how to use the ADF Logger for performance analysis and how to instrument your own code so it shows up in this analysis. Every ADF developer should know, or learn, about the ADF Logger. If you are new to the logger than read up on it at the official documentation or the great Adventures in Logging series by Duncan Mills.

Configuring Oracle Diagnostic Logging

First thing we need to do is to configure the proper ADF Logger levels as the default levels will only show warning and errors. Run your application so the integrated WebLogic server starts. The log window in JDeveloper will have an Actions pull down menu where you can opt to configure Oracle logging:
configuring Oracle Diagnostic Logging

This will open an editor to the logging configuration file at DefaultDomain/config/fmwconfig/servers/DefaultServer/logging.xml which is typically in JDEV_USER_HOME/system11.1.1.6.38.61.92

All the logging for performance diagnostics is done at the CONFIG level so be sure to set the Root logger and Oracle logger to CONFIG. While developing you might even want to crank up the levels for oracle.adf, oracle.adfinternal and oracle.jbo to FINEST but be aware that this will show a lot more logging which has a performance impact of its own.


Run bad performing application

Once this is done we return to our simple example application and apply a filter to an af:table that is based on a bean datacontrol:
filtering af:table that re-fetches data from slow data control

From the logging we can see that the employees are fetched again. This might be a surprise since the page had already retrieved the list of employees from the bean datacontrol. Why would filtering need to request the data from the data control again? When we experience bad performance in an application it would be nice if we can quickly analyze the logs and discover where time is spent.
logging showing up in console

Analyze Log

Luckily we instrumented our code to use the ADFLogger. This not only writes a line to the log window, but also makes it part of the log analysis. To see this working invoke the log analyzer from the same pull down menu in the log window:
starting log analyzer

We can now inspect individual requests and drill down in the separate phases. We can quickly see our fetchEmployees method was executed in this last request and it took 2973 milliseconds to complete.
log analyzer showing our own instrumented code

Instrument Your Code

The key to making this work is to invoke ADFLogger::begin and ADFLogger::end in your code. You should do this before and after any code that can be time consuming, for example requesting a URL, invoking a webservice, very expensive calculations, interacting with the file system, etcetera.

Once you invoke ADFLogger::begin you have to make absolutely sure you also invoke ADFLogger::end even when exceptions occur. For this reason, you would typically use a try-catch-finally construct:
logger.begin("[[DESCRIPTION]]");
try {
  // TODO: do your stuff here
} finally {
  // make sure you always call ADFLogger::end with same description
  logger.end("[[DESCRIPTION]]");
}

One other nice feature about Oracle Diagnostic Logging is that you can supply additional context with a log message that will not clutter the default console logging but is available in the more advanced viewers such as the log analyzer. In this example you can see the java.specification.version but in a real application you would typically log information about the resource intensive task such as the URL being retrieved, query being performed or file being read.
log analyzer showing supplied context

Unfortunately ADFLogger::begin requires a full blown java.util.HashMap with this context. First of all it is bad practice to require specific classes when the java.util.Map intreface would suffice. But more annoying is that constructing a HashMap can require quite some lines of code which would clutter your java code. For this, I created a utility class that accepts a set of Strings as varargs. The list of strings you supply as arguments are alternating keys and values hence the requirement for an even number of parameters.
public class LoggingUtil {
  private LoggingUtil() {}
  public static void begin(final ADFLogger logger, final String action,
                           final String... context) {
    if (context.length % 2 != 0) {
      throw new IllegalArgumentException("even number of context parameters required (alternating keys and values)");
    }
    final HashMap<String, String> map = new LinkedHashMap<String, String>();
    for (int i = 0, cnt = context.length; i < cnt; i += 2) {
      map.put(context[i], context[i + 1]);
    }
    logger.begin(action, map);
  }
  public static void end(final ADFLogger logger, final String action) {
    logger.end(action);
  }
}

This keeps the actual java code very clean of logging code as constructing the HashMap is done in the utility class and only if the logging level actually needs this additional context:
public List<Employee> getEmployees() {
  LoggingUtil.begin(logger, "fetchEmployees",
                    "java.specification.version",
                    System.getProperty("java.specification.version"));
  try {
    fetchFromSlowSource();
    return employees;
  } finally {
    LoggingUtil.end(logger, "fetchEmployees");
  }
}

You can download the full sample application or browse the subversion repository to look at the source code. When running this application yourself be sure to setup the logging levels of your WebLogic server appropriately.

No comments:

Post a Comment

Comments might be held for moderation. Be sure to enable the "Notify me" checkbox so you get an email when the comment is accepted and I reply.