You will be redirected in 30 seconds or close now.

ColdFusion Authors: Yakov Fain, Jeremy Geelan, Maureen O'Gara, Nancy Y. Nee, Tad Anderson

Related Topics: ColdFusion, Adobe Flex

ColdFusion: Article

Monitoring Your ColdFusion Environment With the Free Log Parser Toolkit

Solve common challenges in ways no other monitoring feature in CF or even the dedicated monitoring tools can do

First note the use of the "group by application" clause, which changes the SQL to group all the records of the same application together. Then note the use in the SELECT clause of "COUNT(*) as NumErrors," which will count how many records there are in each such group (by application), reporting it as a NumErrors column. Finally, note the addition of application to the Select list to show the name of each application. The result of this might be:

Application NumError
--------------------------- --------
<NULL> 473
cfadmin 9
App1 1
test_admin 60
demoapps 3
ContactManager 1

Of course, if we wanted the list to be sorted either by the application names or the number of errors, we could add an "Order by" to the SQL.

Creating an Output File
The examples to this point have simply output the results to the screen (command line), but you may prefer to write the results to a file instead. You do this using an INTO clause, which is placed at the end of the SELECT (and before the FROM), to name a file, such as:

logparser "select * into test.txt from
C:\CFusionMX\logs\application.log where severity='Error'" -i:csv

Notice here that I've switched back to the earlier example of listing all the errors from the application.log file. Note as well that the INTO clause can specify a full or relative path (or a UNC path) for the file name, otherwise if none is provided, it will simply be stored in the current directory where the command takes place.

The file output format will default to what has been shown on screen to this point, which Log Parser calls a "NAT" format (native or tabular data format). But as mentioned at the outset, you can also cause the output (on screen or to a file) to be any of several other formats such as CSV, TSV, and XML. For instance, you could change the output file to an XML file:

logparser "select * into test.xml from
C:\CFusionMX\logs\application.log where severity='Error' " -i:csv o:xml

Creating an Output File Per Application: Finally Error Files Per App!
A unique feature of the Log Parser tool helps resolve a problem that has long plagued CF administrators. If you have multiple applications and development teams on one server, you (or they) have probably wished you could create a separate error log file for each application. Well, Log Parser makes this a snap, using a feature called "multiplexing."

It's really as simple as using wildcard characters in the INTO clause, which then correspond to the first column listed in the SELECT clause. Continuing the example above, recall that it's listing all the columns for all the messages that reflect an error. We can alter that to name the application column first, and while we're at it we may as well limit it to just that column, the filename, message, and date and time, as in:

logparser "select application,filename,message,date,time into errors_*.txt
from C:\CFusionMX\logs\application.log where severity='Error' " -i:csv -o:csv

Using the data in the log file reflected so far, where there were errors for the applications such as "App1" and "demoapps," this would create a file such as errors_App1.txt, with the errors for App1, and errors_demoapps.txt, with the errors for demoapps. You'll notice that I've changed the output format to CSV. For some reason, the default NAT format isn't supported for multiplexing, though TSV, XML, and others are. See the product docs for more information. This is really a very powerful mechanism. Imagine that after creating each app-specific file, you could go further and send each file to the appropriate application owner.

In fact, you can go further still to create a directory for each application instead. The secret is that you may use the * anywhere in the path. Just as the command above will create new files even if they didn't exist, you could have the tool create new directories as well. Consider the following:

logparser "select application,filename,message,date,time into *\errors.txt
from C:\CFusionMX\logs\application.log where severity='Error' " -i:csv -o:csv

Note the use of "*\errors.txt" for the Into clause. This will create a new directory for each application and store the file as errors.txt. (You could certainly repeat the * again if you wanted the application name in the file name too.) Imagine how you could now give the owners of each application access to "their" directory. And again, the path you name for the output directory could have used a UNC path to put the output on some shared server.

(As an aside regarding multiplexing, note that the first column listed is used only for determining the file name and doesn't appear in the result. If you really want it in the result, simply list the column twice.)

But How Do I Get Only the Latest Log Entries?
We're about out of space, but I'd like to demonstrate just one more really compelling feature. Regarding the last example, you may be thinking, "this all sounds kind of interesting, but it will stink as the logs grow and grow in size. I wish I could just get the latest log entries." That's a very logical concern. And like the old steak knife commercials on TV, the Log Parser tool makes me want to say, "but wait, there's more!"

The tool has a feature called checkpointing that's really as simple as naming a checkpoint file that will track the row number in the log file where processing stopped. It's really as simple as naming a checkpoint file. Consider the following:

logparser "select application,application,filename,message,date,time into *\errors_*.txt from
C:\CFusionMX\logs\application.log where severity='Error' " -i:csv -o:csv

Note the specification of the "-icheckpoint" argument, naming a file, in this case "test.ipc." Again, the checkpoint file specification can include a path so it can be put anywhere, or it will default to the directory where the logparser command is executed. The feature tracks where log processing has stopped and where subsequent log analysis should start. It can be used whether you're outputting the results to screen or file.

There's one thing to think about regarding use of this feature with the multiplexing (or indeed any log file output) operation: it will overwrite any previously existing file. This operation may make most sense when you're running the operation periodically and immediately sending the logfile somewhere else. That way, the recipient will get a new file whenever there's new content of interest to them. Nifty!

Other Log Parser Features: Too Many to Name
I could go on. Clearly, this is a powerful tool, and we've only scratched the surface. I've provided several resources to help you learn more and see more examples. Consider briefly the following additional features:
- Again beyond log files, the tool can analyze Web server logs, the metabase, the registry, the file system, windows event logs, the Active Directory, and so on
- Input files can not only be pointed to on local and UNC paths, but they can also be specified as URLs as well. So, for example, you can use this tool to read remote CSV and XML files (including RSS feeds)
- You can pipe the output of command-line tools to be processed as input to the tool
- You can store the report output in a database
- You can generate charts from the tool
- Besides simple tabular output to the screen, there is also an available Windows datagrid
- You can choose to store the SQL in a file and point to it from the command line, and even pass in parameters to such a file
- You can name more than one file in the FROM (filename,filename), and the tool will automatically connect the files together (a "union" in SQL terms)
- Since you can read in XML files, consider that this lets you perform SQL against the XML, which CFML can't currently do
- The SQL grammar available is quite substantial. I've only used the simplest examples
- It also offers many useful SQL functions, including ones unique to the tool, like PropCount and PropSum
- The -h command-line option also offers help on SQL grammar and shows examples as well
- And this is just some of what it does

Also, I can think of additional CF-specific kinds of analysis, of various input sources, that I'd like to elaborate. For instance, I've only showed working with the application.log file, and only the logs in the \cfusionmx\logs directory. Note that there are many other log files in ColdFusion that could be useful to analyze. (You can even use Log Parser's ability to query the file system to report what log files exist.) Of course, it's often useful to analyze Web server logs (whether IIS, Apache, or the built-in Web server in CFMX) to help identify performance problems.

Beyond that, there's data in the event logs (related to running CF services), as well as possibly data in the registry (such as client variables, perhaps existing unexpectedly). And since the tool can read XML files generated and available via a URL, it could be used to monitor and report on the output of both SeeFusion and FusionReactor, which can output XML files. I'll offer a URL at the end to name a place where I plan to elaborate on such opportunities as an extension to this article.

Resources for Learning More
Clearly, there's much more to learn. Perhaps the most important resource to start with is how to download the tool. There are a couple of links to note. The Microsoft link is www.microsoft.com/technet/scriptcenter/tools/logparser/default.mspx. The tool includes a help file with considerable extra detail. Beyond that, there's also an "unofficial" Web site at www.logparser.com/. It includes a knowledge base, forums, and links to articles.

But the best resource of all, to really learn and appreciate the tool, is the book Microsoft Log Parser Toolkit, available from Syngress at www.syngress.com/catalog/?pid=3110. Since it's a couple of years old, the book is also available used at Amazon. Even so, it's an awesome resource. You'll learn things you never dreamed were possible.You may enjoy the following articles:


  • "How Log Parser 2.2 Works"
    - www.microsoft.com/technet/community/columns/profwin/pw0505.mspx
  • "Analyze Web Stats with Log Parser"
    - www.microsoft.com/technet/technetmag/issues/2006/08/InsideMSFT/
  • "What's New in Log Parser 2.2"
    - www.microsoft.com/technet/scriptcenter/tools/logparser/lpfeatures.mspx

    Finally, as I alluded to above, I plan to create an area on my Web site to expand on this discussion and elaborate more CF-specific aspects of using Log Parser. Look for that at http://www.carehart.org/logparser/. I welcome your feedback.

    Editorial Note: Since writing this article the author updated  significantly that resource he referred to, with links to more about Log parser, at http://www.carehart.org/logparser/.

  • More Stories By Charlie Arehart

    A veteran ColdFusion developer since 1997, Charlie Arehart is a long-time contributor to the community and a recognized Adobe Community Expert. He's a certified Advanced CF Developer and Instructor for CF 4/5/6/7 and served as tech editor of CFDJ until 2003. Now an independent contractor (carehart.org) living in Alpharetta, GA, Charlie provides high-level troubleshooting/tuning assistance and training/mentoring for CF teams. He helps run the Online ColdFusion Meetup (coldfusionmeetup.com, an online CF user group), is a contributor to the CF8 WACK books by Ben Forta, and is frequently invited to speak at developer conferences and user groups worldwide.

    Comments (5)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

    IoT & Smart Cities Stories
    @CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
    While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
    The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
    Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
    The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
    Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
    In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
    There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
    LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
    Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secu...