Welcome!

You will be redirected in 30 seconds or close now.

ColdFusion Authors: Yakov Fain, Jeremy Geelan, Maureen O'Gara, Nancy Y. Nee, Tad Anderson

Related Topics: ColdFusion, Adobe Flex

ColdFusion: Article

Monitoring Your ColdFusion Environment With the Free Log Parser Toolkit

Solve common challenges in ways no other monitoring feature in CF or even the dedicated monitoring tools can do

There are many resources we should analyze to ensure optimal ColdFusion operation or to help diagnose problems. Fortunately, there's an awesome free tool that comes to our aid to turn voluminous data into useful information.

In this article, I'd like to introduce you to the free Log Parser tool from Microsoft. Yes, it's free. And while you may not run ColdFusion on Windows, that's okay. You can use it on a Windows machine to monitor resources on a Linux machine. The tool applies just as well to those running BlueDragon or any CFML, PHP, .NET, or other environment.

I'll show you the many ways you can use the tool to solve common challenges in ways that no other monitoring feature in CF or even the dedicated monitoring tools like SeeFusion or FusionReactor can do. In one example, I'll show you how it can provide application-specific error log information that many struggle to obtain.

Basics of the Tool
Despite its name the tool is about more than just log files. What kind of resources can Log Parser monitor? It started out focused on Web server log files, and indeed it can do that (in IIS, W3C, and NCSA formats), but it can also analyze all manner of tabular text files (CSV and TSV), as well as XML files, the Windows Registry, the file system, the IIS metabase, the Windows event logs, the Windows Active Directory, and NetMon files.

And while it can produce simple textual results, it can also generate output to a file in plain text, CSV, TSV, or XML formats, as well as produce charts, and store results in a database

But the best, most compelling, and truly unique aspect of the tool is how you go about analyzing the input files. There's no interface for the tool. So how do you describe what data to retrieve? Would you believe you use SQL? That makes it an especially compelling tool for CFML developers, since we're used to using SQL already.

I'd like to focus here on particular forms of information that would be useful for CF developers and administrators to analyze using the tool, so I won't focus on its installation or basic use. I'll direct you to other resources at the end of this article that will get you started.

I'd just like to clarify that Log Parser is meant to be used at the command line (logparser.exe). I'll assume that you have it installed and configured so you can issue commands such as its help option:

Logparser -h

Of course, you can also use such a tool from the CFEXECUTE tag, but again that's beyond the focus of this article. I'll point you to a resource later that will cover such additional details.

Analyzing CF Log Files
It may seem odd at first to contemplate using SQL to analyze log files and the other non-database resources mentioned above, but it really is effective. While most of the existing resources that introduce the tool focus on analyzing Web server logs, I'd like to start by showing an analysis of ColdFusion logs.

Now, how could a tool that analyzes log files regard the columns in that file as columns to be used in SQL? Well, many log files do have a first line called a "header" line that provides a list of names that identify each column in the log file.

Even if the log file doesn't offer a header file, the Log Parser tool has a clever mechanism to analyze the first 10 lines of the file to try to determine what kind of data is in each column and create generic column names.

For this article, let's consider the application.log file that tracks errors in your CFML code and is stored in the "logs" directory where CFMX is installed. On my machine, that's c:\cfusionmx\logs. To query all the columns in all the records of that file, I could issue:

logparser "select * from C:\CFusionMX\logs\application.log" -i:csv

Note the familiar "select *" SQL statement that says "select all columns" and the "from" clause that names the actual log file to be processed, all of which is embedded in double quotes. The additional "-i:csv" argument tells the logparser engine that the file is a CSV (comma-separated value) format file. A subset of the result as it might appear is:

Filename RowNumber Severity ThreadID Date Time Application Message
--------------------------------- --------- ----------- -------- -------- -------- ----------- -------------
C:\CFusionMX\logs\application.log 2 Information jrpp-0 06/21/06 19:53:15 <NULL>
C:\CFusionMX\logs\application.log initialized
C:\CFusionMX\logs\application.log 3 Error jrpp-0 06/21/06 19:53:15 <NULL>
Error Executing Database Query.Data source not found.
The specific sequence of files included or processed is: C:\Inetpub\wwwroot\test.cfm

Yes, it's a jumble for now (all dumping onto your screen at the command line), but later we'll see how to limit what columns to show as well as how to write the output to a file for more effective observation.

Understanding Column Headers
The first line above shows all the column headers that were found in that header file. You can also have the tool list the columns that it's found or detected using the option -queryinfo, such as this:

logparser "select * from C:\CFusionMX\logs\application.log" -i:csv -queryinfo

In the data shown will be the list of columns found. In this example, it would be:

Query fields:

Filename (S) RowNumber (I) Severity (S) ThreadID (S)
Date (S) Time (S) Application (S) Message (S)

Note that the "S" and "I" indicators mean the columns hold strings or integers, respectively. Other types are "T" (timestamp) and "R" (real). We can use any of these column names in the SELECT statement to limit what columns we display. More important, we can use them to limit what "records" we display.

Limiting the Records Found
Going back to the earlier example that just did a "select *" the result was basically the same as if we'd just dumped the file to the screen. Where the tool gets powerful is in using additional SQL clauses to refine the search. For instance, since we see that one of the columns is "severity," which has values such as "Error" or "Information," we could limit the list to just those with errors using:

logparser "select * from C:\CFusionMX\logs\application.log where severity='Error'" -i:csv

Note the addition of where severity='Error.' As with most SQL engines, the string value must be surrounded with single (not double) quotes. On the other hand, notice that the first character of the value is capitalized because, unlike most SQL engines, the comparison here is case-sensitive.

Of still more interest may be to limit the errors to a given application. Since that's another field, you can use this:

logparser "select * from C:\CFusionMX\logs\application.log
where severity='Error' and application='test'" -i:csv

This would list only those errors for the application "test."

Grouping Records
Now, perhaps a different interest would be to see a breakdown of errors by application. Again, this is easy in SQL and therefore easy in Log Parser:

logparser "select application, count(*) as NumErrors
from C:\CFusionMX\logs\application.log
where severity='Error' group by application" -i:csv

More Stories By Charlie Arehart

A veteran ColdFusion developer since 1997, Charlie Arehart is a long-time contributor to the community and a recognized Adobe Community Expert. He's a certified Advanced CF Developer and Instructor for CF 4/5/6/7 and served as tech editor of CFDJ until 2003. Now an independent contractor (carehart.org) living in Alpharetta, GA, Charlie provides high-level troubleshooting/tuning assistance and training/mentoring for CF teams. He helps run the Online ColdFusion Meetup (coldfusionmeetup.com, an online CF user group), is a contributor to the CF8 WACK books by Ben Forta, and is frequently invited to speak at developer conferences and user groups worldwide.

Comments (5) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Charlie Arehart 11/14/09 12:33:00 PM EST

I will also note that since I wrote the article, I did go on to update significantly that resource I referred to, with links to more about Log parser, here.

Charlie Arehart 03/02/08 09:47:14 PM EST

You will now find the Log Parser tool available at the MS site at:

http://www.iis.net/downloads/default.aspx?tabid=34&g=6&i=1287

John 02/28/08 09:19:21 AM EST

A similar tool for unix and apache logs is aql[1]. While nowhere near as adapable it is quick and simple. Also good on Unix is logwatch[2].

[1] http://www.steve.org.uk/Software/asql/
[2] http://www2.logwatch.org:81/

charlie arehart 01/06/07 11:38:26 AM EST

Thanks, Stefan. Glad to hear it was valuable for you. I always welcome feedback, good or bad. :-)

Stefan le Roux 11/13/06 06:07:24 AM EST

Lovely, I once wrote an application to read log files line by line, but the Log Parser really makes it easy to run most types of queries against these files without first having to write it to a database.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to great conferences, helping you discover new conferences and increase your return on investment.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
SYS-CON Events announced today that Silicon India has been named “Media Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Published in Silicon Valley, Silicon India magazine is the premiere platform for CIOs to discuss their innovative enterprise solutions and allows IT vendors to learn about new solutions that can help grow their business.
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT staff augmentation services for software technology providers. By providing clients with unparalleled niche technology expertise and industry experience, Chetu has become the premiere long-term, back-end software development partner for start-ups, SMBs, and Fortune 500 companies. Chetu is headquartered in Plantation, Florida, with thirteen offices throughout the U.S. and abroad.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
DXWorldEXPO LLC announced today that "IoT Now" was named media sponsor of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...