Welcome!

You will be redirected in 30 seconds or close now.

ColdFusion Authors: Yakov Fain, Jeremy Geelan, Maureen O'Gara, Nancy Y. Nee, Tad Anderson

Related Topics: ColdFusion, Adobe Flex

ColdFusion: Article

Monitoring Your ColdFusion Environment With the Free Log Parser Toolkit

Solve common challenges in ways no other monitoring feature in CF or even the dedicated monitoring tools can do

First note the use of the "group by application" clause, which changes the SQL to group all the records of the same application together. Then note the use in the SELECT clause of "COUNT(*) as NumErrors," which will count how many records there are in each such group (by application), reporting it as a NumErrors column. Finally, note the addition of application to the Select list to show the name of each application. The result of this might be:

Application NumError
--------------------------- --------
<NULL> 473
cfadmin 9
App1 1
test_admin 60
demoapps 3
ContactManager 1

Of course, if we wanted the list to be sorted either by the application names or the number of errors, we could add an "Order by" to the SQL.

Creating an Output File
The examples to this point have simply output the results to the screen (command line), but you may prefer to write the results to a file instead. You do this using an INTO clause, which is placed at the end of the SELECT (and before the FROM), to name a file, such as:

logparser "select * into test.txt from
C:\CFusionMX\logs\application.log where severity='Error'" -i:csv

Notice here that I've switched back to the earlier example of listing all the errors from the application.log file. Note as well that the INTO clause can specify a full or relative path (or a UNC path) for the file name, otherwise if none is provided, it will simply be stored in the current directory where the command takes place.

The file output format will default to what has been shown on screen to this point, which Log Parser calls a "NAT" format (native or tabular data format). But as mentioned at the outset, you can also cause the output (on screen or to a file) to be any of several other formats such as CSV, TSV, and XML. For instance, you could change the output file to an XML file:

logparser "select * into test.xml from
C:\CFusionMX\logs\application.log where severity='Error' " -i:csv o:xml

Creating an Output File Per Application: Finally Error Files Per App!
A unique feature of the Log Parser tool helps resolve a problem that has long plagued CF administrators. If you have multiple applications and development teams on one server, you (or they) have probably wished you could create a separate error log file for each application. Well, Log Parser makes this a snap, using a feature called "multiplexing."

It's really as simple as using wildcard characters in the INTO clause, which then correspond to the first column listed in the SELECT clause. Continuing the example above, recall that it's listing all the columns for all the messages that reflect an error. We can alter that to name the application column first, and while we're at it we may as well limit it to just that column, the filename, message, and date and time, as in:

logparser "select application,filename,message,date,time into errors_*.txt
from C:\CFusionMX\logs\application.log where severity='Error' " -i:csv -o:csv

Using the data in the log file reflected so far, where there were errors for the applications such as "App1" and "demoapps," this would create a file such as errors_App1.txt, with the errors for App1, and errors_demoapps.txt, with the errors for demoapps. You'll notice that I've changed the output format to CSV. For some reason, the default NAT format isn't supported for multiplexing, though TSV, XML, and others are. See the product docs for more information. This is really a very powerful mechanism. Imagine that after creating each app-specific file, you could go further and send each file to the appropriate application owner.

In fact, you can go further still to create a directory for each application instead. The secret is that you may use the * anywhere in the path. Just as the command above will create new files even if they didn't exist, you could have the tool create new directories as well. Consider the following:

logparser "select application,filename,message,date,time into *\errors.txt
from C:\CFusionMX\logs\application.log where severity='Error' " -i:csv -o:csv

Note the use of "*\errors.txt" for the Into clause. This will create a new directory for each application and store the file as errors.txt. (You could certainly repeat the * again if you wanted the application name in the file name too.) Imagine how you could now give the owners of each application access to "their" directory. And again, the path you name for the output directory could have used a UNC path to put the output on some shared server.

(As an aside regarding multiplexing, note that the first column listed is used only for determining the file name and doesn't appear in the result. If you really want it in the result, simply list the column twice.)

But How Do I Get Only the Latest Log Entries?
We're about out of space, but I'd like to demonstrate just one more really compelling feature. Regarding the last example, you may be thinking, "this all sounds kind of interesting, but it will stink as the logs grow and grow in size. I wish I could just get the latest log entries." That's a very logical concern. And like the old steak knife commercials on TV, the Log Parser tool makes me want to say, "but wait, there's more!"

The tool has a feature called checkpointing that's really as simple as naming a checkpoint file that will track the row number in the log file where processing stopped. It's really as simple as naming a checkpoint file. Consider the following:

logparser "select application,application,filename,message,date,time into *\errors_*.txt from
C:\CFusionMX\logs\application.log where severity='Error' " -i:csv -o:csv
-icheckpoint:test.ipc

Note the specification of the "-icheckpoint" argument, naming a file, in this case "test.ipc." Again, the checkpoint file specification can include a path so it can be put anywhere, or it will default to the directory where the logparser command is executed. The feature tracks where log processing has stopped and where subsequent log analysis should start. It can be used whether you're outputting the results to screen or file.

There's one thing to think about regarding use of this feature with the multiplexing (or indeed any log file output) operation: it will overwrite any previously existing file. This operation may make most sense when you're running the operation periodically and immediately sending the logfile somewhere else. That way, the recipient will get a new file whenever there's new content of interest to them. Nifty!

Other Log Parser Features: Too Many to Name
I could go on. Clearly, this is a powerful tool, and we've only scratched the surface. I've provided several resources to help you learn more and see more examples. Consider briefly the following additional features:
- Again beyond log files, the tool can analyze Web server logs, the metabase, the registry, the file system, windows event logs, the Active Directory, and so on
- Input files can not only be pointed to on local and UNC paths, but they can also be specified as URLs as well. So, for example, you can use this tool to read remote CSV and XML files (including RSS feeds)
- You can pipe the output of command-line tools to be processed as input to the tool
- You can store the report output in a database
- You can generate charts from the tool
- Besides simple tabular output to the screen, there is also an available Windows datagrid
- You can choose to store the SQL in a file and point to it from the command line, and even pass in parameters to such a file
- You can name more than one file in the FROM (filename,filename), and the tool will automatically connect the files together (a "union" in SQL terms)
- Since you can read in XML files, consider that this lets you perform SQL against the XML, which CFML can't currently do
- The SQL grammar available is quite substantial. I've only used the simplest examples
- It also offers many useful SQL functions, including ones unique to the tool, like PropCount and PropSum
- The -h command-line option also offers help on SQL grammar and shows examples as well
- And this is just some of what it does

Also, I can think of additional CF-specific kinds of analysis, of various input sources, that I'd like to elaborate. For instance, I've only showed working with the application.log file, and only the logs in the \cfusionmx\logs directory. Note that there are many other log files in ColdFusion that could be useful to analyze. (You can even use Log Parser's ability to query the file system to report what log files exist.) Of course, it's often useful to analyze Web server logs (whether IIS, Apache, or the built-in Web server in CFMX) to help identify performance problems.

Beyond that, there's data in the event logs (related to running CF services), as well as possibly data in the registry (such as client variables, perhaps existing unexpectedly). And since the tool can read XML files generated and available via a URL, it could be used to monitor and report on the output of both SeeFusion and FusionReactor, which can output XML files. I'll offer a URL at the end to name a place where I plan to elaborate on such opportunities as an extension to this article.

Resources for Learning More
Clearly, there's much more to learn. Perhaps the most important resource to start with is how to download the tool. There are a couple of links to note. The Microsoft link is www.microsoft.com/technet/scriptcenter/tools/logparser/default.mspx. The tool includes a help file with considerable extra detail. Beyond that, there's also an "unofficial" Web site at www.logparser.com/. It includes a knowledge base, forums, and links to articles.

But the best resource of all, to really learn and appreciate the tool, is the book Microsoft Log Parser Toolkit, available from Syngress at www.syngress.com/catalog/?pid=3110. Since it's a couple of years old, the book is also available used at Amazon. Even so, it's an awesome resource. You'll learn things you never dreamed were possible.You may enjoy the following articles:

 

  • "How Log Parser 2.2 Works"
    - www.microsoft.com/technet/community/columns/profwin/pw0505.mspx
  • "Analyze Web Stats with Log Parser"
    - www.microsoft.com/technet/technetmag/issues/2006/08/InsideMSFT/
  • "What's New in Log Parser 2.2"
    - www.microsoft.com/technet/scriptcenter/tools/logparser/lpfeatures.mspx

    Finally, as I alluded to above, I plan to create an area on my Web site to expand on this discussion and elaborate more CF-specific aspects of using Log Parser. Look for that at http://www.carehart.org/logparser/. I welcome your feedback.

    Editorial Note: Since writing this article the author updated  significantly that resource he referred to, with links to more about Log parser, at http://www.carehart.org/logparser/.

  • More Stories By Charlie Arehart

    A veteran ColdFusion developer since 1997, Charlie Arehart is a long-time contributor to the community and a recognized Adobe Community Expert. He's a certified Advanced CF Developer and Instructor for CF 4/5/6/7 and served as tech editor of CFDJ until 2003. Now an independent contractor (carehart.org) living in Alpharetta, GA, Charlie provides high-level troubleshooting/tuning assistance and training/mentoring for CF teams. He helps run the Online ColdFusion Meetup (coldfusionmeetup.com, an online CF user group), is a contributor to the CF8 WACK books by Ben Forta, and is frequently invited to speak at developer conferences and user groups worldwide.

    Comments (5)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


    IoT & Smart Cities Stories
    The platform combines the strengths of Singtel's extensive, intelligent network capabilities with Microsoft's cloud expertise to create a unique solution that sets new standards for IoT applications," said Mr Diomedes Kastanis, Head of IoT at Singtel. "Our solution provides speed, transparency and flexibility, paving the way for a more pervasive use of IoT to accelerate enterprises' digitalisation efforts. AI-powered intelligent connectivity over Microsoft Azure will be the fastest connected pat...
    There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
    Codete accelerates their clients growth through technological expertise and experience. Codite team works with organizations to meet the challenges that digitalization presents. Their clients include digital start-ups as well as established enterprises in the IT industry. To stay competitive in a highly innovative IT industry, strong R&D departments and bold spin-off initiatives is a must. Codete Data Science and Software Architects teams help corporate clients to stay up to date with the mod...
    At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
    Druva is the global leader in Cloud Data Protection and Management, delivering the industry's first data management-as-a-service solution that aggregates data from endpoints, servers and cloud applications and leverages the public cloud to offer a single pane of glass to enable data protection, governance and intelligence-dramatically increasing the availability and visibility of business critical information, while reducing the risk, cost and complexity of managing and protecting it. Druva's...
    BMC has unmatched experience in IT management, supporting 92 of the Forbes Global 100, and earning recognition as an ITSM Gartner Magic Quadrant Leader for five years running. Our solutions offer speed, agility, and efficiency to tackle business challenges in the areas of service management, automation, operations, and the mainframe.
    The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
    With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
    DSR is a supplier of project management, consultancy services and IT solutions that increase effectiveness of a company's operations in the production sector. The company combines in-depth knowledge of international companies with expert knowledge utilising IT tools that support manufacturing and distribution processes. DSR ensures optimization and integration of internal processes which is necessary for companies to grow rapidly. The rapid growth is possible thanks, to specialized services an...
    At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...