|By Joe Zanter||
|January 18, 2005 12:00 AM EST||
This is not a piece full of technical anything. It's a story about a boy and his dog...ok, not that. It is a story about how a Web-dabbling engineer created some highly useful apps for his co-workers and himself.
We work in a metallurgical lab, which means, among other things, we study metals by looking at finely polished samples (let's call them "mounts," as they are often mounted/embedded in plastic) under microscopes. We like to number and keep track of these samples and their descriptions for reference, and to comply with the record retention requirements of our customers. However, a few years ago we became acutely aware of the limitations of our hard copy log (the mount book) for these samples. Even at only about 1,000 samples per year, the volume of data was too much to search through by turning pages. In other words, "It would be nice to have an application that we could all access to store data, assign new numbers, and do other neat things automatically." Web- (intranet) based seemed the obvious choice, as there were enough computers and an already established intranet presence. Other solutions were available of course, such as a Lab Information Management System (LIMS), but we weren't likely to get approval for a large software purchase. Our IT department had adopted a trouble ticket system, which was fairly simple, but very expensive - more than $20,000. We knew that wasn't going to happen for us; this data was important, but making it convenient and more useful wasn't mission critical.
We had MS Access so we started there. It worked, though it was only really meant for a single user. Knowing that much (or that little) about Access told me that this was not the solution we were looking for. The company has production databases it uses to track the manufacturing of everything we make, and for many other functions besides. Using this system was an option, but it would mean turning over development and any changes to someone else, and, consequently, someone else's timeframe. Our hard copy logs weren't rotting away, so the priority of this work probably wouldn't be very high for them, though it would be for us. That wouldn't work, as we're a rather impatient group when it comes to having the right tools-especially when the right tools seem to be within our reach. Second, changes were typically slow, given the workload of that coding group, so we'd have to live with limitations for longer than we cared to. Third, we weren't talking about one application; we were talking one application to start with, and several others to follow in quick succession. Fourth, if we needed another reason, the work and the output data aren't just text. Images are integral. Though we could probably adapt the production system to deal with images (storing the paths to image files), it seemed more natural to pick some kind of Web-based solution. Finally, the choice between using Access and other established company resources was the choice between doing it ourselves or giving control over to someone else. I knew we could do this, so we decided to keep control. Thus, we tried Access-generated Web pages. Retaining control was the right choice. Using Access Web pages was just a step, however.
At first, the data entry Web pages worked, but they were limited. I thought that maybe I just didn't understand all of the features, but after asking our internal guru it was clear that Access Web pages would not do what we wanted. In essence, we needed something that supported at least a handful of simultaneous users and maintained data integrity without anyone overwriting other data. Access' pages had already proven unfaithful with these modest requirements, much to my dismay. Consequently, some data had to be re-entered from memory. In this case it was caused by simultaneous users. One record was started and before it was finished another person started a record, but it was the same record, so the race was on. One set of data was written and the next overwrote the first. Data erasure is unforgivable by our customers and we don't enjoy it so much either. So - once bitten, twice shy - I went back to talk to someone in IT. Due to the requirements of traceability and data integrity in so much of what we do, the solution had to at least perform these functions well. Flexibility would hopefully be part of the solution.
Our IT department had recently started using ColdFusion on a limited basis and recommended it to me. They were running a server so it was just a matter of learning this new software (Studio 4.5). Looking back, it appears that in it's simplest form CF is just another set of HTML tags allowing the server to build pages from data in databases. At the time, it seemed a little more daunting. I didn't know SQL either. I found enough resources on the Internet to get started. Between several now-forgotten tutorials and some resources at Webmonkey, I started coding. Not having to run the server took much of the steepness out of the learning curve. After a few days of learning some of the ropes and testing code, I started building the interface to enter data for our samples. SQL syntax gave me more fits than CF. If I'd read a little bit more, it may have helped, but it wasn't too bad and there were people to talk to here if I got stuck.
It wasn't long before I had the beginnings of a usable Web interface for our mount book. It's come a long way since those first days. Figure 1 is a screen shot of the latest version of the entry page.
Walking through the major steps of the code, a security template is applied, the last mount number is queried, and a new mount number is generated. The new number, along with the user's ID, is inserted into the mount book so that it is reserved for as long as it takes to fill out the form. The number itself took a little bit of coding effort as it's alphanumeric: a letter and three digits. For each letter, mounts 0-999 are used, then the next letter comes up. That was our original numbering scheme. I think I'd do it differently if I had to do it over. The two little forms at the top are aids, one if users accidentally hit a link to start a mount and the other for inserting mount data from another existing record to save keystrokes for similar samples. Next comes the main body of the form.
Submitting the form gets the data inserted properly and displayed (from a query) for users to check. This application has a query associated with it, of course. The form page is plain. Figure 2 shows a sample of a mount book query result.
The part link, 2177-204, is a link to display an engineering drawing of that part. The update button allows a user to go back in and edit the record data. If that user is not the one who entered the mount, he/she can only add text to the notes field and his/her comments are labeled in the field with his/her ID. Another notable feature is the Add image link. Clicking this link takes users to yet another form where any image can be uploaded. Figure 3 shows this image upload form.
For a certain description of mount, a copy of the image is copied to another network location for processing. An image analysis application is run using a macro to control annotation and measurements and saving of those results. Later in the day, a scheduled task takes the processed image and copies it back to the images tied to that mount, inserting an appropriate record. For any mount with images associated with it, the mount number m328 (as shown in Figure 2), becomes a link to display all images for this mount number. Figure 4 shows the display of all images for this mount page. The data from the mount book is shown, the image is displayed, and all entered image information is shown to the right of the image, including the option to inactivate the image.
I coded apps to eliminate all of our logbooks and I was asked to code a few utilities for other departments as well. One of those departments takes input from a barcode reader and calculates parameters needed to hard-anodize aluminum. I've used Verity in simple ways, such as to give us full text search options over thousands of new and legacy documents. Scheduled tasks monitors some data, reminds us of work that's due, reminds us of equipment maintenance, and supports some QA functions.
The most attractive thing about this setup is its customization and adaptability. Every one of our modest whims was satisfied and we began to have some not-so-modest whims. Some applications have undergone dozens of minor revisions and are now customized to the nth degree by end users. Adaptations emerged from suggestions like uploading images of the metal samples and tying the file to the sample record. That led to another application for uploading and indexing documents for a different department. At the end of the day, we get what we want and need and it becomes not just usable, but also useful. Our customers have access to much of our live data. Presenting analyses to remote locations on our intranet is simplified by adding a few links to an e-mail.
It didn't take long for these applications to become critical to our daily jobs. Now we rely on them, not just for the ability to query our growing body of data, but to satisfy audit requirements and customer requirements as well.
I did have the benefit of not having responsibility for the care and feeding of a Web server. This eased any learning curve that was present and allowed me to take hours rather than days to code something useful. That and the fact that that many CF tags are very straightforward made the self-teaching route practical for a part-time coder. A Ben Forta book and a copy of Studio was enough.
Looking back, I know this was a good path to follow. I knew some HTML and had built a few sets of static pages prior to learning CF; I didn't realize until I started learning ColdFusion tags that the step from static to dynamic was not large. Granted, many of my applications might be considered bicycles compared to others' SUVs. My page designs may not be pretty, but developing for yourself gets you exactly what you want. It's been great for us!
|Darrell Morris 01/20/05 11:46:54 AM EST|
Excellent article. Well written and informative. Thanks Joe.
As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense. In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will dis...
Feb. 6, 2016 07:15 PM EST
SYS-CON Events announced today that Fusion, a leading provider of cloud services, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Fusion, a leading provider of integrated cloud solutions to small, medium and large businesses, is the industry's single source for the cloud. Fusion's advanced, proprietary cloud service platform enables the integration of leading edge solutions in the cloud, including clou...
Feb. 6, 2016 03:30 PM EST Reads: 703
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts...
Feb. 6, 2016 03:15 PM EST Reads: 319
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
Feb. 6, 2016 02:30 PM EST Reads: 348
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
Feb. 6, 2016 01:30 PM EST Reads: 334
SYS-CON Events announced today that VAI, a leading ERP software provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. VAI (Vormittag Associates, Inc.) is a leading independent mid-market ERP software developer renowned for its flexible solutions and ability to automate critical business functions for the distribution, manufacturing, specialty retail and service sectors. An IBM Premier Business Part...
Feb. 6, 2016 01:00 PM EST Reads: 532
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 6, 2016 11:00 AM EST Reads: 109
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
Feb. 6, 2016 11:00 AM EST
Fortunately, meaningful and tangible business cases for IoT are plentiful in a broad array of industries and vertical markets. These range from simple warranty cost reduction for capital intensive assets, to minimizing downtime for vital business tools, to creating feedback loops improving product design, to improving and enhancing enterprise customer experiences. All of these business cases, which will be briefly explored in this session, hinge on cost effectively extracting relevant data from ...
Feb. 6, 2016 09:00 AM EST
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
Feb. 6, 2016 05:00 AM EST Reads: 326
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
Feb. 5, 2016 09:00 PM EST Reads: 767
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
Feb. 2, 2016 02:00 PM EST Reads: 400
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Feb. 2, 2016 04:30 AM EST Reads: 833
Learn how IoT, cloud, social networks and last but not least, humans, can be integrated into a seamless integration of cooperative organisms both cybernetic and biological. This has been enabled by recent advances in IoT device capabilities, messaging frameworks, presence and collaboration services, where devices can share information and make independent and human assisted decisions based upon social status from other entities. In his session at @ThingsExpo, Michael Heydt, founder of Seamless...
Feb. 1, 2016 05:00 AM EST Reads: 917
The IoT's basic concept of collecting data from as many sources possible to drive better decision making, create process innovation and realize additional revenue has been in use at large enterprises with deep pockets for decades. So what has changed? In his session at @ThingsExpo, Prasanna Sivaramakrishnan, Solutions Architect at Red Hat, discussed the impact commodity hardware, ubiquitous connectivity, and innovations in open source software are having on the connected universe of people, thi...
Jan. 31, 2016 09:00 PM EST Reads: 713
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
Jan. 31, 2016 07:15 PM EST Reads: 1,134
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, showed how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants received the download information, scripts, and complete end-t...
Jan. 31, 2016 10:00 AM EST Reads: 1,192
For manufacturers, the Internet of Things (IoT) represents a jumping-off point for innovation, jobs, and revenue creation. But to adequately seize the opportunity, manufacturers must design devices that are interconnected, can continually sense their environment and process huge amounts of data. As a first step, manufacturers must embrace a new product development ecosystem in order to support these products.
Jan. 31, 2016 10:00 AM EST Reads: 793
Manufacturing connected IoT versions of traditional products requires more than multiple deep technology skills. It also requires a shift in mindset, to realize that connected, sensor-enabled “things” act more like services than what we usually think of as products. In his session at @ThingsExpo, David Friedman, CEO and co-founder of Ayla Networks, discussed how when sensors start generating detailed real-world data about products and how they’re being used, smart manufacturers can use the dat...
Jan. 30, 2016 07:45 PM EST Reads: 762
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...
Jan. 30, 2016 03:45 PM EST Reads: 1,256