You will be redirected in 30 seconds or close now.

ColdFusion Authors: Yakov Fain, Jeremy Geelan, Maureen O'Gara, Nancy Y. Nee, Tad Anderson

Related Topics: ColdFusion

ColdFusion: Article

E-Testing: Debugging Your Projects

E-Testing: Debugging Your Projects

You're nearly finished with the code you've been slaving over for hours (or days or weeks, or maybe it was just a few seconds' effort). You're about to turn it loose for your customers to enjoy. The questions are: Is it ready? Do you know it will work? Have you tested it? If so, how? Indeed, should you be turning it over to the users as the next step?

Running the code to make sure it works is the most common "test" performed by beginning and even many experienced developers, but it's rarely adequate. There are just too many things that can go wrong within a given CF template, its queries, and its resultant output to trust to eyeballing the code and crossing your fingers. Further, what may work for you may fail for your users for any number of reasons.

Factor in the consequences of a larger load, or fragile server hosting, and the potential for problems with your program, application, and site multiply. If you add multiple developers, then the need for testing, indeed for "configuration management," becomes more evident. Finally, consider that a bug is much less costly to repair the earlier it's caught in the life cycle of an application.

Whether you're a lone code-slinger or part of a large team, you need to know about testing, but many CF developers aren't aware of the options that may be available to them already, without cost or much effort.

This article delves into the issue of application testing, or e-testing as some have come to call it, which spans a variety of tools and programming practices. Along the way we'll cover HTML- and CFML-oriented tools, as well as some that span Web application testing in general.

A CF Programmer's Testing/Debugging Tool Bag
If you're not performing any of the following testing (or using any of these sorts of tools) in your day-to-day programming practice, you're exposing yourself, your company, and your clients to a preventable risk that may be costing additional time and effort in debugging. The testing/debugging opportunities are broken into three categories:

  1. HTML-oriented testing
  2. CFML code testing
  3. Web application testing
We'll take a brief look at each of these categories of tools and tell you where you can learn more about them.

HTML-Oriented Testing
The first group, HTML-oriented testing, includes at least 11 different kinds of tests. CF programmers often dismiss them, but these freely available tools can be valuable if used effectively. The challenge is that generally they can't be used against your source CFML code but instead must be used against the generated result of your templates (and tested for a variety of browsers). I'll show you how to solve that. Let's look at each of these 11 tests.

The simplest testing you can do is HTML validation: making sure that the HTML you're creating is valid. This may seem incredibly mundane, perhaps even beneath you. But incorrect HTML may render your display flawed at best, or cause your page or part of it to fail to appear to the end user at all. Have you ever left off a closing table tag? Such a page may render fine in Internet Explorer, yet there's a total failure to display that table (and perhaps the entire content of your table-formatted page) in Netscape.

The problem for CF developers is that while we do create HTML in our templates, more often we create CFML code that in turn creates HTML. Many HTML validation tools built into editors like HomeSite and Studio are useless: they can't tell that an apparently errant tag is being sent to the browser only at runtime based on a CFIF test for a given variable. More important, the errant HTML may only be generated at runtime.

But we shouldn't conclude that no HTML testing is possible. Instead, we need to test the output of our pages. There are two ways to do that. One, you could manually copy and paste the output of a page into a new blank page in HomeSite/Studio, then run Tools>Validate Document. (If you don't care for the kinds of errors reported, or want to change the browsers you intend to support, see Studio's Options> Settings> Validation to modify the configuration.)

The other approach, which may be less labor-intensive and can even be automated, is to use a Web-based HTML validation service, such as that at http://validator.w3.org. You can point it at a CFML-generated page (meaning a URL to a .cfm file) just as you can point it at an HTML file (remember, to the browser - and this tool - it's HTML either way).

Other HTML-oriented tests available at this site (and others like it) include:

  • Link checking
  • Spell checking
  • Document weight (download time) testing
  • Cascading Style Sheet validation
  • Color depth (browser-safe palette) testing
  • Image compression testing
The first two can also be performed within Studio via cut-and-paste from your CFML output. The others can be done at the w3.org site as well as at NetMechanic.com and Netscape's "Web Site Garage" at http://websitegarage.netscape.com/O=wsg/tuneup_plus/index.html. Do a search on "html validation" in your favorite search engine to find more. Some problems are still more challenging to test than these simple HTML validations.

We developers often work with the latest and greatest in terms of bandwidth and monitor size, and it's all too easy to forget that not everyone experiences the same luxuries. And for some people, disabilities create an extreme burden in visiting your site.

The first issue is browser-size testing: Will your page fit in various screen resolutions? What looks great on your 19-inch monitor at 1024x768 resolution may not fit at all on a more typical 15- or 17-inch monitor running at 800x600 (or worse, 640x480). This is generally just a matter of making good HTML design choices, such as not hard-coding excessive widths in <TABLE> and <TEXTAREA> tags. Fortunately, there are tools built into HomeSite/Studio that will help: while viewing a page within the internal browser (F12, or View>Toggle Edit/Browse), there's an option that looks like two crossed arrows offered within a set of tools shown just above the browsed page (see Figure 1). Hovering over this icon shows the name to be "Browser Size" and clicking it will offer the choice to display the page within a frame that simulates 800x600 and 640x480 resolution. This is much easier than actually changing the resolution on your monitor. Be sure to set it back to "Fit to Window" when done.

There are also Web-based services that will provide you with a rendering of your page at various sizes, including BrowserPhoto at the aforementioned NetMechanic.com and the popular shareware tool BrowserSizer (search for it at your favorite shareware repository such as download.cnet.com).

Continuing the theme of considering your visitors' needs, there's no more important issue for some developers and visitors than the issue of accessibility testing, that is, ensuring that disabled Web visitors (blind, color-blind, deaf, etc.) can use your site. For some CF developers, it's not a matter to be dismissed: Section 508 of the Federal Rehabilitation Act mandates that federal agencies make their electronic and information technology accessible to people with disabilities. Even if you're not a federal agency, consider that you may increase your audience significantly at a small cost. Most existing Web sites require only minor modifications to make them accessible to the estimated 750 million people worldwide who have disabilities. The modifications can be beneficial to all users. There are tools to help test (and rectify) accessibility issues, the most notable being "Bobby" at www.cast.org/bobby. For more resources see www.microsoft.com/enable/.

I may have saved the best form of testing for last, in terms of sharing something that a lot of developers don't seem to have: JavaScript error awareness. No, this isn't quite about testing (while there may be tools that actually test JavaScript code, I'm not aware of them). I'm referring to the simple matter of your being aware when your JavaScript code is generating errors. Daily, I encounter sites with errors that the developers clearly have missed. The really pernicious thing is that they don't even know they're there. Did you know that, by default, modern browsers (both IE and Netscape) hide JavaScript errors? As a developer, you need to turn them back on! In IE 5.5, use Tools>Internet Options>Advanced>Browsing>Display a Notification About Every Script Error. To enable them in Netscape, see the Netscape client-side JavaScript Guide at http://developer.netscape.com
. Of course, this will now expose you to all the errors on pages you visit, but this is a small price to be sure that you see any errors in your own code.

CFML Code Testing
Some readers may feel cheated by the "testing" mentioned so far, with the focus on HTML tools. Others may have wondered from the beginning if any aspect of ColdFusion code itself can be tested. It can. And many developers don't know about these at all.

The first is CFML syntax checking. Why would we need this at all? Well, going back to the original premise of the article, if you're about to release your code for users to experience, have you really tested it thoroughly? Or did you just make a "simple change" that you "know" will work? Maybe you're reluctant to take the time to fully test it (it involves too many steps). At a minimum, before turning the code over to production (or uploading it to the live server), you should at least ensure that you haven't introduced any CFML syntax errors.

How can you do that? The built-in CFML Syntax Checking tool is the solution. Available as of Release 4 and originally intended to serve as a tool to ensure that old code compiled in the new release, it's useful for just the kind of testing we're discussing. Simply point your browser at http://localhost/CFDOCS/cfmlsyntaxcheck/cfmlsyntaxcheck.cfm. It's a form that asks for the name of a directory (and/or file name) whose code you want to test. It can scan through subdirectories (the option is "recurse") and will report if any template has a CFML syntax error. It stops on the first error it finds, so after fixing any error, rerun it until no errors are reported. It's designed to be run only from your local workstation, but the application.cfm in CFDOCS/cfmlsyntaxcheck can be modified to remove that restriction.

As useful as that can be, it's still just a pretty simple test: Will the code compile? It doesn't tell you much about the quality of your code, or whether it will perform well, or whether there are any of the many common CFML coding mistakes people often make. For this we need CFML code validation. Sound like a pipe dream? Believe it or not, there is a tool to do that. It's not free. It's a Web service from CF maven Steve Nelson, available at www.secretagents.com/products/index.cfm?fuseaction=product&product_id=1. Some will find it worth every penny.

The service, named Stomp (as in stomping out bugs), is available for trial and performs over a hundred very useful CF-specific tests. You zip up and submit a set of code to the parsing tool and after several minutes (typically) it will e-mail you a link to a report on the site. It will show each template and its errors, as well as explain why the problem is worth fixing and, most important, how to do so. Some of these are classic; some you may not have considered.

For instance, in a sample test of just a few templates of my own code, I was warned about:

  • Not having maxlength defined for a form input field (which could lead to truncation problems on a database update in the action page)
  • Using Select * in some SQL statements (which may retrieve more columns than needed)
This was just a small set of code I tested. Stomp looks for many other errors. Stomp also gave me several useful suggestions just in that small code sample, including:
  • Recommending that I use <CFSETTING ENABLECFOUTPUTONLY="yes"> for some templates that had no text for display (or all the text was within CFOUTPUTs)
  • Detecting that I'd failed to test for a file's existence before using it as a source file in CFFILE
  • Detecting that I had a .cfm file with no CF code and recommending that I change it to an .htm extension instead
  • Reminding me to consider the case-insensitive versions of the functions Find, Replace, and Compare, and related list and regular expression equivalents
  • Suggesting that I test for a recordset having zero records before doing a CFOUTPUT loop over the query result set
  • Reminding me to consider using <CFIF NOT len(x)> rather than <CFIF len(x) IS 0>
  • Suggesting I write the CFID and CFTOKEN variables on any links or form actions rather than rely on cookies, and even catching each instance of code where I had an HREF or Form ACTION that failed to pass the tokens
If I were to apply these suggestions (many of which I'd failed to use in multiple templates), I might improve the quality of my site dramatically. Of course, some will argue, "I already know to do (or not do) that stuff." Hey, so do I. But due to laziness (or sloppiness), the problems crept in. It was a great lesson in humility! Then again, Stomp can catch hundreds of errors, so perhaps I should look at the bright side!

It's not a perfect tool. Some mistakes flagged as "hard failures" really ought to have been flagged as "potential failures" (the use of "Select *" was flagged as a "Hard Failure-ODBC"). That's only a minor annoyance considering the value it offers. Perhaps it should flag some errors as "potential failures." In any case another cool feature of the site is that since it's a Web-based service, you have a link on each reported error, allowing you to send feedback easily about the error. For a complete list of the tests (which itself may be useful) see www.secretagents.com/content/index.cfm?fuseaction=testlist.

Another tool that does some CFML code parsing is ParseDog (http://aloha-webdesign.com/dloads/cf/parsedog/releasenotes.htm), which focuses on identifying whether custom tag and CFINCLUDE files exist and has some extra features that may interest you.

Another aspect of CFML code testing is code timing: How long does it take to execute a given code or code block? There's no tool available (yet) to provide automated profiling of CFML code, but you can "roll your own." Look at the GetTickCount() function. The CFML documentation on the function gives a simple but usable example of its use. Another aspect of timing is how long code takes to compile, and how long queries take to execute. Answers can be found in the debugging information that can be enabled in the CF Administrator. It may not be appropriate to enable this in production, but that's why you ought to have a test environment that models your production environment.

The final aspect of CFML code testing may not really seem about testing at all, but in a way it is. Studio has a set of features that can help you see if the code you're creating is indeed valid. Consider it a "poor man's" CFML validation. While the cursor is on a CFML tag or its attributes, press Ctrl-F4 (or use Tags>Edit Current Tag). The tag's available attributes and their values will be displayed. Similar values can be obtained with the Tag Insight feature: while entering a tag and its attributes, wait one second after pressing the space bar and Studio will show you the tag's available attributes and values. Finally, Tag Help allows you to press F1 on a tag to see the CFML documentation for it. As of Release 4.01, the Insight and Help features are also available for functions.

Web Application Testing
I've identified the final group of testing somewhat more generically as Web application testing. It may not be about your CFML in particular, but how the application as a whole (including HTML, Javascript, and CFML) works, in both reasonable and unexpected situations. This sort of testing includes:

  1. Data validation and "bounds" checking: Does your code handle unexpected input well?
  2. Functionality testing: Does the code work as expected?
  3. Security testing: Are only authenticated, authorized activities able to take place?
  4. Regression/functional/smoke testing: Does the code produce the expected result, when run after performing code changes?
  5. Performance/load/stress testing: Can the code sustain a large load?
  6. Concurrency testing: Are there problems when multiple users access the code, or when a single user opens multiple browsers?
  7. Site monitoring:Is the site up? Is anyone aware when it goes down?
The first three are tests that really can't be automated. You simply need to create a set of tests to ensure that things work as expected and that unexpected things are handled properly. Most important: make sure that the code does what the user expects (more on that, and "acceptance testing," later.)

Still, while the creation of such tests can't be automated easily, it's possible to record the execution of your tests and then play back the tests in a repetitive manner to detect whether code that worked at one point in your development is no longer able to pass the recorded tests.

This is often called regression testing, though it's known by other names as well. The point is that you're trying to determine if solved problems remain solved, or if your code has "regressed" to a nonfunctioning point. Going back to the original premise, if you've made a change in some code that's deep in a set of steps that the browser must do (perhaps order multiple items on a shopping cart and then delete one), such automated testing tools can make it easy to retrace those steps as often as you like.

Predominant companies with tools in this space include:

These companies' tools are mature, and relatively expensive, but of substantial value for the benefit provided. A smaller and relative newcomer with a purportedly different approach and a much lower price is eValid (www.soft.com/eValid/).

Most of the tools in this space work as easily as a VCR, literally allowing you to hit a record button to start watching you run through your application while it tracks all your button clicks, mouse clicks, links taken, even JavaScript or Flash events fired (in better tools). They can then be played back repeatedly with relatively little effort and interaction. In fact, most tools include features to allow you to parameterize the tests when rerun so that some or all aspects of processing are driven by variable data extracted from a database or flat file.

Most of these tools allow the recorded, parameterized regression tests to be used for the next type of testing, performance/load/stress testing, which answers the question: Can the code (and environment) scale and sustain a large load? This may be where these tools really shine, because while the regression testing aspect can be a bit cumbersome when dealing with constantly changing code bases, the load test is more straightforward: you record the tests, parameterize them, and then use the tools' ability to create "virtual users" that mimic many users (perhaps thousands) hitting your site. These better tools do more than simply bang on a single page; they run through the same sort of multipage execution that works the way a user would. The tests are limited only by our creativity and the time to create the tests. While some tools use scripting languages to create tests, which are powerful but can be cumbersome to learn, others use the simpler record/playback approach.

Each of the aforementioned companies' tools can perform load/stress testing, but free tools are also available, including Microsoft's own Web Application Stress test tool (http://msdn.microsoft.com/library/en-us/dnduwon/html/d5wast_2.asp) and the Paessler Webserver Stress Tool (www.paessler.com/tools/WebStress/webstress.htm), which is free for five users, with a modest cost to add more users.

Even if you don't think your site will need to support thousands (or hundreds) of concurrent users, consider using some form of load-testing tool to see what happens when even more than one user visits your site. Many an application has been brought to its knees by problems with, for example, sessions being unexpectedly shared among multiple users. This sort of problem can be caught with concurrency testing. And it's not only about multiple users accessing the code, but even a single user opening multiple browsers. Even using the simplest freeware tool causing multiple users to visit the site may reveal unexpected problems.

These automated testing tools are often dismissed as too expensive or too much work for a given project, but the cost and effort can be justified in the testing time saved (or facilitated). While free tools achieve some of this functionality, you often do get what you pay for. The good news is that some vendors are making available (or planning) low-cost developer editions or even limited function (but nonexpiring) trial versions. Most offer trials, of course, but Rational, for instance, now offers a free, full-function 50-user license of its SiteLoad tool and Mercury Interactive is offering free access to their hosted service (http://captainmercury.com.) These are exciting developments for those who've dismissed the pricier tools in the past.

That last point about hosted services is worth expanding: most of the testing tool providers are now offering hosted services that can be quite useful and cost effective.

The final form of Web application testing can also leverage these other recorded scripts and can tell you about the ongoing state of the application once it's in production. Site monitoring asks the questions: Is the site up? Is anyone aware when it goes down? Besides the aforementioned toolmakers, other players in the space include freshwater.com, tracert.com, and others, some of which merely provide a useful mechanism to ping your site (or a given page) to ensure the site is up, and e-mail you when it goes down. Indeed, CF5's own probe mechanism within the new Administrator Tools might be used to provide a similar capability.

Other Testing
Still other tools can test different components or features you may be using:

  • COM/EJB/CORBA/Java component testing
  • Web service testing
  • WAP/WML testing
  • Search engine result optimization testing
We don't have space to get into all facets of Web application testing: I'll leave you to look into these on your own. Additional forms of testing often mentioned in the world of e-testing - including installation, configuration, usability, unit, integration, reliability/recovery, end-to-end, white/gray/black-box, and documentation testing - are also worthy of your further research.

Another aspect of Web application testing asks whether the code you're building is really what the client wants. This isn't really testing so much as prototyping, but it's an important matter nonetheless. Certainly, before rolling your application out from its initial design stage (indeed, before and during that design stage), you should use various means to prototype the application to obtain user approval. This is often called acceptance testing. The FuseBox community is strongly embracing "wireframing" to create simple but adequate representations of the look and feel (and flow) of a site. Even non-FuseBox applications can benefit from this approach, and previous CFDJ articles have addressed this. See "Exploring the Development Process," by Hal Helms (Vol. 3, issue 3).

Finally, a CF project usually exists within the context of several other servers providing related services. Other opportunities for testing include testing the availability and performance of the Web server (and cluster, if any), the entire application server, the particular database (and database server, if any), the mail server, the ftp server, and so on. Consider these as you develop a testing plan.

I hope this trip around the testing world proves helpful for you. There's a lot more that we just couldn't cover, and many resources worth reviewing. I've decided to develop a one-day seminar as a result of my research for this article. If you're interested, drop me a line.


  1. "Stranger in a Strange Land: Bringing Quality Assurance to a Web Startup," by Lisa Crispin: www.stickyminds.com/docs_index/XUS247559file1.doc
  2. "Testing Your Web Site," by Eric Kaufman: www.veritest.com/testers'network/Web_testing1-1.asp
  3. "Understanding Performance Testing": http://msdn.microsoft.com/library/en-us/dnduwon/html/d5dplyover.asp
  4. "Web-Site Monitoring Derails Problems": www.informationweek.com/805/webdev.htm
  5. "Testing Considerations for Web-Enabled Applications," by Gerry Ocampo: www.veritest.com/testers'network/testing_considerations1.asp
  6. "Why Test the Web? How Much Should You Test?" by Mike Powers: www.veritest.com/testers'network/Web_testing21.asp
  7. White paper on Web testing: http://members.spree.com/oceansurfer/webtesting.htm

 . Books

  1. Nguyen, H. (2000). Testing Applications on the Web. Wiley.
  2. Plaine, S., et al. (2001). The Web Testing Handbook. Software Quality Engineering.
  3. Dustin, E., et al. (1999). Automated Software Testing: Introduction, Management, and Performance. Addison-Wesley.
  4. Graham, D., et al. (1999). Software Test Automation: Effective Use of Test Execution Tools. Addison-Wesley.
  5. Perry, W. E., et al. ( 1997). Surviving the Top Ten Challenges of Software Testing. Dorset House.
Perhaps on a par with a book is this informative and contemporary thesis on Web application testing by Jesper Ryden and Par Svensson: www.stickyminds.com/docs_index/XUS512798file1.zip

Portal/Information Resources URLs

  1. www.qaforums.com/
  2. www.sqe.com/
  3. www.stickyminds.com/
  4. www.softwareqatest.com/
  5. www.qacity.com
  6. www.veritest.com/testers'network/
  7. www.empirix.com/empirix/web+test+monitoring/resources/default.html
  8. www.keynote.com/services/html/product_lib.html
  9. www-svca.mercuryinteractive.com/resources/library/

More Stories By Charlie Arehart

A veteran ColdFusion developer since 1997, Charlie Arehart is a long-time contributor to the community and a recognized Adobe Community Expert. He's a certified Advanced CF Developer and Instructor for CF 4/5/6/7 and served as tech editor of CFDJ until 2003. Now an independent contractor (carehart.org) living in Alpharetta, GA, Charlie provides high-level troubleshooting/tuning assistance and training/mentoring for CF teams. He helps run the Online ColdFusion Meetup (coldfusionmeetup.com, an online CF user group), is a contributor to the CF8 WACK books by Ben Forta, and is frequently invited to speak at developer conferences and user groups worldwide.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
Charlie Arehart 10/25/01 07:09:00 PM EDT

I thought I'd mention a couple other tests that I failed to include (we were running short on available space), but are quite important.

You should test your app to see how it performs when either JavaScript and/or Cookies are turned off. Many apps can grind to a halt in such cases.

For instance, if you're using session variables without passing the URLTOKEN appropriately on every page and a browser doesn't support cookies, that browser user may find it impossible to access some pages on your site, appearing as a strange "bug" that's hard to detect and resolve.

Testing on a browser by turning off its support for cookies (and JavaScript) can be very enlightening.

I welcome comments from readers if I've missed any other important tests appropriate for CF developers.

@ThingsExpo Stories
Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the USA and Europe, we work with a variety of customers from emerging startups to Fortune 1000 companies.
The age of Digital Disruption is evolving into the next era – Digital Cohesion, an age in which applications securely self-assemble and deliver predictive services that continuously adapt to user behavior. Information from devices, sensors and applications around us will drive services seamlessly across mobile and fixed devices/infrastructure. This evolution is happening now in software defined services and secure networking. Four key drivers – Performance, Economics, Interoperability and Trust ...
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 20th Cloud Expo, which will take place on June 6-8, 2017 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 add...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
SYS-CON Events announced today that Juniper Networks (NYSE: JNPR), an industry leader in automated, scalable and secure networks, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Juniper Networks challenges the status quo with products, solutions and services that transform the economics of networking. The company co-innovates with customers and partners to deliver automated, scalable and secure network...
The age of Digital Disruption is evolving into the next era – Digital Cohesion, an age in which applications securely self-assemble and deliver predictive services that continuously adapt to user behavior. Information from devices, sensors and applications around us will drive services seamlessly across mobile and fixed devices/infrastructure. This evolution is happening now in software defined services and secure networking. Four key drivers – Performance, Economics, Interoperability and Trust ...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
SYS-CON Events announced today that Hitachi Data Systems, a wholly owned subsidiary of Hitachi LTD., will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City. Hitachi Data Systems (HDS) will be featuring the Hitachi Content Platform (HCP) portfolio. This is the industry’s only offering that allows organizations to bring together object storage, file sync and share, cloud storage gateways, and sophisticated search an...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Analytic. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
@ThingsExpo has been named the Most Influential ‘Smart Cities - IIoT' Account and @BigDataExpo has been named fourteenth by Right Relevance (RR), which provides curated information and intelligence on approximately 50,000 topics. In addition, Right Relevance provides an Insights offering that combines the above Topics and Influencers information with real time conversations to provide actionable intelligence with visualizations to enable decision making. The Insights service is applicable to eve...
Multiple data types are pouring into IoT deployments. Data is coming in small packages as well as enormous files and data streams of many sizes. Widespread use of mobile devices adds to the total. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will look at the tools and environments that are being put to use in IoT deployments, as well as the team skills a modern enterprise IT shop needs to keep things running, get a handle on all this data, and deli...
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
SYS-CON Events announced today that T-Mobile will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. As America's Un-carrier, T-Mobile US, Inc., is redefining the way consumers and businesses buy wireless services through leading product and service innovation. The Company's advanced nationwide 4G LTE network delivers outstanding wireless experiences to 67.4 million customers who are unwilling to compromise on ...
SYS-CON Events announced today that CollabNet, a global leader in enterprise software development, release automation and DevOps solutions, will be a Bronze Sponsor of SYS-CON's 20th International Cloud Expo®, taking place from June 6-8, 2017, at the Javits Center in New York City, NY. CollabNet offers a broad range of solutions with the mission of helping modern organizations deliver quality software at speed. The company’s latest innovation, the DevOps Lifecycle Manager (DLM), supports Value S...
With billions of sensors deployed worldwide, the amount of machine-generated data will soon exceed what our networks can handle. But consumers and businesses will expect seamless experiences and real-time responsiveness. What does this mean for IoT devices and the infrastructure that supports them? More of the data will need to be handled at - or closer to - the devices themselves.