Click here to close now.

Welcome!

You will be redirected in 30 seconds or close now.

ColdFusion Authors: Yakov Fain, Maureen O'Gara, Nancy Y. Nee, Tad Anderson, Daniel Kaar

Related Topics: Web 2.0, SOA & WOA, ColdFusion, PowerBuilder, .NET, Adobe Flex

Web 2.0: Article

The Future of Software Development

How software developers can stay current while staying sane

"Technology makes it possible for people to gain control over almost everything, except technology." - John Tudor

As software developers, our mission is to deliver positive, technology-based solutions - software that provides both the means and the method for working faster, performing better, achieving more. There is little doubt that the technologies we create provide users with the control and functionality needed to be more efficient and productive. However, what happens when the tools we use to produce these solutions get out of control?

Evolution in the technology ecosystem has accelerated to the speed of light - blink and you may miss something important. The software development landscape has mushroomed with near-exponential growth; new products and innovations are flooding the market on a daily basis. It begs the question: does this swift evolutionary pace represent a positive stage in the maturation of software development or are we moving too quickly for our own good? What does the future of software hold for us? It is an open question that can only be answered with time.

All of this warp-speed evolution and growth means that today's world of software development is bigger and far more complex than at any time. With the sheer volume of diverse technologies and products hitting the market, compounded by truncated beta-to-release delivery cycles, it is all too easy for developers to become overwhelmed and feel as if they're falling behind. In the Microsoft space alone, there are hundreds of emerging technologies regularly being released to market. However, this rapid evolution and explosive technology proliferation isn't limited to any one market segment. Rather, it's endemic to software development as a whole - and represents a mounting challenge that must be addressed if developers want to maintain a base of knowledge that is current and timely.

How can software developers keep apace of relevant new developments without becoming buried beneath an avalanche of information about superfluous tools and technologies that are only marginally related or beneficial to their own projects? How do you separate the wheat from the chaff and really zero in on those technologies that are truly important and valuable? While there's no simple answer to this riddle, one method that may prove useful is staying on PAR: Proactive, Abstraction, Refinement.

Going Proactive: The Waiting Game Is a Losing Game
In the early days of .NET development - the "good ol' days," so to speak - new tools and technologies were released at a relatively disciplined pace. During those first six to eight years, developers benefited from new releases and updates being brought to market at reasonable intervals. It was not only possible, but a logical, intelligent choice for software developers to take an extended amount of time to research, review, evaluate, and perform a hands-on trial of these technologies in order to accurately determine whether they were relevant to either project or individual goals. On rare occasions, this process would begin with technology betas but more often than not, teams could afford to wait for release candidates or near-final software before conducting their evaluations. In other words, the waiting game was one that paid off in the long run.

Fast-forward to today: the waiting game has become a losing game. Technology is now moving along at a much faster pace than it was a mere decade ago. The shifting software landscape touches a far broader spectrum of technologies - enterprise, Web, cloud, and mobile, for example - than ever before, resulting in an increasingly wider cross-platform, cross-technology software footprint. Yet though this swift acceleration of evolution and expansion might be unsettling to some, the pace is appropriate for the scenarios and world that we as developers, must serve.

The last two to three years has seen a rapid acceleration in the introduction of new technologies and updates - beta-to-release-candidate cycles are now compressed so that it often feels as if new technologies are being introduced every month. For the average .NET developer, this means the luxury of time has evaporated; sitting by the sidelines, just waiting for updates and new technologies to come along is a sure path to obsolescence.

Software developers who want to avoid the obsolescence trap must become more proactive. In practical terms, this means breaking free of the conventional "wait-and-see" mentality, and instead reaching out to embrace coming change: identifying which emerging technologies, irrespective of the buzz factor, best serve current and future project needs, actively seeking out the latest demos and information from experts and project teams, and engaging in the discussion before the final release hits the market. Success is based on one's ability to be nimble and adapt quickly to evolving conditions. Software developers willing to make the effort necessary to become more proactive will find themselves well equipped to not just survive but to thrive in the rapidly shifting technology ecosystem.

The Art of Abstraction
Being proactive rather than reactive is a critical element for success in the modern-day world of software development, yet, it also poses a formidable challenge; out of the multitudes of new technologies hitting the market, how do you decide which to pursue? How do you determine which ones will be viable and evergreen, and which will fall by the wayside? As it is virtually impossible to predict which technologies will establish themselves as indispensible, developers often find themselves under increasing pressure to adopt a "learn everything" strategy. However, becoming a "jack of all trades, master of none" can be just as detrimental to success as sitting by the sidelines and waiting for the dust to settle.

As developers are generally not blessed with a crystal ball and its predictive abilities, an alternate method for preparing for a successful future is needed. The art of abstraction - reducing and factoring out specific details and minutiae in order to focus on critical concepts - is just such a method. By adding a layer of abstraction between themselves and the shifting technology environment, developers can effectively select, gain a broad understanding of, and manage a wider variety of relevant technologies without feeling as if they're being forced to fully learn every new niche technology that comes along.

Software developers can achieve the level of abstraction that best suits their individual situation in any number of ways. For example:

  • Identify someone who can act as a technology expert - such as project or team leader, or industry analyst - and provide a learned, level-headed, and objective view of a given technology. After identifying an expert, take the time needed to really get to know the person. Much like vetting a vital service provider like an accountant, attorney, or even an auto mechanic, it's critical to validate the veracity of any expert opinion.
  • If finding and identifying an expert is impractical or impossible, find a vendor that is trustworthy, one who has demonstrated its worth in the past. It's less risky to bet on a company with proven, dependable technologies delivering added value than on a new, not yet proven technology.
  • Leverage the power of abstraction tools, like Object Relational Mapping (ORM) suites. With their unique ability to act as an equalizer of sorts - allowing software developers to work effectively within multiple diverse environments without requiring specialized knowledge of each database system - ORMs are an efficient and effective means of capitalizing on emerging technologies without requiring learning from the bottom-up.

No matter how it is accomplished, applying the art of abstraction within the development environment enables developers to insulate themselves from the turbulence and volatility of the shifting software landscape.

Refined Vision: Seeing the Forest for the Trees
Like that old adage "you can't see the forest for the trees," when faced with a veritable torrent of new technologies, it can be extraordinarily difficult for software developers to clearly see the whole picture as it relates to their situation - which new innovations may have a material impact on existing or future projects, and which technologies are on the verge of obsolescence and the annals of computer history. Having this clearer view is essential to both individual developers and development teams as a whole, if they want to succeed. Achieving a more focused view requires a bit of filtering; however, it also makes it possible to see the forest despite the trees.

Refined vision doesn't necessarily mean putting on blinders or selecting just one or two technologies to follow. Despite the grandstanding of various technology pundits, there is no single absolute set of technologies that every software developer must learn in order to be successful. It is true, however, that there are definitely right and wrong tools and technologies that should be learned for specific scenarios; determining which technologies are the best fit is the first step in the refinement process.

Performing a thorough evaluation of the unique needs of a particular development environment enables individual software developers and development teams to identify current and near-future project needs. Armed with a better understanding of project needs in turn facilitates the filtering process - the location and pursuit of those relevant technologies that deliver the greatest added value, while ignoring the "new toy" buzz surrounding technologies that will have little or no bearing. Getting your feet wet and learning more about a new technology than just its name means spending a few evenings becoming familiar with it through in-depth research, hands-on interaction with demos, and engagement with experts in the space. By refining their vision, software developers can identify which new technologies should be added to their short-list, thereby keeping their skill set up-to-date and fresh.

The Future of Software Development
There is no end in sight to the deluge of new technologies being delivered to market, making it fundamentally impossible to accurately predict the future of software development. What can be predicted, however, is the continued mounting pressure on developers to readily accept and quickly adapt to accelerated technology evolution cycles. If there is any one concept or idea that software developers must absolutely comprehend, it is this: it really is okay not to know it all. In truth, software developers should be less concerned about knowing the ins and outs of every new niche technology and more with figuring out what is best for their own development practices and environment in order to remain nimble enough to adapt to quickly evolving conditions. With genuine understanding of this concept, developers will find that it is truly possible to keep their skills current while keeping their sanity intact.

More Stories By Todd Anglin

Todd Anglin is Chief Evangelist for Telerik, a leading vendor of development, automated testing, and team productivity tools, as well as UI components for Microsoft .NET. Before joining Telerik, he worked for a large Fortune 200 financial services company IT shop where he learned the way of the “Enterprise” – big budgets, big projects, legacy systems, and incessant measurement. He now leverages this Enterprise experience to help Telerik make tools that make the lives of all developers as easy as possible. Todd is an active author and speaker in the .NET community, focusing on web development technologies, a Microsoft MVP, founder and President of the North Houston .NET Users Group, and an O'Reilly author.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to make sense of it all.
SYS-CON Events announced today that GENBAND, a leading developer of real time communications software solutions, has been named “Silver Sponsor” of SYS-CON's WebRTC Summit, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. The GENBAND team will be on hand to demonstrate their newest product, Kandy. Kandy is a communications Platform-as-a-Service (PaaS) that enables companies to seamlessly integrate more human communications into their Web and mobile applications - creating more engaging experiences for their customers and boosting collaboration and productiv...
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, shared some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder of TeleStax, a...