You will be redirected in 30 seconds or close now.

ColdFusion Authors: Yakov Fain, Jeremy Geelan, Maureen O'Gara, Nancy Y. Nee, Tad Anderson

Related Topics: Agile Computing, Microservices Expo, ColdFusion, PowerBuilder, Microsoft Cloud, Adobe Flex

Agile Computing: Article

The Future of Software Development

How software developers can stay current while staying sane

"Technology makes it possible for people to gain control over almost everything, except technology." - John Tudor

As software developers, our mission is to deliver positive, technology-based solutions - software that provides both the means and the method for working faster, performing better, achieving more. There is little doubt that the technologies we create provide users with the control and functionality needed to be more efficient and productive. However, what happens when the tools we use to produce these solutions get out of control?

Evolution in the technology ecosystem has accelerated to the speed of light - blink and you may miss something important. The software development landscape has mushroomed with near-exponential growth; new products and innovations are flooding the market on a daily basis. It begs the question: does this swift evolutionary pace represent a positive stage in the maturation of software development or are we moving too quickly for our own good? What does the future of software hold for us? It is an open question that can only be answered with time.

All of this warp-speed evolution and growth means that today's world of software development is bigger and far more complex than at any time. With the sheer volume of diverse technologies and products hitting the market, compounded by truncated beta-to-release delivery cycles, it is all too easy for developers to become overwhelmed and feel as if they're falling behind. In the Microsoft space alone, there are hundreds of emerging technologies regularly being released to market. However, this rapid evolution and explosive technology proliferation isn't limited to any one market segment. Rather, it's endemic to software development as a whole - and represents a mounting challenge that must be addressed if developers want to maintain a base of knowledge that is current and timely.

How can software developers keep apace of relevant new developments without becoming buried beneath an avalanche of information about superfluous tools and technologies that are only marginally related or beneficial to their own projects? How do you separate the wheat from the chaff and really zero in on those technologies that are truly important and valuable? While there's no simple answer to this riddle, one method that may prove useful is staying on PAR: Proactive, Abstraction, Refinement.

Going Proactive: The Waiting Game Is a Losing Game
In the early days of .NET development - the "good ol' days," so to speak - new tools and technologies were released at a relatively disciplined pace. During those first six to eight years, developers benefited from new releases and updates being brought to market at reasonable intervals. It was not only possible, but a logical, intelligent choice for software developers to take an extended amount of time to research, review, evaluate, and perform a hands-on trial of these technologies in order to accurately determine whether they were relevant to either project or individual goals. On rare occasions, this process would begin with technology betas but more often than not, teams could afford to wait for release candidates or near-final software before conducting their evaluations. In other words, the waiting game was one that paid off in the long run.

Fast-forward to today: the waiting game has become a losing game. Technology is now moving along at a much faster pace than it was a mere decade ago. The shifting software landscape touches a far broader spectrum of technologies - enterprise, Web, cloud, and mobile, for example - than ever before, resulting in an increasingly wider cross-platform, cross-technology software footprint. Yet though this swift acceleration of evolution and expansion might be unsettling to some, the pace is appropriate for the scenarios and world that we as developers, must serve.

The last two to three years has seen a rapid acceleration in the introduction of new technologies and updates - beta-to-release-candidate cycles are now compressed so that it often feels as if new technologies are being introduced every month. For the average .NET developer, this means the luxury of time has evaporated; sitting by the sidelines, just waiting for updates and new technologies to come along is a sure path to obsolescence.

Software developers who want to avoid the obsolescence trap must become more proactive. In practical terms, this means breaking free of the conventional "wait-and-see" mentality, and instead reaching out to embrace coming change: identifying which emerging technologies, irrespective of the buzz factor, best serve current and future project needs, actively seeking out the latest demos and information from experts and project teams, and engaging in the discussion before the final release hits the market. Success is based on one's ability to be nimble and adapt quickly to evolving conditions. Software developers willing to make the effort necessary to become more proactive will find themselves well equipped to not just survive but to thrive in the rapidly shifting technology ecosystem.

The Art of Abstraction
Being proactive rather than reactive is a critical element for success in the modern-day world of software development, yet, it also poses a formidable challenge; out of the multitudes of new technologies hitting the market, how do you decide which to pursue? How do you determine which ones will be viable and evergreen, and which will fall by the wayside? As it is virtually impossible to predict which technologies will establish themselves as indispensible, developers often find themselves under increasing pressure to adopt a "learn everything" strategy. However, becoming a "jack of all trades, master of none" can be just as detrimental to success as sitting by the sidelines and waiting for the dust to settle.

As developers are generally not blessed with a crystal ball and its predictive abilities, an alternate method for preparing for a successful future is needed. The art of abstraction - reducing and factoring out specific details and minutiae in order to focus on critical concepts - is just such a method. By adding a layer of abstraction between themselves and the shifting technology environment, developers can effectively select, gain a broad understanding of, and manage a wider variety of relevant technologies without feeling as if they're being forced to fully learn every new niche technology that comes along.

Software developers can achieve the level of abstraction that best suits their individual situation in any number of ways. For example:

  • Identify someone who can act as a technology expert - such as project or team leader, or industry analyst - and provide a learned, level-headed, and objective view of a given technology. After identifying an expert, take the time needed to really get to know the person. Much like vetting a vital service provider like an accountant, attorney, or even an auto mechanic, it's critical to validate the veracity of any expert opinion.
  • If finding and identifying an expert is impractical or impossible, find a vendor that is trustworthy, one who has demonstrated its worth in the past. It's less risky to bet on a company with proven, dependable technologies delivering added value than on a new, not yet proven technology.
  • Leverage the power of abstraction tools, like Object Relational Mapping (ORM) suites. With their unique ability to act as an equalizer of sorts - allowing software developers to work effectively within multiple diverse environments without requiring specialized knowledge of each database system - ORMs are an efficient and effective means of capitalizing on emerging technologies without requiring learning from the bottom-up.

No matter how it is accomplished, applying the art of abstraction within the development environment enables developers to insulate themselves from the turbulence and volatility of the shifting software landscape.

Refined Vision: Seeing the Forest for the Trees
Like that old adage "you can't see the forest for the trees," when faced with a veritable torrent of new technologies, it can be extraordinarily difficult for software developers to clearly see the whole picture as it relates to their situation - which new innovations may have a material impact on existing or future projects, and which technologies are on the verge of obsolescence and the annals of computer history. Having this clearer view is essential to both individual developers and development teams as a whole, if they want to succeed. Achieving a more focused view requires a bit of filtering; however, it also makes it possible to see the forest despite the trees.

Refined vision doesn't necessarily mean putting on blinders or selecting just one or two technologies to follow. Despite the grandstanding of various technology pundits, there is no single absolute set of technologies that every software developer must learn in order to be successful. It is true, however, that there are definitely right and wrong tools and technologies that should be learned for specific scenarios; determining which technologies are the best fit is the first step in the refinement process.

Performing a thorough evaluation of the unique needs of a particular development environment enables individual software developers and development teams to identify current and near-future project needs. Armed with a better understanding of project needs in turn facilitates the filtering process - the location and pursuit of those relevant technologies that deliver the greatest added value, while ignoring the "new toy" buzz surrounding technologies that will have little or no bearing. Getting your feet wet and learning more about a new technology than just its name means spending a few evenings becoming familiar with it through in-depth research, hands-on interaction with demos, and engagement with experts in the space. By refining their vision, software developers can identify which new technologies should be added to their short-list, thereby keeping their skill set up-to-date and fresh.

The Future of Software Development
There is no end in sight to the deluge of new technologies being delivered to market, making it fundamentally impossible to accurately predict the future of software development. What can be predicted, however, is the continued mounting pressure on developers to readily accept and quickly adapt to accelerated technology evolution cycles. If there is any one concept or idea that software developers must absolutely comprehend, it is this: it really is okay not to know it all. In truth, software developers should be less concerned about knowing the ins and outs of every new niche technology and more with figuring out what is best for their own development practices and environment in order to remain nimble enough to adapt to quickly evolving conditions. With genuine understanding of this concept, developers will find that it is truly possible to keep their skills current while keeping their sanity intact.

More Stories By Todd Anglin

Todd Anglin is Chief Evangelist for Telerik, a leading vendor of development, automated testing, and team productivity tools, as well as UI components for Microsoft .NET. Before joining Telerik, he worked for a large Fortune 200 financial services company IT shop where he learned the way of the “Enterprise” – big budgets, big projects, legacy systems, and incessant measurement. He now leverages this Enterprise experience to help Telerik make tools that make the lives of all developers as easy as possible. Todd is an active author and speaker in the .NET community, focusing on web development technologies, a Microsoft MVP, founder and President of the North Houston .NET Users Group, and an O'Reilly author.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
SYS-CON Events announced today that Embotics, the cloud automation company, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Embotics is the cloud automation company for IT organizations and service providers that need to improve provisioning or enable self-service capabilities. With a relentless focus on delivering a premier user experience and unmatched customer support, Embotics is the fas...
SYS-CON Events announced today that Coalfire will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Coalfire is the trusted leader in cybersecurity risk management and compliance services. Coalfire integrates advisory and technical assessments and recommendations to the corporate directors, executives, boards, and IT organizations for global brands and organizations in the technology, cloud, health...
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
SYS-CON Events announced today that MathFreeOn will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MathFreeOn is Software as a Service (SaaS) used in Engineering and Math education. Write scripts and solve math problems online. MathFreeOn provides online courses for beginners or amateurs who have difficulties in writing scripts. In accordance with various mathematical topics, there are more tha...
@ThingsExpo has been named the Top 5 Most Influential Internet of Things Brand by Onalytica in the ‘The Internet of Things Landscape 2015: Top 100 Individuals and Brands.' Onalytica analyzed Twitter conversations around the #IoT debate to uncover the most influential brands and individuals driving the conversation. Onalytica captured data from 56,224 users. The PageRank based methodology they use to extract influencers on a particular topic (tweets mentioning #InternetofThings or #IoT in this ...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...