Debunking myths

Three myths of application development

Don't be consumed by the hype or the history: three things have to be sorted out

This article appeared on TechTarget today.

Every truth is subject to a parallel rise in urban legend or myth. And software application development and delivery is no exception. But true software quality, a result of process and progress, depends on bypassing the industry-persistent myths that surround the process to reach the nirvana of “orchestrated” application delivery, where every process and group is coordinated and conducted with precision.

Indeed, a recent survey of application development and IT professionals* revealed that application development priorities are shifting. The top application lifecycle management (ALM) initiative was identified by 75 percent of respondents as managing application development as a business process. And the #1 priority in application delivery was delivering applications faster – a notable change from the last few years when cutting costs topped the list.

Like politics, application development is subject to differing views and approaches but regardless of the journey, the target destination is the same – while politics exist under the ideal of fair governance, application development is about the end result too. It is within varying approaches that myths about application development have taken root and quietly stunted the process itself. The top three myths in application development today are introduced below.

Application development myth #1: Best-in-class is best.

I risk angering the whole of Silicon Valley by taking aim at the best-in-class ideal, which is at the very heart of IT development, where any success inevitably gives rise to competitors, and a Darwinian survival of the fittest ensues – better, faster, cheaper. But try to get any of these “best-in-class” tools to work together in the context of application development and you quickly see why this first application development belief does not stack up.

Each part of the Software Development Lifecycle (SDLC) is a silo. Regardless of your role – architect, designer, developer, tester, release engineer and so on – there are incredibly good point solution tools on the market today to support your ability to conceive and deliver your part of the application.

These offerings usually come from specialist vendors who develop tools to support a very specific part of the lifecycle. These tools speak your language, are optimized for how you work and encapsulate the best practices and modern methodologies of your part of the process – and your part only.

However, these “best-in-class” tools are not best suited for collaboration across the lifecycle. They rarely provide any automation for the handoff from one phase of the SDLC to the next. There is frequent cut-and-paste from one database to another, often requiring some kind of data migration and transformation process. The tools don’t “talk” to each other, so when data changes in one it is not necessarily reflected in the other.

Across the industry, “best-of-breed” idealism has been allowed to flourish until the inevitable day of reckoning when a business makes any attempt to get all of the various tools to work together. More complex than imagined, many an application has buckled under the weight of trying to connect various disjointed tools across the lifecycle – they ultimately do not make beautiful music together.

Businesses are forced, either by vendor lock-in or by the sheer complexity of the various tools to manage, to throw away technology investments and effectively start all over from scratch.

Application development myth #2: One-size-fits-all “fits” me.

Enter the one-size-fits-all, one-stop-shop, everything-you-wanted-all-in-one-place tools. These tools try to provide an end-to-end solution that can facilitate the support of the entire lifecycle, an answer to the best-in-class problem described above.

The problem is that these one-size-fits-all tools come from vendors who serve a mass audience and so develop generic solutions that are entirely agnostic to the different roles in the SDLC, an organization’s domain or methodology, best practices, policies and procedures. The big selling point is often the “single repository” database – all artifacts in one, centrally managed database.

While the concept might make sense at first, dig a little deeper. Most of these tools are “bloatware” solutions with hundreds of menus and options and a dizzying battery of configuration and customization needs. But perhaps the most dangerous feature is that these tools are optimized for a single platform, forcing a commitment to one OS, platform, hardware or methodology.

When in reality, the environment is dynamic. Today’s systems and applications are highly heterogeneous because we constantly innovate, embrace new ideas and incorporate them into extant systems. Migrating all of the existing data into a one-size-fits-all product can be expensive, time consuming and error prone.

And a one-size-fits-all vendor cannot possibly be a domain expert in any or every part of the lifecycle. These kinds of tools provide adequate functionality, but not the deeper capabilities for dealing with the inevitable corner-cases we all have in our systems.

Application development myth #3: Point-to-point integrations are the answer.

So we now understand two of the biggest myths, which brings the third to bear. Because this third myth says that all of the problems can be remedied with point-to-point integrations – integrating one point tool to another, and voila, problem solved.

But the truth is that these kinds of point-to-point integrations are typically limited in functionality and little more than automated cut-and-paste. As dependencies are added on an impromptu basis, this creates one-off integrations that are impossible to maintain.

And such integrations are brittle by nature – even upgrading the software at either end of the integration can cause the whole thing to fall apart.

We need to connect tools to create the kind of reporting that actually helps people to manage the business. Point-to-point integrations do not support this, and lack the controls needed to ensure the right project approval throughout the lifecycle.

Today’s software business also demands real-time data trending over time, to identify where development efforts are improving, where they’re breaking apart and to eradicate potential issues or bottlenecks early on.

Breaking the myth cycle to empower the lifecycle

Myths persist, but recognizing them will help to break the cycle and drive faster, better software delivery. So what is the answer? Consider a hybrid solution – you may want some best-in-class solutions, but you still need all of your tools to “talk” to each other and to support the end-to-end process.

Process automation provides a guarantee that processes are followed and designated individuals will be able to insert themselves into the process and confirm approval (or disapproval) at each step of the lifecycle. This is essential for accountability and traceability.

Connect every tool to the high-level process and you’ll achieve orchestrated application delivery – preserving existing investments, and designed to allow the most flexible implementation without requiring massive data migration. And in the end, creating “orchestration” increases efficiencies around global application development and delivery through an end-to-end, integrated approach to software development.

*Survey conducted at the Gartner Application Architecture, Development and Integration (AADI) conference, Fall 2010

About Admin

Web Admin
This entry was posted in Business and Technology and tagged , , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s