Orchestrating application delivery (part 2)


Each score is specific to each instrument

So we have seen that the CIO is starved of data. But let’s take a closer look at what is going on in the development community by looking at two common myths:

Best-in-class is best

Each part of the Software Development Lifecycle (SDLC) is a silo. No matter your role, whether an architect, designer, developer, tester, release engineer or whatever you are blessed with incredibly good technology tools to support your ability to conceive and deliver your part of the application. These are point solution tools. They usually come from specialist vendors with decades of experience in developing tools specifically to support that part of the lifecycle. These tools speak your language, they are optimized for how you work and they encapsulate the best practices and modern methodologies of your part of the application delivery experience.

However these tools are not best suited for collaboration across the lifecycle, they rarely provide any automation for the handoff from one phase of the SDLC to the next. There is frequent cut-and-paste from one database to another often requiring some kind of data migration and transformation process to run. The tools don’t “talk” to each other and so when data changes in one it is not reflected in the other. And when the tools are integrated that integration is often very brittle and fails as soon as one or other of the tools is upgraded.

One-size-fits-all fits me

So enter the one-size-fits-all (OSFA), one-stop-shop, everything-you-wanted-all-in-one-place tools. These tools try to provide an end-to-end solution that does facilitate the support of the whole lifecycle. The problem is that these tools come from vendors who have to reach a mass audience and so they develop very generic solutions that are entirely agnostic as to your role in the SDLC, your organization’s domain or methodology, best practices, policies and procedures. The big selling point for them is often their “single repository” database. I have all my artifacts in one database and I can control access to everything from that one place.

While the concept seems to make sense you have to dig a little deeper and ask some questions. Most of these tools are incredible “bloatware” solutions with hundreds of menus and options and a dizzying battery of configuration and customization needs. But perhaps the most insidious feature of these tools is that they optimized for a single platforms. Whether it is optimized for the OS, the hardware, the language, the methodology; these tools are trying to force you focus on one OS/platform/hardware/methodology. But today’s applications are highly heterogeneous because we constantly innovate in our solutions and embrace new ideas and incorporate them into extant systems. Of course migrating all your existing data into one of these OSFA products is expensive, time consuming and very error prone.  And let’s not forget none of these vendors is the domain expert in every part of the lifecycle. So these bloatware tools provide adequate to capable functionality throughout but never the deeper capabilities for dealing with the inevitable corner-cases we all have in all of our systems.

Best-of-all

So what is the answer? There is a hybrid solution here that needs to be considered. You want best-in-class tools but you want them to work together and support your end-to-end delivery process.

Orchestrated Application Delivery addresses this problem directly, simply and preserves your existing investment in solutions you have chosen because they meet your technology needs. It supports your methodology and topology and geographical challenges. It is designed to allow the most flexible implementation without massive data migration and it supports an evolutionary roll-out strategy (not a revolutionary overturn everything we hold dear because we have new tool to implement strategy!).

We’ll look at how this works in part 3.

About Kevin

In the past year Kevin has spoken at 20 conferences and seminars on a range of leading IT topics, including methodologies, business analysis, quality assurance techniques, governance, open source issues, tool interoperability, release management, DevOps, Agile, ITIL, from the mainframe to distributed platforms to the web, mobile, wearable and embedded systems. He is a much sought after speaker, recognized around the world for his provocative and entertaining style. Kevin is a 40 year industry veteran, holder of three technology patents and today is VP of Worldwide Marketing and Chief Evangelist at leading Application Development and Deployment vendor Serena Software. In the past decade he has been crossing the globe and has met with over 4,000 people. At Serena he works closely with industry analysts, the press, customers, partners and employees to exchange ideas about industry direction and business issues. He was born and educated in the UK and lives and works in the Bay Area, California.
This entry was posted in Business and Technology and tagged , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s