Could you be creating a legacy system for your customers?

Averting the Legacy Label: Why do users and my customers consider our software a Legacy System?

Having your product labeled a Legacy System by your team or clients can be a nightmare for any product manager. As we mentioned in the previous post, your product doesn’t need to be 10 years old to be considered a Legacy System by your customers or tech teams. Incredibly, this label could actually stick from day 1 of your product going live.

In this second of four posts in this Legacy System series, we’ll set the context for preventative and corrective measures your product team can take to prevent your product from becoming a Legacy System. We’ll take a look at 2 high-level categories of reasons your system may obtain the Legacy label:

  • Evolving user expectations, and
  • Technological architecture decision mis-steps.

Legacy Systems Born from Evolving Customer Expectations

Product managers and their software architects can help avert the Legacy System label by first reviewing in more detail on what kinds of customer expectations have changed over time such as expectations about software:

  • user interfaces,
  • security,
  • performance,
  • analytics vs. reporting,
  • open APIs, and
  • cost of ownership from licensing and developer costs

Evolving User Interface Expectations

The most obvious changing expectations for customers has been regarding the user interface. Here is a brief history of the trends that have driven software to obtain the Legacy label:

- 90’s customers wanted a mouse. Thirty years ago with the dominance of Windows in more and more homes and workplaces, the obvious Legacy System problem for most product managers was that users didn’t want to learn and interact with traditional “console” interfaces. MS DOS, Unix, or mainframe applications. The UIs of most business software did not include a graphical user interface (GUI). Though the keyboard/console driven interfaces may arguably have been more efficient/productive for many “power users”, the usability/training issues for staff were lessened by building mouse-driven graphical user interfaces “GUI” interfaces.

- 00’s customers wanted to access their software. Twenty years ago, legacy system pains were driven by the expectations of users to access their business data through a web browser both internally to the organization and outside.

- 10’s customers expected mobile apps. In the last 10 years, the problem has been delivering services to mobile devices. Companies have upgraded the web interfaces with “mobile-first” [link] or responsive web UI’s, but more and more are being pushed to “go native” building mobile apps for the iOS and Android platforms that can exploit the device’s hardware platform (sensors, camera, etc.).

- The most obvious emerging expectation for more and more business applications is around solutions for “handsfree” (e.g., not typing) utilizing the sensors/camera of the mobile devices and the power of the cloud AI “interfaces” offered by Apple’s Siri and Amazon’s Alexa.

Evolving Security Expectations

The first few years of gmail wasn’t running on SSL. Most PC’s didn’t have a password prompt. Most wifi hotspots were not protected by a password. Most information technology professionals knew for years that these problems existed before the products and policies (and laws) of companies caught up to help close some of these obvious gaping holes.

More innovation, more critical exploits, and the inevitable emergence of more powerful computers will create ever more exploits. Examples of new threats include sophisticated “phishing” driven by Artificial Intelligence and the massive increases of computing power (quantum) that will provide ways to crack the most secure systems and fool the most technically sophisticated users.

Evolving Performance Expectations

Waiting overnight for reports or hours for queries to run has become unthinkable even on the largest datasets for most users. They have become accustomed to using cloud-based services for search, email, and analytics that give results instantly for very large datasets. Analysts, traders, and executive decision makers all know their time is incredibly valuable and waiting for answers to “simple” queries has become unacceptable.

Evolving Analytics Expectations

Data exploration tools such as Google Analytics and a wave of more affordable OnLine Analytical Processing (OLAP) BI tools have become available (e.g., Tableau, QlikView, Microsoft PowerPivot) beyond the heavyweight BI Enterprise software. There was an increased expectation of business users to analyze their data in real time to inform decisions.

Giant systems optimized for transactional processing could be accessed directly to create monthly, weekly or maybe daily “canned” reports. But were not able to take the load to run queries by dozens, hundreds, or thousands of users, and then give responses within a couple of seconds.

Evolving Expectations About How to Access Data

Having applications available for users from the browser was incredibly useful, but once the business data was obviously available outside the corporate walls, partners and customers wanted to “consume the data” outside the confines of the provided user interface. They wanted to “mesh” the data provided by one partner with their own data, do their own analysis, and resell the data for other purposes. Selling a business software means providing many ways for your customers to access the data and logic of your system. access to the underlying database, various API’s, and SDK’s.

Evolving Cost of Ownership Expectations

Finally, IT departments have evolved their expectations around what is the acceptable amount to spend on their own hardware infrastructure, software licenses, and their own internal or outsourced developers:

  • Hardware infrastructure has evolved from huge mainframes, to less expensive Unix/PC/Linux servers, and eventually to cloud services. The largest companies today are offering their newest services relying on underlying technology deployed to all three of these environments.
  • Even with the offshore globalization trend of software development, developing and maintaining custom software in-house has become increasingly expensive. Platforms as a service such as Salesforce have thrived mostly from the promise to lower costs of creating in-house software.
  • Software licensing costs have to evolve as the computing infrastructure and the number of users have changed. Your software product could be considered legacy simply based on the licensing model.

Legacy System Born from Poor Software Architecture Decisions

Beyond external market expectations, there are some classic pressures and biases that software architects have when creating a new technology product. The two most classic mistakes are:

  • Selecting cost-reducing silver bullets, and
  • Resume-driven technology biases

We’ll review these two categories of mistakes, and then give some cautionary specific examples of software we’ve worked with (been handed or created ourselves!) labeled as a Legacy Systems because of early architecture decisions.

Cost-Reducing Silver Bullet Technologies

Some technology selections/paths promise to massively reduce the amount of effort to build your software. One silver bullet promises to eliminate the need to hire talented developers at all and the other promises to turn a single talented developer into a talented developer team.

Promise to turn anyone into a developer

Some technologies promise to reduce the cost of development by avoiding the need to hire expensive developers, the experienced and formally trained computer scientists. Some examples can include Microsoft Access or Salesforce promise to allow “software engineers” to be born after a few days at a training seminar or flipping through a “For Dummies” book.

Promise to turn good developers into Superheros

Other technologies promise strong developers (e.g., the expensive ones) that they can eliminate the “drudgery” of some coding tasks… giving them the power of dozens of average developers. Various code generation frameworks spring up every few months and many, many ambitious young software developers create their own. Many of these tools are great when proven (e.g., Rails and many of the clones) used for the business problem they were designed for. However, when you draw outside the lines with new ideas/requirements, it takes a very sophisticated developer to debug the creations of these tools and you are often left with “write-only code”, an unmaintainable piece of software.

(Relatedly, you need to make sure your outsource partner isn’t choosing the path of most effort. It is true that creative ambitious developers have been stuck with “heavy” architectures that support projects with tens of thousands of billable dev hours. Frameworks/technologies have been successfully pushed by the unholy alliance between the large consulting firms and the large product companies.)

Resume-Driven Architecture Biases

Another prime category of architecture mistakes are those driven by the resumes (past and dreamed) of the team making the software architecture decisions. You team can lack ambition or have too much:

  • Team lacks ambition. They base the architecture decision solely on what they know the best despite the business problem or technology maturity. This unambitious team uses technology they know in the way they have always used without much reflection or learnings from the market. Many developers could choose this technology even when the technology is obviously flawed in some major way or no longer popular (which will increase hosting and long-term maintenance problems)
  • Team is too ambitious. They base the architecture decision on what will look best on their resume. Your team can often choose technology that gives them “street credit” among their developer peers… and even more importantly, for their next gig or resume. Plus, it’s simply fun to learn new stuff for smart folks (that aren’t working under a fix-price, fix-time contract). Popularity of a technology is important (for recruiting developers for a growing team) and for comfort-level for customers that might want to host your product themself. However, the technology choice should clearly be “mature”.

Tier by Tier. Tear by tear : ( Examples of Architecture Decisions Gone Bad.

We’ll conclude this post to look at some specific examples of Legacy System architecture decisions at each tier of a modern application.

The Legacy User Interface Tier Selections

Although browser-driven UI web applications have been around for nearly 20 years, the technology trends have been very strong and sweep. This was firstly driven by the “browser” wars with Microsoft keeping JavaScript/HTML/CSS unstandardized between IE and alternative browsers. Unhelpfully, the problem was being solved by competing Sun Micrososystems/Oracle (Java Applets) and Adobe (Flash Macor) ideas.

Then, once the browser as software platform stabilized, we still saw a huge turn-over in popular today and dead tomorrow web UI frameworks. We assume this was mostly caused by two forces:

a) many web developers are not trained formally as software developers,

b) formally trained computer science software developers were coming up with ways to make web UI development feel like “real development”. So we’ve seen professional frameworks for “web applications” web tier technology choice trends come and go extremely quickly:

- ExtJS (later called Sencha) - can make web applications work like PC desktop applications. And pushed by Yahoo so must be better.

- GWT - Google’s in-house tech used for it’s advertising platform promising java developers ability to create HTML/CSS/JavaScript UI’s

- Backbone.js - framework that Ruby on Rails developers could love

- Angular - framework developed by some folks at Google and is complex so it must be better :)

- React - framework developed by Facebook so must be even better

The need to build mobile experiences, and a server shortage of developers without the skills to learn/support native development (in objective C or Java).

The Legacy Middle Tiers

Mobile backend as a service is the most recent. User interface developers wanted to build/deploy their applications without having to learn or build the middle-tiers. Some of the darlings in this space were quickly acquired (Facebook with Parse and google with Firebase) and promptly left to die. Two years ago Facebook announced (with 100,000’s of live mobile apps using it!) that it would shutdown. Thousands of developers needed to migrate their “back-end” migrated to keep running.

The Legacy Data Tiers

Relational databases have shown their age, but have proven themselves as reliable backboned for scalable transactional systems. NoSQL databases for problems that would work fine for relational databases.

Neural network databases for analytics where star schemas hosted on traditional relational databases would be sufficient.

Conclusion

We have summarized historical and practical examples of the forces creating legacy software. In our next Legacy Software post we’ll go over a practical checklist to help reduce the chance that your new product will quickly be considered Legacy. In the final post in the series we’ll review your options for recovering from your Legacy Software problem (options for upgrading or replacing).

Read also

Most Read

1 Mobile payments security. What should developers know about it?
2 Software development view of healthcare wearables
3 7 reasons to use real time data streaming and Flink for your IoT project
4 Creating a digital product for the healthcare industry?
5 How to create an effective Asset Tracking System?

Digital products from concept to launch

We understand that creating a product is a challenging and risky endeavor and believe that having a partner with experience and know-how is a critical first step.

Learn More

The Digital Product Journey

From idea to launch we guide you through the startup experience

Learn More