10 outdated beliefs about software
The world of software remains a fascinating place and I keep being amazed at how rapidly it continues to evolve and transform. We certainly have come a long from the early 1980s when I was a teenager programming BASIC on my ZX81. Especially for those of us who have been in the field for decades, it’s critical to continuously reinvent ourselves and our beliefs about the domain to make sure we don’t get stuck in the old ways.
Unfortunately, I still run into a lot of very experienced people who in some way “didn’t get the memo” and are operating on an outdated belief system about software. This isn’t just the case for practitioners but also for customers, managers and senior leaders. To address this, I assembled 10 beliefs about software that I run into regularly and that in my experience are outdated and no longer true for most contexts.
1. Clear, precise requirements are instrumental for success
No individual or group can build a system without an understanding of what it should look like. Traditionally, we used requirements, defined at the start of the project, to describe the desired outcome and assumed that if we’d build the system according to the requirements, we were fine. However, requirements change all the time, capture less than 10 percent of what was really intended, easily slide into specifying solutions, are often used as an excuse and easily move us away from the outcomes we’re really looking to accomplish. In many domains, it’s better to accept that the full scope of the system will only become clear during the project, things change constantly and there are better ways than upfront defined requirements to manage the system’s functionality growth.
2. A carefully designed architecture is critical
In the early 2000s, I was one of those people preaching the importance of careful design and analysis of a system’s architecture before starting development. The belief was that especially non-functional requirements, such as performance and robustness, are hard to ‘bolt on’ to the system after development has started. Today, it’s clear that the architecture of a system evolves continuously in response to new requirements, evolving technologies, changing user behavior, and so on. And it’s entirely feasible to change the non-functional characteristics of existing systems by evolving their architecture.
3. Slow, manual testing is superior to automated testing
One of the most persistent outdated beliefs I run into is that manual testing at the end of a waterfall-style project is superior to continuous, automated testing in terms of accomplishing quality outcomes. The more safety-critical and complex the system at hand, the more this belief tends to be prevalent. Although our research shows that automated regression testing benefits from being complemented with exploratory testing by humans, it’s obvious that testing a system extensively and frequently through automated means has to be superior to less frequent and less elaborate testing.
4. Technical debt means you did a poor job designing your architecture
Especially customers of complex software systems and more senior managers tend to believe that if you need to address technical debt, it’s because you or someone else did a poor job at some point in the past. Our research shows that all systems, independent of age, contain technical debt and the only way to handle it properly is to continuously invest a certain percentage of the R&D resources into managing it. As expectations are constantly shifting, the architecture and design of the system are never optimal for what we want from the system right now.
5. Bill of materials trumps everything
In the embedded-systems domain, one of the broadly held beliefs is that reducing the bill-of-materials (BOM) cost for a product justifies any R&D expense as, especially for high-volume products, this one-time expense is outweighed by the savings generated for each manufactured product. In most contexts, this is no longer the case. Typically, the savings from squeezing electronics are minimal, but the more important reason is that most systems are now expected to evolve during their lifetime. If you have no headroom in the product, it’s impossible to deploy improved software to it and you’ve painted yourself into a corner.
6. The data is owned by the customer and we can’t monetize it
Especially in companies that started their life in mechanical engineering, there’s a broadly held belief that the data generated by the product sold to a customer is owned by the customer and we can’t use or monetize it. Data is an asset, just like software, patents and designs, and who owns the asset is a matter of contracts. If your agreement with the customers states that you own the data generated by the product, even though the customer is the legal product owner, then you own the asset and can monetize it.
7. Post-deployment is relevant only for (serious) quality issues
For those of us who grew up in the age of expensive product recalls due to the first software issues, patching software in products deployed at customers is something only considered as the last resort. Times have changed, however, and virtually all products are connected and deploying new software is virtually free. In addition, customers are increasingly expecting a product that continuously gets better. Consequently, continuous deployment of new software to products out in the field is increasingly becoming the norm.
8. A/B testing is only for online systems
Especially in the embedded-systems domain, where I spend a lot of my time, many still believe that A/B testing is only relevant for the large SaaS providers. In the age of connected systems and DevOps, it’s of course entirely feasible to run A/B tests in embedded systems as well. And it often is superior to some engineer guessing what the best way to realize some functionality is. As the saying goes: if you don’t know, experiment.
9. We can’t do DevOps because we need safety certification
Change is difficult and painful and many tend to look for excuses why they’re special and can stay with their current ways of doing things. Especially in cases where safety certification or other forms of regulatory approval are required, many hide behind that to avoid having to adopt more modern engineering practices. There are several examples out there of companies that push software frequently even to systems that have high-reliability requirements and even regulators are starting to push for fast updates in response to, for instance, security risks.
10. It’s only about the customer and us
There still is a widely held belief that if we focus all our energy on the customer and build what the customer tells us to build, we’re going to continue to be successful. In a digital world where change is faster than ever, this view is no longer sufficient. We need to take a more holistic perspective, understand we operate in an ecosystem, that disruption comes out of left field and that the solutions you provide your customers with don’t operate in a vacuum but in an often surprisingly complex context.
The software industry is fascinating because it evolves so quickly. As professionals, this requires us to reinvent ourselves continuously, question our beliefs and accept that what was true at some point in the past perhaps is no longer true or no longer the optimal way. Letting go of old beliefs that we’ve held onto for a long time is hard, but belief systems are like houses you live in for a while: you need to move on to grow and develop. The only ones that don’t grow and develop are the dead ones. And who wants that?