Meaningful software: wishful thinking and rationality bias

Vsevolod Vlaskine
5 min readNov 16, 2020

--

Biases and software

It’s a stereotype that software development is a punishingly rational process: even a minor typo produces a syntax error, one slightly wrong condition statement crashes a spaceship.

And yet, software development as nothing else is infested by biases of all sorts, or maybe due to its pre-requisite of exactitude and its ubiquity, human fallibility is more starkly exposed in software than in other industries.

The Internet is full of papers and blog posts on biases in software engineering: confirmation bias, availability bias, hindsight bias, and what not — literally dozens if not hundreds of them — as well as recipes of how to control or overcome them.

My point is that there are limits to what can be controlled: some biases unfortunately are not simply errare-humanum-est, but represent human fallacies that impede software development, but at the same time also drive it.

Promised land

For example, how do you feel about the crazy promises the marketing keeps giving to the clients? This bias of scale has been one of the primary sources of incredible frustration for the clients and great stress and burnouts for the engineers.

The promises in software end up so outrageously overinflated, off by an order of magnitude if not worse, because software development is essentially a linguistic endeavour, which gives it semantic leverage unseen in other production industries, especially material ones.

The problem has been articulated literally for at least half a century (Fred Brooks published his seminal The Mythical Man Month in 1975.) And yet, very little has changed: businesses remain soaked in unrealistic promises by their own choice.

They would rationalise crazy deadlines or fanciful feature lists as “the only way to get foot in the door”, “being confident in our brilliant engineering team”, etc but at the core of the promised land is the desire to get there, which is emotional rather than rational both on the marketing and client’s sides: to be viscerally compelling, the emotional side of the sales pitch has to be biased to tip the client your way.

Rationality bias

Calling something like the bias of scale a cognitive bias labels it as a cognitive error, a fallacy that needs to be rectified, just like the compiler being unforgiving to syntax errors or test suite brutal on logical errors.

This is what I would call rationality bias:

When the metaphor of correctness is applied to promises, it leaves out their emotional component. The problem is that, as the brain research has shown, the initiative of decision-making is necessarily driven by emotions. People with impairment of emotions, e.g. due to brain lesions, lose the ability to make most basic decisions.

On the higher level, emotions are connected to values. It is the value judgements that inform, at least in hindsight, our decisions, rather than just a utility function, the latter being yet another illusion.

To make sure: those marketing promises still are insane and hurting professional integrity of the engineers. However, pretending that promise-making could be rectified to become completely logical puts in the blind spot the force of desire behind the promises, which is indispensable for decision-making.

Wishful thinking; red tape and gaffer tape

We need to deal with the cognitive aspect of the biases in the software industry, however using the word ‘cognitive’ blocks the view of the emotional engine of the bias in the first place. The synonym ‘wishful thinking’ lets us focus on the emotional force behind the bias — the latter word meaning a force making us leaning towards a certain course.

We want more, and hence bias of scale. We like thinking and hence speculation bias. The software engineers love coding and hence implementation bias. You don’t want to stop any of those forces. They need to be balanced, not just ignored while resolving the forces.

An extreme example of addressing just the cognitive side of the implementation bias is red-tape, documentation-driven companies, which still are relatively common e.g. in highly regulated industries like medical equipment or airspace. They end up organisationally lobotomised with the engineers’ initiative locked up and bear large overheads of missed opportunities and depressed morale.

On the other end of the spectrum lie the gaffer-tape businesses where projects survive based on their raw vitality. Just like red-tape teams, they carry massive overheads, in their case of delayed and failed projects, expensive waves of mobilisation, throw-away effort, popularity fads instead of commitment to building capability, etc; however while enough cash and enthusiasm (or pressure) is flushed through, they may act as thriving dynamic systems.

None of it is wrong, it’s just has expensive price tags, which some businesses are alright with paying, at least for a period of time. Being in denial about it and keep those costs hidden do not feel right, though.

Implementation bias: a case study

The solution to the wishful-thinking biases is different from company to company and from project to project. The goal is to avoid its cognitive pitfalls at the same time preserving its healthy emotional core, which renders vitality to the whole enterprise.

E.g. there is a cognitive aspect to implementation bias: too much is done too early, with too many assumptions, too much coupling, and too little reuse kept in mind. No-one knows exactly what the existing software does, it’s hard to change or add features, there is no cross-project synergy — the cost of it is monumental and, most importantly, hidden:

If debugging and reverse engineering, instead of testing and knowledge packaging, takes an hour instead of ten minutes every time, it means that the project will take 5–6 times longer or require 5–6 times larger workforce.

The implementation bias is endemic and ubiquitous in software. There is no single solution or organisational pattern to balance it. Rather, there are scores of best/better practices (business analysis and design practices, methods of forming and scaling teams and projects, etc), which need to be applied to the right extent.

The choice of the right mix and right extent is driven by the forces. There are economically rationalised forces like time to market, client satisfaction, cash/value flow, etc (themselves prone to all kinds of biases: modelling bias, lean bias, tangibility bias, etc).

There also is a non-rational force, wishful thinking at the core of the bias itself: the programmers love coding. That’s what causes the trouble in the first place and yet the good solution should nurture this drive: How do we let the engineers start coding as soon as possible to preserve the team motivation and healthy intellectual dynamics?

If you constantly asking whether the team is doing productive, satisfying job without overreach, you are likely to prioritise early non-throwaway prototyping rather than endless business analysis meetings, capability development rather than overfitting to client’s requirements, and representing the development team to the client rather than using the team as an expendable resource.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

No responses yet