April 17, 2021

Software as a cost center

I think it’s fair to say that most devs will cringe at their department being called “IT”. It evokes images of old fashioned big faceless corporations, as parodied in the show “The IT crowd”. Generally when someone uses the term “IT”, it betrays the paradigm through which they view developers: software as a cost center.

Developers in this model are a resources whose costs are to be carefully managed, they are solely here to keep the machines running, not decision makers or a core part of the business.

Those companies will generally not have a developer friendly culture and will consistently produce bad software interfaces. I’m sure you’re familiar with some of them, have you ever had a terrible experience paying your water or electricity bills? They will try and outsource the costs of producing software to consultancies or cheap vendors. The results are always inevitably buggy, unmaintainable frankenstein interfaces obviously stitched together in an attempt to meet a deadline.

Unless those companies change their core operating values they will only ever produce unusable software (sometimes dangerously so). I cannot think of a company which successfully transitioned to the new model. What usually ends up happening is that startups eventually build interfaces which are somehow infinitely better than the competition even though they have a fraction of the funds. They then gradually overtake the incumbents. Think Monzo VS HSBC, Bulb vs EDF…

You really cannot create a nice, mass market product in the modern world without valued, in house development talent. The problem is that very few people are actually good at building software. I think that the right team, if they are given enough ownership of the product and allowed to rework their solutions regulary, can eventually grow to be extremely competent. I’ve also been part of teams which (through sheer luck) ended up with the right tech lead and managed to dramatically improve.

The largest problem facing any software project is the notion of sunken costs. Very often a codebase is allowed to rot beyond a point at which it can be saved, but it will never be refactored because management thinks this will take too much time. Even if it is refactored, the aim is often to replace exactly features from the previous (unusable) codebase, and no real thoughts is given as to why those features were not working the first time around. As new developers come and add their part to the house of cards, the result is very often a spectacular collapse.

If a codebase is in the terminal stage of usability (the developers working on it will know), all temporary fixes and “hacks” to make it work are a complete waste of time. A complete rework is the only true solution, with a complete review of the requirements/practices which presumably led to the first broken iteration.

I’ve been part of projects where that was the case, and the codebase was successfully rewritten from scratch with dramatically positive results. I found that the only way this can work is to have a team with competent senior devs, and where everybody follows each other’s high coding standards/patterns. Trust is the necessary foundation of this; everyone on the team must trust each other to deliver and design the system correctly, even if it is after going through some iterations. A bad apple in a development team is an extreme liability, they really will spoil the whole bunch through mentally exhausting the others.

Software development is hard, but it does not have to be painful if the right environment exists for it to be developed correctly. The companies which understand that will always be the most successful.