Around the web
Ward Cunningham defines technical debt. You keep using that word. The people who work with him do not think it means what you think it means.
Technical debt is a particularly invidious form of “designed as practised” schism, as it is used as a threat to get work that engineers want to do for their own benefit to pre-empt the early and continuous delivery of valuable software. This is already a highly weird situation: “we chose to cut a corner before, let us spend extra time now or there’ll be trouble”. Dressing it up in highfalutin-sounding terminology make this sound like professionalism just adds to the weirdness.
We don't have the original discussions Ward had with his colleagues in which he used the debt metaphor, but we do have this summary in the experience report. As an aside: my great thanks to the executives and librarians at the Association for Computing Machinery, for opening access to their 20th century library of publications. What we see is that through object-oriented polymorphism, the Wycash implementors were able to incrementally update parts of their design without the change impacting the whole running system—thereby making use of new knowledge about the product and problem domain immediately. How immediately? This immediately:
Shipping first time code is like going into debt. A little debt speeds development so long as it is paid back promptly with a rewrite. Objects make the cost of this transaction tolerable. The danger occurs when the debt is not repaid. Every minute spent on not-quite-right code counts as interest on that debt.
So you do not cut a corner today and then threaten your product owner with ruin unless they give you time off for hobbyist programming later. You do the best you can now, and you do the best you can minutes later. If you learn something new about the software you’re writing, you use that information immediately, on the task you’re doing now—and you look for the earliest opportunity to share that information and consolidate your implementation on the updated design.
Now there’s another risk: that my interpretation of this experience report is yet another alternative reading and another confusing meaning for an already muddled phrase. Perhaps “technical debt” has become too confused to have any value as a jargon term, and should be retired.
Showing two things: firstly that it isn't just software engineering that has this gap between design and practice; also that there can be benefits to moving away from the way an idea was designed. Research publication today is a massive for-profit industry (the “publication” part; the “research” part including authorship, review, conference presentation etc. is done for free at best, and costs the researcher at worst) and one that in many ways holds back the creation and dissemination of research, and the careers of researchers. And not evenly: in fields like computer science where conference presentation is the most prestigious way to disseminate research, you get on better if you are able to travel on an arbitrary schedule (i.e. you have no caring responsibilities).
Various bits of chipping away at the edges are being tried: principles for crediting research software developers or citation files for archived digital artefacts make it easier to acknowledge diverse (digital) contributions to research but do not fundamentally change the paradigm. It’s clear that something new is needed; it’s clear what many small pieces of the new big picture will look like; it's clear that the whole will be radically different. It's also evident that it will still be called “papers” and “peer review”; whatever we end up with.
A modern, low-stakes example of design being different from practice: REST is transforming in meaning from hypermediated discoverable interface to fuck it; overload HTTP. Here, we look at how and why.
Back in 2017ish I interviewed for a lead software engineer role at a local company that was one of those “enterprise agile” places. It was clear that the principal architect had a bunch of check marks he wanted candidates to hit when he asked about various software engineering terms.
Once I’d cottoned on to that, I started giving two answers to each of his questions. “This is the answer you want that includes the keywords you need to tick off, and this is the answer that matches what the person who created this idea told me when I asked”.
REST was one example: to get this job I need to say abuse of HTTP verbs, collection APIs with sub-paths for identifiers, JSON, and Swagger. But to answer your question I need to tell you that REST is a design principle for a browser-mediated hypertext system, one of many alternatives with different constraints, and has nothing to do with your RPC server.
BDD was another. TDD was another. Devops was another, as was Scrum. There’s probably a book in “the real meaning of words software engineers use”; there’s certainly an interesting social science research project into why we do this.
DevOps is another of those terms that means a different thing in use than it did when it was introduced. It used to be the integration of development and operations as a common culture; or the restatement of the Agile principle of self-organising teams for people who weren't listening the first time around. Now it means hiring a third role to…do whatever it is people think DevOps means. It might be to build self-provisioning UIs for cloud infrastructure, or it might be to have
root on the Jenkins server.
I picked this article to illustrate the “designed as practised” nature of TDD because it links to so many of the others. I think the JAOO “TDD will deteriorate your design” flapdoodle was the beginning of the slowly-rumbling TDD backlash that culminated in the 2014 “is TDD Dead?” video series linked back in Issue 1 of this newsletter.
Others have attacked TDD for other reasons: the entirely reasonable observation worth serious consideration that there’s no proof of material benefit from TDD; and the early (and now largely abandoned) type supremacist canard that if your code compiles with a strict type system then it works and you don’t need to write any tests. The problem with that latter argument is that your compiler doesn’t know what your customer wants.
Amazingly on the same day I posted Phrases in Computing that Might Need Retiring, Ron Jeffries posted Retiring Some Terms. The one he mentioned that I haven’t yet is Refactoring, so let’s look at that.
I’ve had a career-long difficulty with the word Refactoring. Martin Fowler’s book came out in 2000 and I read it a few years after that. Very early in my career I was leading a team (that's a whole different discussion: suffice it to say that “Senior” is a word that should’ve been covered in this issue). A colleague used “refactor” to mean “edit”: I refactored the code to add this feature. I refactored to fix this bug.
To me, this seemed like a dilution of the word as Fowler uses it:
a change made to the internal structure of software to make it easier to understand and cheaper to modify without changing its observable behavior.
to restructure software by applying a series of refactorings without changing its observable behavior.
It is only this narrower meaning of “refactor” that makes sense within the “Red-Green-Refactor” of test-driven development: if we take the outcome of the tests as the observable behaviour of the software, then we require the tests to have the same outcome before and after the change.
I actually think that refactoring often has an even narrower meaning in modern times: whatever changes are supported under the “Refactoring” menu in my IDE. I don’t agree that it needs retiring, the phrase is still salvageable as long as you check the meaning on use.