Rather more of the internet's stuff than of my stuff this issue, which makes me feel like the issue is rather light (of course it isn't, it's more diverse! and in fact previewing the issue I see I've done far more editorial this issue than is usual) but various things have led me to spend less time on making the internet over the last couple of weeks.
One of those things I know I'll be able to share in a few weeks, and the other I hope to be able to share a few months from now. Both continue the mission to help programmers become software engineers, and that's all good, but neither lead to immediate production of Content™. That's got to be OK. It's got to be a good thing to step away from immediate gratification and Wordpress stats, and make some longer-term things that will ultimately have a better pay off. I've talked before about preparing for computing's centenary: none of the things I'm talking about here are on that grand a scale but hopefully the anniversary is marked with more investment than a change to the banner on r/SoftwareEngineering created the day before.
It doesn't feel OK though. It feels like I'm missing out on being incredibly online, and on sharing things with you. I think that's the unnatural shift in dopamine delivery associated with the notification buzz of being online. I remember when I started working full-time in 2004, I would manually refresh my mail client (which was then, as it is now, Apple Mail) twice a day and my RSS reader (which was then, as it is now, NetNewsWire) each lunchtime.
These days I've turned off most of the beeps and red circles, but still aware that they're there in the background, and that I'm not the cause of any of them. Every application wants to be able to send me notifications. I've fixed the demand-side FOMO by disabling the notifications on my phone. If one of about six people call or send me a message, I'll get beeped, or when my alarm clock goes off. But that hasn't addressed the supply-side FOMO: there are conversations whizzing around the interwebs and I'm not contributing.
What I missed the most over the last fortnight was not recording a podcast episode (actually it's been over a month, but that means it probably got mentioned here in the previous issue). Some people would not class my podcast as a successful podcast: it does not have many listeners, I don't get a whole lot of feedback per episode, and I don't have a lot of evidence that people who listen to the show cite the podcast when forming their own arguments about software engineering.
But I enjoy making it, because it's the closest thing my current skills at internet use allow to my absolute favourite way to talk about software engineering: conference speaking. When I write a blog I typically stream of consciousness an outline or a rough draft, then paste it into Wordpress, preview it, and edit it into shape. With a podcast I basically make an outline and the show notes—the list of links I want to mention to support the episode—re-arrange the outline until it's logical, then hit record. It feels more conversational because while I'm not thinking through the argument in real time, I do allow myself to present it spontaneously.
I'm very interested in what you think though. Send me an email and tell me what's good, bad, could be better, or should be stopped about what's going on here. All of this waves hands exists to help you become better software engineers, so if I'm not doing that the best way possible, I'd love to improve.
On or Between
Lightly inspired by the discussion around Grand Central Dispatch and threading, I talk about whether concurrency details go on a solution object ("this Employee class is now an actor") or between them ("talk to this Employee class through this actor").
Episode 39 : [objc retain];
Progress using XCake to generate Xcode projects continues apace, and we hit the first point where we need to extend a couple of tools: on the one hand GNUstep's Xcode project library (and build tool) doesn't support unit test targets, and on the other hand XCake itself doesn't support command-line tool targets. As our current goal is to build the command-line runner for XCTest, this is rather limiting.
DosAmigans - Twitch
Yesterday's episode of Dos Amigans was great fun. We chatted, mostly about various game controllers after conversation got onto the A500 Mini and its CD32-like, SNES-like controller. Then we got into a new project, an org-mode highlighter plugin for the AmigaOS 3.2 TextEdit app.
Actually most of the development phase was typing in example code and making sure we understood it and the documentation around it before we adapt it to our needs. We ended the stream at the point where it compiles. That's often how projects start: identifying how the solution you need and the tools you have fit together.
Around the web
What went wrong with the libdispatch. A tale of caution for the future of concurrency.
I don't think it's fair to say that
libdispatch went wrong. On the one hand, the promised embarrassingly multicore world that the hardware vendors said was coming still hasn't materialised: they were talking about hundreds of cores, and we still mostly have fewer than ten. And Grand Central Dispatch uses an uncomfortably large number of threads.
On the other hand, it primed that whole world of application developers to stop blocking the UI when their software was working, and that greatly improved the experience of using the platform. NeXT had had Mach threads, but the appkit was resolutely single-threaded and many developers did not use the low-level threading behaviour. Whenever a NeXT did anything, particularly I/O related, you got the spinning propeller (the forerunner to the beachball of death, on NeXT it was basically a black and white BMW logo that told you to wait).
That remained the case through OpenStep and Mac OS X. There were Foundation-level interfaces for threads and locks, but they didn't provide any abstraction, they were no easier to use, so they rarely got used. GCD, which worked alongside Objective-C's blocks and garbage collection, made it incredibly much easier to say "this work gets done without blocking this other thing". That was a huge win.
I imagine that as SwiftUI matures and other UI technology builds up around it, the legacy idea of a "main thread" that you aren't supposed to use except where you must will go away (by which I mean it will be hidden, not it will be removed). All of the software platforms I've worked with in the last few years now have some form of task-centric concurrency, even Python. As platform developers and applications developers alike get familiar with it, the needs change: that is what has happened to GCD.
Story Points: Why is this so hard?
Tim Ottinger, the Agile Otter, explains why story points are so hard to use. He does it by describing the history of the story point (in fairly immaterial terms): why they were introduced, how they were used, how other people thought they could be used, and how they got here. This is an important viewpoint: all of software engineering is historically situated.
The values and the principles of agile software development are, for the most part, pretty slow to change and agreed on by quite a few people. Even they have a material place in the history of software: they were uncovered through the 1980s and 1990s, coalesced at the beginning of the 2000s, and then were shared and promoted in the subsequent two decades and beyond. The specific practices, of which the story point, planning game, and yesterday's weather are examples, arise during that time as particular ways to try to realise the desired values and principles within the context of their invention. Sometimes they work well. Sometimes they don't. Sometimes they initially work well but don't generalise to other teams, other times, other contexts.
Don't accept a practice, or a bundle of practices (often and erroneously called a "methodology") as an "agile" way of working. Decide whether you intend to value the values of agile software development in your work (you don't have to), and then select practices that support those values.
Samsung Electronics Cultural Issues Are Causing Disasters In Samsung Foundry, LSI, And Even DRAM Memory!
I keep half an eye on semiconductor industry news, a legacy of my time at ARM, but I’m no expert. It sounds like not only are Apple/TSMC currently winning at mobile silicon, but Samsung are losing quite comprehensively.