Introduction
The more technically-focused items in this post are mostly about how we can learn from the past. Two related fallacies in historical analysis are urdummheit (the stupidity of the ancients) and urwissenheit (the wisdom of the ancients). Either society reached some prelapsarian peak and we've all forgotten the knowledge of our forebears and now live as savages, or society marches ever forward toward perfection and whatever is new must, ipso facto, be better than whatever is old.
We see this a lot in object-oriented programming. You can identify two camps: one is that the Smalltalk folks got it and that everyone since did not get it and has doomed software to the dark ages; the other that you need no more evidence that OOP is bad than that today's blog posts are about functional programming. In fact, like any approach to thinking, OOP contains both baby and bathwater.
I have (at least on [objc retain];, possibly elsewhere) described NeXT Computer and NeXTStep as one of the best compromises in our industry, from a technical perspective at least. You want to make a 3M (megabytes of RAM, MIPS of processing, Megapixel display) workstation like the Xerox Alto, using ethernet networking, laser printing, and Smalltalk objects. You see a future where everyone's data is ephemerally detached from their physical computer: wherever they are, their applications and data are there too.
But you have a startup budget. What do you do? You compromise: you build that vision out of off-the-shelf parts. A UNIX-like OS gives customers a familiar workstation but can't support Smalltalk objects, so put Objective-C on the top. Laser printers need hefty CPUs and expensive Adobe licenses, so borrow the CPU and license from the central computer: using the same PostScript technology to print as to drive that megapixel display. A combination of a networked identity service (NetInfo), networked file system (NFS), and removable storage (magneto-optical disks) gives you the semblance of ubiquitous computing.
Various factors, including attempts to stay out of Apple's turf to avoid a lawsuit over poaching their staff, meant that the NeXT computer didn't really become a low-cost version of the Alto until March 2001. But we learned a lot along the way. Thus in this issue we explore nil messaging and how SwiftUI is a modern WebObjects: in a similar vein we explore REXX and the duopoly of mobile platforms.
And then the rest of the article is about personal knowledge management. Being a software engineer requires a lot of knowledge, both retained and rapidly discovered. How to cope with that is still an open question, and one I struggle with a lot.
Writing
This nil-swallowing behaviour works really well…
A benefit of having this mailing list is that along with the feed from my blog/podcast I can also point at other writing I've done around the web! This is a comment on the Hacker News thread about the brilliance of "nil" in Objective-C.
Collin's point is "message-swallowing by nil in ObjC is a nice pattern", which gets a bit lost in the short original post within "but there are other things you have to know about how emptiness is represented in ObjC". My comment shows that you can collapse these down to one thing you need to know using message-forwarding: NSNull
can behave like nil
when you want.
The rest of the discussion thread is mostly people saying that this is horrible, and that they prefer optionals/maybes/results/whatever thing their language has for representing emptiness. The behaviour of nil
in ObjC is definitely somewhat rare, and one that as far as I can tell was confected at NeXT. Stepstone nil
didn't do it and Smalltalk's definitely doesn't. I don't think that makes it horrible, just not what many are used to.
The fact that you can't always use nil
in parameters to methods, when the message (pardon the pun) otherwise is "nil
is an object like any other which has this default null-object behaviour" is indeed a source of confusion. Hidden inside the ObjC runtime is a bit of SPI that could have cleared that up: objc_setNilReceiver
. If +[NSNull null]
had had the nil
-like behaviour and the nil-receiver had been set to +[NSNull null]
then "the empty object" would have always been a singleton valid object that always had null-object behaviour.
I'm guessing it's too late to change that.
Issue #44: Mobile
On the mobile duopoly and cross-platform frameworks. A lot of my app work of late has been cross-platform, and I agree with Adrian’s view: you don’t need a particularly complex app before you’re doing as much work with native APIs as if you’d made two projects.
Also, Adrian reflects on his experience learning from Erica Sadun.
The Requirements Trifecta
Three things must ye need to make good software. They are described within.
Around the web
Tools for a Culture of Writing
As you may have noticed, I write a lot. But writing about my work and writing in my work are different things. A blog like SICPers or this newsletter are the things that are on my mind, presented my way, often devoid of context. Within a project you need more contextual information: not "Graham thinks this is a good way to write software" but "Graham, Jennifer, Abhijit, and Tham met (Richard was on leave) and decided that this was the best way to proceed, for these reasons, given these constraints, and that we'll review that choice at this time".
Matt describes a few of the documentation tools (think templates, not text editors) that support that way of working. A common theme across his recommendations is the (semi-)permanence of things like RFCs and retrospective notes. Anything that gets written in a chat application is ephemeral, so doesn't become part of the collective knowledge of the organisation. And in knowledge work, an organisation is its collective knowledge.
In my current project we follow more or less the model here, although our RFCs start as Github issues or even draft issues in the project view. We know we need to do this, but we don't know how: what do you think of this approach? People comment, maybe meet, refine the understanding until there's enough concrete information in the issue to start working.
And that gets me to a missing point on Matt's list: effective meeting minutes. Things that were said in Meeting Room 367 or on the chat log in Zoom are even more fleeting than things typed into Slack. If folks agree that something needs doing, say as a result of a retrospective, then they should probably agree what that is, who will do it, when it will be done by, and how we will know that it is done. And that needs to be written down before it mutates into "oh, I thought I was splining the reticulates, and Doris was reticulating the splines".
I will also mention that presentations can be an effective and compelling tool for sharing some of this information. They can also be dull and supercilious, but when they're good they're much better than written documentation.
Common REXX Pitfalls
Steven and I use ARexx a lot on the Dos Amigans stream to build automated tests of our AmigaOS applications. It's just like a command shell in that respect, you say address MYAPP
and then every verb gets sent to the app port and interpreted by the host application effectively as its own scripting DSL. AmigaOS has its own ReadArgs()
library function that's used both by command-line programs and Rexx handlers, so there's a strong consistency across the OS that's missing from, say, Bash and GUILE on GNU/Linux or Powershell and VBA on Windows.
But I hate having to remember all the weird edge cases that programming languages have. I tried it once, with Objective-C, and look how useful that information is now. So articles like this are great as a sort of "anti patterns library" for the language. If I get an error I don't understand, I'll take a look at this post and ask "is my case described by any of these sections"?
Model View Controller for SwiftUI
Helge Heß demonstrates the valuable contribution of the MVC pattern and the consistency between building applications in SwiftUI, in WebObjects, and in Cocoa.
MVC started life in the Smalltalk land as Trygve Reenskaug’s Thing-Model-View-Editor, and was never really written up (it was due to be the subject of a book in the Addison Wesley Smalltalk-80 series, which didn’t get published). It’s impressive that four decades and countless implementation changes later, it’s still a great template for UI design.
I've Used All The Notebooks
I’m in this picture and I don’t like it.
I go back and forth on paper or digital note taking setups (or both/intermediate: Livescribe pens, penultimate and related apps, Remarkable tablet). Pen writing is way better for helping me remember things. But digital note taking is way better for helping me find things I’ve forgotten later.
I have notebooks going back at least to my undergraduate degree in 2000, and have “imported” some of them to Evernote on and off. But it can’t read my cursive, neither can Remarkable or Apple Notes. Honestly, I can’t sometimes. So I end up knowing that I have my notes but unable to use the information in them.
This needs fixing: suggestions welcome.
Personal Knowledge Management is Bullshit
A counter argument to my plea for a better knowledge management system. I don’t buy it for two reasons. One is the unsubstantiated claim that ability to manage knowledge is genetic (I could believe a genetic component, I could more readily believe an educational component, but either way show me some evidence!).
More important is that the argument refutes organising knowledge as a web of hyperlinks (or a zettelkasten, or any other indexed system) ignoring the affordance of a search facility. If Google can “organise the world’s information and make it universally useful and accessible”, then a search tool to organise my information and make it personally useful and accessible should be readily tractable.
How macOS manages M1 CPU cores
This article is a fascinating look at how one operating system manages a part of its asynchronous multiple processor system. An ARM-based Mac has both "performance" and "efficiency" cores (an arrangement called Big.little by ARM) and the operating system needs to balance the demands of the running applications, daemons and services with the users' expectations of responsivity, performance, and energy efficiency.
The documentation for Apple's XNU kernel includes a design rationale for the thread scheduler explaining why the traditional Mach approach was inappropriate and how the modern system improves on it. The Eclectic Light Co. article explains what that looks like in use.