A recent piece reminded me of the positivity that we had in the early 2000s.
I still retain an optimism, but it’s a little scary to look how much the country has changed in the last 15 years. In the 2000s, after Labour’s landslide in 1997, the atmosphere was far more positive than it is now. And when we look to things like the quality of schooling and, particularly, the ratings of the NHS, we can see that feels justified.
Nesrine Malik recalls arriving in the UK in the mid-2000s and finding a welcoming country that allowed her to build a life from very little.
Alan Jacobs takes the time to look at one writer who thinks about how good it would be to write without writing, and a second who looks forward to a time when they can write about something without researching it. How do they imagine doing it? Using AI that mocks their writing or researching style.
Alan captures why this is likely to result in utterly flawed pieces:
As I was walking this morning I suddenly understood the most fundamental thing that’s wrong with the way Smith and Thompson think about these matters: Smith assumes that at the outset of a writing project he already knows what he wants to say and just has to get it said; Thompson assumes at the outset of a writing project that he understands what he needs to know and just has to find a way to know it. But for me writing isn’t anything like that.
I find this too, that in the process of writing, I often end up somewhere different than where I expected to arrive when I started out.
A GPT language model can only use about 4,000 words of context when generating its next words. The resources required for generation in a GPT increase significantly as this “context window” expands. Unlike a normal Google search which takes milliseconds to run, generating responses in ChatGPT, for example, takes on the order of seconds. That’s expensive.
I’d not thought through the implications of this limited number of tokens a GPT large language model can use when generating its output. As Allen Pike explains in A 175-Billion-Parameter Goldfish, as well as dollar-cost, this has deep effects.
I couldn’t help falling for the higgledy-piggledy, nature-encrusted look of these moss-covered wooden tiles on a building in Leigh Woods, on the outskirts of Bristol.
It’s no secret that I spend time customising my workspace. More recently, that has involved creating my own VS Code theme. Further in the past, it was a bit more hardcore, involving writing code that replaced core parts of the Windows experience. I’ll write some more on that one day soon, I hope: Windows shells saw great creativity in UX for the few years they were tenable (Windows 95 through XP), but sailed under most people’s radar.
For today, we’ll look at a smaller part of that fascination: fonts. Specifically, fonts for coding. I was set to this by Tim Bray’s Monospace and More Mono, a small excursion into ten or so monospaced fonts.
I can’t resist an excursion into ten or so monospaced fonts.