If you want to make a point or are trying to convince someone of something, avoid the fallacy of arguing from authority like the plague. And, although a corollary, never ever appeal to the authority vested in you as a person through your position in an organization. Whether or not your position has actually any form of coercive power (rather few actually do), once you resort to that, you will immediately lose the argument, just like you lose the trust and esteem of your peers that might have took you years to build.
It's been close to a decade since I last sat an exam; that is until two days ago, when I took the AWS Cloud Practitioner certification.
On a very personal level, it seem my test anxiety from university days has only been dormant due to the lacks of formal test situations in my life. Even though the stakes were orders of magnitude lower I found it made me remarkably nervous just to think of being in a test situation again. Probably deeply learned behavior...
The official exam guide lists 115 services as explicitly in scope, so for a knowledge based exam there is quite a bit of ground to cover (and trivia to memorize). The questions I got in the end did turned out easier than those in the material with sample tests that I used for preparation. Anyway, although I'm glad that I did it, I'm even more glad to re-enroll to the school of life now ;).
I've recently switched to a new company laptop. This also means I now have to use Windows 11. And to my horror some imbecile in Redmond deliberately decided to take away the option to dock the taskbar to some other side of the screen than the bottom. For over a decade my taskbar was docked to the left side. I usually don't wish ill to people, but I hope the idiot who is responsible for this user experience crime got the axe in some of the recent layoff rounds at MSFT.
Some time ago Aaron Parekis post brought my attention to a a little project, called the Internet Phone Book. Today I got a mail that it went to print, and I just ordered a copy. I would be a liar if I'd deny that the fact that you can dial me up there (464) played one little part in the purchase, but even otherwise, how could a resist such a love letter to the web in printed form, bringing together most beautifully the two forms of media closest to my heart. Looking forward to "dial" to many new sites and explore the vastness of the poetic web.
A little bit of friction in the publishing process does not only have downsides. I've just scraped a few sentences that I wrote down in anger in a local file. If I'd only needed to type into a box and hit enter, it would be too easy to rage-post. Whether I have I actually learned not to post in anger, well, I'll see it the next time when an executive has strong opinions on the role generative AI in the software development process...
In some circles of programmers, the field of software architecture has a bad reputation. To some extent maybe deservedly so, likely caused by overgeneralizing from encountering a certain type of software architect at work. Architectural qualities of systems in my experience and opinion cannot be influenced, effectively at least, by command and control, or idealized yet ultimately meaningless diagrams, or sending out lengthy documents, in which one is pontificating about how things ought to look like from 10000 meters above the ground. I believe in the superiority of practice. There is no substitution for spending a significant part of the work week in the weeds, on systems which are actually in use by other people. Fewer opinions, but better informed ones trump all those write-only documents in the world.
A third of the year has gone by already, that pace just feels unreal. Good thing I didn't take any resolutions, but how about the themes for this year that I mused about at the beginning of the year? I have not been following them in any systematic way. Fundamentals had to make way for the ephemeral topic for which I had to put time in out of necessity (not too fond about that...). I try to counterbalance it by what I choose to study in the remaining time, which never quite feels enough, certainly not enough to revisit what I do from first principles. I also didn't rebuild anything from scratch, but in some way that might actually be a good sign, indicating that nothing was broken beyond repair.
Lastly, while I didn't have much dedicated time for contemplation, when I am thinking about the decade to come, I come to the conclusion that going forward I need to adjust my themes: take much more active care of health and fitness, trying to systematically get rid of unnecessary stuff, finding one big topic to deeply focus on and trying not to spread myself too thin.
A third of the year has gone by already, that pace just feels unreal. Good thing I didn't take any resolutions, but how about the themes for this year that I mused about at the beginning of the year? I have not been following them in any systematic way. Fundamentals had to make way for the ephemeral topic for which I had to put time in out of necessity (not too fond about that...). I try to counterbalance it by what I choose to study in the remaining time, which never quite feels enough, certainly not enough to revisit what I do from first principles. I also didn't rebuild anything from scratch, but in some way that might actually be a good sign, indicating that nothing was broken beyond repair.
Lastly, while I didn't have much dedicated time for contemplation, when I am thinking about the decade to come, I come to the conclusion that going forward I need to adjust my themes: take much more active care of health and fitness, trying to systematically get rid of unnecessary stuff, finding one big topic to deeply focus on and trying not to spread myself too thin.
Tim Severien is mapping out the participants of the blog question challenge that was (or maybe still is?) making rounds earlier this year. The resulting visualization, that Tim posted on mastodon looks already like an interesting opportunity to discover new blogs, in some way it is an implicitly existing web directory made explicit. It also made me curious to trace how back I came about to participate, and so I did: Starting from Lars-Christian who tagged me, via Meadow, via Brandon, via Adam Newbold, via Jason Burk, via Gabz, via Kyle Reddoch, via Jason, via Chris ODonnell, and finally via Brian I reconnect to Tims graph. What I found a bit suprising is that the microcosm of personal websites apparently still manages to defy the six degrees of separation as proposed by the Small-world experiment.
In system theory there is the the principle of suboptimization, which states that Optimizing each subsystem independently will not in general lead to a system optimum, or more strongly, improvement of a particular subsystem may actually worsen the overall system.
.
In software development a certain way to divide labor in an organization leads to a similar problem. When there are specialists for cross-cutting concerns (sometimes also known as quality attributes), which have no end-to-end responsibilities for a particular product, yet a mandate to coerce those who do, to work on their particular topic, it can bring a whole project down. The reason is that the folks who have both the skills and interest to steer a system end-to-end, will become trapped in a permanent defensive position, dragged down by endless bikeshedding over details of varying significance. And over time it is nearly certain that they will run out of fucks to give. When that point is reached their choice is to either resignate or leave. Software producing organizations ought to be aware of that.
As much as I sing the praises of choosing boring technology, occasionally even the most boring technologies have their interesting moments. And by that I mean interesting as in the old curse: may you live in interesting times...
For something as mundane as the Java/Jakarta EE equivalent of JSON.parse (or simply writing an object literal in Javascript), there is a design desicion that results in a completely preventable performance nightmare: When you use Json.createObjectBuilder(), or Json.createArrayBuilder(), each call will do a scan of the class path to look up the implementation of the JSONP interface. This is required to be compliant with the Jakarta EE spec. Which in theory enable to swap the implementation of JSONP on the application server at runtime, but in practice you never do that, and more important: the naive usage of the API comes at the huge cost with regard to performance. In the still unresolved issue that was brought up six years ago somebody reported a factor of 7200 as compared to the relativly simple solution: Always assign the JSONP provider to a static final field (maybe in a Utility class) and always use that instance to call createObjectBuilder().
public static final JsonProvider JSONP = JsonProvider.provider();
I'm in in the process of refactoring two command-line tools, and I'm wondering if it is somehow possible to quantify when you enter the territory of diminishing returns. Both tools are essentially working well, the refactoring is for the sake of the maintainer.
In one case (at work) that will be someone who is not me. This codebase follows a rather old-school ES5-style, and contains a few idioms that a whole generation of programmers has not been exposed to, so first order of business is to bring it into a more palatable form. After that, the question is: improve the typeing, improve the test suite, write some more documentation or call it a day and schedule the handover? Time is money, and I cannot spend too much on something that basically already does the job, so what has the most utility per hour invested? No conclusion yet.
In the other case the maitainer is and will remain I myself, because the other tool is my SSG. I only touch every once in a while. So now the choice to write plain JS, which was done to avoid dealing too much with toolchain setup (albeit using every nicety that was baseline in 2023) is catching up with me when I occasionally want to add/modify a plugin. As node 23 can execute TypeScript (or at least a huge subset of it) directly, I still get away without tooling and add types to help taking load from my memory.
Being easy on the maintainers memory/mental load seems to be the overarching theme here and the trait that I might want to optimize for in future projects way earlier than I used to do.
I expect some of the software I work on to be in operation for decades, so one question I ponder regularly is how to make its design viable for this assumed lifespan. Experience teaches it's certain that some of the third-party dependencies will breaking changes and even their end-of-life. Full rewrites are rarely feasible, let alone economically viable, so many application languish for a long time as long as they still generate revenue (a branch of my employer just a recently, without any trace of irony, advertised a position for a VB 6 developer). So how not to get trapped by that?
In the implementation context I state the problem as how to factor code in a way that does not dependent of any framework or external library. Beyond my day-to-day issues, in a more principled and generalized way of phrasing it, I think the UI layer needs, but currently lacks, a conceptual equivalent to what an object-relational mapper is for the application layer: a convenient level of abstraction that would enable to swap out the underlying implemention with relative ease, just that instead of the RDBMS it would be the UI toolkit/component library that were to be changed in (mostly) transparent way. I think a great step into that direction would demonstrate the feasibility for one major framework and the Top 10 of its component libraries, but the endgame of such a abstraction layer could also include the framework level.
An alternative and simpler solution would be a self-constrained approach that would deliberately limit itself to use only platform primitives. Although the proposition starts to look more plausible and viable from year to year as the web platform improves (for example customizable selects just very recently landed in Chrome), it would be orders of magnitude harder to advocate for in a corporate environment. It takes would likely take a bit more time and skill to satisfy the whims of your run-of-the-mill Figma jockeys and the exorbitant expectations they tend to raise in other non-technical stakeholders.
Lars-Christian writes in Cool apps are interoperable apps that there are options available for those who are willing to search for them and, when necessary, also self-host. I cannot help to notice, that this type of interoperability is based on the most primitive possible integration style (as defined in the seminal Enterprise Integration Patterns by Hohpe/Woolf): File Transfer.
Tangetially related (see also earlier note): end-user runtime composition and scripting approaches are largely untackled, doubly so in the locked-down consumer devices of today. Via Samir Talwar I found Jeanine Adkissons work on Pipe-Based Programming, which is proposed as one approach to adress what they call the Desktop-scripting problem: How should unrelated programs written in different languages be integratedâespecially in an ad-hoc manner in a desktop environment?
. But I've yet to find a convincing implementation for a desktop environment I'd care to use. For web application I wonder how far one could go to abuse a end-to-end testing framework like Playwright for some macro-level programming spanning multiple unrelated sites. Certainly still pretty far away from an end-user, even a power user.
Carl Svensson has a point when he states that Computers were more fun when they weren't for everyone. Sadly, the old unix adage is true, what is designed to stop its users from doing stupid things, also stops them from doing clever things.
Rules of thumb for communication in a distributed/remote work setup: If a ticket system in place, that is the first and foremost destination to pull status updates. And absolutely no hello. That cannot be so hard.
I am starting to have visceral reactions when I see generated images, and it is quite hard to avoid having that kind of slop shoved in your every document. I hope that before long it will become socially unacceptable. If not for ethical reasons, then for the sheer ugliness of it.
Nassim Taleb makes a case against high-frequent news / media consumption in general.
People tend to infer that because some inventions have revolutionized our lives that inventions are good to endorse and we should favor the new over the old. I hold the opposite view. The opportunity cost of missing a "new new thing" like the airplane or the automobile is miniscule compared to the toxicity of all the garbage one has to go through to get these jewels. [..] [R]espect for the time-honored [..] implies a minimal exposure to the media as a guiding principle for someone involved in decision making under uncertainty.
I ought to remember it the next time I'm tempted to visit the orange website.
Barry O'Reilly has a fantastic piece of professional advice for budding software architects: It's meant to be chaos - enjoy it!
. The whole interview on the "Dear Architects" newsletter is worth a read.
In git command line options match already if they are unambigous with regard to the prefix. So instead of passing --amend you can write:
git commit --amen
If you've amended a commit, it of course gets a new hash. So if it was already pushed to a remote, you indeed better pray that nobody else has committed and pushed in the meantime, because you now need to push the amended commit with -f. Or, to add some extra sillyness, you can create an alias for the push command, like
git config --global alias.punish push
So you can finally:
git punish --force
Arguably, when you can and need to force push to the trunk/main/master branch, someone being punished forcefully might be quite adequate...
Ward Cunninghams C2 WikiWikiWeb turns 30 today. Sadly, it is in read-only mode for more than ten years now. But I am happy that it is still out there. A glimpse of a web that was, a memory of the web I got exited about, a testimony to what the web could have been, and a hint on the many possibilities of this medium that are still for us to create and explore. Further on up the road!
I am a sceptic when it comes to professional programming supported by LLMs. I think the cognitive limits of the human in the loop will turn out to become the upper cap of the output in a professional context. And I think it will turn out to be wasteful to burn the mental cycles of experienced developers with reviewing LLM slop.
But there might be an economic tangent to it, that could turn out dangerous for unapologetic fossils(like yours truly): a computing analogon to Gresham's law, if you will. The latter states that "bad money drives out good", and so it is conceivable that sloppy-pasta apps will be produced and deployed in a volume and frequency where the potential errors and exploit scenarios are just priced into the total cost of system ownership. It certainly doesn't help that the general public already has every reason to expect the software that governs its everyday life to be dysfunctional anyway...
Bots are nuisances you have to live with on the open web. I've just banned an IP adress that fetched my feed twice every minute. Now, my site is small and unimportant enough that this does not bother me too much, but for independent people who are trying to build a honest little business (like a code forge that is not owned by big tech). Such behavior has already depleted the commons and now just recklessly sucks up private resources, read Drew DeVaults well justified rant Please stop externalizing your costs directly into my face; I'd underline every word. It feels like living through another gilded age, the big tech billionaires being the new robber barons.
For future reference:
sudo iptables -A INPUT -s 4.231.104.62 -j DROP
I'm occasionally taking part in hiring interviews, mostly as technical screener, but sometimes also for other roles that are adjacent to my line of work. What I find a bit puzzeling is that many candidates do not appear to be curious at all about what they would work on in the role they are interviewing for. And while the observation holds notably for both engineering and product positions, although in the latter cases it confuses me even more.
Just a thought: would it be worthwhile to publish all notes of a given month as single article to the main feed?
A reader may every once in a while encounter a book which shifts their perspective radically. For me, Barry O'Reillys Residues: Time, Change, and Uncertainty in Software Architecture is such a book. A very slim volume, dense with ideas and a pretty well-presented case, that some of the pillars on which my profession rests, are irrevocably flawed. Not sure what to make of it yet, except, that I now need to read at least his papers that preceeded the book as well.
Even though the web might still be one of the best shots we've had yet at creating a dynamic medium, it is stupendously complicated to showcase the dynamic behavior of a process or algorithm. You can show a source listing, but stepping through something, suspending execution, inspecting state, let alone modifying code at runtime is hard and needs tons of bespoke manual code/tooling around it (which in itself might then also not even have the property of being modifiable, inspectable etc..). A Smalltalk or Lisp-like environment could help with some of the issues, but likely at the cost of ease of distributability and probably broader understandability. More than sixty years have passed since Douglas Engelbart wanted to augment the human intellect with computers, and metaphorically speaking, we are still writing with pencils that have a brick attached to them.
The speed with which international public affairs are unraveling is breathtakingly fast. I imagine being not the only person who perceives it as draining and exhausting.
This is the intended effect of a deliberate tactic called firehose of falsehood or alternatively, "Flood the zone with shit". Considering that the latter term was coined by an american nazi, suprisingly honest description.No conclusion here yet, just to chronicle.
I've just received a mail that informed me that my mail-in ballot sheet for the upcoming election of the German parliament is on its way. Coincidentally also today the conservative and liberal faction passed a motion in parliament together with the fascist neonazi party AfD. Now the damned bastard Merz, who will likely become the next chancellor, is feigning suprise, hypocritically expressing regret (mind you not about the xenophobe anti-immigration motion his party put forward), and declaring that he is only seeking majorities in the middle of the political spectrum. A mere two days after the 80th anniversary of the liberation of the concentration camp Auschwitz. As German citizen I am deeply ashamed of and angry beyond words about the 348 members of the Bundestag that voted "Yes" today.
Just to document it: what the lap doge of the felon-in-chief displayed yesterday was the Hitler salute. Plain and simple. Nobody with at least one eye and about half a brain left could say anything else. There is nothing to be mistaken here. This was no hapless, clumsy motion out of the exciting heat of the moment. It was a fascist unabashedly showing his true colours. A calculated move, certain of his impunity. Sadly, in the last matter correctly so. Let us not fool ourselves. It is class war, currently their class is winning.
In the 1968 essay Is progress real? Will and Ariel Durant argue against sentiments that romanticize earlier times in history. They ask a set of rhethorical questions, juxtaposing the vices of the past with the virtues of the present. One of these questions, from a 2025 perspective, doesn't support the point they wanted to make quite as it probably did 57 years prior. They asked has any American President imitated Pericles, who lived with a learned courtesan?
.
Admittedly, I would not use the word learned to characterize any of the courtesans of the felon-elect. And certainly I don't think it is warranted to insult Aspasia of Milet by comparing her to Space Karen and the likes.
Here's a gem of a quote, which I found in Bret Victors compilation of Alan Kays messages to the Squeakland mailing list:
By the way, one of the ways that I characterized the Dynabook years ago, was: "An instrument whose music is ideas"
Just found a mail in my spam folder, notifying me, that a site on which I once created a profile is shutting down in a few days. Happens to be another failed startup. I won't shed tears after it, because after some evaluation I decided that it wasn't for me. When I wanted to delete my profile again: impossible via the site itself, support de-facto not existent, mails ignored. Well, I don't want to find joy in a failed business, but in the particular case: good riddance to these bloody GDPR violators...
A few themes for 2025: relearning fundamentals, revisiting first principles, rebuilding what is necessary, and contemplating the next decade.
Just noticed a defect in my static site generator, or a case of a hasty abstraction if you will. The plugin to format the dates used the year of the current week according to the ISO-8601 standard. This was the behaviour I wanted to have to format the index of the weeknotes, but not for a specific calendar date.
So today, Monday, the 30th of December 2024, marks the beginning of the first week of 2025, because the first week of a year is defined as the week that contains that year's first Thursday. So, happy new ISO-8601 year ;).
In 1980 about 3.500.000 people suffered from Dracunculiasis, an infectious disease caused by a parasitic worm, against which there is no vaccine available. It leaves half of those who get it permanently disabled. Since 2015 the number of infections per year is in the low two-digits, last year it was 14. 2015 is also the year when the man whose non-profit organization spearheaded the eradication campaign, said that his last wish was that the guinea worm dies before he does
. He was then 91 years old and lived to be 100.
Many more things can be said about him, but if that were the only thing, it would already be enough to call his life well lived. His name was Jimmy Carter. May he rest easy.
Inspired by Wouters How Bad Is Link Rot At Brain Baking?, I just did a little experiment to estimate the number or broken links on my site. Turns out: luckily not too bad. Of 409 links I consider seven to be broken. Two of them caused by the death of the bird site, one youtube video taken down because of copyright claims, one article that was unpublished on an otherwise still maintained site, two sites went completely offline and one broken link is a fixable copy-and-paste error in the URL (ends on .html and not .htm). So, overall somewhere between 1 and 2 percent. Important take-away: not all de-facto broken links come in the form of 404s.
Looking through some old bookmarks I found this little excerpt from Surely you're joking, Mr. Feynman:
So I got this new attitude. Now that I am burned out and I'll never accomplish anything, I've got this nice position at the university teaching classes which I rather enjoy, and just like I read the Arabian Nights for pleasure, I'm going to play with physics, whenever I want to, without worrying about any importance whatsoever.Within a week I was in the cafeteria and some guy, fooling around, throws a plate in the air. As the plate went up in the air I saw it wobble, and I noticed the red medallion of Cornell on the plate going around. It was pretty obvious to me that the medallion went around faster than the wobbling.
I had nothing to do, so I start to figure out the motion of the rotating plate. [...] I went on to work out equations of wobbles. [...] It was easy to play with these things. It was like uncorking a bottle: Everything flowed out effortlessly. I almost tried to resist it! There was no importance to what I was doing, but ultimately there was. The diagrams and the whole business that I got the Nobel Prize for came from that piddling around with the wobbling plate.
Quite a consolation. Do what what is interesting to you, worry about importance later, if at all.
Nearly two years ago Large Language Models were all the rage. To some extent they still are. Anyway, I wanted to see what the hype was about and did a litte experiment with ChatGTP. I tried to use it for a coding exercise. The outcome was a mixed bag, I wrote an article about it, and have basically ignored LLMs in my professional life ever since. What shall I say: my job has not been automated away just yet.
On the other hand though, I keep hearing about productivity gains from colleagues, I also see that the capabilities of the various models enable people who are not professional programmers to build useful things that they would have perceived as out of their reach before.
I remain sceptical, foremost because the necessary presence of hallucinations implies that the output cannot - at least in good conscience - be used in any (business-)critical system without thorough critical review. And human review places an upper cap on productivity gains. An empirical case study from 2006 on the effectiveness of lightweight code reviews reports that review rates faster than 500 SLOC per hour, review session length > 90 minutes and code size under review > 400 SLOC all make the process significantly less effective and miss critical defects.
I think Fred Brooks' famous assertion from his 1986 paper No Silver Bullet still holds true: There is no single development, in either technology or management technique, which by itself promises even one order of magnitude improvement in productivity, in reliability, in simplicity.
.
But, even if it is not an order-of-magnitude, if a consistent percentage improvement can be had, language models might become a tool like IDEs, autoformatters, linters and debuggers in the long run. As github just today advertised a free tier for their copilot product, it might be worth to repeat the experiment and gather a few more anecdata...
The dillo browser turned 25 recently, and - unlike the five year older Netscape Navigator - is still alive and kicking.
The project had been dormant for years and was revived by a few enthusiast only some month ago. Today it was on front page of the beloved orange site, which was an impulse to give it a try.
I'm afraid to report that non-programmers cannot expect to find too much help to just grab a binary and start browing, the last debian package that you can install is ages old, and does not reflect the last state. I had to build it build from source. Ok, cloning the repo, finding the install instructions (because: if you download the zip file instead of clone the repo the necessary steps are missing), installing another dependency (fltk 1.3) and seven simple shell commands later, I could successfully browse my site. This is how it currently is rendered:

Interesting to me, as I had not anticipated it quite like that. But the JS-free two-level drop-down on hover menu relies on a few newish CSS features, so not too suprising that it doesn't look quite right. So let's disable CSS (something that dillo can does pretty easily), and indeed that looks, if not better, at least less broke:

That now leaves me wondering whether there's a graceful degradation technique for CSS that I could use to keep my styles around while still being able to cater to such niche browsers, if only as a a little gesture against the stifling monoculture brought to us by the Chrome / Safari duopoly.
I generally don't mind writing into the void, but I know many others do. Regarding the purpose of the act of contributing on and to the open web, Jay Hoffmann offers a perspective in his article The Free Web. He writes, that it exists foremost through contributions of individuals that are both free as in speech as well as in beer. He continues to advise:
Put something on the web. And do it for free. This will require, first and foremost, your time. That is no small ask, time is the most valuable thing we have. But I can tell you one thing thatâs become readily apparent to me in my decade of research of the web. It is only through peopleâs time that weâve gotten to where we are. The web wasnât built by solo tech geniuses, or finance firms, or luminaries with grand promises. It was time, and energy, compounded by millions of people. Tiny little bits of information collectively covering the vastness of human experience. Collective action will be what brings us to the next era of the web. [..] These little actions, these little contributions, are the best way we have to claw back to a truly free web.
What a pity it would be if one of the greatest socio-technical systems that humans have build would be wrestled out of our collective hands by corporate greed.
Christmas is approaching fast, so I'm writing down my wishlist for self-hosted software (and knowing myself, self-hosting in some of the cases implies, writing from scratch, as if the work-in-process were not killing me already...). Anyway:
- A feed aggregator. I'm happy with Liferea on the desktop, but it would be really nice to have it available everywhere. Might go open-source shopping there.
- An image uploader, think imgur, just for myself. This one is on the edge of "I could do that in a few evenings". Not sure if its worth the time though.
- Very lightweight Personal (Kanban) Board - no fluff, just a bunch of virtual cards moving around. Good candidate to write from scratch.
- A (Git-)forge as primary mirror of my repositories. The only case of definitively something from the shelf. Ggitea, forgejo, maybe a reason to experiment with Fossil SCM - which ticks many boxes, except that it is not git-based. Although, I'm not convinced that the admin tax will be worth it in the end.
- I'm an old wiki nerd, but by and large still happy with TiddlyWiki. I have written some tooling to be able toto use it as frontend to my SSG, but for the latter it turned out less productive than I had hoped for. The digital wineyard isn't growing many grapes. Too much friction. An ergonomic web-based editor for the pages would still be nice.
- I'd like to expand the bookmark app, I'd like it to be more connected with the website overall, maybe as driver for a linkblog, also on the list: to fight against link rot: auto-archive to Wayback machine plus self-hosted archiving, a clipping/annotation tool (like hypothes.is) would also be useful.
I'm not completely done here, but if I look at this long wishlist already, I realize what I need more than anything else is focus time.
I've written about the benefits of classless CSS a while ago, and I still mostly stand by it. I don't use an intermediate format (like markdown/ asciidoc etc), so it keeps my markup relatively free of clutter. For text-heavy sites with a relatively regular structure it can go quite a long way. But over time, I came across a few edge cases. And to be honest, I happily compromise on philosophical purity, and added a few classes.
The prime example is syntax highlighting. I use prism.js for it, which tokenizes the sourcecode to be highlightes, puts each token into a span and adds the appropriate classes for it. I bundle the CSS from the prism package in the pages where they are needed. Another example are one-off decisions outside any very regular structure. For example I recently animated the dice icon in the top-bar. Classless CSS relies on the premise, that the markup itself clearly characterizes the essence/purpose of each element, which in this case just doesn't hold true. Also, when selectors start to get overloaded classless CSS can get messy too.
So, maybe the combination of mostly selector-based styles with pragmatic use of classes sprinkled in would describe my approach now better. It has a drawback though: doesn't have the same ring to it as classless has. I'm a terrible marketer, so if anybody come up with a catchy term, I'd be all ears.
A dump of a few links related to tagging and categorization:
- Hillel Wayne on Tag Systems
- James Sinclairs Categorisation Research
- Karl Voits research on personal information management using associative navigation, as well as his blog posts on How to use Tags, and on why Logical Disjunct Categories Don't Work
The unix philosophy of composing programs by piping streams of bytes from the standard output of one process to the standard input of the next one is well and widely understood. When it comes to having something even only vaguely similar for UIs, programs are considerably less "cooperative".
For desktop apps (which turned from being "the norm" to a niche) the points of connections are, in the best case, based on a shared / standardized / application-agnostic file format. A good example, which might even serve as a bridge to the world of the command line, is CSV, which you could think of as lowest common denominator for spreadsheets, on a higher level of complexity you have ODF for office applications. Composition in this case takes the form of working on the same file with a sequence of different applications. The much more common, much more brittle and much less sophisticated mechanism is the venerable clipboard. The copying from one process to a temporarily globally accessible, shared in-memory location to an appropriate input is then done ad hoc and manually. With a bit of abstraction it also resembles a pipe.
For web applications it seems to be even bleaker, the default browser security policies are all protections against client-side composition. And at that point I also have to note that most of them have been retrofitted into the web platform because of the abuse scenarios which their absence can enable.
But, even if you'd run a bunch of applications behind a reverse proxy on machines under your control, so that same-origin, mixed-origin and HTTP headers were but a small matter of programming, integrating/composing web apps is would be non-trivial and requires a ton of bespoke development work.
Yahoo Pipes for a few years enabled tech savvy users to build data mashups from resources and API on the open web with relative ease and comparatively little programming. The service shot down nearly a decade ago and nothing comparable has emerged since. The open web as a source of data also has has dryed up considerably over time (at least unless you are a company with an excess of capital for funding an army of scraping bots overseen by a staff of engineers).
But let's cast all that aside for a moment, we are also lacking some fundamental ideas on how macro-level composition mechanism for UIs ought to look like. Applications are generally packaged as black boxes which do not expose a model that would expose its capabilities, its inputs and outputs in a way that would make scripting (or macro-recording?) possible or even convenient.
The more I think about it, it strikes me as odd: At all times we already have a model of what the application is capable of: its source code. The explicitness of the interface is through the build process literally lost in translation. So, since we are incapable of "talking" about applications after putting them into a form that suits the operating system (and in that sense a browser is an operating system as well), I think Dan Ingalls, was onto something when he wrote: An operating system is a collection of things that don't fit into a language. There shouldn't be one.
Software has long since abandoned the philosophy of "big bang" releases, why do I expect that the same does not apply to the written word? Writing is thinking, good thinking doesn't just happens over night, but needs time. Increments can be incomplete, and still already be of some value. Just because one increment of your thinking is out, that does not imply that a given topic is concluded. The imperfections of a momentary snapshot are no reason to deprive yourself of reaping the potential benefits that can be in making them public: delivering partial value, inviting more feedback, making progress visible.
For years the thought of getting rid of a book that I've acquired was completely anathema to me. The utmost I could fathom was to give it into the hands of another reader. But unless physically damaged beyond hope, I would have never put one into the wastepaper container. I've went through one of my shelves today, and what can I say: I assessed half a dozen books as something that I don't want keep and cannot imagine anybody else wanting to have. Asking myself if that is a sign of aging, or a result of simply being overwhelmed by having too much stuff.
Is there an authoring environment for longform texts that actually provides both a convenient editing experience and is free from distractions? Well, distractions come in many forms, but when I take my current setup into account: I write these notes in VS Code as HTML file and don't bother too much to separate text from markup. The default of the autocomplete is sensible for actual HTML, but this tries to make a tag out of every other word, so while writing, I usually switch the file to plaintext mode. I'd like to use a editor which could filter out all presentational concerns while still writing in a proper markup language under the hood (be that TeX, HTML, AsciiDoc or what have you). The main point: I'd like to be able to pretend that I write a plaintext file to not bother with technical minutiae while writing. Also, it would be great if this were web-based, so that it could accessed from anywhere. Oh, and some kind of management of material/references would be great. A wiki would come to mind, but the problem with these is that you'd need one per writing project, for otherwise the amassed unrelated material starts to become a distraction quickly. Ah, well, all that, or I just initialize a repository, write plaintext until everything is ready to be typeset in an additional step.
Conducting job interviews properly is not easy, and I stand by what I said earlier about leetcode interviews. Anyway, what one usually wants to achieve is to sample enough information to see if proceeding to another round (or in the final stage extending an offer) is a risk worth taking. For this I consider a proper conversation more helpful than checkbox-style knowledge questions. One pair of questions that I've recently read: Tell me about your favorite technology, Followed by: can you tell me in which ways it sucks? Never used it myself (as of yet), but I think it could be a good way to create a fruitful dialog in a technical interview.
In the foreword of Richard Gabriels book Patterns of Software Christopher Alexander expresses a remarkable take on the level of quality he aspired to in his work.
In my life as an architect, I find that the single thing which inhibits young professionals, new students most severely, is their acceptance of standards that are too low. If I ask a student whether her design is as good as Chartres, she often smiles tolerantly at me as if to say, âOf course not, that isnât what I am trying to do. I could never do that.â
Then, I express my disagreement, and tell her: âThat standard must be our standard. If you are going to be a builder, no other standard is worthwhile. [..]â
Alexander then continues to ask if architecture is actually the correct metaphor and whether a parallel can really can be drawn between the fields, for he asserts that in his later buildings he arrived at the qualities of aliveness he was seeking out (pointing to his late opus "The nature of order"). He doesn't ask it rhetorically, and doesn't give an answer, but he notes on the field of software:
I get the impression that road seems harder to software people than maybe it did to me, that the quality software engineers might want to strive for is more elusive because the artifactsâthe programs, the codeâare more abstract, more intellectual, more soulless than the places we live in every day.
I'm thinking it might be time to go a bit language-shopping. I'm productive with my current technologies, but it can't hurt to take a look around from time to time, to prevent blindspots and becoming too stale. So, just a list of contenders and maybe a few pros/cons on each one.
There's the fun category of mind-bending but probably impractical. I've never really seriously attempted an array-based programming language, like APL or K. Might be a intellectual challenge, but I don't see a practical problem I'd want to tackle with them. A bit more practical, still mind-bending: a stack-based language, like a Forth dialect. Maybe for toying around with microcontroller again?
I'd also like to wrap my mind around Erlang, as it has a reputation for highly available and scalable systems.
Then it would be good to have a proper systems language in the tool belt again. My C sadly has become rusty over years and years of TypeScript and Java. I am sceptical of Rusts' cultish following (XYZ rewritten in Rust has become kind of a dev clichĂŠ). I also never got warm with Golangs. I think I'd enjoy something like Zig, Nim or Crystal.
When it comes to "real" OO: I'd like to give some form of Smalltalk another try. Maybe Citrine?
Scripting languages: maybe Raku, mostly because Hillel Wayne wrote about a few of its unusual features.
Or something functional again (old love doesn't rust - no pun intended)? Scala, F#? But JVM/.net runtime. I could give Common Lisp another try.
With so many choices, maybe I should do advent of code this year and try a different one each day...
I've read a statement on Linkedin the other day, but as the algorithmic feed is utter garbage, I'm unable to find it in its original context again. Anyway, I very much agree with the notion and it bears repeating: leetcode-style interviews are nothing short of hazing.
A lot of folks in my bubble of interests are starting to flock to a revenant of a once beloved site. You know, the one that was murdered by Muricas upcoming first lady. Well, this time I won't join the party, not only for the many good reasons outlined by Cory Doctorow, but also because I need to be protective of my time.
I have fond memories of a time, where durable public conversations on the web happened in significant parts in a wealth of forums, many hosted by an enthusiastic admin (who usually also doubled as moderator). I've spent quite some time on several of them.
Yet, all good things come to an end. Most, if not all of them ceased operations over a decade ago. It was death by a thousand paper cuts. The instances usually ran on something like phpBB, and had quite an attack surface and admin tax, then the search engines stopped sending traffic, and the para-social media platforms sucked most of the remaining oxigen out of the room, and a few of their other great deeds managed to wake up even the sleepiest regulators, which in turn made laws, that, although these laws are rather reasonable, had the sad byeffect of making a legal minefield out of running a simple forum as a hobbyist. So forums went the way of the dinosaurs. Those who are still around resemble their ancestors like a sparrow resembles an archaeopteryx. The Big List of Small Forums (note: big != exhaustive) currently lists 89 of them.
As an active participant, it always was a sad thing to see years of discussion threads gone from one day to the other. So, I've probably starting to resemble what Kevin McGillivray has dubbed a homesteader on the web. In his fantastic map of the web he writes:
To the east is the Archipelago of Personal Websites and the less densely populated frontier of the Open Web, dotted with isolated islands and woodland cabins with one inhabitant each, lonely as a single player Minecraft game, sending messages to each other in archaic RSS and email bottles. These homesteaders are traditionalists. They see the ever-expanding borders of Big Social as the end of a way of life. They want to see the return of the vibrant early web, or of what they think the early web was. And they're out there, building and hoping others will dismantle the city and join them on the frontier. [..]
The homesteaders are reviving buried patterns that had promise but are now extinct or endangered. For the homesteaders, the danger is in distrusting all new patterns by default, becoming blind to the shortcomings of the old patterns, and finding themselves all alone in a web ring of one. What we can learn from the open web homesteadersâwhat works, what's missing, why nostalgia won't help.
Indeed, nostalgia doesn't help. And genuine communities on the web are one thing clearly missing. If you have thoughts, by any means, send a message in an archaic email bottle to my isolated island.
Everywhere, there are broken things. There are beautiful things in the mix, but so much is broken or built on broken foundations.
Iâve never really been one to accept a broken foundation of how the world works. Now is the time to reinvent the wheel, and it always was, no matter the lies youâve been told. [...]
Have more ideas on wheels needing reinventing?
Benjamin Hollon asks a really great question there. I assume everyone can think of broken things and systems that would benefit by being reinvented. It actually might be quite hard not to fall into the trap of defeatism when such big topics are brought to the table. But, except for the square-shaped wheel maybe, there are enough problems to be solved, solved again, solved better.
This also reminded me of a short story in Peter Bichsels book Kindergeschichten, titled "Der Erfinder" (The inventor), about a man who lives in seclusion and independently invents things that already exist. The story closes with a sentence that I still find very beautiful (to the extent that it has lost beauty in the following translation, I am to blame):
Yet he remained a true inventor his whole life long, for even inventing things that exist already is hard work, and only inventors are able to do it.
I watched a documentary on J.R.R. Tolkien yesterday. It included an interview snippet in which Tolkien tells how as a college professor he had to grade term papers, a task he describes as laborious and boring. One student handed in a superfluous page somewhere in his paper. Tolkien was delighted that he didn't have to read that page. Instead, he started scribbeling down a sentence on that blank sheet of paper: In a hole in the ground there lived a hobbit.
It seems, that sometimes a little boredom can go a pretty long way...
In 1986 the space shuttle Challenger broke apart in the air a bit over a minute into its flight. All seven astronauts died. This led to an inquiry by a commission of which Richard Feynman was a member. The final report contains an appendix on the topic of reliability of shuttles and the risk estimation. Feynman writes:
If a reasonable launch schedule is to be maintained, engineering often cannot be done fast enough to keep up with the expectations of originally conservative certification criteria designed to guarantee a very safe vehicle. In these situations, subtly, and often with apparently logical arguments, the criteria are altered so that flights may still be certified in time. They therefore fly in a relatively unsafe condition, with a chance of failure of the order of a percent (it is difficult to be more accurate).
Official management, on the other hand, claims to believe the probability of failure is a thousand times less. One reason for this may be an attempt to assure the government of NASA perfection and success in order to ensure the supply of funds. The other may be that they sincerely believed it to be true, demonstrating an almost incredible lack of communication between themselves and their working engineers.
He then gives his thoughts how things should be handled:
Let us make recommendations to ensure that NASA officials deal in a world of reality in understanding technological weaknesses and imperfections well enough to be actively trying to eliminate them. They must live in reality in comparing the costs and utility of the Shuttle to other methods of entering space. And they must be realistic in making contracts, in estimating costs, and the difficulty of the projects. Only realistic flight schedules should be proposed, schedules that have a reasonable chance of being met. If in this way the government would not support them, then so be it.
He concludes his observations with a remark that is both very tragic, and very powerful:
For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.
Bartosz KorczyĹski has reimplemented huge parts of the classical Visual Basic 6 interface in C# and made it runnable in the browser. It is not a complete reimplementation, and I don't know how much of the actual language is supported. I managed to trigger a message box from the click handler of a command button, but I got an error when I tried to asign the content of an input to the caption of a label, but this recreation from scratch it is an impressive technical feat nonetheless.
It would miss the point if you'd try to rewind the clocks to 1999, but I am sure that there many lessons to be rediscovered. For example, let's take the UI toolkit:

Measured against today's standards, that is an incredibly small component library. The number of controls amount to a baker's dozen, four of them are hardly UI components in a stronger sense. Timers are an abstract concept, that they are made into a component is merely an overstrech of the metaphor, a convenient way to declare an object, maybe a bit of "golden hammer"-syndrome. The frame is visible, but actually just a tool to introduce a hierachy, so you can have multiple groups of radio buttons for example. Vertical and horizontal scrollbars would in today's toolkits on be an implementation details of a some container that happens to overflow. What are does that leave us with? Picture, Label, Text input, Command, Checkbox, Option button, Listbox, Combobox and Shape. Most of them can be mapped directly to HTML primitives that exisits since the early days of the web (granted, it took a while for SVG to become widely available, but the rest: just stardard controls). So, why on earth does every company needs to have their very own component library? Why do we consider it necessary to have a three-digit number of components at our disposal to create user interfaces to, let's face it, CRUD APIs that are persisted in relational databases? (Constrained) input, selection (single and multiple, maybe with a few constraints like: at least one or at most n), textual and graphical output, and a way to invoke commands, and we've got the 10% that cover 90% of the use cases.
Certainly, the world did not stop spinning, a bit of refinement does not hurt. But I am under the impression, that something fundamental in culture of software design and engineering was lost over course of the last decades: a focus on the essentials, paid for with excessive complexity. Likely an instance of an effect that Gregor Hohpe has described and eponymously dubbed as Gregor's Law, it is nature's punishment for organizations that are unable to make decisions
.
Anyway, if somebody were to work towards a new software substrate I think this is among the most fundamental lessons to be relearned.
In the country where the red queen reigns supreme, you might encounter a dependecy that you cannot get rid of wholesale, need to update and which breaks its interface. Happens more often than anybody cares to admit.
I used to treat internal dependecies as, well, dependable. But experience taught me that a dependency which breaks once, usually will break again in the not too far away future. So to manage undependable dependecies you can create an anti-corruption layer and encapsulate all changes to the interface. As that comes at a cost and usually is only conforming to the initial upstream interface anyway, I tend to skip it the first time around, giving the dependency the benefit of the doubt. But when the interface turns out unstable it is "fool me once, shame on you, fool me twice shame on me"; make sure that the fix also ensures that adapting the next time will be a matter of editing a single module under your own control.
Software is not a product, but rather a medium for the storage of knowledge. [..]
[T]he hard part of building systems is not building them, it's knowing what to build â it's in acquiring the necessary knowledge. This leads us to another observation: if software is not a product but a medium for storing knowledge, then software development is not a product-producing activity, it is a knowledge-acquiring activity.
Phillip Armour â The Five Orders of Ignorance
Although I don't entirely agree with its absoluteness, it's an interesting framing of the matter nonetheless.
Jonathan Edwards delivered a keynote adress to the Workshop on Live Programming, in which he talked about Software substrates. He defined them as a conceptually unified persistent store for both code and data, with a UI for direct manipulation on that and live programming within that state. In such a substrate there would be no division between development time and runtime. Using and the system and programming it would be only different ends of a spectrum, everything happens in the same "world".
He contrasts the idea of a software substrate to what he calls "the stack", as pars pro toto for conventional software engineering, and conceides that the idea of substrates has been thoroughly beaten by the stack since the 1990es. He asserts that substrates are better suitable for a class of problems, that lie in-between with regard to complexity, for which spreadsheets are not powerful enough, but which do not yet warrant building a conventional software system around them, as that would be too expensive. He sees low-code/no-code competing on that ground as well, but says that these are attempts to "template the stack", and therefore cannot establish themselves successfully and durably.
Edwars proposes a research agenda, on Data-first software, with the goal to generalize spreadsheets and make them more powerful. The ideas and problems to be solved include: trying out UI metaphors beyond the grid, version control for the data that is end-user friendly, building in support for changes in types and schemas, treating code as meta-data, which should be kept small, inspectable and malleable, adding interoperability with The Stack via HTTP-APIs, and providing a subset of the features of conventional database management systems.
I think that is a very interesting problem space. How could software look like that empowers the user without coercing them into becoming a software engineer. I think if you want to get there you need a great number of these "in-between" problems and folks from both ends of the spectrum with a desire and willingness to collaborate on the making of that substrate, with their egos so much in check that it does not become patronizing experience for either side. I think the RAD tools of 1990es like Visual Basic and Delphi were on to something there, they were certainly erring on the side of "the stack", but one could learn a great deal from them. The WYSIWYG approach to form building, with all its warts and limits, would have enabled designers without any programming knowledge to contribute an actual part of the actual software, which is in stark contrast to today's common approach, in which designers use software to paint (very credible) images of what a software might look like, that are not executable in a meaningful sense at all, which are then thrown over to engineering in order to be recreated from scratch.
If software substrates would enable an intellectual cohabitation, if you will, of end users, domain experts, designers and engineers in the same medium, it would be a huge achievement in my eyes.
An interesting idea: Rebecca, Allen and Jordan Wirfs-Brock have authored a paper Discovering Your Software Umwelt in which they reflect how the influences that surrounded them shaped their approaches on software development. They share the questions which they used for their self-reflective narratives. I could imagine that writing such a narrative can indeed uncover interesting insights, both on an individual level as well as for the field as a whole.
On the C2 Wiki I've stumbled over a definition what patterns in software development are about: We're looking for that common knowledge that's so uncommon.
â A really great idea, beautifully condensed into a pithy sentence.
This day should have been a pretty ordinary wednesday. The most exiting part should have been the slightly premature, yet happy little St. Martin's fest organized by the kindergarten of our youngest child. Alas, it was no ordinary day at all.
The citizens of the USA have elected a fascist as their president. I am not surprised by the outcome, but disgusted and nervous about the consequences nonetheless, as I have no doubts that this will have consequence of severest nature.
What a timing for my own country's government to collapse over the smallest faction of the coalition, the neoliberals, losing their political senses over a prognosis of not being part of the next parliament.
I've taken out two slim volumes from my shelves today. One is a very timely book, it was originally published at the beginning of the first term of office of said fascist, by historian Tymothy Snyder: On tyranny - Twenty lessons from the twentieth century. The first lesson is "Do not obey in advance", which I really had wished that the chancellor of Germany had taken, instead of sending servile congratulations before even the electoral college majority was certain. The final lesson, "Be as courageous as you can.", lead me to take out a yellowed anthology, a book I've inherited from my father, with texts from Erich Kästner. It contains an essay that impressed me deeply when I read it the first time, with the title "Schwierigkeiten, ein Held zu sein" (On the difficulty of being a hero), in which Kästner recounts the book burnings organized by the Nazis in the early months of their murderous regime. The number of authors whose books were burned are in the hundreds and of those authors Kästner was the only one who witnessed the obscene spectacle with his own eyes. He concludes this essay with the following (translation by me)
What I had done was by no means heroic. I was simply disgusted. I stayed passive. [..] While watching the burning pyre I haven't yelled back. I have not threatened them with my fists. I had merely clenched them in my pockets. Why do I tell you that? Why do I confess? The reason is that when you speak of the past, you always also speak about the future. Nobody among us, nobody at all can answer the question of personal courage before they had to face a situation that actually requires showing it. Nobody knows, whether they are made out of the material from which the decisive moment in history forms heros. No people, and no elite may rest and do nothing, just hoping for the best, that in a case of emergency, in the most serious of scenarios, there will be enough heros at hand.
Did I ever tell you that I was once named as Person of the Year by Time Magazine? Probably not. It's not that much of an achievement. Really, and I'm not humblebragging, because they gave that title to quite a few people in 2006. Anyway, it's a cool way to introduce yourself at parties or on about pages...