Title graphic of the Moonspeaker website. Small title graphic of the Moonspeaker website.

 
 
 
Where some ideas are stranger than others...

FOUND SUBJECTS at the Moonspeaker

Should We Settle? (2025-12-08)

Officially this is a museum-piece settee in spain, which looks suspiciously like it was never meant to be sat on, really. Original image by Votpuske via wikimedia commons under Creative Commons Attribution-Share Alike 4.0 International license. Officially this is a museum-piece settee in spain, which looks suspiciously like it was never meant to be sat on, really. Original image by Votpuske via wikimedia commons under Creative Commons Attribution-Share Alike 4.0 International license.
Officially this is a museum-piece settee in spain, which looks suspiciously like it was never meant to be sat on, really. Original image by Votpuske via wikimedia commons under Creative Commons Attribution-Share Alike 4.0 International license.

Of course, as soon as we pick up such a loaded question, we have to deal with the issue of definitions and lack of specificity. It makes total sense to ask what "settle" means here, and furthermore, settle for what? Revealingly, "settle" is a verb with so many negative connotations I can't help but think the various colonizing nations and those committed to their exploiting ways have taken against it. It is no longer positive to "settle" a new place, in part because there are no new places and it turns out there were never any new places. "Settling" a lawsuit doesn't seem to have the happiest connotations among the people who insist that every lawsuit must have a winner and a loser. In that sense it seems to share the unwanted seat with the latinate verb "compromise." But as it happens, that was not the context of how this rather colourless question came up. It is not meant to be answered except with the default answer of "no," and that in itself is very revealing. Still, before continuing on I should explain the context, which admittedly was not on a trip into the dictionary or something. An example of that rara ava, an adult who managed to acquire a mortgage for a home when it was still comparatively affordable, exactly where they most wanted to live, followed by a combination of further good fortune, savvy, and hard work so the mortgage is paid. Now they are notionally semi-retired, though having had good fortune and good sense both, the adult in question is leery of getting any closer to retured than that. A combination of wisely and humbly hedging their bets makes excellent sense. Also notionally, that fortunate and sensible adult could "settle," just keep doing what they're doing. Considering how many times I have had less than praise for the greedy self-declared elites and their hangers on, the usually manipulative question of settling may seem like it should be answered "yes." After all, the adult in question is in a good situation, and they have enough. But wait, that suggests the only definition of "not settling" at work behind the question is "not settling for less," usually less of money, property, or even power over others.

So faced with an example where the pseudo-question "should that person or us should we be so fortunate settle?" came up, all manner of fascinating other ponderables arose. A person with such good fortune is able to choose what to do with much of their time and any funds beyond the necessities. If they decide to do nothing more with it besides play tiddlywinks, that strikes me as fair enough. This is rarely what people outside of the very capitalist set do though. They've usually got some serious practice to work on, whether it be art, some kind of spirituality, or ongoing volunteer commitments of one sort or another. They may or may not be politically active in a real sense, that is not just going through the sadly denuded process of voting in many "western" contexts, where the ballot votes are counted but the real voting is through money. (I don't think there is any real evidence against the return of property qualifications for having a meaningful impact on political decision making in the "west.") It actually brings to mind the famous Bill Watterson commencement speech illustrated by Gavin Aung Than in his now wrapped up zen pencils, still featured on the front page of the series in its online form. Watterson is an excellent writer, so we should not be surprised that he selects some zingers for the speech, making observations about people perceived to have "abandoned a career" and how "a person happy doing his own work is usually considered an eccentric if not a subversive." I have encountered claims that such people are lazily zoning out, supposedly refusing to contribute or work hard enough, maybe even fleeing adult responsibilities and the real world. My activism inclined acquaintances demand to know why such people aren't organizing. Well, maybe such people are, maybe such people aren't. We actually don't know. People who are so fortunate as to be able to make such choices about their time and work life are not exhibitionists about what they are doing. They may end up creating an award-winning comic strip like Watterson, but he is so notoriously publicity shy that I think even if they do achieve a conventional success, such people are inclined to avoid the spotlight. This actually makes sense to me.

If a major element of a person's motivation is to be able to choose how to spend their time, which typically also means living by a particular set of values incommensurate with a constant drive to get more money and prestige, then fame itself is not a goal. We see all manner of examples every day of people who have some version of fame or its common correlates, lots of money and an apparent ability to control others. But these people don't seem to have much control over their own time or what they do. Far from it, these are people whose fame includes a reputation for being in demand. When people with those goals really hit the "big leagues," they end up paying assistants to manage their schedules for them, or at least managing the fall out when they pick on choose who to pay attention to and who to ignore. Still, they are extreme examples and tangled up in an openly corrupt system of perverse incentives. I think it is more common for people to value having a great deal of structure in their lives, and they may even be amenable to that structure being imposed by other people.

But the sad fact is far too many people never get anything that even remotely resembles a choice. A structure is imposed upon them, and they find theselves faced with the grim question of whether to resist and try to get out of that structure, or to settle. (Top)

Demanding Perfection is Political (2025-12-01)

J.J. Harrison's november 2009 photograph of a great angle at the painted cliffs in australia, via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license. J.J. Harrison's november 2009 photograph of a great angle at the painted cliffs in australia, via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license.
J.J. Harrison's november 2009 photograph of a great angle at the painted cliffs in australia, via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license.

By now even the people most inclined to give Aristotle a solid pass would probably begrudgingly all admit he wasn't wrong or exaggerating when he commented about people being political animals, even if he was too much of a sexist jerk to admit women are political in their own right. I strongly suspect this caused him considerable difficulties when he went off to spend time in the macedonian court cozying up to future super mass murderer Alexander. In what today is recognizable as a gangster-land system where whole families are customarily part of the politics and associated intergroup violence, women didn't have a choice but to be political even if they wanted to somehow keep out of the fray but paradoxically also stay with their family otherwise. Since in effect the only way families continue is through the women anyway, this just became one more way women caught up in sexist contexts get scapegoated, and sure enough the ways to accuse them of being less than perfect proliferated. Whatever the new philosophy, law, or technique, women could be faulted for using it, not using it well enough, or not using it at all. If they are successful at surviving and contributing to their family's success, rest assured they will be impugned for supposedly being sexually impure, using some type of physical or mental poison, or of having generally tainted agency either because they did what they wanted or because they didn't. In other words, no matter what, the women will be faulted. This is a political demand for perfection, and the way it is demanded of women is highly illustrative, because the same constant dishonest double bind creation is turned against any other person or group an exploiting class and their followers is desperate to keep down. The overt examples of this technique for maintaining the status quo are proliferating even more than usual because the established ways of keeping different people down and exploiting them are coming apart. Those ways are coming apart due to a combination of persistent and successful resistance, combined with the fact that those ways are the favoured approaches of people who are endlessly greedy and at best unwilling and at worst incapable of accepting any others besides themselves have a right to live independently.

The most obvious examples of the political demands for perfection recently involve anyone involved in land protection or resistance against what even united statesians have begun openly referring to as "american imperialism." (I now dislike to use the term "american" in this way because the americas include many more countries and people than the united states. Colleagues from elsewhere in the americas helped me fully appreciate that.) In both canada and the united states, efforts are ongoing to declare any effort to protect the land from industrial exploitation "terrorism," that abused term which nobody can claim isn't a political swear word anymore. When the pretence to justification "it's okay when we do it or when we pay for it" for a very restricted definition of a certain sort of "we" is so tissue thin as to be nonexistent, many people who might have otherwise gone along with it are no longer willing to do so. I don't intend any sort of idealistic point by noting this. I think quite practically and understandably, people hate being called stupid, and that is true regardless of their political positions. The new marketing category of "environmental terrorism" has turned out to have some interesting enforcement nuances. For the children of the professional managerial class, it provides a status-raising minor arrest record. For the rest, especially the racialized rest, an entire infrastructure of legal abuse intended to drive them and their entire family into bankruptcy. Meanwhile, train derailments that blow up the centre of small towns or poison entire towns are not considered "terrorism" and there are no consequences for the executives who remove safety practices and refuse to spend money on repairs and maintenance to the train cars and tracks. Those executives are deemed a little careless maybe, and the corporation given a slap on the wrist fine. They don't need to be perfect. Those are the examples it is okay to talk about in public. Then there are the others.

We aren't supposed to talk about the demands for perfection of people resisting oppression. Those people are always supposed to do so nonviolently to be perfect, and why oh why, their critics moan, don't they die nobly instead of fighting back? Why aren't they following Gene Sharp's playbook? Why aren't they giving themselves up like Jesus? How dare they fight back and cause injury to soldiers who are only "doing their duty." This litany comes roaring out especially against Palestinians fighting steadily to end the ongoing western-backed genocide against them and win back their homeland in a non-apartheid form. It also get an airing every time a "non-western" for which read non-united states affiliated country fights back against western-backed gangs trying to overturn their government. It's interesting how all of a sudden, the "I was only following orders" defence counts, as long as it is western-backed people claiming to be following orders. They are never expected to be perfect. Not only that, whatever accusation a western-backed person makes against a targeted country or group or person, we can count on it to be repeated and bruited across the mainstream media with no effort at fact-checking or anything else. The correspondence to the way accusing a woman of being sexually promiscuous is taken as confirmation of guilt is exact.

Furthermore, we all know this is a political tactic. We all know that "perfection" is only demanded of people who are targeted for exploitation and even outright destruction. It has got to the point that the people who think they are elites and are in charge can't even be bothered to pretend otherwise. I guess that is progress, of a sort. (Top)

Actual Thoughts on Steampunk (2025-11-24)

Quote of a lovely image of an orchestrion from Jake Von Slatt's Steampunk Workshop website. The original-original source is a Collector's Weekly 2012 article Von Slatt cites, which is also well worth a look. Both pages visited and working 24 december 2024. Quote of a lovely image of an orchestrion from Jake Von Slatt's Steampunk Workshop website. The original-original source is a Collector's Weekly 2012 article Von Slatt cites, which is also well worth a look. Both pages visited and working 24 december 2024.
Quote of a lovely image of an orchestrion from Jake Von Slatt's Steampunk Workshop website. The original-original source is a Collector's Weekly 2012 article Von Slatt cites, which is also well worth a look. Both pages visited and working 24 december 2024.

Nearly seven years ago I wrote a piece called The Nineteenth Century Blahs, but never did get around writing much about one the stranger responses to the nineteenth century of the late twentieth and early twenty-first, steampunk. The great steampunk vogue came and went in the 2010's, running alongside the temporary mainstreaming of so-called "geek culture." For awhile there, every comic con or similar event had at least a solid steampunk-costumed contingent, if not a full track of sessions to complement a substantial number of vendors and products. Among speculative fiction writers, a very few tried their hand at writing counterfactual histories and alternate timelines that were really interesting. At the moment such writing is tragically blighted by the market hijacking and perversion of the concept of "woke" combined with protestant extremist demands for faith displays commonly designated "virtue signalling." But if you can find at least the Steampunk III: Steampunk Revolution anthology edited by Ann Vandermeer, you can have the marvellous experience of reading an excellent selection of stories including other than "white" main characters from before the blight took off. Steampunk II: Steampunk Reloaded edited by Ann and Jeff Vandermeer is also good, if you can track it down. But the first Steampunk anthology they edited, which features the first short stories reveals profoundly violent and disturbed evidence of how most of the first steampunk writers were specifically nostalgic for an era when supposedly patriarchy was absolute and the lesser races were treated like subhumans without apology by "whites." It's a deeply disturbing read, but for anyone who is concerned to see a full spectrum of written treatments including the worst, then that first anthology represents an important historical record.

For good or ill though, in the end steampunk became almost wholly about an aesthetic, and so by nature this meant it could only become a fashion and then quickly miss or even actively close potential avenues for imagination and artwork. So if a person was drawn to steampunk by what could be construed as a critique of modern technology with its constant drive for centralized control and rendering everything possible into pictures under glass, the contradictions began to poke out in a hurry. While there is an admittedly fun thumb-nosing in steampunk redecoration of computers and computer peripherals, rendering them other than butt-ugly for most "PCs" and rejecting the weird anorexia-adjacent and colour palette-minimized aesthetics of recent apple products, the fact remains this is only about looks, not fundamentals. It's less glamorous to get into the discussion and considerations about freelibre software and hardware, although the joys of joining the rogue repair community are evident in such businesses as ifixit and the craft fora like instructables.com. Yet it seems to me there was an important element of an attempt to revalourize practical skills and ways of practising art still often sneered at as mere "craft" because they don't depend on expensive art supplies. Still, I get why the aesthetic element took off long before the commercialization kicked in. Thanks to commercialization and mass production of products designed to land in the trash as quickly as possible, which if all else fails turns out to mean they are poorly designed and/or ugly and difficult to impossible to refurbish, the result is both boring and disgusting to the eye. Maybe various cynical parties saw this as a means to drive people back to the pictures under glass, where the colour and movement are. Then faced with the evidence from more than just steampunk aesthetic that people value pleasant textures and hefts, techbros decided handheld computers needed "haptic feedback." (Including the jackanapes who decided they would literally hide all options to switch from "haptic feedback" to quiet, distinct click sounds instead.)

Perhaps what steampunk as an aesthetic and performance mode reflected more than anything else was a particular cohort of young but mature adults, who for awhile found a wonderful niche to meet up together and have fun in. It started out from rather humble beginnings, associated primarily with thrifting and reuse. In fact, it could even overlap with the larping scene, while not depending so much on waving around swords and pretending the world is full of elves and orcs or rolling dice. The few people who tried to "dress victorian" on an ongoing basis seem to have been after something else demanding rather more money, and in at least one case I am aware of, ended up with them being formally ordered out of a local botanical garden and tourist attraction. (I don't have the details about the botanical garden incident, it all sounded very odd, and seems to have come down to some sort of ban on "costumes" in the gardens unless it is hallowe'en.) This also suggests that this particular aspect of steampunk aesthetic was made ephemeral by default, because the people involved would tend to be pulled away from their original cosplay groupings by such life necessities as moving for work, dealing with lack of work, perhaps building a life with a partner and even having children. But hopefully the incidental reminders about how people can still create art in their daily lives, resist pressure to generate constantly growing mountains of plastic waste, and have fun with similar aged and/or like-minded friends.

I am not convinced by the people who sneer at handheld computers masquerading as phones as skinner boxes, nor when they sneer at people they think are run wholly by their "smartphones." I am not too convinced by the people who have decided to reframe questions about "social media" and handheld computers in terms of demands for "digital veganism." Yes, I too have seen toddlers handed a phone or tablet to keep them busy instead of more familiar toys whose role in building muscle and gross motor skills are now being rediscovered the hard way. While I agree wholeheartedly that we have a serious and growing problem with technologies designed and redesigned to encourage physical and mental helplessness with a strong side of surveillance and attempts to impose centralized control, the associated technologies are symptoms not the disease. The disease is the all-pervasive clash of social classes, whether we like it or not, and the best and most thoughtprovoking uses of steampunk as a loose category within speculative fiction and a cosplay aesthetic practice actually dealt with this element directly. After all, who had well-fitted, elaborate clothes in the real life nineteenth century and often the fictional steampunk reimaginings of it? Who could act as if there was no such thing as money and expect to be accompanied by servants who mysteriously never needed paying? Fashion is not so unimportant or unrevealing as we are encouraged to believe. (Top)

Revolt of the Tools? (2025-11-17)

Illustration by Gustave Brion from an 1867 edition of Victor Hugo's *Les misérables,* courtesy of oldbookillustrations.com. Illustration by Gustave Brion from an 1867 edition of Victor Hugo's *Les misérables,* courtesy of oldbookillustrations.com.
Illustration by Gustave Brion from an 1867 edition of Victor Hugo's Les misérables, courtesy of oldbookillustrations.com.

There is a farcical quality to how badly search engines perform today, even including the ones not designed or distorted into a poor cover for datamining and advertising. Even more hilarious is how suddenly it is all but impossible to roust up a search result to provide an online citation of any anthropologist or similar person who referenced at least in passing a Mayan or other story once easy to look up under a heading like "the revolt of the tools." I am well aware of the possible and if so dubious reframing of it into the revolt of the toys in the original pixar hit "Toy Story." There is a lot going on in the much earlier Mayan story, more than a person reading a transcription would necessarily pick up on, because they are often reading the story out of context. The question of who was telling it when must always be asked and preferably answered with stories featuring important social commentary. Nevertheless, even if advertising and datamining per se were not major issues of concern today, if there were nevertheless hucksters at large trying to sell magic "AI" to fill their pockets before the bubble pops, it would end up being difficult to find much about pre-computer era reflections on what could happen if the tools stop behaving as their users intend, let alone wish. There are many not so honest brokers who would like us not to take stronger notice of how so-called "tools" like what is advertised as "AI" are centrally controlled and completely unauditable. They are not "our" tools, nor indeed are they "tools" as such, they are not designed to help us complete a job. Unfortunately, these incredibly wasteful programs that met their moment by providing a means to keep selling GPUs and waste electricity when the bitcoin hype began to fail are not intended to be useful in the more common sense of the term anyway.

All that said, the obnoxious "AI" hype is not what brought the Mayan account of the revolt of the tools to mind, though the problems with large language models and other programs meant to put together statistically probable sequences of "symbols" may seem more directly relevant. My point is not to use the story as a means to emphasize that such models don't have any agency, even though the Mayan story hinges on the tools having agency because they attack humans for abusing them, thereby forcing humans to correct their behaviour. No, a big part of what brought this story to mind recently was turning to my trusty OED – I have an older electronic one and an offline hardcopy compact – to examine the current meaning and connotations of the word "organ." Quite unexpectedly, I learned this is a word with an opposite trend to the many words for living creatures applied to weapons and other poison-spewing technologies to make them sound less threatening. Back in the day Jane Caputi and Mary Daly were among the feminist scholars taking note of this uncanny and dishonest linguistic trend. Probably the most infamous and symbolic in a north american mainstream context is the nuclear power "plant." This is not the same tactic as synecdoche, where a part is made to stand for the whole, but an effort to naturalize and render unthreatening a genuinely dangerous process or item. However, it turned out that "organ" originally referred to inanimate tools, as in hammers, saws, and the like, and only later was reapplied to parts of the body. The most common illustrations of this reapplication to part of a living being are basically men referring to their penises in pornographic or porn-adjacent contexts. I guess the logic of this, since these examples tended to be from various authors writing in england, is that supposedly the penis seems to have a mind of its own and the hope is to remake this sometimes embarrassing and inconvenient phenomenon into something more seemingly threatening or useful. I suppose it can be face saving to try to imagine an unruly seeming body part to be more like a tool that can be repurposed into a weapon, but I am not convinced.

More helpfully, this reminded me of an ancient greek habit of referring to slaves as if they were tools with feet, and then in turn of the fact slavery was practised in north america before europeans showed up, but not at remotely the scale of european practice, nor reframed in a way to slaughter people en masse, as became entrenched colonial practice even before then. With that detail in mind, a story detailing the revolt of the tools maybe doesn't sound quite so whimsical. Maybe, if we are sensitive and thoughtful persons, we begin to wonder how the people wielding those tools were rendered so invisible that rather than automatically imagine people with those tools in their hands, we imagine tools mysteriously moving about in the air. Which, if your "we" is similar to mine, years of cartoons both by the now ever more infamous disney corporation and warner brothers shorts with their depiction of seemingly self-operating tools bubble up in memory. Thinking back, I remember my favourite book of fairy tales including the story of elves who help a poor shoemaker by making up shoes for him at night, leaving no other sign of themselves in the morning. Those shoes couldn't possibly make themselves! Now I wonder at how the denouement of the story always reframes the shoemaker's wife as making some sort of mistake by making those elves clothes, after which they leave the household. This ending always struck me as appropriate, because the family was no longer impoverished, and the wife was demonstrating respect and gratitude. Perhaps those elves had someone else to help. Or, if we consider the resonances with the original practice of live-in apprentices, the apprenticeships were done. Still, the point is, this old european story is a bit more honest. If we can't see the tools at work, it's because the people who use them are required to work out of sight. This is a common requirement of heavily exploited workers, since if their exploitation is too visible, well, others might object and the exploiters might find it impossible to keep their sense of right and wrong from troubling them.

What really brought the subject of this rangy thoughtpiece to mind though, was a pair of coinciding readings. One, Karen Messing's wonderful and infuriating Pain and Prejudice: What Science Can Learn About Work From the People Who Do It, the other a brief news article recounting walmart management's latest attack against the dignity of the people so unfortunate as needing to work there. Said employees who work on the shop floor, stocking shelves, running the cash registers and so on, are to be fitted with body cameras to help stop shoplifting by customers. Of course, this is a bullshit claim. The point of the cameras is to degrade them and demonstrates walmart management is of the view that their floor workers are stealing basic necessities sold in the walmart stores. This is the same corporation keeping walmart wages so low that in some united states jurisdictions they "help" employees apply for foodstamps. It would please these greedy corporate types more to spend more money on various forms of automation to get rid of employees altogether than to pay decent wages and treat them with dignity. The cruelty, viciousness, power tripping, and waste are the point. Nevertheless, the same greedy people will whine and do whine when "the tools revolt." (Top)

What's Good for Extraction is Not Necessarily Best (2025-11-10)

Image of indigo plant extract applied to paper by Palladian, 2004. Image courtesy of wikimedia commons, where Palladian has released it to the public domain. Image of indigo plant extract applied to paper by Palladian, 2004. Image courtesy of wikimedia commons, where Palladian has released it to the public domain.
Image of indigo plant extract applied to paper by Palladian, 2004. Image courtesy of wikimedia commons, where Palladian has released it to the public domain.

Until not so long ago, mainstream views about writing and literacy seemed pretty much set in stone (no pun intended). According to those views, there was only one true form of writing, a perfect and best means of representing words for people to read later. That form of writing was and is based on using a small number of symbols, about thirty usually, representing both vowels and consonants. Punctuation seems to be taken as basically optional, and accents or other diacritics completely sneered at as unnecessary. This writing must be for representing words, that is speech, unambiguously. No interpretation! From this we land in a first contradiction, that the letters are supposed to do all the representing, and neither punctuation nor accents count as the writing, even though they are necessary for the disambiguation part. Well, and they don't always work. And then there is the whole issue with spelling, which is most acute for french and english. But, anyway, still, common sense not long ago, and I suspect to this day still, is that alphabetic writing is the very peak of writing system development. Once an alphabet meant for representing words unambiguously is set up, there is nothing else to develop. After all, didn't the original phoenician abjad get superseded by the greek and then latin alphabets? And haven't even the slavic peoples who standardized at first on the cyrillic alphabet still start from an alphabet, and aren't they adopting the latin alphabet after all? I'm actually not trying to be facetious or dishonest putting these questions here, even though, like it or not, in a context for others to read I too have not seen such questions presented because they record accurate ideas or perceptions. And yet, not so long ago, these would not have been questions but seemingly obvious reasons the latin alphabet is supposed to be the best and the end of the developmental line for writing systems.

Silly as it may be, while I have long been inclined to reject this idea of the latin alphabet being the top of the heap, I was hard pressed to say why. Even though considering such a status for it statistically unlikely is quite weak tea, even though it is true. The greeks still write with their alphabet to this day, though they may use a variant on "greeklish" (frequent visitors to the Perseus Project will be most familiar with the variant called beta code) when typing on a keyboard without ready access to a greek keyboard layout. Whatever the opinion of others, people across the part of asia with the orthodox christian church as a part of their religious history have mostly stuck with the cyrillic alphabet. Arabic cursive script, that other famous and widespread descendant of the phoenician abjad is not going anywhere either. It is well worth reading up on the korean hangul script, which is developed on rather different principles to generate the letter shapes compared to the alphabets I have mentioned. There is an underlying pictorial element, but unlike the widely accepted argument that the abjad letters reflected a series of pictures and related letter names derived from back projecting from hebrew, hangul letters reflect the shape of the organs used to make the sound they represent. If the qualification for best writing system is representation of sound as accurately as possible, it seems to me hangul would be the proper top of the class.

Part of what can make the naive claim about alphabet superiority seem reasonable is the way writing systems loosely called "hieroglyphic" are presented. Many of us encounter ancient egyptian hieroglyphics, and they can seem quite formidable. There are many signs of several types. There are the main pictorial signs, then the signs added to disambiguate meanings, a specific separate set for numbers, and then a subset used to represent foreign names. Those last ones are the progenitors of that famous phoenician abjad. The best known hieroglyphic writing uses the most elaborated version of the script, but there was also a simplified, cursive form for more everyday use. Practically speaking, the ancient egyptians used this script for thousands of years, and it took major and violent change over a long period to finally run it out of day to day use. The coptic alphabet is its closest descendant now, and I am not sure that alphabet is used for much outside of coptic bibles these days. The most famous logographic script system made on similar principles in use today is of course, the chinese one. No matter how complicated it may seem to a person who grew up just with the latin alphabet, clearly it is effective and useful. On seeking to improve literacy rates, china did not opt to abandon this old system, even if to an outsider it might look easier. But to do that would effectively cut off the people from their entre history and literature, so paradoxically they would still be illiterate. Rather than do such a dangerously disruptive thing, it is makes more sense to learn the latin alphabet as an extra script if desired. Such a policy seems to be what the chinese have chosen, and this is true of many other countries where the population has an older and deeper written tradition in a different script. (Many, not all. One of the most famous counter examples is turkiye.) As to why a people opts for a logographic script rather than an alphabet, I have not run into any author who has ventured to make an argument about that, or at least, not an explicit one. Syllabaries can look a lot like overgrown alphabets, but they are favoured by many peoples whose languages eschew bare consonants altogether. The best known is probably japanese hiragana, but there are many others.

Looking back in now hilariously dated popular ancient history books, the authors go back repeatedly to the difference between the phoenician abjad and the greek alphabets. They are really impressed by and exercised to explain the development of letters for vowels. This is an undeniable difference, but one that seems a bit strange to make a fuss over. All human languages have vowels, but whether humans seeking to write down their language want or need to represent them varies. If their writing system is logographic, then directly representing vowels can be utterly beside the point. Representing vowels doesn't seem that special, because in the end, the phoenicians and other peoples using an abjad didn't need to do so because they did not have words differentiated only by a vowel sound at that time. The greeks did, so if they were going to use a script that directly represented sounds, they needed to work out a way to represent vowels. They were solving a practical problem, that's all.

But, if you are perhaps a person of a colonizing, extractively minded sort, then your perspective would be very different. If on encountering any person or thing different from what you already knew, you had in mind first of all to categorize them, identify something useful and exploitable in them, and then extract the exploitable bit, then the alphabet is the best possible way of representing language. After all, notionally an alphabet allows language to be represented accurately and permanently apart from the original speakers, especially if set down on an appropriate surface. Once the words are extracted and recorded, the original speaker and composer is no longer necessary. This very attitude is at the core of so-called "salvage archaeology" which was invented by scholars certain Indigenous people were doomed so they had best record all the good, interesting bits before they were wiped out. So in the end perhaps what happened was that finally somebody trying to formally write up why alphabets were "best" got cold feet about maybe having to admit what made them seem best was how certain colonizing people used them. (Top)

Funny Thing About Those "Last Human" Stories (2025-11-03)

January 2011 photograph by Alan of part of the giant chess set in wollongong, australia. Quoted from flickr on 15 december 2024. January 2011 photograph by Alan of part of the giant chess set in wollongong, australia. Quoted from flickr on 15 december 2024.
January 2011 photograph by Alan of part of the giant chess set in wollongong, australia. Quoted from flickr on 15 december 2024.

To be fair, there are undoubtedly serious speculative fiction readers who could make serious arguments with many citations that in fact stories featuring last humans on Earth are more diverse than the small selection that contribute to my impressions about them. By nature trying to really dig into the concept and fan out from it into a story would be no mean challenge, since after all we humans are gregarious and are born into communities. Our survival depends on it. Hence, such "last human" stories must always feature adults. Even the hamfistedly secularized messiah backstory for the comic book character Superman reflects this. "Kal-El" is the last survivor of planet Krypton, sent off in a space capsule as an infant, but he is in some sort of suspended animation until he gets to Earth. My thoroughly nonrandom and not extensive knowledge of "last human" stories seem at first very logical in their premises. They agree that in order for there to be one human left on Earth all alone, some terrible disaster must have happened. The broad consensus on how this can happen, following in the wake of Mary Wollstonecraft Shelley, who looks to be the first to write such a story in english, is via pandemic. Today we can expect authors will be less inclined to make this a "natural" pandemic, and more likely to frame it as the outcome of biological warfare or even a dread accident during peaceful medical research. So far I haven't bumped into many examples of such ideas as effects of the tail of a comet impacting the Earth's atmosphere, or a comet or meteor hitting the Earth. There are certainly "near miss" narratives, like the tragically awful Quatermass Experiment movie's macguffin, the alien life form that if allowed loose on Earth will annihilate all life. So I bounced among older science fiction stories, hilariously bad comic books, and now free to watch online early black and white movies with soundtracks. Fun as this was, I began to have more and more questions, because it became clear there is a seriously bimodal distribution in the "last human on Earth" stories. Yes, it's the obvious one, but it's also weird.

The majority of the time, the conceit of the story is, "here is the story of the last man on Earth and what happens to him." Sometimes it turns out he is mistaken about his aloneness, which can allow any number of clever plot twists – but they are difficult to do well, and tend to stick in the memory, and therefore thwart enjoyment of re-reading or re-watching unless there is a lot more going on. In any case, the last human on Earth is a man, and he is of course doomed, one way or the other, although in Wollstonecraft Shelley's novel, she leaves the possibility open that it is just possible the last man could find some surviving humans somewhere, eventually. As we might expect, there is typically a strong element of social critique in these stories, and a conclusion that european societies are too foolish in aggregate to reasonably cope with a pandemic. Ah, which leads me to realize that all of the examples of this sort of story I have encountered are by people of "european" background. I don't think this means nobody from other backgrounds or histories has written or filmed a story on this theme. Even the biblical tradition (such as it is) doesn't contemplate a population of less than two people. A far smaller number of stories purport to be about the last woman on Earth, but this always turns out to mean just that. As in, there is one woman trying to fend off the male population since somehow the rest of the female population is gone. I found it both sadly revealing and unsurprising the few examples I ran into showed up in a catalogue of science-fiction B-movies, and it was quite clear these were pornographic in nature.

The obvious sadly revealing point is the pornography. The not so obvious though, is how apparently nobody who has gotten the story into some sort of published form can imagine a last woman on Earth. They assume in such a disaster, all the women would certainly die, and can't seem to imagine women would do anything or could do anything without at least one man around. This doesn't really make sense, however. Like it or not, the actual evidence out there is that women are incredibly tough and able to survive quite astonishing and appalling levels of physical and mental insult. This does make grim sense, because among mammals generally it is the females who have to somehow survive pregnancy, childbirth, and child raising. Therefore they have to be pretty resilient – and if the female of the mammalian species is not surviving well, then that means conditions for that mammalian species are quite terrifyingly bad. Then again, perhaps it is these very practical facts that have militated against actual "last woman on Earth" stories. I can think of several different stories, including the wonderfully biting classic by James Tiptree, Jr. "Houston, Houston, Do You Read?" in which the probability of a serious number of women surviving a pandemic is acknowledged and followed through in the story. In light of the rather unhinged and extreme reactions typically invoked by any suggestion or indication of female separatism in fiction or real life, I suppose the rarity of such "women survivor" stories should not be surprising. (Tiptree also wrote "The Screwfly Solution," which makes a similarly biting point about how a possible last woman on Earth situation faced with a population of men would really work out.)

For my part, somehow the idea of trying to imagine what a human woman left alone on Earth might do does not lead to the sort of mental grooves "last man on Earth" stories are typically full of. Many authors seem certain a last man would inevitably go through a phase of deliberately breaking every rule he thinks are mere products of social pressure. I don't know about that, since I don't think the only thing that keeps us from behaving badly or senselessly is other people. In any case, since the character of the last human who happens to be a woman is so unwritten, I find myself thinking not of doing obnoxious things, but in one of those interesting lateral associations, of the Voyager spacecraft that are now both just about beyond any means to transmit data back to Earth, between sheer distance and their decreasing ability to generate power to transmit in the first place. (Top)

Phylogenetic Trees are Cool (2025-10-27)

One of the excellent image downloads provided by the website of David M. Hillis' laboratory at the university of texas, accessed 19 november 2024. One of the excellent image downloads provided by the website of David M. Hillis' laboratory at the university of texas, accessed 19 november 2024.
One of the excellent image downloads provided by the website of David M. Hillis' laboratory at the university of texas, accessed 19 november 2024.

My first encounter with the method of mapping relationships between different species was in a class about dinosaurs. This was in the days when there was still considerable resistance to the idea of warm-blooded dinosaurs, or survival of dinosaurs in the form of birds. While I couldn't make head nor tail of the arguments for and against warm-bloodedness at first because of needing to learn more about the sorts of fossil evidence involved in it, the idea of smaller dinosaurs surviving to the present in much altered and evolved forms made sense. After all, if mammals made it out first by being small and gregarious, then by evolving fur and other means to survive cold and wet conditions, then it didn't seem so ridiculous for similarly small, gregarious dinosaurs to do the same. It's toughest to make it through large change sin climate, even temporary ones, for larger creatures. They don't have much wiggle room to go hungry, have less ease in moving to more promising places, and may not be able to live in groups large enough to get the benefits of a herd or troop. Well, I say all this now. Back then I found dinosaurs simply too cool to agree something as ridiculous as an asteroid strike wiped them all out completely in one shot. While obviously an asteroid strike is hardly trivial, if it was possible for one disaster to somehow neatly take out just one major type of animal or other organism from the world, well that just seemed too surgical. Plus, even then we had already learned about the extinction of the pleistocene megafauna, many of whose smaller sized cousin species made it through that alarming period. By no means all of course, a mass extinction is no party. Still, it probably really came down to me managing to hang onto the common perception of many kids that dinosaurs are awesome. Arguably the coolness of phylogenetic trees has a somewhat more adult basis, even if some of the most famous examples are apparently so in significant part because they are simple enough for tattoos and posters.

Phylogenetic trees aren't new exactly, although the most famous first example is drawn from Charles Darwin's notebooks, specifically one dated to 1837. The Digital Atlas of Ancient Life provides an excellent reproduction among the illustrations of its explanation of phylogenetics, which I am going to lean on heavily here. Practically speaking, a phylogenetic tree is quite similar to what we refer to as a family tree, in that it is meant to depict relationships between ancestors and descendants. As always, things are more complicated when we dig into the details. A phylogenetic tree represents hypotheses about evolutionary relationships between different types of organisms, whether they are alive or extinct. They are more obviously hypothetical and therefore contested than family trees, and the most reticent biologists are reluctant to refer to the organisms under study as species once they are identified wholly from the fossil record. The issue is not genetic relationship as such, but that the colloquial meaning of species can cause a great deal of confusion and unsupported expectations. The OED definition of a species doesn't reveal where the confusion can come in, stating "a group of living organisms consisting of similar individuals capable of exchanging genes or interbreeding." The trouble comes from an expectation very different looking organisms will inevitably be separate species and unable to exchange genes or interbreed. This expectation doesn't stand up to much scrutiny, as we have only to think about the extraordinary diversity of the domestic dog breeds, many of them maintained only by the type of forced breeding which among humans is firmly denounced as incest and liable to produce at best sickly offspring. More recently, recognition of the importance of symbiosis and its effects has added a further layer of complexity, although so far primarily on the genetic and microscopic rather than macroscopic level. In any case, unlike a human family tree, phylogenetic trees are meant to represent degrees of relationship, not literal lineal descent lines.

So, supposing you don't want to just look at these phylogenetic trees for their aesthetic qualities, how are they read? Well, these are analogous to trees, so they have branches extending away from a trunk or base. Therefore, the shorter and further from the base a branch is, the more distantly related it is to the group of organisms defining the base. The places where a new branch splits off is a node, and the node represents a common ancestor. According to the Digital Atlas of Life, "A phylogenetic tree is an illustration depicting the hypothesized degrees of evolutionary relationship amongst a selected set of taxa (singular = taxon). The taxa are typically species, but can also be higher-level Linnaean groupings like genera or families. Alternatively, some phylogenetic trees depict relationships among individuals within a species (e.g., from geographically isolated populations)." So don't be surprised on running into examples of phylogenetic trees tracing relationships between quite different groupings, but they can't mix levels. If the tree is meant to show relationships between species, all the names will be of species, if between genera, then genera names. The example reproduced above is a tree of phyla relationships, which roughly reflects shared body plans between all the organisms included in it. To begin with phyla were defined by body shape before anyone could study genetics. Nevertheless, the body shape based definition of a phylum does not have the same weakness as it would if used for defining species. Maybe a better way to refer to this is body type or body plan, which implies an examination of more than superficial similarities. (Taxonomy is all about arrangement and ranking. As serious biology students will be able to explain in their sleep, the taxonomic ranks from most inclusive to least are: Life > Domain > Kingdom > Phylum > Class > Order > Family > Genus > Species)

While there are certainly people who find the idea of all life on Earth being literally related in a quite direct sort of way, especially via genetics, quite disturbing, I find it hard to share this perspective. On one hand, it is quite wonderful, wonderful in a truly ineffable sense. It is no wonder to me there are people so overwhelmed with delight at this, and at representations of it as to want to literally have it marked indelibly on their skin. On the other hand, yes, it is not wholly a pleasant thought when we are faced with species extinction, anthropogenic climate change, and less than pleasant or helpful other organisms in the world. Contrary to the views of a great many scientists and engineers, our ability to meddle with other organisms in the wildly complex network of life on Earth does not strike me as a powerful thing so much as a perversion of knowledge and failure to properly honour our kin. Indeed, such notions as humans supposedly being the epitome and ruler of life on Earth strikes me as an especially acute example of hubris. (Top)

Alien Disappointments (2025-10-20)

United kingdom alien internee card for Adelheid Heiman (1903 - 1939). Public domain image courtesy of wikimedia commons. United kingdom alien internee card for Adelheid Heiman (1903 - 1939). Public domain image courtesy of wikimedia commons.
United kingdom alien internee card for Adelheid Heiman (1903 - 1939). Public domain image courtesy of wikimedia commons.

Honestly, I don't get it. I don't get the apparent intense disappointment some people have in the absence of any aliens from outer space. After all, we have more than enough trouble to keep us busy here on Earth among ourselves without inviting more. Then again, perhaps I just have the wrong mindset to appreciate much of the excitement. After all, extraterrestrials seem to be where just about every version of supernatural explanation or event is attributed in hopes of making them real after all, somehow. The big ones people are a bit embarrassed to talk about are, understandably, the religious ones. The people who are sure they are wholly secular, but have decided the strange protestant extremist notion of the rapture will in fact involve special humans being spirited away from the dying Earth by aliens does come across at best awkwardly. (If you want to see the original source of the rapture nonsense, I understand the locus classicus is the annotations in the infamous scofield reference bible.) Then there are the people hoping aliens bring fancy technology to fix all our problems, which strikes me as an attempt to reframe a bowdlerized pantheon of deities. Of course, there are also the people utterly terrified that extraterrestrials will come to Earth, colonize it, and totally wipe out or enslave humans. Well, that comes across as a reflex of bad conscience and a fascinating resurgence of officially repressed knowledge of colonialism on Earth. So far I haven't read or heard to many racialized people subscribing to any of these ideas, perhaps in part because for so many they are literally descendants of people who suffered real life alien invasions by humans. I am not a fan of psychological explanations, but can't deny certain types of stories may not be intended literally so much as they serve as a means to critique the status quo in conditions where people think it isn't safe to do so openly. This is a major source of the angst about any sort of fiction among the very rich and their direct servants in the professional managerial class.

To begin with "alien" just means unfamiliar. Maybe there is an element of "is that all?" behind some of the disappointment about the apparent absence of extraterrestrials we could talk to or have various dubious experiences with. Could it be possible we now know everything there is to know, have identified all the possible variants in human experience and the way things work on Earth and maybe even the inner Solar System? Put that way, it sounds quite ridiculous and arrogant. We're humans, not any sort of infinitely powerful or knowledgeable being we may try to imagine. It is no more likely we know now or could know all there is to know than for "history" to somehow be over. We can count on change and being surprised, hopefully not too unpleasantly. Yet, I realize there are many among my friends and acquaintances dealing with difficult circumstances and horrified at the thought that what they are seeing and experiencing of the world is all there is, that there could be no better or other way to live. Good grief, who wouldn't feel appalled at the prospect of that being all, even if only in a particular period of relatively personal struggle? Who wouldn't want some kind of overriding total contradiction to even a transient feeling of such a kind, which a belief in or expectation of aliens would certainly trounce. It's that or watch a production of "Death of a Salesman" by Arthur Miller, I suppose. Admittedly, I'd sooner watch Star Trek or Babylon 5. Or a test pattern, if there was no other option to do something else.

In the end the most careful, and I think fair characterization of the latest resurgence of aliens, worries about "unidentified flying objects," and the claims of mysterious drone swarms over a selection of northeastern united states cities may be Lambert Strether's at naked capitalism last year in his blog post on the supposed drone swarms: "it's clear we have a population under stress and in distress." He is speaking specifically of people in the united states, but the description is certainly applicable far beyond there, and in much of the world can only understate the case. (Top)

Misjudging Oratory (2025-10-13)

February 2016 photograph of an altus 110 loudspeaker by Jacek Halicki. Image courtesy of wikimedia commons under Creative Commons Attribution-Share Alike 4.0 International license. February 2016 photograph of an altus 110 loudspeaker by Jacek Halicki. Image courtesy of wikimedia commons under Creative Commons Attribution-Share Alike 4.0 International license.
February 2016 photograph of an altus 110 loudspeaker by Jacek Halicki. Image courtesy of wikimedia commons under Creative Commons Attribution-Share Alike 4.0 International license.

Cognitive dissonance is more often unpleasant than a relief, but we can't help but run into it. Real life is too full of contradictions even if nobody is trying anything nefarious for it to be otherwise. Many of those contradictions then are of the sort that reveal we have misunderstood something about how some aspect of the physical world works, or how our social world works, to take just two very broad categories. European philosophers fell over themselves with excitement once they finally noticed contradictions indicated something useful and meaningful was going on, just have a look at the voluminous works of Georg Hegel and the many people besides philosophers who have taken up his ideas since. But you don't have to read philosophy or anything like that to end up puzzling over contradictions of course. Whenever we are provoked to wonder why, what, or how, chances are we have a contradiction provoking our attention. Among those that have long puzzled me because they can't be resolved once and for all, is the curious reputation of "oratory." To this day, at least in mainstream anglophone contexts, the definition of oratory remains Aristotle's, "the skill of persuasive speaking." This skill is both lauded and despised for the same quality, its ability to persuade. Besides that, it has always struck me as odd to define oratory primarily as this. Surely it is at least as much about the skill of memorable speaking? Except, the people whose professions are most visibly centred on oratory, politicians and lawyers, would much prefer we don't remember so well. After all, how can they change their positions and arguments as convenience demands if we remember what they already said with enough clarity to challenge them and resist their claims?

UPDATE 2025-01-11 - I have happened on a helpful definition of a sophistic argument from Stephen Jay Gould's book The Structure of Evolutionary Theory. On page 269 he comments when setting out what he means by a sophistic argument in a parenthetical statement, "...in the sense that any potential refutation could be so 'accommodated,' thus making the theory irrefutable, untestable, and therefore useless."

If we dig into Aristotle on this point and the ancient greeks more generally, it doesn't take long to end up tangled up in questions like what the term "sophist" actually meant, because many of the best known works applying the term are from athenian sources and use it primarily as a derogatory label. It is common now for politicians to attack the character of those who challenge them, especially those not involved in the same political and professional circles, effectively accuse them of being sophists, of being "persons who reason with clever but fallacious arguments" as the OED notes. So if you are a good speaker making arguments they don't like and you are not part of their club, then those politicians will attack your character and reputation instead of demonstrating if or how your argument is supposed to be wrong. But this does not suggest oratory is the issue as such, nor anything that may or may not be sophistry, whatever that means, so much as a corrosive form of competition. In a much earlier thoughtpiece I ended up concluding a major issue with politics as we currently know it is the way participants are perversely incentivized to seek higher office in order to maximize the means available to them and their patrons to steal. Under such conditions, politicians are hard pressed to not become active dangers to broader society, and skills like oratory degraded to serve the ends of facilitating theft and exploitation. In that case, understandably what we get left with is oratory as little more than a shell game or other type of clever con. Such oratory attacks the person or otherwise appeals precisely to emotions and prejudices, not reason. So there really isn't persuasion happening at all in that case, just distraction. Well, as long as we agree persuasion is to do with our thinking and active choosing.

Okay, then annoyingly we need a better way to at least be warned to watch for something amiss when a person is speaking to us with a view to convincing us of something. Even if the person speaking is outwardly just trying to impart information, there is still a persuading aspect, since they hope to convince us that they are trustworthy. This turns my mind back to the idea of trying to redefine, or add to the definition of oratory such that it is the skill of memorable speaking. A speaker who doesn't want you to remember what they said so much as how you felt while they were saying it, is probably up to no good. Such a speaker then just has to invoke the right feelings, and later listeners are prone to attributing to that clever speaker what they wanted to hear, because this hooks into our ability to confabulate. Yes, "confabulate" sounds rather silly, but it is more useful than "lie" with all its connotations of deliberate falsehood. When we confabulate, we typically do so entirely unconsciously, we fill in what seems most likely in the absence of real information. In fact, neurologists have found we absolutely must have this ability in order to function, but we have to watch out because it is so powerful. It is incredibly important for us to develop all the resistance we can to having others trigger our confabulation abilities. It seems to me this is a major driver behind the many forms of mnemonic devices developed in so many cultures over time, many of them taking advantage of our ability to combine our spatial memory with our ability to remember concepts. Such practices can help us catch when we are trying to come up with an explanation or tell what somebody else said or did without enough information to be sure they are doing so accurately. To be clear, I don't think this was about dealing with dishonest other people so much as it was probably to help us survive in dangerous conditions such as natural disasters or coping with a carnivore trying to catch and eat us. Under such conditions we need to keep our heads and not act on absurdities that are only convincing in the moment because we are scared.

This is clarifies to me at least one of the most puzzling things about the versions of Socrates we have received from antiquity. He was accused of being a sophist, he insisted he wasn't one, and he wasn't a teacher, he just asked questions of people about commonly held ideas. Yet, even in Plato's dialogues there comes a point where Socrates comes across not as an honest questioner but a clever and manipulative one, which points again to the "sophist as insult" label. By all accounts, he sought clear definitions, especially of such important abstract nouns as justice and the good, yet seemed to have little to no interest in helping anyone learn how to resist manipulative speakers, including those who sought to manipulate the emotions rather than honestly engage with the listener's reason. (Nor in actually developing an explicit definition of any of those terms.) At least, this is the paradoxical image we have via the written word to which he did not contribute. (Top)

Employees are Not Children (2025-10-06)

Image of an engineering department employee circa 1962 from the seattle municipal archives flickr. Who is this? Did she wish to be anonymous in this photograph? Image of an engineering department employee circa 1962 from the seattle municipal archives flickr. Who is this? Did she wish to be anonymous in this photograph?
Image of an engineering department employee circa 1962 from the seattle municipal archives flickr. Who is this? Did she wish to be anonymous in this photograph?

The place where management fads go to die is the excessively priced training circuit, where a few people make a great deal of money flogging outdated and vastly inappropriate "advice" for people who imagine themselves leaders and are barely competent as managers. The same trainers and deluded aspirants to managerhood eagerly snap up popular "business" books, which alas, are basically slightly padded internet listicles. It is not without reason that people like Kevin Kelly have a recommended reading program of real material demanding study and thought to understand and apply that is far cheaper than the now infamous "MBA," combined with trying to run a small (low overhead) business. One of the best (or worst, depending on your point of view) of the lucrative for the trainer but pernicious for trainees who take it seriously is the secularized christian "seven habits of highly effective people." What makes so much of this material so pernicious is the bits of genuinely good stuff, a bit like the swig of water to help a nastier pill go down, except you are not supposed to notice there is a pill at all. There are all manner of difficulties I could pick on here, including the bait and switch whereby supposedly "public government" is evil authoritarianism, but as long as the workplace is an authoritarian dictatorship in fact and literal experience, that is acceptable. In fact, I highly recommend reading Elizabeth Anderson's Private Government: How Employers Rule Our Lives (and Why We Don't Talk About It), published in 2017 with princeton university press. It is of course, extremely united statesian, but this makes it all the more relevant because the "best" businesspeople in the anglosphere at least seem to think the way things are done in the united states is the only way to do anything. But for this thoughtpiece, the focus is on the polite version of how such managerial training defines and frames "employees," which is literally as children. To be sure the less polite version bluntly treats employees as self-moving tools of a badly programmed sort, demanding constant micromanagement to keep them efficient and on task, hence the constant claims "artificial intelligence" will cause mass and permanent unemployment.

In general no, employees are not children either literally or figuratively speaking. The temptations to infantilize workers to the people hoping to exploit them most are not difficult to identify. If only the workers could be persuaded to serve employers the way children supposedly follow the dictates of parents, and furthermore could be cowed into doing so for fear of punishment, "labour relations" would certainly be easier. I can't help but wonder how many people with this mindset have ever actually taught or taken serious part in raising children, because it doesn't take too much effort to learn parents are by no means the absolute authority figures to children this mirage of control suggests. We have all too much tragic evidence of how abusing children does not and cannot make them totally obedient or supposedly perfect future adults. We also already know via historical evidence how slavery is a prelude to social collapse after a period of apparent hyperprofits. It doesn't take long before people see through illusions of being free and want the real thing, or in the context of capitalists seeking to exploit workers to the edge of death to maximize profits, that the employees will see that for what it is as well. However, everybody is in agreement that overt violence is not a good strategy. Better to stick to infantilizing strategies intended to get the same hyperprofits without the bad press and pricks to the conscience overt violence exact.

While as an individual I certainly can't claim to have a personal, scientifically done study of how employees respond to managers who strive to infantilize them, there is a great deal of evidence already available consistent with what I have observed. There are always a range of people in any workplace, and they take a variety of approaches to their work. Depending on the nature of the work and how they experience the workplace, they may do strictly what is asked of them and no more, they may do more than that for a whole range of possible reasons (i.e. boredom, ambition, getting bonuses, competing with somebody else), they may cynically cultivate appearances and connections while deliberately doing as little as possible. Among the variables to consider are level of experience, whether the worker is seeking technical excellence, to manage projects, to manage people, or to lead people to complete projects. Ah, I should note what "managing" and "leading" are here. If a person is managing, they are focussed on telling people what to do and making them do it. They don't and won't trust the workers, because a manager is sure they are always trying to get away with something, usually some form of theft. These are the people prone to "managing" a project or team into complete deadlock or failure, because they can't let the actual workers work. If a person is leading, they are not striving to apply a one size fits all, everyone is trying to rip me off rule. Instead, they are seeking to help the employees work as a team, trusting them to get the work done to the proper standard, intervening to help remove barriers to getting the work done well, including helping those who need additional training or a clear consequence for hindering their colleagues to help them get back on track. "Trust" doesn't mean blind trust. It does mean treating employees like adults, and not resorting to such self-defeating tactics as encouraging people to compete within work groups and between work groups. It is all too easy for competition to supercharge perverse incentives. Similar to most effective drugs, the therapeutic dose is all too close to the deadly one.

I do understand why leadership is so much more honoured in the breech than anything else. It demands respect, empathy, and trust in others, all of which demand a commitment of time and a willingness to experiment and test. It demands accepting and acting on the reality that people are individuals who typically have stronger commitments to each other as colleagues than they will ever have to a manager who neither understands nor respects them. But this is all at odds with any form of exploitation, let alone capitalism, where the perverse incentive is to treat all workers as the same so they may be easily replaced. The idea, as the late Ken Robinson stated so bluntly, is to convince us that based on our year of birth, we are a product of that year, to be shaped through job training into the sort of employee demanded at the moment the training is done. This is lunatic and we already know it doesn't work, nevertheless the most prominent pundits among the "business class" insist that extending the factory model of education further and further into adult life, including insisting anybody who isn't a "boss" is effectively a child. (Top)

Main Character Syndrome is a Philosophical Problem? (2025-09-29)

First frameset from the article 'The Significance of Plot Without Conflict,' by the still eating oranges art collective, 15 june 2021. First frameset from the article 'The Significance of Plot Without Conflict,' by the still eating oranges art collective, 15 june 2021.
First frameset from the article The Significance of Plot Without Conflict, by the still eating oranges art collective, 15 june 2021.

I wonder if there isn't a metric for when a particular pop culture notion has arrived along the lines of "when a philosopher trying to write for the general public writes about this topic," because that would actually be an interesting, even if trailing measure. Sort of like how dictionaries with definitions developed from word use in printed works tend to be at least a century or so behind day to day speech. (Perhaps rather less than a century now that computers are ubiquitous, but I doubt that.) In any case, awhile back I happened upon philosopher Anna Gotlib's article published on aeon.co, Main Character Syndrome. The editors at aeon.co did this article absolutely no favours at all by imposing the title they did, then adding an image suggesting that the primary people with "main character syndrome" are young, white women able to keep up with the latest styles. But then again, the name of this supposed syndrome is misleading, and Gotlib seems reluctant to acknowledge the issue, which I don't think is merely "harmful solipsistic narratives" but a broader issue of anglophone cultures, which are hyperindividualistic and firmly committed to glorifying greed and violence to feed that greed. The people who suffer most from the mindset of finding it difficult to acknowledge or respect the separate existence and needs of others are not teenaged white girls – who like most teenagers will grow out of their inevitable teenage self-centredness, which is far from incapable of empathy or acknowledging the existence of others – but men who have managed to either inherit riches or steal riches by some form of violence, whether that violence be direct or indirect. Then again, Gotlib also assumes and writes as if "we" are all if not actively victims of "main character syndrome" courtesy of too much time online or imbibing mainstream media, then unconsciously victims of it from too much time online or imbibing mainstream media.

On one hand, I cannot and do not deny Gotlib's concerns and critiques of mainstream and so-called "social media." There are serious issues with the narratives both these forms of "media" propagate, from the way they encourage regular viewers and participants to have ridiculous expectations of reality to the ongoing efforts of their purveyors to make them psychologically addictive. Those are definitely real concerns. Yes, the people who can be described as suffering from "main character syndrome" are "seeking some kind of love, or approval, or reassurance that they matter. They are looking for a feeling, a vibe." But, I am not convinced this is such a widespread issue. I think that so-called "elites," and the professional managerial class are certainly suffering from their alienation and sense of insecurity deriving from that alienation. Their alienation and unease is no mere product of too much time online or watching bad marvel movies, however. In this I am more inclined to agree with the existentialists that they are suffering from "bad faith," they have adopted false values because supposedly they will suffer if they behave otherwise. They are committed to self-deception, to rationalisation, of bad, unethical behaviour. How else could we do wrong in the world, and keep doing wrong, if we did not have some story to use to pretend our behaviour was justified?

It seems to me this is what so-called "main character syndrome" actually is. It is a different label for rationalisation for bad behaviour, a form of rationalisation that is now so crude and ridiculous it is undeniable and even people who are outside of the professional managerial class let alone the elite may be considered to indulge in it. Ah, but there's the rub. What if the – watch out, I am going to use a verboten word – proletariat are behaving in a manner that treats their supposed "betters" as no more and no less than themselves? Would that not seem like they are "sufferers of main character syndrome" who are not treating their "betters" with the level of entitlement to which they have become accustomed? Yes, we can't all be behaving as if we are the only people in all the world who count for anything. But the sad and blunt truth of it is, there is no acceptable number of people who can behave that way outside of the brief moment of adolescence we must all suffer through, and hopefully we suffer through it in a way that helps us complete our growth into full adulthood including empathy and respect for others.

Perhaps in the end I would not have had an any quibbles with Gotlib in her essay, except for the telltale and unfortunate invocation of popular culture as a source of the growing issue of "main character syndrome" or having "main character energy." She is certainly not claiming to make any sort of clinical diagnosis, and clearly she hopes to encourage a resistance to narcissism and move to empathy and constructive social interconnection. Then again, a person could argue that by invoking popular culture and "social media" as she has, Gotlib has in fact revealed a non-inclusive "we" and "us," which is an intellectually honest thing to do. After all, aeon.co is not claiming to be a mass market publication or anything, it says on its about page "Aeon's mission is to explore and communicate knowledge that helps us make sense of ourselves and the world. We ask the big, existentially significant questions and find the freshest, most original answers, provided by leading thinkers on philosophy, science, psychology, society and culture." Their contributors tend to be academics and professional writers oriented to the more middle market and higher sort of magazine, even if aeon media group ltd is a registered 501(c)(3) charity in the united states. That acknowledged, aeon.co does provide free access to its articles and eschews advertising. (Top)

"It's Never Censorship Until It's You" (2025-09-22)

Ham operator card used to confirm broadcast signal reception, 1940. Public domain united kingdom government image, via wikimedia commons. Ham operator card used to confirm broadcast signal reception, 1940. Public domain united kingdom government image, via wikimedia commons.
Ham operator card used to confirm broadcast signal reception, 1940. Public domain united kingdom government image, via wikimedia commons.

Believe me, I don't like it. I don't like it at all. But I have been forced to come to the conclusion that "censorship" is a meaningless political slur. I have been forced to conclude this, because somehow, somehow, it is never censorship until it's "you" for so many people, including people whom I respect as writers, scholars, or commentators. The word is as at least as badly mauled as the notion of "freedom of speech." This is obviously terrible, because we badly need to talk about what is often labelled as questions of censorship and free speech. More and more, I find myself using Betty McLellan's excellent explanation and framing of the issue in terms of fair speech, conceptually and partly analogous to fair trade, and intended to allow us to actually think through what speech and speaking means in social and political context. (Please note that I am not getting into the specifics of "fair speech" and its meaning here, so yes, when you get to the later parts of this thoughtpiece there will be parts not fully described.) Right now what we are reading and hearing in the mainstream media and much of what isn't is invocation of buzzwords and use of rhetorical cudgels. This is hardly new, and to be sure we have little chance of escaping such dishonest use of language in the real world. The difficulty is that at this stage, practically nobody seems to be taking what they say seriously, let alone what anybody else says seriously, and this is adding to the problems we are facing. When people don't take what they or others says seriously, they begin to treat the whole of life as a game, in which the only point is to win. This guides them into extremely short term, solipsistic thought and action. Whether we can resist or counter such guidance is a bit chancy right now. Scholar and author Glenn Diesen wrote eloquently about the danger and dishonesty of today's "cancel culture" that has metastasized from the anglosphere to many connected countries and cultures. Concerning censorship he notes in his 5 october 2024 essay, American Censorship Intensifies in the Information War:

The first stage towards normalising censorship and cancellation is to set a precedent with a seemingly minor and justified case, which can be framed as protecting the public rather than being an act of oppression. The initial censorship must be supported by reasonable moral or security concerns to achieve consent from the public, and the target should be a despised fringe actor. In the beginning, the government will not involve itself directly and limits itself to cautiously expressing understanding for the censorship and cancellation. Even the word censorship is avoided and replaced with "content moderation" and "de-platforming". Vague concepts such as "hate speech" and "propaganda" are used to justify censorship as they cannot be clearly defined. The vagueness of these concepts allows for incrementally expanding the range of speech that is criminalised and to apply censorship selectively. Support from the intelligence agencies and media is imperative to convince the public that free speech is a privilege and not a right. Gradually, censorship is normalised among the public and the need to justify it goes away.

Diesen's description matches the most recent iteration of the process as now applied to mainstream scholars who expected to be immune to such attacks on their ability to say and publish what they wanted in mainstream outlets. Funny how it never counted as such until it was them.

UPDATE 2026-02-08 - I have happened on the now mostly dormant blog The Rancid Honeytrap, an aggressively unpleasantly titled blog to be sure, but the writer was generally a thoughtful contributor who made a point of showing his work. That is, he sought to set out not only his analyses of what others say and do, but his own reasoning, including teasing apart temptations to rationalize from real examination, and documenting how his thoughts changed and when he developed and set out a clear description of his positions. As usual, skip the comments, alas they are inevitably mostly trash as the site is wordpress-hosted and the blogger, who went by the monicker Tarzie, understandably did not have time to keep all the trolls at bay. The particular post I found excellent to think with and relevant to this piece is White Supremacy and Magic Paper (alternate source) from early 2015. For non-united statesians, it provides an excellent historical overview and analysis of so-called "free speech absolutism."

This process of first normalizing censorship and cancellation by invoking a supposed social or security risk, then turning to near or total cancellation started a long time ago in the so-called "liberal west." I think there is a strong case to be made that it began with the constant effort to prevent women from taking part in public life in any way, be it as artists, politicians, clergy, or simply workers outside the home. Women having an opinion let alone expressing it in any manner has been repeatedly denounced as a danger to whatever society at least since violent colonizing cultures began extending their terrible reach beyond their original wartorn and denuded homelands. (I am not a big fan of this description, actually. I think the scary root issue in colonialism is greed sickness, better known by the ancient greek term pleonexia, and how it became the central driver of colonialism, which is not a culture but a horrible expression of greed sickness. Worst of all, greed sickness is in many ways a chosen illness, one people are persuaded to absorb into themselves and express.) I can well imagine others might prefer to argue this goes all the way back to trying to prevent slaves more generally from acting in ways demonstrating they are in fact reasoning humans and able to contribute to society by more than brute labour, because they have an itchy "what about the menz?" type response. I don't.

But this larger case is not the one that led me to reluctantly concluding accusations of "censorship" are no more than political, and worse yet, dishonest political accusations. No. Far from it. I did not want to accept such an idea. However, I was forcibly disabused of my belief in any other perspective on it by my recent forays into academia-adjacent work, and then observing subsequent expansion of the phenomena I observed there to the work I do now. In the 1980s and 1990s, I saw and read of Feminist authors and scholars being driven out of academia and publishing. The well-known examples include Andrea Dworkin and Somer Broadribb. There is always a group who manages to survive the purge, usually by finding a good niche in an other than ivy league or pretence to ivy league institution for academia, and/or a small or fringe press with a steady output and hard won distribution links. But I had already learned as an undergraduate of slender means that to really read and learn about topics that interested me, I was pretty much on my own. The best writers and scholars were already driven out of readily accessible outlets and spaces, and if they couldn't be for being say, Feminists, or pro-Palestinian, or communists, then the replacement accusations were ready to hand: racism, classism, antisemitism. These became the broadsides turned against active Feminist scholars who were too visible, often used in sad displays of using attacks on the person to get ahead. Those attacks came in mighty handy because they gave cover to the elite men who really wanted to shut those uppity women, and any other woman they perceived as uppity, up. But somehow, that didn't and doesn't count as censorship or cancellation.

Over a decade ago, I learned thoroughly that anybody who wanted to get by in academia, including getting the all-important faculty to support to get published and then into real tenure-track jobs instead of the sessional grind, had to say and write only the right things. Those things already included the original "conversion therapy" anti-homosexual narratives, alongside the anti-Feminist, anti-woman ones that always look shiny and new because they keep getting refurbished for reuse by the hangers on in what Barbara and John Ehrenreich so importantly and accurately defined and described as the professional managerial class. But somehow, that didn't and doesn't count as censorship or cancellation. We know what incitement to violence against a group or class of persons is, it is incredibly non-ambiguous. We know, and at times it has been actively persecuted. We know repeated depiction of specific groups or classes of persons as subhuman acts as a normalization and incitement of violence against them. We know the parallels between pornography and the sorts of grotesque pictures and editorials in racist literature are of this sort. We know that actually, we can responsibly and reasonably block publication and propagation of such trash. Somehow, only that ever really counts as censorship and cancellation.

Oh no. It is never censorship it seems, until finally the censorship and cancellation touches a white man. Would that more people, more consistently, understood and acted on the point of Martin Niemöller's famous poem, even if whether he wrote those sentiments as a poem is contested by some, and even if people go back and forth on whether he was trying to rehabilitate himself from initial flirtation with nazism and being a typically antisemitic protestant. Yes, I know, I know, protestantism is not necessarily antisemitic but there are a tragic number of protestant sects claiming to be the real "chosen people" who therefore develop special animus against adherents of judaism and islam. For completeness, here is a version of the famous poem, from the My Poetic Side website:

First they came for the Jews
and I did not speak out
because I was not a Jew.

Then they came for the Communists
and I did not speak out
because I was not a Communist.

Then they came for the trade unionists
and I did not speak out
because I was not a trade unionist.

Then they came for me
and there was no one left
to speak for me.

Obviously censorship, as in dishonestly preventing people from fairly presenting their ideas and opinions and engaging in debate about those opinions is not the same as being arrested and sent to a concentration camp. Of course it isn't. My point is, when you stand by and let others you don't like who are engaged in fair speech be silenced without a murmur, then it is less than impressive or honest when you complain when you are silenced yourself. (Top)

Speculative Fiction, Science Fiction, and Women (2025-09-15)

Sanjay Ancharya's 2017 photo of a magellanic penguin in san francisco zoo. Image courtesy of wikimedia commons, under Creative Commons Attribution-Share Alike 4.0 International license. Another aptly incongruous result from a search string not appearing on the image's page. Sanjay Ancharya's 2017 photo of a magellanic penguin in san francisco zoo. Image courtesy of wikimedia commons, under Creative Commons Attribution-Share Alike 4.0 International license. Another aptly incongruous result from a search string not appearing on the image's page.
Sanjay Ancharya's 2017 photo of a magellanic penguin in san francisco zoo. Image courtesy of wikimedia commons, under Creative Commons Attribution-Share Alike 4.0 International license. Another aptly incongruous result from a search string not appearing on the image's page.

I must confess to not having followed the whole "speculative fiction" versus "science fiction" and "fantasy fiction" label bruhaha, which it seems to me both Margaret Atwood and Ursula K. Le Guin commented on. Perhaps it is more Atwood I am remembering, because it seemed to surprise media types when she wrote novels that couldn't be neatly shoehorned into the book marketing bucket labelled "literary" nor the one labelled "popular." To this day there are certainly people who continue to have issues with the term speculative fiction, but I do rather like it, as it is less beholden to marketing and seems to capture something about the stories told that it can describe. It allows people to acknowledge openly that a great deal of both "science fiction" and "fantasy" is no more than obsessive retold colonization fantasies with their own special costumes and scene dressing. This does not mean they are necessarily "bad," unpleasant to read, or even (gasp) "simplistic." It does mean that when authors try to do something different while seeming to be producing a story to fit the usual mainstream anglophone versions of "science fiction" and "fantasy," the reception is often at best bewildered, and more often quite hostile. I can see authors looking to speculative fiction as a label that helps win their work new space and perhaps forewarns the reader, thereby helping give the story a chance. But that's not the point of the label in the end, and not the way it works. Authors may choose to explore all manner of questions if they are not inclined to write according to an accepted and rote marketed formula, and at its best speculative fiction is chock full of such thought experiments. It also seems to me that the label "speculative fiction" has made it easier for established authors to experiment, and for authors in other languages to see their work translated for the alas stubbornly parochial anglophone market. That said, I also understand the authors and readers alike who are annoyed about the apparent use of the term "speculative fiction" as a means to make once rather disreputable genres acceptable. There are some resonances between publisher branding with "speculative fiction" and their hasty arrangement for alternate editions of the Harry Potter books with covers more like those of "adult fantasy novels" for adults who would otherwise feel sheepish reading them in public. Still, I don't think this is where the speculative fiction label really started.

How I ended up tumbling down this book-related rabbit hole was running into a note reminding me to look up Naomi Mitchison, who was a contemporary of J.R.R. Tolkien. Tolkien is so often discussed as if he was disconnected from anybody but anglo-catholic men and without acknowledging that he wrote some wildly regressive stuff in the very same era as such (in)famous authors as members of the bloomsbury set. Part of the trouble with decontextualizing him and his work in this way is how it also dissociates him from his wider academic and writing community, which included Mitchison. On trying to find out more about her than the rather pallid reference to her as one of the supposedly very few women Tolkien had more than passing acquaintance with in his life – this is obviously silly, he didn't live in an isolated vicarage or something – it turned out I had found a bit of a thread into a far bigger and more intriguing picture. I happened first of all upon Lizbeth Miles' 2016 Uncanny essay, Quest for an SF/F Grandmother, which discusses, in part, Mitchison's science fiction and fantasy novels, all published after the age of sixty-five. It turns out these novels are part of an incredible outpouring of such speculative fiction written by women in english published between roughly 1950 and 1990. Mitchison wrote much more than these novels, but it seems not many people have been locating and acquainting themselves with her wider body of work, although there is a bibliography produced and kept online by beccon publications (no longer publishing) and the science fiction foundation.

While chasing down more details about Mitchison's novel Solution Three, I happened on a far more recent essay by Dennis Wilson Wise about Sci-fi/fantasy publisher extraordinaire, Judy-Lynne Del Rey. It is a great essay, although afflicted with stupid "for the web" pseudo-paragraph formatting. How many people realize the key role played by Del Rey, among many other women, in establishing publishing outlets for speculative fiction and fundamentally shaping and maintaining the market? It strikes me as far from coincidental that Del Rey Books suffered so much insult for supposedly publishing primarily poor quality formula fiction, a sneer practically never turned on other major publishers in the genre. She was finding the difficult path between getting books in print and making enough money to pay the bills without being forced into merger with a corporation seeking to build a monopoly in the business. As Wise observes, Del Rey brought an amazing number of now revered authors and "classic" books into print. It is not a given anyone, let only any woman who is able to find a place in publishing or writing for publication of speculative fiction will end up helping build a remarkable efflorescence of stories. Yet when it comes to speculative fiction, women seem to have more factors working in their favour in terms of recognizing and pursuing other than established lines.

By more factors, of course I do not mean anything along the lines of biological determinism. After all, many of those factors are shared with racialized authors, both women and men, whose work makes up an extraordinary proportion of the now widely recognized "classic canon" of speculative fiction. Being not quite in the mainstream of anglophone culture or outright rejected from it, such authors and publishers have more opportunity, wanted or not, to think their way out of the usual ways of doing things. This doesn't seem to result in producing ridiculous épater les bourgeois schlock, except perhaps as juvenilia we never see. As is too typical in such conditions, they are also forced to write far better quality stuff, work far harder, and have far more by way of qualifications to get published or into publishing in the first place. It is no coincidence that the pioneer developers and founders of fan fiction and subsequently fan fiction zines then archives and websites, where many of the stories are riffs on speculative fiction franchises, are predominantly women. If they hadn't, there would be next to no speculative fiction featuring three-dimensional women characters, let alone racialized people in general in those same franchises, at all. (Top)

Not a Winning Strategy (2025-09-08)

Delabelled version of Jenakarthik's july 2009 diagram of academic dress for a master's degree holder in singapore. Image with original labels accessed via wikimedia commons and used under Creative Commons Attribution-Share Alike 3.0 Unported license. Delabelled version of Jenakarthik's july 2009 diagram of academic dress for a master's degree holder in singapore. Image with original labels accessed via wikimedia commons and used under Creative Commons Attribution-Share Alike 3.0 Unported license.
Delabelled version of Jenakarthik's july 2009 diagram of academic dress for a master's degree holder in singapore. Image with original labels accessed via wikimedia commons and used under Creative Commons Attribution-Share Alike 3.0 Unported license.

In preparation for this thoughtpiece, I went back to some of the bibliographies of books and papers accrued during my recent foray back into university to complete graduate studies in history. Immediately my vague impression about controversies over post-secondary education being new at least in the european-invaded americas was promptly debunked. Since such institutions are typically positioned as producers of a labour aristocracy, a professional managerial class, or just to train the offspring of the most successful gangsters, they are invariably fiercely fought over. Who will be allowed to attend, who will be allowed to teach, how they will be allowed to teach or learn, and what instructors will be allowed to teach all have associated clashes associated with them. Sometimes those clashes are literal and physical, with fighting in the streets or ruthless slaughter of faculty for political purposes. Other times the fighting is primarily via legislation and manipulation of degree-granting powers, who may be the president of the institution and so on. Post-secondary institutions differ in origin as well as focus, so we can find examples that began as religious seminaries, free publicly founded entities, and overdeveloped private and secular schools. In other words, they tend to have serious problems with conflicts of interest, even those originally founded as publicly owned and financed schools and colleges. After all, everybody wants to have a piece of presumably malleable young people, who often have no or at best weak commitment to the status quo. This makes them both promising and dangerous. Hence the constant attempts to create and impose the myth of the ivory tower, which doesn't convince anybody but is good for generating insults, and the way the treatment of their students is a genuine bellwether of social change.

I am in canada, and by chance have attended two universities founded in the early 1960s, plus some time at two others founded at the turn of the twentieth century. The experience at the older universities made me wonder why they were in the centres of the cities they are sited in, while the later two are as far from downtown in their cities as possible. They are far enough from downtown and served by the curiously constricted street systems so typical of suburbs that bad weather can force and effective campus closure because most students live off campus and literally can't get there. On trying to figure this out, I learned this was meant to make it as difficult as possible for university students to effectively protest. This is partly an immediate physical access issue, the students would have to round up transport to where they intend to protest unless the protest is right on campus. As the present insane real estate bubble reveals, the unaffordable campus residences, insufficient for the whole campus population anyway, makes university and college students milch cows for local real estate speculators. So today between ever-increasing tuition and ever-increasing rents, post-secondary students are often working multiple jobs and going to school, because unless they know people, landing a decent-paying job on a high school diploma alone is not feasible. Meanwhile, the current trend is for the MBA-poisoned set to demand post-secondary schools be reduced wholly to job training entities, for the jobs and industries the MBAs deem relevant. I doubt this strategy has ever worked out in the longterm, but it is in the nature of MBA training to fixate on short term gain and the devil catch the hindmost.

The provincial and federal governments constantly whittle away the public funds set aside for all education, let alone post-secondary institutions, but what particularly caught my eye in the articles about the most recent funding cuts is how the lobbyists who are supposed to be working for the latter are doubling down on a failed strategy. To make up for the loss of operating funding, and now at the point that raising tuition for domestic students is reducing their enrolment, universities especially have long focussed on increasing international student enrolment instead. It seems the university administrations are doing this on the assumption that such students will always come from wealthy families, and/or they will have major funding from their home countries. Their tuition is breathtakingly high, and even though officially this is not policy, if international students can stick it out, maintain a solid grade point average, and land a job in canada, this is a route to immigration as well. It's a hell of a gamble for them and difficult to stick the landing, so there is no question of the commitment and determination of international students. But the cruel fact is, they are being ruthlessly exploited, whether they are in "STEM" fields or the humanities, or the fine arts. Rather than pushing back against this perverse system of incentives, what has universities canada, the lobby group for many of the medium and all the large universities in canada say, through the president Gabriel Miller? "It's really costing canada the people we are gonna need to be doctors, to be engineers, to be entrepreneurs..." It is difficult to respond civilly to this statement. I'm going to do my best.

"Canada" should absolutely not be brain draining other countries. There are always people seeking to immigrate for positive reasons, and sadly too many people who must immigrate or become refugees outright. Those are facts, and it would be best of all for canada to stop supporting or pursuing policies that push people into leaving their homelands for the negative reasons. It is far from a loss to canada if international students come to canada for training they are unable to take closer to home, and then go back to take up jobs in those fields at home where they are needed. Indeed, I think that would be an important positive role for canada to take in the world instead. But what Miller is referring to is a practice of depressing wages of university-credentialed people in targeted fields to the point that international students are preferred because it is presumed they will be able to subsidize themselves somehow in those jobs. It is disingenuous to pretend canadian students don't want to enter STEM fields or medicine, when they do and even complete the programs, only to find they are unable to find longterm work. If they are fortunate and willing, they may end up landing a job overseas or at least south of the border in the united states.

Oh, I understand. I have heard and read the nonsense stories about canada's "falling birthrate" and how the population will be too small and old to be all those anticipated doctors, engineers, and entrepreneurs. This makes sense only in a country where ageism is so rampant in professional fields anyone older then fifty has problems keeping in appropriate work where they can help train understudies and pass on corporate memory. Ooops, this is exactly the rampant issue in canada, where I can tell you from direct experience managers are all about cutting experienced headcount because "we can all be replaced" and they think anyone who has higher wages besides them is deadweight and should be gotten rid of. When the organization subsequently goes into a tailspin, everyone is surprised and go figure, no one is to blame, except of course for the now through no fault of their own ill-experienced and ill-trained younger staff, who may work eighty hour weeks and still be accused of being lazy. Oh, you know what else is contributing to the perception of a reduced pool of possible hires? Racism! Yes indeedy, good old racism, especially anti-Indigenous racism, where Indigenous peoples living in what is currently called canada have both the highest birth rates and the fastest growing rates of post-secondary education, including university degrees. All those young people who are training up to go into trades, to go into medicine, STEM, and entrepreneurship. They are ready to work in far more than what is developing into an attempt to make Indigenous people into their own jailers by not-so subtle pressures to go into social work, paramilitary, and military jobs instead. Oh, and you know who else is having a hell of a time trying to get into other than shitpaid jobs that shouldn't be shitpaid in the first place? Women! Yes folks, sexism is playing its own special role by trashing women for having babies and if they don't, for if they enter the waged workforce or if they don't, and cutting off the nose of the economy and society at large to please an imagined racist, masculinist face. It would be funny if it wasn't so stupid and inevitably short-sighted. But of course it is short-sighted.

And I am not even getting into the insulting and ridiculously low pay typical of "service jobs" which universities and other post-secondary institutions play up and encourage in order to increase enrolment. "Service jobs" are emphatically not low-skilled or unskilled. I have close friends and relatives who have worked in a range of them, including the ever-present restaurant sector. Front of house or back of house, the hours are long, the work is hard on the body, and it demands an incredible amount of management of the people who come to the restaurants to eat. There are not as many as there used to be, even in places where the customer base is expected to be tourists or workers brought in for temporary large projects who have per diems. Those can be very difficult people to handle, all too often impatient and overtired, if not entitled and rude. By the way, it is a tell that teaching jobs are more and more often generally if not literally as poorly paid as service jobs requiring no formal training. Teaching students at every level, now including post-secondary, is quite explicitly referred to as a type of service. In elementary schools, to parents as babysitting. In secondary schools to parents as babysitting and exploitative employers as cheap job training. Now post-secondary institutions are being positioned as service providers to corporations, which, practically speaking, is a recipe for the ultimate destruction of post-secondary institutions. We are already seeing this, as the older and larger examples with control over if not outright ownership of real estate or with a proliferating collection of corporate funded professional schools and research centres make themselves more and more inhospitable to students and instructors alike. (Top)

Giving Up on the Web is Both Childish and Cowardly (2025-09-01)

August 2005 photograph by Coolcaesar of the NeXT workstation used as the first web server by Tim Berners-Lee, via of wikimedia commons, under  Creative Commons Attribution-Share Alike 3.0 Unported license. August 2005 photograph by Coolcaesar of the NeXT workstation used as the first web server by Tim Berners-Lee, via of wikimedia commons, under  Creative Commons Attribution-Share Alike 3.0 Unported license.
August 2005 photograph by Coolcaesar of the NeXT workstation used as the first web server by Tim Berners-Lee, via of wikimedia commons, under Creative Commons Attribution-Share Alike 3.0 Unported license.

I wrote a series of thoughtpieces on what is wrong with the web and what is right with it between late 2015 and early 2016. My general conclusion was that the web is not irredeemable, and the elements that made it useful and accessible are still present. This is so despite the mighty efforts of such ever-worsening players as google and microsoft interfering with web standards, manipulating the web browser landscape, and misinforming people about how to make a website and how to post it. To this day, many of the people crying about the supposed death of the web and how there is nothing but crap on it are a narrow subset of people who use the internet or the web. From my definitely unscientific sample, they may be old or newcomers to the web, more or less familiar with computers, and definitely caught up in the hellscape of so-called "social media." Some have rediscovered static websites, and are a bit less inclined to try to proselytize for alternatives like the gemini protocol. The gemini protocol has a lot going for it, including development of its own niche. However, it gives up a lot of what makes html hypertext such a wonderful medium, even if the html page author deliberately sticks to text and images, with no javascript or extended css coding. It looks like gemini (and gopher) may be the best candidates for people who would like to run their own small server to host their website on, but want to stay within the usage limits of their internet service provider and hedge their bets about possible security issues. This cannot replace the html-document based web, especially the uncentralized version. If you are working on webpages or a website, using a "wysiwig"-type editor, and get "free" hosting from a company, chances are your pages or site are generated on the fly by a content management system. In other words, your pages are rendered into fields in a database, then spat out into templated pages when somebody clicks on the link. I hope you have a back up and are able to snapshot the composed pages in case the website shuts down or you need to move, including changing content management system.

Now, in a sensible world, internet service providers having pulled the rug out from under anyone who had gotten used to hosting their website on space that came with their internet service, everybody would be able to use a local host server, and those would be minimal or no cost. As in, I think email and web hosting up to a maximum capacity should really be part of what we get as part of a national postal service. Business websites and anything involving actual sales should be professionally hosted instead, the former for improved visibility and because businesses should pay their own way, the latter because I don't think the post office should be responsible for sales transactions outside of their own. (Postal banking is a separate question which would nuance this aspect.) Since we have to live in the world we've got, not the world we wished we had, we need to marshall our resources to handle webhosting differently. I think there is definitely space for community-provided webhosting, should community members be willing and able to organize and make it happen, and in truth we really need this. If hosting and other internet services are basically all centralized, we are then inevitably at the mercy of those hosts and providers spying on us, stealing from us, or pulling the plug. They may complain about and try to block decentralization, but they will be less able to pretend to justify bad behaviour if they are not the only sheriff in town. Those of us able to contemplate building a website must also resist and refuse to use "free hosting" in part to help force down prices for paid hosting. We already know the enshittification and abuse cycle of "free hosting/email/messaging" on the internet.

But probably this is just repeating old points more than anything else. I appreciate many people may feel like they need to abandon the web due to the current "AI" hype, and the way lazy crooks are filling up whole websites and pages with computer-generated nonsense to an even higher level than before. Google has enshittified itself into a death spiral, and it takes more effort than it should to track down a way to order products online without somehow getting routed through the monstrosity Jeff Bezos uses to wring tribute out of every business so unwise as to engage with it. However, there are indeed search engines beyond google or bing, and not only are their good and interesting ones like mojeek or wiby, there are metasearch engines too. After a bit of practice, it gets much easier to skip over machine-generated garbage, and I can say truly that once you move away from the google-bing near duopoly, there is much less that makes it into the search results anyway. We can all get back into the habit of saving bookmarks, and saving links to important archive and source sites, from the internet archive to our local public library, or indeed, city hall's website. We need to curate these ourselves, on our own computers in a document not dependent upon a "service" or even a web browser to read. This is quite doable even for those most disinclined to do anything like make a text document and copy and paste in the links by hand. Even the most hostile versions of google chrome have an "export bookmarks" or "export favourites" option, which generates an html page of links you can save wherever you like. (Do try this – the code that generates the output is often a barely updated version of the composer mode in the old Netscape browser suite!)

I suspect where many of us will wind up in terms of what we experience as "the web" when it is not some sort of corporate or non-profit website or paid offsite hosting website, will be much more local. It will be something based on a mesh net, and so some websites will have hours of availability. It may be some websites with limited availability become important enough for people to ask how they can help support the site to make it more available in terms of time and perhaps reach. Things can take off from there, or not as the case may be. The state of the internet corporations are striving for is often referred to as cable television. This is all too true, yet I think my point is, the concept of networking computers together to more easily share information, and to do so specifically via https and the trusty html webpage is not something only the corporations, or governments can do. If we basically flounce off in a snit because we can't recreate the pre-commercial web for the whole world straightaway, then we are allowing ourselves to be corralled when this is only possible if we allow it. In the present hyper-individualistic zeitgeist, it can be easy to feel like there is nothing for it but to give up. Well, it may be convenient for centralization and profiteering for us to believe that, but community-based development based on meshnets and other at least relatively decentralized options are still available. We need to get out and use them. (Top)

So Close, So Far (2025-08-25)

A sample stone paper notebook, image quoted from the company's website 22 september 2024. Credit where credit is due regarding their subtle marketing push. A sample stone paper notebook, image quoted from the company's website 22 september 2024. Credit where credit is due regarding their subtle marketing push.
A sample stone paper notebook, image quoted from the company's website 22 september 2024. Credit where credit is due regarding their subtle marketing push.

Like most writers, no matter how much time i spend typing on some computer keyboard or another, the majority of my notetaking and early drafting goes on in handwritten notebooks. Therefore I also am prone to picking up unusual examples of these wonderful tools for thinking with that need neith batteries nor a power cord. From there it is only a short step to becoming a connoisseur of specific pens and pencils best suited for drafting. I did used to think the fuss about writing utensils was a bit of silly fun, but have since learned to take it a bit more seriously due to a combination of repetitive stress injury avoidance and preserving records and resources. Like many of us, I am all too aware of the endless and dishonest push by many advertising corporations claiming to be technology companies and their publisher followers whose hatred of hard copy books seems to know no bounds. They claim this is about how wasteful and environmentally destructive it is to make "dead tree books" when what they are really panting after is forcing us to constantly pay them rent to write and to read. They especially want to do this by a method of "book production" that fundamentally has no limit, which hard copy books do. Only so many paper books will fit in even the largest libraries. Oh, and they don't like libraries either precisely because that means we can read books without buying them. Still, leaving such greedy fantasies of the advertising and publisher rentier class aside, it is true that if, if, a primary reason to cut down trees is to make paper and cardboard for producing books, it is better to make books in some other way. I am not convinced this is true, by the way. Forests are under siege worldwide for corporate greed, not books specifically. Just look up how much of those trees cut down are deemed pulp unsuitable for anything but particle board and similar conglomerates.

UPDATE 2024-12-21 - All of this is not to deny there are some interesting efforts to adjust plastic formulations to make them biodegradable, which should be a key element of how we get away from using plastics altogether. For instance, I have learned about D2W biodegradable materials, first via reference at FriendlyMailer and then via additional information provided by the museum of design in plastics. The D2W plastics have metal salts added to them to promote degradation of the plastic polymers into their non-plastic parts in the presence of oxygen and water.

UPDATE 2024-12-31 - And yes, I do know that it is possible to make plastics from wood chips and the like (indeed the earliest plastic used were of this sort), and there is an industry of researchers thrashing away to find means to make mass-producible "bioplastic" that will rot, for example see the cool down article by Justin Housman Researchers Make Game Changing Discovery While Experimenting with Plastic Replacements. As usual, the headline is overblown and everybody is missing the point. The point is to vastly cut down and all but eliminate plastic use except for where nothing else could possibly work and even then use only the biodegradable kind.

There are some intriguing efforts to make cellulose-based paper using other fibre sources. Some of the most interesting use waste fibre from other crops, such as sugar cane. Others are trying to regain the ground lost by the non-cannabis hemp growers due to the mendacious "reefer madness" propaganda campaign, and have some competition among the people who are trying to use bamboo to make just about everything wood has been used for. The challenging question for all of these though is about how they treat the fibre to whiten it and then smooth and size it. These processes are the major generators of mercury and chlorine contaminated wastes traditional paper plant operators have dumped into waterways for decades. The resulting paper doesn't have to be hyper-white, but it does need to be light and smooth enough for legibility and to be versatile enough for use as both printing and writing paper. (After all, we all reuse at least the backs of our used up printed pages, right?) Paper and papermaking has a long and storied history, which I can't cover here. For a fine précis, it is well worth reading Roland Allen's The Notebook: A History fo Thinking on Paper. Due to a very eurocentric evidence base, Allen overextends his argument of how notebooks have influenced and expanded thinking, but there are probably few non-academic books bringing together so much important history with detailed references and endnotes.

And so one day, happily exploiting the rationalization of checking the more "environmentally-friendly" notebook options in a local stationery store, I happened upon a curious example claiming to contain no cellulose fibre at all. No indeed, this brand of notebooks has pages of "stone paper," emphasizing an 80-85% calcium carbonate content and no trees or water used in production. This sounded both intriguing and to good to be true, but it made some sense nonetheless. Calcium carbonate is just chalk, and I don't think chalk use has entirely ended even in the most electronically hampered schools yet. The famous "white cliffs of dover" are among the easy sorts of evidence for how chalk needn't be bleached. So if somehow chalk can be made into paper, definitely bleach is going to be optional. But, this still sounds too good to be true. To get the chalk to act like paper, it needs to be bonded to or mixed together with a substance to make up sheets. So, I headed off to the website of stone paper solutions to learn more. What was the clever trick that supposedly minimized their carbon footprint and eliminated any need for water in production? The page about how the "stone paper" is made is the same as their "sustainability" page, which adds some more information about the product's practical qualities. It is "Sustainable, waterproof, tear-resistant paper with endless applications." Oh, and it is definitely recyclable. Okay, fair enough. But, what else is it made of besides chalk? The company about page says very, very little, although it does seem to be based in vancouver, bc – and they refer in their recycling options to making products such as composite decking.

After a bit more digging, inspired by search terms starting from trying to figure out what composite decking is, I found the original company interested in producing and expanding the market for it is based in europe, perhaps the netherlands. The rather surreal definition page on the website for the printing company printsimple (the day I visited the page, the fourth best selling item was special envelopes for condoms) explained that stone paper is:

Stone paper is a type of paper made of 80% calcium carbonate (limestone) and 20% bio-polyethylene resin (HDPE). The limestone is used as raw material from existing limestone quarries and processed into a fine powder. The HDPE is used as a binding agent. HDPE is a recycled plastic and can be recycled again after use. Moreover, it is also an eco-friendly & sustainable material.

Now, the claim about "bio-polyethylene resin" which does not match the acronym HDPE acronym, sounds alas, suspicious. It is not difficult to find out more about HDPE resins, as companies like GAP Polymers are happy to oblige. The acronym properly expands to "High-Density Polyethylene" resins, and their plastic recycling number is 2. HDPE is used in many things already, including most familiar plastic bottles for non-pressurized liquids, bags, and, go figure, picnic tables and plastic lumber, according to eartheasy. It's recycling number means it is hardwearing, and stands up to sunlight and extreme temperatures. In other words, HDPE is not prone to disintegrating into microplastics to the same degree as other plastics with higher recycling numbers, which means it can be easier to collect, clean, and reprocess. Despite the "bio-" in the printsimple definition, the primary feedstock for HDPE is hydrocarbons. That's right, it predominantly a byproduct of oil production, a way of making use of the hydrocarbon fractions unsuited for processing into something to put in a fuel tank. Alas, this means any claims about extremely reduced energy use to make the HDPE resin and its lower carbon footprint are at best, misleading.

My sample of stone paper is indeed not inclined to tear or stretch, and as the wikipedia article observes, feels somewhat like eggshell to the touch. This of course is no surprise, eggshells are also mostly made of calcium carbonate. I tried out some tests with different inks and pencils, and found pretty much as expected, they all wash off for the most part. Oil-based inks or permanent felt marker ink may not, although my three or so examples came off with soap and water. Fountain pen ink rinsed away or wiped away easily with a sponge. This already indicates to me that stone paper is not archival quality, and it seems it is deliberately designed to degrade in sunlight. On consulting an article from the AICCM bulletin cited by the wikipedia article, I also was able to confirm there must be a proprietary finishing coat on the paper as well. I am very sad to have to conclude stone paper is a failure as a means of reducing impact on the environment of paper usage, precisely because not only does it depend upon hydrocarbon production, it adds to the proliferating plastic waste issue. Plastic recycling is not energy efficient under typical industrial conditions (i.e. minimal hand sorting and cleaning), and the proprietary coatings are may mean there is a meaningful component of toxic waste involved after all. It is really too bad, I can't deny the notion of "stone paper" is really neat. But maybe it is possible to pursue the calcium carbonate option for whitening cellulose papers as an alternative to bleaching them. (Top)

A Dunk On "Canlit" (2025-08-18)

Perhaps it is a bit too easy to dunk on the books flogged under the marketing term "canlit," but that is hardly a reason not to do it anyway. I say this because in an odd sort of way, it is both apparently easy to dunk on "canlit," and yet also difficult to say something useful about it. A major part of the difficulty is the nature of the term, that short form of "canadian literature" apparently meant to encourage sales to the people who are or wish to be seen as patriotic, and for the convenience of academics engaged in creating a specialty niche for themselves in some literature studies department. Truth be told, I don't begrudge anybody seeking to identify, read, and/or purchase canlit their doing so. After all, dunking is not equivalent to "don't let there be any of this sort of book/story/essay/anthology," or "nobody should like this stuff," nor is it intended to be. From what I can tell, having grown up far from the main publishing centres in canada, and therefore from the people who like to think they are cultural arbiters here, there is a genuine and persistent audience for the writing marketed as canlit. Unfortunately these works are often desperately parochial, which unfortunately because "canada" was and is a colony and therefore bereft of a self-defined culture or sensible awareness of the rest of the world, seems unlikely to change in the near future. But I would find more of canlit readable if the authors and publishers were more willing to acknowledge that the fundamental definition of canlit is:

  • A non-fiction or at minimum moderately realist fiction;
  • set in southern ontario;
  • preferably in "cottage country" if the author is bored with toronto; and
  • always in english.

I am barely exaggerating here. There are novels by canlit-marketed authors who are seeking to represent other places than somewhere in southern ontario they grew up or lived for an extended period of their adult and professional life. A fascinating number of them, perhaps due to editorial nodding, give away the reality of a southern ontario location meant to be a fictionalized new york or smaller northeastern city in the united states instead. I confess to finding this charmingly funny most of the time, and in books intended to be humorous, the backhanded reference to filming in toronto at night as a cheaper stand in for united states productions fits well. The most frustrating part of the examples where the accidentally still legible southern ontario serial numbers is the hint of a genuine southern ontario-specific and settler culture and history poking out. So on one hand there is the teasing hint of a positive creation instead of the habit of trying to reproduce a fossil image of a fictional british culture, on the other an indication of contempt for the book, the author, and the readers. The book was clearly not expected to be of interest anywhere and to anyone else really than a southern ontario market. (But if you would like to see real definitions and some canlit reading recommendations and study guides, canlit guides has you covered.)

These are less exciting things to write about than the issues leading Alicia Elliott to write an article published september 2017 titled CanLit is a Raging Dumpster Fire on open-book.ca, a specifically ontario-oriented site focussed on canadian-owned publishers there. Elliott delves into the appalling politics, racism, and sexism wreaking untold havoc across the works in this category. The effective definition of canlit in the article is both broader and narrower than my tongue in cheek version above: books published by canadian publishers, by authors who are – considering Elliott is Mohawk, maybe the best way to describe the authors is, people who hold canadian citizenship or have otherwise become part of the canadian literary community. Evidently, I am not a member of the community as such, as it is full of people who are either academics or published by a medium- to large-sized canadian publisher. Elliott makes an observation I am not sure many people have seriously followed up on since,

I believe that this sudden anger at CanLit is the inevitable result of Canada's own national identity crumbling. When you think about it, it's really not that farfetched. One could very easily go through this entire essay, swap the word "CanLit" for "Canada" and it would still, for the most part, make perfect sense. What words have traditionally been employed to describe CanLit? Polite. Liberal. Progressive. Welcoming. Aren't these the exact words consistently used to describe Canada? And if CanLit's really none of these things, can we honestly believe that Canada is?

Some time ago, a friend of mine who was also born in and lived mainly in western canada were talking about our experiences living in different parts of the country. We were having great fun tying to figure out what makes it all "similar" and "canada" to us rather than someplace else, even in regions of provinces where we were most at sea. As a matter of raw practicality, it came down to we didn't need a passport to travel around the country, we count on everybody knowing at least who the current ruling federal party and their latest harebrained plans were, and we shared a broad habit of testing communication waters by starting with the weather. This led us into the vagaries of accents in canada, and the surreal phenomena of the way a saskatchewan river to fraser river system region accent is deemed to "sound like it comes from nowhere in particular" to many united statesian filmmakers. (The filmmaker I read who commented on this meant "nowhere in the united states" of course, he wasn't being a jerk or anything.) Then we went back and forth on the fascinating and often very funny phenomenon of united statesians who absolutely cannot stand the way we – being from the region of the nowhere particular sounding accent in question – pronounce the term "american." My experience is the people who get annoyed rather than puzzled can't believe our pronunciation has a norman french-inflected lilt in it, they are sure we are just trying to get their goat. And then in one of those lateral associations that stick in the mind for later I commented that in many ways when people grumble about "americans" they are often talking about somebody in southern ontario, especially toronto.

The tie back to Elliott's point is twofold. For one thing, outwardly referring to another group while critiquing behaviour in one's own is a technique widely recognized in other communities. So when we grump about "americans" and really mean out of touch and arrogant denizens of southern ontario who are in important positions of economic, political, and/or social power, perhaps it is because the covering reference takes the sting out. Or the covering term makes it possible to pretend we aren't having a more uncomfortable discussion and then try to have it anyway. Elliott's discussion helps make clear that Canlit itself is viewed by many of those who subscribe to the label as written works that reflect what "canadian identity" is supposed to entail, how "canada" is supposed to be. I can't help but wonder, was the anger in 2017 about "canada's" failures on these points (hardly new news surely?) or about the embarrassing proof of how little power canlit as such has to push "canada" to be "polite, liberal, progressive, welcoming" and worse yet it has problems keeping its own house in order? I sympathize here, because the 2017 discussion arose as much out of shock experienced by a community of colleagues who had believed in a greater level of professionalism and fairness than the facts revealed. There is no comfort in the same old lies about demands for an end to appropriation being answered with colossal whining about supposed censorship.

The woes of canlit unfortunately are tied up in the very term and that it is about marketing. This presents constant perverse incentives to avoid risk-taking, so the genuine diversity of works written by people living in canada who may or may not hold canadian citizenship, and may or may not want to be critical towards the notion of "canada" is effectively censored. Such works barely get a chance to land in the slush pile which is in fact the trash or recycling bin, electronic or firmspace, where so many manuscripts end up. What already sells well and has been selling well is what leads publishers to demand more of the same. And so canlit is grimly, tragically, the same. Even the book cover designs are from a narrow range, which gives the results an uncanny, provincial appearance in bookstores where hardworking staff patiently assemble canlit displays. Truth be told, I think writers in canada have thoroughly outgrown the hamfisted attempt to recreate a fictional british canada that looks oddly like the late 1960s and early 1970s except everyone has shorter hair and thicker waistlines. I think readers in canada are fed up with that nonsense too. (Top)

If A Site is Blank With Javascript Off (2025-08-11)

If a site looks like this with javascript turned off, or worse yet literally has no page with it turned off, it might as well not be online. If a site looks like this with javascript turned off, or worse yet literally has no page with it turned off, it might as well not be online.
If a site looks like this with javascript turned off, or worse yet literally has no page with it turned off, it might as well not be online.

Okay, I'll be blunt. I don't care what the hell your website is supposed to be about, or if your website was actually written and posted by people instead of being a pile of large language model generated slop. Even before considering those issues, if I land on your site somehow, and it comes up blank because I don't allow every unvetted javascript and plug-in known or unknown to even webmasters online, you are running a worse than mickey mouse outfit. You are running an outfit that can be silenced and blown offline at any time. This includes far too many websites of non-mainstream news websites with solid contributors and reasonable documentation to support checking their receipts. Over the years I have developed some workarounds to this sort of javascript mess, including turning off the site's default stylesheet and even viewing the page source instead. (Yes, really. I'm curmudgeonly about this but willing to work around annoyances for solid actual writing.) What finally drove me to write this thoughtpiece though, is the growing occurrence of websites with their entire content wrapped in javascript to the point that each page is parsed through the javascript and then parsed again by the web browser rendering engine. The only reason to do this is to spy on the reader as much as possible, by using javascript to track at least where the mouse goes and how fast or not they scroll the page. And yes, there is a real possibility that what I stumbled on are a few sites that have deployed their web application pages to the regular web, or have pages designed to be both web and web application deployable. The end result is that if the site is not visible without javascript on, it may as well not be online. At least the sites with rude "you can't see this site without javascript, oh and here is an ID number for your visit" are up front about preferring I not see their pages.

Back in the not so long ago early days of the web, creating a website with no text-only fallback was unheard of for serious and experienced web production teams. To be sure, this was not a wholly altruistic matter, although it happened to make the web of that time far more accessible for the visually impaired or anyone dealing with neurological issues aggravated by the sorts of flashing or abrupt movements characteristic of early web-served video and graphics. Web browsers were not standardized, web access could still be very slow, and while programmers developed audio plug-ins quickly, video plug-ins were much harder. Worse yet, the original online short video and game solution, macromedia flash, developed into an unstable, energy inefficient, security dumpster fire. Many people criticized Steve Jobs for dumping flash support altogether from the various i-devices, but like it or not, he was right. It took some time for flash players to become widely available, and then all the stability and security issues made people more cautious about using them. Under the circumstances, when website visits were not taken for granted or viewed mainly as a means of data mining and flogging irrelevant advertisements, not having a text-only fallback would have been ridiculous. Even the wretched bbc had a well thought out and organized text-only version, including of audio-visual heavy sections like those for classic Doctor Who. Especially for any organization claiming to be a serious outfit, no matter what its size or whether it was meant to sell something, having the site come up blank or unusable just because a visitor was using a text-only web browser or not running every possible A/V plug-in or javascript was absolutely unacceptable. It made them look foolish at best, careless, or worst of all, contemptuous. Better to have no site at all than risk that.

Now too many people running websites are either unaware or could hardly care less about the possible impact of these sites coming up blank. Perhaps in their view anybody who doesn't run all the javascript or who won't run their web app is just some sort of "dinosaur" or even a crank. In that case, I concede their contempt for anybody who would try to visit their site would probably be expressed some other way. For those who are unaware, this strikes me as truly tragic and probably a reflection of the ongoing effort by the various corporations and their authoritarian-inclined techbro stooges to mystify all technology and persuade people it is too hard for them and they don't have time to learn it so they should use corporate "services" instead. As I noted at the start of this thoughtpiece, it is incredibly risky to go along with the pressure to use such pseudo-services, which have terms of service as well as a technical stranglehold making them able to take the site offline and delete it at any time. How many of these blank with no javascript on websites and even those that aren't, have no back ups independent of a pseudo-service? How many are in effect depending on the internet archive to save their asses (typically without supporting it)? It's a scary thought. (Top)

The original logo of apple computer, in many ways the total antithesis to the later rainbow-style macintosh logo and its eventual plain white successor in common use today. Quoted from the Apple-1 Operation Manual in the Bitsavers.org collection at the internet archive. Accessed 24 august 2024. The original logo of apple computer, in many ways the total antithesis to the later rainbow-style macintosh logo and its eventual plain white successor in common use today. Quoted from the Apple-1 Operation Manual in the Bitsavers.org collection at the internet archive. Accessed 24 august 2024.
The original logo of apple computer, in many ways the total antithesis to the later rainbow-style macintosh logo and its eventual plain white successor in common use today. Quoted from the Apple-1 Operation Manual in the Bitsavers.org (apple) collection at the internet archive. Accessed 24 august 2024.

Much to my annoyance, I cannot yet relocate the source of a critique of the apple computer logo as referring to the judeo-christian bible and the chapter according to which Eve ate a fruit from the tree of knowledge of good and evil, and thereby supposedly sentenced humanity to mortality and the need to work for a living, and women to agonizing childbirth. This story never made much sense to me and sounds suspiciously garbled, but as many of us have learned either passively by overhearing it or directly by somebody teaching it, supposedly the fruit in question was an apple. The reasons this was unlikely to be the fruit referenced by the original storyteller and the ongoing argument and efforts to trace how and why apples got dragged into it later on is outside of the remit of this thoughtpiece, though quite instructive about ancient fruit varieties, languages, and cultural warfare. Since apple computer is a united states corporation founded in a context where "the bible" and a sort of "greatest hits" subset of its most illustratable stories for puerile audiences are part of the cultural water everyone is swimming or drowning in, I can't see any plausible argument to counter the critique. Even though the critique also strikes me as a bit overwrought, considering what apple computer has developed into and the wildly different original logo, I must concede the point is nevertheless a sticky one.

Why does the critique strike me as overwrought, whether fairly or not? Well, I admit to being rather charmed by the original notion of "apple" deriving from the shorthand for "application," "APPL." Truth be told, so I don't think it would have occurred to me to use it as an actual designation for any sort of machine. Twenty years or so ago, such things seemed clever rather than an active effort to slip something by us. That of course, is a "seemingness" from a less cynical, or at least more naïve time for me. Now it irritates me constantly to hear large language models referred to as "thinking" or "hallucinating" in an effort to persuade investors these things are intelligent and so should have billions of dollars spent on them. The trick is simply too obvious now, and so even the idea of "apple" being something of a clever joke or pun meant to help with branding and advertising does not have the same entertainment value it once did. Master of marketing on the cheap as apple computer has long been, the more homespun tenor of some of the earlier iterations certainly hasn't held up.

Today at least, the original apple computer logo would never make it to the drafting table. No, today logos are not allowed to be so elaborate. Unlike earlier days when part of the point was to make them more difficult to copy and reproduce, now they must be reduced to the simplest forms possible to ease clear reproduction at multiple sizes. Hence finding logos that still feature more than two letters or a slogan of any kind at least in the anglophone mainstream is quite difficult. Even the most slow to change university logos, with their origins in stylized heraldry and therefore meant to have multiple elements each with their own meaningful details, have been reduced to versions that look suspiciously like they were designed for redrawing with a large felt marker. In the apparent management mania for ever thinner and flatter devices and visuals, apple has even abandoned the slight extra line once used to imply three-dimensionality in its modern monochrome logo. But I suppose it would also scandalize a marketing department now to include the banner line added to the original apple logo, which reads, "Newton... 'A mind forever voyaging strange seas of thought... alone.'" I doubt any marketing professional would go along with using such a tag line, with its overtones of melancholy. (Drawn from William Wordsworth's poem The Prelude according to cultofmac.) No, no, no. Can't have possible customers associating the product with loneliness or exploration of any kind. Far better to guide them toward seeing getting the product as a way to be part of a crowd in which everyone has to have their own apple computer to fit in.

The black and white Newton logo was designed by original apple computer cofounder Ron Wayne, on deliberately old fashioned lines. He did a brief interview with motherboard in june 2017, and gamely put up with a question about whether he has any "regrets" that he sold his original stake in the company among others of that ilk. His explanation of how he came to sketch the logo is far more interesting, and makes quite clear he knew his design was very much counter to the trends of the time. I suppose in his own way he was following through on the positioning of apple computers as part of the counter-culture and a challenge to the infamous ibm and burgeoning but still brand new microsoft.

I have bumped into a rather grim (and somehow very united statesian sounding) theory that the subsequent apple logo was a reference to the fate of Alan Turing, widely believed to have committed suicide by eating an apple laced with prussic acid. The 1977 rainbow logo was created by designer Rob Janoff when the company was shifting retrenching in preparation for the release of a new model, which included a colour screen that Steve Jobs understandably wanted to show off. According to cultofmac, "Janoff added the bite in the apple to give it a sense of scale when reproduced at different sizes. (It was also a play on the word 'byte.') The colorful stripes showed off the Apple II's big feature, while embracing the countercultural tenor of the times." As always, the real story turns out to be quite mundane – and it is worth noting as well that information about Alan Turing, his contributions to the development of computers and his death were little-known outside military-oriented circles until the early 1980s. (Top)

The Warfare Nobody Talks About (2025-07-28)

Reproduction of Figure 2 in 'Long COVID: A Clinical Update,' by Trisha Greenhalgh, Manoj Sivan, Alice Perlowski, and Janzo Ž. Nikolich published in the Lancet 31 july 2024. Reproduction of Figure 2 in 'Long COVID: A Clinical Update,' by Trisha Greenhalgh, Manoj Sivan, Alice Perlowski, and Janzo Ž. Nikolich published in the Lancet 31 july 2024.
Reproduction of Figure 2 in 'Long COVID: A Clinical Update,' by Trisha Greenhalgh, Manoj Sivan, Alice Perlowski, and Janzo Ž. Nikolich published in the Lancet 31 july 2024.

If you have not spent any time reading the finance blog naked capitalism, I can't help but think you are missing out, even if only by not checking to see whether it is a blog you would like to keep following. In this day and age it is hard to understate the importance of checking receipts for ourselves and making up our own minds. Yes, it is hard work sometimes, and it makes us undeniably responsible for what we know and what we do based on what we know. I don't see any other way to survive for those of us who have no means, ability, or inclination to try to construct and maintain a permanent bubble to fend off the rest of the real world – a bubble that can't and won't protect when it is most needed anyway. Among the many topics besides narrowly financial that the team at naked capitalism covers has been the ongoing COVID-19 pandemic. They have provided some of the best and most consistently sobre and evidence-based coverage available, including thoughtfully considered links to other sources. The person leading this coverage is Lambert Strether, whose astringent wit and appreciation for how a major pandemic interacts with general socio-economic conditions in the united states in particular but also in other countries for which he can find accurate information is a true wonder to read. This work features especially in naked capitalism's daily links feature, and in his own afternoon water cooler (the water cooler is funded separately from the main blog and more than deserves all support people are able to chip in). I have quoted here, including Strether's annotations, a snippet from Greenhalgh et al.'s paper, "Long COVID: A Clinical Update." Besides quoting it as a wonderful example of Strether's understated, but wonderfully pointed and acerbic commentary, I have quoted it because it does not feature in the COVID-19 coverage part of the 1 august 2024 watercooler it is drawn from. No, Strether placed this under a quite different header: Class Warfare.

UPDATE 2025-11-11 - I forgot to add the note that Lambert Strether moved on to other writing projects in early 2025, and readers will need to go to his naked capitalism author archive to read his body of work, which of course includes much more than COVID-19 related material.

The placement is introduced with a seemingly mild comment, "Yes, of course Long Covid hits the working class disproportionately." Yes, seemingly mild. In one of his earlier video-recorded editions of "Reading Marx's Capital (Volume 1)," David Harvey observes that the industrial proletariat in much the world where there is still industry, is majority female. When it comes to the specific issue of women and girls' health and vulnerability to serious outcomes of COVID-19 infection, it is worth noting that an important factor in immune response to any pathogen is the person's state of nutrition. Malnutrition is rampant across much of the so-called "west," especially in the united states (widescale obesity is a telltale sign of malnutrition in an "advanced" economy). Under the best of conditions, women and girls living in patriarchal societies are pressured to eat less, less often, and less to none of the best quality and most valued food. They are also often discouraged from physical activity and going outside to get real sunlight, so necessary for proper generation of vitamin D, a key to bone and immune system health. Preconditions such as metabolic disorders triggered or exacerbated by malnutrition run rampant among women and girls, especially those who must live in conditions of near or outright poverty. Work relegated to women and girls is typically conveniently redefined as not "real work," as in it supposedly entails no physical or mental labour.

In "the west," the de-industrialized proletariat is majority female, locked into the vastly underpaid and hyper-exploited "service industries." From waiting tables to working in nursing homes, teaching and assisting in schools to running grocery stores, these women and girls perforce must interact with people everyday. Many of those people will be sick. Many of those people, like the women and girls at work, are living in societies where every form of pressure is put on them not to protect themselves from COVID-19 or any other illness by effective non-pharmaceutical interventions. The owners and managers of the businesses constantly cry poor, that they can't possibly afford to improve air quality at minimum, not even by the addition of Corsi-Rosenthal boxes, which are not expensive to build and would also help tamp down the fall cold and influenza plagues. The same owners and managers also cry about how so many staff are too sick to work, or do shoddy work when the managers and owners succeed in forcing them to come to work anyway.

Now go ahead and tell me there is no such thing as class warfare, and that women are not a class. (Top)

Copyright © C. Osborne 2026
Last Modified: Friday, January 02, 2026 21:04:05