|
FOUND SUBJECTS at the Moonspeaker |
||||
|
Technical Information Signal Boost (2026-04-27)
This is an image of an 1867 patent, u.s. patent 64 205. According to the label pasted onto the original document, what was supposed to make this design better was being able to slip the foot in from the back. Scan via wikimedia commons.
All too often, while it is easy to find articles and comments decrying all the lies and bullshit posted everywhere online, including active poisoning of search engine results to make finding correct information harder, those articles and comments provide no alternatives. For example, they do not suggest alternative search engines, or admit frankly that depending on topic, it is may be necessary to use one search engine and not another. Nor do such articles or comments typically provide links to solid online sources such as curated databases, or original documents or articles. Any jerk who proposes using so-called "artificial intelligence" is a shill or a fool and should be ignored. None of that is useful. Hence I try to include direct links to useful articles, especially when they derive from solid websites and online databases because they deserve direct traffic, and what makes the "world wide web" just that is links between webpages, sites, and databases, not advertising and propaganda mills masquerading as search engines. Among the technical topics it can be tricky to get good information on is the "UEFI," "Unified Extensible Firmware Interface." Yes, there is a prominent forum website out there (uefi.org/) with links to support documents, which can help to a degree, but the challenge is to get an accurate overview. These days wikipedia is simply too politicized to depend on in general, let alone in the case of new interfaces introduced at the behest of large corporations. Developers have their own more detailed needs, but what about for those of us who are not getting quite so deep into the details, while needing to check compatibility details and verify for ourselves whether a reported vulnerability is a genuine issue for a machine we are using or otherwise responsible for. To that end, developer Sami Tikkanen wrote an excellent, highly readable UEFI & BIOS fact sheet. Roy Schestowitz at TechRights signal-boosted the fact sheet and published a follow up article by Tikkanen as well, The UEFI Hype and Microsoft's Lies. My intention here is to give the signal boost an additional bump, as well as provide links to a pdf of each in order to add to their accessibility. (UEFI & BIOS fact sheet, Tikkanen UEFI article) While the internet archive and other projects based other than in the united states are doing important work to keep materials up and findable online, we can't just leave it to them. To the extent possible, each person who develops and/or works on a website must contribute as well. Redundancy and crosslinking are more important than ever as the pressure of attempts to swamp the web with nonsense continue. On the matter of microsoft and UEFI in particular, it is a bit surreal to follow up on the things microsoft representatives and documents say. There seems to be a microsoft view that the only computers in all the world are those running microsoft software, similar to the view of certain united states representatives that the united states is the only country in the world buying anything or doing anything meaningful. While of course both are influential and show up in plenty of places, this is still a very strange, dissociated belief and/or act they are putting on. (Top) Plagues Do Not Improve Genetic Fitness (2026-04-20)
Watch out for the revenge of the deeply disrespected and ill-treated industrialized chicken. 2018 photograph by Ferme Faunistique, used under Creative Commons Attribution-Share Alike 4.0 International license via wikimedia commons.
Not so long ago, the title of this thoughtpiece was simple common sense. It wasn't hard to come up with recent examples, some with recent impact in living memory to illustrate it, either. But, following the development of Edward Bernays' vile work by the comparably vile Josef Goebbels, not long after COVID-19 was willfully fomented into a raging and still ongoing pandemic by corporate malfeasance, the claim that we should "get covid" and thereby "get immunity" and therefore a strong immune system began to blare from every possible news outlet. I can vouch for the ubiquity of the nonsense because the push got well beyond "mainstream media," which I do not follow, and my own relatives and friends began spouting shocking nonsense, including stuff counter to their own usual practical perspectives. Before I continue, just to be clear, by "corporate malfeasance" I am not referring to any claims about COVID-19 being somehow created or "gain of functioned." No, I am referring to the corporate, especially big pharma driven crushing of the production and access to both pharmaceutical and non-pharmaceutical interventions that could stop the spread of COVID-19, prevent it, and safely and sensibly treat it. They were damned if they were going to forgo any means to in their view maximize profits from modified RNA-based injections intended to serve as covid vaccines. Plus, they were uninterested in allowing people to ask quite reasonable questions raised by the non-pharmaceutical interventions and the evidence that yes, like it or not, COVID-19 is one of a number of infectious illnesses that spread in aerosols, not just droplets or fomites. In 2017, historian Laura Spinney's book Pale Rider: The Spanish Flu of 1918 and How It Changed the World hit the bookstores, as we would expect because publishers gotta market, just in time for the pending hundredth anniversary of the improperly nicknamed "spanish flu." (By rights it ought to have been called the united statesian flu, because it was spread from the united states by infected soldiers sent overseas to the european war.) It is an excellent book by the way, including a brilliant explanation of how flu viruses are named for science purposes and what the "H" and "N" parts mean. As is standard procedure for such books, Spinney sets the stage for the main story by tracing earlier pandemics, especially those in the nineteenth century. The most important of these to us now would probably not have stuck in my mind, except that Spinney noted while it was originally referred to as a flu pandemic, an illness so nasty and fast-moving its major colloquial nickname was "grip," in retrospect scientists agree it was not an influenza but a covid virus. This infamous pandemic lasted over ten years, and the illness was first reported causing mass illness in russia in 1891. It is possible to read about the impact of "grip" in an 1892 revised edition of Julius Althaus' Influenza: Its Pathology, Symptoms, Complications and Sequels; Its Origin and Mode of Spreading; and Its Diagnosis, Prognosis, and Treatment, at the internet archive. Even a quick read over of the table of contents alone is enough to bring on a horrible sinking feeling, because the descriptions of the course of severe illness and range of post illness issues are all too familiar today. The people who survived that 1890s pandemic were not stronger and better off when thirty years later, including the children who had since grown to adulthood. As the saying goes, hindsight is twenty-twenty, but the grim signs were everywhere across the big colonial power of the day as britain's rich classes began conscripting armies to fight the new big war against germany in 1913-1914. By then, the majority of the working classes had lived in squalid to okay at best conditions in the new industrial conditions for over a hundred years. For that whole time, their ability to eat wholesome food, drink clean water, live in properly ventilated and uncrowded homes, bathe regularly, and acquire basic schooling were, to put it mildly, restricted. The military recruiters discovered the hard way that in spite of patriarchy running rampant, such that women and children in a family went hungry to feed "the men" even when they were all working ten hours or more a day to maintain the bare minimum of necessities, "the men" were in a pathetic state. Potential recruit after potential recruit failed their physical due to the impact of childhood malnutrition or current malnutrition as reflected in stunted growth, chronic illness, and infectious disease. The recruiters blatantly looked the other way when comparatively healthy but underage recruits came forward, then pretended that something magical happened to "british soldiers" so that they grew taller and broader chested once in the armed forces. I am all too aware of how limited language can be when saying simply: this is the population brutalized in a rich man's war in which the rich man took care not fight, then got smashed again by the spanish flu. The rich classes were in a sweat as the spanish flu finally wound down, helped along in part by the development and imposition of non-pharmaceutical interventions, including masks, all-important ventilation, and improved food and sanitation. The working classes were furious and among the diversity of their demands for change, there were some consistent ones. The rich wanted to retrieve social control and get ready for the next war they already knew was coming, but what the hell were they going to do for armies? Well, that, and how the hell were they going to solve the problem of how to prevent sickly servants from spreading disease to them? Hence, some grudging improvements beyond the emergency efforts to deal with the spanish flu followed, even as the economy cratered. In any case, post world war two, the working classes demanded, and got sensible reforms to improve food, sanitation, medical care, and education for the majority, at least in the "west." Contrary to what capitalist fundamentalists like to pretend, this had excellent economic results, albeit not the sort of result that funnels the vast majority of the benefits to a few rich people while driving everybody else into precarity and penury. Nevertheless, the rich classes never forget their greed and how much they resent those they deem lesser having anything at all, let alone life itself. So they made sure to include a poison pill in every set of government policies in the form of a demand to maintain military spending, an insistence that the only science research the public purse would fund was military research, and a further demand for any beneficial funding and activity to go to the private sector. So it is that once too many people had been persuaded it wasn't necessary to resist these nefarious policies because either they were convinced the policies either weren't nefarious or didn't exist, the many positive gains began to slip away, and old issues began to reappear again, albeit in a paradoxically "clean" form. Once again, more and more people found it impossible to feed themselves, living in conditions where they could eat themselves "full" yet still be malnourished thanks to the adulteration and ersatz quality of what they could afford to eat. To add insult to injury, these new fake foods induce metabolic illness such that many impoverished people are overweight and then accused of being undisciplined gluttons who can't manage their money, because "look at them." Housing is so expensive people are caught between homelessness and overcrowding. Public elementary and secondary schools are so starved of resources and teachers so poorly paid governments are continuing the process of actively restructuring them into day prisons. Driven by their commitment to fundamentalist capitalism, governments are also privatizing public spaces either by selling them outright or adding ever increasing "user fees." Then they criminalize children who have nowhere to play but the street. Ah, and I almost forgot to note the poor ventilation endemic to public buildings of all sorts, from schools to office buildings, apartment complexes to condominium complexes. Those were the conditions under which COVID-19 hit the scene in "the west." This is the sort of thing I mean by corporate malfeasance that fomented its pandemic status. All these conditions were at play before the epically stupid decision not to shut down international travel properly, instead allowing the rich to run around as they pleased. The "lockdowns" we were encouraged to experience as some kind of mass imprisonment experiment were never lockdowns, and furthermore set up to fail. Short term school and office closures for example could have worked to curb COVID-19 spread if they had been used as an opportunity to correct ventilation issues and properly equip everyone with N95 ventilators for use in conditions where crowding made air flow inadequate. But that would only have made sense to the self-proclaimed "leadership class" if they respected the life and health of everybody else. Like it or not, as I write the current COVID-19 pandemic is not over, and it is now in its sixth year, and worse yet, we have a rumbling new influenza in the works that has been growing ever nastier over the past several years as well. In early 2025, Brandon Keim's article at nautilus, The Unnatural History of Bird Flu: H5N1 is a Human Creation didn't get as much attention as it should have. However, the article does not deserve this as a whole, it is excellent otherwise and includes important clarifying information about what industrial farming and meat production reveal about virulence and immunity. Two brief quotes stand out in particular. In the facilities – the artificial ecosystems – that now house much of Earth's terrestrial vertebrate biomass, constraints on virulence that prevail in natural ecosystems are not merely removed. Virulence is actually favored. In the words of Mark Woolhouse, an epidemiologist at the University of Edinburgh, the viruses are "a response to the selection pressures that exist in a human creation: the modern poultry farm." This makes intuitive sense. Viruses that leave birds healthy, moving about and flying halfway around the world, reproduce more than viruses that endanger their hosts and thus limit their own opportunities. If a strain turns deadly or incapacitating, it ultimately harms itself. Survival of the fittest in this case means doing no harm. "This is the starting point. This is where we come from. This is the default host-virus interactivity scenario," Slingenbergh says. Leaving aside whether the claim about where much of Earth's terrestrial vertebrates live, note the point about how constraints on virulence are removed in those industrial meat production operations. They are set up in a way that facilitates rapid spread of any nasty virus. The way to force a virus to evolve into a harmless or mostly harmless form, insofar as such a thing can be forced, is to impede its spread, preferably by every means possible without imposing harm. Among the tactics available and effective for this purpose, and necessary whether there are vaccines or other pharmaceutical interventions or not, is ensuring conditions of life for the potentially vulnerable population are decent and healthy. It should not surprise anyone that in conditions of mass malnutrition, crowding,a nd filth in an industrial farming operation, the virulence and destructiveness of infectious disease would go up. After all, that is exactly what we are seeing in similar conditions for people. (Top) "Remembrance Day" is a Farce (2026-04-13)
1897 cartoon by Samuel D. Ehrhart in the magazine puck, courtesy of the public domain image archive.
The original "remembrance day" comes from the attempted commemoration of the end of what was originally called "the great war" and then "world war one" because of "world war two." Neither of the terms "world" nor "great" would have been applied to these two wars if they had not drawn in nearly every country of people who think they are white at the extreme west end of asia. "Europe" in recent history has been through repeated bouts of warfare, usually episodic and sometimes relatively limited. If intraeuropean warfare wasn't curbed by the presence of an empire of some sort, the fallback for a time was colonization, a more or less convenient way to deport the poor and no longer integrable veteran soldiers. But the various european elites and the growing capitalist interests had a special taste for war, so good for boosting economies via war production and massive pillaging of neighbours, so long as the "right side" won. Failing open warfare, encouraging others to fight and taking advantage of the chaos to steal resources and sell weapons and other gear to both sides could be gratifying. However, such duplicitous games can't go on forever, and in the case of "world war one" the clever schemers were too clever by half, and unable to prevent the war from dragging them in too. Hence instead of all having another go at the russian empire à la Napoleon, germany and its allies wound up fighting britain and its allies in what was partly an externalized intrafamily competition among the late queen Victoria's various grandchildren. There is no denying this war brought numerous issues unavoidably to the surface, including how the depredations of capitalism were so severe even in britain the number of conscripts who were too ill to be sent off as cannon fodder caused panic in the richer classes. If they didn't put that right and stop exporting the comparatively healthy poor to the colonies, how would they fight the next war? And there was going to be a next war. Everybody knew it who had any sense of what was going on in the world, and people did, including the working classes. In elementary school, teachers insisted we students should make poppies and memorize the execrable poppy poem because the veterans "fought fo our freedoms." This reasoning never made sense to me as a child, and of course it wouldn't. It wasn't meant to. We were supposed to accept it and go along, just like we were supposed to imagine the national anthem was always played before professional or amateur league sports games even though that was a thing specifically introduced during the "great war" in an effort to encourage recruitment. It was even more puzzling when during high school the principal took to playing the national anthem in the morning, which had never been a thing before. If nothing else, it got students moving to class anyway. And to be fair, in high school the social studies teachers did explain how "remembrance day" came specifically from the "first world war" and how great a struggle it was to ensure veterans received adequate healthcare and support in their return to civilian life from that war and so many others. We were encouraged to interpret "remembrance day" as about support for veterans, which by then we were old enough to understand made sense, especially for conscript veterans who often would have had no idea whatsoever they might be facing. The "second world war" was not discussed. There were a few dated books about it in the school library, but it was not invoked in "remembrance day" ceremonies. I was living in a canadian forces base town, so heedless of whatever might be said at school, they took on the vague "we've always done that" quality of the national anthem before hockey games thing. Over the years I have come to understand why the teachers were not talking about the "second world war." Besides the issue of being in a "base town" and the always politicized and heavily policed elementary, secondary, and high school curricula, the "second world war" was not very convenient. Despite the efforts of later twentieth century propaganda like the Indiana Jones movies with the "archaeologist" who was actually an antiquities thief scrapping the comic book evil nazis, odd, contradictory stuff kept coming out of otherwise respectable people's mouths. Things like holocaust denial, and that the united states magically joined the war alte and won it. It didn't take anything away from anybody fighting the definitely not a positive force in europe or overseas nazis to call bullshit on both claims. Even in those days before readily accessible digital archives it was not difficult to find more accurate information, including from those dated books, ironically enough. And alongside the odd, contradictory stuff began to come more demands for ostentatious display of the "remembrance day poppy" for not just november 11th but now up to a week before. And woe to the person who wore the wrong poppy. Don't be wearing the now little known white poppy for peace, and who do you think you are still wearing the poppy with the green but in the middle when it is supposed to be black? But what the hell was the point of all this "remembrance" when the number of wars seemed to be going up, even if there were at least no "world wars" demanding conscription or other mobilization of people who think they are white. The rank and file veterans were still struggling with poverty, lingering injuries, and problems integrating back into civilian society. It was less visible than in the early twentieth century, and the eldest survivors of the "world wars" visible in the news seemed to be well taken care of. The so-called "iron curtain" was gone, yet the now anachronistic "nato" was now openly destroying countries, apparently starting with yugoslavia. By the early 2000s, there was no way around it. The main thing "remembrance day" had become was what the guy who wrote the shitty poppy poem wanted it to be: a giant military recruitment festival. The guy who didn't write that poem, Wilfrid Owen, wrote a very different work, "Dulce et decorum est," and makes for a genuinely respectful remembrance of "world war one" veterans. But even that didn't convince me "remembrance day" was a farce. No, it was the canadian members of parliament ignorantly giving a standing ovation to a former nazi who should have been in prison. For some other examples of what has made "remembrance day" into a farce, a place to start is Robin Philpot's review of Peter McFarlane's book Family Ties: How a Ukrainian Nazi and a living witness link Canada to Ukraine today. It doesn't have to be this way. But until canada, the united states, and the countries in europe face up to the real meaning of war, "remembrance day" can be nothing else. (Top) Lossy Communication (2026-04-06)
Photograph of a wooden ladder beside bookshelves by Tim Van Cleef, published 10 april 2019, via the common reader and unsplash.
The title of the essay popped up among the many links among my collection of old-fashioned rss feeds. It caught my eye, and I hesitated to open the link. We Are Losing Our Words, by Jeannette Cooperman at washington university's the common reader. The title is on the very edge of clickbait, but unlike clickbait, there is no subheader with some shocking claim about population level IQs or inveighing against "kids these days." Plus, the recommendation with the link is from a sensible source. Alright then, there must be something good there even if the headline writer did flirt with capsizing the essay's figurative boat. And so there is. Cooperman is too polite to be so blunt, but in many ways she is firmly decrying the pressure from far too many editors and what purport to be writing tools to dumb down written work of all kinds for general audiences. In fact, Cooperman is so polite she makes sure to provide some other reasons this matters. Besides noting how "the words we keep ready in the back of our throat reshape our neural pathways every time we use them, creating new connections. They influence what we are capable of perceiving," she refers to evidence of how building and maintaining a broad vocabulary improves empathy and helps counter risk of dementia and general cognitive decline. But truth be told, I love the essay best when she is furiously tearing down the people who continue to push for the sort of writing convenient for inserting advertisements into. It is surreal how a genuine and reasonable critique of technical documents chock-a-block with long sentences of tacked together subordinate clauses and obscure jargon has been turned against just about any writing meant to be published in a magazine, journal, or online equivalent. Or, for that matter, in mass market paperback novels. There is certainly a grim logic in the pervasive pressure to dumb down writing of all kinds today, beyond colonizing the reading space with advertisements. Advertising companies of the "social media" subspecies are constantly working to prevent people from leaving their websites, and now their "apps." If a person wants or needs to stop looking at the "social media feed" to look up a word, there is a risk they might leave the feed altogether. On the flip side, this also encourages a certain resentment in people who have accepted and gotten used to this type of eviscerated writing when they happen on the real stuff. To their shock, it takes longer to read, takes more effort, and may even force them to use a dictionary and read a sentence or paragraph more than once. Unfortunately, it seems like more and more people interpret this sort of experience as either evidence they can't read this more substantial writing, or as an inappropriate expectation they will take the time and effort to parse it. "Don't make me think" is an appropriate expectation of a program interface, not something we read. The first few times I encountered the acronym "tldr;" it was funny. Now it is mostly frustrating. How can a person really know if something is too long to read for any reason without actually trying to read that something in the first place? And by "something" I do mean a non-technical essay or article, not longer or more technical items that may demand (gasp) rereading. It is fair enough to skim a bit and conclude, "too long to read right now" or "not on a topic I care to read about," but some respectful engagement is required. Then again, skim reading is also a skill, and not necessarily taught in school these days. Skim reading is not the same as speed reading, and contrary to the claims of the lingering speed reading applications and courses out there, the most effective way to read faster for those who do not have dyslexia to manage, is to read more. Read more and read widely. In the case of reading for a specific area of interest or study, learn the vocabulary – which does mean for awhile having to grind along with a dictionary of one sort or another. If you really want a leg up and are coping with a western european language, including the mongrel english, among the most effective things you can do is complete a course on latin and greek roots used in science and technology. Books used for these courses are still common denizens of both school and university libraries, and may even turn up in medium-sized public libraries. Or for those inclined to stick to online sources, this query at the internet archive will bring up books and even recordings, some requiring login to use, others always available. The majority of ostensibly "highfalutin'" vocabulary is built on redressed latin and greek roots, and yes making and using words like this was about turning away certain readers. But far more often, as Cooperman observes in her essay, it is about striving to express a new thought or nuance, and just plain to have fun. There are adult versions of the by turns hilarious and wryly annoying word games kids come up wth as they gain skill in their mother tongue. It seems a bitter shame to let advertisers and other jerks bully us away from the fun and creativity language allows, and not just written language. I find myself thinking also of the visual language of plays and films, and of music of all kinds. All these are under particularly relentless pressure to reduce them to formulae suitable for mindless reproduction first by bored and miserable humans, and then by large language models to maximize immediate profit. It took a sadly long time for more practitioners or other arts to realize the real danger of late stage capitalism and the greedy corporate obsession with enclosure of commons and getting rid of wage labour was from them, not unauthorized copying. If it was solely up to such capitalists, we would certainly not be permitted to communicate, though we would certainly be permitted to subscribe for a fee. (Top) Belatedly Found Explanation for a Puzzling Problem (2026-03-30)
A teaser image quote from A.J. Wykes excellent article on the 'loudness war' that blighted so much music through the late 1990s and up to the 2010s. Do check out the original article on the Sound Guys blog, which includes sound samples.
There are many reasons I gave up listening to over-the-air radio, not least the way any and all stations are now like cable television: constant talk and most time taken up with repetitive and generally useless advertising. (No doubt there are sociologists or anthropologists out there who could complete impressive studies of what the advertisements suggest about the presumed listeners.) I had noticed what little pop music there was seemed to be really, not so much loud, though it was that too, as noisy. This was the mid to late 1990s as grunge took off in popularity, and now I suppose some of this came of attempts to maintain the sound ambiance of live performances in less than adequate music performance spaces. Despite this, there were still some bands from this time I quite enjoyed, such as garbage and oasis, even though the latter seemed to despise their fans almost as much as metallica does theirs. On their breakout album, "(What's the Story) Morning Glory?" the actual title track is basically a tremendous wall of noise, the sort of thing it can be fun to play while getting started on a messy job like washing a car or something. It always struck me as somewhat odd, considering the other tracks were recorded and mixed in the standard way we have gotten used to through the twentieth century: minimize noise, clear vocal and instrument tracks. There is nothing to suggest this specific track was recorded live in performance or similar. Still, as we know, musicians and artists in general opt to do peculiar things from time to time. Some experiments land, others don't. This one was okay. Years later, on a foray to replace broken headphones, I made my reluctant way out to a newish big box electronics store, very much out of my way. But, the store was among those that would honour the headphone warranty, so it was worth a shot and a nice day besides. Once done with the headphone issue it would be possible to enjoy a fun walk exploring a less familiar neighbourhood on the way home. This was my plan, and like most such plans, it did not go quite as intended. While I did manage to successfully navigate the warranty redemption process and acquire working headphones, it was no easy feat, and not due to the sales clerk. Being a big box store, it was and probably still is, basically a warehouse, with the attendant high ceilings and not great speakers for its sound system. There was music playing, but the overall effect was so cacophonous that never had I been so desperate to get out of a store again, and been so relieved on getting out the door. I can see this being a technique applied in certain cases like dubious fast food restaurants, but it seemed completely counterproductive for a store selling gadgets and accessories to the gadgets. I could not understand originally how the clerk could stand the racket. They didn't appear to have ear plugs on, and the music, such as it was, wasn't so loud we couldn't speak to each other at normal vocal volume. It was all very odd. A number of my friends commented they had had similar experiences in this warehouse-type of store, and we all agreed it was odd. Odd, but not the sort of odd that demands investigation. So it was with a combination of surprise and yes delight I happened on an article helping to explain both those oddly noisy music tracks and the subsequent unfortunate noise experience at the big box store. A.J. Wykes wrote a longform article back in the summer of 2021 on the Sound Guys blog, What's the loudness war? How has it affected music? Who are the winners and losers? Wykes explains how music in the fifteen years or so starting from around 1995 indeed became both louder and in effect noisier. Go figure, in discussing the digital tools sound engineers could use to make music tracks louder, Wykes says, "These tools meant that, starting in the mid-'90s, engineers could make their tracks significantly louder than before. As those brickwall limiters were pushed harder, the sound became increasingly harsh. (What's The Story) Morning Glory by Oasis is considered a landmark example of detrimental amounts of compression, combined with extreme limiting, to create something absurdly loud." Well, how about that. The noise comes of the compression and clipping effects, the audio equivalent of the nasty looking artefacts on an over-compressed jpeg. Combine the harsh effect with bouncing and echoing around a warehouse outfitted as a big box store, and the result is bound to be less than pretty. But it is also something a person could get used to if they followed music steadily through those fifteen years and didn't generally find it unpleasant. By accident, I hadn't done this, so part of my response at the big box store was about genuine shock at how awful the noise was. How oddly appropriate the breaking point for this loudness, clipping, and distortion was achieved by metallica, who made such a hash of the sound quality on the original release of their death magnetic album their fans wound up reconstructing it using the video game guitar hero iii. I am by no means a heavy metal connoisseur, while appreciating how the serious bands are amazing at finding ways to be yes loud and still allowing their instruments and vocals to stand out. It is something of a high wire act though, and it was most likely going to be one of them who'd fall and be popular enough for the fall to have some serious impact. (No pun intended.) (Top) Oppressors Love Abusing Language (2026-03-23)
A surreally appropriate result of a wikimedia commons search on 'oppression.' This is a july 2015 photograph by ImagePerson, used under Creative Commons Attribution-Share Alike 4.0 International license via of course, wikimedia commons.
In her substack article in her Free Female Speech Friday Series, Sandie Peggie (14 february 2025), Kara Dansky quoted a pointedly apt section of Andrew Doyle's substack post from the day before. Quoting Dansky quoting Doyle: "Trans-identified people are among the most powerful in society. They are endlessly celebrated by the media and celebrity classes. They are able to enforce speech codes against the will of the majority of the population in the knowledge that those in authority will support their demands. They are able to call on the police to harass those who dare to refuse to comply. Above all, they enjoy the status of victimhood in spite of being among the safest and least victimised of all demographics. Who knew that being oppressed was so appealing?" Of course, the only people who can seriously and dangerously do such things are the oppressors. While Doyle refers to trans-identified people engaging in this type of abuse, the fact is the primary people doing so are the very ones who have the more entitled positions in a patriarchal society, the men. Overall, it appears the trans-identified men are especially interested in such language manipulation precisely because on one hand, they don't actually take language seriously when it comes to any suggestion they have obligations to other people who are not men. On the other hand, they consider abusing language to get authorities to torment the women and girls they target as their pay back for their commitment to sex stereotypes and subsequent failure to meet their own standards for so-called "masculinity." It is little wonder as time goes on that the biggest practitioners of linguistically-based abuse sound and act more and more unhinged, because they have to keep pushing the envelope. They have to keep finding a new, apparently bigger authority to try out their pretence to oppression on, with a side trip to demand pronouns or prance around in a public women's washroom or something for a top up. Please note, the description of these trans-identified men sounding and acting unhinged. The ones most active in this form of abuse are not unhinged, they are in control of their actions. If it were otherwise, they would not be so successful, as is true of any oppressor. Sandie Peggie's case added the finishing touches to a powerful clarification of the mindset of abusers pursuing this specific form of linguistically-based abuse I had stumbled upon reading a book on a notionally different topic. Of late I have been working my way through a number of classic studies of the relationship between scientists, technicians and society, many of which were published between 1969 and 1985. On this topic, the unexpectedly relevant text is Joseph Weizenbaum's Computer Power and Human Reason: From Judgement to Calculation, first published in 1976. Yes, Weizenbaum is unpacking his observations and concerns about male student computer programmers of the time. But his observations are not only applicable to them, and he spends an important portion of the book considering the implications of how computer programmers experience a feeling of power through their coding. It is his reflections on the nature of this feeling of power that are most relevant here: The conviction that one is all-powerful, however, cannot rest; it must constantly be verified by tests. The test of power is control. The test of absolute power is certain and absolute control. When dealing with the compulsive programmer, we are also dealing with he need to control and his need for certainty. (121) A few paragraphs earlier, Weizenbaum notes that the closest parallel to this mindset of the compulsive programmer is the compulsive gambler. But the point is the broader perception such individuals have of their personal power over some other, especially some specific object or person. To maintain a sense of absolute power over a computer or the other machines and cards in a casino is hard enough to achieve and intoxicating enough. To create and maintain a sense of absolute power over other human beings is something else again. But the very demand for constant tests of the apparent power lets slip the conviction of being all-powerful is not a conviction at all. After all, if the person claiming the conviction was so certain of themselves, there would be no reason to test their pretence to power over the objects or persons their claims centre on. Of course, this does not mean oppressors determined to reassure themselves they are powerful can't cause distress and outright destruction. Clearly they can, so long as enough other people sympathize with them and go along with their demands. This must be the stubborn grain of sand irritating in spite of the oppressor's sense of superiority: their power does in fact depend upon others, it is not wholly sited within themselves. Hence part of what the need in approaching each test of their sense of power is an attendant power of persuasion, preferably by means of words. Oppressors generally dislike "persuasion" in the form of physical violence unless they are part of a group all attacking together. Otherwise they might risk personal injury or getting caught out should the victim turn out to have some genuine capacity for fighting back. Oppressors depend upon us being too afraid to lose the sorts of things they value, let alone the necessities of life and self-respect. What oppressors can't seem to absorb, hence they remain oppressors and abusive, is the reality that they are not all-powerful and even those they consider most powerless will fight back with all they have if they deem themselves unavoidably cornered. (Top) Lateral Thoughts From an Ancient Image (2026-03-16)
Quote of the Alamy photograph of the Dolní Věstonice portrait head, courtesy of LiveScience. The original carving is 4.8 centimetres by 2.4 centimetres, a bit larger than a nine volt battery. Click to see a larger version.
Early last year, one of the latest publications on a tiny carving from a site at dolní věstonice in the czech republic raised an understandable stir. As Kristina Killgrove wrote at livescience, among the artefacts recovered from the site is what is now understood to be the oldest known human portrait in the world. Not only the oldest human portrait, but a portrait of a woman buried at the site whose skull was recovered decades ago, and has recently had a new facial reconstruction completed from it. Alas that the more detailed paper on the facial reconstruction is not widely available, because it is actually breathtaking to look at it alongside the carving. The carving is dated to approximately 26 000 years ago, which means by some extraordinary fortune, a definite portrait rather than an abstract work has made its way to the present. Absence of evidence is not evidence of absence, so we can't claim nobody was creating portraits or other more literal representations of individual people at that time in that part of the world. But if they were, it seems likely they were not making very many, because this carving appears to be utterly unique, unless there are others not yet recognized in boxes of incompletely catalogued and cleaned artefacts. That aside, at this time of hyperindividualism in the so-called "west," this present context might seem in and of itself to be what would give this carving an additional significance. It could be, but it also seems to me that the connection between the carving and a verifiable person gives an additional sense of immediacy that is not typical of such ancient items. Or maybe it is simply that we humans have a cross-cultural tendency to find great significance in objects endowed with eyes or representations of eyes. Referring to a cross-cultural tendency can be dangerous because there are not very many verifiable examples. In this case though, the evidence is reasonably solid. To start, consider the west asian amulets to ward off the evil eye, which are not only still in wide production and use, but are also found in the archaeological record of the region. Or consider the doll-making practices in the Blackfoot and Haudenosaunee Confederacies, where dolls are not given eyes. There are both practical and spiritual reasons for this. To give eyes to a doll or other object with a face gives it life, or a more conservative person might insist, a semblance of life. Either way, a doll with staring eyes can be quite frightening to a small child at night. Curiously enough, these ideas seem to be part of the deeper belief systems of europeans and european-descended peoples in the americas as well, considering the infamous series of "Chuckie the evil doll" horror movies. Another archaeological example I am aware of is a type of sumerian statue depicting a person with hands clasped and their eyes painted or sculpted wide open and unblinking. These statues are associated with temples, strongly suggesting a link to spiritual practice. Of course, none of this is coincidental. As social animals with three dimensional vision, we also have highly expressive faces, especially via our eyes and mouths. In the course of any personal interaction, our eyes do a lot of work by expressing our emotions. A mismatch between expression in the eyes and the rest of the face can be deeply disturbing or merely awkward. As many of us have rediscovered over the years of the COVID-19 pandemic, much of our smile is in fact expressed with our eyes, so much so that even if a person's mouth is covered, it is possible to see when they are smiling. The people who try to claim otherwise are at best being disingenuous, but all too often are simply being bullies. It is also worth noting that the importance of eyes for reading mood and social cues is part of why many first attempts to draw a human face are out of proportion, with over-sized eyes and mouths. There is a certain irony that when we learn to draw we first have to figure out how to draw what we see, not how we interpret what we see. It is quite counterintuitive for anyone who did not find drawing came "naturally" to them in school. (Betty Edwards' Drawing on the Right Side of the Brain is rightly revered for her accessible description of how we see versus how we draw.) This puts me in mind of science writer Lynne Kelly's latest book which I have not had a chance to read yet, The Knowledge Gene. Kelly has already written several recent books on mnemonic systems, which this book builds on. It's relevance to this thoughtpiece though is given by a quote from her own description of the book,which "offers scientific evidence for what so many people already know instinctively: that art, music, performance, story and our connection to our surroundings are fundamental to be[ing] human. But it goes even further to show that these ancient, innate, universal and uniquely human skills have been fundamental to human culture for at least 70,000 years and are critical in storing information in a way no other species can manage." While I have shared the focus of the articles on the dolní věstonice carving interms of its uniqueness as an ancient surviving portrait, perhaps this is in a way too specific. After all, the majority of ancient artefacts that most hold our attention are decorated if not wholly an artwork. When we encounter evidence of human creativity in this way, we are in fact unmistakably encountering traces of what makes us distinctively ourselves. (Top) A Visceral Response (2026-03-09)
Library of Celsus in Ephesus, photograph by Herbert Weber, april 2014. Image courtesy of wikimedia commons, used under Creative Commons Attribution-Share Alike 4.0 International license.
No doubt there are exceptions – there are always exceptions – but overall, I think it is fair to say that in most places today people are horrified by the destruction of libraries. By this I mean libraries in the common english language sense, a place where many books and other sorts of reading matter are collected together (at minimum) so that they may serve as a shared resource for a group of people. I also mean libraries in the more extended sense of the collected and carefully curated knowledge preserved and shared by trained community members in many cultures, who may also maintain ordered collections of encoded objects. The non-trivial number of people who shrug their shoulders at the prospect of the shut down of the local public library on various rationalizations have always been with us to some extent, and it seems to me based on an assuredly very small-scale survey, that such people seem to have a pair of especially prominent beliefs about libraries. First, they seem certain that libraries, especially the public kind, are in fact a waste of money and not appreciated by anybody really who uses them. Second, they frequently decry any book or document that is not in direct use for a work-related purpose. These two beliefs suggest a deep sense of pessimism about other people and arguably the future of their societies. They are also very instructive, because in spite of the apparent skepticism about the utility of libraries in any form, these beliefs reveal a recognition of the role they play in the transfer of cultural knowledge and therefore connection between generations. Libraries express an optimism about and commitment to the future. So people aren't being sentimental or romantic when they express alarm or horror at the destruction of a library, however modest, and however obnoxiously restricted in access. We also should not lose sight of the fact that literacy itself is such a powerful tool that people to this day are actively blocked from learning to read and write, by direct personal violence and by deliberate restrictions on their ability to find teachers or any books to read at all. Since libraries are in fact so integrated into how cultures and therefore community values and history are passed down, people are far more passionate about them than might be expected. Concerns about the potential suborning of libraries are not new either. It is reasonable to be troubled in "western" countries where ongoing attacks on the very social fabric, let alone social services has led to more and more of these services being provided by public libraries. The librarians are seeking to carry out their responsibilities as members of their communities through maintaining the utility and safety of the libraries, which does mean at this point, like it or not, supporting job searches and places homeless people can spend time safe from the weather and able to read or use the computers to look for work and do other tasks. It is not bad that libraries provide such assistance, but it is bad that in effect they are being used as a last resort due to willful and ideologically driven defunding of public services. In other words, what libraries are expected to do reflects broader policy decisions and problems. The contents of library collections follow the current media technologies as well, so that over time the earliest days of lendable recordings, which in my experience were cassettes, have given way to CDs, DVDs, and now different types of "ebook" lending. These days I have observed even graphic novels and comic books included in library collections, which not everyone agrees with, regardless of which specific items are available. Libraries are, like people in general, unavoidably political. It is a fact that people can and do read, watch, and listen to materials at libraries that various authorities may disagree with. When questions of authority and accessibility get into the mix, it can be easy to conflate things that shouldn't be. For example, far too many people are conflating access restrictions based on age with censorship. No, there is good reason the children's section does not include adult-themed or sexually explicit materials, and for kids entering their adolescence to need a parent or guardian to sign off on them borrowing, listening to, or viewing certain items. It is reasonable for people to be concerned that performances and talks in libraries that may include sexually explicit material or costuming should not be provided to a general audience including children. Protecting children and respecting the different developmental needs of children and the different interests of children and adults is not the political part. The struggle over if librarians should effectively be allowed to overrule parents or shift the general atmosphere of a given library branch to something closer to a venue encouraging sexualized displays is the political part. Politics is after all, very much about power. Who a library belongs to, who runs it, what information it takes part in spreading, these are all important and contested elements. Meanwhile, librarians typically come to their profession with a sense of social responsibility and commitment to contributing to the righting of social wrongs. From this perspective, it is little wonder so many people have a visceral response to libraries and what happens to them and in them, if they are literal places. It is also sadly understandable why there are people who throw up their hands and conclude maybe there shouldn't be libraries, or libraries of a certain sort. Political conflicts are tough, and challenges to authority are always threatening to some degree. But like it or not, we must face up to the political and authority challenges. And we need libraries. (Top) Staying In Your Lane Can Be Hard (2026-03-02)
Photograph of numbered lanes in sao paulo by Teo Romera, december 2008. Image quoted from Romera's flickr photostream on 14 february 2025.
The current state of public discussion and debate is terrible. The difficulties are about more than just the continuing crack down against actual "free speech" let alone "fair speech," although at the moment the most widely recognized sources of trouble are so-called "social media" and the rest of the media, which is generally a corrupt mess in the putative west. These are two real factors, and yet, they may not be quite as important as we are encouraged to think. The reason I suggest this is an interesting reading experience, in which a scientist known for demonstrated brilliance in his field, produced some of the most ridiculous, harebrained, and evidence free commentary on a different field I had ever heard. I am not an expert in the other field this scientist was speaking on, but what he said was strange enough sounding to me that some basic research and original source checking was in order. Of course, this is always a good idea when seeking to make sense of an area outside our own experience and expertise. But what the hell happened to this scientist that he was pontificating apparently without having done any due diligence research into the topic? Otherwise he would not have been stating ridiculous, harebrained, evidence free things. Even though any of us can pop off about something without being especially informed about it, we are encouraged to expect a scientist to do better and therefore speak with authority based in checking the receipts, so to speak. Failing that, a naive person would expect a scientist or other authority figure to demur from commenting absent consideration of real evidence. But then, such an authority figure might deny lacking any real basis for their opinions, and do so in the teeth of any suggestion or even proof to the contrary. This indicates the crux of this specific issue. Let's step back a moment and consider the thoughtpiece as a form, which is of course just that of a brief personal essay consistent with the genre still most famously written in by Michel de Montagne. The topics of these articles don't range too far afield, yet they are explorations of issues and ideas, and a subset of them are certainly removed from my immediate expertise. They are definitely positioned as and intended to be no more than explorations, not authoritative sources, although I do try to include intriguing links where that makes sense to help readers delve deeper into the subject themselves. Furthermore, I don't try to confabulate or misrepresent what I have learned or found. While a random writer on the web should not be deemed a particular authority, even so I would be leery of presenting a more forceful discussion or explanation outside of my own genuine areas of knowledge and experience, and then my inclination is to make sure to provide as many links and references as possible to support further exploration and checking receipts alike. My say so alone can't make me an authority to others who don't know me, although I could build some authority in the case of specific topics via a record of well-written and referenced pieces. Nevertheless, it is not likely people would take much of what I might say about other things with less by way of apparent evidence of more than surface knowledge too seriously. They might greatly appreciate any links I may provide and words or phrases helpful for plugging into search engines, which would be pretty neat. It can be terribly tempting to go running out of my lane based on strong feelings or preconceptions just as it can be to anyone else, but I do strive not to do that, however tempting it may be. This tends to curb such awful mishaps as episodes of foot in mouth and the like, which past a certain age are not easy to recover from. Okay, so now back to people like that voluble scientist foraying into other areas seemingly without too much background to help him out. The issue is not just a scientist thing, but an apparent "authority" thing. In the "west" such as it is, it appears the only thing a person really needs to be treated as an authority in pretty much any topic suggested to them is to be either rich or famous, plus male and considered "white." Frankly, this is really weird. To become famous typically involves either dumb luck or inheritance, these are not states indicative of anything else about a person. Unfortunately to become rich usually entails criminality and exploitation, so mere luck is not as strong a factor there, and of course a great many rich people didn't "get" that way legally or not, they inherited money and a place in a social network full of other rich people. Yet, scientists can seem like they should be an exception, especially the ones who manage to win such sadly diminished prizes as the famous Nobel. After all, such winners are supposed to be geniuses. Well, certainly they are in their own fields. But as geniuses in chemistry or physics or whatever, they cannot be expert in every possible topic on Earth. Still basic day to day competency is well within reach via research or basic consideration of more than such eye catching but nearly information-free things as clickbait headlines or ill-written news articles and listicles. But I do get it. Being treated as an "Authority" with capital "A" can go to a person's head. For famous scientists or anybody else to stay in their lane doesn't mean never having an opinion they might share on an area they aren't experts in with the press and general public. But it does mean refraining from comment when unable to base it in real information and checked receipts. Due to how quickly and far media can spread information, we all have a duty (rich, famous, neither or both) to not act effectively in the role of the boy who cried wolf. Once the mediasphere is polluted, it is incredibly difficult to counter nonsense, confusion, and outright falsehoods, which as we all know too well, is dangerous to society. This is true even if staying in our lane seems like taking the ridiculously long way home. (Top) Thoughts on "Technological Change" (2026-02-23)
Wellcome Trust image of a mid to late nineteenth century toothpaste pot via wikimedia commons, under Creative Commons Attribution 4.0 International license." title="Wellcome Trust image of a mid to late nineteenth century toothpaste pot via wikimedia commons, under Creative Commons Attribution 4.0 International license.
In the course of hs lectures on "Reading Marx's Capital," specifically in his series on Volume 1, he refers to technological change. Technological change is one of the factors affecting the productivity of labour and therefore how much profit capitalists may extract from the workers completing the labour. It seems to me that the usual understanding of what "technological change" refers to is changes in the tools used to make commodities for sale. To this day the canonical examples are machinery that automates away labour, jobs, and quality of the product. The famous early examples all the various spinning and weaving machines brought in quite overtly to increase production while throwing people out of work, even if the product was of such poor quality the factory owners left it to sit and rot. In their view, breaking any ability workers might have to resist being exploited was worth any short term price. Now, whether or not we all agree about why and how machinery was brought in during industrialization, still, the points about automation and increased speed and mass of commodity production are not contested. But, somehow in his discussion of technological change, Harvey briefly mentions as an example, different types and brands of toothpaste. Now, while I think this simply must be a mishearing on my part, it ends up being a rather interesting one. Different types of toothpaste could be an example of a technological change intended to improve the commodity, although we must always ask "improve for who and for what?" It would not surprise me to learn that some vaunted change in toothpaste formulation was not intended to improve its efficacy in cleaning teeth and maintaining oral health but to make the stuff run faster through the pipes and nozzles in the machines filling the standard toothpaste tube. And there is another question raised. Is advertising itself an expression of technological change, in terms of the packaging and labelling of, in this case, toothpaste? For the purpose of this thoughtpiece the idea is stay close to the immediate commodity, rather than wandering off into the broader advertising industry and such. As the illustrating photograph suggests, there is real technological change when it comes to toothpaste packaging. Originally, if a person bought it ready-made, it came in the sort of shallow small jar typical of creams in the same period. But toothpaste is after all a paste, which combines dry ingredients with a liquid, rather than a cream which usually combines a distilled oil or similar with a base animal fat or similar. While polishing paste for dishes and cutlery didn't present a major problem when packaged in shallow or deep jars as long as their tops were not too narrow because they were applied and used with a cloth, toothpaste didn't work so well, especially once toothbrushes became the standard for cleaning our teeth. Getting enough toothpaste on the brush and not making a thorough mess was far more difficult until the squeezable tube came along. So that definitely counts as a technological change, with applications in many other areas. Until very recently, these tubes were made of metal, so they needed to spiffing up to prevent them from rusting and to help counter any tendency to brittleness as the tube was squeezed, bent, and eventually folded or crushed as it was emptied. Furthermore, toothpaste sellers wanted to ensure their brands were recognizable and difficult to detach from the tube. Okay, figuring out how to paint and later attach water-resistant paper labels, then plastic labels and then the shift to plastic tubes, those are all technological changes. Those changes would then influence changes in machinery to cut out as much direct human handling to apply the newfangled labels and fill the tubes. Okay, but what about the toothpaste labels, how they changed with time and therefore advertised the toothpaste. Does that really count as "technological change"? It really seems to me that it doesn't. The printing and application of the labels certainly can be reframed as examples of technological change. A person designing new labels periodically for use is an example of application of industrial art design, which has lost the majority of its detail and elaboration since the late nineteenth century, in part because it is a mass product now, militating against encouraging people to keep the emptied packaging. The growing move towards enforced obsolescence by adopting cheap packaging resistant to reuse and too ugly once emptied indicated most emphatically that the packaging was not to be a status symbol acting as a more or less subtle advertisement in the home or office. On top of that, today there is minimal difference between the different brands and kinds of toothpaste available. Even such novelties as gel toothpastes and alternate flavours like cinnamon have all but vanished. Most of the time, with exceptions from "healthy" and "organic" companies, you can pick any flavour you like as long as it's mint. There are various gimmicks and claims about what is supposed to differentiate the toothpastes in other ways ("whitening," "for sensitive teeth," "prevents cavities," etc.). In my experience even the dentist is not too specific when it comes to toothpaste recommendations, except for brands claiming to be good for sensitive teeth and warning people off of habitual use of whitening toothpastes. I still don't see how merely having different brands or types of toothpaste represents any sort of real technological change. Combining different flavourings, altering levels of flouride or adding peroxide are not new ideas at this point. Merely changing the appearance of packaging is not a real technological change either. Since the packaging is constantly less forthcoming in terms of useful information, if we stretch and consider design of the content of the labels as technological change, there has been little if anything meaningful on that score either, except for the packaging to get worse. (Top) Attention Span Debates (2026-02-16)
Photograph of Le Pont Tivoli while closed by Christian Ferrer, 2013, via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license.
Among today's steady stream of space-filler articles and alas often dubious listicles is an entire genre of "attention span" interest pieces. Based on my observations, they have apparently distinct threads for small children versus teenagers versus adults, and they often trace all ills back to screens, although some more recent ones keep trying to invoke something-something-covid. I have also heard and seen colleagues and public speakers referring to their personally reduced attention spans, usually along the lines of something like "I just can't seem to read books anymore" or similar. Overall the media versions strike me as attempts to gin up a moral panic, which is incredibly frustrating because there are real questions and concerns about the ways handheld computers are used, including as a questionable early child entertainment box. But this didn't start with computers, and many of the same themes can be found with somewhat different labels but applied to television and before that movies and comic books. When the attention-depleted are presumed to be adults, the themes are quite different. I have been struck by how often the articles and discussions turn out to be often not so subtle demands for adults to be "more productive" and effectively accusing them of slacking off on the job. While I have no quarrel with making available useful techniques for dealing with an onslaught of too many tasks in too short a time via the trendy pomodoro method or the bullet notebook, all too many of these things are corporate products clearly intended to reshape our behaviour to improve our exploitability by corporations. But this begs the question, are people experiencing attention span problems these days or not? The seeming veritable adult epidemic of late diagnosed attention deficit hyperactive disorder (ADHD) might suggest the answer is yes. However, as the late creativity and education expert Ken Robinson observed, our current conditions in industrialized nations are among the most hyperstimulated in recent history. He was specifically talking about children, so many of whom are diagnosed with ADHD and prescribed powerful drugs intended to make them focus. This strikes me as a massive social experiment in which children and now also adults are subjected to these drugs with no real idea of their potential longterm neurological and physical impacts. It does not seem like these drugs are prescribed on a temporary basis to assist the child or adult while they learn how to improve their ability to concentrate, or identify any specific environmental or social conditions that may be triggering the issues and could be improved. It is understandable that many people of all ages have a great deal of trouble sitting still and "concentrating" in conditions where they may need to cope with constant noise and visual distractions, tedious and unchallenging tasks, and a manager or instructor who has no time or no respect for them or both. Nobody can concentrate in the midst of all that. Managing a classroom full of kids is hard to begin with, and all the worse when in many schools their outdoor recesses and physical education classes have been cut, so they have almost no chance to burn off energy running or walking around, climbing, kicking a ball around and other outdoor activities. They are often heavily restricted in what they are permitted to read and study for the sake of standardized testing. Now younger adults may come to post-secondary education expecting to sit in class surfing on their phone or laptop instead of participating. All too often adult students instead of answering a question for themselves opt to "google it" instead, which takes them out of work with their peers in class to hunt online instead. Having acknowledged at least some of the many external distractions, it hardly seems fair to imply a broad inherent loss of attention span. It is telltale that advertising corporations openly discuss how to demand and hijack our attention, and more and more often they resort to the addiction-encouraging techniques developed to improve business at casinos. My observations of actual people indicate the concern they have is not a moral panic or search for a diagnosis or a productivity boost, but a desire to defend themselves from this onslaught. Certainly I have heard more people questioning whether they need a so-called smartphone, and why they should have an always internet-connected device of any kind. Admittedly this is most emphatically a northern north america sourced perspective, and I am not living in or familiar with the larger cities typical of the northeastern united states. Some of the pictures and videos I have seen of cities plastered top to bottom with advertisements, including many elaborate LED billboards playing videos on repeat appall me. Yet I have not heard people concerned about distraction and lost attention span mention these, so perhaps these recede rapidly into the background after all for most people. Thinking over what people have said to me and my observations of the behaviour of my own students together with guidelines on how much reading instructors are recommended to assign, there is an important aspect of the attention question not often considered. Endemic to education and work is the use of extracts, snippets, and what amounts to glorified lists. At my other day job, writing instructors insist we should not be using long or complex sentences because they are "too hard" for the general reader. We should not be providing so much detail or expect the "general reader" to cope with documents much longer than fifteen pages at best. Something has gone seriously wrong here, because neither real academic work nor the professional analytical work I do can be adequately presented in an extended listicle or without some complexity. My work does not lend itself to "tweeting" or other "social media" posting, and frankly, it shouldn't. The thing about attention, especially reading and watching attention, is that to be able to have a longer attention span, we do need to practice. I remember how excited my classmates were to begin reading chapter books for the first time, that was a hallmark of growing up and for many also marked when they were allowed to choose their own books to read for the first time. There are still comparable adult experiences via choosing to study a subject or read novels or non-fiction of more challenging types than those written according to formula. That's a change in mode, and it is natural to be unable to read such books for very long at a time at first. Experience and practice speeds things up for adults just as it does for children. There are parallel cases for analogous movies and plays, or learning and applying skills and trades. Sometimes it seems like the adults worrying about their attention spans have forgotten this. (Top) Lyric Tropes or Rhetorical Figures? (2026-02-09)
Counsellor Comma from the 1824 children's book Punctuation Personified, via the public domain image archive.
Among my various readings in the course of completing my mot recent major academic project, I managed to borrow a copy of the brilliantly titled and genuinely interesting, though occasionally hard going book, Do Metaphors Dream of Literal Sleep? A Science-Fictional Theory of Representation by Seo-Young Chu. I have mentioned it from time to time and wrote an essay about Chu's theory and how she applied it. It doesn't seem like Chu has had an opportunity to write a more approachable follow up, which is unfortunate because she draws out material that can be easy for science fiction and speculative fiction readers more generally may be prone to missing. For english language science fiction specifically, Chu's use of what she refers to as a lyrical approach indirectly demonstrates how particular features of these works hark back to what may be unexpected origins. She was focussing on the ways "lyric figures" are taken literally and become important elements of science fiction, and includes a brief discussion of science fiction poetry, which is a unique challenge in part because it needs to find new lyric figures. She provides four main examples at the beginning of her discussion of specific lyric figures and their science fictional reflexes: apostrophe with telepathy, synesthesia with paranormal sensation, personification with animation of a humanoid artifact, and catachresis with dissimilar combined images taken literally. It is interesting she did not provide more generalized science fictional reflexes for all four, as personification could be reflected as animation of any artifact and indeed taken as the concept behind contact with other than human consciousnesses. I admit catachresis is trickier, but the by turns humorous and sinister uses of names in science fiction might be just right for the job. Take for example the eponymous space ship in the sci-fi situation comedy Red Dwarf, or just about any title from the oeuvre of Philip K. Dick. Still, something seemed oddly familiar about these "lyric figures." On one hand yes, they can definitely be found in poetry, including many examples from other languages than english. On the other hand, they don't seem necessarily about poetry in the usual sense we take the term today as performance. At that point the other shoe finally dropped. What Chu was calling lyric tropes, I was familiar with as rhetorical figures, and anachronistically we could refer to poetic performance as a subtype of general public speaking. (Anachronistically because the evidence from ancient texts and ethnography suggests poetry came first as a memory aid, and this turned out to be an excellent communication multi-tool.) Explicit instruction in spoken and written rhetoric is not very common now, especially in elementary and secondary school where for a short while it was a commonplace. A minimal amount of instruction in written rhetoric is still part of north american curricula, but anything more is now considered a "frill" and therefore accessible only as an extra or for additional fees. That is, unless the student in question is attending a private school or university, in which case many programs will include such instruction as a basic requirement. Setting aside the socio-political implications of that for the time being, one of the benefits of the web is that it is in fact posible to learn about spoken and written rhetoric via well-written and presented websites. Among the award-winning sites on this topic out there is silva rhetorica, which is full of modern language examples and is a rare example of excellent use of a frames-based layout. The "Flowers" menu on the right hand side provides a full list of the names of rhetorical figures, among which Chu's examples duly appear, except for synesthesia. But this is not actually a gap, because practically speaking synesthesia is a specialized type of metaphor, and metaphor is among the "Flowers." I am nowhere near up to date when it comes to current science fiction or speculative fiction more broadly, so perhaps authors have already found and applied as many other rhetorical tropes to reframe literally and build stories around as are available. It is tempting to assume the various sorts of word misuse would be more than enough, however, it is a powerful tool best applied carefully. An author can get away with a clever title like "To Serve Man" once or maybe twice, after that it becomes cloying. Inexperienced authors can end up in trouble should they apply this as a conceit such that the story is more akin to something enjoyed once then both thrown away and forgotten. Then again, going through the rhetorical figures doesn't reveal too many other obvious possibilities without delving more seriously into the details, which suggests it would be tough to make use of others without making a very contrived story. It would be artsy in the pejorative sense, and not much fun to read. (I shudder to think what it would be like to write, actually.) Such flights of fancy aside, the more important point is that even the most apparently "modern" and "brand new" ways of writing in fact do have connections back to much older ways of remembering and storytelling. Much of the more recent science fiction I have read by settler north americans has an odd artificiality to it even as the writer works incredibly hard to build an absorbing secondary world. I begin to think this is because so many of them either do not have a clear grounding in their own histories, or they have been persuaded they should avoid such grounding for marketing purposes. This is deeply unfortunate, as I think it renders what could have been amazing work into a shell the writer ends up putting together for the readers to bounce off of, with hopefully enough hints of something more to keep them coming back. (Top) The Hollow Earth is a Surprisingly Old Idea (2026-02-02)
The figures from the page just after the end of Edmond Halley's paper "An Account of the Cause of the Change of the Variation of the Magnetical Needle with an Hypothesis of the Structure of the Internal Parts of the Earth: as it was Proposed to the Royal Society in One of Their Late Meetings," pages 563-578 of Philosophical Transactions of the Royal Society of London, 1753. See the full pdf at the internet archive.
One of the great joys of working with older documents and sources is that they turn up unusual research rabbit holes. The most recent examples of these I have encountered started from a reference to "the Indo-European notion of the stone sky" with no source cited. It is clear why there is no citation: the author was a philologist working on indo-european languages writing for her colleagues, who would already know the details. I was aware of sky imagery from west asia based on the notion that it consists of a vast metal bowl punched with holes like a sieve through which the stars shine at night or rain may fall. People evidently didn't mean this literally, this was and is a trusty metaphor. A metal sky is not so different from a stone one, and really either could be slyly argued for based on meteorites which as is widely known may be metallic, stony, or even both. Indo-european philologists argue the connection between sky and stone ties first to the notion of the sky as a covering, and then by reference to deities who make the sound of thunder and provide fulgurites and meteors. The intuition that the sky is a sort of protective cover has turned out to be a correct one, now that we have learned about the ozone layer, among other atmospheric components. To this day the notion of a stone sky has persisted all the same, via the notion of the hollow Earth, though so far as I can tell, attempts to argue for a hollow Earth often assume we live on the outside of it rather than the inside. Eric Grundhauser at atlas obscura identifies Edmund Halley's turn of the eighteenth century attempt to explain why magnetic declination changes over time as the earliest scientific speculation about a hollow Earth, describing the idea as "strange." However, I can't agree with this characterization because it does not take into account the general intellectual framework of Halleys time, which is not too difficult to learn about these days. Before delving briefly into that though, it is important to note that Halley's suggestion that there must be internal spheres with magnetic properties contributing to the magnetic anomalies is fundamentally correct. Today we refer to those spheres as the inner layers of the Earth, including the upper and lower mantle and inner and outer core, of which the current scientific consensus is the metallic core is the source of the Earth's changing magnetic field. It wasn't until we understood seismic data well enough to plot it and broadly reconstruct the Earth's interior from it that we could confirm from indirect observation as well as developed understanding of planetary structures overall that there are no gaseous layers or spheres within the Earth. At the time Halley gave his talk to the royal society of london, not only was none of this information available yet, scholars still held religiously-based beliefs about the Earth and existence more generally. Halley refers directly to the assumption that a planet and its parts must exist for the purpose of supporting life. It's a weak assumption by our lights, and based on the published version of Halley's talk, he wasn't impressed with it either, even though he did fall back on suggesting life on other planets or within the Earth need not be similar to what we know. He also freely cited poetic references to underworld visits as possible interpretations of what could be inside the hollow Earth. Considering his words immediately following those citations, "And though this be not to be esteemed as an Argument, yet I may take the liberty I see others do, to quote the Poets when it makes for my purpose," I suspect he was taking a jab at other poet-citing society members more than anything else. He repeatedly calls upon ship's captains to take measurements of magnetic bearings more frequently and consistently, indicating he was fully aware of the lack of real data to test his proposal with. (Readers interested in a more detailed discussion of Halley's paper may be interested in geographer Duane Griffin's article "What Curiosity in the Structure: The Hollow Earth in Science," which is accessible via the internet archive.) Overall though, it seems the hollow Earth is a preferred place for poets and later novelists and screen writers to spin out examinations and critiques of their societies. Peter Fitting at public domain review summarizes several early novels featuring utopian societies and otherwise lost worlds within a hollow Earth, for example. Among the very earliest, poetic versions of such reflections by means of a more or less ordinary person visiting the underworld are enshrined in the greek Odyssey and its roman knock off, the Aeneid. The attraction of the underworld and its partial rationalization as the hollow Earth for such social reflection and critique seems clear enough. If such a place were real, it would be difficult to access, and therefore its ability to affect our usual world would be restricted. The writer could skirt around any potential risks in critiquing their society directly, while providing a clever in story rationalization for why nobody seems to have come from that other world to interact with or enforce changes on this one. Still, this doesn't seem to really explain the persistent appeal of the idea of a literal hollow Earth to some people in the united states from the nineteenth century down to the present, starting with a person named John Cleves Symmes, Jr., who not only believed the Earth to be hollow, he published a circular declaring it so. Perhaps the appeal is the suggestion of another world these people could go to in order to get away from this one, preferably without dying first. Understandably the difficult socio-political issues of then and now make decamping to somewhere else where all such problems are already solved seem tempting, or at least fun to imagine. Yet, to take such an idea literally strikes me as an act of radical irresponsibility. As several novelists whose protagonists visit the interior of a fictional hollow Earth argue, the trouble with our socio-political problems is that in striving to leave them behind by going someplace else, we inevitably bring them with us. (Top) Value Versus Effort (2025-01-26)
Hablot Knight Browne illustration for an 1872 edition of the novel Barrington. Image courtesy of oldbookillustrations.com.
With the ongoing controversies over so-called social media, the question of "accessibility" keeps coming up, alongside the more important issue of "transferability." Now, to be clear, the "accessibility" I have observed being discussed most is not about whether people who live with various sorts of physical and mental disabilities are able to participate in "social media," but whether the barriers to participation should be high or low. Transferability or whatever we ultimately all end up calling it refers to the ability to pick up your profile, messages, posts and such, and go somewhere else. It is roughly analogous to the ability to port a cell phone number between cellular service providers, which originally wasn't possible, then was but was expensive, and now is provided as an option as a matter of course. So far nobody is suggesting that sort of sequence for "social media" because it is presented as "free" in order to facilitate massive data mining and forcefeeding people with various types of propaganda, starting with the advertising that is meant to lower resistance to other forms. The same people are inclined to argue that nobody will pay for "social media" or other services or products provided via the internet, even though we have ready evidence that this claim is bullshit every day. Substack is the current poster child, but before that was kickstarter, and before that the earliest online small-scale storefronts. People are willing to pay for what they deem is not garbage or offensive to them in other ways if they have means to do so. But the people who piously declaim about how "free" services are eschewing an inappropriate barrier don't talk too much about how they make deals with computer and cellphone manufacturers to preload their pisspoor applications, and prevent the people who wind up with those computers and cellphones from deleting those applications. It seems the main "social media" moguls especially love "free" so long as we have no choice about whether to use their dubious products or not. Of course, this is completely consistent with the "you can choose anything you want as long as you choose what we tell you to" attitude so typical of the various people who deem themselves "elite" and smarter than the rest of us based on how much money they have been able to steal or pretend to have on paper. But let's go back to the particular version of "accessibility" at work here. I have written before about how the rapidly expanded access to the web, especially it's widely hyped version as "web 2.0," does correlate with a serious drop in quality of website and general participation, whether that participation is via some form of "social media," blogging, or website creation and maintenance. I suspect it is more fair to call it a correlate than a cause, or at the strongest a contributing cause. After all, the eternal september phenomenon is one that any longterm online community needs to deal with. A burst of unsocialized newcomers whose lack of knowledge can be taken advantage of by maliciously-inclined resident rolls is an unfortunate fact of life when the online community is not predominantly composed of people who know each other in real life. To be clear, in my view this is not because people can be anonymous on the internet in a way they cannot necessarily be in person. No, the issue is that it is much harder to provide the same type of body language information that enables us to understand when someone is joking, being sarcastic, genuinely so bewildered they are asking what seems like an astonishingly simple question and so on. This is the original, practical impetus behind the early emoji phenomenon, the impromptu development of additions to text messages like *cough, cough* and widespread agreement that TYPING IN ALL CAPS IS EQUIVALENT TO SHOUTING. Before the wretched effort to place a whole morass of emojis into unicode succeeded, this was far more effective than might be expected. Now, I do not think that people only value what they literally pay for online. There is plenty of evidence against that, from people who undertake volunteer moderation duties to those who give back by helping build and maintain projects like the internet archive or wikipedia. However, I do think that when people are discouraged or even blocked from actively contributing to the web or other internet-hosted activities they may take part in, that this is generally destructive. It is true that to begin with, much of the time people had to take active steps to access the web and join the progenitors of today's "social media" such as usenet and other bulletin board and messaging services. The steps could be a bit fiddly or confusing, but completing them reflected a moment of proactive engagement, which we humans are generally quite fond of, and it indicated there were more possibilities for participating and even creating something to share. It wasn't necessarily competitive, although it could be. The internet in general allowed for artistic, hobbyist, and professional uses, and yes, when nobody corporate quite knew what to do on the web or took it seriously, it was better. By which I mean, the genuinely useful concept of the search engine had not been suborned and then subsumed into an advertising and spying vector. Nobody had yet been totally fooled into the belief that building websites with just a text editor was hard, and later the early blogging software properly supported rss feeds as a matter of course and was far more interlinked. I have the impression that earlier on blogging software was not as rigidified by templates either, at least in terms of their appearance and layout. Unfortunately, blogging software became part of the problem, as various "platforms" became centralizing behemoths via their "free" tiers, many of which are now either dead and gone, moribund, or suffering various levels of political censorship and manipulation. The role of centralization in the transformation of the web specifically and the internet more generally into an all too often actively hostile virtual space is key here. Many writers, podcasters, and computer technicians of all sorts have discussed the role of advertising and surveillance corporations in this. However, the critical role of the sometimes humble, sometimes emphatically not so humble internet service provider (ISP) is not examined or acknowledged nearly enough. Like many people of my age, my first regular access to the internet came via my post-secondary institution, and second via the public library. My first home internet access was via dial-up modem, without real limits since even if it had been feasible for me to stay online for hours at a time, I was not involved in moving major amounts of data or early online gaming. This used to be pretty common outside of the computer science circles that included serious programmers, engineers and the like. But there was one more thing ISPs used to do that they don't anymore, and that is provide a modest amount of webhosting space with every subscription. For many people, this was their means of joining the web as a more active participant via a small personal website. This is something that ISPs should be required to do, but they are not, which helped channel people into the tar pits of such dubious early free web hosting services as the original "Xoom" and "angelfire." The challenge now is that even smaller ISPs with fewer by way of perverse incentives to not provide modest webhosting to their subscribers are likely to do so via so-called "cloudhosting" with the likes of aws or worse. So in the end, I mistrust the usual claims about how people shouldn't be expected to learn something to contribute to the web or participate in online forums and the like. Those original "barriers" had become as simple to overcome as teaming up with friends who had already worked out how to get on the web or build websites or yes, join "social media." The real source of barriers to entry and participation are in fact the very corporate and "security" shills who pretend that there is no safe way for ordinary people to do so except through centralized pseudo-services like what we used to rightly call "aohell." (Top) The Swatting Problem (2026-01-19)
An excellent view of the swat river in pakistan taken in september 2017 by saeedartgallery. Image courtesy of wikimedia commons, used under Creative Commons Attribution-Share Alike 4.0 International license.
Admittedly, going by the state of the mainstream news, such as it is, the phenomenon of "swatting" is apparently no longer newsworthy. The term refers to the phenomenon of calling or otherwise contacting a law enforcement agency to report a false claim of a hostage taking, mass shooting, or similar incident to which a swat team would be sent. But that leaves a big part of the definition out should a person not know what a "swat team" is, so to cover that point, "swat" here is properly "SWAT," an acronym from "special weapons and tactics." I have seen only fictional versions on television, the groups of armoured and armed men who head off to deal with hostage situations alongside negotiators and paramedics. There is considerable overlap in effect between swatting and calling in false reports of bombs or bomb threats, in that they generate serious disruptions and often make it at least into the local media. Of course, that immediately suggests why we generally don't hear about either type of false report. To give them that sort of attention on top of the disruption to innocent people caught up in the swat or bomb handler responses would fulfill the desires of the false reporter and encourage them to make more false reports as long as they felt sure of getting away with it. The clampdown on swatting as such was and is very hard, because calling a swat team on a household is very likely to get someone killed, and this is even more true today with most police forces hyper-militarized by surplus military equipment and a significant number of ex-armed forces members. So far so understandable. Except, the more I thought about the phenomenon of swatting in particular, the more I began to wonder how it could genuinely be possible on a more than one off basis. Now, it turns out that among the most unpleasantly prolific of individuals who engage in triggering swatting incidents do so over the internet, and apparently this is a dubious "service" that may be purchased from shady characters online. Ah, so now we can explain part of what must be at work here. Such clever and unethical people are able and willing to spoof the location and source of the false report so that it seems to be legitimate. For example, the report may seem to come from the place where the swat team is expected to go, or in immediate proximity so that a witness could plausibly notice and call for help. I expect the original spoofing techniques are now neutralized at least in the context of the various means by which law enforcement bodies may receive reports of dangerous situations or direct calls for help. But that still begs the question of human response. Suppose a call comes in that appears to justify scrambling and sending out the swat team. It seems likely the organization and call out of such teams is similar to that of firefighters, and so there will be a group on call at any given time of day who have drilled in getting into their gear and on the road quickly. Logically they must have a team leader, who would be expected to assess the situation and position the team. Evidently situation assessment is going to be the hardest part, as it must be done quickly under conditions where the swat team will be full of adrenaline and hyper-sensitive to any stimulus. After all, they are responding to an emergency, which suggests there must be not a moment to lose. If they have been part of the growing number of training programs in urban warfare as opposed to what could notionally be considered urban policing (I realize not everyone agrees there is a real difference between the two concepts), then predisposition to extreme measures is even stronger. Yet acting without having sufficient information will certainly endanger the civilians caught up in the situation, whether it genuinely calls for a swat team response or not. This is typical of dangerous situations of all kinds where somebody else is called upon to help, whether or not they are some sort of police officer. This routes back to training again. Still, it seems this is not the sort of insurance against a false swatting call leading to dire injury it should be. I suspect what weakens the effectiveness of the protections in this case is ironically what makes law enforcement officers feel safer. By this I mean the combination of powerful weapons, effective body armour, and practical immunity from serious consequences should they do innocent people harm. Certainly that powerful combination produces considerable physical and legal safety for the swat team members. But it goes so far as to make the physical and legal safety of the innocent and the guilty alike deeply secondary. The nearest parallel I can think of from a non-law enforcement context is the dangerous behaviours of people who drive SUVs and other huge vehicles. Ensconced in what amounts to a tank because of the sheer mass and size of the vehicle, which may also be tall enough to render people on sidewalks and even bicycles difficult to see, such drivers can become astonishingly aggressive menaces on the road. (Top) "Oh, Right, You're Political" (2026-01-12)
Photograph of one of the ugliest sculptures in calgary, alberta, 'The Conversation,' by Danielle Scott, taken september 2009. The sculpture is on a busy street, so this is a really impressive daytime shot, only one person in it who is standing still and turned away from the camera. Image courtesy of wikimedia commons under Creative Commons Attribution- Share Alike 2.0 Generic license.
The title of this thoughtpiece is the rather surreal response of an acquaintance to my answer when they asked me what I thought of the new Star Trek series Picard. All I said was that it didn't interest me much, as it seemed to me we have lots of stories and information about the erstwhile successor to James T. Kirk. Let's move on to somebody else. Now I would add, clever as it was to wrangle Michael Dorn onto Deep Space Nine, I don't mean more characters from the same storylines. There is a lot by way of not yet optioned and "canonical" material to make scripts from thanks to the various series of tie in novels, and I suspect a surprising number of comic books. This doesn't strike me as an especially political response. But it seems my acquaintance assumed it must be, because my disinterest was in a character played by a middle-aged white man. I must confess, until that moment I had been much more skeptical of the famous feminist declaration that "the personal is political," but now it seems I missed the potential depth of the point behind it. According to my acquaintance, mere disinterest in a main character who happens to be played by a middle-aged white man is political! On one hand, of course that is absurd. On the other, with "the personal is political" in mind, it seems part of the feminist point is that what you do can be deemed political, whether you mean it to be or not. Where it really matters to you, best to be explicit about it. In terms of that surreal to me experience, I suppose it also reflects that entertainment is political, it can't help but be since politics are part of life – and Star Trek was always political in its own way, from the original series on. There are quite a few possibilities even for such a staid old property as star trek while staying "in universe" instead of handing the visuals over for plastering onto a poorly written Michael Bay script or similar. For those who would prefer to stick to the spaceship-based storylines, there are the people who work on other than the alpha shift, for instance. Considering the Next Generation ships on which crew members live with their families, what about more on how to manage the contradictory demands on the very structure and facilities of such vehicles? I am aware of a comedy series that is not "in universe" yet suggests another set of storyines to chase, the ones featuring other than flag ships. I admit to a fondness for the handwavy treatments of life on Earth, but would prefer something more detailed and thought out, even though I know quite well the politics of that are not acceptable to present and discuss overtly. The proof lies in such dreadful efforts as the invention of the "section 31" portion of starfleet or the tragic devolution of the originally intriguing Borg into caricatures. (With the disclaimer that I have not kept up with all the tie-in books, where the authors have more wiggle room.) While I appreciate speculative fiction, especially speculative fiction with a certain amount of deliberate and open political messaging like the various elements of the Star Trek universe, must tread a delicate line to avoid boring most of the audience, it's a shame how this has become part of an excuse to keep remaking the same stories. I've never heard anybody complain about seeing other than the people analogous to aristocrats in entertainment, if their stories are not disrespectfully rendered into the equivalent of virtue signalling. Choosing to clobber the audience over the head with a morality play declares total contempt for the audience, after all. Then again, I suppose that begs the question of what we expect "being entertained" to mean. For example, by entertained, do we mean we come away feeling pleased and superior to the villains, or delighted by the clever way the heroes solved their problems? (Top) Origins of the News (2026-01-05)
2021 photograph of an 1885 catalan painting of a newspaper seller, photographer not identified. Image via wikimedia commons.
Sometimes it is important to check on the origins of the various publications vaunted as "the media," especially the examples that seem especially broken. Hence I undertook some basic research to develop a better sense of the origins and development of the things we typically refer to as "newspapers" although they may not literally be printed on paper anymore. I had the definite sense that in the case of english-language papers, broadsheets were in the lineage somewhere, and had already learned that the origin of "yellow journalism" corresponds with the creation of what we now call tabloids. A very impressionistic sort of sketch, plus some information on seventeenth century manuscript sharing practices, which was very much an upper class phenomenon in england, where there was considerable pressure to avoid the appearance of doing anything others might perceive as "work." The thread tying together "newspapers" is of course the notion of "news" and reporting it to an interested group. Interest in "news" and a need or desire to share it is not "new" of course, and it was never dependent on having paper and ink. For a quick and easy proof from europe, a quick web search will soon reveal various online and paper published collections of roman graffiti, much of it devoted to politics, advertising, orders not to pee on that wall, and the inevitable silly and scurrilous stuff added by drunks and teenage boys. Among the most famous examples found in situ are in the former city of pompeii. Graffiti is not generally credited as an ancestral form or contributor to newspapers, especially since the romans had officially published notices called acta diurna. These began as announcements of public business, and apparently developed into something more like a modern day gossip column or tabloid later. Since I did not opt to dig into non-english language sources too much on this topic, I will continue following the european line here. The loose consensus of reasonably solid online sources is that the earliest newspaper-like things were the roman acta diurna dating from roughly the first century, followed by gazettes of political and military happenings for the upper classes in sixteenth century venice. With the Gutenberg printing press spreading around europe through the 1600s, weekly papers rapidly followed alongside a burgeoning struggle over control of the printing presses and what their owners printed. It seems at first european governments were especially averse to the reproduction of what we would consider "daily news" and of course anything to with the growing controversies among christians about their religion. But even in these early days, these were not exactly the ubiquitous items on or offline of today. They were all subscription-based, including the early english "newsbook" which the british library traces from the early to mid 1600s. The creative commons licensed textbook chapter I read about this, Culture and Media: 4.1 History of Newspapers digs into the gradual shift in formats and introduction of illustrations to newspapers, as well as a shift in price and the role of subscriptions. It marks the advent of the penny press, which was the first substantially advertising supported, non-subscription, daily sort of newspaper. The penny press is the real ancestor of today's newspapers, and they established the degradation path we have now seen go even further in the online versions. The Flaneur's Alley blog includes a 2019 post Neal Gabler on The Troublesome Origins and History of Newspapers that provides some additional information on the specific role of entertainment as a goal of presenting "the news." Gabler quotes sociologist Robert E. Park from a 1927 publication, in which Park declared "[T]he reason we have newspapers at all in the modern sense of the term, is because about one hundred years ago, in 1835 to be exact, a few newspaper publishers in New York City and in London discovered (1) that most human beings, if they could read at all, found it easier to read news than editorial opinion and (2) that the common man would rather be entertained than edified." The blogger bolded this quote, and I agree it is important, but not for the reasons either Park, Gabler, or the blogger do. Now, before the creation of the penny press, newspapers were subscription-based. A regular reader had to have enough income to subscribe directly to the paper, a subscription to a reading room or library with newspapers in their collection, work for one, or be part of a circle of people (yes, in those early days this would usually mean men) who paid for a subscription together. Such circles could include illiterate members, to whom the literate members could read articles aloud. Yes, poorer people did these things if they could manage it and found that following news of some sort in this manner made sense. The penny press began in a key era of industrial capitalism, when capitalists began to seek other means to extract from the wages they did pay to the working class than the company store or various fines. But how to tap into this massive population, and their meagre pocket money? Whatever the means, it couldn't cost much, and it couldn't prioritize a subscription model, because people in such precarious economic conditions would avoid any subscription like the plague. This was a social group presumed at best to be too exhausted for "editorials" and intolerant of articles written with the condescending idea of "bettering" them. More commonly, they were like sociologist Park, inclined to presume most people are simply too stupid to cope with any writing of substance. However, capitalists generally prefer the lowest classes not have too much ability to read or hear meatier stuff, that might help them do radical things like organize and win better wages and conditions. Better they be no more than entertained, and preferably repeatedly told how awful the world generally is, so they are encouraged to believe their lives could be worse. I can see why people would choose some cheap entertainment to help them through tough times, working poor or not. And it is true the competitive pressure of the penny press accidentally spurred the invention of investigative reporting, which is a genuine good all too necessary in the world. Today it is quite easy to find multiple articles bemoaning the state of newspapers today, complaining that people refuse to pay for them, won't subscribe, and so on. I have written about this complaint before, so I won't say too much about that except to reiterate the point that most newspapers today emit outright trash that is so contemptuous of the reader it can't even entertain. Newspapers reflect assumptions about class, sex role stereotypes, and the social conditions their desired audience must cope with. Apparently a perennial common perception among newspaper management and owners is that their product must be somehow addictive for the potential buyers, nowadays not the readers so much as the advertisers. (Top) Keep Those Goalposts Moving (2025-12-29)
Image of a wheel-equipped football net, courtesy of keeper goals athletic equipment, accessed 30 december 2024.
I admit to having no idea "moving the goalposts" has been such a popular meme and framing for various blogposts about business life, let alone sports. It is a well-established metaphor, widely used and understood in english, and likely in other languages where at minimum actual football (the kind without helmets and body armour) is widely played. More recently some goalkeepers have engaged in literal goalpost moving, as did a mob of unhinged united statesian college football fans (this time the kind with helmets and body armour), so it was a bit of a challenge to find an illustration for this thoughtpiece. I love the various photographs of football (the unarmoured kind) teams carrying their practice nets on and off the field for how they illustrate quiet teamwork, and some of the satirical images of a token diverse group of businesspeople carrying a similar net around are genuinely funny. But they didn't quite work for this particular thoughtpiece, and even the political cartoonists have not quite got to the topic, which is interesting – then again, there are not too many active political cartoonists these days. Instead most discussions of "moving the goalposts" outside of sports coverage seem to emanate from the lifecoach set, who have a strong preference for the expression as a means to tell readers about how changing their goals make them unhappy and may undermine their chances of success. Funny enough, the potential for goalpost moving to backfire is a big part of the political issue I have in mind here. I have read and watched lectures by economists Michael Hudson and Richard Wolff in which they explain the "deal" offered by the united states to the european countries that agreed to join the "north atlantic treaty organization." There are various parts of the "deal," but the elements of note just now is the special trade off on the balance of economic activity. Economic activity may be turned to military or civilian purposes, and practically speaking these two purposes are in a genuine pie division situation. The number of people, the time available in a working day, and the materials are all finite. (This description is not from Hudson or Wolff, whom I suspect would not simplify as I am here for brevity.) This suited the united states and its military and political leadership just fine after world war ii. Having maximized their profits through both world wars, and not subject to the mass death and destruction of europe or asia more generally, they were in the catbird seat when it came to any country afflicted with an anti-socialist, worker-hating elite. The united states offered, among other things, to subsidize the militaries of its european affiliates across the north atlantic, so they could spend their tax dollars and all the other fees and such they collected on their civilian economy. I could see this making sense to military wonks who wanted to keep the remaining nazis on their leash, and to the capitalists eager to impose and shore up fundamentalist capitalism. It's an old imperial policy the athenians played with to the detriment not only of themselves but all the greek city states of their time in the peloponnesian wars. As an imperial policy it seems like such a good idea, but such ideas only work as long as the empire is able to expand and maintain what it has already seized via whatever form of warfare. However, nobody just goes along with being coerced, and the peoples forced into the empire always resist and keep resisting. Not even genocide stops that. So inevitably, the empire wants the vassal states to pick up more of the tab and the social and economic destruction entailed in keeping the people who want out in, and trying to destroy all the other peoples who are still outside, especially anybody deemed a military threat. The problems with this demand are not just about finite resources. After all, the vassals held up their end of the deal. They are chock-a-block with united statesian military bases, and they have no control over the small stocks they are permitted to have of the most effective weapons. Their militaries are comparatively small, and shrinking for the many reasons induced by fundamentalist capitalism plus the impact of citizens refusing to serve for a whole range of ethical and/or practical reasons. The general population understands very well if things go wrong, the vassals are in fact, on their own. This was always true, but nobody on the united states side is even trying to hide it. Oh, and I should add, a variant of the same deal along the lines of "we'll manage and control the military and you go play with your toys in the corner" was given to canada. Some of the same united statesian grumbling about canada not spending enough on its military is going on as well. This seems less than sporting considering canada's elites have dutifully signed up to ever sort of trade deal that yes makes the united states its major trade partner, very much to the benefit of the united states, or more accurately, the capitalists there. So it could be asked, what's the problem here? Canada has never seriously competed with the united states as compared to say, germany or france. It's coughing up lots of primary resources on the cheap, especially the all important oil and gas necessary to keep the united states military moving. I'm no capitalist with direct skin in this game, but that strikes me as quite a big source of support for the united states military, and in today's dollars – the united states kind of dollar, the value is counted well into the billions. But that is no longer good enough, according to at least a portion of the united states elite. From what I have observed, the deeper source of frustration for the elites annoyed about this is their latest goalposts. What really annoys them is not the expense, it isn't as if they ever pay it. It isn't that they feel the united states is defending other countries "for free," they aren't. The new goalposts are about people, it is not possible for the european vassals or canada next door or even the united states itself, to provide enough people to serve in the imperial military or its economic auxiliaries. There are not enough trained soldiers of any kind and not enough people fit and able to serve even if they were willing to act as the on the spot brutes to impose and maintain united states control, which like that of any empire, depends wholly on being able to slaughter the people and reduce their homes to ruins. Having moved the goalposts this far, it seems matters are shifting into circumstances such that many not so elite people feel they have been backed into a corner, with a choice to die later according to somebody else's plan, or sooner with a serious chance of winning a better future for any survivors. Definitely illustrative of how dishonest goalpost moving can backfire. (Top) What Credibility? (2025-12-22)
Snippet of the cover of the 1988 edition of Bert E. Bradley's textbook Fundamentals of Speech Communication: The Credibility of Ideas, via the internet archive, 29 december 2024.
After a point, it is difficult to know what to say or do on encountering european and european-identified people who repeat, in an aggrieved tone, always with an added side of "how dare they not do as we tell them," how terrible it is that those awful foreigners won't enter agreements with them. In the mainstream media where the anglophone practitioners are completely fixated on the united states whether they are citizens of that country or not, they worry incessantly about the credibility of that country. However, in that context I feel sure they don't mean by "credibility" what I mean by credibility. The sort of credibility I have in mind is the kind where an individual or community or other larger group of people is able to make and uphold agreements with others by following the terms of the agreements. By "following" I mean ideally in a context where disputes are considered in a fair setting and the subsequent ruling, if it comes to that, is followed by all parties even those who feel they have lost something by it. In the best case, this comes out as close as possible to a consensus, where nobody is completely satisfied per se, but everyone can agree the resolution is as fair as possible, minimizing need for coercion to enforce it. If a person or group of persons acts in a manner designated under the old-fashioned term "sharp dealing" and/or resorts to force whenever they think they can get away with it to ignore the agreement, pretty soon they have no credibility. Inevitably, such people or groups of people end up isolated, because they are simply impossible to live with because they behave as if no laws, agreements, or anything else should ever apply to them if they can find a way, especially a way based on force, to get away with it. Pro-tip: If the only way somebody else will have anything to do with you is because you forced them to, you are a sociopath. Of course a sociopath wouldn't be reading this anyway. But maybe more people are anxious these days in part because they have realized, especially if they are european or european-identified (this is a larger group than anybody who thinks they are white), they are living in societies that encourages especially people like them to behave sociopathically in order to "succeed." The encouragement starts with intensive instruction, not necessarily subtle, in dehumanizing "others" to whom no promise has any validity. Nobody likes to be fooled or psychologically coerced into this sort of evil behaviour, as discussed in Cyberethics several years ago. It seems the ways europeans and the european-identified are offered to get around the cognitive dissonance and their moral disgust at the situation are few. They all depend on somehow tranquilizing feelings and preventing thought and questioning. The obvious big two are organized religion and the military. Not quite so obvious are the secularized versions of organized religion, meaning some church is not the organizer and funder, in the form of "NGOs" and various so-called "philanthropic organizations," the latter of which have been a money-laundering operation for the hyper-rich since at least the nineteenth century. Even less obvious today, yet it is the founding method, is the appeal to greed. But once the credibility (that word again!) of organized religion, NGOs, "philanthropy," and the military are gone, there is nothing left but the appeal to greed. However, once there are no easy targets left to redirect the anger of those trained into a sense of entitlement to riches and endless ego-stroking, the final way on offer is also older than it seems: drugs. Drugs of all kinds. Look into the core products "modern" european empires fixated on, among them are always drugs, grim necessities to keep the malnourished on their feet long enough for exploitation and doped out enough to be unable to resist. The special cruel twist in evidence today is the way the poisoned food and drug products churned out by united states-run corporations create an appearance of a well-nourished population, even an over-nourished population, that is in fact desperately malnourished. It's all about having a convincing appearance long enough for the trick to be pulled and the pocket picked. So, if the mainstream media is crying about it while really meaning their terror now it looks like the united states and its hangers on may no longer be able to pretend to run the world via total violence, and therefore these countries may be brutally isolated until such time as they can demonstrate any capacity to make and uphold agreements, that may be a helpful turning point. Or they could face up to a history in which the sad evidence of the corrosive combination of dehumanizing plus making "pretend agreements" is all too clear, and often held up as something to be proud of. There is an oft repeated but poorly understood story of the little boy who cried wolf, drawn from Aesop's "fables." It may well be the epitaph of so-called "western civilization" and its cynical notions of credibility. (Top) Funny Shaped Abridgements (2025-12-15)
Quote of a photograph of a vintage set of Carl Von Clausewitz's major works from Sophia Rare Books courtesy of a referral from ClausewitzStudies.org, accessed 24 december 2024.
Thanks to the combination of my day job and my research work, I often have to look up older sources, including many books commonly referred to as "classics" and translated from the original language, typically french or german, unless the "classic" is much older, in which case some variant of latin is more likely. However, this also means that sometimes I end up having to spend time sorting out which of the readily available translations for such sources I can't read in the original have a reasonably good and up to date translation. This can be frustrating, because "classic" is little more than a marketing term, which means it is basically impossible to avoid a visit to the nearest university library or applying some interlibrary loan-fu with the local public library. (That public libraries at least in canada, even in large cities have bluntly abandoned maintaining decent reference book collections is a whole different issue.) This is very much the case even for books regarded as "classics" and focussed upon military matters. In fact, maybe the jumped up editions of older military books are among the best examples of what a mess the treatment of such older works, especially the translated ones, are in. At this point even Karl Marx' Das Kapital is getting a new translation with no abridgement, not least because there is a resurgence of interest in his critique of political economy. There is a similar resurgence of interest in Carl Von Clausewitz's work typically referred to as On War in english, and also in his book on the final campaign against Napoleon, albeit for quite different reason. Finding myself needing to examine On War myself, and already jaded by my experience of abridgements of longer works in english, I wanted to see a full edition, preferably of a recent, accurate translation. "Recent" could mean as even early as the mid to late nineteenth century, depending on the original language. This sounds like it should be simple to arrange, which is of course a giant wailing siren of a warning that it is no such thing. Since Clausewitz's work, like Marx' is about highly politicized activity, finding an edition not excessively burdened by somebody else's politics than the author's is already difficult. Finding an abridged edition of this type is all but impossible, because the abridgement itself is inevitably political. From what I have learned on my brief foray into this area of bibliography, On War was published at first in a three volume edition, or even in five volumes, depending on what shorter works if any were appended to it. This is not usually done today, and there are not many single volume editions today. According to ClausewitzStudies.org there are practically speaking only two of them. That is certainly what I have found. What they do not mention, perhaps because the anticipated audience for the site is more a bit more academic than average, is anything about the non-academic press or mass market oriented editions. Of these the usual standby presses to go to are (like it or not) the conglomerate of penguin-randomhouse and oxford university press. The former typically has a lightly revised version of just about any older non-fiction work of the broad "classic" type via the old pelican imprint. The old pelicans usually have a new cover to line the book up with the new penguin jackets and that's about it, although there are exceptions of course. The oxford classics imprint is notable for the press' investment in updated ancillary essays, which it does more often than penguin-randomhouse. Either way, their editions may not be the best, but they are typically passable and affordable, therefore often available in bookstores and included in medium to large-sized public library systems. However, some bright spark decided not to clearly mark that their editions of On War are abridgements, not on the front cover, and not in the marketing blurbs provided on their sites or bookstore catalogues. That bright spark probably should be sent to do something less visible until they are a more substantial ember. Due to stubbornness and curiosity, I found myself wondering about those abridgements. Not to read, in all honesty. In fact, if anyone who happens to read this far in the thoughtpiece is wondering whether I would recommend reading any abridgement of a non-fiction work my answer is ABSOLUTELY NOT. Even the best intended and most evenhanded abridgement can't help but be misleading. For the fiction case, I have read a fondly remembered for the somewhat incongruous persian-style illustrations and now impossible to find cleverly abridged version of The Arabian Nights for adolescent readers. Alas for the charm of such abridgements because as any adult who has looked at a more faithful translation knows, the level of bowdlerization to achieve this effect means the result seriously misrepresents the original text. It is simple to see the full text of On War online between project gutenberg and of course the ClausewitzStudies site in both english and german. But, abridgements are not all the same. They are made to meet the presumed interests of the market of the time. I knew the two popular abridgements would likely overlap a great deal, and where they differed would reveal much about the time they were published. In the case of the penguin and oxford abridgements, this was overdetermined. Penguin-randomhouse is literally still reprinting a version from 1968, and does indeed use the J.J. Graham translation. Oxford produced its abridgement from the more recent, 1976 translation by Michael Howard and Peter Paret with princeton university press, and published it in 2007. Anatol Rapaport's introduction to the penguin abridgement explains he was "guided by an intention to offer the contemporary general reader and to the student of international relations those portions... which relate most directly to our own time." In the oxford abridgement Beatrice Heuser "selects the central books in which Clausewitz's views on the nature of theory and war are developed." Rapaport was a psychologist, mathematician, and anti-war activist. He passed away in 2007, after a storied career. Heuser is a historian, political scientist, and professor of strategic studies at king's college in london. As is evidently overdetermined, Rapaport and Heuser did select different portions of On War to include, but what they both selected is at least as interesting. There is no need to read too deeply into On War to notice either, in part due to the assistance of ClausewitzStudies.org, which provides a full table of contents with the subsection titles. There are eight books, six complete and two extant as sketches Clausewitz was unable to complete before his death. Courtesy of ClausewitzStudies.org, the basic, but full list of titles and chapters is provided in the box below. It won't expand or stay expanded unless the pointer is run over it. It is quite revealing just to consider the titles of the eight books though, which are:
Rapaport's abridgement includes all of Books 1 to 3, Book 4 except for its chapter on Night Fighting, and parts 1 to 6 of Book 8. Evidently he deemed the potential readers more interested in theory than such practical considerations as who actually does the fighting, defence, or attack. Heuser also includes all of Books 1 and 2, but then switches to selections: parts 1 to 5, 11, 13, 14, and 16 of Book 3; parts 1, 3, 4, 7, 8, and 26 of Book 6; parts 1 to 7, 15, 16, 21, and 22 of Book 7; and parts 1 to 8 from Book 8. Heuser expects the reader to be interested at least a bit in defence and attack, but still not very much in who actually does the fighting and how the different types of forces may be expected to interact. Now, obviously I am no military historian nor does my own research run adjacent to the field. Yet it still seems quite strange to me that both abridgements take as given that much of the practical information would be of no interest to a general reader. Then again, maybe this is because Clausewitz wrote before aerial bombardment was a reality for at least british anglophone audiences, and anglophone audiences in general have typically not lived in places where armed forces are expected to move through en masse, potentially needing temporary shelter and to get food and water. In that case, all right. It would not be so relevant to know things like how far the military forces may be expected to move in a day, the number of their camp followers, or how far ahead their scouts may be. Nor would they necessarily need to read about how such forces will set up their camps or what they will do or won't should they set up a defensive line, artillery forces, or ammunition dumps. Let us all be so lucky, anglophones or not. (Top) |
Thought Pieces
Thought Pieces
|
||||
|
Copyright © C. Osborne 2026
Last Modified:
Friday, January 02, 2026 21:04:05
|
|||||