|
FOUND SUBJECTS at the Moonspeaker |
|
More Thinking With Ursula K. Le Guin (2025-03-03)
Fairy sky shot on the caribeean atlantic by XEON, july 2013. Photograph used under Creative Commons Attribution 3.0 Unported license via wikimedia commons.
As I come reluctantly to the final entries in Ursula K. Le Guin's blog, inevitably in the wonderful and bittersweet sense, I return to a special gem from february 2014. Not being a follower of asocial amplifiers, perhaps her remarkable eighty-second post made a great splash and I simply missed it. Somehow I doubt that though, in light of what has been going on since. This wonderful post is Belief in Belief, in which Le Guin firmly opposes the inappropriate conflation of the verbs "know" and "believe." She does this in the best of ways, the prickly sort of way that makes you wonder uneasily what the heck she is going to write next, but following her with the confidence gained from her previous work. Le Guin may make her readers uncomfortable from time to time, not gratuitously in anyway, but because she trusts the sense and honesty of her readers. She declares up front that she does not think that "the credibility of a scientific theory and the credibility of a religious scripture are comparable," and furthermore, "And I want to write about it because I agree with [Charles Blow] that issues of factual plausibility and spiritual belief or faith are being – cynically or innocently – confused, and need to be disentangled." It is extremely difficult not to quote too much, I went back several times snipping and clipping to avoid doing so. Please read the original, it is just that wonderful. What impresses me about this essay is not merely that I agree with Le Guin either. It is an amazing tour de force of method and packing detail and careful analysis into less than 1300 words. I would recommend studying it to any senior high school student or first year undergraduate looking for excellent examples to learn how to improve their writing from. For instance, the starting point of this essay is a new york times editorial by Charles Blow, which discussed in part the results of a Pew survey. Le Guin was perturbed by what Blow stated about the results, so in choosing to write about it, Le Guin started by trying to find the original survey questions, or at least enough information to reasonably assess their content. Her findings lead her to observe, This language reassures me somewhat. For if a poll-taker asked me, "Do you believe in evolution?" my answer would have to be "No." I ought to refuse to answer at all, of course, because a meaningless question has only meaningless answers. Asking me if I believe in evolution, in change, makes about as much sense as asking if I believe in Tuesdays, or artichokes. The word evolution means change, something turning into something else. It happens all the time. The problem here is our use of the word evolution to signify the theory of evolution. This shorthand causes a mental short circuit: it sets up a false parallel between a hypothesis (concerning observed fact) and a revelation (from God, as recorded in the Hebrew Bible) – which is then reinforced by our loose use of the word believe. I don't believe in Darwin's theory of evolution. I accept it. It isn't a matter of faith, but of evidence. Nowadays we are dealing with multiple thickets of hideously meaningless questions on important surveys including such things as national censuses, where we deeply need meaningful questions we can answer meaningfully. I think Le Guin is using the word "meaningful" in the way I am here: to describe words intelligible to another person without wondering desperately what intoxicants we had consumed before answering. Furthermore, here is an excellent example of not name-calling on the hot button topic of evolution in the united states. She has no time for that, instead she sets out with care what she means and what the widely accepted meanings and elements of belief versus knowledge. Le Guin won't let those who would like belief and scientific knowledge to be exclusionary in the same head get away with such sloppy thinking. The whole undertaking of science is to deal, as well as it can, with reality. The reality of actual things and events in time is subject to doubt, to hypothesis, to proof and disproof, to acceptance and rejection – not to belief or disbelief. Belief has its proper and powerful existence in the domains of magic, religion, fear, and hope. I see no opposition between accepting the theory of evolution and believing in God. The intellectual acceptance of a scientific theory and the belief in a transcendent deity have little or no overlap: neither can support or contradict the other. They rise from profoundly different ways of looking at the same world – different ways of coming at reality: the material and the spiritual. They can and often do coexist in perfect harmony. Maybe the problem is that believers can't believe that science doesn't involve belief. And so, confusing knowledge with hypothesis, they fatally misunderstand what scientific knowledge is and isn't. The last paragraph in this quote represents what I understand is an example of application of the charitable interpretation. No denouncing believers who simply can't believe others can willingly and genuinely adopt knowledge based on evidence rather than blind faith in an authority. Perhaps people taking that position would claim that "science" stands in as a sort of authority here instead. So of course it makes no sense to someone who takes that position to read or hear as Le Guin explains, A scientific hypothesis is a tentative assertion of knowledge based on the observation of reality and the collection of factual evidence supporting it. Assertions without factual content (beliefs) are simply irrelevant to it. But it's always subject to refutation. The only way to refute it is to come up with observed facts that disprove it. Maybe the real challenge to believers is the refutation aspect, because an absolute, all-knowing authority would be shown a fool or a liar if said authority could be refuted. But the knowledge at issue on the science side is derived from a method developed by fallible humans intended to accept and work with human failings, including the limits of human knowledge and awareness at any given time. Confidence in the method derives from how it allows people who use it to identify errors or incomplete aspects and put things right. A hypothesis that stands up to challenge becomes a theory which nevertheless can and should still be tested. As Le Guin emphasizes, scientists accept the theory of evolution "and use it and defend it against irrelevant attack because it has so far withstood massive attempts at disproof, and because it works. It does a necessary job. It explains things that needed explaining. It leads the mind on into new realms of factual discovery and theoretical imagination." Being a much younger writer, I would be tempted to over-egg the batter here by adding a statement to the effect that religious belief has a different job. A great rhetorical point is a lot like a great joke, the key to making it work is timing and knowing when to stop. Success in a relatively informal medium like a blog post is a serious demonstration of skill. I was going to add one more quote, but this is another instance where it is vital to take note of when to stop! (Top) Reading Footnotes Can Be Rewarding (2025-02-24) I have something of a love-hate relationship with footnotes. They are a clever invention, at their best when skillfully combined with endnotes. For instance, some of my favourite books have used endnotes for reference citations, while using a judicious smattering of footnotes to provide apparatus such as definitions or extra information which are not strictly necessary, but good to have easily at hand. In especially mad and wonderful cases, the author has fallen in with a wonderful book designer with whom they have used citation endnotes, informational footnotes, and outside margin comments and/or section summaries. Not many books receive such treatment, because even in the computer-based book design world, that is a lot of work in terms of layout, and authors and editors have to do quite a bit of work to find the balance that achieves genuine usefulness rather than self-indulgence. There aren't many writer-designers even who can manage to flirt with the edge of self-indulgence and get away with it. Edward Tufte is one. Unfortunately, especially in academic books where informational note versus citation note are most usefully separated, they often aren't, all being relegated to a big blob of endnotes at the very end of the book, or in the best scenario, each chapter has its own endnotes. Alas, even though hypertext is brilliant for footnote and endnote use, almost nobody does. But if you like me, happen to be among those who will resort to the footnotes of a good book that is ending all too soon, then you know a big reason to do so is they may have unexpected gems. Case in point, I have been reading Ursula K. Le Guin's blog, which her literary estate has graciously kept available online with her website (including all the wonderful Annals of Pard and related posts). Alas, I am going to come to the end of her posts soon. Among them are a very few with two, maybe three footnotes. Not many, and so all the more precious. One post is Le Guin's examination of that dreadful idea, "the great american novel." It is a rangy, intriguing article, and glancing at its two brief footnotes, one declared: In the 1920s, on a great Peruvian hacienda with a private bullring, my parents watched matadors-in-training fight cows. The full ritual was performed, except that injury to the animal was avoided, and it did not end in a kill. It was the best training, my parents were told: after las vacas bravas, bulls were easy. An angry bull goes for the red flag; an angry cow goes for the matador. Go check out the context this footnote comes up in, because it provides, in less than a hundred words Le Guin provides additional depth to her critique too many readers who skip the footnotes will miss. The information isn't strictly necessary in the main text, it is understandable she opted for the footnote. It is a real gift she decided to capture the information all the same, rather than not include it. There is a whole world in those few words. This is just the sort of treasure that leads me to poke at footnotes and endnotes even in far less prepossessing books and articles. Footnotes are, at least in english, found primarily in non-fiction books. They have never just been about citations, and indeed among the earliest copious footnoters, Edward Gibbon, ended up doing so in part because he received complaints that he had too many plain citation footnotes and not enough more interesting ones in his famous History of the Decline and Fall of the Roman Empire. Of course, then he received complaints about his scurrilous commentary in the footnotes, but that is another story other writers on the web and off have thoroughly covered. I am aware of at least one published novel with footnotes, in this case actual citations. It wouldn't surprise me if it turned out Douglas Adams used them as part of the comic presentation of elements of The Hitchhiker's Guide to the Universe series, but have never read all the novels. Mind you, I have seen such footnotes as the occasional translations and definitions added to translations of such novels as The Journey to the West and The Romance of the Three Kingdoms. But those are not inclusions by the author, they are extra help courtesy of the translator. The difficulty with trying to use footnotes in fiction as an author is of course the problem of how to do so without falling into self-indulgence. Or how to do so without trying to stick them in due to not wholly honest complaints from readers who have either not read the whole book so they are missing parts of the story and are confused, or they are anglophone but dislike it when the dialect they read is different from theirs. It probably makes better sense to provide a bit more context in the story to help them where it matters, and ignore them for the rest. I suppose a determined author with careful planning and a knack for creating a fictional scholarly edition might be able to include footnotes on a broader basis if they wished... (Top) Ode to the Command Line (2025-02-17)
A random, resized snap of a terminal emulator window courtesy of Vicarlo via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license.
You could call this thoughtpiece a counterpart or even a counterpoint to a much older one in which I captured for posterity my delight in the apple macintosh before it vanished forever, Ode to the Old Mac. For those readers who have been following along for a good while, or who have delved into the thoughtpiece archives, I have reflected since at intervals on the extremely tragic trajectory of apple software and hardware alike since, and my days buying apple hardware ended well over ten years ago now. I am keeping a hopeful and interested eye on the helloOS system, which is a more active and promising reimplementation of apple's original human interface guidelines than even puredarwin (the latter may perk up, but hasn't in several years now). Both are nonproprietary. But much as I appreciated the old mac, and the first promise of the early macOSX days, I was remiss in not speaking up about the importance of the command line. To be sure, at first it was far from a tool I liked much, due to my original introduction to it in a distorted and heinous form via MS-DOS. The terminal emulator – terminal for short – bears only the most cosmetic relationship to the MS-DOS version. Practically speaking I do spend more of my time working in programs with some kind of graphical user interface, although less time before, precisely because some things are best taken care of in the terminal. For instance, since I like to readjust the computer's behaviour to suit me rather than the other way around, in due time I figured out the best way to achieve that is via the terminal. It is typically faster, more specific, and allows ready access to options operating system distributors may dislike "users" having the temerity to change. There are also a few annoying window manager bugs best cleared by simply restarting its parent process, and contrary to what we are often encouraged to believe, this really can be okay instead of a full logout or restart. (Yes, not always, especially if it happens there is a patch available. In that case of course, patch the system!) In today's version of GNOME, to me it is simply unbearable without being able to correct its behaviour by researching and adjusting settings in the terminal. These are examples easy to dismiss as simply cosmetic for those who consider what the pre-selected defaults are as always right or good enough. I will have to agree to disagree with them, and happily agree not to mess with their computers or user accounts on shared boxes. The terminal is useful for far more things, most prominently system administration and maintenance. For anyone who would like to do nothing but work in terminal windows, it is possible to do so via command line interface applications including such venerable programs as emacs, vim, and nano. There are web browsers, music players, all the usual suspects. However, as soon as it is necessary to edit audio or video, well, in my experience you can trigger that from the command line, but you can't edit from there, and this is a good thing for everyone concerned. It can also be quite educational to dig around in the shell to see what commands it has built in, as this varies somewhat between shells. Ah, before I give some examples of those, I should say briefly what the heck a shell is in this context for clarity. By "shell" I like most people am referring to one of a family of programs that run as the fundamental command interpreter in the terminal emulator. The first shell "sh" was written by Ken Thompson of UNIX and C language fame, later replaced by a new version by Stephen Bourne at bell labs in the golden age of industrial research and development. Subsequent shells are variants of Bourne's shell, including bash, the "Bourne again shell." (There is excellent documentation of the origins and development of the UNIX shells maintained by Sven Maschek. See especially The Traditional Bourne Shell Family.) The shell commands are in fact small programs, which can be combined into larger programs better known as shell scripts. The built-in commands include such well-debugged and sensibly featured default programs as cal and calendar, rsync and jobs have their own regular uses. There are a few more whimsical items not installed by default today, such as the old "quote of the day" program or the various autotext generators now only still found preserved in emacs. I am aware of a couple of shell commands such as write and wall that nowadays are probably not installed anymore or restricted to administrator use due to their security problem potential. And besides these, the shell includes support for basic programming needs such as looping and conditionals. The terminal may not be fancy, but it is a remarkable and unassuming swiss army knife combined with a shell program, fast and able to do just about anything you want it to if you can create a shell script for it or find a free/libre script someone else has written to use instead. For instance, suppose you want to blog in a pared down way from the command line in terminal. I have recently learned of Carles Fenollosa's bashblog script, designed for just this purpose. Its really neat, and he has just added tag support. (No comments though.) He has managed all this with a 500 line bash script, which is quite impressive. While no one is required to use the terminal in this way, the availability of such a remarkable and flexible program is a boon, a bigger one than it may seem at first glance. (Top) What Is Speech? (2025-02-10)
Nineteenth century illustration by Randolph Caldecott, from The Complete Collection of Pictures and Songs By Randolph Caldecott, via oldbookillustrations.com.
I know, this seems at first read like one of those ridiculous, obvious questions. Paraphrasing my trusty OED, "speech" is when a person expresses their emotions and thoughts usually by means of sound made with the mouth and vocal cords. In lieu of the usual physical means to speak, a person may sign, or write out what they mean instead. Okay, so far so good. This seems to cover the major basic cases without de facto denying the expressive abilities of persons who are deaf or dumb, or who need to use other than their mouths and vocal cords to speak for temporary reasons. More often than not, "speech" comes into question when people in the united states are arguing about what they refer to as "freedom of speech," which I understand actually refers to "freedom from curtailment of speech by a government, especially a federal government." This leaves a genuinely uncomfortable field for private entities to strive to interfere with freedom of speech, most infamously the very rich striving to impose arbitrary limitations via threats to withdraw financial support for companies and corporatized universities. I have been grimly fascinated by how often straight-faced claims are made that pornography is "speech," tout court, and therefore it cannot be curtailed by any law. Well, it seems to me that if this were true, the pornographers should be at the front of the protests and actions to bring updated versions of Catharine MacKinnon and Andrea Dworkin's ordinance meant to allow people used as performers in pornography to seek legal recourse for violations to their civil rights. If pornography was simply "speech," then pornographers have nothing to fear. After all, "speech" does not entail violations of civil rights. Right? Oops. It is not quite so simple as that, more's the pity. The example from history that always sticks in my mind when trying to properly untangle claims about speech and how free it can or should be and who should try to control it if anyone and on what basis, is the case of Julius Streicher. Yes, that guy, the nazi propagandist who ran an infamous newspaper through which he set forth and normalized vile ideas and encouraged violence against any group marked as "non-aryan" for which read "scapegoat for our screw ups" in nazi ideology. He was hung for his trouble. I think we can reasonably agree that he abused his specific access to a powerful amplifier to his speech, his newspaper, as well as his freedom of speech as such. Although, his speech was only so free as to say just what the authoritarian leadership of nazi germany was willing to hear. It happens that he was a nazi himself, so he certainly did not feel trammelled by the ambient conditions he did so much to support. This routes the question back to how to solve the puzzle of what to do when somebody insists their speech is impeded for no good reason. The Streicher example indicates that inciting and subsequently normalizing hatred and acting lawlessly against targeted groups is a good reason to firmly curtail a person's freedom of speech. Hopefully the vast majority of the time that curtailment means taking away their ability to easily amplify their speech and/or nuisance those who would do not want to listen to or participate in that person's speech. Hmmm. Now this reminds me of something about speech not often stated explicitly in the articles I have read about it. Speech is in fact an expression of a social relationship, or better expressed, speech is a social practice. We may practice a formal speech alone in our bedrooms, but to make the speech we go out in public to share it with other people. How we attain our audience and who we involve in making our speech to that audience matters. What we intend to encourage by our speech matters. If speech were merely noise, and had no motivation behind it, we could treat it the way we do nuisance noise. But speech is intended to be meaningful, and so to encourage us to think a certain way, and/or to act a certain way. We may accompany our speech with certain acts for emphasis, to illustrate the correctness or possibility of what we are saying and so on. Censorship is a powerful and therefore dangerous tool, to applied with care, not a broad brush. However, "censorship" is poorly defined, so poorly that it is mostly an insult and a word that destroys a conversation before it can begin. However, when I check the ever-present OED, I note the definition refers to its use by a government for what could easily become authoritarian purposes: suppression of material considered politically unacceptable or dangerous to security. I think the adjective "obscene" is in practice a reference to politically unacceptable material that questions the acceptability of sexual violence and male domination. (The questioning may be as to why male dominance is not more extreme, or why women should not resist it.) From what I have observed over the past ten to fifteen years, the trouble with censorship is that the people who scream most about it and how their speech is supposedly affected by it, is that they actually love it dearly. It's just they are absolutely determined to have it directed only against the political speech they don't like, joined with extra amplification for free of the speech they do, meaning of course, their own. If nobody's political speech is censored, except for that which seeks to normalize violence and hatred against any scapegoated group or any other type of human rights violation against any scapegoated group, I think this would be reasonable. Indeed, such limitation on speech would demand an implementation of that much honoured in the breach thing, due and fair process via a public hearing, and if found reasonable a followed by prompt, meaningful, and effective action. I suspect a due and fair process would be very useful, although of course difficult to do right now because presently everything is so gravely politicized and to truly provide such a fair process, it would indeed open up a floor to challenge injustice from unavoidably and effectively. There are plenty of people not so interested in that. (Top) Private Notions (2025-02-03)
1881 illustration from A Popular History of the United States: From the First Discovery of the Western Hemisphere by the Northmen to the End of the First Century of the Union of the States: Preceded by a Sketch of the Pre-historic Period and the Age of the Mound Builders, Volume 2, via wikimedia commons.
The challenge is one of fending off the various hucksters claiming that "privacy is dead" and "you don't care about privacy unless you have something to hide" or "you don't really care about privacy or you wouldn't be using [X]." As is too typical of hucksters, they are full of babble intended to encourage our insecurities and spur us to a minor form of panic, which renders us into easy marks. In spite of myself, I occasionally find myself feeling frustrated by the waste of the real talents of these fast talkers on such miserable and socially destructive activity. Certainly I was prone to this in my younger days, thanks to less experience and not having much information for or against what they were saying, apart from recognizing the mannerisms and speech structures so typical of hucksters of all kinds. I suppose though, that in a backhanded sort of way a huckster can indicate something socially useful: what the latest authoritarian angle of attack is. Those attacks always start small and apparently unimportant, to see if we're asleep on a convenient topic. They don't intend to wake us up, but often the hucksters do. The latest huckster round of flailing is part of the struggle to maintain the veil of marketing and other lies meant to distract us from the massive government surveillance which is now outsourced to marketing corporations with diverse camouflage. This flailing is especially desperate because actually, people are emphatically not going along with having effectively no privacy. It is striking how powerful the push back is growing, including in the homeland of one of the most obsessed surveillance organizations in the present time, the united states. It is striking, yet on checking some diverse books and readings from both my research and coursework, it turns out that the struggle to create and maintain general privacy in the united states specifically is not new, and as we would reasonably expect, overlaps with similar struggles in other places. However, since the rich and their enablers in the united states like to make their problems everyone else's, it is worth parsing some of what I learned here. To begin with, I was reminded again of the importance of religious extremism as an impetus for eventual emigration to the americas – one of several, of which I suspect getting rich quick was often more important, nevertheless. It seems to me the prominence of the puritan "pilgrims" is more revealing and actually more honest than many united states historians might have expected. Puritanism is one of many protestant forms of christianity accepting a belief in predestination, convenient for interpreting class divisions and financial success as evidence of divine grace, and a famous hatred of images, bright colours, and music. Less talked about today is their determined early efforts to control all thought via such means as limiting all reading to a specific edition of the english translation of the christian bible and demands for conformity in public and at home. This tipped over very easily into what today seem appalling levels of bluntly, spying, eavesdropping, and the dangerous deployment of rumormongering and accusations of lapses from acceptable behaviour, especially should a person or persons try to get away from prying eyes and ears. What many of us today would understand as privacy is a problem for any person who wants to hold coercive authority over us. After all, such a person does not and cannot trust us, because apparently they don't trust themselves. But if they could somehow know our minds and all the things we want to do or could want to do, well, then they could manage us for our own good. Well, that's the religious version of this dishonest rationalization. It isn't hard to find examples of the religious, political, and parental equivalents. Then, my reading of Feminist philosopher Marilyn Frye's work came back to mind. She wrote a brilliant essay on separatism, noting among may other important points, about how the master is allowed to walk into the slave's hut at will and do as he pleases. Within a slave society, a slave is forbidden to bar the master entry, and is certainly not allowed to make free with the master's "property" or entry into the master's house. Note, this means among many other things, a slave has no privacy. As far as possible in the case of plantation slavery, the united states slaveholders and their counterparts in the caribbean were determined to keep levels of slave education as low as possible, assuming that basic illiteracy would prevent slaves from organizing to resist or from otherwise undermining the process of making profits. They were quite sure without access to writing and with fierce punishments for illicit literacy, it would be impossible for enslaved people to keep the security necessary to resist, survive, and escape. They more than half believed, as too many people still do today, that any person who cannot read or write also cannot plan or reason. They were wrong, and even their ability to search homes and destroy the meagre belongings of their captives at will did not work to maintain their power for long. There is a sadly hypocritical quality to the ways in which united states media likes to trot out caricatures of the east german stasi or the soviet-era kgb as examples of obsessive, ultimately ineffective, but of course dangerous enactors of surveillance states on a day to day and household to household basis. They trot these propaganda images out to distract us from the ways in which they are advocating the very same surveillance they claim to despise, apparently because they can pretend it's "nothing personal" and just part of "good capitalism, not bad communism." Authoritarianism is bad no matter how it is dressed up, and no matter how much effort its proponents put into hiding their actions to support and enact it. Only people desperate to impose total control because they can't trust anyone try to force everyone to have such things as constantly internet-connected monitoring devices in their homes, or try to fob off alarm and anger when people find out what was supposed to be an innocent application on their phone monitors and uploads recordings of all their doings on a near-realtime basis. (Top) Tone Deaf Messaging (2025-01-27)
Screen shot of an outstandingly tone deaf message, so readers can be reassured the text discussed was not invented, 2023.
These are in so many ways, excruciating times. When old certainties are shown up as nonsense and the last of zombie colonialism becomes impossible to ignore because the smell has overcome the undertaker's make up and preservatives, the result is anything but pleasant. I say this all too aware how very fortunate my circumstances are compared to so many others in the world. An awareness long far from academic. Matters have developed far enough now that even the grim process of how committed capitalists carve off more and more things and ideas to put a price and a marketing plan on them, especially if they could otherwise interfere with the colonial world they love best, now happens quickly and before our eyes. I did not appreciate at first what a warning the sneering term "social justice warrior" was of the beginnings of this very process as applied to genuinely useful and yes just action. Such terms are the bellwether of a cooptation exercise, in which providing an appearance of doing right is more important than actually producing meaningful change. In fact, it is deliberately encouraged in order to counter and prevent meaningful change. More than one keen eyed and sharp eared entrepreneur picked up on old attempts to shut down meaningful change by those claiming racists and similar are merely ignorant or crazy. If they are ignorant or crazy, then technically they are redeemable by treatment or training. All the better if either of these can be sold on a contract basis with suitable mark up, and structured around the model of deliberate practice of self-humiliation, to be followed by reward with a certificate or a job guarantee for confessing sins. I have often heard this referred to as a "maoist struggle session" – well, Mao didn't invent it, and I suspect christian missionaries didn't either. Nevertheless, christian missionaries are among the most infamous practitioners of this sort of psychological terrorism. But there is plenty of far more benign stuff than that, stuff that seems like it may even come from a well-meaning place, not meant at all to go along with not altering the structural issues that keep sexism, racism, and colonialism firmly in place. At least I hope so. It seems to me if it were otherwise, they would not be so relentlessly tone-deaf and ridiculous. Reproduced here is a screenshot (to reassure the rightfully skeptical I didn't make this shit up) of an astonishing message shared with me by an associate who was caught between astonished and appalled when they received the original email. Below is the text reproduced separately, except for the closure. Dear XXXXXX, Self-advocacy and self-care are acts of resistance! Surviving in a world of white supremacy, inequity, and microaggressions comes at a great cost to our mental and emotional stability. We often feel "unwell." We struggle with our emotional and mental health because we struggle to understand what diminishes it. We struggle to understand all the ways we are impacted by survival. The XXXXXX Black Caucus invites you to join us online for a Black health and wellness workshop... In this workshop, you’ll learn to assess your current mental and emotional wellness, explore the aspects of life that contribute to emotional "breakdown," and gain the ability to re-prioritize your current mental and emotional needs. You will walk away with tools and a plan of action to start your wellness journey. Please note that this workshop is intended for XXXXXX members who identify as Black. ... So many wonderful buzzwords, it's an embarrassment of riches. But note first thing, the fundamental position expressed in this message is that racialized people, in this case anyone so foolish as to "identify as Black" are the problem here. Furthermore, they make up the problem solely via their individual behaviour. As individuals, such people are not doing a good enough job on the tasks of self-advocacy and self-care. Worst of all, these people who "identify as Black" don't know how they feel really, just that they are vaguely "unwell" and therefore are unable to properly care for their mental and emotional health. They need to take workshops to learn how to recognize how they actually feel, and how to re-prioritize their emotional and mental needs. Really, individuals who "identify as Black" are sick, but with a brand new plan of action and the right tools, all will be right with the world in future. Honestly, who in the hell actually wrote this shit? I am a racialized person. When I encounter racist structures and actions in my life, I don't feel "unwell," I feel angry. Furthermore, I am right to be angry, because that stupidity impacts not merely myself busy trying to do my job well and make a decent living, but all the others like me who are prevented from doing their jobs well. It impacts our entire damned society because it means racialized people, including specifically Black people, are stuck wasting their energy fending that shit off instead of being able to put their full energies into their society, families, and yes, themselves as individuals. Tools and tactics to deal with personal needs are of course, important. But if I do no more than take care of myself, I won't stop being angry, because I won't have dealt or even tried to deal, with the root of the problem. "I got mine, jack," is not a strategy for ending racism, colonialism, or sexism. It is true anyone needs to start by getting their own house in order before they can help fix larger structural issues, but this email suggests a path that barely does even that. Now, perhaps I am being unfair. After all, there is apparently a Black Caucus, and it is at least highly probable they are enacting an action plan of some sort. Such a plan could reasonably consist of providing training meant to help individual Black colleagues, organizing with Black colleagues to take specific actions to change acute expressions of structural racism, and strategizing those with a view to breaking those structures apart all together and replacing them with something healthy and just instead. But where is the reference to this plan? Say something like "We are offering this workshop as part of our planned action items for supporting personal development. To learn about other action items and opportunities to contribute to organizing and political action, see XXXXXX." Or contact so and so to request a copy of the plan, or whatever. As it stands, be advised, "Black identifying" colleagues, that the issues you are facing are actually poor prioritizing and lack of tools to properly assess emotional and mental health. There is a contractor who will deliver a lucrative workshop to put you, and no doubt any other "racially identifying" group who pays the fee, right. (Top) The Meaning of What We Can't Or Won't Imagine (2025-01-20)
Photograph of a coin from a very different time in palestine, suggesting very different possibilities, by Pepe Escobar via Karlof's Geopolitical Gymnasium.
There is a poet who has been writing a poem almost every day, day on day, since the beginning of the COVID-19 pandemic, which is still not over, so he is still writing. He goes by "Z.M.L." and posts his now hundreds of poems on social media and the excellent website and blog Librarian Shipwreck. I did not encounter these wonderful, bittersweet encapsulations of these terrible times until somewhere around the eightieth week. They are short poems, some of which will inevitably stand up better in time than others, and I sincerely hope his circumstances allow a publication of these poems on paper as well as online, in their entirety. In so many ways, he has captured the feelings so many of us share I think, in a few powerful words. The shock, the disbelief, the horror of realizing that what many of us have believed, in our heart of hearts, that there were enough people in the world in the the right places and willing to act in meaningful ways to counter the people whose motivations are so vicious, so self-centred, they would ruin the world just to declaim proudly they alone did such terrible things, because they were the only ones bold enough to make the world perfect. Please note that the usual meaning of "perfect" is complete and faultless. Please note that in grammar the customary meaning of "perfect" is completed and in the past, therefore unchangeable, and having impact in the present. The only way to maintain a "perfect" state is to prevent any further change. To try to make a "perfect world" in this sense, to pursue such a goal, is to descend into a state of total and homicidal madness. An especially pernicious state in this case, because it is always heralded by the insistence that this all for "our own good" and to "make the world a better place." How often have we been told that the people who talk about making some kind of perfect world are unrealistic seekers after a utopia, something foolish and unattainable. Such people, we are reassured by the mainstream press and those who purport to be "leading thinkers," are harmless cranks. No more. I don't contend I have always resisted such claims. If I did it would be dishonest at worst, terribly naive at best. But I do know when those claims and others of the same sort began to trouble me. It was the day in a religious studies classroom in a parochial high school when the instructor duly informed us even the best and most noble person, one shown to actually be such not merely reputed to be such, would burn in hell for eternity for not being a roman catholic. Do not pass into purgatory, do not pass go, straight to hell with them, to be tormented forever, and shown up in their torment to all of heaven so the blessed could enjoy seeing them suffer. Considering my age at the time, perhaps this might have gone over my head all the same, after all due to the vagaries of hormones and brain development, we don't have a lot of emotional depth at high school age. So the thing about the blessed supposedly enjoying the agonies of the damned mostly bounced off of me at that moment, due to characteristic temporary narcissism of my age at the time. But the logical incongruity of it, the stated insistence on the meaningless of actual deeds – that stuck. It wasn't until many years later I came to understand how this type of illogic is made logical, by setting up specific requirements for perfection, which in turn is defined as how best to do what some outstanding authority figure wants for our own good, and therefore anything, anything is permitted to meet the goal of that perfection. Anything. Religion as such need have nothing to do with it. The "authorities" busy striving to prevent all application of non-pharmaceutical interventions to curb and prevent transmission of COVID-19 are not limited to ostensibly secular ones. Whether they claim a carefully dressed up version of social darwinism justifies this because supposedly the unfit will be killed off saving society the cost of their care and upkeep, or they claim in the most pious of tones that everything is god's will and if that means the vulnerable will be carried off sparing us the cost of their care and upkeep, they are saying the same thing. Their contempt for anyone who resists them is absolute. So absolute, they write and speak publicly of their hopes that those resisting people die. They sneer at entirely sensible precautions like improving air quality by such means as improving circulation of fresh air and bringing in hepafilters, which would have additional benefits beyond preventing the transmission of COVID-19. I take note of this type of intervention especially, because it can be effectively invisible to the people who work themselves into a rage when they encounter others wearing respirators or even the most useless of surgical masks. The controversy around masks, including the demented insistence on removing all requirements to where any type of mask let alone a basic N95 respirator in medical facilities. Yes, medical facilities where logically we would hope great care is taken to curb the spread of any infectious agent for the sake of both patients and staff. This is simply mindboggling. Well, it was, until the "authorities" in this matter, the public health officers and similar let slip that, they "wanted everyone infected." The most recent united nations climate change conference was held in the united arab emirates. The attendees this year were especially notable, a glittering array of oil and gas barons and fawning politicians, all eager to make new and updated deals for the exploitation and sale of hydrocarbons. Rest assured, there were no corduroy-wearing, granola eating, environmentalist types making any statements for the big cameras or allowed to influence the proceedings. We are supposed to be reassured because the supposed adults are in the room now, making agreements and conferencing about the impact of ongoing destruction of the ecosystems the majority of people on Earth depend on to survive. The majority who are so poor or at least so distant, they will never take a high-technology plane to an equally high-technology five star hotel to hobnob with the rich, very rich, and obscenely rich. Those people are in the way of "progress" to the "perfect world" in which all good things go only to the deserving, and the rest go to hell. Straight to hell, do not pass go, do not collect 200 dollars. A couple of weeks ago, I encountered this headline, It's Time to Face the Facts: Zionism is Inherently Anti-Semitic, which introduces a searing piece by Leon Lorraine. Many, many such articles are published everyday now, about what antisemitism is or isn't, what zionism is or isn't. I make myself read a selection of them, including more than a few that make me deeply uncomfortable, but absolutely important to engage with. This is useful discomfort, the kind that drives me to ask questions, take a moment to sort out what I think and why. It is more important than ever in these times when so much effort is afoot to pour a pretence of information on us, to make us panic, to guide us into going along with the destructive drive of those committed to a homicidal madness expressed as a determination to do what is for our own good. Never mind that what they want us to do is die promptly when they consider us unprofitable or too resistant, and to refuse to look or question when they demand and enact the mass death of others they have found ways to target. Never mind that. We aren't supposed to notice. We aren't supposed to notice. We aren't supposed to imagine. We aren't supposed to notice that secular nations where all inhabitants are respected and are citizens, irrespective of religion, skin colour, or anything else, are more than merely possible, they have and do exist now. We aren't supposed to notice the cruel record of any theocratic government, whatever religion it is supposed to serve, and how that cruelty is enacted against both the people who are officially declared "right" and those condemned as "wrong." We aren't supposed to imagine anything else is possible. After all, the people obsessed with enforcing their personal version of perfection certainly can't. (Top) Anti-Intellectualism is Hardly a "Working Class" Position (2025-01-13)
Cover of the 2 august 1834 penny magazine of the london-based society for the diffusion of useful knowledge, 1826-1846.
One of the most frustrating, and frankly cruel parts of the way knowledge of their own history is denied to most people, is how this very denial can be manipulated to persuade them they "must believe" and "must be" certain things, no matter what. Take for instance the working class of any given place or time, who are so often declared invisible to history because they can't easily be reduced to a few sociopathic "heroes" as if it is a shame not to have such people running roughshod over the community. Let's consider a specific and yes easily accessible example, for those of us living in anglophone countries, the nineteenth century working class in england and its extensions into such colonies as canada, australia, and new zealand. (The united states is a related but pointedly different case due at least as much to the role of government-gangster attacks on unionizing and grassroots organization more broadly as its origins in secessionist british colonies.) An all too common canard is the claim and insistence that anti-intellectualism and being working class somehow go together, usually rationalized as "because the working class has no time or energy to read, let alone study." It is undeniable that time constraints and intense poverty are violent impediments to much reading or study. But that in no way means working class people therefore can't be interested in or value such activities. Indeed, we have repeated evidence this is not the case via the many, many working class families who strive to ensure at least the youngest members get a more extensive education. While more education is not an absolute guarantee of prosperity or even basic getting by in any capitalist society, it is something that can significantly help the odds. The trick is not to get fooled into undertaking the empty sorts of training and studies the "elites" engage in specifically because they don't need to have real knowledge and skills to succeed in their nepotistic circles. They are more circumspect about this than they used to be, no longer so focussed on such upper class credentialling degrees as "classics" – and it does pain me to have to type that, because unless tied to an archaeology or even political science degree a classics major is primarily a form of conspicuous consumption. No, nowadays there are the "ivy league schools" and similar, with a growing infamy due to the role of "legacy admissions" and their equivalent, in which the students entering on that basis needn't work too hard to pass and come away with a C-average degree. If anyone is inclined to come away from such an experience a confirmed anti-intellectual, the people from this capitalist and legacy aristocratic class are hard-pressed not to act according to that inclination. If you don't believe me, just check a little more into the actual backgrounds of most of the current crop of "industry leaders" and "tech entrepreneurial successes." Time and again you will learn about a mostly male cohort who come from families with inherited wealth of some type, who went on to a private university where they often never bothered to finish their degrees before going off to run the companies they became famous for. To be sure, a proportion of them go back to finish their degrees, I suspect especially the ones who were doing real work, rather than those cheerfully making their way through on B minus to C grades. Meanwhile, in my experience, despite denying they take learning seriously, the majority of the most serious and carefully intellectual people I know are indeed from the working class. They are deeply concerned about the way manual labour and trades are derogated, and even the men are cluing in to how ruthlessly underpaid service jobs are, and how much skill they require, now that they can't avoid having to accept them. They ask tough questions about the nature of career advice, which is often framed as "get into this hot area now!" when both trade- and undergraduate-qualified jobs take a certain number of years for people to make their way from student to practitioner. If they are guided or pressured into the current "hit" area, how can they really be sure they will be able to get a job in that area or make a decent wage? We are in capitalism right now, too many people striving for the same positions craters wages and leaves most jobless. All the indications are that in this very narrow set of anglophone countries, there is a desperate shortage of teachers, doctors, and related professionals alongside a similar dire shortage in many trades. Yet the options to train in any of these fields is constantly narrowing, and the potential employers refuse to pay decent wages to locally trained and educated candidates, preferring to braindrain other countries and manoeuvre those immigrant candidates into poorly waged positions with visa limitations hanging over their heads. The end result is a terrible race to the bottom for everyone. Of course, they don't use the preferred jargon of the various social scientists out there, and this is an easy pretext for ignoring them for many people who are not working class. These very people with this sophisticated and pointed analysis express a serious commitment to learning via so-called "informal" channels. In other words, time, money, inclination, or any two of those three mean they are not paying fees to a college, trade school, or university to learn a complex new skill. They are making savvy use of the public library, the internet, trading work to get direct experience, refurbishing older items, and so on. Not everyone has a house full of books. Those who have and maintain a small collection of well-used volumes reread and think with them all the time, and may even have a liking for providing relevant quotes from them, although that seems a generally frowned upon thing right now in a generally anti-intellectual period. All of this is part of an honourable tradition, albeit one tragically limited one nipped in the bud by the two earlier world wars and the interregnum between them in which the war on labour reached such heights the entire capitalist system nearly collapsed for good. This earlier tradition includes those self-same lines of education outside of formal institutions with high tuition fees, participated in when energy and resources allow. Among the earlier sources of working class intellectual expression were the working men's societies, a portion of the earliest subscription libraries, and a once thriving small press producing magazines and news papers. Even the short-lived society for the diffusion of useful knowledge, founded by bourgeois do-gooders with colonialist ambitions sought to hijack these grassroots initiatives. Appreciating there was a pent up demand for modestly priced, accessible books for adults studying on their own in what today is often referred to as "upgrading," the society produced both books and periodicals during its years of operation, 1826-1848. It was a controversial effort of course, since striving to keep "the masses" ignorant is a key desideratum of capitalist classes up to no good. But how to counter the potentially destabilizing effect of what eventually became mandatory basic elementary education for all children, regardless of class and sex? Well, what better counter than to propagate a fog of propaganda claiming the working class is anti-intellectual, even stupid, and furthermore to be truly working class means denying any interest in learning or to have any sophisticated undertanding of the world! (Top) Chasing Pointers on Symbiosis (2025-01-06)
A wonderfully clever logo from one of the many tribute pages to renowned biologist Lyn Margulis, 1938-2011. The source page for this image is hosted on the earth, geographic, and climate sciences department of the university of massachussetts at amherst.
One of the pleasant tasks of a transition from the old to the new year, similar to completing a major writing project, is going through notes for posts and articles to see what tantalizing bits and pieces need at least some follow up. Caught up in my most recent collection was a note to the effect of "Lynn Margulis - Dorion Sagan - symbiosis - mitochondria used to be separate critters." This particular note is very old, it comes from a note slip tucked into a notebook pointing all the way back to a course I took as an undergraduate. By happy coincidence, Maria Popova at the marginalian undertook one of her signature annotated readings of Lynn Margulis' book Symbiotic Planet: A New Look at Evolution last year, so this note slip didn't just find its way to the recycling bin. Lynn Margulis was a major developer and proponent of the first hypothesis, now theory proposed and named by James Lovelock, in which the Earth's biosphere is understood as Gaia, a great, interdependent, interacting population of life. But the ideas and lines of research Margulis truly made her own were those of symbiogenesis and symbiosis, which focuses on how the vast majority of life on Earth is made up of composite beings, including ourselves. Margulis' arguments for and popularization of symbiosis as a factor in evolution, if not its major driver, were quite controversial. If you would like to see a relatively low key expression of the controversy, a great piece to read is the rather grudging preface by Ernst Mayr to her 2002 book with Dorion Sagan, Acquiring Genomes: A Theory of the Origin of Species. The book is a fascinating argument for a careful definition of species as made up of individuals with the same genomes, which has such follow up implications as acknowledging that bacteria, the primordial prokaryotes, have no species. Species begin with eukaryotes and their nuclei. Furthermore, Margulis argued that this also indicates a context in which Jean-Baptiste Lamarck's much maligned argument for how species evolve is valid. My high school biology teacher taught us Lamarck was laughable, because he made an absurd argument for children acquiring new characteristics from their parents in a mechanical sort of way. The recurrent, obviously absurd illustration was supposedly Lamarck claimed giraffes developed long necks by earlier pre-giraffe parents stretching their necks to reach higher leaves, and then endowing pre-stretched necks to their offspring. It's an unforgettable, and alas, oversimplified image. Margulis argued that in cases such as those by which eukaryotes acquired nuclei and eventually other specialized organelles like mitochondria, the ancestor prokaryotes had indeed opportunistically acquired the first major change, the nucleus by trying to absorb another prokaryote. Over time and many attempts, eventually an original prokaryote managed this successfully, neither the original prokaryote nor its absorbed compatriot died immediately or before the combined creature could reproduce. And the daughter creature was a eukaryote, and the new eukaryote kept consistently reproducing other eukaryotes often enough for the change to stick. Even on a much larger scale, conceivably other plants and animals could acquire transmissible characteristics by managing to incorporate other creatures in their bodies, such that both are able to survive. In other words, this is not a situation where one creature exploits another, parasitism. This a situation characterized by symbiosis, a living together benefiting both organisms. Not so long ago, having symbiosis as a major factor in evolution might have seemed like gilding the lily. The consensus had become that there was a sort of genetic descent by modification leading to speciation. Except, as creationists were annoyingly fond of pointing out, there was no smooth sequence of fossils to back this idea up, and not for lack of searching on the part of palaeontologists. Random mutations, the supposedly exclusive drivers of genetic speciation, almost exclusively produces deadly or at least reproduction-curtailing changes in the organisms affected by them. Ongoing monitoring of the famous galapagos island finches does not suggest their cycles of physical alterations over time in their genetic isolation are leading to further speciation. The evidence actually suggests what Stephen Jay Gould famously argued for and wrote about in popular science books, punctuated evolution. There is indeed evidence evolution may occur in leaps. Not crazy leaps, but leaps all the same. How can they be explained without denying the salience of genetics and the pressure of the environment a community of creatures live in which affects whether they are able to reproduce and how many of their offspring survive to reproduce? As Margulis and her colleagues have argued and continue to demonstrate through research since Margulis' passing in 2011, symbiosis is at least a major part of the explanation. Part of what made Margulis' work controversial was not so much the argument for symbiosis in itself, but her considered critique of "western science as usual," specifically the infamously common unilingualism of "western" scholars. Anglophone scholars unable to read in other languages, and/or unwilling to arrange for translations of relevant literature in other languages, especially those of eastern and southeastern europe, they remained almost completely unaware of significant research into symbiosis, especially in russia. After all, they were already convinced that in the USSR all research into evolution could only be a reflection of the infamous Trofim Deisovich Lysenko, and he was presumed absolutely wrong. I did not major in biology and even I am aware of "lysenkoism" at least as a pejorative. Yet as it turned out, whatever Lysenko himself argued, other soviet scholars were making their own arguments based on actual evidence, albeit the inevitably incomplete sort of arguments early interpretations of new evidence enforce. (Top) Outsourcing is a Cop Out (2024-12-30)
September 2007 photograph of a building proximity access card by kymacpherson via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license.
Admittedly I touched on this barely two thoughtpieces ago, but did not take up the point directly. Part of what inspired this thoughtpiece, besides the latest evidence of rampant censorship achieved via centralization, is the combination of an off-the-cuff blog post by Nico Carton, Keep Your Blog Simple! and a post he cross-referenced by Fabien Sanglard, All You May Need is HTML. Between the two of them, they encapsulate the core reasons a person who wants to have their own website is specifically not served by the software we are lured into using to make one and keep it running. Neither of them strictly rule out that some of the other more complicated options and add-ons may have utility, but they question assuming they must have utility because they are available and widely advertised as "easier" or somehow "necessary." However, at this point, it seems to me a person or business who seeks to delegate the very infrastructure of their website or anything else all to other people via at a company pretending to do the job for free or for a moderate fee are simply copping out. It's a great excuse when some external service doesn't work properly and we can't fix the issue ourselves. If we literally can't access and apply the tools needed to solve our own problems or create our own stuff, when things don't work as expected we are encouraged to shrug our shoulders and wait to be saved by "the experts." We are encouraged to learn helplessness. Please note, I am not conflating this with a situation where a person considers trying to delegate aspects of their website or other work that can be hooked up to a remote service provider because they are extremely time constrained or dealing with a temporary situation where they need more support than what they have set up themselves can provide. Of course it is entirely reasonable to bring in external help in such emergencies. But most of the time, we are not in such an emergency. I find myself thinking of a silly example, and yet a good one. Today the video cassette player is all but vanished from the scene, but not so long ago they were at least as ubiquitous as a basic desktop computer. And they were infamous for their digital clocks, because the procedure for setting them differed between brands and sometimes even between generations of the same brand. Worse yet, these machines were not equipped with the now standard battery back up included in many digital devices otherwise connected to a main power supply. So whenever the power went out, these machines forgot what time it was, morosely blinking 12:00 until someone reset them, an absolutely necessary step if they were going to be used to automatically record anything, let alone just use the clock. When I moved out on my own, for the first year or so I would periodically get a phone call from my parents in which they would request I talk them through resetting the clock on the vcr. We all laughed about it, and of course in time they didn't need me to help them with it. The year they replaced the old vcr with a new one and triumphantly set up the new one on their own in spite of the hilariously badly translated user manual still stands out not least because of the sidesplittingly funny phone call in which they provided an exegesis of that user manual. Then again, I didn't have the motivations of the various hucksters who would like to persuade us to learn to be helpless and incurious. I had no wish to encourage or extend an expensive contract for so-called "cloud computing," nor an angle on data mining or surveillance via offering a supposedly "free" service. Nor am I part of a company with a pretence to being a "start up" sniffing after vulture capital to continue keeping the boat afloat even as it constantly loses money. Without such perverse incentives in the mix, it is not so easy to see any reason to immediately recommend "cloud computing," access to one or another large language models, or using some centralized "free service" like wordpress hosting by automattic or similar. As too many of us have learned directly in many countries where people claiming to be "elites" and elected to office have indulged in neoliberal stupidity, "outsourcing" means massive cost overruns, worse or not service, and a horrible mess to clean up made all the worse nowadays by the regular inclusion of database creation with basically no security. "Outsourcing" as any sort of supposedly longterm solution is a cop out, if not also a scam every time. If on a small, personal scale you are looking to try out something new like building a small website and putting it online, we no longer have the plausible deniability of not knowing "free" hosting isn't free, nor of there being "not enough documentation" to help us learn the basics to do what we need to do. Nor is there such plausible deniability on a large scale, the evidence of failure of outsourcing to cut costs or provide dependable results is all around us. (Top) Presentism Strikes Again (2024-12-23)
Yanpeng Cao's photograph of neolithic Pingliangtai's ceramic drainage pipes in place, via the 14 august 2023 article on the site at phys.org. The full article is published with open access journal nature water, Earliest Ceramic Drainage System and the Formation of Hydro-Sociality in Monsoonal East Asia.
I wrote a startlingly long time ago about "presentism" and its frustratingly vague common definition, one which continues to make it a slur more than a useful concept when invoked. In the title of this thoughtpiece it is intended to be a bit tongue in cheek, but also to refer to an example closer to what it seems to me "presentism" can be defined as. Perhaps there is in fact a better word applied to what I am suggesting here already. What I understand by "presentism" here, is the insistence that the only way to do things at any time, is the very way we are doing things now. The OED indicates this is quite consistent with what people mean at least some of the time, and it declares the word means "uncritical adherence to present-day attitudes, especially the tendency to interpret past events in terms of modern values and concepts." The tricky word the OED writer has in there is "uncritical," which is how we can account for such obvious points as slavery and colonialism being evils in any age, not least because the victims of these vicious practices of any time period leave clear evidence of how profoundly horrible they found and find these evils to be. Okay, with those points ferreted out, I think the term "presentism" can fairly be applied to instances where even in defiance of the actual evidence before them, a person insists that the only way a past thing could have been achieved is by the ways such things are achieved today. Now, here we have something more specific and therefore able to help avoid name calling or seeming to be name calling. It is also draws out how slipping into presentist interpretations or expectations may be closely related to deeply embedded notions of what is "common sense" and even "moral." So when presentism strikes, the person afflicted may end up quite annoyed at being called on it, even prone to understanding the challenge as being to their character as opposed to how they have interpreted the evidence at hand. Of course, in our current era of hothouse politics and rampant censorship, it is possible they are right. Only checking whether the question or critique focusses on the evidence under discussion can tell us that for sure. What brought all this to mind is stumbling upon a brief news item discussing an impressive neolithic site in pingliangtai, china. On consulting the map provided in the article on the site in nature, I can add that pingliangtai is west of shanghai and south of beijing, hence in eastern china between the two great rivers, the yellow and the yangtze. So this is a location where water management is of a special salience. Contrary to the assumptions of the already thoroughly debunked but stubbornly persistent invention of colonial anthropologists, the "hydraulic civilization," the physical evidence indicates neolithic pingliangtai did not create a hierarchical society to manage flooding and drainage. In fact, the remains at the site indicate communal construction and maintenance of elaborate networks of ceramic water pipes. The people there managed the planning and work with what arcahaeologists seem to always male a point of calling "stone age" tools. Now yes, literally that is true, "neolithic" refers to "new stone" and apparently the people at pingliangtai were not gathering and smelting metal for tools. No doubt this made some of their work more difficult than it could be with iron spades and such, but there is wonderful work from other places giving examples warning us off of making such an assumption. I think specifically of the example of the pyramids in egypt, where the quarry workers successfully used copper chisels to break the stone into usable slabs, and later smoothed them with the same sort of chisel. This was not trivial work, but it is important to know they work with a limestone that on first release from its matrix is still moist and soft as strong wood, becoming much harder as it dries out. Furthermore, the ancient egyptians worked with the grain of the stone, not against it, which also makes things much easier. Without iron tools, people had to know their materials deeply and well because it was much harder to simply bash through obstacles. Returning to pingliangtai, the overview article at phys.org comments, "...despite the apparent lack of a centralized authority, the town's population came together and undertook the careful coordination needed to produce the ceramic pipes, plan their layout, install and maintain them, a project which likely took a great deal of effort from much of the community." The underlying idea seems to be that it isn't people can't work together without a coercive authority over them to plan and carry out a complex project, but that they won't, because of an overriding selfishness and insistence on short term gain. Furthermore, there is another underlying idea, that labouring on such work must be overall miserable. But we also have a great deal of evidence that people must learn such total selfishness and short term thinking, and societies do not necessarily encourage or reward it. And I find myself thinking of smaller-scale examples, like the most tedious part of quilting, sewing the cover to the back, the central event of the quilting bee. Or to go up a step in scale, the all-in work of harvesting fields. The labour required may be hard or tedious, but spread out can draw people together to find a way to get what they need done by applying many hands to make it lighter for everyone to bear. The people of neolithic pingliantai may have found satisfaction not only in successfully planning, building, and maintaining their ceramic pipe system, they may have taken satisfaction in the social relationships and cooperation entailed in completing all these things. I don't intend this in a romanticizing way. The article and paper about the site provide a population estimate for neolithic pingliantai of 500 people, and they did have both protective walls and a moat around their village. The latter infrastructure may have been as much if not more about managing livestock and water management, but it is not clear from the full paper that the walls also served a defensive function against other people. No doubt the neolithic village had the same basic sorts of human challenges any community does, of interpersonal rivalries and dealing with setbacks and so on. Fair enough that this was a small village. Yet it was a small village building and managing an elaborate drainage system without waiting for an outside leader to organize them or otherwise force them to do it, all contrary to what we are encouraged to expect in our grim present. (Top) A Self Cleaning House (2024-12-16)
Scale model of Frances GABe's self-cleaning house, courtesy of the hagley museum in delaware. Site accessed 1 december 2023.
One of my current research projects is a study looking into how women apply their capacity to invent new technologies and adapt those invented by others. Often forced to do the "shit work" and anything so long as it doesn't seem glamorous or otherwise potentially ego and status-boosting, women are not as likely to have either need or opportunity to apply their creativity to the challenges of weapon or propaganda dissemination design. It is not without reason a whole range of stories centring on men and their inventions feature them seeking to usurp divine powers in some way, usually epitomized by a search for absolute power over life and death of if not the world, at least one or a few people. Perhaps inevitably, after wading through far more writing and dilation on Mary Wollstonecraft's novel Frankenstein than was of any use at all, I began to locate better and more practical sources, including Autumn Stanley's wonderful opus, Mothers and Daughters of Invention: Notes For A Revised History of Technology (New Jersey: The Scarecrow Press, Inc., 1993). It was through Stanley's work I first learned about Frances GABe (yes, this is how she spelt her surname) and her self-cleaning house, with its fascinating echoes of some of the preliminary analysis of housework in the work of Charlotte Perkins Gilman. But admittedly, I didn't follow up on it much until a different research rabbit hole led me to an obituary of Frances GABe, and finally a few more detailed descriptions and photographs of her self-cleaning house as partially realized and in models. The models are now held at the hagley museum in delaware. GABe had a flare for both invention and publicity. Unfortunately, her approach to solving the problem of boring but necessary housework did not catch on, even though it takes a clever line. She decided there was no point in arguing for men and boys to take up their share of the work. Instead, GABe chose to do all she could to make real the dream of a sort of "push button washable" home. If housework could be rendered as simple and quick as possible for one person to handle, even if it was always the woman, well, that would still be a victory. Among her most notable and I think clever ideas was her effort to design cupboards and closets to be both storage and washing machines, with all the attendant space-saving implications. However, the trouble with the activities generally labelled "housework" is how diverse and constantly changing they are, as well as how overall "modern" houses are effectively designed to make them miserable to keep clean or otherwise in order so long as anyone is living in them. As a result GABe took up questions of how to redesign the structure of the house to allow floors to drain evenly and to install her systems intended to provide spray cleaning and subsequent drying. But overall she never managed to achieve much take up for her ideas, probably not helped by the ways she was forced to incorporate waterproofing. For my part, I wondered how GABe would have solved the issue of the constant levels of plumbing work and maintenance necessary for her set up, which alas she never had the chance to work out fully. Overall, GABe's approach seems very compatible with the present capitalist fundamentalist obsessions of the united states. Her designs were scaled to individual "family" homes, and would have entailed lots of accessory sales and maintenance. It didn't set a challenge to the imposed vision of the so-called "nuclear family" and even hinted at possibilities for meeting the absurd cleanliness standards of the sorts of "women's magazines" recommending schedules for washing walls and how to keep the ironed pleats in dress pants crisp. It even seems like with some other adjustments her ideas could have been adapted by ship and recreation vehicle builders. The full and successful implementation of her concepts in a broad sense, even if in the end having the walls and floors cleaned and dried via sprinklers and fans had to be dropped would still have left the assumption that women are responsible for doing all the chores to keep a house tidy and the inhabitants in clean clothes and decent meals untouched. While GABe unambiguously identified the problems with the efficiency and efficacy of housework lay in male dominance of house design, she did not seem to spend much time on that critique. She had her hands full enough trying to get a hearing that didn't present her as a crank, a challenge not uncommon among inventors who don't work for someone else in a corporate laboratory. In the end I think the main barrier to a more serious assessment and development of GABe's self-cleaning house was not technology or as many of the articles about her imply, GABe herself. Far from it. Despite her apparently unchallenging plans for revolutionizing housework in terms of who would be responsible for it, GABe still presented a powerful challenge to sexist preconceptions. The central one of these presumptions is that women and girls have neither intelligence nor imagination, so dull and unremitting work no man or boy values should satisfy them. The next layer of presumption on this centre is that women and girls' time has no value and there is nothing better for them to do than be unpaid servants to men and boys. No matter how lauded and desirable automation is supposed to be when the labour is done by men, as soon as it is done by women, no actual labour-saving devices or designs are allowed. To automate the labour deemed too demeaning for men or boys to do is in effect to let slip that it is labour in the first place. Indeed, there are multiple studies demonstrating how what men consider "labour-saving devices" for women are in fact about saving men the labour of wasting women's time and directly confining them to the house. (Top) About Those App Stores (2024-12-09)
Graph of applications and downloads on the apple appstore created in 2013 by Sjoerddebruin via wikimedia commons under creative commons attribution-share alike 3.0 unported license.
"Appstores" sound like a sensible solution to the challenge of helping software developers sell and distribute their software, provided we are not aware of how software was distributed and paid for before today's internet. That is, before the current pressure to reduce the web part of the internet to little more than a catalogue with a shopping cart and thousands of spy scripts attached to it, carved up among a few monopolistic corporations. If nothing else, this results in a considerable amount of centralization. What seems good about such centralization is the creation of a sort of consistent marketplace where smaller sellers don't have to set up and maintain their virtual storefronts, they just pay rent to hang out their shingle and sell their stuff. No matter how often reality shows this model facilitates extortion by the party who can manage to control a centralized marketplace and then engross as many sellers to using it as possible, people keep trying this. Yet it clearly is feasible to refuse to participate in the appstore model, and many older, not small to medium size software and software-service tied firms do so, and among those are both free/libre committed organizations and those sticking to proprietary software instead. I suspect a great many of us, and in the "us" yes I am explicitly including myself, missed the possible dangers because the advent of online music stores were the obvious analogy, albeit as it turned out an incomplete one. By rights, online music stores should make it possible to make massive back catalogues available and ease the pressure on artists with smaller fan bases when they are seeking to record and distribute their music. It even looked feasible to find a better way to handle the singles verus albums problem, because all too many artists have a knack for singles but are hopeless at albums. In one of those backhanded seeming "good deeds" corporations are known for, when apple made a success of what has become one of the biggest online music stores, it apparently muscled DRM out of music sales after initially having it. Now DRM has got in via the back door, and too many music artists are caught between two stools. They either use an online distribution and sales a service outlet of some kind, usually surrendering a cut of each sale, or they can commit to setting up and maintaining their own small-scale website and store. But of course, the store will likely be a plug-in from yet another centralized company that offers support of sales and distribution "as a service." But that is just one set of headaches. The only proprietary software program I praise as the exception that proves the rule is BBEdit. It's amazing, and as best I can tell no other coding IDE with a firm orientation towards website development and maintenance matches it. There are valiant free/libre projects that come remarkably close despite having one-person or tiny team developers and maintainers and founded much more recently, such as bluefish and pluma, but no real match. Having spent a long time primarily in the macosx world, I purchased BBEdit from barebones software directly before there was an apple appstore for the desktop. It took some time before barebones software could sort out an apple appstore purchase option, and I never opted to switch to that option. There are several reasons for this, first and formost being the excellent notification and update service barebones software had already implemented. Probably what seems the weakest reason is simply that I am ornery and saw no reason to change things. Plus, free/libre software is the primary software I used by then, pushed along on that path by apple's decision to cripple applescript support and leave applescript documentation in a hot mess. And then one day, I went to the macosx dock, and saw one of the rarely used apple application icons with a strange to me white circle with a white diagonal line across it. Then error messages began to pop up. By then Mozilla had suffered an issue with its servers that caused firefox extensions to fail verification and refuse to load, so these messages were more intelligible than they might otherwise have been. Now I had undeniable proof that applications "purchased" through apple's appstore dialled home constantly to be recertified as duly purchased and therefore allowed to run. If any of those programs had been mission critical for me, well, let's just say it would have been bad. The only exception was BBEdit, which to this day, along with the other barebones software offerings, is still a proper application, not a subscription. The appstore issue didn't just affect apple's offerings of course, it affected any program purchased through the store with its associated licensing arrangement. As can be imagined, this experience gave me a lot to think about, and contributed directly to my decision not to purchase apple products again when upgrading my hardware stack. The touchy part when it comes to appstores and similar options online and off, is the sense that they represent a way to delegate away a task at least perceived as time consuming or difficult to manage to somebody better at it or with more time. It seems like a time saver at least, even if not necessarily much of a money saver. And these perceptions can be true, and it is even possible in many cases to find free/libre software based options and other means to avoid spying and what amounts to getting stuck in a position of "Nice business/website/podcast/or whatever you got there. It'd be too bad if our verification server went down, wouldn't it?" This acknowledged, those perceptions are often not true, especially for the majority of us who are not trying to do larger scale, more security-challenging tasks like running email servers or providing secure online sales. If it is still difficult to do what we need to do, I think this means not necessarily that we need these "helpful" centralizing "services," but more likely the task is needlessly difficult. This often means there is a dearth of documentation, or the documentation is too telegraphic because it is still not much more than the crib notes an expert uses. Appstores came in at a time when it was not quite so clear and easy even to carry out such sensible steps as checksumming, and when far too many people were convinced the command line is scary. In other words, they were not so much a help as a means to encourage people to learn to be helpless around computers and anything else "complicated," so they treat what should be tools as a sort of master. Obviously this is a viciously unethical and disrespectful line for the various large computer and software companies to take, and it must be opposed. One of the more troubling bad habits of all too many free/libre software activists is their insistence on referring to graphical user interfaces as "slobberware" and indulging in a level of insult and hate against apple products that remarkably outstrips their expressions of contempt for microsoft and ibm. Part of what makes it too bad is how their genuine concern and critique gets lost in an apparent contempt for people who appreciate being able to use a well-designed graphical user interface in combination with the command line, and even more so anyone who would like the computer interface they use all day to not be an aesthetic shitshow. Their concern that the design and changes to create graphical user interfaces are often hostile to any desire or commitment to learn how to operate, administer, customize, and program the computer, reflecting an ongoing attempt to destroy general purpose computing outside of military and spying institutions is more than borne out by our present circumstances. But there is a difficulty here, because a great many of those activists seem caught on what they perceive as a dilemma. They actually aren't committed to ensuring general purpose computers are accessible and able to be managed and secured by the public at large. They would much prefer general purpose computers without the surveillance and other interference, but best accessible only to a sort of pseudo-priesthood of computer science majors or similar. They value the ways in which they have readjusted themselves to the computer, and are contemptuous of approaches to programming and hardware intended to make it easier for anyone to work with a computer without having to learn arcane commands or read circuit diagrams. Thankfully this particular sort of free/libre software activist and their free/libre hardware counterparts are not very common nowadays. They get this is not a real dilemma. To achieve the preservation and improvement of free/libre hardware and software, it turns out that making it possible for anyone to be able to learn how to customize computer hardware and software without being forced to grovel for permission to a corporation is the better approach. After all, whether a person opts for some version of gnu/linux or free/openbsd, they can choose not to install a graphical user interface, remove one already present on the system, or simply never run it. None of these things are options on the typical mainstream closed operating systems. The issue was the closure of options, a closure of dishonest and disrespectful kind. Why, even "appstores" have variants that are not the nightmare roach hotels with ever increasing fee cuts for the corporate proprietors these days, including what aren't "stores" but graphical user interfaces on top of apt (gnu/linux) or ports (bsd). How ironic those interfaces are majority flaky and slow, so for my part I tend to manage software from the command line these days. The exception in my experience so far, is MATE's software centre. (Top) That's Really Rich (2024-12-02)
1554 print by Heinrich Aldegrever held by the metropolitan museum of art in new york, via wikimedia commons.
By all accounts,the actual popular response to the ever rising levels of carbon dioxide is thoroughly unsatisfactory. Various people in high level government and non-government positions inveigh against poor take up of electric vehicles, so-called "smart meters" on electricity, against single use plastic wrappers and containers, and always, always, against most people eating any meat whatsoever. The level of carbon dioxide in the atmosphere is still going up, the average temperature on Earth is going up, more weather-related disasters are happening and disproportionately impacting the poorest and often southern hemisphere populations. On top of that, more and more people who are outside of these groups of people scolding the majority of the world are declaring global warming nonsense because the climate naturally is always changing so piss off. I don't agree with this denial that human driven carbon dioxide release and ecosystem destruction are pushing global warming and knock on climate alterations destructive to conditions the current human population is most familiar with. But, I do understand where the skepticism is coming from, because a great mamy of the loudest blowhards demanding "we" cut back on supposed luxuries like meat eating and using internal combustion engine run vehicles are among the most cash-endowed on the planet, and they give up precisely nothing, or can finance highly expensive, industrially generated replacements for their own comfort. When they repeat real scientific observations, it inevitably seems like they are telling self-serving lies. They know what they're doing, and clearly believe that no matter what happens on Earth they'll be fine. They are sure that no matter what happens, they will continue in the lives of luxury to which they have become accustomed. There is an unjust and undeniable disproportion in level of use of the necessities of life and wastage of the rest by so-called "western" countries in the northern hemisphere. Yet one of the biggest and most effective first steps in ending that is taking down imperialism via ending mass warfare and arms sales by state elites with pretences to absolute power which they think they have by being able to kill the majority of people on Earth. The biggest guzzler of all needful resources and emitter of carbon is the military, and the military parasitizes the civilian. The second biggest effective step would be to ground and disassemble every single private jet, train, and ship on Earth. The jet set rich like to pretend their purchase of "carbon credits" makes up for their constant flying and luxury voyages. Of course they do, because "carbon credits" are more conspicuous consumption without enforcing any changes on them. The third step would be shutting down industrial agriculture, including decentralizing food production. Yes, that means vegetables out of season and exotic fruits and vegetables will all but end for the slice of population that is not rich. Of course, in effect this means shutting down capitalism, and demands such steps as nationalizing major public goods from communications to transportation systems and banks. It also means important legal changes, like banning inherited wealth. Hoarded wealth must be dispersed again. Tn turn that demands an end to patriarchy, because it is a key factor allowing rationalization of rampant theft by violence in order to hoard riches and hand them down to male relatives. No matter how much the relatively rich demand everyone else stop using energy or eating very much, preferably dying quietly after labouring for their profit, these demands are not solutions. The problem is not that there are too many humans, the issue is that there are too many rich people who refuse to accept that anyone else is human and has the right to live decently. This particular social and economic stratum benefit by a patriarchal capitalist system predicated on exploiting the majority. They insist that wages should be as low as possible, and all the social stress and support be extracted from women for neither wages nor any sort of support, and those women are also pressured to keep bearing and somehow raising children until they die of the childbearing or the incessant work. But these very demands pressure the not so lucky majority to have many children in hopes a few survive for the parents to live off of in their old age, and furthermore pushes that unlucky majority to enter wage labour in industrial production. The industrial production is destructive of ecosystems and pours out carbon dioxide into the atmosphere. But the people most committed to maintaining the system that keeps them in luxuries to such levels that they are bored and randomly destroy the many things that are cloying to them can't accept the problem is them. No, they are sure the real problem is the surplus labouring population, which is always increasing even if the literal world population goes steady state, because these "employers" are replacing labour with machinery if they can manage it as fast as they can. It's pretty rich how a bunch of greedy assholes are fixated on how the rest of us should be stripped down to eating bugs and wearing sandals and a cassock while the lucky few walk an hour or more a day to their wage job. Whyever aren't the rest of us doing penance for their greed and evil by swearing vows of eternal poverty and begging them for forgiveness, right? (Top) Monkeys Typing (2024-11-25)
Quote of Robby Kraft's run of a face detection algorithm on over 7000 noise images from his november 2015 blogpost (web archive link | alternate) during his participation in the school for poetic computation in fall 2015.
There is a sort of parable, intended originally I think to make a point about probability, in which an infinite number of monkeys whacking randomly on an infinite number of typewriters might eventually produce Shakespeare's play Hamlet, Prince of Denmark. I am deliberately providing my half-remembered rendition of the attempted parable, because this is usually how it is referenced, sometimes quite telegraphically. ("Monkeys, typewriters, Hamlet, you know.") A person with more mathematical expertise will recognize this as a rather colloquialized version of what is sometimes called the infinite monkey theorem, and there is ongoing argument about its origins. Jorge Luis Borges had a special interest in it, perhaps most elegantly expressed in his strange but compelling short story The Library of Babel, which has an online realization at libraryofbabel.info. What has persistently brought this to mind of late is the trailing edge of the hype and hucksterism centred on so-called "artificial intelligence." The computer programs are artificial enough, but they are hardly intelligent, capable of independently making reasonable new ideas, extrapolations, or interpolations from what is already known to the ostensible thinker. Which is not to say that these programs have not demonstrated something useful, and had long before the excruciating and stupid waste of energy and time they represent via their "training periods" and such. Human intelligence is not simply the product of pattern matching or highly constrained interpolations or extrapolations. Those sorts of things definitely seem to be a part of human intelligence, and it seems many other forms of humanly intelligible intelligence (say that ten times fast!), just not everything. Truth be told, I am not convinced these programs amped up via brute force addition of more and more CPUs and amounts of RAM were necessary at all to figure this out. But I do get that there are a whole lot of obsessive people paid to work in the oxymoronically named field of military intelligence who want these techniques because they think it could render any, all, and every form of encryption useless for keeping secrets from them. The image reproduced here from Robby Kraft's study of a standard, open source face detection algorithm may seem a bit unrelated here, but it is not so far off the usual "AI" beaten path. The algorithm, turned loose on 7000 noise images, gradually begins to build up a ghostly pattern reminiscent of a face. There were no faces in the noise. The effect is an inexorable result of the development and training of the equations and matrices used in the algorithm to carry out the identification. That's okay, there's nothing wrong with that. It's an understandable effect, and so far as I know, nobody has confused this effect with the algorithm suddenly becoming self-running and conscious, thereby deliberately running off to try to find some pixels somewhere, somehow, to get a face of some kind in there. No, it just ran long enough, and it is actually a bit surprising that it took just 7000 images to get the results Kraft did. Not trying even a bit of back of the envelope calculating it seems most likely tipping in more noise images would simply wash out the image visible here, but that is exactly the sort of common sense estimation which should not be trusted but tested. Mathematical probabilities can be very surprising when the details of the algorithm at hand are not set out and explained, which I have not here and Kraft didn't for the purpose of his paper. By now it is probably already quite clear where in part, this thoughtpiece is going. What we put into such big, number crunching programs will tend to reproduce the stuff we put into it, however inchoately. On top of that, we humans have a propensity for slipping into speaking of something that seems to respond meaningfully to us as alive. We've all done it, referred to a poorly running car as not cooperating with us, or following the practice of speaking of all ships as "she" and "her." This is all fine, so long as we don't fool ourselves into thinking we are dealing with another intelligence. It puts me in mind of an episode of the old Columbo mystery series, from a time when such programs had regular cameos by currently popular actors and other performers. A famous ventriloquist of the time with his dummy featured in this episode, which hinged on him playing a character who came to believe his dummy was a separate being. Part of what makes the episode work is our general knowledge of how yes, we can fool ourselves into such ideas, though thankfully usually we are able to realize we are mistaken. The other thing worth taking a second look at with respect to the recent claims about artificial intelligence" is the parlour game I learned and played under the name of "mad libs." This is apparently a trade name for the game developed, trademarked, and sold by a pair of writers with a specialty in comedy. It takes advantage of the structured and patterned nature of human language, with the humour and unpredictability of it generated by mechanically separating selection of nouns, verbs, adjectives, and adverbs from composing the final sentence. This is achieved by defining a set of templates with blanks to pop the selected words in. Large language learning models bear a more than passing resemblance to this, but are intended to produce a probable, sensible sentence without human intervention after the algorithm is tweaked to select words based on probability they will fall in that spot in the sentence. I have read it described as given the word before, the program places the most probable word next. This must be a vast oversimplification of what is happening, even in the context of now solidly developed machine translation. There the corpus of data makes all the difference. The initial texts used give the subsequent machine translations a particular aptitude for translating similar documents best as compared to others. I suspect a great many machine translation programs are best at "translating" one of the various editions of the christian bible because of how often it has been translated already by missionaries. This makes sense. Translation is an especially nice example here, because we can see immediately how feeding such a program gobbledygook will lead it to produce gobbledygook itself if it is then set to translate some real text. Now here is the difficult part. The computer programmers who have fed the various "GPT's" gigabytes and more of data to "train" them depend on those datasets being overwhelmingly sensible, not garbage. But they don't seem to be carrying out that training on as curated a collection of materials as machine translation programs. This too makes sense, because one goal of the whole project at least as outwardly presented is to produce a program better and better able to mimic real conversation and produce written material without forcing the human participant to do the writing. The result though is when these programs are run more and more often, they begin to reveal more and more of what they were "trained" on, and are prone to putting together piles of at first plausible looking nonsense. There is a wonderful moment illustrating this point in Alice's Adventures in Wonderland during the mad tea party, when Carroll writes, "Alice felt dreadfully puzzled. The Hatter's remark seemed to her to have no sort of meaning in it, and yet it was certainly English." (Top) Historical Cats (2024-11-18)
Door at the bottom of the exeter cathedral astronomical clock taken by Julian P. Guffogg in 2012, used under Creative Commons Attribution – Share Alike 2.0 License via geograph.org.uk.
The abiding human fascination with cats is incredibly old, and a ready source of jokes and teasing between friends. Nevertheless, it is by no means the bane of our present existence, a marketing phenomenon, nor its derivative, the "social media meme." Besides the often noted cat deities in ancient egypt, and the wonderful Maeki Neko japanese "beckoning cats" popular across much of asia and believed to bring good luck, there are distinctive cat breeds with evidence deep in the archaeological record wherever "domesticated" cats are found. There are plenty of practical reasons for the ancestors of those cats to throw their lot in with humans, not least the human propensity to create homes all too inviting for small rodents and other creatures small cats like to number among their prey. Contrary to what might be expected, house cats can be quite happy to eat beetles, so long as the beetles in question are not bad tasting and a reasonable challenge to catch. For a gently famous example, readers will be well served by having a look at the late Ursula K. Le Guin's blog, where among the many posts are a charming series she came to label "The Annals of Pard." "Working cats" have a storied history all their own, including encouraging the development of the cat door. I had no idea anyone was tracking where and when the earliest of these could be found, as in the fourteenth century example from exeter cathedral illustrated here. Indeed, there is documentation of a regular stipend for the cathedral's cat, ensuring the cat would be assured of food even if it successfully eliminated the mice for that season. All that said, many people have cats not as working animals, but simply as companion animals. This is not so surprising, many cats are personable and are happy to use humans as ready heat sources, and their purring is a pleasing mystery. To this day it is an abiding puzzle how cats purr, which they are able to do even while eating, drinking, or sleeping. Still, there are not so personable cats, those disinclined to allow humans to pet them or be "lap cats." The evident intelligence and acrobatic talents of cats make them real characters in their own right, as well as independent enough not to need the sort of constant attention needed by pack animals like dogs. Arguably they are the closest to real life aliens most humans get to meet, and they are wiling and able to exchange gazes with humans, often referred to as a "staring contest." Of course, this also means there are humans who deeply dislike cats precisely because cats have minds of their own and do not depend upon the humans they live with for approval and reinforcement. Cats may indeed choose their "owners." They are also unfairly accused of being the reason for recently dropping numbers of small songbirds, heedless of the fact that if house cats were the actual cause of this, then in many countries the songbird population should have been long gone over centuries ago. In north america the accusation seems somewhat more plausible because they were introduced by invading europeans. So what could possibly have led to this lighthearted and happily silly thoughtpiece? No less than stumbling upon the growing phenomenon of "cat cafés" which strike me as among the most charming and sensible approaches to finding homes for cats who find their way into animal shelters. Being animals of such strong character with particular needs in terms of managing their litter boxes and feeding them properly, having a feline member of the household is not a trivial commitment. Cat cafés allow people to spend time playing with and getting to know adoptable cats, finding a compatible cat or learning in a sensible way that a specific person or family is not well suited to include a cat in their household. For those unable to adopt a cat even though they are willing and have compatible personalities with one, these cafés are one way to spend time with them alongside volunteering at traditional animal shelters. I have just learned these unique places were first opened on taiwan, where they were a hit with japanese tourists and the idea soon began to make its way around the world. (Top) Older Tone Deaf PR Campaigns (2024-11-11)
The original keep calm and carry on poster, quoted from the keepcalmandcarryon company website, accessed 3 august 2023. It is grim and surreal to realise just how commercialized this old propganda campaign has become.
Several thoughtpieces back, I wrote about projection and how I learnt about the notion of "elite panic" (see Military Projection). I commented very briefly on the wildly tone deaf and luckily withdrawn just in time "Keep Calm and Carry On" posters, one of a series of frankly stupid concepts for making the "poors" behave under stress. In the end I suspect the real reason the series was generally dropped came down to it making the total contempt of the rich for everyone else far too obvious. That aside though, it is interesting to take a second look at the rebirth of these old propaganda posters as memes and a capitalist profit centre. Obviously the statement on the poster as shown here is besides the point, nobody is paying much attention to that. So what is so pleasing about this poster, and its many variants, paid or not, applied to mugs and t-shirts or not? To which many people would respond, quit beating a dead horse, the poster is obviously aesthetically pleasing to many people. It's approachable and can be used and applied to lots of different things as a more or less hah-ha only serious gift." And I quite agree with those people, and consider these really interesting points. There are probably quite a few more. Here we have one of the few examples of accidental folk art. The original designer was focussed on different criteria, primarily clarity so the poster could be easily read at a distance and in a hurry, and simplicity of colour separation and inks. In one of those fascinating accidents when fonts and sizing come together just right with a good design, there's something of a hit. It is not lost on me that many of the riffs and unofficial homages to the licensed original drop the stylized english crown, which is both silly and not transferable. Not that this poster and its mutations are popular outside the anglosphere, they definitely depend for an important part of their interest on the accidental preservation of original copies, the ongoing fandom for english culture and execrable "rule britannia" nonsense. But I wonder if this description so far may miss the point by a bit. I don't hold with the widely repeated media point that "the crowd," the so-called "masses" or whomever is in the vast majority is stupid or without a sophisticated critique of propaganda and the means the self-proclaimed elites encode how much they despise everyone else. To the contrary, there is indeed plenty of such critique using its own jargon and having its own registers, including a highly important and regularly underestimated one of humour and what I have heard people from england refer to as "taking the piss." There is a real undercurrent of making fun of elite panic and sending up the original posters and their daft messaging in many of the variants created since, many of them no doubt not truly legal in trademark and licensing terms but too numerous and not profit impinging enough to bother the company providing the example image here. And also of course, a reminder, "This is what the sort of people who fund and pick this sort of tone deaf nonsense think of us. We can't depend on them." Of course, by now the initial fad for derivative folk art derived from these posters is mostly over with, and the whole thing is not too deep. I could see it featuring in one of those half-annoying sidebars in high school and undergraduate history textbooks to "make it more interesting" while hopefully distracting students from asking any awkward questions. To date, I have not found these posters discussed alongside other propaganda posters from the world war two period. Propaganda posters are discussed, but the focus is on the most lurid and extreme examples of racist trash and confused discussions of Soviet posters meant to glorify rank and file workers. (There is no way to put those posters in sufficient context in an undergraduate or high school general or world war two history textbook.) Yet it is important to talk about the ones that strike us today as relatively innocuous, not least because a great many of those now shocking posters were actually not all that shocking in their time. The result would be some challenging but very interesting discussions. But heaven forbid history should be taught in a way that honours how interesting it is as opposed to boring everybody solid by insisting on how today is supposedly inevitably better than yesterday and studying history is necessary to "honour" select people in the past. High school students are already willing and able to dig into specific questions like, who paid for developing, printing and distributing these posters during the second world war? Who designed them? Whose ideas and beliefs do they reflect? Are the dynamics of propaganda development and distribution still much the same today or different? (Top) Whither the Web Browser? (2024-11-04)
Snapshot of an early version of the download page for the 1990s-era mosaic web browser, quoted from history-computer.com, accessed 3 august 2023. Also see the university of illinois' page on mosaic.
There is a core selection of programs the majority of us with the double-edged forces of working regularly with computers deal with on a regular basis. They include the standard office suite of word processor, spreadsheet, and presentation program, a music and/or video player, email client, and a web browser. It's an astonishing thought, how many of us at any one time use less than ten programs to do almost all of our work from day to day. However, it seems there is an ambition among corporations selling software to reduce consumer computers to dumb terminals which run only a web browser, through which they will use "webapp" versions of those core applications. Quite apart from the blatant greed and obsession with control this ambition reveals, it is also an uncanny fun house mirror version of the emacs ethos. Emacs is of course the famous free/libre code editing program developed by Richard Stallman. He designed it with the ability to extend it using a dialect of lisp, and a default install has so many standard built in modules, including games, an email client, calendar generator and so on, that a person could effectively treat it as the user interface for their computer. It could even be a superior interface, depending on the computer and the sort of customizations the person working on the computer prefers, especially if the person takes up emacs lisp programming to make their own plug ins and adjustments. The thing rendering the web browser into a potential ersatz user interface on top of a dumb terminal is the advent of javascript, which loosely fills the role of lisp in emacs. I have already written about how javascript seemed like a good idea but has turned out to be a gateway drug to serious issues from contributing to online insecurity, ballooning data usage, and surveillance to more minor ones such as interface monstrosities and broken drop down menus. Powerful as javascript can be, it has also led to a serious crisis in web browser development. The majority of mainstream web browsers are skins on google's chrome browser. The ones with officially free/libre code, including the minority that are variants of firefox, are afflicted by a development cycle designed to make it extremely difficult to keep them secure. This exacerbates the problem of how the web browser is constantly being pressed to be the "everything program," running every sort of media and meeting corporate demands to support DRM. This may seem reasonable at first glance, especially since it was a great development when images, videos, and sound were integrated into web pages and subsequently into the browser itself so it was no longer necessary to run a separate video or sound player when surfing pages on the web. But there is a point of diminishing returns for people browsing and researching on the web, and we have gone well past it. This has helped give impetus to claims that we should use "webapps" analogous to the applications run on mobile phones instead. Taking a step back, the web browser is originally by definition about exploring and using the part of the internet referred to as "the web." That part of the internet began as thousands and then millions of hyperlinked html pages, and those began as a means to share scientific information. It made sense to integrate images, sounds, and even videos as these are all part of academic research. Unfortunately, this also rendered the web into an all too tempting distribution medium for pornographers and advertisers. These two groups are a pair of serious headaches (to put it exceedingly mildly), but they did not drive the confused development of web browsers as such. No, that sprang from the "web 2.0" approach to trying to get people in general to give up their personal data to companies running websites and trying to sell products on the internet. They wanted to make "the web" writable from the many thousands of computers people were surfing the web from. This meant they needed to capture attention, hence they focussed on development of online games, the grim things constituting the dystopia of supposedly "social media," and rapid production and sharing of cheaply made videos. All centralized, all struggling to find a way to make a profit at least cost. It didn't, and doesn't have to be this way. I actually agree with the broader notion that people able to surf the web should also be able to contribute to the web if they wish. However, I do not agree this should be at the cost of their privacy, security, or freedom to have and make the fullest use of a general purpose computer. A better approach is for email and optional website hosting to be provided as a public utility, including supporting decentralized web hosting by individuals running small home servers. This would still allow for privately owned hosting and email service for the cases where that makes sense, and even something like the advertising-funded sites present now. Then the challenge of indexing and searching the resulting web would be a different thing as well. In any case, there would likely be far less pressure or seeming reason to push the web browser to become a poorly secured, memory hungry, poorly designed extra operating system. In any case, I think the current mainstream "web browsers" are only just hanging onto that identity. The future of the "web browser" per se looks set to fully bifurcate at least for the immediate future. There will be the corporate funded and controlled mainstream web browsers, which are becoming large blobs of spyware encouraging us to perceive and use general purpose computers as crippled dumb terminals. And there will be programs focussed on browsing the web, with basic multimedia support without DRM and severe limits on what javascript behaviour they will support. They will be developed to be fast, secure, and configurable with built in element blocking so that it is possible to block advertisements, badly behaved scripts, and such website horrors as perpetually moving gifs. Such browsers could still be extensible, or not, although I suspect more developed ones would tend to be extensible. There are some genuinely excellent use cases for extensions, which is what saves javascript from simply being chucked out. (Top) Unexpected Energy Sinks (2024-10-28)
A rather vintage laptop docking station. January 2019 photograph by Raimond Spekking, via wikimedia commons under Creative Commons Attribution-Share Alike 4.0 International license.
Energy usage is a fraught topic any time, although today because of the particular ways late stage capitalism demands energy waste while penalizing those who must strive to avoid wastage as much as possible because they can't afford it. Depending where we live and the options available, our big energy headaches may centre on gasoline for a vehicle, electricity in the household, natural gas in the furnace, heating oil or diesel. As more and more of us have rediscovered over the past decade or more, this is not just about the fuel source, it is also about the devices and machines we are seeking to fuel. Many of these have been redesigned to improve their efficiency at turning their fuel into whatever we hope and expect to get from them. More recently still, we can run again and again into devices designed to waste energy instead. And I do mean a general "us," not a coy reference to people so wealthy they may choose to drive giant gas-guzzling vehicles or heat houses with incredibly high ceilings as expressions of conspicuous consumption. Several years ago, I moved to a different apartment, and as always, each building has its own profile in terms of electricity usage. The differences may be minor, or they may be more significant if the building is in an area where the climate allows for use of electric or natural gas heating. Accordingly, on moving to the new apartment, equipped with the usual average electric bill amount for the place, I settled in and began seeing how the reality would turn out in my case. It was going to be a bit different than I had encountered before due to my having the good fortune to be able to work from home the majority of the time. Therefore I had a work laptop with a dock, an addition compared to what I would usually have in regular use. The rather old fashioned docking station in the picture is far older and much bigger than the one that came with the work station I use now, although the sheer size makes it easier to see the extra ports they make available. So on this much more modern work station, I started out with a modern dock – I use the truncated name to suggest how much smaller this counterpart to the old one is, and alas, it has far fewer ports too. Its cooling fans whistle terribly, which was driving me gently crazy, to the point that I often resorted if not to headphones and music then ear plugs to curb the racket. Meanwhile, I was suffering some worry over the state of my electric bill, which was different enough from the established averages for the apartment to be a bit disturbing. One day I finally could not stand the small but wretchedly noisy dock any longer. Work replaced it on warranty because it was not supposed to be making so much noise, and the unit had a known manufacturing flaw associated with overheating. Nevertheless, the new unit spent only a few weeks running more quietly, becoming at least as loud and annoying as its predecessor. Not seeing any point in trying for another warranty replacement, I checked the available ports on the laptop, and finding that with a few minor adjustments I could do without the dock altogether, packed it in its box and put it away. Now on occasion I noticed the fan in the power supply, but unlike the dock which whirred and whined, the power supply had a quiet whisper or hiss, and didn't run constantly. And to my astonishment, my monthly electric bill dropped significantly and stayed consistently better due to my power usage going down from 30 to 40% each month. I checked all the other variables and the answer kept coming back the same: the change matched up in timing and consistency with the removal of the laptop dock. It had never occurred to me that these devices could be such power hogs, bearing in mind they are for laptops, and my work laptop is set to turn off the screen after a minimal period of idleness (including the telltale pauses entailed in reading a document), it has an SDD drive, and automatically suspends after 15 minutes of lack of input from the mouse or keyboard. In other words, my employer has quite conservative power usage settings in its laptop system images. I am not claiming all laptop docks are electricity hogs like this one, it may be that it is from a series affected by a perfect storm of pandemic related manufacturing flaws plus bugs in the power management software. My point is that quite unexpected items can be adding to household or office electricity uses. We reasonably expect larger appliances to do so, but medium to small sized devices that run for at least the period of a work day may be punching above their weight, figuratively speaking. (Top) An Essay in Presentation or Persuasion? (2024-10-21)
Plate from August Carl Joseph Corda's 1835 book, Essay on the Mineral Waters of Carlsbad, via wikimedia commons.
Perhaps this is written up in an obvious way somewhere, the various things the standard essay format we are taught in elementary school is for. Or rather, by standard essay format I mean the five paragraph essay, and by "we" I mean those of us who were taught how to write prose essays rather than bullet point presentations in the classes known under such labels as "english," and "language arts." My most recent teaching experiences suggest many young students no longer learn the five paragraph essay and hence the basic structure they can extend to any length they need to until their first year of college or university. It's not a difficult format, and certainly a very useful one. But it is not always obvious that it while it is eminently suited to presenting an argument to a conclusion, it is not intended at all as a means to describe a research plan. After all, the point of research is to study and synthesize data, not start with a predetermined answer and try to force the data to fit. But there is the rub, because the five paragraph essay is a means to provide an extremely high level summary of hypothesis, evidence, and conclusion after research is done. I have met students who are quite puzzled when the essay happens, not least because now they are so often pressed for time. As Barbara Alice Mann has noted with her trademark wry humour in the early pages of her 2000 book Iroquoian Women, the five paragraph essay is the regular format of the puritan sermon, and it is boring. The form is even older, going all the way back to the ancient romans and their orators. Understandably, students hate the format, feel trammelled by it, and are rarely convinced it could persuade anybody. There is no way the poor five paragraph essay is going to get much love, in part for the same reasons the multiplication tables don't: it is a foundational form many students of english are expected to learn and then apply to their future writing efforts. It is predictable, so that an audience with unpractised listening or reading skills can be clear on how things will go and how long it will take to hear or read it. The way the form was taught to me, the five paragraph essay works like this:
With such a mechanical algorithm, it is no surprise to learn that banal five paragraph essays are comparatively simple for large language models to turn out in a few minutes. Alas this approach guides students to produce something so dreadfully boring they rebel at the idea of doing the work themselves. It really is only a good way to teach essay writing with quite young students, who will pick it up quickly and want to move on. It takes far more writing experience and desire to pursue writing in a more serious way to discover how tricky it is to write a pithy and interesting essay that fits within five paragraphs. Indeed, they soon discover it usually means writing a longer essay, and then editing it down to size. But that is for readers, not listeners. For a speech, the predictable structure is helpful to the speaker and the audience. Of course, there is still a tyranny of form here, and anthropologist Alice Beck Kehoe would note how its basis in an Indo-European linguistic and cultural tradition is revealed by the recurrence of the number three: three arguments, three subsections of the essay. But that itself begs a question: are the three arguments expected to be the bare minimum of evidence required to make a claim convincing, or simply just enough to produce a speech or essay deemed complete? (Top) |
Thought Pieces
Thought Pieces
|
|
Copyright © C. Osborne 2025
Last Modified:
Sunday, January 19, 2025 02:30:02
|
|