Title graphic of the Moonspeaker website. Small title graphic of the Moonspeaker website.

 
 
 
Where some ideas are stranger than others...

FOUND SUBJECTS at the Moonspeaker

Scifi Cryptoracism (2020-04-06)

Vulcan, Tuvok from Star Trek: Voyager, courtesy of Memory Alpha, april 2006. Vulcan, Tuvok from Star Trek: Voyager, courtesy of Memory Alpha, april 2006.
Vulcan, Tuvok from Star Trek: Voyageur, image courtesy of memory alpha, april 2006.
Sympathetic Romulan, unnamed Romulan Commander from the original Star Trek, courtesy of Memory Alpha, april 2006. Sympathetic Romulan, unnamed Romulan Commander from the original Star Trek, courtesy of Memory Alpha, april 2006.
Sympathetic Romulan, unnamed Romulan Commander from the original Star Trek, image courtesy of memory alpha, april 2006.
New Style Sympathetic Romulan, Donatra, Romulan officer from Star Trek: Nemesis, courtesy of Memory Alpha. New Style Sympathetic Romulan, Donatra, Romulan officer from Star Trek: Nemesis, courtesy of Memory Alpha.
New Style Sympathetic Romulan, Donatra, Romulan officer from Star Trek: Nemesis, image courtesy of memory alpha, april 2006.
Bad Romulan, Nevala, Romulan officer from Star Trek: The Next Generation, courtesy of Memory Alpha. Bad Romulan, Nevala, Romulan officer from Star Trek: The Next Generation, courtesy of Memory Alpha.
Bad Romulan, Nevala, Romulan officer from Star Trek: The Next Generation, image courtesy of memory alpha, april 2006.

I admit to being sorely tempted to use images of Klingons to illustrate this thoughtpiece, but because not a few Klingons have been played by racialized actors, it seemed to me that this aspect might mislead readers to think that the issue here is based in casting, not make up. If casting was the main evidence at hand, then for one thing that require a good deal more research than is typical of a thoughtpiece, and a great deal of argument about what can be inferred about the beliefs and intentions of the people working on Star Trek. Obviously the cast of the original show tells us that while not immune to systemic racism in the industry, the casting work on the show pushed remarkably far for its time. I'm not interested here in the overt sorts of racism that can be reflected in casting decisions anyway, but in cryptoracism in science fiction as exemplified in one example franchise. (Books are for another day.) The "crypto" aspect is not about literal visibility in terms of actors made up to look like particular aliens, but about the unconscious racism that ends up poking out even in a by all accounts well-intentioned development team. The series of photographs of actors made up as a Vulcan and three different Romulans encapsulates this well. Rather than use a picture of Spock played by the late Leonard Nimoy as the canon Vulcan for this discussion, I have opted for Tuvok played by Tim Russ, again to reiterate this is not about the actor being racialized.

The key feature that I would like to draw attention to is the state of the character's foreheads. Tuvok and all other Vulcans that I can find images of from the various movies and series, have smooth foreheads. In other words, no ridges. This is also true of Romulans as depicted in the original series and in the case of the Romulan ambassador in Star Trek V: The Final Frontier. What those Romulans have in common is that they are supposed to be sympathetic characters, ambassadors, warship commanders behaving honourably, potential love interests for Spock and similar. What looks at first like brow ridges on the early version Sympathetic Romulan is in fact a shadow thrown from his hairpiece (I checked). They could be rendered into friends somehow, except in the Star Trek universe they go into isolation, and something remarkable happens.

They grow horns! Er, forehead ridges, as if they need V-shaped marks of Cain to set them apart from Vulcans, heedless of their quite distinct aesthetic and demeanours. Now the most sympathetic Romulans have shallower, and often more rounded forehead ridges than the definitely villainous ones. The "Bad Romulan" is a military officer who ordered the massacre of an entire ship's crew. The reboot movies tried something else, reverting to smooth foreheads plus more literal marks of Cain in the form of facial tattoos still following the overall V-pattern.

By the way, this pattern does indeed hold for Klingons too. The earliest Klingons are hardly different from Romulans in appearance, except for their darker skin tone make up (oops) and their facial hair, since at first all Klingons are male. After that, they start pouring on the forehead ridges, with the implication I guess that a key way they used to fight in their ancient early days was by head butting. Never mind how daft that would be in a bipedal species that keeps its brains in its head. This holds true in Star Trek: Discovery as well, although it looks to me like somebody tried to make a homage to H.R. Giger. Worf may seem like an exception, but his character is made sympathetic by having him orphaned by massacre and then brought up by human foster parents. Franchise writers have a penchant for "mixed-blood" characters who are supposedly tormented by their own hybrid nature rather than by the social prejudice they are assumed to have to deal with, and Klingons became hybrid-possible with the popularity of Worf. It's interesting that no writers have tried out the opposite assumption, that such people would be quite comfortable and highly revered because they are not so common, as they wouldn't be between separately evolved species, if they could happen at all.

I do have sympathy for the people who have to design aliens who must in general be played by humans and therefore are going to be human-templated. Nowadays designers and writers are more conscious of skin colour and they don't want to stumble into the minefields of on-screen racism unless it is part of a story if they can help it. So they have worked very hard to create different "species" via other means, some actually clever and subtle though poorly explained, as in the case of the large black irises of Betazoids. But more often than not, they have resorted to sticking prosthetics on actors' faces, sometimes extending them down and along their necks and shoulders to distort their profiles. The trouble is, this is still mapping out in racializing ways, because how we are encouraged to read goodness or badness against appearance is very much embedded. We are encouraged every day to associate smooth, pale skin, plus thinness and height with being a good person as well as parts of the definition of "default beauty." This is excruciatingly hard to catch, and hard imagination work to overcome – though of course well worth it.

If you are still feeling doubtful about this, let's consider some other Star Trek franchise villains. For example, consider the Cardassians and Jem'Hadar, both of whom are also given thoroughly reptilian elements, pinging another "western" mainstream prejudice. The Founders are again an exception that proves the rule, because their first representative we meet is the thoroughly likeable and sympathetic Odo. They may be bad, but the implication is that they could be made friends. This is not so clear in the case of the Cardassians, although it does seem like the writers for Deep Space Nine came to the very edge of breaking through the unconscious bias here in the character of Elim Garak, except for him being a psychopath ex-torturer. On the other hand, we have the execrable Ferengi, whose first appearance was such a spectacular writing fail anyone could have been forgiven for thinking that they would never reappear again. In time they are remade from outright villains to rogues, and a sort of externalization of a capitalist, patriarchal, misogynist id.

Then again, maybe it is unfair to focus on villains. What about some other types of aliens. The obvious ones to think of next are the Bajorans, whose planet is conquered and occupied by the Cardassians. They are set up as not just as losers to a more masculine and militarized society, but as notoriously superstitious because they worship mysterious beings living in the wormhole near their planet, referred to as the Prophets. The Federation is set up as being majority secularized and too sophisticated to believe in any sort of deity, and such belief is coded as "primitive." After all, the Vulcans don't believe in such an illogical concept as a deity either. The Trills have no head hardware, just some fetching purple markings on the sides of their necks, and they generally feature as scientists and ambassadors. I've already mentioned the Betazoids, although their existence as roughly inverse Vulcans works against them in other ways.

Again, I don't intend this as a critique. The big theme here is that this is a non-trivially difficult problem, especially if the people developing the science fiction work in question are working in a visual medium, grew up in "western mainstream culture" and are not racialized. Some of what the writers have been doing in Star Trek: Discovery is actually rather encouraging on this point. It makes logical sense (no pun intended) that Michael Burnham like Worf, would experience considerable angst related to growing up in a literally alien culture, unable to conform completely to its dictates even if totally willing and wanting to do so. Alas, they seem to have found that innovation too hard to go beyond while also hurrying to fix the cowardly decision to kill off one half of gay couple Stamets and Culber. (Top)

The Good, the Bad, and the Not Sure (2020-03-30)

Cover of the (in)famous penny dreadful, *Varney the Vampire or the Feast of Blood* published during the 1840s. Image courtesy of Women Write About Comics, october 2016. Cover of the (in)famous penny dreadful, *Varney the Vampire or the Feast of Blood* published during the 1840s. Image courtesy of Women Write About Comics, october 2016.
Cover of the (in)famous penny dreadful, Varney the Vampire or the Feast of Blood published during the 1840s. Image courtesy of Women Write About Comics, october 2016.

A friend directed me to an excellent study, The Intellectual Life of the British Working Classes by Jonathan Rose. Rose is engaged in a whole range of tasks in the book, from writing a history of education by and for the british working classes to demonstrating that they did not just settle for or accept without criticism the books they had access to. Using autobiographies written by working class people themselves, Rose draws together an intriguing discussion of how they actually viewed such cheap books as "penny dreadfuls" and chapbooks that are perhaps closest to modern comic books. Indirectly, Rose is trouncing the stereotype of working class people as uninterested in reading and education, especially those in the most physically labour-intensive jobs. He gives examples of subscription libraries and informal book exchange systems among miners, shepherds, and factory operators who pooled funds to buy books and could literally read their copies to pieces. A great deal of what particularly appealed to the generation of readers Rose was studying, people who came of age in the late nineteenth and early twentieth century is now referred to as "classic literature," and there are persistent strands of both popular periodicals more abstruse publications. Among the many things this book led me to think about again, is what makes a book "good" according to popular consensus.

UPDATE 2020-08-18 - A more recent brief treatment of this theme, including references to several complementary reads to bring out more of the picture was published by Helene Guldberg with spiked on 21 july 2020, The Self-Making of the British Working Class.

This is a topic that is well-suited to spoiling a party because different people bring very different anxieties about class and culture when they talk about art in general, let alone books in particular. Erica Lagalisse notes in her recent book Occult Features of Anarchism that "taste" is really a way of demonstrating membership in a particular class. Openly liking or praising the wrong things in the nineteenth century could undermine a person's "respectability" and that is still true today. Yet if the topic could be safely raised, it would be wonderful to consider it through the prism of Athena Andreadis' Unibrow Theory of Art, which she summarizes as follows.

...my definitions have to do with the artist's attitude towards her/his medium and audience and with the complexity and layering of the artwork's content, rather than its accessibility. In my book, lazy shallow art is low, whether it's in barns or galleries. What makes Avatar low art is not its popularity, but its conceptual crudity and its contempt for its sources and its viewers' intelligence.

It seems to me that this is a reasonable approach, in that it forgoes snobbery and refuses any simple conflation of popularity with quality, good or bad. Since it kicks snobbery to the curb, it does not demand that a person who likes an example of lazy, shallow art be declared lazy and shallow for liking it. I suspect we all have a particular artwork or several that would fall into the lazy and shallow category, and that liking needn't have a thing to do with any analogy to junk food. We humans have a fascinating capacity to encode objects with meanings that may be all out of proportion to the object itself. The object may be banal in itself, yet if it was a gift from a special person for example, it has a different valence because of the memories it calls up and the relationship it represents. This is highly individual, but certainly not indicative of a flawed character.

Still, this does not mean that people who protest the valorization of popular art in any form are necessarily being snobs. There is a genuine critique to be made of the cynical manipulation of popular works in attempts to influence general belief and behaviour. Lagalisse touches on this as well in the book I have already cited. On one hand, we are not simply dragged around by the nose by propaganda more generally or its form when embedded in popular artworks, especially visual culture such as movies and posters. On the other, that doesn't mean that those materials can't at least confuse us or pollute the cultural environment so that we are forced to dig through much that we never asked for just to put together a decent understanding of current events or enjoy a show. Note that Andreadis refers to low art as having conceptual crudity, contempt for its sources, and contempt for viewers' intelligence. This point helped me finally make sense of the special rage inspired by certain late twentieth century artworks, including an infamous painting sold to a major canadian museum, which looked to most like a black square on a white background. This painting, such as it was, apparently made a non-trivial number of people viscerally angry. The news stories at the time fixated on how much the painting cost, but it seems to me now that must have added insult to injury as far as most people were concerned. The painting expressed contempt, deliberately or not, for anybody who would question whether it was good art, or art at all. Of course, we could argue that this painting isn't an example of what Andreadis meant here. She was talking about a James Cameron movie after all. But the painting just happened to have nothing else about it to make the contempt easier to ignore, Avatar had major special effects and genuinely gorgeous visuals. Think back to those "penny dreadfuls" and the like that the turn of the century british working classes read and enjoyed, but by no means uncritically. In fact, the penny dreadful featured in the illustration for this thoughtpiece is a great example. It is one of the earliest vampire stories written for a broad audience, as it happens by an author with contempt for that audience, as Doris V. Sutherland at Women Write About Comics shows with snippets from the author's own notes. The main reason it has not been lost altogether is its loose relationship to the more famous vampire stories of the nineteenth century in english.

In any event, I still wouldn't recommend bringing up the question of what "good" or "high" art is at a party if you would like to be invited back, but I would suggest giving the Andreadis Unibrow Theory of Art some real consideration. (Top)

Writing and Editing (2020-03-23)

Yet more steampunk clip art, this time showing a clerk clerking, october 2020. Steampunk art from the Dover Steampunk Sourcebook, 2010. Yet more steampunk clip art, this time showing a clerk clerking, october 2020. Steampunk art from the Dover Steampunk Sourcebook, 2010.
Yet more steampunk clip art, this time showing a clerk clerking, october 2020. Steampunk art from the Dover Steampunk Sourcebook, 2010.

Today many of us get at least basic training in how to read and write, the components of basic literacy. The two things may seem to go inexorably together to us now, but this is a very recent impression, historically speaking. Originally people in europe were taught reading and writing completely separately, and if they were in the wrong social class, might be taught only how to write their names. This persisted deep into the nineteenth century in much of europe and the various european colonies. On top of that, Indigenous persons might well know how to read and write in their own languages, or have a clan sign that they customarily used as a signature, only to be prevented from using either if they participated in a treaty signing or were having their marriages or the births of their children recorded in a church register. This makes "signing with an x" a less than perfect confirmation of a person's illiteracy than it might otherwise be, especially in the nineteenth century. On the other hand, people from less than affluent backgrounds began to find ways to make a living by writing from at least the seventeenth century, building on their hard-won mastery of the word. But of course, their work didn't get published simply as is. There were already editors, some working for larger publishers, others working as publisher-editors. In england particularly, editors were in a striking position.

Many of those editors were part of businesses taking advantage of the combination of the ability to print broadsheets and early news sheets cheaply. Soon this would build out into newspapers and magazines, and an important portion of these publications was aimed at an audience with little pocket money but great numbers. But in order to fill up these newfangled broadsheets and such, the printers needed material. Poetry and songs were great starting points, after that, they needed the forms that would grow into articles, shorter essays, short stories, and eventually serialized novels and investigative news. All of this was new and people had to make it up as they went along. English spelling was inconsistent at best. Editors had to develop house styles to manage spelling and punctuation consistency, as that would help speed typesetting. They also helped invent and shape the new forms of writing, including constructing an audience for them. This was inevitably hands-on work, with different levels of interaction with the various authors, though it probably dropped considerably once the form was defined and the author had gotten the hang of it. Of course, authors of more respectable reputation or at least greater popularity could probably exert more influence in their own right.

Alberto Manguel commented in "The Secret Sharer" in his book A Reader On Reading on the intensive, not to say intrusive (he is very diplomatic) editing process he had experienced when working with north american and british publishers. At first I was unsure what this more intensive process could be, since it seemed to me that there are two layers of editing. An initial layer in which an experienced editor would read for sense and note awkward or hard to understand sentences, rough transitions that need smoothing, and the places where the author has not provided information that a general reader needs because they don't live in the author's head. This shouldn't enforce extreme rewriting per se, though it would reasonably take some back and forth. The next layer of editing would be the copyediting to clear out the spelling issues that spellcheck can't handle, impose the house dictionary and punctuation styles, and probably insert page or section breaks for longer manuscripts. That has inevitably several iterations as well as the text is transferred closer to final copy. That's all quite a lot of work, but not necessarily the sorts of immense rewrites I have read about in the writing memoirs of popular writers who have gone on to put together instruction books on their craft. My own forays into different publishing venues finally clarified for me what may be the difference Manguel observed and what makes preparation for publication particularly laborious in the context of publishing traditions rooted in england.

UPDATE 2021-04-09 - A more recent treatment of this issue in the context of translation of women authors whose language is arabic by M. Lynx Qualey in new lines magazine. A specific section that stands out in the context of this discussion:

"In both the United States and the United Kingdom, English-language publishing has what people call a "strong" editing culture. By contrast, Arabic editing culture is often called "weak." Indeed, English has a more elaborate system of editorial control. And yes, you can often open an Arabic novel and find typos. However, the "strong" editing culture of the U.S. and the U.K. also means readers are more likely to find their translations altered.

In Arabic, when translations are changed – as the first Harry Potter translations were – translators and editors usually argue that the contents aren't suitable for their audiences. In English, it is more common to find editorial changes aimed at "improving the book" or "making it easier" for the English-language reader to understand.

For my part, I had submitted a sample from a very early and not at all mature enough manuscript for consideration at a small publisher that was just starting out. I entirely expected to be thanked politely and directed to try again after they had released their more detailed submission guide, which would better inform me about their selected market niche. Instead, they seemed quite enthusiastic, which was great. Alas, they were new in the field, and it seems that perhaps my sample had not been examined too closely. They were also extremely busy, and of course the publication process runs slowly with different manuscripts lined up to go through the process. Eventually I received a new version of my manuscript, in which the editor had worked very hard to completely reshape the text into an entirely different book. This certainly made the parameters of what the publisher wanted explicit together with the series of books they had now released. It was probably lack of experience on both my and the editor's part that led to that tremendous effort on their side, rather than stopping short and letting me know that the manuscript was not working for them after all. With due consideration of the new information, we agreed that it would make more sense to withdraw the manuscript and if I was going to submit something else, to start from the publisher's now established expectations. I suppose in part this is why many publishers won't deal with authors directly, and many authors have agents who are able to direct promising manuscripts to suitable publishers.

In the context of the north american and british markets, the drive for many publishers is still to produce a specific, defined sort of book that has a high likelihood to sell well. Based on past sales, they do of course have some evidence for what sells in particular genres, which books are most likely to come out with a splash then fade out, and those that need more time to get their feet under them. A known author might have more room to go outside of standard genres or realizations of a given genre, because their name is a selling point in itself. The publishers in other words, are trying to mitigate their risks by exerting control over the eventual product. So it could happen that even if an author has carefully studied the portion of the publisher's catalogue that their manuscript would most likely fall into, then wrote and revised with a view to harmonizing with those other books, they could still end up in a significant series of additional transformations to their manuscript before it is green lit to go to press. I have wondered more than once whether this is the reason for the noticeable stylistic differences between english books from large presses versus translations of books from other languages that have already sold well overseas in their original languages. At minimum, it seems that it must contribute.

A part of what led me to ponder on this, separately from my own minor adventures in formalized publishing, is some of the commentary I have read on (of course) The Lord of the Rings and the Harry Potter series. J.R.R. Tolkien apparently had sufficient authority to basically refuse to be edited, and leaving aside judgement of whether that was a good or bad thing, it is an interesting thought. He did not envision The Lord of the Rings as a trilogy, but it was published as one due to a combination of paper availability and how much of the manuscript was completed at a given time. I could see an experienced editor feeling a definite urge to press for changes to make it better resemble the long established format of the triple decker novel. Meanwhile, in the case of Harry Potter, I have read more than one argument that after book three, J.K. Rowling got copyedited and not much else. That could be so I suppose, because by then the books were a big enough phenomenon with an apparently set formula so she could just run with it. Of course, I have no more idea than the other commenters on this point. Yet I wonder if some of what others perceive as bloat in the later books may in fact have been a response to many fans of the books asking for more details of Harry Potter's world. Or just as likely, that Rowling's ability to reflect the specifics of teenage psychology annoyed some more mature readers. In books directed solely at adults, perhaps those things would have been pruned out by Rowling herself. But adults were never the primary audience for these books. (Top)

Intemperate Views on Temperance (2020-03-16)

Some more steampunk clip art -- I am still trying to find the product this image was originally an advertisement for, october 2020. Steampunk art from the Dover Steampunk Sourcebook, 2010. Some more steampunk clip art -- I am still trying to find the product this image was originally an advertisement for, october 2020. Steampunk art from the Dover Steampunk Sourcebook, 2010.
Some more steampunk clip art – I am still trying to find the product this image was originally an advertisement for, october 2020. Steampunk art from the Dover Steampunk Sourcebook, 2010.

Reading general depictions of the temperance movement in north america, I've always found how intemperate most of these depictions are. The pop culture image of temperance campaigners is that of a bunch of raving people who couldn't stand to mind their own business but had to get into the private business of others. Contemptible people who demanded liquor bans that encouraged dramatic raids and the smashing and dumping of barrels of cheap whiskey and gin while the working people who just wanted to let their hair down were criminalized. Meanwhile organized crime including the infamous Al Capone and his friends moved in to found or take over speakeasies and fill the streets with gang violence. Those awful temperance activists, opening the way for gangsters and shootings in our streets.

Well, obviously I am exaggerating a little, but not by a lot. Much of what I have summarized draws on movie depictions and brief articles, and the usual sneers at nineteenth century Feminists. Today most of us understand that while it is possible to consume alcohol safely and responsibly, for those so unfortunate as to find themselves addicted to alcohol and suffering the horrors that come with an ever-growing addiction, they understand all too well that while they may not agree with everything temperance activists said, they did have a point. After all, the starting point of the very notion of temperance is balance, and that is still the major element in the adjective "temperate" for weather, and a person's temper, which is preferably even if not good. "Tempered" steel has been treated so that it is flexible and able to withstand giving or receiving blows.

Temperance activists in the nineteenth century especially had seen much of the worst that alcohol could do to people, and they could refer to accounts written by people who had survived the incidents. Alcohol is a well-known depressant today, able to lower inhibitions and induce or exacerbate depressed feelings. To this day Indigenous peoples are still struggling against the prejudices ingrained from this period, when alcohol was a tool used to disorient communities and cheat them in trade, let alone the role alcohol may play in coping with persistent poverty and the aftermath of traumatic experiences. Non-Indigenous writers were shocked to discover just what the liquor customarily sold to "Indians" was actually like in its effects. For example, artist Charles M. Russell, who took part in canadian cattle round ups wrote in his memoirs:

"I never knowed what made an Indian crazy when he drunk till I tried this booze... With a few drinks... the Missouri looked like a creek and we (ride) off in to it with no fear... if a man had enough of this booze you couldn't drown him. You could even shoot a man through the brain or heart and he wouldn't die until he sobered up.

When Indians got their hides full of this they were bad and dangerous. I used to think this was because an Indian was a wild man, but at this place... where we crossed the herds there (was about ten families of Indians) and we all got drunk together. The (Indian women)... got mighty busy (hiding) guns and knives. In an hour we're all... so (mean) that a dog... couldn't have got along with us.

Some wise cowpunchers had (talked) all the cowpunchers (into leaving) their guns in camp. Without guns... cowpunchers and Indians are harmless... they can't do nothing but pull hair... we were so disagreeable that the Indians had to move camp."

This was emphatically not nice alcohol. It was cheap and not at all like the distilled liquor, brewed drinks or wines most of us are familiar with today. Even moonshine was an utterly different experience. This trade liquor started out as a distilled and concentrated liquid in a small cask or barrel. Then it was diluted with water to make it go as far as possible, flavoured with pepper and whatever else might provide the burning sensation that so shocks most people when they taste hard liquor for the first time, and one or more handfuls of tobacco at least for colour. It is true that alcohol is a toxin itself, but the nicotine from the tobacco is even nastier.

UPDATE 2020-06-09 - Mirabile dictu, the 8 june 2020 news round up at feminist current includes this note, that "Canada's highest court in Ontario has struck down a law forbidding those charged with a crime from using the defense that they were too intoxicated to know what they were doing, which victim's advocates have said will encourage offenders to think they can evade responsibility for their actions." It would be more mirabile dictu if this disingenuous and vile defence has never been successfully used. Intoxication is not properly a defence, especially alcohol intoxication precisely because of this convenient pseudo-defence. If it was genuinely meant and intended for the purpose of not offending again, then we would have overwhelming evidence of men in particular swearing off alcohol and not reoffending. We do not.

Non-Indigenous women and some men argued that the disinhibiting and what we would now call addictive effects of regularly consuming cheap alcohol lent itself to a host of ills. Men under the influence would insist that they could not be held responsible for any violence they might commit at that time, whether it be beating another man in the street or going home to beat their wives and children. They could and sometimes did argue they had the right to drink their earnings away and leave their families to starve, or else that they couldn't help themselves, because they were under the influence. Entrepreneurs then and now made extraordinary profits from alcohol sales, with no sense that they had any responsibility for the impact of their development of advertising and general fuelling of the alcohol trade. To this day four of the wealthiest families in north america made their fortunes selling hard liquor or beer: the Molsons, Labatts, Sliemanns, and Busches. Their companies developed into early large corporations, and were far from trivial to oppose. They could afford to push their messages about themselves and their products in the press, including demands that small time liquor producers and smugglers be shut down because supposedly they were the real causes of examples of alcohol's havoc producing properties. It was unlicensed, uncontrolled liquor that was at issue as far as they were concerned.

There was overlap between early north american Feminists and temperance advocates, leading to the common labelling of temperance as part of "maternal Feminism." Maternal Feminism gets a bad rap, and it was certainly not perfect, prone to slipping into a female form of paternalism and losing sight of the actual needs and concerns of most women, who were and are, just like today, working class. It wasn't always the best framework to even notice let alone challenge racism from, and this was deeply frustrating to many women, who argued that maternal Feminism was an attempt to play to the politics of respectability, which would inevitably leave most women behind. It was certainly not a long term strategy, but it is an understandable one. The women best equipped with time and resources to take part in social activism were also under the heaviest pressure to perform respectability, and that meant portraying themselves as perfect mothers whether literal or social, whose concerns were driven by the motivations stereotyped feminine and maternal behaviour insisted they must have. And of course, maternal Feminist or not, it is perfectly reasonable to want to help women, children, and men alike avoid and escape violence and poverty.

Meanwhile, capitalists who weren't part of the big liquor interests had quite a bit of interest in temperance for their own reasons. They wanted stable, obedient workers who would spend their minimized wage packets on the "right" things, and rest assured, the "right" things never included alcohol, let alone entertainment outside of the home or anything but the most basic food. If the workers wanted to spend more on things that their employers did not consider necessities, then they would demand more money, which the capitalists did not want to share, regardless of how big or small their profits were. The combination of two sets of self-serving capitalists effectively fighting in the media and the legislature over whether alcohol should be legal probably did the most damage to the temperance movement of all, because their behaviour was so obviously cynical. (Top)

Troubles With "Cis" (2020-03-09)

A rare map that clearly shows all of the portion of southern europe typically referred to as 'gaul' by the romans, march 2020. A rare map that clearly shows all of the portion of southern europe typically referred to as 'gaul' by the romans, march 2020.
A rare map that clearly shows all of the portion of southern europe typically referred to as 'gaul' by the romans, march 2020.

I suspect many, many people are quite baffled by some of the events they may have heard about, in which women meet to discuss laws proposed and passed that enshrine the notion of "gender identity" as a protected characteristic alongside of and sometimes even instead of biological sex, and said event is promptly mobbed by protesters insisting that all such discussions are "literal" violence and therefore fascist. "Literal" is a word that is going through a surreal transformation in meaning at the moment, in which it no longer means the blunt facts in front of us, for example, a literal chair is the thing you sit on behind your desk, but something like "symbolic" or "representative." Unless the people using the word are part of these mobs, who seem to be trying to go back to the original meaning in some ways. Many people who are strongly committed to the notion of "gender identity" are also quite insistent that many people are "cis-gendered" and furthermore, they are the authorities on who is or isn't "cis-gendered." As I understand it, to be "cis-gender" is to wholeheartedly subscribe to and perform the gender stereotypes expected of you based on your biological sex. The people who are certain that "gender reality" is real, which evidently I am not, would not agree with this description. I understand that they would maintain that if you are "cis-gendered," you do not feel any mismatch between the gender you feel and the gender you were "born with." Alas, "cis-" is not the best prefix to use for this.

UPDATE 2020-08-24 - There have been many thoughtful takes on the attempt to push terms like "cis-gender" into everyday and academic language. A great one that I have stumbled upon at long last is at the blog culturallyboundgender, "Cis Gender"? Cui Bono? dated 20 june 2014.

UPDATE 2022-09-13 - I have just stumbled on a reference that verifies the pronunciation, spelling, and connotations of "cis-" that seemed to me to be at work here. In the book Listening to Britain: Home Intelligence Reports on Britain's Finest Hour May to September 1940, edited by Paul Addison and Jeremy A. Crang (London: Bodley Head, 2010). A critical element of this snippet from page 156, with lots of the surrounding text, is that it reflects a widely understood connotation, not an academic or in-group, obscured meaning. "Tunbridge Wells (South Eastern) Much public feeling that COs are in safe jobs getting good salaries while other peoples' sons and husbands are facing danger for a few shillings a week. Government should mobilize everybody and pay them a standard rate. Growing irritation at super-politeness of BBC referring to our enemies. 'Is this cissy attitude due to the existence of an appeasement policy?'"

Until re-entering post-secondary education to work on a graduate degree, I had never heard the term "cis-gender" before, nor did I have any familiarity with the political and social questions it is tangled up with. For good or ill, it immediately pinged my bullshit meter though. Due to my background in classics I was well aware of the label "Cisalpine Gaul" and its origins in latin and early roman history. I appreciated the attempted analogy at play, and in and of itself the pronunciation of "cis-" in the french way as opposed to the original latin way did not bother me. What rang the bullshit bell was the all too obvious parallel to another word also tangled up in arguments about gender stereotyped behaviour and expectations: "sissy," with its less common but easily recognizable alternate spelling, "cissy." We all know that this word is a shortened form of "sister," and unlike its counterpart "buddy," it is not a word anyone wants to be called. Considering the overlaps between the semantic fields of "sissy/cissy" and "cisgender," I have a hard time believing that the originators of the latter term were somehow oblivious to the insulting connotations their new term must inevitably drag with it. On top of that, some especially loud and abrasive users of the term insist that if they label a person "cis-gender" that person is default wrong if they point out that in fact they do feel a mismatch with respect to their gender. If they have a gender. It appears this is the case even if the person refusing the label "cis-gender" agrees with the conceptualization of "gender identity" the people wielding that label often subscribe to. Quite apart from that baggage, there is also the problem of the negative connotations of the lengthened "ssss" sound because it is associated with snakes, which are ruthlessly demonized by judaeo-christian mythology.

Somehow then, "cis-gender" has become a label that can be imposed upon others even if they point out that it is wrong and does not reflect their feelings or identity, let alone their biology and life experience. As applied in the examples I have seen and heard, including examples selected and presented by people who are strongly committed to "gender identity" as a real thing, "cis-gender" is not neutral. It is a verbal bludgeon applied to either drown people out or shut them up. This is all quite pointless if what you would like to have is a constructive conversation, especially when one of the big things I think we can all agree on is that assuming based on appearance what another person's behaviour and beliefs are going to be is at best unhelpful and at worst actively oppressive. (Top)

Sloppy Origins (2020-03-02)

Rare preserved set of seventeenth century sailor's slops held at the museum of london, march 2020. Rare preserved set of seventeenth century sailor's slops held at the museum of london, march 2020.
Rare preserved set of seventeenth century sailor's slops held at the museum of london, march 2020.

Quite some time ago, despite being a grown adult, an older adult saw fit to give me a long lecture about my "sloppy" clothing. The focus was particularly on some military surplus fatigues I had been wearing while out on a long and vigorous walk – that is, exercising. It was quite surreal to be told off for this, especially because at the time I was in a town hosting a military base, so military fatigues were quite a common sight all told, though of course mine were anything but current. The lecture wasn't particularly helpful, but it did end up leaving me pondering that odd word "sloppy." It may even be related to a now obscure second meaning of the verb "to lop" which now we associated primarily with cutting something, especially a branch off, but also refers to hanging loosely or limply. The two meanings do seem associated. Adding "s" as a prefix to a verb in english often seems to go with taking the quality captured by the verb and expressing it in ways that imply a passive aspect. In the case of "to lop" for instance we can find "to slop" where say "slopping" water out of a pail is usually unintentional, and the verb itself has pejorative connotations of carelessness. So understandably a person may interpret "sloppy" as a term that must mean the person so described is careless, even lazy, and a thing described as "sloppy" carelessly or lazily made. This seems obvious. Then I stumbled on some additional information that provided new food for thought.

UPDATE 2019-10-02 - There is another word that seems related to these conceptually, "clobber." Most of us are probably most familiar with it as a verb meaning to thoroughly beat someone up or defeat a person or team in a competition. In reading Dorothy Thompson's book of essays Outsiders: Class, Gender, and Nation, she refers at one point to all her family's "clobber." Context suggested that a canadian would say "stuff" for "clobber," and indeed a quick foray into the OED reveals that the word is also an informal british english mass noun referring to "clothing, personal belongings, or equipment" that can be dated to the late nineteenth century but has no known origin. We can safely surmise that british people with titles don't generally use the word. It's quite an unusually shaped word though, and sorely tempts a conjecture that it is an extension from the verb "to clobber" because those subjected to a military defeat would typically be pillaged of their clothing, belongings, and equipment afterward. If this conjecture got any wider play, it would soon fall into the linguistic category of "folk etymology."

The new information is part of a scholarly article describing servants' working conditions in england and its colonies through much of the nineteenth century. Household servants could generally expect room and board, plus basic clothing to do their work in. In fancier establishments, that clothing might be livery, elaborate clothing intended to differentiate servants working for the rich from those working for what today we anachronistically refer to as the middle class, as well as differentiating the servants of different houses from each other. People who today we might call lower middle class, again, anachronistically, and even a bit lower down the scale than that, might provide slops, or rough, even hand-me-down garments to do work in. The OED notes that "slops" was originally a term for any worker's loose garments, without stating overtly that these would typically be general labourers who would be doing physically intensive labour, as well as tradespeople. The term could also refer especially to the clothes and bedding provided to sailors in the english navy. These garments were and are loose for the same reason most workout clothes still are: for ease of movement, allowing for good circulation, and helping wick away and dry sweat. This is a bit different from the role of a smock, which is generally a sort of protective shirt used to keep paint, plaster, or dirt off of the shirt a person is wearing underneath. The underlying verb in "smock" doesn't exist in english anymore, and the OED suggests it was old english "smûgan," meaning to creep into. Maybe the notion old english speakers had in mind is how a person has to wriggle a bit to get into a long shirt or night gown.

So with this material in mind, it seems like "slops" as a noun shouldn't have had any especially bad connotations to it. Loose workers' clothes were simply practical, especially if we think of such examples as sailors and stevedores hauling on ropes and moving heavy gear, or labourers digging ditches. The trouble of course, is that this type of clothing is particularly associated with manual labour often carried out by severely exploited people. If you are getting handed clothes to wear by an employer, the employer's concerns are to clothe you as cheaply as possible while making sure you aren't mistaken for anyone else's employee. That way as an employer you can be parsimonious and then charge employees for theft if they run away with their slops on. So there was not a lot of drive to make sure that slops were loose to facilitate ease of movement, but not so ill-fitting as to make a person look ridiculous or potentially become a hazard because they were too big. Today it is still considered acceptable to look down on people whose primary work is general labour because supposedly they lacked the character to "better themselves." This is a totally unjust generalization at any time, and ties into stigmatizing poverty. After all, two details that give away that a child is dressed in hand-me-downs is that they show visible signs of repair and especially that they are too big. Effectively then, to accuse someone of looking sloppy is to accuse them of looking poor.

I think it is no coincidence then that over the past twenty years or so, workout gear has become more and more form-fitting. This isn't just about people possibly wanting to express their vanity by using their workout clothes to expose without exposing their trim figures. The various new synthetic fabrics used in the now omnipresent spandex shorts and running tights alongside the almost complete rout of jogging suits seem to also respond to a worry about "looking poor." Think back to Sylvester Stallone's "Rocky Balboa" character, who is a working class man born into an italian immigrant family, and one of whose most famous training scenes from the Rocky movies features him in a jogging suit on his training run up and down city streets and stairs.

By now of course it is probably obvious why my counterpart saw fit to lecture me about my chosen workout clothes. It's bad enough that they were loose-fitting, even worse that they were second-hand, and utterly beyond the pale that I was out in public in them, even for the sensible purpose of exercise. What if someone saw me and assumed that I was poor, and by implication deemed this evidence of poverty among my family and friends? Set out like this it is easy to dismiss this as silly, but of course for my counterpart it didn't feel silly at all. To them it felt threatening, because we are all encouraged to treat poverty as a contagious state that we can bring down on ourselves by merely looking it, or that we should hide at all costs (no pun intended) if we are in it. Alas, would that avoiding and in turn preventing and ending poverty were so simple! (Top)

Markdown Reflections (2020-02-24)

2013 astronomical image of the Sun, intended to illustate solar flares and sunspots taken by nasa goddard space flight centre under Creative Commons Attribution 2.0 Generic license via wikimedia commons. (It has nothing to do with the thoughtpiece really, it's just cool.) 2013 astronomical image of the Sun, intended to illustate solar flares and sunspots taken by nasa goddard space flight centre under Creative Commons Attribution 2.0 Generic license via wikimedia commons. (It has nothing to do with the thoughtpiece really, it's just cool.)
2013 astronomical image of the Sun, intended to illustate solar flares and sunspots taken by nasa goddard space flight centre under Creative Commons Attribution 2.0 Generic license via wikimedia commons. (It has nothing to do with the thoughtpiece really, it's just cool.)

Over the years that I have spent building websites and seeking to create the perfect notebook layout as briefly discussed in Quixotic Columns, I have learned several different mark up languages. The heavy hitters include HTML, the useful but intensely frustrating xml (because for best results you need to learn xslt), and my honest favourite for typesetting, LaTeX. Much more recently I have finally properly made the acquaintance of markdown, in part because it is embedded in many text editing applications though not always transparently. As John Gruber has explained with admirable conciseness, markdown is a way of marking up text plus a perl-based tool to convert it into HTML. Gruber emphasizes the readability of the associated mark up, especially the principle that a markdown document should be legible even if it has not been processed into HTML or some other format. This is an excellent principle, and is congruent with the approach taken in terms of file format that I like about webpages and LaTeX documents: if all else fails and all you can open them with is something like the "cat" command in *nix or heaven forbid, notepad in windows. Gruber has placed his version of the associated perl markdown conversion tool under a BSD-style free software license, and this has certainly helped fuel its wide adoption.

All of this led me to ponder the question of file format brittleness. Most "word processing" programs, the ones with lots of options for "what you see is (sort of) what you get" formatting and insertion of drawings and spreadsheets among other shrieking horrors, often don't save into a simple file. The majority of them save a zip archive that is packaged to look like a single file, but actually includes one or more previews for the print screen and/or previews from the file system, plus an xml file that contains toolbars and style settings, and finally another file again that is usually in something like xml with lines and lines of apparent gibberish with actual content in it. Text paragraphs are all usually all bundled together, but footnotes and other such apparatus might be in any number of places, and depending on the program the content may be repeated for non-obvious reasons. I have learned about this via the terrible experience of having to retrieve data from irreparably corrupted files. A notable honourable exception to my knowledge is Nisus Writer Pro, which saves everything in rtf with some extra codes, simple codes to manage styles and support headers and footers when viewed with the program. The explicit point of that decision was to ensure that the user would be minimally exposed to file corruption risks. (I have not heard of an equivalent that runs under windows, but would love to learn about one so I can recommend it to my windows-devoted writer friends.)

It is a real puzzle to me why among the fistful and more of files in that zip archive, there isn't a markdown version of the document as well as part of the open document format or even the microsoft formats. Having them there wouldn't per se break either format as such, and programs that support them often already have automatic markdown conversion via autoformat settings. So the equivalents are already available, albeit for the most basic formatting. In an emergency however, that is more than enough. If the main parts of the file get messed up, then at least there could be an option to dump the markdown version so that you could start over from that. If file recovery fails all together or leaves things in a partially garbled state, that markdown file could still save a significant amount of work for very little if any additional overhead. This could also work around such alarming paradoxes as the fact that rtf format is not completely standardized even between programs at times, especially in windows where one program will make a hash of rtf formatted documents from another program. I have had this happen at least once between parts of the windows office suite, though to my knowledge this problem has since been resolved either by microsoft's programmers, or by my employer at the time completing an all-round update of the suite.

Of course practically speaking, those of us who do a lot of writing have developed other back up plans in case of lost or corrupted files, including frequent back ups and using the most brittle formats only when necessary for sharing with others or submitting electronic drafts. Back ups shall always be with us, but perhaps with markdown becoming so popular, those brittle file formats will begin to truly fade away. (Top)

A Very Small Slice (2020-02-17)

Microscope section of feldspar in anorthosite courtesy of the oxford earth sciences image store, february 2020. Microscope section of feldspar in anorthosite courtesy of the oxford earth sciences image store, february 2020.
Microscope section of feldspar in anorthosite courtesy of the oxford earth sciences image store, february 2020.

Ruha Benjamin, author of Race After Technology: Abolitionist Tools for the New Jim Code, commented in an interview that "There's a very small slice of humanity whose imagination about the good life and a good world are being materialized." This is I think, an incredibly powerful reflection on the current state of the world. Time and again we read about and directly experience day to day structures that certainly reflect particular ideas about how people should live in the world. They are presented to us as authoritative, as notions we should accept as the best thing for us, or else as our grim fate since we are too flawed to pull ourselves up by our own bootstraps. The sticking point of course is that these ways of living, of seeing and remaking the world, are being imposed by small groups of people with very specific ideas. They are certain they have the right ideas for the rest of us, though they have no intention whatsoever of living in the same way. There are few better illustrations of the absurd self-perceptions of that small slice of people, and it begs the question why and how the rest of us can be persuaded to put up with them. Reference to monopolies of force simply aren't sufficient, because the fact remains that these people are vastly outnumbered, and social scientists have studied how many people actually die in successful uprisings. It turns out that far fewer people die in such circumstances than die in conditions of various levels of social coercion, not least because determined uprisings by a broad enough range of people disrupts the creation of prison and torture complexes. Obviously a short thoughtpiece is not going to present "the" answer to such a critical question, but still let's take a run at some potential factors at play.

As usual, I am going to insist on starting from the premises that people genuinely make rational decisions in light of what they know and expect about the conditions they live in, and that they are inclined to be good. The "in light of what they know" part is not optional, but not intended as a claim that people are generally ignorant either. My point is not that people would do whatever I or somebody else thinks the right thing is if only they knew the same things that person or I do. In fact, people could still make the same decisions they would have anyway or different ones again than me or that other person might have suggested. Any of us may opt to work against our inclinations or what turns out to be our best interests of course, and it may be that we simply can't make sense from an outside perspective of the rationales that other people are applying. Sometimes it is just damned hard to figure out what the best course really is, and right now there is pathological social pressure against changing course when we realize what we thought what was going to be a positive result is turning out to be a negative one. I think we also simply can't deny that the trouble with logic is that we can make perfectly rational arguments using it that are practically insane, but totally reasoned. The problem is a nasty one-two punch of the starting point of the reasoning chain plus the difficulty of changing course even after it becomes clear that we have managed to double down on a mistake, much to the frustration of any philosopher who has sought to find a nonsense-proof way to deal with the world that is simple to define and apply. Our great hope is that more and more of us are able and willing to get up again after the punches and refuse to agree that we are lesser human beings if we decide to take a different path than before that leads to better results. After all, even the famously cantankerous Alexander Pope acknowledged that "to err is human, to forgive divine."

One of the odder aspects of the small slice of human beings whose vision of the good life is expressed in most of the mass media and the ongoing malformation of systems intended for communication and information sharing, which unlike mass media is not one-way, is how they treat "other ways of being." On one hand, other cultures and types of relationship with the world are held up as pathological in the extreme, especially when they are the ways of racialized people inconveniently nearby within "western" countries. On the other, when those others are on other continents or in "non-western" countries, they are lauded as vanishing entertainments best seen and enjoyed by tourists before the chance is gone. Then again, having written this out, perhaps it is not so strange. It's the "resistance is futile" mantra, papered over with pious claims that really those ways do have value, but only insofar as they yield to "the best" way to live. After all, the "noble savage" is only noble insofar as he – always he – is dying.

All this is not to deny that the people seeking absolute control and power via total homogeneity haven't been experiencing some pangs of worry just in case they haven't managed to seize the last bit of exploitable knowledge from what they insist is a minority resisting them. And it is an insistence. No matter how you turn the numbers around, even if the population of the industrialized world, which includes more than "western countries" is 10% of the world population overall, and we pretend for the sake of argument that maybe the majority of those populations agree with the small slice notions of the good life, that is still far less than 90%. And we already know that in fact the small slice notions are held by probably no more than 2 or even 5% of that 10%, so 0.2 to 0.5%. That is nothing that remotely resembles a majority, unless of course the people with these ideas resolve that mathematical problem by the tacit assumption that they are the only genuine humans. (Top)

Touchscreen Troubles (2020-02-10)

Illustration from Bret Victor's essay 'A Brief Rant on the Future of Interaction Design,' november 2011. Illustration from Bret Victor's essay 'A Brief Rant on the Future of Interaction Design,' november 2011.
Illustration from Bret Victor's essay 'A Brief Rant on the Future of Interaction Design.' Original essay from november 2011.

And by touchscreen troubles, I don't mean the grotty way they look because our fingers leave smudges, smears, and prints all over them. Touch screens have been very much presented as our future by the various cellular phone makers and vendors, and indeed the notion is embedded in the Star Trek franchise. The original series was full of recognizable switches, bulky keyboards, miniscreens, and retouched tape recorders. Of course the people making the scenery and props had to use what they had, and special effects could only go so far on their budget. No one is too worried about how the prequel series are out of synch with the implication that touch screens are the future, in that more and more screens and fewer and fewer switches the click and dials that turn are noticeable anywhere in them. The "next generation" and other series in that line stuck to having actual switches and the like mostly in the holodeck. Of course, when making a sci-fi program, flat screens are great for animating in special effects, and those were much cheaper to add and change. I don't imagine anyone in charge of the Star Trek legendarium is trying to make a strong statement about technology in the future, but of course apple and samsung among others certainly are. I do imagine they did not enjoy the news about a horrific crash in august 2019 that led the u.s. navy to replace touchscreens with dials, and I suspect other physical switches. Bret Victor's rant about interaction design referenced with the illustration to this thoughtpiece provides a great overview of the touchscreen paradigm versus the other options for touch-based interaction.

Victor coined the excellent designation "pictures under glass" for touchscreens. I recently learned about a technique of user interface testing that creates a rough version for people to work with using a cardboard mockup of a cell phone. It is basically two layers of cardboard, with enough give between them so that a slip of paper can be inserted between them at the bottom and pulled through to fill "the screen." Make as many sketches of different screens as you like until the slip of paper is full, then begin pulling it through the mockup and pretend you are using a real cell phone. It's a clever idea, and of course can be scaled up to tablets and larger devices. A sort of update to the flip books once used to teach children how animation works. This is a way to test out different sorts of pictures under glass. It is not intended at all to deal with how difficult it is to use fine grained gestures or selections on such screens, even if their dots per inch level is "retina" scale. In fact, there is a terrible irony in that for many of the applications we use on these devices, more dots per inch makes for a worse experience in using them. Take adjusting the volume on screen in an application for example, or the difficulty created by the growing repertoire of gestures that are supposed to enable us to do more with the device but are nearly impossible to remember.

There are purposes for which pictures under glass just don't work, especially when we need to apply a setting and not have it accidentally changed by a glancing touch or at times even a damp cloth. Writing software that successfully ignores unintended input while never missing intended input is insanely hard, hard to the point that in many cases it makes no sense to do it, such as in the course of flying any sort of plane or handling any other dangerous equipment or vehicles. As humans living in a three-dimensional tactile world, we need the feedback that textures, pressure, and so on give us. We have already designed many tools that take advantage of what we understand based on our experience using our hands and all our senses, so that we build into our tools affordances that account for our inevitable mistakes and a surprising number of curve balls that the world can throw at us.

Of course, if we use those sort of tools instead of touch screens, that annoys the owners of companies that sell touch screens, who are looking for infinitely increasing sales. Since a touch screen means a low power computer, and that computer is likely going to be internet-connectible and set up with applications ready to mine as much data from any person who uses the touch screen as possible. All that surveillance data and the surveillance itself are the new money spinners, at least by speculation, so there is a lot of pressure from tech companies to put touch screens everywhere they can. The ones who sell cell phones would like us to channel more and more of our lives through those screens to get the data and try to sell us things. In other words, there is a perverse incentive to get touch screens in use where they aren't the best option, which may well survive the impending horrors of the so-called "internet of things."

Originally I didn't have too much against this really, apart from my frustration with volume settings in music applications. Then apple introduced a reminders application as part of what came "for free" with iOS. It was held up as an excellent and needed addition to the default applications, though I was puzzled why it needed to be there when the calendar application does pretty much all the same things. I did try it out, and found the big thing seemed to be about tracking the items completed more explicitly. All right, fair enough. Until my first cell phone bill after I had started using the application, from which I learned that my reminders were somehow taking up data, and there they were all totalled up by date on my bill screen. This horrified me so much that I stopped using the application and checked to make sure no others were engaged in such behaviour, before in the end the excessive cost of cell data led me to drop that too. Still, I pondered whether to use my calendar for the purpose instead, as it didn't seem to report to either my cell phone service provider or apple. At which point the whole experiment collapsed because the speed and convenience of my trusty notebook couldn't be beat, and I could always add an alert to my calendar later. But there was no sense using the notebook only to transcribe it all to my phone when unlocking it, pulling up the application I wanted and so forth took so much time.

Bret Victor makes many excellent points, but I am not sure that I can follow him when it comes to the field of "haptics" which, keeping with my Star Trek starting point, could be conceived as incremental holodeck development. It probably doesn't help that cool as the holodeck is as a concept, I'm not sure it would be that much fun to literally be inside a computer. Victor argues that the idea is to make the building the computer so that any sheet of paper or whatever can share in the computer's abilities – I believe this is an accurate paraphrase of his overarching idea, but please see his Dynamicland project for yourself rather than taking my word for it. His reconceptualization of how computers and regular objects that don't have computers stuck inside them looks far more promising than haptics defined as somehow making a touchscreen that allows you to "virtually" feel something via a bunch of vibrating and blowing add ons. (Top)

Deskilling is Relative (2020-02-03)

Cover of Harry Braverman's 1974 book, *Labor and Monopoly Capital.* Cover of Harry Braverman's 1974 book, *Labor and Monopoly Capital.*
Cover of Harry Braverman's 1974 book, Labor and Monopoly Capital.

Harry Braverman's book is still well worth reading and thinking about to this day. He carried out an examination of the motivations of employers seeking to automate aspects of jobs and their beliefs about employees, especially their beliefs about the intelligence and value of employees as active as opposed to passive participants in the employers' "business." I suspect that many critical scholars of Frederick Taylor and his ideas, bolstered by fudged numbers and ferocious self-marketing have a copy of Braverman's book on their shelves. Today if ever we were labouring under the impression that employers focussed on extracting maximum profit have somehow made this into a virtuous and socially constructive pursuit, we have been firmly disabused of the notion. If the 2008 economic meltdown didn't do it, then the so-called gig economy and its tight relationship with "faux-tamation" must have provided the reality check by now. In the midst of such calamities, it is easy to forget that even de-skilling is relative, and that it is possible to be stealth de-skilled, and not necessarily by employers. Today the major source of stealth de-skilling is the desperate drive to data mine all of us, probably with the idea of reaching a sort of eerie nirvana consisting of each person being a never-ending source of profit for those who data mine us, even as we ourselves have no means to support ourselves.

Part of what makes de-skilling relative is that the notion of "skill" is itself not fixed, and it has been cheerfully abused to allow the creation of two-tier pay structures in far too many workplaces, unionized or not. The famous examples include instances where men are considered skilled and paid more for the same tasks that later, once the men are able to move into work considered more prestigious, are relegated to women who are then conveniently declared unskilled and paid less. An important subset of automation in england during its industrial revolution aimed at minimizing the amount of brawn required to carry out tasks, so that men could be replaced with presumed weaker women and children, who could then be paid less. Levels of strength and endurance vary considerably between individuals depending on their phenotype, overall health, social position and subsequent social support for or against them developing their bodies, let alone their sex. Those complications aside, we do have clear documented evidence of systematic lower pay rates for women based solely on the fact that they are women, which nowadays can't be covered over by claims about men supposedly being the breadwinners, so the fallback positions are careful redefinitions of skill on a case by case basis, or preferential channeling of women into part time work.

There is no question that the skills required of people in a more or less automated factory are different from a situation where people carry out all the labour by hand. Another motivation for bringing in machines is that the division of labour to allow mass production of cheap goods renders the individual tasks so boring that it can be difficult to keep the system running as people struggle to stay alert and avoid mistakes. The machinery itself can be notoriously balky, so operators need to learn how to diagnose and fix trouble on their own to keep rates up. Not that employers are necessarily fond of this, because a key drive behind deskilling is the desire to centralize all control in the employer and his (let's face it, it's mainly men in these positions) and their chosen deputies. A skilled worker is a worker who could use their skill to gum up the works, as mill owners learned early on in belgium where the original saboteurs threw their shoes, sabots, into the machinery to protest their working conditions. Still, the key relationship here is that an operator could become amazingly skilled at keeping the machinery running and therefore meeting the day's production targets. But their skill at making whatever the item being punched out that day is may well be none. If a series of machines does all the work to make the parts that go into the final product, all the operator needs to know is how to run the machine, not how to make the parts. This by nature renders the operators less like people with individual skillsets and more like interchangeable parts, which is also a desideratum of employers, especially in mass production where the analogy will intrude itself on their thoughts constantly.

Stealth de-skilling is quite an old phenomenon, manifesting at first as people simply not passing on skills that they see as no longer useful. There was a time that by necessity a whole family, not just the women and girls, would know the basics of how to knit, weave, and sew. Clothes were expensive necessities, so being able to make their own clothes, repair them, and ultimate rework the worn out parts into new clothes or other useful items was critical. This work is time consuming when done completely by hand, and that led people to develop labour-saving devices to help produce those needed clothes faster. Early capitalists seized on this because they saw a huge pool of cheap labour they could get their hands on, if only they could drive people to switch from home made to cheap, mass-produced clothes. In northern north america, this meant trading as much cloth and clothing as possible and persuading Indigenous peoples who wore mostly hard-wearing and warm leather and furs when they wore clothes at all, to wear the cheap cloth and shoes instead. Indigenous ways of preparing and tailoring leather and fur soon ceased to be skills everyone had at least the basics of in ways parallel to europeans. Today things have reached the point that it is exceptional to meet someone who was taught in their youth how to adjust poorly fitting mass produced clothing, let alone how to do so using a judiciously applied seam ripper and needle and thread.

A much newer form of stealth de-skilling is related to modern hand held devices, and the experiment in not teaching both print and cursive scripts in much of north america. It seems that many of us have forgotten that the way we develop dexterity and the ability to work with the wild range of objects available to us with our bodies, and especially our hands, is by using them. This means that when we're toddlers, we should be playing with toys like blocks, balls, dolls, and versions of items fortified against our early need to put everything in our mouths, like block books. Without the challenges to our strength and dexterity, we develop neither. I have been chasing an article that reports on a study that argues new medical students lack the dexterity to stitch their patients. In other words, their hand-eye coordination and strength are probably in trouble, because just typing and swiping are not enough for the fine motor control sewing needs. Then again, I have run into more and more people who are unable to touch type, in part due to autocorrect and predictive typing that means they never have to learn the keyboard.

All of this is not to say those med students are unskilled of course. They are, in many things, including many things specific to their hand held devices. In the end it comes down to the question of what we believe people should be skilled in, and what skills should be recognized. That, and why we think people should be skilled. There is a non-trivial difference between insisting that people should be equipped to handle an emergency that cuts the electricity without needing an immediate airlift and that they should focus on being the best sort of service worker whose every task is mediated by a computer. (Top)

Background Graphics (2020-01-27)

Snippet from an old apple desktop wallpaper, january 2020. Snippet from an old apple desktop wallpaper, january 2020.
Snippet from an old apple desktop wallpaper, january 2020.

One of the big changes that came in with cascading stylesheets, besides the loose attempt to force a complete separation between content and formatting on web pages, was the ability to use background graphics and do all sorts of fancy things with them via stylesheets. Separation of formatting and content is a good idea generally speaking, but it is not so easy to achieve in practice because the boundary between them is not as perfect and absolute as those who get particularly angry about page layout using tables insist. (I share their frustration about this, especially as it impacts website accessibility.) Often this whole question is quite arcane nowadays, with so many people using blogging software that presents them with templates and autogenerates code. Every now and again, even with templates, things go awry, especially when it comes to background images, which we are all used to since desktop images are ubiquitous, and most modern operating systems can take a folder of desktop pictures and automatically change them throughout the day. Easy, familiar, generally a nice change rather than a distraction. Like everybody else, I never spent much time thinking about this, until I ran into a website that made me pay more attention.

The site in question is an absolutely wonderful one, Juliette Wade's Dive Into Worldbuilding. It is home to an in-depth series of discussions, interviews, and articles on writing speculative fiction and how to create the alternate universe that the events and characters happen in. The guest authors wrestle with such questions as how to avoid the equivalent of the amateur conlang mistake in which the new language is english with a few letters lopped out – unless that is actually what you want as part of the story or conlang. A mistake in one scenario may be exactly the right thing in another. I understand that the discussions are zoom meetings streamed on youtube or similar, with a summary and partial transcription posted a few days later. Juliette Wade is a veteran author herself and gets help keeping her now eleven year old venture into helping out writers and readers of speculative fiction alike via patreon. I report all this because if you are even remotely interested in these topics, you will be well served by checking Dive Into Worldbuilding out.

What this has to do with background graphics is what is on one hand, an absolutely apropos image that features as the standard background of the main text block of each page: a water droplet and backsplash in shades of blue to bluish-grey. Unfortunately, the opacity of this image is 100%, with a result that for me is intolerably distracting, so I use uBlock Origin to prevent it from loading. I suspect that actually the stylesheet template is set up to ease the transparency of this image, but the way to code this is not quite standardized, so inevitably it does not necessarily behave appropriately in all browsers. Now of course, on a virtual desktop, when we open an application window, it blocks out the background image by default in most cases, though it is often possible to change that behaviour if you want to. Arguments are still ongoing among MacOS users as to whether the menu bar should ever be translucent to any degree because they apparently find that intolerably distracting. For myself, I think it would be infuriating if it was all or nothing: totally transparent or totally opaque. It would be even more infuriating if I couldn't select the basic opacity level that suited me. A translucent menu bar doesn't bother me, probably because there I find the menu text to be bold enough to still be legible, and I am not scrolling extensive text over it in order to read.

Background images are difficult to select between finding something that suits the theme and content of a website, avoiding examples that are too reminiscent of advertisements and therefore likely to annoy visitors, and sorting out when or if accidental visual effects pop out due to interactions between text and image. All this doesn't get into the question of on-screen size and how the image's byte size may affect initial load time and download size for slower connections and people checking out the site using a phone or tablet. There are many new elements here, related both to the issue of internet connections and that screens emit light as opposed to familiar paper with its reflective properties. If you like, we could reframe background images as illustrations as not merely what we don't see, as discussed in a previous thoughtpiece, but illustrations we are paradoxically not supposed to see, or at least to see them only for a short time before we tune them out like a steady sound. (Top)

Asocial Amplifiers (2020-01-20)

Stock image of a blue and white bullhorn, january 2020. Stock image of a blue and white bullhorn, january 2020.
Stock image of a blue and white bullhorn, january 2020.

My skepticism of "social media" is far from a new thing, no doubt rooted in my experience of the earliest "free services" that soon proved not so free. Back in the day terms of service were not nearly so long and full of legalese, and so it turned out to be easy and shocking to learn that "free" web hosting companies demanded to own whatever you posted on them in case it got popular and they could make money from it, by representing it and later especially from advertising placements. At one point, before google had become the pre-eminent search engine, it was a general thing for search engine companies to provide facilities to create a user account through which you could set up personalized filters and often use a free email account. I experimented with one of these on a long dead search engine site, and was struck by the steadily increasing demands for personal information and claims that if the information provided was not accurate I would be legally liable for providing "false information." This makes no sense under any other circumstances except for the necessity to provide accurate payment and shipping information when making purchases online. It all felt very off, so I backed off and decided to watch the new "social media" and see what it turned out to actually be apart from its advertising.

"Social media" is described by such sources at wikipedia as a means of "creating" and sharing information using virtual communities and networks. The material is supposed to be primarily "user-generated." On average wikipedia writers are uncritical boosters of the latest tech trend, so this is unlikely to downplay social media features or suggest that they are worse than they are. If the users are making the content, then it is easy to conflate social media with such means of reproducing and sharing information as the printing press, radio, and arguably television and movies. Easy to conflate, and easy to elide important issues with the way those previous media were implemented, such as being based on expensive equipment that enables those who own the equipment to serve as gatekeepers and censors on what is created and shared, especially when there is broad agreement at least in word that everyone should be able to share their ideas and discuss matters of concern to them. These issues are multiplied in social media, because of the expensive nature of computers and access to general internet infrastructure. It is too easy to forget that what first opened up print media to the broader population was smaller printing presses and cheaper ink and paper. The openness depended in great part not only on size, speed, and cheapness, but especially on the inability to keep them under centralized control and regular surveillance. It wasn't easy, it isn't easy, we have to work to maintain the elements of media that are useful and matter to us.

Overall, it seems to me that "social" doesn't just mean people thrown together without necessarily ever interacting. A quick dictionary visit refers to "social" as relating to the organization and interrelationships between people living together because of their need for companionship in work and play. To me there is a strong undercurrent of choice in the notion of "society," people choose to be together, to work together to make something happen. Alas, societies can become coercive, usually by refusing to respect the choice of people to leave. I don't mean to imply societies and social behaviour are inherently positive and perfect, since they involve us humans with our difficulties and imperfections. Nor is "media" simply positive, in its definition as one way mass communication. In truth, "social" and "media" are in contradiction here, because social behaviour is not one way at all. Social behaviour is at least two-way, more than one person is participating, is able to contribute to the conversation and make meaningful changes to the messages shared and encoded in whatever participants encode it in, be it a social structure, a ritual, or yes a book or a programme. If "social" and "media" are in contradiction, then their implementation in a constructive way that actually facilitates and supports people interacting and creating together is effectively crippled at the start.

With all this in mind, I simply can't accept the "social media" label. More accurately, systems like facebook or twitter are better labelled "asocial amplifiers." They are great at giving great volume to a selected message, especially advertisements, insults, or threats. We are trained to quickly shift our attention to what strikes us as threatening, including more subtle implied threats like, "without our product you must be a loser." That's what the outrage factories make use of, including their earlier incarnations in "yellow journalism" and propaganda made obvious and clumsy by its inclusion of racist and sexist stereotypes. It is true that people can manage to have conversations using asocial amplifiers, it is possible, absolutely. Once the number of participants goes beyond a certain size, I suspect as little as twelve people, it gets harder and harder to have an actual conversation. That's when more people don't know each other in real life, and therefore have no shared standards of behaviour. The usual attempted solution for this has been having one or more moderators, plus making it necessary to go through some sort of filtering process to join in an attempt to recreate a real life social control process online. This works so long as those coming to join do so in good faith, and we must always remember that abusers will always pretend to good faith just long enough to get well established and difficult to remove. That is how abusers roll. And yes, groups getting poisoned by bad faith actors long preceded "social media." I observed the tragic demise of two wonderful mailing lists I was once part of under precisely those circumstances. Asocial amplifiers are designed to make us feel like we need them to keep in touch even as we are actually isolating ourselves to keep up with their various alerts, alerts designed with the same addictive qualities as the stimuli of gambling machines. If they were as positive as they are made out to be, asocial amplifiers wouldn't need addictive qualities for people to stick with them.

The question left standing then is whether "social media" could be the sort of online equivalent of a real life community that at least some boosters have claimed. Honestly, I don't think so. Firmspace communities work precisely because we get to know each other in them, develop shared standards of behaviour, and learn how to appropriately head off bad and unconstructive behaviour. We learn one another well enough to be able to take the right approach when our friends are responding from incredibly diverse places. A person handling the loss of their job, a recent major promotion, an exciting development in their family, or a day they felt deathly bored and out of sorts, none of them can be easily shoehorned into a single approach, even to start. There are simply too many variables, and I do agree that not all challenges can or should be met with mechanical technology. (Top)

Funny, That (2020-01-13)

M.C. Escher print Relativity, 1953. M.C. Escher print Relativity, 1953.
M.C. Escher print Relativity, 1953.

Few phenomena are as strange to witness as that of reversal, especially patriarchal reversal, which was so carefully and revealingly unpacked and explained by Mary Daly in her breakthrough book Gyn/Ecology. It is a rampant rhetorical technique in societies characterized by persistent oppressive structures maintained by many people refusing to oppose the oppression so long as it doesn't trouble them and theirs. It also a rampant practical technique, most often in the form of blaming, trying, and punishing victims of oppression rather than the people who actually harmed them. Patriarchal reversal in particular can be blunt and frankly stupid in implementation, usually following a longer term more cleverly set up sequence of reversals. The bullshittery of men claiming they are lesbians could not have gotten any mainstream play at all without the lengthy development and spread of queer theory, which itself is a blatantly anti-Feminist, anti-freedom movement. The second part of that characterization may be surprising, and I will come back to it.

Reversal has been on my mind a great deal of late, in part due to an odd encounter with the notion of "safe spaces." These sound like they should be a good idea, yet my experience of how they are invoked and what is expected in them leaves me feeling pretty damned unsafe. This is no doubt because the definition of "safe space" I have is not congruent with the reversed definition that is most widely flogged by people claiming to be in favour of social justice and other things that also sound quite reasonable, so long as you don't look too hard at the practical actions implemented under those terms. For my part, I understood and understand safe spaces to be places where people may discuss difficult topics and express diverse points of view without fear of being physically harmed or shouted down. I have been in such a space, where a young person repeated a racist myth about the spread of AIDS. This person was not shouted down, nor were they trying to convince the rest of us. In that space, they felt able to raise this idea, which they clearly weren't very certain about. We were able to explain that it was a myth, a highly racist and vicious one, alas therefore also extremely memorable. Respectful and firm corrected information provided, not an attack on that person, who indeed may or may not have come away agreeing with the corrections. That was for them to decide. This was a tough situation, many of us were very uncomfortable. We met the challenge in a good and humane way. That doesn't always happen, I know. The key is to keep building on the times we've done it right. Claims today that in a "safe space" no one is allowed to be uncomfortable, and the not so implied follow up that violence against those considered to have unacceptable opinions is permitted are extraordinary, eviscerating reversals of an excellent goal for places and times where we wrestle together in groups with difficult topics.

A growing problem right now online is censorship of Feminist, gender critical, and simply transactivist questioning materials. The reach of this censorship has been slipping further offline due to the growing overlap in the media in particular, and the abusive reversal of the notion of "safe space." I do my best to read the original of a person's controversial opinion, because when emotions are high it is important to check what they actually said, not what someone else claims they said. Claims made when emotions are very high can be wrong, because it is hard for us to give a fair hearing to others when we're upset. On the other hand, if the person in question has truly expressed an objectionable opinion, it will still be objectionable with a cool head and in the original. The ongoing mess in social media censorship is not news to anyone (see Graham Linehan's site for ongoing coverage specific to twitter; I have not yet found round ups for facebook's similar messes). This move to censorship is growing in some blogging outlets, including automattic's wordpress.com (for follow up on this start with 4thwavenow's coverage of the Gallus Mag case), and medium. In the case of medium, perhaps I am late to the party of realizing what was going on there. My knowledge has been firmly updated due to my efforts to have a look at Dr. Em's discussion of queer theory and the fact that many of its major founders were and are in favour of legalizing paedophilia (it has four parts, is not a pile of raving accusations, and includes full references for all quotes, see for yourself: 1 | 2 | 3 | 4). That connection is not new, and has been documented in the academic context before, led by Sheila Jeffreys in her book Unpacking Queer Politics. I was intrigued to learn that this work was being redone, especially in a more popularly oriented outlet. Medium is presented as a place to share different ideas and opinions. However, Dr. Em's account is now replaced with an error page reading "This account is under investigation or was found in violation of the Medium Rules" (quoting directly from medium). It's not too difficult to surmise that Dr. Em has probably been another victim of targeted abuse of the system for complaining about someone's posts. Such abuse only works because the systems are set up so that to be accused is to be found guilty, and then censored, with avenues of appeal difficult to access and often set up to prevent successful appeals in any case. That these systems are supposedly automated is no excuse, because victims of censorship have been able to show how somehow the system bent to target them. Meanwhile, screenshot after screenshot of threats to kill, rape, and beat women generally let alone having specifically controversial opinions get a pass.

I do think that online it is possible to have something more like real safe space, in which people are physically and socially safe to express controversial opinions. It is true that generally even when an online platform is run by people willing to invest in human moderators and train them well, that the moderators can still be overwhelmed by volume, so some form of automation is needed, and people need to be able to report complaints and concerns. Yet it is also necessary to curb abuse of reporting systems and resist the temptation to censor for fear of somebody, usually an advertiser, being "offended." Many of the larger companies crying fear of advertisers are in fact big enough to refuse to give in to the fear, if their claims to be afraid are not disingenuous. Leaving aside issues of greed and power-hungry junk, practically it is quite possible to get closer to an online safe space with the assistance of digital tools.

For instance, right now many moderation algorithms are performing reversal by banning and silencing people whose writing or speech gets mobbed my trolls. I talked about this some time ago in Garbage In Means Garbage Out. It's a technical challenge to retrain these algorithms to point in the right direction, but since when did a hard programming problem become reason to refuse to solve it? For a person to be able to report abuse or otherwise make a complaint, they should have to be registered on the platform in question, and there should be a probation period to see if they develop a pattern of abusing the reporting system. Serious abusers will wait this period out anyway, but this is likely to counter the mob tactics that are typically applied as part of abuse of reporting systems. For accusations of particularly severe rule violations, immediate bans or deletions are unjust and crush the credibility of the site as much as ignoring genuine threats. To my knowledge claims that a person is not merely expressing a controversial opinion but being abusive doesn't always require providing specific quotes, and that even if it does, the application is so broad as to evidently be an expression of specific politics, not application of a balanced policy. So before the ban hammer or delete button can be applied, I think it is reasonable for there to be a flag that is added to the top of the reported page saying something like "Readers should be aware that concerns have been reported about the content of this page. An investigation of the report(s) is underway. For more information on our review process, see our policy page." Wikipedia is in bad shape these days, but it does have something that approaches the right idea when it comes to marking a page or section as controversial. In any case, that leaves it to the reader to decide whether to continue reading or not, and they can also tell others whether they think the page should be read or not. From there of course, the challenge is to make sure that there is a genuine review process. But again, since when does a challenge chase off so many techies and people who care about freedom of speech and social justice?

The crowning recent reversal in some ways has been the slippage in the usage of the verb "validate." It comes from one of the better latin verbs, valere, "to be strong," the same word at the root of such words as "valour," "valid," and of course further derivatives like "valorous" and "validity." "Valid" specifically means "having a sound basis in fact" according to my OED. The accompanying verb "validate" is the act of checking or proving that something has a sound basis in fact. This did not originally refer in day to day life to feelings, because we have no way to validate someone else's feelings anymore than they have a way to validate ours. Sometimes we have feelings that not only can't be validated by others, we can have our emotions engaged by nonsense, which is frustrating. But our feelings are our feelings, and none but ourselves can say they are true and real for us. To try to force someone to validate our feelings for us is none other than a power play that can never truly satisfy, because even if we successfully bully them into acting in a way that we think should validate us, it's still an act. The buzz will wear off, the demand for repetition and for more people to repeat the supposedly validating behaviour will grow. This is the opposite of validation, which when applied to a more usual real life case, such as validating our ticket at the theatre, is specific and limited in extent. You hold out your ticket, the agent checks it, and if your ticket is good, off you go to the event. Hence the growing insistence by those interested in "gender self-identification" in having official documents that say whatever they say their gender is, because maybe that will finally validate them once and for all. There's a weird logical sense to this even though it won't work, even though the reversal embedded in this use of "validate" is the worst sort of mug's game. (Top)

The Illustrations We Don't See (2020-01-06)

Illustration of 'inconstant fortune,' sixteenth century line drawing by Jean Cousin. Image courtesy of old book illustrations, january 2020. Illustration of 'inconstant fortune,' sixteenth century line drawing by Jean Cousin. Image courtesy of old book illustrations, january 2020.
Illustration of 'inconstant fortune,' sixteenth century line drawing by Jean Cousin. Image courtesy of old book illustrations, january 2020.

I have had a small book repair and binding practice for the better part of ten years, at first simply to repair rather tattered, already second hand books that see regular use in my day job. Then I went on to doing small jobs for others and binding up books made up of calligraphy work and some small forays into illumination. With plenty of other things to keep me busy when it comes to illumination I am no more than a dabbler, but it is a wonderful rabbit hole to cheerfully jump down, especially with the resurgence of illustrations in fiction directed at adults. In any case, adding embellishments including small pictures and larger scenes to blocks of text was no mere rote exercise during the practice's first major heyday. As Kathryn M. Rudy wrote in Piety in Pieces: How Medieval Readers Customized Their Manuscripts, what sort of illumination got applied reflected complex interactions between what a person could afford, what they believed, how they used their books, and how long their books could last. While reading this book in particular, which is part of the collection of open access texts made available via open book publishers, I was reminded of how often we don't really see illustrations, and how that can leave space for considerable divergence between what they say and what the text says.

For instance, let's go back to rabbits for a moment. In early 2019 the rabbits busily wreaking havoc in the illuminated spaces of certain medieval manuscripts hit the news again courtesy of open culture's march blog post, Killer Rabbits in Medieval Manuscripts: Why So Many Drawings in the Margins Depict Bunnies Going Bad. For those who wondered just how the Monty Python troupe decided on having a terrible killer rabbit in The Quest for the Holy Grail, learning about these allegorical rabbits provides part of the answer. The other part has to do of course with Terry Jones, a trained medieval historian whose first book Chaucer's Knight is quite good, and well worth requesting by interlibrary loan. The violent bunnies seem to be especially associated with manuscripts illuminated by monks, who apparently were desperate for a bit of humour in the course of their long working days adding decorations to what could sometimes be quite dour prayers, stories, and sermons. This sort of thing is not unique to manuscripts either. The famous Bayeaux tapestry includes framing rows of small pictures above and below the main scenes, including some startlingly risqué elements from a current perspective.

Based on the fact that most books for adults lack illustrations, it seems fair to conclude that at least in the sorts of book markets I am familiar with, many readers have probably lost the habit of reading illustrations. As children we read them, and in many children's books they are explicitly designed to reinforce and help explain the text. The use of illustrations to counter or ignore the text outright for the sake of a chuckle or other break from reading the words seems quite lost. Yet elaborate illustrations like those created by William Blake or that helped set the scene in Conan Doyle's Sherlock Holmes stories or Dickens' lengthy novels added considerably more than a bit of relief from rows of letters. That said, I should acknowledge that the latter two examples are from works originally appearing in magazines as serials, so they came from a reading practice developing in its own way before those works were collected again into books. Media historians have been giving these pictures more serious attention, especially to nineteenth century examples, which are so packed with reflections of then current ideas about gender, class, and the arrangement of space, let alone religion and politics. Yet these are the very pictures that many of us don't really see, precisely because the whole apparatus of symbols and topical references are absent. (If you'd like to see some examples that aren't political cartoons from punch, which are often easiest to find because John Tenniel worked for that periodical for many years, I can recommend Rudyard Kipling's Just So Stories at project gutenberg or the internet archive.)

Remarkably, even some of the earliest superhero comic books are edging into a similar state of partial illegibility, including those featuring such now household names as Batman and Robin or Superman. Who knows what was going on when the panels in the Batman and Robin comics show Bruce Wayne and Dick Grayson sleeping in the same room at Wayne's mansion, though in chastely separate single beds. Or have a read of Jill Lepore's book The Secret History of Wonder Woman, in which she unpacks a whole range of wild details from the life and times of William Marsden who created the original comic book character. He did better by Wonder Woman than his terrible and unscientific invention that is ruining lives to this day, the so-called "lie detector." (Top)

The Curious Cloud (2019-12-30)

Cloudy photo. C. Osborne, december 2012. Cloudy photo. C. Osborne, december 2012.
Cloudy photo. C. Osborne, december 2012.

Not too long ago, instead of referring to "the cloud" we just called it off-site, remote accessible servers. The renaming is a transparent marketing ploy of course, and joins the ranks of such travesties as "nuclear power plants" and "the lifecycle of a pipeline." The key difference between what used to just be off-site, remote accessible servers then and now is that various companies own servers that they offer to rent out to others to host data and run software from. This is supposed to be a service so that those purchasing space "in the cloud" don't have to invest in servers of their own, or at least not as many. On-demand access, no dealing with the hassle of security and computer repairs. I suspect the more common use of "the cloud" is not so much for storage as it is to access massive computing arrays for parallel processing. Admittedly, this sounds rather practical at first. In fact, it sounds a lot like good old fashioned web hosting, just with extra bells and whistles in terms of processing power. Many of us purchase space on a company's servers to host our websites from, rather than setting up our own servers and managing the various tasks that go with owning and maintaining a server. So far so good, same principle. Nothing new to see here. Well, a little bit new, but it is telltale that nobody ever tried to brand "the cloud" as "web hosting 2.0."

The purpose of the intensive branding exercise is no doubt in part because many of these "cloud computing" providers never host any websites at all, except perhaps their own. They certainly want to differentiate themselves from companies specializing in web hosting as we have known it. That produces a different risk profile in terms of ne'er-do-wells trying to break into their systems and cause trouble or syphon out sensitive data. On the other hand, "cloud-based" companies can certainly do a good deal of troublemaking and fraud on their own account, just like any other company, as the recent case of a payroll processor that vanished from "the cloud" along with millions of dollars that should have been people's pay cheques. The basics of the saga have been investigated and written up by Brian Krebs. Part of what went awry was allowing an unusual request to redirect money from its usual holding account to an entirely different account at another bank. Perhaps "the cloud" helped make the resulting transaction run faster and all the harder to reverse. Old forms of fraud and security trouble in a new corner of the industry.

Yet somehow the whole notion of "cloud computing" niggles at me. It sounds like a trend that could very easily add to the ongoing effort to enclose the internet and amp up the various "social media" platforms that have recreated many of the worst aspects of old-style aol with an extra side of surveillance and attempts at manipulation and intrusive advertising. After all, people seeking space in "the cloud" to run their businesses from are going to be understandably leery of using just any fly-by-night operation. They will lean towards established players who are expanding into new areas, from infamous amazon.com to that old-timer ibm. The successful smaller players are vulnerable to being bought out and absorbed in more or less hostile take overs by other companies with more money. The web hosting sector isn't all sweetness and light by any means, but so far there are still a range of different companies of different sizes competing for the business of people looking to have websites hosted, including refugees from the ongoing censorship horror show run by automattic and people discovering that their isps are killing the few hundred megabytes of webspace that used to come by default with their subscriptions.

Obviously this is not a critique or anything so much as a musing on what is supposed to make "the cloud" different and something to get excited about, apart from the marketing hype. Having seen more than one internet-tied service that took advantage of economies of scale go in unfortunate directions already though, from search to "free" web hosting and email neither of which are actually free, I think it is reasonable to keep a skeptical eye on how "the cloud" develops. (Top)

Reflections on Historical Materialism (2019-12-23)

A widely reproduced photograph of Karl Marx taken by english photographer John Mayal circa 1875. This copy courtesy of wikimedia commons – see the wild collection of warnings attached to the picture, they give good food for thought. A widely reproduced photograph of Karl Marx taken by english photographer John Mayal circa 1875. This copy courtesy of wikimedia commons – see the wild collection of warnings attached to the picture, they give good food for thought.
A widely reproduced photograph of Karl Marx taken by english photographer John Mayal circa 1875. This copy courtesy of wikimedia commons – see the wild collection of warnings attached to the picture, they give good food for thought.

Having mentioned Karl Marx in the previous thoughtpiece, it is worth taking a little time to consider him and his ideas on his own. Not many people do this, understandably when there are all sorts of things to do and cope with. Yet it seems to me that it wouldn't hurt for more of us to check out what he actually said, did, and what his methodology was, so that we don't get fooled into believing nonsense. Marx was actually a prolific writer, and the document he is best known for, The Communist Manifesto was literally a one off that expressed his ideas at one point in his development as an activist and a theorist – his ideas, and those of Friedrich Engels. Regardless of whether or not you or I agree with him, it is important to acknowledge that he did change his mind, and he did develop his ideas, which is shown over the span of his works. Of course the book he is really famous for is the three published volumes of Das Kapital, only one of which he completed himself, the others having been edited by Friedrich Engels. It may be a surprise to some readers here, but Das Kapital has pretty much nothing to say about communism, socialism, or revolution as such. It is as the subtitle notes, an extensive critique of the field then known as political economy, what Marx argued was a scientific critique. Here I'd like to explore a bit of why he felt that his critique was scientific, not dogmatic or irrational.

To start with, Marx argued that his method of historical materialist analysis was scientific in its nature. Famously, he referred to it as Hegel's dialectical method turned upside down. This sounds like quite an arrogant claim, since Hegel was and remains a major player in philosophy. If you have taken an undergraduate degree with a history component of some kind or simply have read a bit about how social development works, you may have run into the description of Hegel's method as cycles of "thesis-antithesis-synthesis." The thesis is an idea at work in the world, and it meets its contradiction, the antithesis. To deal with the contradiction then, a synthesis, some sort of melding of the working parts of both ideas is carried out and then goes on as the new thesis. Hegel himself did not describe his method in this way, but it is a common means of teaching it to new students, and from what I have heard and read is not inaccurate, though it is of course, simplified. It also seems convincing, and sounds rather similar to scientific method as we get taught about it in elementary school: start with a hypothesis, test the hypothesis with an experiment, then revise the hypothesis. Repeat. There is an ongoing argument over whether Hegel meant this method to apply only to ideas and not to involve experiments as a check.

Marx found Hegel's method quite promising and useful, but he felt it was upside down because it started from abstract ideas that need have no connection to the physical world. He insisted this rendered the method less than useful for identifying real life problems and potential solutions, potential actions. He was not at all impressed by utopians, people busy coming up with idealized societies with no road map to get there, and he seemed to think that part of what made it hard to figure out a plan to get to better societies was having flawed analytical tools. So he argued for what we now know under the heading of "historical materialism," a term that to my knowledge he didn't coin. Marx argued that Hegel's method was great, but it started from the wrong place. First he argued, start from the real world, look at what is actually happening on the ground. Then abstract from that, and think through what the implications of your proposed abstractions would be. Now check, do those conceptual ideas and abstractions accurately describe what is happening. If they don't, look into what in the processes on the ground is showing your ideas are awry, and work on correcting for that. This is not an easy way to work at all, and it is part of why Das Kapital is such a huge, at times insanely difficult to follow text even in volume one, which remember Marx completed and thoroughly edited to produce its second edition.

UPDATE 2022-10-09 - David Harvey has released an episode of his Anti-Capitalist Chronicles discussing Marx's Historical Materialism. (Alternate Source) It is well worth a listen and includes specific references.

There then is how Marx "turns Hegel on his head." Marx insists that the ideas should conform to the evidence in the world, not the world to the ideas. This is an at least more scientific method because it insists on taking the evidence as it is, not picking and choosing evidence to make it fit the idea. That may be the hardest part, because if we are not self-aware, we may not realize that we are only noticing what fits our convincing idea and not what doesn't. Marx was particularly interested in this aspect of human psychology as he worked through his critique of political economy, because he kept finding that even though for the purposes of Das Kapital he accepted the parameters political economists set for how the economy should be working, the real world results did not match what they said. The real world results didn't match even in circumstances where the economy was set up in as close to an ideal fashion as possible. His argument for how this could be possible is far more nuanced than just pointing to greedy capitalists. Instead, he insisted on the importance of social roles as expressed in the social classes partially discussed in the previous thoughtpiece.

The hardest part of historical materialism as a method, for both Marx and I suspect a great many of us, is sticking to its emphasis on motion, the fluidity of everything. David Harvey delves into this in excellent detail in his lectures on reading Das Kapital which you can listen to or view for free online at his website or on youtube. This recognition of fluidity is in part an inheritance from Hegel, and Marx according to Harvey avoided emphasizing causality. I have worked my way through Das Kapital, and I think this is true. Where things went somewhat awry was Marx's invocation of unilinear social evolution, including an expectation that everyone everywhere had to go through the sort of industrial revolution england was in his time, and that the hideous suffering and destruction this entailed was necessary to get to the good stuff at the other end. This is not an impressive claim to a current ear, ringing as it does with echoes of religious fatalism and racism. Marx's low regard for the so-called "lumpenproletariat" is none too helpful background to this as well. Worst of all, it contradicts the fluidity aspect at the root of the premises of dialectical methods of analysis. It is quite reasonable to demand an accounting for this. It seems ridiculous to trip over such a simple hurdle after daring to correct Hegel.

Well, hard as it may be to believe, in the late nineteenth century europe and communities tied to it in colonies around the world, unilinear evolution was an optimistic viewpoint. It denied claims that most people in the world were at best the living appearance of humans, not merely unwilling to live better lives but constitutionally unable to live better lives. "The poor" were still more often assumed poor by nature, so even if they complained about how hard their lives were, that was supposedly just too bad because they couldn't possibly live otherwise. Other opportunities, better food, better clothing, better housing, all would be wasted on them according to many thinkers and a significant number of people who saw themselves as other than working class. The parallels drawn between "the poor" and various sorts of animals were intended to illustrate and shore up such assumptions. Unilinear evolution has many faults. But the one it didn't was denying the humanity of most of people, let alone their ability and right to seek and build better lives and societies. It took some time before the faults in the notion of unilinear evolution were recognized and the notion itself replaced with better ideas based on observations of the real world, including the somewhat belated understanding that it imported causality into dialectic methods of analysis – whether Hegel in the original or Marx's reformulation – through the back door. (Top)

Class is a Social Relationship (2019-12-16)

View of a local ferry preparing to leave for vancouver, bc. C. Osborne, june 2011. View of a local ferry preparing to leave for vancouver, bc. C. Osborne, june 2011.
View of a local ferry preparing to leave for vancouver, bc. C. Osborne, june 2011.

The title to this thoughtpiece paraphrases historian E.P. Thompson, who is of course particularly famous for his book The Making of the English Working Class and his part in the debates that followed the revelation of how Stalin had co-opted socialism within russia and by influence on a range of international communist parties. Thompson was certainly not in favour of authoritarianism or totalitarianism in any form, regardless of the whatever popular movement the oligarchs in waiting were trying to co-opt and use to boost themselves into power. In the case of "class" specifically, Thompson was making a key point, because even into the early 1960s, many people in england still accepted without question the false claim that whatever "class" a person might be in was dictated by birth. By such false logic the various families claiming to be "royal" and/or "nobility" insisted, and still insist, that they are better than the rest of us by nature of their genetics. Of course, before biologists managed to work out anything about DNA or genes, the claim was that they were better because they were divinely chosen somehow. It is unlikely that people generally found this terribly convincing even in officially more religious times, or "insulting the king" couldn't have been defined as a capital crime, which it was from time to time.

So if a person is not merely born permanently into a "social class," how to they get into one and what is a social class supposed to be anyway? A social class is a group of people who are supposed to share a specific set of characteristics, characteristics defined by social relationships. Perhaps most famously, Karl Marx referred to the working class specifically as people who do not own the means of production, and therefore have to work for a wage and then buy their means of subsistence. This is over simplified, but I actually don't agree with those who claim Marx only thought he had the one and only answer forever and that was that, though I do agree his writing suggests that at times he did, usually in the flush of discovery. As Thompson's formulation emphasizes, this access to the means of production is not somehow dissociated from human relationships. For that state to be possible, people have to work together to make it possible and enforce it. The creation of at least three social classes within capitalist societies, capitalists, workers, and enforcers, is predicated on specific relationships. If you are part of the capitalist social class, you don't have to take the humanity and intelligence of anybody outside of that class seriously. You can deny their individuality, their needs, and insist that everything you do is for their own faceless, unwashed good. The enforcer class, usually encouraged to feel flattered by the monicker "middle class" has been persuaded that their individuality is being respected, because they are doing the dirty work to enforce the limitations on the working class. These are the folks who most religiously insist on liberal individualism, because that is supposed to be why they have the social relationships to the capitalists and the workers that they do: they are superior liberal individuals, who live by the same values as the capitalists. Furthermore, they get to treat the workers as faceless, brainless, dirty masses, even if they do have to spend the rest of their time studying the peculiarities of the capitalists to stay ahead.

Meanwhile, the people in the working social class who haven't made the dubious trade to get into the middle social class don't get to treat anybody as faceless, brainless, and dirty. Unless of course, they can be persuaded that certain types of people are ripe for such treatment because they are a lesser sort of working class because they are racialized, female, and so on. So there are a bunch of negative social relationships to release pressure from further up the hierarchy. Yet in an interesting backlash, people in the working social class are encouraged to see the capitalists as tragic individuals who suffer to make the economy run, but in fact the capitalists are just one little group of faceless fools whose monopoly on force is all that keeps them in control of the means of production. The paradoxical occurrence of cleverness and intelligence in one or more areas of life and complete idiocy in others is usually illustrated with caricatures of scholars and anyone else who is visibly likes books in this anti-intellectual era. Yet something closer to the truth of the matter featured in one of Scott Adams' early Dilbert cartoons, in which one of the engineers told the pointy haired boss about a serious issue that needed to be dealt with, and had an obvious practical solution. One of the other engineers said to the pointy haired boss, "...and your bizarre, otherworldly response is?" Even the comic pointy haired boss isn't actually stupid. But he is so sure that his employees are less than human he can't manage to take what he hears from them seriously.

"Social class" may be an inevitable construct in human societies, but the sort of relationships enshrined in them are up to us. Right now we are facing hard questions about social class, including those of us living in countries where we have been encouraged to believe that somehow we don't have social classes. I suspect this is because for awhile the idea that "class" actually meant a certain way of behaving and dressing and how evidently superficial those things are made it seem like if behaviour and dress wasn't so sharply differentiated maybe there were no classes anymore. Plus, it even looked like there was real social mobility to be had, like there were ways to "move up" based on the assumption that social classes must always be hierarchical, and if you could move up then you couldn't move down and therefore "social classes" were no longer constructed but real and therefore not classes. This doesn't make actual sense, rationalizations rarely do when put under scrutiny. We don't have to have hierarchical social classes, though we probably can't avoid the fact that social classes must inevitably enshrine certain types of social relationships and therefore certain types of societies. The question now, as it becomes an unavoidable fact that the sort of society we have now is not tenable as an ongoing way of life, what are we going to do instead. (Top)

"Surplus" People (2019-12-09)

How creepily appropriate that the first results in searches on the word 'surplus' are all to do with military supplies, august 2019. How creepily appropriate that the first results in searches on the word 'surplus' are all to do with military supplies, august 2019.
How creepily appropriate that the first results in searches on the word 'surplus' are all to do with military supplies, august 2019.

Picture the scenario. The climate is changing, and the number and magnitude of severe storms and shifts in habitat for plants, animals, and far smaller more dangerous insects and microbes are affecting people all over the planet, but especially people who have the least access to the necessities of life. A great many people who live in "western" or "industrialized" countries have learned that their access to the necessities of life plus luxuries they understood they deserved, is being cut off. The luxuries part stings and is none too impressive, but these are tough folks on average. They can handle going without luxuries for awhile. Except, they have learned that the deal they understood they had made for their guaranteed necessities, involving working hard, not causing trouble, and toeing the line, that deal is dead as far as the best off dealers are concerned. The "elites" are doing just fine, and are sure that between their money, mercenaries, and gated communities they'll be fine. Understandably, the people who were on the other end of this deal, often called the middle and better off working classes, are asking some very hard questions. They are getting angry. Quite a few of them who want the old deal back are thinking violence might be their ticket to shoving out the old boss so they can be the new boss. Even more powerfully, lots of them are saying, who the hell needs a boss? This is a stupid way to live, we can do better. The people who don't want things to change, who can literally pick out how much money they want to burn that day, they have choices in how they could respond. So how have they responded? Primarily with violence.

This is not hard to check out. Just look up the latest police shootings and taserings in north america, whose victims are so often racialized, immigrants, or women – or all three – and how often those victims die. A better measure that may be less geographically specific is to check the proportion of the population in jail, and what types of crime they were charged with and imprisoned for. Check out how those numbers have soared in the largest industrialized countries, and how many of those crimes are in effect poverty. Check out the newspapers to see how much more play organizations and gangs with a media handle are getting who encourage hatred of racialized people and women. Have a look around at who is silenced, preferentially banned. If you think that "social media" is over-represented in the controversy, then feel free to look up the number of and types of crimes that preferentially go uninvestigated, or are underreported because the victim gets put on trial instead of the perpetrator. Notice whose death gets counted and under what circumstances. All of these ways of treating people and reporting about how they are treated helps elide physical violence imposed on people who are poor, racialized, female, or all three. The violence is rendered invisible by the insidious implication repeatedly made that they deserve their suffering. There is one card that gets saved for last in this nasty run of propaganda, and then it gets waved as hard as possible. That card is the one that reads, "surplus people."

Oh yes, "surplus people." No verbs, because the older versions, "eliminate the unfit," "exterminate the brutes," and "cleanse the race" are too obviously horrific and too associated with the eugenics as practised in nazi germany and much of north america on Indigenous peoples, African North Americans, and anyone perceived as less intelligent. We are a bit too relaxed about the "surplus people" monicker, perhaps because the term "surplus" turns up in military terms so often, and in that case surplus military equipment does indeed sound like a good thing. Surplus equipment is unneeded, and at first glance it seems like if the military equipment is surplus, that must mean there is less for the military to do. Which should be good all around. Unfortunately, that is not the way military surplussing works. But the key point about surplus is that whatever – and whomever – is supposed to be surplus, is unneeded.

Hence under conditions that look likely to upturn the system that helps them keep exploiting, so-called "elites" and those who identify with them begin invoking the spectre of surplus people. They begin pointing again and again at the "unindustrialized" nations where birth rates are just too high, because according to them those populations are mostly poor because they have too many children and obviously have no ability to plan responsibly for the future, or they'd stop having children. These arguments aren't new, it is easy to look them up in english. Start with the execrable Thomas Malthus, who was in no way an outlier in his day and whose arguments were not controversial. If you'd prefer recognized fiction, there is a famous little book by Charles Dickens called A Christmas Carol that summarizes the arguments in the first few chapters via the character of Ebenezer Scrooge. To the people who raise these pseudo-arguments, it must seem like they win no matter what. If social policy and people generally start seriously implementing their ideas, pretty soon the poor everywhere are scapegoated for whatever is going wrong, and just about any treatment to force them into line can be justified. Except anything that might enable those poor people to actually resist being forced into line.

I find it grimly fascinating how many people refuse to overtly unpack the question of, if some people are supposedly surplus, what are they supposed to be surplus to? We have practical evidence at hand that shows "surplus" people do not explain the presence of unemployment or starvation. Almost every instance of famine that we can find solid information about in the "modern" world has been shown to be a product of policy and maldistribution, not literal lack of food. Further back than that it gets more complicated because the capacity for specific groups to monopolize enough force and control enough distribution to have similar impacts was not as common. "Surplus" always seems to come down to, too many people of a certain kind for our way of life to continue in the way to which we have become accustomed, for various definitions of "our."

All of which is not to say that the number of people on Earth should expand without limit. For one thing, that is obviously impossible just based on practical, day to day facts and simple physics. I don't agree with anyone who claims that "the poor," especially in "undeveloped" or "under-developed" countries are multiplying irresponsibly or without any thought for the future. They are doing the best that they can under terrible conditions, especially the women who actually give birth to the children and get saddled with primary responsibility for the children's health and well-being with little or no help, no matter what the circumstances under which those children were conceived and born. There is a whole lot of hypocrisy out there, among the men who insist women should not be allowed to prevent conception or abort, but who then cry and moan about how terrible it is that there are "surplus people" and children living in terrible conditions. It doesn't matter to those men at all, so long as there are more children, because it is about their perceived control over life itself, which they expect will never impact upon their own creature comforts and social position. So seriously, watch out for the "surplus people" card. It reveals more than it conceals, and we need to look straight at it. (Top)

Supposed Civilization (2019-12-02)

Snapshot of a three dimensional computer model of the roman forum by Lasha Tskhondia under Creative Commons Attribution-Share Alike 3.0 Unported license, february 2012. Image courtesy of wikimedia commons. Snapshot of a three dimensional computer model of the roman forum by Lasha Tskhondia under Creative Commons Attribution-Share Alike 3.0 Unported license, february 2012. Image courtesy of wikimedia commons.
Snapshot of a three dimensional computer model of the roman forum by Lasha Tskhondia under Creative Commons Attribution-Share Alike 3.0 Unported license, february 2012. Image courtesy of wikimedia commons.

I think it is fair to say that the question of what civilization is, who has it, who doesn't, and whether that matters has troubled people ever since the unpleasant notion that there must be only one answer to those questions that then could be imposed on everyone else by any means necessary took serious hold. This notion of "right makes might," that is, having the presumed correct answer licenses all manner of terrible stuff so long as the goal is "right," described by philosophers Minnie Bruce Pratt and Marilyn Frye, has a consistent tie to imperialism and colonialism. If for the moment we take "imperialism" as making sure the right stuff is always available to make the right way of living happen, and "colonialism" is how people are made to live right, and the ones who make them live right, it all hangs sadly together. At the moment, everyone on Earth is facing dangerous climate change, the collapse of the american empire, and the weird spectacle of the not so united kingdom springing apart and england collapsing in on itself. So the question of "civilization" has been creeping ever so steadily to the top of mind again for many of us, because even if we don't dare say it out loud, we know our lives are going to change, probably a lot. We are bracing to face having to give things up that we have gotten used to in late stage capitalism. Thoughtful bloggers, some professional historians, others not, have been pondering again what the raw definition of "civilization" must be, what feature without which "civilization" does not exist.

Take for instance a june blogpost at Book and Sword, The Key Question in the Fall of the Roman Empire – discussing the roman empire is a great way to end up talking about the present, especially if whatever the current wobbly empire is had partly claimed to have solved whatever took its roman counterpart down. Of course, the trick is to agree on the key factor that took down the roman empire. In the context of this thoughtful blogpost, the argument comes down to the availability of clean water for the general population, which is in itself an excellent proxy for population health. This also stands as the de facto definition of civilization in the blogpost, with evidence added from a selection of articles showing that falling height and life expectancy preceded the collapse of the roman empire in particular despite there being more stuff in their houses and a more complex economy. And, the water infrastructure was in trouble as people lost the willingness or ability to cooperate to keep the various aqueducts and pipes in repair. This all sounds much more plausible as an indirect definition of "civilization," if it is not merely "people living in a settlement so large that the immediate land base cannot support them and they need to bring in basic necessities of life from elsewhere."

I have observed that many people in the press and in my immediate acquaintance take it as a fact that complex machinery, elaborate music, philosophy, science, and so on, are impossible if people aren't living in cities and within the sorts of societies arranged on hierarchical lines and run by an oligarchy with a monopoly on violence. It is very easy to feel sure about this if the only information easily available to you on the subject is drawn from societies defined by those very features, including their negative assessment of every other way of living. A negative assessment the boosters of those societies must make, to shore up their rationalizations for the trade offs they have made in their own societies, including using violence to sieze the goods they then use to feed their economies and produce luxury goods. Fundamentally, any art or practice involving serious thought is defined as a luxury in those societies, unless the art or practice is part of building up the military. It isn't difficult to find books and articles with many citations in original sources that explain how oligarchies have sought to block general access to art, basic education, and any sort of higher learning on the grounds that the people affected "had no use for" or "time for" such luxuries because they were impoverished and, anyway, they were really too stupid. At the moment I have a book on my desk that covers more recent expressions of such nonsense in britain, Jonathan Rose's The Intellectual Life of the British Working Classes. It says a great deal that today the idea that working class people could have intellectual lives is still one that draws amused contempt in a lot of circles today.

I think that the word "civilization" is a thoughtkiller here, or at least something that triggers endless circles rather than more useful perambulations. It pulls in a thousand unspoken presuppositions, almost all invidious, that we get stuck fending off instead of wrestling with the scary question. The scary question of, when we humans decide to live together and work together to build a society, what are we trying to do? Who are we trying to serve? What goal or goals are we trying to meet? I'm quite serious. The goals we prioritize, however we end up prioritizing them, become the guides for action for our societies. We begin to express them via infrastructure, via literal stuff on and in the ground, as well as systems of ideas we systematically invoke and demand to stuff into children's heads. I think many principled people across all manner of walks of life would consider the goal of ensuring as many people have access to clean water, decent food, and stable shelter as possible quite sensible. It isn't even especially utopian, because we know as a practical matter of fact that when people lack those things and any realistic hope of getting them by peaceful means, it isn't long before violence and ill health begin to take ever increasing tolls. On the other hand, we have to watch out that we are working on goals, and those don't get met overnight. Impatience gets us in trouble, helps feed the cruel notion of right makes might, that the ends will justify the means.

There is more than one way to live after all, many ways to build complex societies with wildly elaborated art and science. It's just that there is not a single one size fits all way, and only a few ways that involve ruthlessly exploiting almost everybody so a few people can have more than anybody else and try to maintain a monopoly on force to keep things that way. Those few ways are not only socially cruel and destructive, they are also environmentally destructive, as multiple examples of past societies we can reconstruct from documents and from archaeology have shown. Which suggests another feature those few societies share, besides a subscription to the notion that theirs is the only way to live: they all share a belief that there is no future, no one who will come after them whose eventual needs and happiness need be considered. (Top)

The Unfortunate Career of "Nice" (2019-11-26)

UPDATE 2020-04-20 - The sad devolution of "nice" continues in the worst way, as various jerks tell women and men who insist on facts and supporting women's rights that they should be "nice." It doesn't matter how polite and constructive they are either, as Helen Saxby explains with receipts in her most recent blog post, Why Can't Women Be More Nice? As she explains in no uncertain terms, "be nice" has in effect become a supposedly polite way to say "shut up." There is also an inherent condescension in the order that in real life would lead to the person saying it to another adult to getting a solid punch in the eye. It puts me in mind of a terrible vanity flick by Russel Crowe – I think it was another Robin Hood remake – in which his character liked to order his supposed love interest to "ask him nicely." If my memory serves, said supposed love interest was played by Cate Blanchett, and to be honest I was waiting for her to go full monster Galadriel on him. Oh well. Clearly I was not part of the intended audience for that movie, but if you are or are at least willing to give it a watch through just to hear the key line and its tone, you'll soon see and hear the problem with "nice" and its adverbial form "nicely."

UPDATE 2023-04-26 - Michael W. Lucas provides an excellent example illustrating the more contemporary meanings of "nice" in his brief description of the process priority changing utility available in most *nix systems called, you guessed it, "nice." On page 543 of the third edition of Absolute Free BSD he explains, "When reprioritizing, you tell FreeBSD to change the importance of a given process. For example, you can have a program run during busy hours, but only when nothing else wants to run. You've just told the program to be nice and step aside for other programs. ¶ The nicer a process is, the less CPU time it demands. The default niceness is 0, but niceness runs from 20 (very nice) to -20 (not nice at all)."

Model T ford converted to pull a plough, the sort of conversion Henry Ford hated. Circa 1925, original source library and archives canada, 3391814. Model T ford converted to pull a plough, the sort of conversion Henry Ford hated. Circa 1925, original source library and archives canada, 3391814.
Model T ford converted to pull a plough, the sort of conversion Henry Ford hated. Circa 1925, original source library and archives canada, 3391814.

Long ago in a place very far away, I was in elementary school, just starting grade five. That year I was in a class co-taught by two teachers, which may or may not have been a common thing at the time, although it sounds like it would be a great way to approach a whole range of teaching challenges. Anyway, they were a real dynamic duo these two teachers, one fabulously flamboyant, the other very much a science jock: Mrs. Pelletier and Mrs. Wallace. "Mrs" in the original sense by the way, in which the term simply indicates that the woman in question is a grown adult, which for a grade five aged girl student was pretty mindblowing. On the second or third day of class, Mrs. Pelletier, who taught mostly the humanities subjects, calmed us all down and began getting our language arts assignments set up for the year. Then she paused and declared, "You are welcome to use any adjective you like, but not nice. Nice doesn't say anything. It is so empty that last year in this class we ceremonially buried it. Now let me introduce you to the thesaurus." Truth be told I'm not sure about the thesaurus part this far along, but it fits. Now you may be thinking, gosh, big deal, why tell a bunch of fifth graders not to use the word "nice"? Well, being a teacher myself, I can tell you from both perspectives that it is an awesome way to get kids thinking about words and getting them more interested in reading and writing. Give kids an approachable challenge and watch them take off with it in the best way, which they will nine times out of ten. That aside, obviously I never forgot this little episode, and continue to eschew "nice" as a sort of "filler adjective to describe something pleasant."

Thing is, "nice" really is an awful word, astonishing as that may sound. It genuinely astonished me to properly sort this out via my trusty OED. And by awful, I do mean awful. This isn't just a matter of the exercise in which the apprentice actor has to say "goldfish" in twelve different ways. It's not just a tone or pitch issue.

For starters, "nice" in english originally meant "stupid," and it specifically described a person and the state of their intelligence, or rather lack thereof. It is a worn down pronunciation of latin "nescius" with the same meaning, itself derived from the verb "nescire," "to not know." After awhile, "nice" got a positivity upgrade, in which it was used to describe someone who kept away from bad or dishonest behaviour and therefore lacked knowledge of it as a matter of practicality. Then it even began to mean a person was shy, the idea being they didn't know how to behave at parties. This soon slipped right back to the negative side of the ledger as it began to refer not to being honestly unaware or careful to avoid bad behaviour, but making a pretence of not knowing as an expression of snobbery or just plain dishonesty. As linguists Julia Penelope and Susan J. Wolfe would warn us to expect, this particular version of "nice" was most often aimed at women and men considered effeminate. More recently, these meanings have been lost, and the word is used to refer to "pleasant or agreeable" things. A thesaurus replacement for it in this sense might be "anodyne" meaning "unlikely to provoke dissent or offence; inoffensive, often deliberately so." I stumbled on this synonym in a review of a performance of Beethoven's fifth symphony, and had to look it up. It's a pretty fancy way to say that the performance was nothing to write home about, one way or the other. Home run for the thesaurus!

Ms. Pelletier didn't go into all of this with my fifth grade class, she merely pointed out that it was overused and we could certainly be more creative than that. What makes "nice" awful is not per se that it originally meant other things that are less than pleasant or that when Shakespeare writes it into dialogue he is probably writing a whopper of an insult. No, the trouble is that "nice" says nothing, it has become the default filler when, sure enough, we would like to avoid provoking dissent or offence. While there is plenty of pressure on all of us right now to act as if we can't handle dissent or offence and so should avoid it, that pressure is neither well meaning nor constructive. We're all made of sterner, more creative stuff. (Top)

Copyright © C. Osborne 2024
Last Modified: Monday, January 01, 2024 01:26:19