Title graphic of the Moonspeaker website. Small title graphic of the Moonspeaker website.

 
 
 
Where some ideas are stranger than others...

FOUND SUBJECTS at the Moonspeaker

Some Clarity About "Political Correctness" (2018-05-07)

A rampant pile o' dictionaries. C. Osborne, may 2018. A rampant pile o' dictionaries. C. Osborne, may 2018.
A rampant pile of dictionaries. C. Osborne, may 2018.

One day an acquaintance of mine who is an avowed leftist, began loudly beaking off about the horrors of "political correctness." I found this very interesting, because the term comes out of right wing politics, and, probably naively, I expected this to be a feature that would make a left wing person go, "Hmm, I think I need to learn more about the meaning of this term." It isn't impossible for left and right wing political positions to be similar based on entirely different criteria and chains of reasoning. It happens, and there is nothing wrong with that, and we can probably set it down to the perversity of the universe that spoils our hopes for sealed political boxes. In fact, I think it is a great prod to pause and go, "Hmm, I think I need to learn a bit more and check my reasoning." Notice that is not the same thing as "I had better change my mind, somebody right wing thinks that!" Life is not that simple.

UPDATE 2018-05-29 - On this topic another great source of long and thoughtful reads is Andrew Nurse's blog canadian studies. A great starting point is his 11 April 2018 post Thought Police and Post-Secondary Education.

UPDATE 2019-09-07 - There is another closely related rhetorical tactic to this one, in which challenging a person's political beliefs or merely asking them questions to better understand their position leads to that person insisting that they are being "triggered," "made uncomfortable," or "having their existence challenged." This is absurd hyperbole but very much in vogue right now because a key atomizing tactic that propagandists of the status quo apply with vigour is the claim that who we are is what our opinions are, or as Siva Vaidhyanathan noted in an interview with the team at Logic magazine, "[a] tendency to see political opinions as extensions of our identities."

UPDATE 2022-08-12 - For those interested in following up on how manners are used to manipulate people and maintain social hierarchies, there is a real treat freely available for you to read. The late anarchist anthropologist David Graeber wrote an Manners, Deference, and Private Property: Or, Elements for a General Theory of Hierarchy and published it on his blog circa 2007. If the blog's stylesheet keeps the page blank for some reason, if you switch to reader view or turn the stylesheet off the essay will pop right up.

As time has gone on, it has been quite the sight and sound, the sudden right wing discovery of "free speech," even among those with authoritarian leanings. Under authoritarian conditions, "free speech" is not possible for anyone, so this is a genuine incongruence. In other words, the anybody sympathetic with authoritarianism, be they right or left wing, cares not a bit about "free speech." This is a rhetorical abuse, a use of trigger words to persuade people into a lather and prevent them from thinking about what was actually just said. It blows my mind that people can both fall for this, and insist that they are skeptical, especially of politicians and satirists, the very people who like to flail "political correctness" accusations for all they are worth. A significant part of why this annoys me so much is that the loudest accusers are also quite clear that the last person who should ever have freedom of any speech is anybody remotely like me, if they can help it. Somehow "free speech" stops being a core value as soon as it is their oxen getting gored. If you'd like to see and read about a brilliant case in point, look up Michelle Wolf's monologue at the "white house correspondents dinner" and the hilarious protests of the various right wing as much as left wing types whose oxen got gored, many of whom inveigh against "political correctness" and supposed claims that the mere tender feelings of various "interest groups" are protected by it.

Then go and read this article, by Nathan J. Robinson, What Being "Politically Incorrect" Actually Looks Like. I am going to quote an important point that Robinson makes in it. Here it is: People who insist that "political correctness" means a stifling of the right to "be offensive" are being imprecise. In fact, they are upset that they can't make jokes at the expense of the powerless and marginalized, that there's less tolerance for humor that rests on cruel stereotypes about people of color, women, disabled people, queer people, and fat people. Well, honestly, yes of course those people are being imprecise. They have to be, otherwise they would have to admit that what they really want is to be able to say the sorts of things they used to be able to get away with in public. Robinson is too diplomatic to say so in the way I just did. Instead, quite as reasonably, he pointed out But there is an even more important conclusion to draw from Michelle Wolf's comedy. It helps us see through the lie about political correctness, this idea that feminists are trying to kill humor. They're not trying to kill it. They're trying to turn it against the people who deserve to be its targets: the Trumps and Weinsteins and Cosbys and Moores of the world. And those people aren't upset because they value edgy humor, but because they want humor that bullies the powerless rather than exposing the grotesque immorality of the powerful. The cult of "civility" is the real "political correctness," the stifling consensus that prevents us from telling the truth about the people in charge.

By the way, it was entirely possible to learn this before Michelle Wolf's brilliant use of her opportunity to use her razor wit where it would do the most good at an otherwise questionable event. For the old fashioned and the new fashioned but also adventurous, you could have spent some time with any of several brilliant Feminist linguists, including Julia Penelope and Susan Wolfe. Or Feminist philosophers – philosophers are quite as concerned about language as any linguist – Jane Caputi and Mary Daly. Not only do they skewer such rhetorical abuses of language as the supposed phenomenon of "political correctness," they are damned funny at it. Humour was an important part of how they made their points, and hard to avoid. As Michelle Wolf showed again and again, the vicious discrepancy between self-righteous claims and real-life injustice can be so absurd as to make us laugh, even as it opens our eyes to wrongs that need to be righted. It has to be. Or all we could do is weep. (Top)

Fourth Wallisms (2018-04-30)

Screen grab of Dr. McCoy in a rare triumphant moment from the original ST episode 'Journey to Babel' courtesy of memory alpha, april 2018. Screen grab of Dr. McCoy in a rare triumphant moment from the original ST episode 'Journey to Babel' courtesy of memory alpha, april 2018.
Screen grab of Dr. McCoy in a rare triumphant moment from the original ST episode "Journey to Babel" courtesy of memory alpha, april 2018.

"The fourth wall" being the now familiar conceit that actors engaged in a drama do so within a more or less virtual box with one invisible or absent wall through which the audience watches them. This idea is quite a commonplace now, to the point that it is easy to find web pages full of examples in comic books, television shows, movies, and the usually understood original source, the theatre. No one seems inclined to acknowledge that ordinary novels and stories, let alone non-fiction of various types rampantly breaks the fourth wall. For some reason it is egregiously present in children's fiction, when an author is unable to respect their young audience enough not to slap them in the face with something to the effect of, "This isn't real, you know." Children get that, they are some of the best and sharpest critics of fourth wall breakage you will ever meet. When done well of course, breaking the fourth wall can be brilliant and funny. I'm not sure if it could be serious and sad, although I don't doubt that it has been attempted. The trouble with it is that breaking the fourth wall, whether subtle or obvious, succeeds or fails purely on the actor's delivery. Their success may be a matter of audience taste, so arguing about it is probably not worth the time.

Part of what brought all this to mind was reading the early pages of Susanne Langer's Philosophy in a New Key. Langer is best known for her studies of philosophy and its intersection with art, which led her to recount her childhood experience of a performance of Peter Pan. She noted that she enjoyed the performance, until the actor playing Peter turned to the audience and inveighed them to shout that they believed in fairies in order to save Tinker Bell's life. Langer found this profoundly disturbing, we might say now that it threw her out of the story. Until that moment, she was working under the assumption that this was "ordinary theatre" and far too young to know about Berthold Brecht. This isn't about suspension of disbelief, nobody believes a play is literally real. So what went awry here?

I think it has to do specifically with modern expectations of the audience, and the particular ways in which the camera is deployed and used not just to manipulate what the audience sees, but how they see it. Langer's childhood falls firmly into the early television era, so she is an early example of our present common experience of learning and experiencing multiple dialects of visual depiction. In the "modern" era, the audience is positioned in a manner very similar to that imagined in many cultures for a deity. Outside, able to see and hear everything that is important, uninvolved, with nothing to gain or lose from the experience, except maybe entertainment. Performances directed at children are most likely to strain this because they so often include explicit "teaching moments." Not a bad thing per se, but like fourth wall breaks, highly dependent on their delivery for effectiveness without being insulting or annoying. So the trouble with the performance Langer saw was that it violated that convention, or finally did so clumsily enough that rather than enjoying it, she felt that it spoiled the play. She did note that plenty of children merrily shouted they believed in fairies and were fine with it, so again, this is not a zero sum game. Fourth wall breaking is risky.

As I thought this over, it occurred to me to wonder whether the usual "origins of theatre" story in mainstream western culture, the one that says the ancient greeks invented it pretty much on their own and then everybody else eventually adapted it via them but mostly the romans, whether "the fourth wall" was a thing for the ancient greeks. The contributors to the 1992 anthology Nothing to Do with Dionysos? draw out a multitude of features of ancient greek theatre and its social context that can be reconstructed from written and archaeological records. For example, in ancient athens, women weren't supposed to be in the audience anymore than they were on stage, and a significant portion of that audience was the young men deemed to have recently come of age. The chorus, which famously comments on what is happening in the main play interacted with the main characters, and spoke directly to the audience in both comic and tragic plays. Even in translation, there seems to be an expectation of some form of audience participation, and not just the "oohs and ahs" today more associated with the circus.

Much later of course, we get to read about the "groundlings" of elizabethan era theatre, who since they were expected to stand and couldn't expect to be able to see and hear everything as well the more well-heeled members of the audience, saw no reason not to be pointed and interventionist when they felt the play was going badly. Over time this sort of behaviour became more and more frowned upon as theatre was gentrified and separated into new categories like "theatre proper" versus vaudeville, or for that matter, circus and wild west shows. All of which suggests that "the fourth wall" is perhaps surprisingly new, reflecting a greater desire to control audience experience and response. And in the end, that may be the real problem with breaking the fourth wall. If done badly, it makes the playwright and/or producer into a sort of intrusive puppet-master, rather than a co-creator of a hopefully moving and pleasurable experience. (Top)

Poetry Is Not a Luxury (2018-04-23)

Public domain photograph of an opening page from one of the christian gospels included in the Book of Kells, courtesy of wikimedia commons, January 2014. Public domain photograph of an opening page from one of the christian gospels included in the Book of Kells, courtesy of wikimedia commons, January 2014.
Public domain photograph of an opening page from one of the christian gospels included in the Book of Kells, courtesy of wikimedia commons, January 2014.

The title of this piece is an invocation of the luminous Audre Lorde, whose famous short essay of the same title debuted in a 1977 issue of the journal Chrysalis: A Magazine of Female Culture. In this time of neoliberal-neoconservative backlash in which students of all ages are relentlessly pressured to abandon the humanities because they are supposedly useless, I find myself pondering Lorde's words again. She was a Black, Feminist poet who fought for her avocation through poverty, racism, sexism, and homophobia. She had to deal with finding the necessities of life for herself and her children. She worked damn hard, and she had no illusions about the importance of food, clothing, and shelter. Yet there are many people out there who disagree with her statement in that essay. They insist that no, poetry is a luxury. That categorization is not debatable. To debate it, let alone oppose it according to those making this claim, is to be utterly unrealistic. And if those folks meant simply the poetry that is so often inflicted upon students in school, then I would have to concede a part of their point. Poetry done over into hackneyed awfulness and then shoved into unwilling students' eyes is certainly not a luxury. It is a vicious form of aversion therapy, because in truth, poetry is a not a luxury. It is a necessity.

As Lorde notes, "Poetry is the way we help give name to the nameless so it can be thought. The farthest horizons of our hopes and fears are cobbled by our poems, carved from the rock experiences of our daily lives," and "In the forefront of our move toward change, there is only poetry to hint at possibility made real." In her essay Lorde is specifically discussing women and their capacities to make change, seeking to answer the implicit question of what new ideas and modes of living are going to be put in place of the current ones that aren't working. If the goal isn't some masked version of the christian rapture, how else can we live? These are at root optimistic questions, assuming that people can change, injustice can be overcome, and there are other ways of being. Imagining change, serious, principled change, demands more imagination, not less. It is also hard work. Poetry isn't just fun and games.

I am quite serious. Writing poetry in the conventional sense, the carefully arranged words that fill books and a considerable portion of the airwaves, is not trivial labour. Not the poetry that lasts, that actually touches people and remains in their minds, ready for singing or recitation till almost their very last moments on Earth. Just like anything else humans make and experiment with, there is plenty of stuff that doesn't work out. Going back to the expanded sense of "poetry" in Lorde's essay, there are ideas, approaches to making lasting change that people have tried, and found wanting. But here's the strange thing. We are not encouraged to be as magnanimous to poetry, as we are expected to be towards religion, economics, or any experimental science. No indeed.

When following a religious precept leads to some appalling social or environmental result, that just means we humans misinterpreted the divine instructions. When the latest economic theory is shown by real life results to be complete bollocks, that means the market was not allowed to be its proper, ideal self by human mismanagement. When the latest exciting scientific experiment gives an unexpected result, that goes to show the scientific method is working. But when an attempt at creating lasting social change doesn't have the desired result, not only is it declared a pointless effort by the purveyors of the status quo, they add that any imaginings of different ways of being are pointless fiction and should be mocked, and discouraged as not fit for adult attention. Well, that's a great way to maintain the status quo, to be sure.

As always, it is critically important to ask whenever somebody tells us that we are wasting time with a particular practice, who benefits by us agreeing to trammel our imaginations? What remains unconsidered and unchanged if we agree to stop seeking to envision and embody new possibilities? Who benefits from that? (Top)

Mixed Bag (2018-04-16)

Photograph of mixed nuts by Sage Ross, via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license, december 2013. Photograph of mixed nuts by Sage Ross, via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license, december 2013.
Photograph of mixed nuts by Sage Ross, via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license, december 2013.

One day in the course of a family gathering, one of my relatives discovered that someone had brought a box of "bridge mix." The term is actually a generic one, for finger food that is easy to eat and not too messy while handling a handful of cards while playing the titular game of bridge, for instance. The fancy stuff in the box is considered fancy because all of the items are chocolate covered. So raisins, peanuts, turkish delight, or the to me utterly disgusting malted milk balls. Different companies put wildly different things in their bridge mixes, and of course the quality is just as variable. Really, the label on the box should probably be, "you must trust us, you bought our product."

From this you may have surmised that this thoughtpiece is going to be about the ongoing dumpster fire that is facebook and surveillance capitalism. But I figure plenty of other writers have that covered. I've been thinking more about the attitude and philosophy embedded in computers and the ways they are applied in the real world, because in the end that is a more important issue. One of the best writers I know of on this is David F. Noble, whose earliest books deal with the history of technology, especially as it is expressed through the control-obsessed mindset encouraged in engineers.

UPDATE 2018-07-11 - For those who may be feeling skeptical about the mindset I am describing and its use as a guiding design principle in early automation, readers may want to turn their attention to Charles Babbage's writings. I find it mildly bemusing how much bad press he got in his time when he in effect did yeoman's work for the cause of capitalists and others hoping to hyperexploit labour if they had to use it while steadily decreasing the skill and numbers of people employed. Babbage's book On the Economy of Machinery and Manufactures is still regarded as a classic, and includes this often quoted statement from page 54: "One great advantage which we may derive from machinery is the check which it affords against the inattention, the idleness, or the dishonesty of human agents." Few quote the following sentence, which reads, "Few occupations are more wearisome than counting a series of repetitions of the same fact; the number of paces we walk afford a tolerably good measure of distance passed over, but the value of this is much enhanced by possessing an instrument, the pedometer, which will count for us the steps made." Babbage is of course right on this point, and he is also pointing out the way to treat an operation to be automated in the first place: break it down into tedious, small parts requiring as little mental engagement or physical dexterity as possible. This is true even though walking is actually far from a non-dextrous action.

From what I can tell not only from David F. Noble but also from more recent writers and historians from Christopher Dummitt to Jane Jacobs, engineers, contrary to popular belief, are not trained and encouraged to solve human problems. Far from it. Originally all an engineer was was a man, always a man, who supervised an engine. He made sure the engine ran properly, had sufficient fuel, and nobody messed up the engine. Engines were conceived as a solution to the problem of human labour. First, to make human labour possible where otherwise it wasn't, such as the bottom of deep coal mines. Second, to force humans to labour more by taking away their means of exerting control over their work, which ironically, trains are very good for. Third, to replace human labour all together. The inference then is that humans are actually the problem that engines are supposed to solve, and over the years of the invention and professionalization of engineering, engineers were encouraged more or less subtly to identify with the engines and accept that definition of "the problem." This is no caricature. It reflects the sort of externalizing and careful elision of social context and the risks and benefits to others that is quite typical of what today is a common technocratic mindset, which itself is founded on this narrow engineering view. However many engineers disagree with and oppose it, it is still the majority position.

If "the problem" is humans and enforcing the "correct" behaviour on them so the machine will perform at its best, then it stands to reason that opportunities for exploration, tinkering, and repair by anybody besides an engineer with specialized training and tools vanishes. It also stands to reason that the engineers will become ever more engrossed in designing greater and greater surveillance mechanisms into the machines, in order to identify and counteract "incorrect" behaviour. It stands to reason that they will work on obfuscating as much as they can how any machine now, not just any engine, works. After all, we must trust them. We bought the product, or at least accepted it, accepted the premises it enshrines, starting with the premise that the machine is designed by people who don't trust us. They trust the machine. And they trust that the machine can be used to make us behave in ways that they consider appropriate, efficient, and productive. This is why so many engineers and futurists who would like to be engineers insist that the future of computers is miniaturization and total ubiquity. If they can only put enough computers surveilling human behaviour, and the computers have big enough databases to look up things in, then humans can be managed totally and once and for all.

This is also a key reason for manuals and other sorts of information about machines all but vanishing. That it is difficult to write and maintain those manuals is no explanation. Textbooks and cookbooks are both hard and complex to write, covering complex practices and concepts, demanding a mixture of graphics, text, and design provided by a team of hardworking people. We expect there to be different levels of textbook and cookbook, the sort that raw beginners need, intermediate students, and the people whose experience and interest means that they dip into their books, rather than working through them systematically. But this all centres the humans, and furthermore demands ongoing work. A machine-centred approach seeks a once and for all answer, because revisions are deemed a waste of time and energy when demanded by humans. So every attempt is made to reduce the number of decisions available to the human. Hence a not uncommon response to a request for assistance in doing something with a machine that it seems to a person reasonable to expect it to do, especially in the context of computers, if made to "tech support," often results in an answer to the effect of, "why would you want to do that?"

Total distrust of humans does not have to be embedded in the design and operation of machines, but right now, it is. That is fundamentally why the modern crop of machines we are all too familiar with, the ones absolutely dependent on fossil fuels, were developed. After all, they were bankrolled primarily by capitalists seeking every possible means to maximize their profits, and a key means has always been to somehow get rid of labour costs. Note that labour is always defined as a cost for capitalists and something that must be imposed on workers. Nobody likes being imposed on, so of course the workers resist and the capitalists demand solutions that limit resistance while displacing responsibility for any worker unhappiness from them if at all possible. Which takes back again to machines, the rationalization of increasing surveillance, and the ongoing effort to render every machine a black box which no one but the company that sells it controls. (Top)

Geeksploitation (2018-04-09)

Sample of various orange gaming dice, quoted from the imaginary adventures gaming supply website, august 2022. Sample of various orange gaming dice, quoted from the imaginary adventures gaming supply website, august 2022.
Sample of various orange gaming dice, quoted from the imaginary adventures gaming supply website, august 2022.

I have found several reviews of the new movie, ready player one, in my rss feed, and so far the big message that comes across is disappointment. Not surprise or anything, and there are definitely plenty of people who like it. Yet I was particularly struck by the reviewer who commented that Steven Spielberg just doesn't get the culture he is trying to depict, or the gestalt of the original book. This person did acknowledge that Spielberg had to deal with licensing limitations and the like, but his greater concern was a failure to appreciate the "spirit" of things, if I understood him correctly. Not having seen the movie, and frankly not being a participant in a gaming community or similar pop culture oriented geek group, I have no position on this in itself. Instead, the reviewers' reflections got me thinking about the ever-expanding genre of geeksploitation flicks and tie-in products, the dark side of the relative mainstreaming of fandom, basically in order to facilitate exploiting fans.

Needless to say (so I will anyway), "geeksploitation" is neither my own invention, nor a new term. Tim Wu references an article dating back to 2010 in a 2013 movie review of his own, so it could be argued that the term is relatively old hat. wiktionary documents a 2000-era use of the word in a book by Richard Grayson, and the dictionary entry defines it as "taking advantage of highly-motivated programmers willing to work long hours." The connotations remain consistent, even if the reference has expanded from taking advantage of employees to taking advantage of "consumers." It's hard not to see here an ongoing drift of the word towards something not so far from the infamous quote misattributed to P.T. Barnum, "There's a sucker born every minute." Late-stage capitalism is certainly revealing the apotheosis of this cynical viewpoint, and "geeksploitation" sounds like it could be exhibit A.

Exploitation films are certainly a thing, and part of what brought down movie rental businesses was the sheer dreadful mass of them on top of the effects of the web and a new thing called netflix. I can remember the majority of the small town movie rental stores I grew up with jammed with multiple copies of them while the latest flavour of the moment was impossible to rent anyway. Some of these films did make it into "so bad it's good status," like the attempt to take further advantage of the original He-Man toy craze, "Masters of the Universe" (Courtney Cox before Friends, and Meg Foster before Hercules the Legendary Journeys, folks – there's some kind of nerd cred for you). All before there was a "geek" market, instead the focus was on children and their tie-in toys after the law against advertising directly to children was struck down in the states. The market segment of the "geek" market today is made up predominantly of those now grown up children. Hmmm.

I have to admit that I do puzzle over the notion of "geek culture" despite the fact that I have a range of overlapping affinities with aspects of what the term seems to usually be applied to. At least, in terms of the sorts of things that geeks are expected to like, and which therefore are reflected in marketing to that segment of presumed affluent or presumed obsessed enough to minimize necessities in favour of their preferred pop culture products. A challenge that marketers face with everyone is the fact that people actually don't like being marketed to, especially if they realize that's what is going on. This is a key reason for the poor view of people who sell used cars for a living, quite apart from the perverse incentives that get so easily tangled up in the practice. So maybe part of the disappointment I have read about is a disappointment with the assumption that "geeks" are either being obviously marketed at, which is not considered cool, or because the marketing is aimed at expanding the market. In other words, going beyond "geeks," who are used to being outliers and a somewhat exclusive crowd, usually not by choice, which foils what is I think a true "tenet" of "geek culture": "geeks" are not like everybody else. (Top)

The Abused Verb "Identify" (2018-04-02)

Go ahead, pick one. Widely shared clipart, march 2018. Go ahead, pick one. Widely shared clipart, march 2018.
Go ahead, pick one. Widely shared clipart, march 2018.

UPDATE 2018-07-19 - Fair Play For Women, a brilliant website well worth spending some time reading on, makes another important point regarding this complex of abused words. "When you ARE something, you don't need to define yourself as it. No-one self-defines as a human. No-one self-defines as alive. People with adequate vision don't self-define as sighted. The very act of self-definition means to present yourself as something you are not. It is, bluntly put, to tell a lie and ask others to agree to pretend it is true." This quote comes from the article Questions for Left-Wing People Who Support Self-Definition as a Woman.

UPDATE 2019-09-15 - Philosopher Kathleen Stock wrote a piece intended to be part of a blogpost answers to the question "How can philosophy change the way we understand transgender experience and identity?" which is relevant here. Everyone who wasn't transactivist affiliated appears to have been deep-sixed from the final published version, so I quote Stock via Leiter Reports: "Philosophy can ask: what is a transgender identity? More generally, it can ask what 'identity' is, and interrogate the central role that the notion now plays in contemporary politics. On one interpretation, one's identity is wholly subjective: it's whatever you believe you are, right now, where your beliefs guarantee success – if you now believe that you are such-and-such, then being such-and-such is your identity, and there's no way you can be wrong about that. Sometimes we hear that identities include, not just being trans or not, but also having a sexual orientation: being gay, or heterosexual, or bisexual. But if, for instance,'subjectively believing you are heterosexual' is equivalent to 'actually being heterosexual,' then this presumably means you are automatically heterosexual as long as you feel that term applies to you. And this looks wrong. Aren't there independent, non-subjective conditions to be fulfilled, to count as being a heterosexual? You have to be genuinely attracted to the opposite sex, for one. Lots of people believe they're straight but aren't. Self-deception is possible. So possession of a heterosexual identity, in an interesting sense, seems to require more than just subjective belief. If that's right, then we should think harder about making a transgender identity only a matter of what one subjectively feels is true about oneself right now."

UPDATE 2020-12-23 - It occurred to me that many readers might be a bit mystified why I find Orwell so politically objectionable. After all, a person could insist that it is not Orwell's fault that he wrote two of the most abused propaganda set pieces that are shoehorned into school curricula in hopes of indoctrinating students by forcing them to read them. This would be nice to believe, but in my view it simply isn't true. Orwell knowingly and willingly wrote propaganda, for his own reasons, not at the direction of the british government, and it came in useful after the second world war. The kicker from my perspective was learning about the posthumously published evidence of his fingering people as "potential communists" to british intelligence, such as it was. No need to take my word for this, see OpenCulture's 2015 nuanced post George Orwell Creates a Who's Who List of "Crypto" Communities for British Intelligence Forces (1949), or for a far more passionate take, see Ben Norton's 2016 blogpost George Orwell was a reactionary snitch who made a black list of leftists for the British government. Whatever you may think of Ben Norton, I much appreciate his link to Isaac Asimov's review of 1984, one that seriously engages with the book in a truly impressive way – not because he trashes the book, although he does (which is a bonus), but because he takes issue with it as a novel and the claim others have made that it is science fiction.

UPDATE 2022-01-27 - Due to continued not terribly thoughtful references to Orwell's novel over the past several years, more recent analyses are finally appearing on-line. An excellent one is Jo Brew's Big Sister is Watching You! A Radical Feminist Analysis of Nineteen Eighty-Four on substack. Not many people have paid much attention to Winston's rape and murder fantasies let alone the fate of the women in his life before he takes up with Julia.

Among the unfortunate themes of this period of intensive backlash against women, Feminism, anyone who can't pass as white, and apparently the Earth itself, is the rampant abuse of language. I am no fan of George Orwell, in part because of his despicable propaganda piece 1984, designed to encourage defeatism and conformity, and more recent revelations about his practice of spying and red-baiting, but I have to concede that when he created the category label "newspeak" and analysed it in his infamous novel, he was telling no more and no less than the absolute truth. He provides a chilling précis of such practices as insisting words mean the opposite of their common meaning, refusing to define terms or insisting on such nebulous meanings that they can be conveniently rejigged for political convenience. These practices are all too tempting because they can be powerful forces for gaslighting, silencing, and the destruction of open and honest conversation, let alone debate. There are many examples of this in the mainstream media right now, and it is not difficult to run into it being wielded as a clumsy rhetorical bludgeon by people claiming to be "progressives." However, I would like to focus on a key item in this current version of "newspeak," the abuse of the verb "identify."

I headed off to my copy of the OED as per usual to learn about the roots and general history of this word, since it is the case that words change in meaning over time, though in the normal course of things this generally takes close to a hundred years or more, at least in english. The OED duly informed me that the roots of this word are from latin, from "idem" meaning "same" and "facere," meaning "to make." So to begin with, "identify" meant "to make the same," especially by finding enough similarities between two whatevers or whoevers to say, "yes, that is the guy I saw here yesterday," or "that's the plant that'll kill you if you eat it." Sounds useful. Then the follow on definitions include, "establish or indicate who or what someone or something is," "recognize or distinguish," "associate someone closely with, regard someone as having strong links with." This last one has strong connotations of this association and regarding being inaccurate based on the illustrative quotes. Then, "to regard oneself as sharing the same characteristics or thinking as with someone else." The referent in all these cases is not internal at all, but with someone or something else. Which means that the identification can always be contested, which is very useful when you want to claim that some group merely identifies as something, and then declare that in fact they don't exist. Indigenous peoples worldwide are all too familiar with this bait and switch. Look up the Sinixt Nation, or the Native Tasmanians whom people who think they are white are still claiming are completely extinct.

There is also a closely related abstract noun, the ever handy "identity," which my OED reports means "the fact of being who or what a person or thing is," "the characteristics determining this." The characteristics in question are stable, features that enable the person or thing to be consistently recognized. In themselves the features or characteristics need not and more often cannot determine behaviour, except in the most generalized senses. For example, I have two feet and two legs and no issues preventing me from moving them. So if I want to get around, my starting point is walking. The notion of "identity politics" is very popular right now, because it starts from the notion that people who are not upper class, rich males who think they are white must "identify" as what they are, and that "identity" is a completely individualized thing that drives all of their ideas and actions. If only they'd "identify with" somebody or something else, they would automagically develop the ability to "pull themselves up by their own bootstraps" and stop demanding changes to the status quo. After all, a person who "identifies with" someone or something can have their resultant identity contested and even denied by "those who know better." Somehow "those who know better" are are always upper class, rich males who think they are white, similar males who aspire to be just like them, and various highly sympathetic women who think they are white. That's a spectacular, but actually very old abuse of language. It can be traced as far back as colonial settler states defining things like "Indian status" so they can delete whole peoples by what the brilliant historian Barbara Alice Mann refers to as "pen and ink witchcraft."

There is at least one more distortion to consider here, and that is the modified term "self-identify" which is apparently the alternative for speakers who want to set their identifications beyond question. I base this inference on my encounters with fellow students when they draw on this verb, and I can see the appeal, because it is supposed to overcome the potential for the identity at hand to be questioned. Lucy Mcdonagh, one of many brilliant, clear headed working class activists on the front lines everyday, unpacked this verb as follows, "That's what 'self-identify' means: anyone can say they are anyone... So, rich, privileged people can claim to be marginalized." (Here's the source article from Feminist Current, it is worth every moment you spend reading it.) On top of that, people can "self-identify" with just about anyone or anything, no consistency required.

Let's bear in mind here that intensely admiring another social group is not the same thing as being somehow a member of that group. Not even if that admiration leads to a desire for membership that blazes brighter than a thousand suns. As Métis scholars have had to point out a lot lately, who you are isn't about who you claim or who you "identify with" or "self-identify as" but with who claims you. Whether you are accepted by that group, and what you do that reflects your deep respect and love for that group as a member. Insisting that either mode of "identification" overrides this is an illustration of a sense of entitlement, not a sense of reality. Who and what we are is not changeable, much as we may want it to be. Our intense feelings of admiration, sympathy, or even (heaven forbid) ownership, can't change what we are.

But if destructive notions of entitlement and ownership aren't in the way, it is quite possible to identify systemic oppression, and take action against it. That may well mean working against the systemic oppression of some group that in a moment of confusion a person claimed to "identify with" or "self-identify as" when they intended to express their empathy and commitment to do right by that group they admire. For the ones who abuse the terms "identify," "identity," and "self-identify" in an effort to appropriate and oppress the groups they claim to be, those are the contributors to the systemic oppression, not its opponents. A word to the wise. (Top)

You Can't Have It Both Ways (2018-03-26)

Roman era representation of the deity Janus, one of the Roman deities of liminality, in his case the hinge between the new and old year. Photo by Loudon Dodd, july 2009 via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported, 2.5 Generic, 2.0 Generic and 1.0 Generic license. Roman era representation of the deity Janus, one of the Roman deities of liminality, in his case the hinge between the new and old year. Photo by Loudon Dodd, july 2009 via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported, 2.5 Generic, 2.0 Generic and 1.0 Generic license.
Roman era representation of the deity Janus, one of the Roman deities of liminality, in his case the hinge between the new and old year. Photo by Loudon Dodd, july 2009 via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported, 2.5 Generic, 2.0 Generic and 1.0 Generic license.

This particular thoughtpiece has been on my mind for awhile, because it took some time before I could sort out what the logical mess was. What happened was I bumped into yet another article insisting with absolute certainty about the importance and value of advertising. These are easy to find, just follow a brand name tech magazine for a few days. Advertising is how anybody who makes money makes money, as far as that person is concerned, and that's why corporations and companies of all sizes spend money like there's no tomorrow on it. Advertising works, it causes people to change their behaviour, or they wouldn't spend money on it. Then the very same media outlet or blogger will declaim how video games and pornography don't affect anyone's behaviour at all. Ever. It is impossible, nobody can prove a connection between video game playing or pornography and violence. But, you can't both claim that repeatedly seeing images and hearing messages intended to influence behaviour must have an effect when it is advertising, or more bluntly, propaganda, and deny that special categories of images and messages somehow never, ever work this way. At least, not if you want to be logically consistent, which I suspect is the real reason the people trying to make these claims rush for what they believe is the "free speech get out of logic and critical analysis free" card. (I have a couple of other thoughtpieces that consider aspects of the uses and abuses of the concept of "free speech.")

UPDATE 2018-05-17 - It is worth noting that I am not suggesting that advertisers or whoever else who wants to influence people is somehow able to program people in the manner of computers or the fictional manchurian candidate. In a 1970 paper, "Consumerism" and Women, Ellen Willis lays out an excellent analysis of advertising and "consumerism theory." The paper is anthologized in Voices From Women's Liberation, compiled and edited by Leslie B. Tanner. She works her way through the complexities of social conditions as they interact with the propaganda a person may see and hear depending on their circumstances, and firmly argues against constructing propagandists as men who can magically lead everyone around by the nose. She does not argue that propaganda has no effect, but that its effect can only be temporary unless other circumstances act continuously to reinforce its authority and believability. In other words, I have unwittingly retraced some of the same steps as Willis, beginning from a somewhat different starting point.

Since "advertising" is an attempt to have a euphemism for the sensibly maligned term "propaganda," it is reasonable to treat them together. Of course people who sell propaganda under either of its labels must at least pretend that they fully believe and expect the stuff works. A great deal hangs on the notion that it is "the only way" to give a product, service, or message "findability." Leave aside for now that people couldn't have waited for propaganda for this to happen, or humans would never have figured out how to eat omnivorously or use fire. Today the main propaganda techniques seem to be "more," "louder," and "shock somehow." This leads to such absurdities as a pair of nike shoes marked at least eight times with the name and logo of the company on each shoe, in spite of the fact that one of each together with the easily recognized shape and style of the shoes basically screams "these are nike shoes." When I counted on the pair I had access to – an exercise I started because the sheer number of repeats just at a glance floored me – at least three instances were effectively invisible because they were inside the shoes. This did not persuade me to buy the shoes. Still, I'm sure many advertising mavens would say that this technique still worked, because my attention was caught.

Still, let's think through the sequence "advertising" is supposed to take us through.

  • catch a potential purchaser's attention;
  • convince them somehow that they need the product;
  • furthermore, persuade them they must purchase the product, no shortcuts like theft;
  • repeat.

These are the raw basics. The whole point is that the advertising is supposed to guide and change behaviour. Since "more," "louder," and "shock somehow" are all part of the implementation, the expectation is not that advertising will achieve this instantly. Far from it. The expectation is that people must be repeatedly subjected to advertising, constantly subjected to advertising, preferably where they don't expect it and can't avoid it. Hence the proliferation of bathroom advertising over the past ten years or so in canada, although it would not surprise me to learn that this phenomenon began much earlier elsewhere. None of this is particularly different from how what is generally recognized as "propaganda" works, the only difference is what action and/or behavioural change is the desired outcome.

Now here is where we get to a key point. An immediate objection to my putting video games and pornography in the same category as propaganda is that nobody is trying to spread propaganda messages of any kind via their use. Frankly, that is bullshit, even if every video game and pornography studio's personnel struggled desperately to avoid sending any messages of any kind, no politics, no product placements, none of it. If they did, they would be fooling themselves. Everybody gets this when they visit websites not categorized as video game properties or pornography. One of the first things the rawest blogger does is express their politics by the very title of their blog and their first ten posts. That is only to be expected. We humans are opinionated and probably can't survive without opinions. And if we find ourselves with a platform that repeats our opinions incessantly to a huge audience, we can expect that in time our opinions will influence some of the people exposed to them. And we can expect that, regardless of our intentions.

It doesn't matter in any way, shape or form, if there is nobody involved in video game development anywhere who wants to encourage players to behave violently or not. The studies that claim there is no connection at all between violent video games and violence enacted both online and in firmspace are alas, unconvincing at best to all but true believers. The key is repeated, longterm exposure, and that is something that no study of whether there are connections between video games and violence or pornography and violence by various psychologists and sociologists has dealt with. The logical inconsistency is one source of the problem with sorting this out, though not nearly as great a source as the significant money in video games and pornography. That the influence of money is a real problem for credible studies of any kind is not a theoretical issue. There are plenty of real life examples, from tobacco and fossil fuels to genetically modified organisms, drug treatments, and alchohol.

There are some things we already know about media intended to alter human behaviour by means of as many senses as possible and increasing intensity via techniques such as louder sound, more shocking content, or blunt repetition. We know that people are susceptible to these techniques to varying degrees at different times and different ages. We know actual behaviour change and action consistent with propaganda messaging can be achieved, though rarely for long without ongoing reinforcement of some type, and even then, the reinforcement doesn't work forever either. These are uncomfortable things to know and connect to products considered "mainstream" and "acceptable." But know them honestly and thoroughly we must. (Top)

Not An Explanation (2018-03-19)

IT help desk folks, hand wavy 'your computer' is not an explanation. Meme by C. Osborne, march 2018 (Photo Source | Info Source) Original photograph taken and shared by Lamey Griner in 2008, thereby spawning a thousand memes. IT help desk folks, hand wavy 'your computer' is not an explanation. Meme by C. Osborne, march 2018 (Photo Source | Info Source) Original photograph taken and shared by Lamey Griner in 2008, thereby spawning a thousand memes.
IT help desk folks, hand wavy 'your computer' is not an explanation. Meme by C. Osborne, march 2018 (Photo Source | Info Source) Original photograph taken and shared by Lamey Griner in 2008, thereby spawning a thousand memes.

This is somewhat of a public service announcement, directed to all the people in jobs where they have to deal with potentially irascible or frustrated people, where the question may in someway entail computer use or be related to using a computer. That's quite a significant range of people, from IT help desks to the hard-working library staff in every sort of library you can imagine. It's a tough job. The person with the question or problem to solve may not share the same technical jargon, or may not be in a position to provide all the details of what they were doing that may help the help desk person/librarian help them most efficiently. Worse yet, as I noted, that person may already be annoyed, frustrated, or irascible. I am aware of this from direct experience on both sides of the desk. But here's the thing. Should the person seeking your technical assistance state that they have a computer running a MacOS, or a GNU/linux variant, that fact does not in any way explain their technical issue. Suggesting it does is lazy and insulting, and guess what the result is when you make it?

That person on the other side of the desk is now at minimum annoyed. At maximum, they are furious. Either way, their ability to engage constructively with you and apply your suggestions is now seriously reduced, because you have effectively implied that they are stupid. I think we can all safely agree that any version of this outcome is bad.

For those wondering how I can so categorically say the computer a person uses cannot explain away a technical issue, well, first of all, it is no longer 1982 or even as late as 1995, when computer hardware and software was so much less standardized. Back then yes, absolutely, which specific machine you had, and what its operating system and hardware was could make, and often did make, a significant difference. Not so much on the early web, where the problem was no longer the machine so much as it was the non-standardized web browsers, plug-ins for multimedia when that became a thing, and the effective bandwidth a computer could manage. Way back then, ethernet was rare and high speed internet for the general public was a dream.

Today however, things are quite different. Standardization has been imposed with a vengeance on the web, the internet more broadly, and computers. If anyone has any doubts about this, look up the latest exploitable weaknesses discovered that affect pretty much every computer on the planet, regardless of operating system. In terms of web-based applications, html, javascript, plug-ins, all are standardized. There are always wingbat versions out there, but it takes effort to access and install them. In fact, web-based applications, tools, and sites in general are among the most consistently behaved computer-based things today. It takes serious effort to make something that refuses to do anything sensible for only one operating system, especially something that is accessed through or run within a web browser, even if it is specifically optimized for the most common browsers. So unless the person who calls or comes to the desk with a technical question or issue is running some extraordinary machine of pre-2000 vintage, suggesting the machine is the problem – especially if you reproduce the issue on the bog standard windows PC likely provided to you in your job – is, as I said at the start of this rantish public service announcement, is not an explanation. It's just a great way to piss the person you're trying to help off.

I realize that making such a suggestion may be a rote habit. It took me a good while to stop asking people whether they had tried restarting their computer after windows modernized and I was helping out friends and relatives with computer problems. Horribly enough, invoking the "maybe it's your computer" pseudo-explanation may even be part of the standard responses to technical requests recommended to help desk staff in their support binders. Thankfully, I haven't heard of a situation yet where there wasn't a way to get rid of it, especially if the justification is in order to avoid alienating people and accidentally doing harm to the best defence libraries and IT departments have against software and hardware related catastrophe: people willing and comfortable with bringing forward their questions and issues to be resolved. (Top)

Privilege Check (2018-03-12)

1933 south african cashier's cheque, a fancier version than is typical nowadays. This image came originally from a wikipedia entry, circa march 2018. 1933 south african cashier's cheque, a fancier version than is typical nowadays. This image came originally from a wikipedia entry, circa march 2018.
1933 south african cashier's cheque, a fancier version than is typical nowadays. This image came originally from a wikipedia entry, circa march 2018.

Courtesy of a several intriguing and thought-provoking conversations over the past week or so, I wound up sitting down to think through the vexed term "privilege," and the practice of "privilege checking" as applied in the current political climate of extreme anti-Feminist and resurgent racist policing. It took me awhile to sort out what was bothering me about the way "check your privilege" has been weaponized into a demand not to do any such constructive thing, but to shut the fuck up. The profanity is not there for emphasis, the profanity reflects exactly the tone, demeanour, and often wording of follow up demands from the people who now most frequently make the demand in high visibility, public venues. It has been extraordinary hearing and seeing demands to "privilege check" deployed against those of us who are dealing with systemic oppression, in other words, people who are disadvantaged as if that is somehow a special state of honour. If you want a great example of gaslighting, systematic erasure of a person's, most often a woman's, reality, this is it.

UPDATE 2018-09-07 - A great discussion of the notion of privilege and one of its most famous recent proponents and reconceptualizers, Peggy Mcintosh, is provided in Outis Philalithopoulos at Naked Capitalism. It is a three part series with mostly constructive participation in the comment sections: The Victory of Privilege, Before Privilege: Cats, Spaghetti, and Ice Cream, and The Divided Psyche of Privilege. Philalithopoulos provides a constructive and nuanced reading, including a tracing of Mcintosh's changing ideas about "privilege" and how it works, particularly her shifting analogies for it and what is arguably a gradual loss of the aspects critical of oppressive societal structures. One of the commenters on part two, Buck Eschaton, tracked down the paper Mcintosh wrote prior to her more famous knapsack article, Feeling Like a Fraud.

UPDATE 2022-05-08 - Also relevant here is a paper I recently stumbled on that explains the origins of the execrable "privilege walk." I have never seen any purpose to using humiliation exercises as supposedly "team building" or "anti-racist" exercises. Since when is devising artificial ways to humiliate people presumed to be racist or "privileged' because they are not racialized or struggling to get by on a minimum wage job a constructive action? It's ridiculous and counterproductive. But where did this dumbassery that is driving a doubling down on social division and anger come from? Well, see Christian Parenti's 18 november 2021 article at nonsite.org to find out, The First Privilege Walk. Among Parenti's many well-chosen quotes is this one that finally clarified the humiliation exercises and what they are for: "As Dennis Tourish and Tim Wohlforth put it in their book On the Edge: Political Cults Right and Left, "Artificially engineered peak experiences have long been known to induce extreme conformity. In the case of R[e-evaluation] C[ounseling], the supremacy of emotion over thought means that the discharge process is exalted as the most important part of the counseling experience... Research suggests that when people engage in embarrassing behaviors in front of a group they are inclined to exaggerate the benefits gained from group membership. Given what they have been through, they are in urgent need of some justification for their behavior. Who wants to admit having just made a prize fool of oneself? Counseling individuals in front of large crowds at workshops, while encouraging the strong display (or dramatization) of extreme emotion, unleashes precisely this dynamic within RC."

Whenever a particular word usage troubles me, I turn to my trusty OED, not for the absolute meaning, but for the various connotations of a word over time. The OED is intended to be a reference collated on historical principles, reflecting what people have said and meant by words rather than a prescriptive dictionary, which tries to tell everyone what they can or cannot say. The editors and contributors have done and are doing a remarkable job of this since the whole project was initiated in 1857. "Privilege" is an old and fairly common word, entering english in the "middle english" period, which is usually defined as roughly 1100 - 1500. The first definitions in my OED focus on the abstract noun, explaining that a privilege is "a special right, advantage, or immunity granted or available only to a particular person or group of people; something regarded as as a rare opportunity." Yet I wonder if the demand to "check your privilege" actually refers more to the verb, "to be exempt from a liability or such an exemption to which others are subject." By extension, "to privilege" someone in the active sense, is to grant such an exemption to someone or some group of people. In fact, it seems to me that this is exactly the reference we should be starting from. With these passive and active verbal senses in mind, finally, I get it.

The original point of the privilege checking is drawing from a Feminist, intersectional perspective on privileges, which are given for "good behaviour" to persons or groups who are caught in interlocking systems of systemic oppression. The exemptions are not intended for their benefit. They are for the benefit of the oppressors, who hope to use the exemptions to co-opt the oppressed and divide them from each other by persuading them to compete for them. In other words, these "privileges" take the pressure of resistance off of oppressive structures and the groups of people who control, enforce, and benefit by them. The "privilege" is not the thing the oppressed person gets so much as it is the act of granting which is used to win either their conscious acquiescence to the oppressive structure because it isn't so bad for them when they do, or to claim that those who are unaware that they have been granted a privilege and therefore are not experiencing respect for their rights, that they are in fact acquiescing. Passive, unwitting acquiescence is very convenient for oppressors, who love to yank that out and say, "Look, you're a hypocrite." At times, especially when feelings of frustration and anger are at their highest, it can be all too easy to slip into doing the same thing among the highly diverse communities of Feminist, anti-racist, and anti-colonial activists.

Yet I don't think this was the original point Feminists calling for intersectional analysis and insisting that Feminist groups and movements must grapple with their own unconscious racism, colonialism and so on, bringing that to consciousness and working to neutralize and remove them. The point was never just to shut people up or drive them away, but to help overcome dangerous, divisive weaknesses in the theory and practice of Feminists individually and in group work. Based on publication dates of the various articles, books, and talks on this, the peak time of wrestling with these ideas in a relatively constructive way spanned the late 1970s into the early 1990s. This was a time when resistance to an "I found Jesus" sort of approach to radical analysis looked like it might get the upper hand. By this I mean the sort of intolerant, "I have found the one and only, perpetual, unchangeable answer" attitude that is encouraged in converts to forms of evangelical protestantism and some forms of catholicism. Many Feminists were starting to appreciate that they could not, and should not expect one version of Feminist theory or action to fit everywhere and forever, because systems of oppression are multiple, intersecting, and will be reshaped in hopes of neutralizing resistance. All that quite apart from the fact that humans are imperfect by nature, and the need to change and improve in no way invalidates activist work and commitment. Genuine hypocrisy does, refusing to do the needed work once you aware something is awry, that does. Not having to make changes and making them.

The deployment of "privilege checking" as a silence tactic makes use of a deliberately propagated category error. The error is defining the exemption as the privilege, effectively agreeing that certain people are not truly full human beings, and therefore they do not have full human rights. The aspects of culture or social status that differentiate an oppressed person of group from others are not "privileges." They are not "privileged" by their difference, they are not allowed to be different as a privilege. They are different by necessity, in resistance to systems of oppression, and yes, in the sheer joy of human creativity. It is not a "privilege" to be subjected to demands to wear crippling shoes and clothing as women generally are in north american societies. If it was, women would generally not be allowed to wear those things, and only a few permitted to wear them. Those considered middle class do have a level of economic privilege, they are permitted to maintain an economic foothold in return for their active support for capitalist, racist, patriarchy. It takes time and effort to realize, if you're middle class, that this is a privilege, not merely an outcome of your excellent character. One of the telltale markers of economic privilege in that case is the fact that the middle class is expendable as soon as the capitalist economy goes into crisis.

"Privilege checking" can be an incredibly powerful thing to do, and I think it is fair to say that it is hard to do well and effectively. Nobody gets a pass from it, if they're serious about resisting and ending all forms of oppression. But it is nothing but gaslighting and supporting oppression to abuse this powerful technique by using it as a figurative stick to beat and silence people with. (Top)

Attention Debt (2018-03-05)

A quick grab from classic Doctor Who episode 'Snake Dance,' january 1983. A quick grab from classic Doctor Who episode 'Snake Dance,' january 1983.
A quick grab from classic Doctor Who episode 'Snake Dance,' january 1983.

Based on what I have seen, read, and heard both online and off, there is a broad consensus in canada, if not north america, that everyone has shorter attention spans these days. It is a remarkable consensus that I find mystifying. The same people who tell me this in person, including people with a range of education levels and political interests, have typically done so in contexts where their own attention span has been anything but short. They were taking part in an extended conversation with me for example – I highly doubt it was my own self that inspired their fixation – or were taking a break from an intensive session of texting or tweeting about an issue between friends. Or since I stumbled on the topic, they were telling me in loving detail about a television programme or computer game that they have been engaged with on a regular basis for days or weeks, explaining intricacies and details or commenting on the quality of the writing. Even those least inclined to dig into the details have a lot to say about the differences, not all positive, between old and new programmes and movies. Moving away from pop culture, I can't think of many of them who don't have a more or less consistent hobby or sport they attend to, although the hobby may be carefully camouflaged as home renovations or the like. These folks are from a wide range of ages, where is this shortened attention span?

Is it possible that we have here a widely held idea that is based on a flawed measurement? In the 1980s the consensus crisis of learning and knowledge was that literacy was falling off a cliff, why even elders were reading less and less. Except it turned out that the basis for these claims were surveys demonstrating what turned out to be the early stages in the collapse of mass market newspapers and magazines. There is far more to read out there than newspapers and magazines, and that is just the least of the reasons those publications are generally in trouble to this day. Perhaps presently, when the measure of "attention span" is apparently whether school children can be persuaded to read the terrible textbooks they are subjected to in school for more than ten minutes at a time, or people browsing websites online and spending less than a minute on most of them. Both are ridiculous measures of no more and no less than how awful school textbooks often are, alas, and how awful most websites are, or else how many websites are designed as the semi-equivalent of reference books and headline tickers. In other words, media that fail in their purpose or are not designed for extended attention in the first place.

In one of her later essays, Jane Jacobs mused that maybe so many children are hyperactive today not because they are literally sick or learning disabled, but because they are understimulated. If anyone has read Ken Robinson's books or listened to his ted or rsa talks, this may sound a bit familiar. He is arguing that many kids today constantly have perpetually beeping and whirring phones, immersive video games, and movies and television shows cut and scored to maximize excitement, so school is horribly boring in comparison. Jacobs was arguing from a different cause, pointing to how children have been pressured out of playing outside in non-structured environments in unstructured time. Instead parents are urged to effectively give the kids each a daytimer and book their kids solid, and can risk severe censure if they allow their children to play outside or go for a walk. As a result, those kids have fewer opportunities to learn self-reliance, play imaginatively, and do neat physical stuff like climb trees and generally horse around. Thinking over both these arguments, I don't think this trouble is exclusive to children. Just do an online search for examples of office environments, even "campuses" like google and apple have built. The monotony and sense of ever present surveillance and control is uncanny. That can go double for what is rapidly becoming the most common workplace type, the franchise restaurant or store.

So is it really that everyone's attention span is less, or is it that their attention span for certain types of media or jobs is found wanting by particular employers and representatives of corporations? There is also a remarkable consensus that in the world right now more choices of entertainment, work, or life paths are available than ever before. Questionable as this is on a practical basis beyond entertainment, it is hardly surprising that there are still enough options available that if a person finds a given book or game insulting, boring, or just plain lousy, they don't spend more attention on it than identifying it as not worth the time requires. Or it could be that right now, the hegemonic styles of books, movies, television shows, and news in english are effectively designed in snippets and dizzying jump cuts because that supposedly will make them "pop" and grab the ever elusive attention of the audience, whose members must be wondering how the hell they are supposed to successfully pay attention when every other minute there is a jump cut. Aliette de Bodard wrote an intriguing article touching on this point for the science fiction and fantasy novelists blog in 2010, Narrative, Resonance and Genre. It is well worth a stretch of your attention. (Top)

The New Medievalism (2018-02-26)

Dentistry really hasn't changed that much, except for the analgesics. Public domain image from Omnum Bonum by James Le Palmer held in the british library and reproduced here via wikimedia commons, 2018. Dentistry really hasn't changed that much, except for the analgesics. Public domain image from Omnum Bonum by James Le Palmer held in the british library and reproduced here via wikimedia commons, 2018.
Dentistry really hasn't changed that much, except for the analgesics. Public domain image from Omnum Bonum by James Le Palmer held in the british library and reproduced here via wikimedia commons, 2018.

The "middle ages" or "medieval period" is a loosely defined era of european history, with a consensus average dating of 500 - 1500. Its beginning for a given part of europe is defined by when the roman empire finally officially fell there, meaning that roman military and political control were both gone. Into the breach stepped primarily gangs of mercenaries and robber barons, who included a significant number of high officials in the catholic church. Depending where and when a person lived in this era, their sex, their class, and their access to learning, their experiences could be very different indeed. Unfortunately, as the prevalence of roving bands of armed men suggests, a lot of those experiences were far from pleasant. It is during the middle ages after all that those bands of goons invented and imposed feudalism, and catholic officials hyped up and helped drive the hideous crusades that devastated the muslim world where art, technology, and social orders were generally more sophisticated and fundamentally more hopeful than in europe. Bear in mind that "the muslim world" was then just as it is now, an incredibly diverse region.

I'm not saying there was nothing good happening in europe in the middle ages. There is no argument here with claims that despite significant obstacles, great art and even some philosophy came out of the middle ages in europe. Europeans got very good at appropriating technology from every possible place else their warbands and merchants could reach, since so much was lost between feudalism, warfare, and epidemics. That's not saying much though, when two of the major products of the age were the impossible to reintegrate men who were channeled from warfare into "exploration" voyages, including the ragbag of malcontents in columbus' crews, the systematic war on women often referred to in insulting terms as "witch hysteria," and of course, the precursors of the modern nation-state.

Yet another of the products of the middle ages is "medievalism," a rather elastic term that I have found applied to studying the middle ages on an academic basis, revivals of supposedly "medieval" culture and mores, and a belief in and romanticized images of that very time. Scholars of the period are not necessarily egregious romanticizers of it at all, so it strikes me as a bit unfair for these three things to be placed together. On the other hand, such prominent medieval revivalists as J.R.R. Tolkien and C.S. Lewis, or on the less officially academic side Eric Gill and William Morris makes it hard to argue that this unfairness is any more than something that "seems."

The revivalist and romanticized ideas about medieval europe are not too far apart. Somehow everything is clean, nobody gets sick, everyone knows their place, especially the lower classes and the women. The clothes are great and mark everyone unambiguously by sex and class. Warfare is a purely aristocratic pursuit in which "king Arthur" and his knights run around being very christian knights and very chivalrous besides. That chivalry had nothing to do with actual respect for women or anyone else considered weak is another detail glossed over in a hurry. "You go first," meant, "I think there's an ambush ahead. Go on, you're expendable." Technology was not too high, not too complicated. Tolkien's description of his fictional hobbits disliking anything more complicated than a hand bellows is a good example of the "ideal" technological level. Anything more clever is of course the product of nasty easterners. Oh, wait, I forgot the eternal whiteness and purity of everyone in this idealized image, except of course the obligatory baddies when and if they show up. They are always non-white, and often conspicuously non-christian. Every land is ruled over by a king and his collection of mercenaries, and these are hardly a constitutional monarchies, because the kings were officially chosen by "god" via such useful devices as tournaments, swords in stones, but most important of all, birth into the right family and with a penis.

Today medievalism is less overt, though just as unsophisticated in its bemoaning of a lost, supposed perfectly controlled society of racist patriarchal hierarchy upheld by church, military, and monarchies. The serial numbers in the form of medieval era clothing and technology have been dropped for the most part, but the glorification of the rest of the stuff is still there. "Lower technology" and handiwork have been redefined as a wonderful hobby for the few that helps set them apart from the many who have to do all sorts of work for daily survival. This is the dark side of the efflorescence of clubs like the society for creative anachronism* and the mainstreaming of fandom, this accompaniment of corrosive medievalism. (Top)

* Full disclosure, I think the society for creative anachronism is neat in many ways, but I can't understand the insistence that branches that deal in overtly punked imaginings of the past like steampunk are verboten.

Adults Can Do Better Than a 5 Year Old's Version of "Fair" (2018-02-19)

Seeking pie chart, kept getting spanish language diagrams of the bones of the foot (el pie). Image courtesy of Saber es práctico who got it from wikimedia commons, may 2009. Seeking pie chart, kept getting spanish language diagrams of the bones of the foot (el pie). Image courtesy of Saber es práctico who got it from wikimedia commons, may 2009.
Seeking pie chart, kept getting spanish language diagrams of the bones of the foot (el pie). Image courtesy of Saber es práctico who got it from wikimedia commons, may 2009.

In the course of listening to the latest interview at Women's Liberation Radio News with educator Anya Robyale about whether to integrate Feminism into early child education and if so how, on the latter question Robyale commented on young children's sense of what is fair. She noted that the five year old's response of "it's not fair" is a response to a simplified version of fairness, which is of course to be expected. Five year olds have enough to deal with figuring out how to read, tie their shoes, and learn how to behave in public without tangling with the deeper questions about fairness that adults do. Yet it seems like right now, there are all too many bad intentioned participants in discussions about adult level fairness issues who are working mightily to deflect the conversation and subsequent actions, if any, onto the five year old version of fair.

When a person insists that all that is required to take care of systematic oppression is to make it fair by providing the same opportunities of access to everyone, unfortunately this is generally just such a dishonest appeal to a child's sense of fairness. Children don't know or understand that even if there are no obvious barriers to say, becoming a carpenter like an explicit rule against some group of people apprenticing in it, that it looks "fair," things can be emphatically unfair. If sexual harassment is deployed by men to keep women out of carpentry, it doesn't matter that there is no explicit, formal rule keeping women out. The effect is created by the behaviour that the men are allowed to use to make the learning and work environment so impossible few to no women become carpenters. The principle is the same on all manner of other bases. Consider that until the late nineteenth century, being a catholic could be a legalized reason to prevent a man (women were already blocked by other social and literal legal rules) from entering certain professions, and for some time after that legal rationalization was removed, the social prejudice against catholics maintained barriers against catholic participation.

The trouble as always, is the systemic barriers that realize oppression and unfairness without any explicit rule or conscious participation necessary. These barriers are then conveniently invisible unless made visible, and then often not taken seriously unless and until somehow they go awry and affect people who thought they had no limits on their choices. There aren't too many of those. And since there is a pervasive mainstream narrative that poverty is actually self-inflicted because the poor are supposedly lazy and incapable of planning for the future, the biggest barrier to better life chances and contributing to family and society is dismissed as "not systemic" and "their fault, not ours."

Adults can easily do better than this, and at different times in history certainly have. The publicly funded social programming, legal, and social changes that are still under constant attack by the remnants of extremists claiming to be "social conservatives" include many good examples. Not everything worked, or works. Some things were loose bandages that covered over a systemic issue or created a temporary safety valve to blow off the pressure, and so failed as they were fundamentally intended to do. Yet there are also many, many examples of successful and feasible approaches. And of course, it doesn't hurt to firmly remind adults that they're acting like five year olds when they loudly bleat that a program intended to counteract systemic oppression is supposedly "unfair" to them. We don't take five year olds too seriously when they behave like this, and we certainly shouldn't take adults seriously when they behave like this. I have observed that the loudest complainers are the very same people who have multiple opportunities in other places, or are in fact unqualified for whatever job or program they insist they are being kept out of for no good reason.

These specific protesters give away a very specific belief they hold, which is that anything they can label "affirmative action programs," a phrase now so pejoratively connotated it is effectively unusable in everyday contexts, must inevitably be used to allow unqualified people to do whatever the program or rule applies to. It doesn't matter what the reality of the programs and rules is, or the competencies of the people angrily declaiming about unfairness. They believe that "affirmative action" is out there to get their unqualified selves into jobs they aspire to regardless of their qualifications. But the other adults in the room should be more than able to recognize such attempts to channel a five year old and do better. (Top)

The Most Dangerous People (2018-02-12)

Biohazard sign graphic contributed to wikimedia commons by Offiikart under Creative Commons CC0 1.0 Universal Public Domain Dedication, 2011. Biohazard sign graphic contributed to wikimedia commons by Offiikart under Creative Commons CC0 1.0 Universal Public Domain Dedication, 2011.
Biohazard sign graphic contributed to wikimedia commons by Offiikart under Creative Commons CC0 1.0 Universal Public Domain Dedication, 2011.

I knew what was coming. I edged around the various newspapers and news sites. Studiously averted my eyes when the headlines showed up and I couldn't avoid them. Took a deep breath. Tried to look at other things. Kept reminding myself of other stories, other happenings. A great feel-good one from out in Stó:Lo territory. Grinding my teeth and avoiding obnoxious olympic coverage with its pukeworthy "feel good" commercials about how canadians value "our land."

Because I didn't want one more piece of proof, one more piece of evidence that the settler state of canada considers Indigenous people little more than dangerous animals who have to be put down as fast as possible. Little better than the wolves that get slaughtered as soon as they are reintroduced to their territories – sound familiar? Little better than the bears that half-starved and hemmed in more and more by humans resort to raiding garbage cans and kitchens, only to be killed, or if they are supposedly lucky, drugged and dumped off somewhere far outside of their own territories – sound familiar?

I didn't need to hear about how a carefully culled white jury decided that a sleeping Indigenous man is so dangerous that an old white man can walk up quietly and execute him with a shot through the back of his head because he felt "threatened." Indigenous man, asleep, supposedly drunk or smelling like it I guess. That's enough to feel threatened when you're a white man. So threatened as to be sure that if you don't kill first, that guy will get you.

Everybody knows how threatening Natives are. I read Ryan McMahon's latest article on vice.com. Thought about how much sounded familiar. Diving out of the way of cars driven by white people, white people driving fast, ignoring the crosswalk, lurching towards me across the median. The constant hovering presence of clerks in every store, always pushing, pushing, pushing, to get me back out the door. Don't drink, don't smoke, don't smell like pot, don't smell like alcohol, especially if you're female. Sound familiar?

Everybody knows how threatening Indigenous women and girls are! Read Robert Jago on MediaIndigena for an example, right from the Stanley courtroom. After all, why else would the numbers of murdered and missing Indigenous women and girls be ticking up every day. Murders uninvestigated, trafficking winked at, survivors of rape informed in the courtroom that they only had themselves to blame since they were drunk/intoxicated/improperly dressed/around men. So threatening. So threatening that no violence is too much, no violence against them counts. Sound familiar?

There was a lady everybody called Old Mary in a town I grew up in. She was feisty and dirt poor. The cops made a habit of arresting her for vagrancy, public drunkenness, the usual. Then one year, she died. Somehow she managed to freeze to death, after surviving the bitter winters just fine for years. For decades. The word went around. Starlight tour. Sound familiar? But this is the era of reconciliation, reconciliation. Sound familiar?

Sophie McCall wrote in 2011, "While reconciliation prioritizes the expiation of the colonizer's sense of guilt, it places the onus upon the colonized to end longstanding conflicts." So the onus on Indigenous people is, don't be threatening. But we're threatening when we're awake. We're threatening when we're asleep. We're threatening when we're sobre. We're threatening when we're drunk. We're threatening when we're female. We're threatening when we're male. We're threatening when we're old. We're threatening when we're young. We're threatening when we're working. We're threatening when we're unemployed. We're threatening when we're peaceful. We're threatening when we're fighting back. We're threatening when we're alive.

Try reading it again. Sophie McCall wrote in 2011, "While reconciliation prioritizes the expiation of the colonizer's sense of guilt, it places the onus upon the colonized to end longstanding conflicts." When the topic of "Aboriginal people and the justice system" comes up, the discussion is always about how to make the structures that railroad Indigenous people into jail on the flimsiest pretences and the jails themselves "more culturally sensitive." Sound familiar? Ever heard anybody talk about how the justice system needs to be repaired so that Indigenous people actually get justice from it? Bet that doesn't sound familiar.

Read it again. Sophie McCall wrote in 2011, "While reconciliation prioritizes the expiation of the colonizer's sense of guilt, it places the onus upon the colonized to end longstanding conflicts."

Now tell me. Tell me why I should believe, tell me why any Indigenous person should believe, that the onus being placed on us to end longstanding conflicts is not for us to hurry up and die. (Top)

Feminism as Ecosystem (2018-02-02)

For some reason, most pictures used to illustrate ecosystem articles online use images from coral reefs and similar places. This image courtesy of Richard Ling via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license, august 2004. For some reason, most pictures used to illustrate ecosystem articles online use images from coral reefs and similar places. This image courtesy of Richard Ling via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license, august 2004.
For some reason, most pictures used to illustrate ecosystem articles online use images from coral reefs and similar places. This image courtesy of Richard Ling via wikimedia commons under Creative Commons Attribution-Share Alike 3.0 Unported license, august 2004.

It doesn't take much reading about Feminism to notice that a very few descriptions are applied to "it" in defiance even of the evidence that the description is associated with in whatever article, book, or other item may be at hand. The big one is the insistence that Feminist activism can be characterized as a series of waves, of which somehow there have only been three at most, and these have happened only where there is or has been what can be labelled a "western" society. The absurdity of "western" as a descriptor only gets worse the more anyone reads of history, because somehow no matter how east the people labelled "westerners" are actually from, wherever they get to is never "west" until they get there. Referring to "western" Feminism tends to make this strangeness especially obvious. A close follower in terms of size to the implied "western" wave model is the current ever louder insistence that Feminism any time it seriously challenges established notions of power, femininity, and masculinity, is suddenly "white." This is one of the very few contexts in which "white" stands as a slur, and that is truly remarkable. Its conjoined twin is "middle class," which is also rendered into a homogenizing slur in this context, which is one that refers only to women. After that, the next one usually trotted out is the "obsolescence model," in which the fact that at least a few women can be pointed at in occupations once deemed exclusively for men, and women generally can vote and officially own property means that Feminism is completely unnecessary now.

Taking these in inverse order, obviously the last one is wrong, period, not least because Feminism was and is never just about making sure women somehow achieve an ersatz version of the perfect liberal individual, characterized by the franchise, property owning, and working for wages. That's easy to verify by reading beyond the Feminist activists and theorists featured in mainstream textbooks, many of whom wrote in far more accessible venues including popular magazines, newspapers, and trade press books. The second one is also wrong, though it may be harder to see what is wrong with it at first glance for those who believe fiercely in "queer theory" and its close cousins, "post-modernism (tm)" and "kink." Somehow calling down Feminism as "white and middle class" is always okay, even though when women who happen to be white and middle class don't take advantage of their privilege to oppose the oppression of women generally, they catch hell. If they do take advantage of it they catch hell, because it seems the default assumption is that they are only out to make things better for themselves. If they do avoid that accusation, it doesn't take long before they get accused of taking up a sort of "white woman's burden" instead, whether or not that is the disrespectful though not necessarily ill-meaning approach they have taken. Yet once again, it doesn't take much work to find out that Feminism has never been the exclusive concern or work of white, middle class women. That bit of extra work soon reveals that middle class women are not a monolithic group, they were far from all taking up "white woman's burden," and it helps a lot to check out what they actually said and did, rather than the images of them propagated in the mainstream media. The uncritical acceptance of media images of women by so many who are keen critics of everything else is a puzzle all its own.

The big, overarching description has two parts, so let's start with the second bit, the claim that Feminism is somehow only a "western" thing. The only way to argue this is true is to start by strictly limiting what "Feminism" means, so that it only refers to what "western" women do in the "western" world during its history, such as it is. But in that case, we can't have it both ways. We can't both insist that "Feminism" can only refer to a particular type of european and north american political women's movement, and then insist that any time we find evidence of political women's movements that are indeed focussed on women's rights and needs, those can't possibly count as Feminism. Being a historian in training, I think it is fair to take into account that "Feminism" is a new word, but if we define it as I have here, then it is a phenomenon that women can enact in many times and places, just like the generally accepted terms "democracy" and "monarchy." It isn't hard to find diverse examples and manifestations of women opposing structures and practices that oppress them, though it would be as wrong to insist that these are all identical as insisting all democracies and monarchies are the same.

The last description of Feminism as a series of "waves" is practically speaking, absurd. No other political movement is referred to in this manner. Nobody refers to "waves" of democracy or fascism. Democracy is an especially good example, because it is now begrudgingly acknowledged that "democracy" is not a finished state but a goal towards which people strive and once acquired must be enacted and re-enacted constantly in order to defend it against the various people who hope to reinstate some version of their notions of the middle ages in which they will of course be nobles and in charge. Feminism also cannot be simply equated with achieving one or two things and then leaving off the work of overcoming and preventing oppression of women. The repeated mainstream media claims that Feminism happens in waves attempts to persuade women that somehow their oppression is different though, and can be simply vitiated for good by a few tweaks that don't change much else in the overall structures of society. I admit it would be wonderful and much less work if it were true, but it isn't. To describe Feminism as made up of "waves" effectively denies the persistence of women's resistance and the ongoing thread of women's activism over time and space. Different specific Feminist campaigns have caught particular mainstream media attention in different decades, and those campaigns have been described in hyper-symplified and homogenized terms that create the impression of a wave.

Yet, as Audre Lorde pointed out, "There is no such thing as a single-issue struggle because we do not live single-issue lives," and Feminism is far from a single-issue body of theory and practice. So my suggestion is that instead of reproducing or accepting the misleading descriptions so often handed to us, and as a start on perceiving and enacting Feminism as the intersectional body of theory and practice it is, we try characterizing it as not just a movement but also as an ecosystem. As applied in biology, an ecosystem is a complex network of interconnected and interrelated organisms. There is no such thing as a static ecosystem, nor is there ever a single, homogeneous ecosystem, which would be a contradiction in terms. Ecosystems vary with time and place, sometimes things go terribly awry, impressively often they go well, and generally they keep chugging through significant changes. This strikes me as an excellent metaphor for Feminism in terms of its history, theory, and practice. It allows us to acknowledge and accept that not everything goes right all the time or for every woman, frustrating and disappointing as that is. Even more importantly, it reiterates the possibility and potential for change and improvement, and also the necessity for change and improvement. An ecosystem that does not change is not an ecosystem. It is simply dead. By the same token, an ecosystem that lacks diversity is not an ecosystem either. Yet any ecosystem starts out from a keystone organism, and builds out from a smaller number of organisms, the ones that find an opportunity, and are able to take advantage of it.

The ecosystem metaphor for Feminism is no good for clickbait headlines and simplistic storytelling, but it is well worth thinking with. (Top)

The Future of Housework (2018-01-25)

A galaxy class replicator prop image, courtesy of Memory Alpha, july 2012. A galaxy class replicator prop image, courtesy of Memory Alpha, july 2012.
A galaxy class replicator prop image courtesy of Memory Alpha, july 2012.

That is, the future of housework as depicted in most science fiction that I'm familiar with. This isn't a subject that I generally spend much thought on, since housework is an unavoidable fact of life, although there are certainly people in denial about it. But then it occurred to me that it is precisely the things we hardly think about that could, or even should be given some real consideration in speculative fiction, especially science fiction. On one hand, yes, science fiction isn't really about the future. On the other, that doesn't change the fact that writers and other artists engaged in working on science fiction works are trying to imagine a different sort of now, and the stuff we forget to think about is what will give away the limits of our imagination, at least for that work at that time. Housework is definitely a major source of limits. If you don't believe me, take a moment to dig around the index pages of any science fiction fandom you like, and try to find some terms for items that are used to support day to day life. Actual day to day life, so not weapons, not military uniforms, not exotic life support apparatus, nor wildly imaginative medical apparatus to insta-fix every horrific injury. In other words, the stuff that reminds us inevitably of the fact that humans have bodies, and are bodies, as opposed to various sorts of items that either avoid dealing with that at all, or treat humans as perfect mind-body dualistic units, allowing bodies to be treated as if they were machines and repaired accordingly. This is tricky stuff, the habit of avoiding the body and therefore the fullness of being human goes way back, and isn't just a weird product of supposed victorian prudery.

Let's start with one of the more elaborated and pop culture science fiction universes around, the ever more incoherent Star Trek universe. On star ships at least, there are replicators to take care of such tasks as cooking and the dishes, and apparently even the laundry right down to ironing and folding. The weird mania for costuming that pretends there are no clothes fastenings and that everyone will happily live in perma-wedgie jumpsuits their whole lives strikes me as quite strange. Maybe that sort of pattern is supposed to be easier for the replicators. The tie-in novels occasionally mention things that clean floors and furniture, but it seems that there is no longer any dust or anything of that sort that requires sweeping. How this squares with the reality of skin, I do not know. On the other hand, as soon as people go planet side, it's back to ordinary house work and the reappearance of servants. Funny that. If we look at Doctor Who, which finally gave up any pretence to coherence at least forty years ago, in the various imagined futures there are no almost homes. Lots of military bases, space ships, compounds, and corporate headquarters. No places people live who aren't in some kind of uniform unless they are in a "primitive" society. I haven't seen the whole of Babylon 5's run, but what I saw was pretty consistent with these two, while Star Wars just inserts the 1950s plus vaguely unfamiliar looking clothes the few times "home life" is unavoidable.

These are perhaps unfair examples. Novels are generally much better, because of course they have much more virtual space and time to world build in. Of my more recent reads, I have been quite impressed with Ann Leckie's universe in the Ancillary series. My range of steampunk anthologies and weird fiction selections tend to stick to pseudo-victorian depictions for the most part, which shouldn't be surprising but strikes me as quite strange. But I suppose the surreal effect of a main character who is a brain in a tank being served tea and crumpets by a perfectly coifed and uniformed house maid is hard to resist. Most of the other novels I have read more recently could certainly be considered science fiction, but are often set in the very near future or a very near future alternate universe, so there is not much difference between them and the present. They have a reason that works in their fictional worlds, and the authors aren't just cribbing from foreign places for effect, which is an unfortunate habit mainstream authors can fall into. Among those are books like The Girl in the Road by Monica Byrne and The Golem and the Jinni by Helene Wecker.

It seems there is a general consensus in the published and popular science fiction imagination that in the future there will be no housework. This is kind of a neat idea, I admit. I for one am all for a self-cleaning bathroom, or at least self-cleaning toilets, immediately or yesterday if not sooner. Ditto self-cleaning floors. An easy no muss, no fuss way to deal with dirty dishes and dirty clothes would definitely be worth having. Based on the foulness of processed food though, I think getting rid of cooking is an idea dead on arrival. Perhaps that merely reflects a significant limit on my imagination, but I find it hard to let go of a notion that I have seen labelled with the word "foisson." I stumbled on this now rare and little-used word in The Red-Headed Girl From the Bog by Patricia Monaghan. It refers to the special energy of a substance that renders it live and nourishing. So butter that has lost its foisson provides no nourishment, even if it otherwise seems fine. My meaning here is not that I think all food and drink literally has a magical essence that makes it nourishing. Far from it. We have a reasonably good idea what makes food nourishing, those obsessively measured and counted vitamins and minerals, let alone proteins and carbohydrates. The trouble is that major food processing removes everything but the carbohydrates and may also replace food with substitutes that seem like food but aren't. Soy is a big pseudo-food substitute, along with a whole range of petro-chemicals. Even without processing and inappropriate replacements, versions of the good stuff produced by chemical processes in factories and laboratories, such as vitamins and various chemicals meant to provide needed minerals have their own issues. They aren't bad, but it turns out surprisingly often that the apparently easiest to artificially produce version is also the one that humans have the most trouble absorbing. So all together, it seems to me that self cleaning toilets and floors are by far the easier problems for inventors to solve.

Mind you, perhaps I have this all the wrong way around. Maybe the thing that a lack of consideration of how housework might be different or what parts of it might still be present in the future doesn't reveal a failure of imagination. Perhaps it simply reflects a popular consensus that nobody is going to spend precious inventing time and brilliance on such stuff, because it just isn't that important. Until the toilet backs up, I suppose. (Top)

The Tribe Called Wannabe (2018-01-17)

The current flag of the settler province of quebec in the settler state of canada, february 2018. The current flag of the settler province of quebec in the settler state of canada, february 2018.
Current flag of the settler province of québec, february 2018.

Credit for the title of this thoughtpiece goes to Rayna Green, a brilliant Indigenous scholar whose discussion and dissection of people who think they are white seeking, even demanding to define and "play Indian," "The Tribe Called Wannabe: Playing Indian in America and Europe" was published in 1988 in the journal Folklore. It's a brilliant and uncomfortable read in the best way, and one that is worth following up with Eve Tuck and K. Wayne Yang's analysis and challenge to the various "settler moves to innocence" including trying to treat decolonization as merely a metaphor. It took me a long time to finally test some of the more troubling modes of québec politics and pseudo-argument with these notions, because despite the fact that québec is as much a settler construction as the whole of canada is, québeckers historically have had to deal with linguistic and economic oppression. For better or worse, after having learned a great deal more about québec history and the current trend of people who actually think they are white trying to pretend that they are actually from my nation, insisting on a racist and racializing definition of "métis" intended to undermine and destroy Indigenous rights and existence, I found myself unable to continue fence sit on aspects of the issue.

Despite the work and documented research by Métis (Chelsea Vowel, Adam Gaudry, Chris Andersen) and even québec (Darryl Leroux) scholars making it ever harder to just brush off the most recent efforts by some québeckers to recreate themselves as Indigenous because that will somehow legitimize their own inherited and present-day colonialism and consign Indigenous nations to the dead past, well, I was still shrugging it off. But as it turned out, I had one of those figurative camels and its back was in big trouble. The straw that finally did the poor fellow in was a section from a canadian history textbook, in which the author referred to "québecois" in the eighteenth and nineteenth centuries. Except, the term was not coined until the 1960s, when "québecker" was given a new ending analogous to that of "Iroquois" those ever reviled whenever they turn out to still exist today and ever-revered when québeckers can refer to them in the past tense. This timing makes sense, first because of the famous (at least in canada) "quiet revolution" in québec during which the catholic church's political power was curtailed in the province and notions of cultural identity in québec changed significantly. It was also then that the full consequences of giving up on a significant "french fact" outside of québec became clear and re-emphasized by Pierre Trudeau's argument for what he called "multiculturalism." In western canada, the beleaguered but persistent french-speaking population still includes plenty of people who feel québec abandoned them as thoroughly as france abandoned québec, according to many québeckers.

At which point I could no longer deny that "québecois" is part of an attempted settler move to innocence, and an early attempt, though a less sophisticated one, to appropriate Indigeneity from actual Indigenous people. I happen to agree with québeckers that they are a distinct people, both from the various english-speaking polities within canada and from the various french-speaking polities of france. They have certainly undergone a process of ethnogenesis here. However, they are anything but Indigenous. To actually be Indigenous, it is critical to have an ongoing relationship with the land, which means interaction and care, not individual ownership and ruthless exploitation. It is just as critical to be claimed by an Indigenous community, as Andersen has noted, it isn't about who you claim when you're Indigenous, it is about who claims you. There are certain points of convergence that are easy to point to and attempt to use as a claim that after all, québeckers aren't so different from Métis. Catholicism, french language, ancient connections to the fur trade. A convenient Indigenous female ancestor 6 or 7 generations back. Right? Wrong. Incredibly wrong. And we have been encouraged to be that wrong by colonial governments in all levels in canada, who would have loved the unfortunate débacle of "small m" and "big m" Métis to get firmly entrenched.

So just to be clear. While Métis may be at least nominally catholic and speak french, being "mixed blood" is not what makes them Métis. In fact, I can vouch for the fact that we have other terms for ourselves that we prefer to use that often get ignored by settlers. By Métis, I am not referring to the attempted racialization of the term. I am referring to the Indigenous people whose ethnogenesis occurred after europeans began showing up here. We have our own language, which is Michif, not french, and customarily we would have known at least Plains Cree and/or Saulteaux as well. We had already, in the early 1800s, communities whose members understood themselves to be different from their Indigenous and non-Indigenous neighbours, were self-governing, and had developed such recognizable national accoutrements as a flag, anthem, and specific lands with which we had and have relationships. Those lands do not include any part of québec. Or the maritimes. Or most of ontario. Or newfoundland or labrador.

In other words, we're distinct from québec and québeckers. Nothing is going to change that, not pseudo-history, or a pious repeat of how québec was against the judicial murder of Louis Riel by the canadian settler state. Yes, québec was opposed to that, because he was perceived as a french catholic, and nothing at all to do with the fact that he was Métis.

More and more settlers are seeking a different sort of "wannabe" status – they wannabe from the lands they live on now, not from somewhere else, and they want that to be in terms of just and healthy relationships with Indigenous nations. A good starting point would be for them to firmly oppose the actions and claims of the tribe called wannabe. (Top)

Unsettled (2018-01-20)

Diagram of the broad seismic picture off the coast of vancouver island, courtesy of the Mid Island News Blog, september 2011. Diagram of the broad seismic picture off the coast of vancouver island, courtesy of the Mid Island News Blog, september 2011.
Seismic diagram of the vacouver island region courtesy of the Mid Island News Blog, september 2011.

Towards the end of last year and as 2018 began, yet another outburst of pseudo-scientific attempts to prove once and for all that Indigenous people in the americas are not Indigenous made the rounds of news outlets and television stations. The usual old canards got trotted out, from the so called "Bering Strait theory," in fact a poorly supported hypothesis that can be traced at least as far back as writings by Thomas Jefferson (Elaine Dewar's 2001 book Bones is well worth a read on this one) to the "Solutrean hypothesis" (Stephanie Halmhofer's recent post on her blog Bones, Stones, and Books is a good read on this canard) which is newer and less popular but has become a special darling of white supremacist groups. The desperate settler desire to relabel Indigenous people "immigrants" is far from new, and was enshrined at one point in the categorization of Indigenous people along with all other "ethnics" right into the 1960s. "Ethnic" meaning anybody who is not considered british or german. Everybody who was and is british or german is for the most part never referred to as an "immigrant" or the term is pinned to them only very briefly. They have always been "settlers" or even "citizens" once the settler state of canada was labelled on white men's maps, but everybody else is pretty much stuck with "terrorist," "alien," or "immigrant," depending on how pejorative the speaker intends to be. No way in hell Indigenous people will get called settlers in any sense of the term, and that's interesting.

A key rationale for refusing to admit that Indigenous peoples have always been in the americas, or even that they were and are settled in the americas, is the insistence that Indigenous peoples were all nomadic. Furthermore, to be nomadic in this context is to wander around at random in a daze for no reason other than not being civilized or advanced enough to stay in one place. The investment in shoring up this rationale is spectacular in its size, and europeans began making it in the context of the americas the moment word began to get around that the lost navigator Columbus hadn't managed to reach India. Never mind that in fact Indigenous peoples are among the most settled on the planet when left to live by their established laws in relationship with the land. Yes, most Indigenous cultures may seem less sedentary, especially to short term and disrespectful visitors. In fact, the supposedly aimless wandering was and is a structured cycle of moves over the land reflecting movement that is both sacred and practical. To take just an example from what is now labelled southern alberta on non-Indigenous maps, archaeological evidence alone keeps revealing greater and greater time depth to Blackfoot relationships with and cyclic movements around their traditional lands. The Haudenosaunee practice of moving whole villages every twenty five years or so to allow the land to recover from an extensive period of human occupation and farming is not nearly as well known as it should be, especially the phenomenon of returning to a known old village site after 50 to a 100 years or more.

Attempts to actually apply the term "settler" to Indigenous nations in such phrases as "they were just earlier settlers," besides implying that they must have wiped out somebody else who was already here, does something else that is quite unintended. It accidentally admits that in fact the "settlers" who came to the americas to steal the land and generally be lousy neighbours didn't, and don't settle. The number of settler homesteads that have remained occupied by the families of the first non-Indigenous people who took them is minuscule. Small towns appear and disappear with economic cycles that follow resource booms, therefore nearly at random. Fewer and fewer people live in even the same town or city all their lives, often criss-crossing much of north america in their efforts to remain both employed and in reasonable housing. Even those who weren't driven by economic necessity rarely stayed anywhere long. The number of people pursuing a cycle of moves over time who are not Indigenous and not extremely wealthy people who have several homes to occupy at different times in order to take advantage of the climate and the tax breaks is also vanishingly small.

The destruction of healthy relationships with the land and with other people is one of the most vicious wages of colonialism and capitalism. It is little wonder that the non-Indigenous people who at one time would have been simply proud to be called settlers are in fact unsettled, in more ways than one. (Top)

An Obligatory Net Neutrality Piece (2018-01-13)

Snip from the most common header illustration in the electronic frontier foundation's blog articles on net neutrality, january 2018. Snip from the most common header illustration in the electronic frontier foundation's blog articles on net neutrality, january 2018.
Snip from the most common header illustration in the electronic frontier foundation's blog articles on net neutrality, january 2018.

Good obligatory though, not bad obligatory. Net neutrality is critically important. Not just in terms of not allowing "internet service providers" to throttle streaming services they don't make money from, but also in terms of getting back on track towards an internet where willing participants are able to do so fully, regardless of sex, race, or class. My purpose with this thoughtpiece is not to try to drag the conversation off onto that goal, because I don't think these two sets of goals are in competition. They are complementary, and right now the immediate emergency is about potential abuse of the ability to block access to services and sites on the internet by slowing them down. Rather, I would like to draw some attention to a strategic screw up that almost nobody seems to be thinking about. The only person I know of who has given a thorough consideration of this within the past few years is Maciej Cegłowski in his 2015 talk The Web Obesity Crisis.

Again, bandwidth is important. Yet because bandwidth has been increased so significantly compared to the days of when I was using a 14.4 modem and my phone line to log onto the internet, much of Cegłowski's critique remains as valid as it was before. Website after website, wordpress instance after wordpress instance, uses gigantic image files and attempts to load hundreds, if not thousands of javascripts from who the hell knows where. I hate to break it to all you folks who paid way too much money for somebody to grab all those javascripts to give your site drop down menus, you wasted every damn penny. It can all be done for free in effect with stylesheets, which also fail more gracefully if the browser can't render them properly. The number of sites that have a ridiculously huge social media logo that loads first, is absolutely huge, then shrinks to something 30 pixels by 30 pixels in size and finally lets the rest of the page load today is – unbelievable. I haven't yet found a way to block those stupid images so I can actually see the site instead, which is why I followed the damn link in the first place. This is something I am seeing with sites that are not streaming videos or music, on a basic highspeed internet plan in a country where officially net neutrality isn't dead or tangled up in numerous lawsuits.

The majority of the conversation about net neutrality has centred on throttling, including the contributions on the subject by such big players as the electronic frontier foundation, whose net neutrality series is definitely worth the time, and the Free Software Foundation, which provides an important summary about how lack of net neutrality also facilitates the expansion of DRM. Of course, throttling could also be used to make it that much harder to download free software, especially alternative operating systems. Right now, even a standard major system update for a mainstream operating system can take between 2 and 4 hours to download on a basic highspeed internet plan.

The strategic screw up that many web developers and developers of businesses intending to somehow make money via internet distribution, is not avoiding sucking up bandwidth when they can. This can be an excruciatingly difficult thing to avoid if you are a blogger who does not self-host, for example, and this criticism is not intended for people using those services. There the fault lies with the provider, not the people who have opted to use it, many of whom may be completely unaware of bandwidth as a potential issue. What seems to be happening online seems to be the same phenomenon as the introduction of more and more efficient alternatives to incandescent lightbulbs. The alternatives are more energy efficient, so people seem inclined to use more and more of them, ending up using the same amount of electricity (or more) anyway. I get the urge to do this by the way, because yes, some of the new lighting set ups I have seen are pretty cool and might even be useful. Providing high fidelity photographs and sounds on a website that people can view and download is pretty amazing compared to what it was like online in the early 1990s. I'm sure there are readers who jumped for joy when they could finally email family pictures and the newest, glossiest internet memes without their email provider bouncing the lot.

Unfortunately, this also means that many of us online will be caught over a barrel if throttling becomes common, because throttling is not a fine grained tool. It doesn't have to be, and plenty of websites that don't actually need much bandwidth relative to say, netflix, can get clobbered all the same if the throttling is turned on based on a range of ip addresses, as seems likely. If the site is not relatively lightweight by default, or has no lighter weight version, that site may as well be gone. The problem is not just the risk of throttling without net neutrality, the problem is that for the moment at least, the net pattern of web and internet development has made throttling a serious threat. Beyond internet service providers potentially abusing their positions, throttling can be imposed by other parties via pressure on those providers at this point, by other technical means, or simply by major natural disaster.

Safeguarding net neutrality by means of regulation of both companies that control internet access and infrastructure and preventing governments from creating a non-neutral internet for their own purposes is a key insurance policy. However, it is also necessary to decentralize internet access and internet infrastructure and return to treating bandwidth like the precious substance it is. (Top)

So, You'd Like to Reinstall An Old iOS App (2018-01-06)

The current iTunes logo, which frankly I despise, but haven't yet gotten around to swapping back to the less gross old blue one. Hell, I'd be happy if this one was in greyscale. This copy of it is here courtesy of wikimedia commons, october 2014. The current iTunes logo, which frankly I despise, but haven't yet gotten around to swapping back to the less gross old blue one. Hell, I'd be happy if this one was in greyscale. This copy of it is here courtesy of wikimedia commons, october 2014.
The current iTunes logo, which frankly I despise, but haven't yet gotten around to swapping back to the less gross old blue one. Hell, I'd be happy if this one was in greyscale. This copy of it is here courtesy of wikimedia commons, october 2014.

Okay, fair enough, if you don't, in which case, feel free to skip this thoughtpiece completely. For the rest of you out there, I thought it was only decent and fair to write up how I managed to do this, because the application that went awry for me is one that I like very much, but it has become wildly buggy on the latest version of iOS on my incredibly old but still chugging second generation iPad. Originally I was completely stymied as to how to sort out the issue, especially when a full restart, reinstall, and a couple of gesture tricks that sometimes shake things loose all failed. (The newer iOS versions can trigger odd gesture problems in older applications – or the older applications can trigger odd gesture problems in the newer iOS.) Then my luck finally ran out in the grim world of iTunes, and my library became so utterly corrupted I had to trash the lot and reimport all my music and apps. This rendered my music library into a complete shitshow that still hasn't finished stinking (no, I'm not bitter), but did lead to a surprisingly useful thing.

I found myself with a handful of older and newer versions of my purchased iOS apps. Unfortunately this did not include preservation of my very few in-app purchases. Word to the wise folks, if you have an in-app purchase, the developer screws off somewhere, and you have to reinstall the app, your in-app purchases are gone and irretrievable, so far as I can determine. This may not be a real risk for most, but, for what it's worth, there is my warning. I'm actually less irritated about the loss of a small in-app purchase as compared to the mess my music library is in.

Anyway, as igeeksblog.com noted, on older devices, you can sign in to the iOS app store and reload older versions that can still run on an older version of iOS. This worked great when I needed to restore my even older iPod with the few apps I still use on it, and kudos to apple for making the process transparent and easy. The big problems came down for the iPad, because it can officially run the latest version of the application in question, so the app store will not offer any older version, even as a "Well, we don't agree with you at all, and it's on your own head if it all crashes. Don't call apple support, they will laugh at you. You suck. Fine. Here it is." sort of option. That would be a very rude option, but still, as a last resort, bearable. The folks like me whose cell phone service provider's system sign off sounds utterly disgusted with you for ever checking your voicemail will recognize the style.

(For those wondering why I don't just submit a bug report, the difficulty is that the software company has retired all development and support for this version of the app, and based on other tests I have worked through, the bug is on their end not iOS's and I don't need the mass of features in the latest version of the app which I would have to purchase at full price.)

The folks at igeeksblog.com suggest digging around in your iTunes backups, or asking a friend if they have a copy of the app. Except, logically I don't think you could use your friend's copy, because it will be watermarked for your friend's iTunes account, not yours. So, best to stick to a copy from your own backups. Then you just have to drop it into your app library in iTunes, click "ok" when it complains you're replacing a newer version, then drag and drop it onto the icon for the iDevice you are working with. However, it turned out in my case that the oldest version of the app I could pry out of my iTunes backups had the same problem as the most current version. I was close to giving up. But then I found myself wondering about my Time Machine backups. By some wild chance, could there be an earlier copy of the app wedged in there somewhere?

The answer is yes, but the key is to dig around starting from the top level of your home folder. If you start from the Mobile Apps folder in your iTunes folder itself, Time Machine will insist you have no backups whatsoever beyond the current year. This is a strange and unnecessary behaviour, and may be a bug. After all, if you purchased the app there really should be no issue, and I have happily used this app for years. In any case, after a couple of reinstalls of successively older versions of the app, I finally have a working version again. All you have to do on locating an older version in Time Machine is restore the older version, then follow the same directions as the igeeksblog.com article linked to above provides.

Hopefully the likelihood of anyone else seriously having to do this is very small. However, iTunes is such an alarming and lumbering behemoth you really can't count on that forever. The best thing to do is to back up all of your music, apps and so forth on a separate drive as well as taking advantage of Time Machine. I back up my music independently of Time Machine, but had not done the same with my apps since this particular combination of issues had never occurred to me as a possibility. Alas, it is always the most creatively weird things you never thought of that cause the most headaches. (Top)

A Funny Way to Talk About a Person (2017-12-31)

Photograph of a Hammer by Evan-Amos contributed to the public domain via wikimedia commons, august 2010. Photograph of a Hammer by Evan-Amos contributed to the public domain via wikimedia commons, august 2010.
Photograph of a Hammer by Evan-Amos contributed to the public domain via wikimedia commons, august 2010.

One of the terms that aggravates me in the "technosphere" such as it is, quite apart from the execrable and insulting "meatspace" applied to the world we physically live in via William Gibson, or "meat styluses" for the extraordinary appendages at the ends of our arms, is "user." "User" as in the term applied to people who use computers, as well as people who use drugs whether they are illegal or legal, since the key point is what they are perceived to be used for, escape. The term does not have the best of connotations, unhelped on the computer side of the ledger by the still too common habit of rendering it into the condescending label "luser" among too many computer programmers. I program computers myself, and can't abide the term without a sense of at minimum discomfort. The base assumption when "user" is negatively connoted in talking about computers is that the person so labelled is too stupid to cope with the machine they are using. Supposedly they do appalling things like using the original cd drive trays as cup holders and can't understand what the on-off switch is. Worst of all not so long ago were those supposedly pathetic users who preferred a graphical interface over the command line. Yet brand new tools understandably confuse people who have not used them before, and at first nobody knows how best to use a given tool or fully appreciates the many things they could do with it. It was impossible to know ahead of time that not only are graphical and command interfaces just different, they also best serve different purposes, tasks, and ways of working. I think it is fair to say that everyone is still catching up with the fact that general purpose computers are not singular tools but effectively tool boxes full of an embarrassment of riches.

But let's pretend that the term "user" has never been applied to a socially frowned on practice like taking psychoactive drugs in order to escape difficult life circumstances. I think the term would still be problematic. After all, nobody calls a person who uses a hammer a "user" except in the special case of "tool-user" which is quite valourized and often gets tied to the presupposition that said user must be male. It is not a sex or gender-role based difference either, because a woman or man who uses a sewing machine is also not referred to as a "user." In such cases, the people are just called people, or of course, a person in the singular. Why should someone who learns and applies the tools provided by a computer be called a "user" instead of just a person? It can't be simply that they aren't always a computer programmer, or a person officially credentialed as an engineer who could both explain all the physics and chemistry of a computer and build one. The vast majority of us have no clue whatsoever how to build any machine that is more complex to put together than a basic electric motor or elaborate mechanical clock, and on a day to day basis we don't need to for most purposes.

There are in fact some important people who are not generally referred to as "users" when they make use of the remarkable toolboxes that we call computers. Those people do paid work on them, especially on computers that they don't own. More often than not those folks are called "employees" or "staff," and they of course, are doing work, not playing games, watching movies, or using their computers to socialize. Of course not, right? They are using those expensive machines for real life work, not escapism. At this very moment there is a debate going on about whether or how computer use may be addictive. "Do the bright colours and pseudo-kindergarten flat designs endemic to most computer operating systems right now present such a risk of addiction that everyone should only interact with them when the screen is set to greyscale?" pundits wonder. My description of the current design aesthetic gives away my firmly negative opinion of it. There is documented evidence that gambling machines have been developed in ways that encourage people to keep playing, and that this has led to them becoming literally addictive. They depend on rewards that are pseudorandom and reinforced by lights, noise, and free plays rather than a money jackpot as often as possible. The uncomfortable parallels to the way in which "social media" work and the abuse of alerts and notifications by those applications should concern us. That games played on a computer including specifically gambling games may provide an addictive stimulus is not an unrealistic worry. Or we could consider the implications of interface designs that suggest the people who use computers need the affordances and encouragements given to small children who are just learning the behaviour expected of them in school.

All that reasonably conceded, that doesn't make every possible use of the tools on a computer into a potential addiction. That includes many of the tools for relaxation and play including games, movies, and the various programs used to make art. It seems to me that the term "user" implies that by default, we are all helpless to decide what programs to run and which tools to use on our computers, because of the computers themselves (which is nonsense) or because of the ongoing efforts of various corporations to take a paternalistic, surveillance attitude to the computers and software we purchase from them. Yet we are no more powerless around computers even if they are brand new to us or we are not programmers, than we are simply powerless in our encounters with drugs. In saying this I should also add that I do not agree that drug addiction is evidence of a character flaw or moral failing. Nor do I think escaping from our concerns and cares for awhile is a form of moral failure or laziness. In moderation, it is an important form of self care and an aspect of how we maintain and express social bonds.

After all that sturm und drang, I suppose you could reduce this all to the statement that calling people who use computers "users" rather than "people" is to disrespect and disempower them, especially if they come to agree with the characterization of themselves as "just a computer user." We already know a parallel usage, the old and now properly maligned derogation of a woman's unpaid work in the home as being "just a housewife." (Top)

Copyright © C. Osborne 2024
Last Modified: Monday, January 01, 2024 01:26:24