The Moonspeaker:
Where Some Ideas Are Stranger Than Others...
COMPUTERS BY WOMEN: A COUNTERFACTUAL EXPLORATION
A CONTRARIAN OVERVIEW
The rumblings began around the 2010s, the sort of rumblings that make for excellent clickbait headline titles and handwringing in the mainstream press to tide them over in lieu of a recent natural disaster or "terrorist act." According to these rumblings, there was a growing crisis because girls and women were not remaining in or studying "STEM," the execrable acronym for "science, technology, engineering, and mathematics." After the usual mainstream media suspects had embarrassed themselves by talking up projects that condescended relentlessly at girls by trying to make "STEM" interesting via pink microscopes and doll-branded games with sciencey looking elements, it looked like "the issue" would fade out again. It didn't of course, because more thoughtful scholars began noticing how girls did in fact take these subjects, and how young women made up more and more of the "STEM" majors and eventual graduates. Yet they seemed to have particular difficulties staying in their fields, and this could not be explained away by insisting they were abandoning the field to have babies, although in reality they had to deal with the obnoxious motherhood and family care penalty imposed on all women. The evidence of hostile work environments, miserable pay, delay or absence of promotion based on merit, difficulty accessing capital to found businesses or scale up production of inventions, and other issues showed women did not merely give up on such careers. No, it seemed evident women in these fields made their own nuanced cost-benefit analyses, and all too often those analyses forced them to conclude they were wasting their time at best. In our present hyper-capitalist moment, more and more women entrants to these fields are conscious of the ways the still widely accepted insistence on paying them less compared to men with the same qualifications is just one more version of a mean-spirited tactic meant to encourage the hostility of their male colleagues. These factors seem to be in near perfect-storm mode in the field of computer engineering and its outlets in hardware and software, especially software development. There is a lot structurally wrong with "STEM" from education to eventual employment for everyone and for women and girls especially, but the rumblings are more than merely annoying. Not so long ago, "science," originally better known as "natural history" was specifically considered "for girls" since it did not require latin or greek for them to study it. By the nineteenth century, even higher mathematics, which was something of a hold out due to the continuing use of Euclid's textbook, no longer had an ancient language barrier.
Then further examination and the interest encouraged in the world war ii codebreakers at bletchley park began to spoil the perception that somehow women never had a thing to do with computers. Popular history writers made hay with the eccentric characters this aspect of world war ii military activity helped reveal, from Alan Turing to Charles Babbage. Somehow nobody else seemed to really have computers or develop them except anglophones and a few honorary non-anglophones from europe, but that is something that stands out now rather than in the 1980s-2010s. (The reality is quite different, as newer historical books and articles are showing from primary sources.) Then the conversation shifted again as historians popular and scholarly alike took note of Ada Lovelace and Grace Hopper. Still others began to notice how very many women there were in photographs at bletchley park, and not just on breaks. Slowly, slowly, information began to trickle out showing how these many women, even those women's royal naval service members who so frequently show up in photographs cajoling the codebreaking machinery through its paces, were never just blindly following male directions. They couldn't, or they couldn't have done their jobs. In 2005, David Alan Grier's book When Computers Were Human went some way to seriously acknowledging the original computers were women, many of them involved in astronomical research. Women were right there labouring on complex calculations in science and military applications, and then in the development, implementation, and adaptation of mechanical, then electromechanical, and finally electronic computers. Then they were ever so steadily run out, as documented in both britain and the united states by such authors as Susan Abbate in Recoding Gender, Marie Hicks in Programmed Inequality, and Nathan Ensmenger in The Computer Boys Take Over.
The effort it ultimately took to keep girls away from computers and anything involving repair or construction of machines, and then women from electric and then computer engineering, then ultimately computer science was far from trivial. From massive sex role stereotyping and both government and business campaigns reinforcing efforts to trammel everyone's imagination about who should program computers, including resume filtering, in-person interviews, and personality profiles, the mass effort finally worked. After the efflorescence of women in computing from the 1950s into the 1970s, by 1995 feminist historian Dale Spender in her groundbreaking but too little known study, Nattering on the Net described sociologist Sherry Turkle's 1988 findings related to women and computers:
Most of the young women surveyed made it clear that when they looked at the people who are drawn to computers they rejected them as role models – they didn't like what they could see.
Such a response fits the data on how women feel about science: it isn't that they can't do it, and it is not necessarily that they don't like the subject. What they turn away from is the image of the scientist or the computer hacker. It doesn't fit with their notions of themselves as women.
There is a sting in the tail of Turkle's data that Spender references here. It was a survey based on 25 young women attending harvard and mit. It was an early study of a very specific group of young women, but as a first step in characterizing the state of women's perceptions of science and computers, a worthy start. Never mind that those young women hardly objected to the mere appearance of those mostly young men in computer science, they disliked their personalities and behaviour, including I suspect, the behaviour of those computer science students towards them. In the early to mid-1990s, Spender did not comment on the bitter irony of women not seeing themselves even as scientists at that point. It is pure accident that despite featuring Mary Somerville in her still unsurpassed Women of Ideas and What Men Have Done to Them, she did not note that William Whewell coined the word "scientist" for her. The very exemplar of a scientist was a woman in the first place. But those young students had no way to know that, and the advent of readily available digitized copies of Somerville's books among many others was not to be for another ten to fifteen years.
Nevertheless, Spender's reflection on the result stands the test of time. To this day, even young or old, women who do indeed see themselves as scientists and/or computer hackers are currently not too enamoured of the typical options offered to them in their fields. Women scientists and engineers who work specifically with computers may well be voting with their feet precisely because of where the jobs are: military applications. The fact is that surveillance capitalism is itself a (re-)militarization of civilian-oriented advertising and propaganda. Considering women are so often the primary victims of civilian surveillance and attendant further abuse at the hands of men, women have pointed reason to do all they can to recognize when they are being tempted to help build the infrastructure of their own oppression. To be sure, many people would insist, and I would agree, it doesn't have to be this way, things can be better. Yet they don't seem able to think through how it would or could be different. They seem as caught up with the two primary models of computers-in-use as the men who due to their social position and access to money and materials were able to shape the earliest development of computers. These two models may be referred to as "the clean factory" and "the panopticon."
The harvard college observatory computer team, circa 1890. Image courtesy of wikimedia commons.
In the clean factory model, at long last all human labour the factory owners must hire people to do is reduced to zero. All the work is done by robots that never tire, and never err, thanks to their artificial intelligence. Robots do the production, and robots clean up any dust and debris. Such factories won't need lavatories, cafeterias, or cloakrooms. They needn't be safe for humans to be inside either, robots can be designed to run at temperatures, noise levels, and air quality humans cannot tolerate. This follows the logic that insists "efficiency" means paying the lowest wages to the fewest people who actually make the product which can be sold for a profit in a capitalist fundamentalist system. The profits will then be all the higher and all the fatter in the pockets of the owners of the factories. By contrast with this model focussed on removing people, the panopticon model is focussed on never losing track of any person for a moment. The proponents of the panopticon model sometimes make indirect reference to Jeremy Betham's original design for what he conceived as a uniquely effective sort of prison. As a matter of marketing, they don't like to do this unless they are seeking military contracts, or rather, they didn't. Nowadays with the level of society-wide militarization across north america and much of europe, the techbros are no longer so circumspect. They are eagerly marketing always-on microphones to listen and cameras to watch for the household, supposedly so that we can feel secure. They are sure we will all be happier to surrender paper and ink in favour of "ebooks" and doing all of our writing and other work on computers – by which the techbros mean computer terminals that must always connected to the internet to work. After all, how else will they add what we read, write, and watch efficiently to their databases? Meanwhile, the same techbros eagerly flog the data they have already hoovered up to every government, pointing to its pretended "security" applications for preventing crime, including that troublesomely malleable and political category of violence, "terrorism." There is a mania to this model all its own. Surveillance and data collection are far from harmless. This is true even though there is an uncanny echo of the joke first recorded in the early twentieth century eastern united states with the punch line, "There must be a pony in here somewhere!" In any case, one of the places the two models overlap is in the notion of security. If factories producing munitions have no people working in them, just strictly programmed robots, then notionally there can be no sabotage, no leak of information about the latest wunderwaffen. Nor indeed, could human ethics or pangs of conscience interfere with the centrally directed outcomes of computer use.
The lines of development to modern computers are stubbornly tied to military applications. This includes those infamous and error-ridden logarithmic tables that drove Charles Babbage to wish they could be calculated by steam. Contrary to hopes and expectations, breaking down mathematical calculations into smaller steps and then having a crew of people carry out each step in sequence reminiscent of factory production of cars or guns does not produce a guaranteed correct result. As Grier describes, human computers were prone to if not outright carelessness, errors of transcription and errors of boredom. Readers who completed arithmetic drills in elementary school will recognize the paradox that stopping to think when doing rote operations not only slows down the person doing the calculating, it leads to mistakes. On top of that, such rote calculation is best done as a mental form of sprint, because attention wanders easily over longer periods. These effects increase rapidly as soon as the calculation at hand is not rote in nature. Babbage did not have reducing the burgeoning workforce of female computers of his time in mind as such, although he was thoroughly alive to how mechanical calculation could be both accurate and cheap because fewer people would be needed to run the machinery. The cost factor played at least as much if not more of a role in the eventual preponderance of women in computing jobs in observatories through much of the late nineteenth century. I suspect what made this a relatively unthreatening job for women to do was precisely how it could be reframed as "mechanical" and not requiring much by way of reasoning. As Hicks observes, women and machinery went together in the public mind in britain because women were associated with a sequence starting with professional men replaced by lower waged women who subsequently were replaced by machines. It is much easier to reduce a human workforce or participate in a war, if only the messy people element can be kept out of sight.
In other words, the people, mostly men, who have led and are leading development and deployment of computers, are fundamentally seeking a means to create perfect deniability. If somehow a computer running an algorithm can be set up as a supposed authority, the subsequent license to do heinous things and claim "I was only following orders" seems airtight to the most extreme advocates of such applications. We are already seeing real life evidence of this behaviour, such as that great outcome of the 2008 financial crash, the corporate rental cartel. One such cartel in the united states is profiteering so brazenly that the notoriously reluctant to investigate corporate crime department of justice is looking into their price fixing. This is a far cry from Dale Spender's 1995 pragmatic comparison of computers to a telephone plus a good book. Today this comparison sounds almost whimsical, even though it is still true.
A more famous comparison and contrast is Ada Lovelace's of the abilities of Babbage's partially constructed analytical engine versus his projected difference engine. All too often it is reduced to her explanation that "We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves." Out of context like this, it is easy to guide the reader into taking it as evidence Lovelace did not really understand the principles or possibilities of either machine. Or, that she saw it as a means to make whimsical artworks or a gateway to the current marketing bubble buzzterm, "artificial intelligence." However, put back into at least its original paragraph, it reads rather differently.
The distinctive characteristic of the Analytical Engine, and that which has rendered it possible to endow mechanism with such extensive faculties as bid fair to make this engine the executive right-hand of abstract algebra, is the introduction into it of the principle which Jacquard devised for regulating, by means of punched cards, the most complicated patterns in the fabrication of brocaded stuffs. It is in this that the distinction between the two engines lies. Nothing of the sort exists in the Difference Engine. We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves. Here, it seems to us, resides much more of originality than the Difference Engine can be fairly entitled to claim. We do not wish to deny to this latter all such claims. We believe that it is the only proposal or attempt ever made to construct a calculating machine founded on the principle of successive orders of differences, and capable of printing off its own results; and that this engine surpasses its predecessors, both in the extent of the calculations which it can perform, in the facility, certainty and accuracy with which it can effect them, and in the absence of all necessity for the intervention of human intelligence during the performance of its calculations. Its nature is, however, limited to the strictly arithmetical, and it is far from being the first or only scheme for constructing arithmetical calculating machines with more or less of success.
Lovelace is striving to describe a remarkable and highly useful tool, one real, one possible but not built yet. Indeed, this perception of computers as useful tools is at least more obvious in women's comments on them early in their history all the way into the late twentieth century. The Analytical Engine performs according to a completely pre-defined pattern, while the Difference Engine's calculations may be both accurate and unpredictable because it can handle differential equations. This is what begins to open up the possibility of using machinery to model and predict. As most of us are well aware, having a tool to assist with these tasks comes in very useful.
By the time we get to actual general purpose computers, women are already prioritizing purposes not necessarily at the front of corporate or military minds. Quite apart from Spender's ebullient commentary on the computer as means of communication, in the 1960s Delia Derbyshire was among the first serious composers of electronic music. Yes, she started with mechanical means involving carefully timed and spliced tape loops. She also added electronic instruments and computers as soon as they were available. By the 1980s, female fans of the short-lived original Star Trek television series were adding computers to their equipment for sharing information about the program alongside fan created audio, video, and fiction.
Please note my point here is not based in some sort of biological determinism. Thanks to sex role stereotyping combined with the ways in which capitalist economies strive to keep all people as atomized as possible while demanding women find some way to carry out all social reproduction for free, women are incentivized to notice and adapt technology to make social reproduction possible. Women cannot survive at all if they cannot create and maintain social networks among themselves. And like it or not, if women can't survive, nobody can. Conversely, although men may value some or all of the same aspects of computers as women, they are under pressure to pretend they don't while striving after the goals for computer development and use established by the masculine-stereotyped military and corporate interests.
Meanwhile, there is a persistent undertone of hostility in much of the response to what are arguably the most stereotypically "feminized" of computers, those produced by apple incorporated. A brief foray into the (of course) opinionated snapshots of "hacker culture" in the Hacker's Dictionary aka The Jargon File provides early evidence of how the first consumer-grade apple computers were received in what can be loosely described as computer science circles of the time. There is a cluster of less than enthusiastic terms under the letter M, including macdink, Macintoy, and Macintrash. These terms suggest their coiners found the notion of computers designed for accessibility to people who were either not able or unwilling to spend hours relentlessly learning how to code in assembly or machine language or use a soldering gun to build their own machines personally insulting. No doubt it was both frustrating and exciting to see once necessary drudgery vanish thanks to improvements in computer construction, speed, memory, and energy efficiency. I think there is a real critique caught up in the kvetching though, one taking issue with developments indicative of a move towards external and centralized control of these machines by rendering them into black boxes people cannot easily repair or reprogram. It is one thing to make it unnecessary to have arcane knowledge to operate a computer, and another entirely to seek to make it impossible to understand and alter the inner workings of a computer at all. The battle to prevent the total enclosure of general purpose computing is very much a part of and response to that very critique. Far too much time was lost on sniping at apple for prioritizing a graphical user interface and applications allowing non-programmers to use those computers for entertainment and making art, and then making the computers themselves at minimum less ugly by eschewing the colour beige. If a computer is more than an appliance, and if we are going to go along with having computers in the home, better they be visible and not eyesores. They shouldn't be hidden in every possible other appliance or item in the house or the office, whether or not a full general purpose computer is genuinely necessary for the task the appliance or item is doing. Still, it's the tie to the stereotypical association of women with "the home" and wanting things to be "pretty" that adds to the hostile undertone. This is not to say apple hasn't played into this in the worst ways, between its more and more locked down operating systems and attempts to render its computers into fashion accessories and therefore dangerous electronic waste. Nor is it to say apple's development and marketing lines reflect actual respect for women. They don't.
With this background set out, the question remains. Suppose women had not been pushed out of the various branches of the computer industry, especially as developers of hardware and software. Furthermore, let's suppose the perverse incentives pushing development of all technology towards variants of the "clean factory" and "the panopticon" are no longer present. No longer present, but recognized, and so likely to influence what women imagine and try to realize. Well, what might computers look like then?
NO MAGIC, NO INSTANT PERFECTION
At the moment it is still very common for people to liken computers to brains, which probably does more to obscure understanding than otherwise. Therefore I would like to start from a different analogy and framing. Instead, the analogy here will be to fabric, and not because of the accidental compatibility of such an association with stereotypes about women and textiles. Indeed, there is a hint of this comparison in the connection between jacquard looms and eventual computers, although as we have seen, writers often think of the automated loom and the computer, rather than the elaborately woven fabric and the computer. I think it is worth drawing out (pun very much intended) some details about fabric and the reconstructed history of its development, because it has some properties that would be well-suited to these computers intended to follow different models than the clean factory and the panopticon. This will lead into a discussion of four alternate models to think with: the self-publishing house, the mobile personal library, the spontaneous network, and the tedium-saving device. In each case, the idea is to favour decentralization, hands-on control and operation by people who choose to do so, and a greater level of social accountability. In a way, this is an attempt to break computers entirely away from their conception and construction as an expression of militarism. It also strives to build on women and girl's more common view of computers in patriarchal, capitalist societies where they are relatively available, which can be summarized as "computers are tools people use to make tedious tasks easier and faster."
To date the known archaeological record keeps showing on a worldwide basis that women have been the primary inventors of specialized and reusable fibres, textiles, and subsequently tools for spinning, knitting, weaving, and repairing those fibres and textiles. Sticking specifically to cloth, as opposed to baskets, carpets, rugs, or nets, which have different affordances and often different materials, there are many important characteristics to notice. By nature, fabric wears out, and so it needs regular repair and replacement. The most useful fabric for wear and storage is time-consuming to make by hand, which adds to the motivation to make it last by careful processing, repair, and reuse. As archaeologist and philologist Elizabeth Wayland Barber notes, the majority of women's time in west asia went into making textiles for much of history, they literally put the clothes on their families' backs. Later, depending on the vagaries of sex differentiated labour patterns, men might be the ones who wove the fabric for sails alongside their work on fishing nets and rope, but it appears women originally made the cloth for those too. Unfortunately women's inventions were promptly used against them by men willing to use violence to force women to produce far in excess of the immediate needs of themselves or their own families. This is a general risk with inventing and deploying labour-saving devices in contexts where exploitation enforced by violence is legal, but that is a separate issue for the moment. The points to focus on here are that women invented tools to support their textile work using readily available materials and easily shared designs. They could upgrade the components and improve the design depending on the type of weaving and fibres available. Well into the twentieth century, it would have been unheard of for a woman not to have the materials and equipment available to mend and sew clothes and other household cloth items, even if she didn't weave cloth herself. This is the sort of ubiquity greedy capitalists can only dream of, although they easily lose sight of what made this possible: it wasn't just the practical necessity, there was also a tremendous amount of creativity and social relationships bound up with textiles. Cloth was and still is everywhere, but over time as people came to live in larger communities and trade networks, looms were not as common. In the nineteenth century european colonists took particular interest in centralizing looms into factories, so that the number of independent women weavers dropped precipitously. Later the desire to speed up production of more elaborate fabrics helped drive the development of the jacquard loom. But how did women develop fibre technology to such extraordinary heights even before creating the larger, more elaborate looms that supported early larger scale production?
The archaeological evidence, including ancient figurines and two-dimensional representations of women indicate that first, they invented string. This might seem a strange idea now when various sorts of sticky tape, hook and loop fasteners, and zip lock bags are everywhere, string is not as commonplace as it was in day to day use. Among the possible first inspirations for inventing string are managing long hair, and even more likely, keeping together small collections of regularly used tools. It might sound easier to use animal skins and sinews for these things, but this is far from the case. Processing animal skins to produce useful leather and sinew for sewing requires a great deal of physical labour, even for small animal skins. Leather pouches with drawstring closures might be the earliest commonly used long-wearing bags, but the sheer effort required to make them plus their limited size would tend to leave much to be desired by any palaeolithic person who needed to collect various types of materials and foods. Women would also have a strong incentive to find means to make it easier to carry babies and small children. Somewhere along the line, women invented knotted string bags, which likely also helped with the creation of nets, which are after all simply larger bags. In the majority of places humans live, the climate is not well-suited to going naked, and it seems as women moved into those places they began to develop woven fabric where useful fibre-giving plants and animals were available. And of course, in west asia people in general took special effort to domesticate fibre-giving plants and animals to better support the growing demand for cloth in daily life. Women also sorted out ways to dye wool and cotton fibres a myriad of colours, and subsequently clothing and woven rugs and carpets became "readable." That is, what may have begun as simple decoration rapidly developed into systems of symbols used for social purposes. Before writing as we now know it was developed and in use, it is likely women memorized and shared patterns and techniques by both individual instruction and by encoding them in songs. But cloth could be rendered more literally readable because embroidery was also highly developed at an early period, starting as a development from tailoring and reinforcement of hard-wearing or otherwise weak parts of garments and bags. Cloth, string, and the various arts and crafts associated with them are very different from computers in that not only is nothing about them proprietary in principle, they are produceable without the sort of elaborate and expensive equipment required for making computers. Synthetic fabrics and dyes are very much the exceptions that prove the rule in this case.
So, with this very broad picture of the history fabric and production in mind, I think women would redesign computers starting from the principle that no one should be forced to use or own an electronic computer of any kind to get through their daily life. Among the most basic reasons for this apart from treating non-renewable resources in a frugal and responsible manner, is the ethical requirement that privacy be respected and the tyranny of surveillance prevented. It should always be possible for a person to complete such tasks as online purchases or registrations by going to a shop or office in person, telephoning via landline, or filling out and sending in an appropriate form. This would maintain at least two important benefits: resilience should there be any sort of natural or human disaster that disrupts electricity generation and/or electronic communications, and allowing people to choose to carry out such tasks in person. By "landline" I am not suggesting this might require redeploying copper telephone wiring again, but the point is of course that no one should be forced to have a cellular telephone. Nor should landlines be reduced solely to the equivalent of "voice over internet" calling, which has persistent privacy invasion issues. At the moment there are myriad household devices senselessly equipped with computers, including more recent examples set up with wifi for supposed "convenience." In a remarkable number of cases, the computers are completely unnecessary, and stopping production and distribution of the majority of so-called "smart" devices would also cut off the growing stream of both e-waste and privacy and security breaches in the home via these often poorly supported and secured devices.
The next principle is that computers must be recognizable as what they are, not hidden or otherwise disguised as something else. A hidden or disguised computer is inherently an insecure device likely to be abused for surveillance or some form of data theft sooner or later. Striving to make computers pleasant to the eye, as well as smaller and less likely to emit excess heat and noise does not require disguising them as something else. This principle should not be taken as a license to continue the contemptuous and wasteful efforts of apple computer to recreate computers as disposable fashion accessories. As a correlate to this and to improve resilience in the face of potential loss of external power, all computers expected to remain plugged in should come with an emergency battery to allow graceful shut down if the power goes out, and it should always be possible to turn them off using a hard switch. Portable devices like music players, cellular phones, and tablets should also have easy to use on-off switches. And when a device is turned off, it should be off. Especially for "always plugged in" computers, it should not be necessary to unplug them altogether to be sure they are not drawing electricity. Furthermore, any emergency battery must have a separate switch to break the circuit between it and the computer.
At one time it made some sense to focus on increasing the processing power and storage capacity of personal computers. Not everyone wanted to carry out data and processing-intensive tasks like digital image, sound, and audio editing. But those who did had a reasonable interest in beefier machines than most people might want to have on a day to day basis. However, now the practical processing capacity of even the worst laptop is equivalent to what would have been a supercomputer as recently as the 1980s. The only reason to demand even more powerful chips and ever more massive storage is to support centralization and surveillance. As it stands, the extraordinary over-production of high power chips and memory storage is a product not of genuine utility or demand, but of a massive speculative "AI" bubble combined with a last gasp of the military industrial complex in the form of a desperate effort to realize the dystopic dream of a total surveillance state. There is simply no reason to make computers faster or smaller now, and the effort to do so is inconsistent with maintaining and improving repairability. Rather than wasting effort on efforts that are now running into quantum-level issues, it makes better sense to focus on improving energy efficiency, repairability, and recyclability of components instead.
Unfortunately much of the current improvement in energy efficiency for computers has come from a combination of specialized software tuning with motherboard designs that enforce use of software switches. This presents some difficult problems in terms of repairability and making it as difficult as possible to abuse computers for surveillance purposes. Hence the principle of right to repair demands implementation of a move to free/libre software only. The only way to keep software switches honest is to be able to ensure the code can be audited and patched to correct inevitable bugs and any attempts to meddle with them for nefarious purposes. Moving to exclusively free/libre software would entail a different approach to firmware and separate "systems on a chip"... to wit, the firmware must be auditable via software tools anyone can use, and replaceable with secure software if found wanting, and separate "systems on a chip" removed, that is, the unit the extra system is on should be physically removable, and the computer then still be able to operate. If there is a real use case to have the extra system there, a person who wants it may leave the unit in place, though they may not be willing to do so if they find the extra system is insecure. A redesign allowing such module removals outside pristine laboratory conditions would further support economical and ready repair, and must also go in hand with recyclability. There will probably always be proprietary software around, but no one should be forced to use it at home or at work. Furthermore, a person's right to reject surveillance software and to use only free/libre software must always be respected. In the case of a persons who are incarcerated or transitioning from prison back to freedom, the older methods of tracking their location and enforcing reporting requirements are still available. The only reasons to drop those older methods is a grim combination of expecting to imprison and keep in a semi-imprisoned state more and more people with fewer and fewer guards. These are not reasons any of us should go along with.
Supposing we have applied these principles, which at minimum significantly remove computers from a focus on military purposes, they are repositioned as tools we manage and control, including exerting real influence on their inclusion or not in our daily lives. They may still be used for many of the purposes we are familiar with, from working remotely in a wide range of jobs to producing digital art. However, their greatest potential may be in their capacity to support decentralization. The obnoxious "technolibertarians" who come up with advertising slogans such as "information wants to be free" have recognized this, even as they strive to mystify it. "Information" has no wants, and what even counts as information is socially determined. Libertarians of any stripe are in favour of being able to take whatever they want from others without having to contribute anything themselves, helped along by any means of mystification and distraction available. The efforts to mystify computers into magic boxes supposedly only "techbros" understand and pretend "information" is an independent feeling entity are just the most recent example of such means. Networked computers combined with small-scale printing facilities, including the trusty desktop laser printer make it far easier for us to do what our gregarious, social animal nature inspires us to do: share ideas and pursue relationships. This is exactly why the first computers to really break through outside of an office environment were apple computers designed to support learning by doing (programming optional), and modest desktop publishing including graphics production. Apple was also ahead of the curve in providing colour graphics and supporting excellent sound reproduction. Serious video game players may still turn their noses up at apple computers in terms of both their hardware and software to this day, but it was fundamentally the desire to make and share art that made the difference. Large-size publishing and graphic reproduction is still beyond the means of a single person or household, but small to even medium-size is. The self-publishing house in hard copy is real, and so is its electronic equivalent thanks to the internet, especially the world wide web. Right now online publishing is in an awkward state, but with a re-imagined, decentralized approach to computer networking and use, the situation would certainly be different. For instance, in such conditions self-hosting would be more common, and there would be a strong case for public libraries to provide hosting if not of websites other than their own, of search indexes of local area websites. A wider area directory could be composed and provided via subscription to the libraries using a peer to peer method. This would re-establish a non-commercial, non-corporate controlled search infrastructure, making it easier for people to find and use smaller websites.
By now our imagined groups of hard-working women redesigning the computer and computer networks will have ended up having to work with many other people to seriously restructure the economy. The control of computer hardware and software must be stripped from monopolies and corporations, along with the very notion of a corporation itself. Author Indrajit Samarajiva, whose educational and working background includes cognitive science, online business and magazine startups argues that corporations are fundamentally malevolent "AIs." This may seem like hyperbole on Samarajiva's part, but his arguments are very strong, including providing further evidence of how corporations are distilled capitalism in terms of how success in either incentivizes psychopathic choices and subsequently psychopathic day to day behaviour. With the very production and distribution of software and hardware decentralized and the problematic corporations and monopolies out of the way, continuing the redesign of both will be much easier. Nevertheless, it is not necessary to have this step completed before the redesign begins, in fact I would suggest it is precisely necessary to start both projects in parallel if one or both has not begun already. It is a mistake to leave bad conditions in place in hopes of having the perfect ones to start practical work from.
The notion of a mobile personal library is one of the things that causes the most angst among the people whose dreams revolve around making money by charging rent. At the moment they are in almost complete control of the mainstream publishing system, where they ruthlessly exploit authors and artists in general, and then wherever possible use the authors who are lucky enough to write one or more bestsellers to attack any proposal that challenges the mainstream monopolies. It is all too understandable that authors and other artists who have gone along with this have either been convinced they owe everything to the corporations that print and distribute their work, or are caught up in invidious contracts limiting what if any criticism they can make of the present system. Decentralization remains a watchword here. But for those who feel angry about what are widely referred to as "shadow libraries" and peer-to-peer sharing, at least as presented by the mainstream media, they come across as unable or unwilling to acknowledge what drives the demand for them. Fundamentally the issue is that the various media corporations do not value paying the original authors and artists, nor do they value providing affordable access to electronic versions of works either to borrow or to acquire as duly purchased copies. Without the baleful corporate influence in the mix to begin with, including improved support of public libraries as hubs for legal online access to electronic copies, plus respect for privacy, for those who cannot afford or do not wish to have purchased copies, they could maintain their own list of stable links accessible through those very libraries. As to purchasing electronic copies, we already have the infrastructure and algorithms for that. An excellent addition to the websites of stores and companies helping authors and artists sell their work would be to follow a practice already common on new publishing concerns like substack and ghost, which explain how much of the money they collect goes to authors, and how much to the company to provide the servers and software services.
The spontaneous network is another use of computers we can already do, although it is not as widely done as it might be since at the moment internet access is presented as wholly provided by "internet service providers." Unfortunately these are too often disservice providers under present centralized circumstances. But spontaneous networks are something else. It is surprisingly simple and easy to put together a local area network using basic consumer hardware, although this would not be the nearly impossible to keep secure and repair items filling the shelves today. Instead these would be more like business routers, capable of having their operating systems updated, their RAM cards and wifi radios upgraded. As it should, this just sounds like "the internet," because originally it was created by establishing wired links between different institutions. The point here though, is to further improve the ease and comfort of people to set up their own temporary wireless or wired networks in order to do work, run events, and so on. Here again, the point is to support respect for privacy and resilience in case of disaster or simple loss of access to the wider network for whatever reason. Working together to create a spontaneous network of computers is also a great way to help curb the tendency to mystification in how computers are used and set up. It should never be impossible to carry out such tasks without expensive technicians. There will always be someone who will claim this would never work, or that it might prevent those who would like to be technicians from making a living. This is an obvious absurdity, however. On the canadian prairies, before regular phone lines were run out to or between smaller communities or isolated homes, farmers made use of the wire in their fences. These telephone systems could be impromptu and rough, but quite workable over impressively long distances.
We are encouraged to reframe computers and all sorts of other machines and tools as "labour saving" devices. The people doing the encouraging are those especially interested in two contradictory things: selling such devices at a serious mark up as widely as possible (effectiveness depending on the social position and funds of the buyer) and deploying those devices in ways that reducing the number of people who must be paid to do various sorts of work. They are focussed on making personal profits and reducing their dependence on other people whom they prefer to control and exploit. The skyrocketing levels of mental illness, general discontent, and even boredom even among the very wealthy show this is no way to live. To say so is not the same as throwing away all technology more complex than a water mill and setting ourselves up in a medieval fantasy world. Instead, the key is to take a human, worker-centered view of machines and tools as "tedium-saving," or even "tedium-removing" devices. However, it is probably better to opt for the former rather than the latter, as some tedium is simply not avoidable. From that point of view, computers are true wonders of the age, from typing up documents to making audio-visual editing easier. Even more important than their capacity in this area though, is how computers may be programmed to provide important affordances and supports for people who live with disabilities. For anyone of whatever level of ability, computers may help automate those deadly rote but necessary tasks that bored people inevitably make endless mistakes doing, allowing us to focus on the more important creative aspects of our work instead. Conversely, computers can continue supporting economical mass production, and in future help better plan when and if mass production is applied.
IN CONCLUSION
These ideas of how women might redesign and reframe computers would not in themselves neutralize the negative potential of these complex tools, though they would go a long way towards restoring a sense of human-scale responsibility for managing them. A great deal of what pushes women out of working in computer science and related fields is that women have even more reason to dislike and distrust social structures based on extreme competition and rigid hierarchies. Since these are the sort of social structures women are most often oppressed by if they are not already caught up in one or more already, this is an important barrier. Unfortunately, despite their lack of open and rigid hierarchies, the various free/libre software and hardware communities and companies are not immune to becoming anti-women culturally. The communities are particularly prone to "The Tyranny of Structurelessness," in which structures are very much present, but now based in cliques and similar group formations, where any written policies or rules become a fig leaf for the most popular and powerful clique leaders running things. Smaller software and hardware companies can be better or worse. The best cases tend to be those with women among their founders and owners. I suspect that the majority of women who work professionally on and with computers today, whether they work on hardware, software or both, are not recognized as such because of why they do so. Often their purpose is to support a different goal than foisting the computer into things somehow, instead they are focussed on using the computer as a tedium-saving device on the way to completing a larger goal. Hence the many women who are busy in "support work" on so many science and technology projects, and those who do the actual work to adapt and set up new computer systems and associated peripherals in offices, stores, classrooms, and homes.
- See Magic Died When Art and Science Split by Renée Bergland in nautilus magazine.
- For example, see Simon Singh's The Code Book: The Science of Secrecy from Ancient Egypt to Quantum Cryptography (New York: Doubleday, 1999); Doron Swade's The Difference Engine: Charles Babbage and the Quest to Build the First Computer (New York: Viking, 2000); and Paul Gannon's Colossus: Bletchley Park's Greatest Secret (London: Atlantic Books, 2006). More technically inclined readers will want to look for the various books written and edited by B. Jack Copeland.
- Grier, David Alan. When Computers Were Human. Princeton: Princeton University Press, 2005.
- Abbate, Janet. Recoding Gender: Women's Changing Participation in Computing. Cambridge: MIT Press, 2012.
- Hicks, Marie. Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing. Cambridge: MIT Press, 2018.
- Ensmenger, Nathan. The Computer Boys Take over: Computers, Programmers, and the Politics of Technical Expertise. Cambridge: MIT Press, 2010.
- Spender, Dale. Nattering on the Net: Women, Power, and Cyberspace. Toronto: Garamond Press Ltd., 1995.
- Spender 1995 , 173.
- Spender, Dale. Women of Ideas and What Men Have Done to Them. London: Pandora Press, 1988. See pages 230-236.
- "The Borg" and their cube spaceships in Star Trek: The Next Generation comes close to a visual representation of this sort of factory. The show's writers never did give a convincing explanation of why the Borg should insist on "assimilating" organic beings. A more specific and detailed analysis of the connections between nineteenth century british and european interest in automating away real human labour and the development of so-called "artificial intelligence" and machine learning, see Pietro Daniel Omodeo's review essay of Mateo Pasquinelli's book The Eye of the Master: A Social History of Artificial Intelligence, The Social Dialectics of AI.
- Despite my reluctance to cite wikipedia here, it is undeniable that the collection of diagrams and illustrations in the panopticon article is excellent. The Internet Archive has scans of original editions of Jeremy Bentham's design and notions of the panopticon, Panopticon; or, The Inspection House and Panopticon: Postscript.
- See Quote Investigator for an informal tracing of the joke's origins and history.
- Grier, especially the first part of the book, Chapters 1-5 (pages 11-88).
- See Babbage's books The Calculating Engine and On the Economy of Machinery and Manufactures at Project Gutenberg.
- For a poignant case study of the intersection of exploitation specifically of women in this way, see Biography of Williamina Fleming, American Astronomer at Salient Women and The Maid Who Mapped the Heavens at narratively.
- Hicks 2018, 8-9.
- Conor Gallagher, naked capitalism: How Responsible Is the Rental Housing Cartel for the Explosion in Homelessness? 1 september 2024. As I complete the editing phase on this essay, the zionist entity's military is actively adopting "AI" for the exact purposes of pretending they are not responsible for committing genocide.
- Spender 1995, 176.
- Manabrea, L.F. "Sketch of the Analytical Engine Invented by Charles Babbage, Esq. with Notes by Ada Lovelace," Scientifc Memoirs 3(1843): 666-731. Page 696.
- For an introduction to Delia Derbyshire's life and work, see Delia Derbyshire: An Audiological Chronology and Sandy Levins, Wednesday's Women: Delia Derbyshire – Unsung Hero of Electronic Music 1937-2001.
- Bacon-Smith, Camille (Author); Hall Stephanie A. (Photographer). Enterprising Women: Television Fandom and the Creation of Popular Myth. Philadelphia: University of Pennsylvania Press, 1992.
- These links are direct to Eric S. Raymond's curated version of The Jargon File on his website.
- To date I have not found many sources on or offline focussed on the more whimsical and creative applications of computers and computer programming in the 1980s to 1990s, perhaps because this is also when commercial computer games took off, overwhelming the records much of the more artisanal and experimental work. Threads to follow to this other material include the late Dennis M. Ritchie's Bell Labs Home Page (see especially the page on the early game Space Travel: Exploring the solar system and the PDP-7 ) and David C. Brock's Computer hIstory Museum blogpost, Computer History Museum – The Earliest Unix Code: An Anniversary Source Code Release, 17 October 2019.
- At that time computer science programs were rare. At the university where I completed my physics degree, anyone who wanted to specialize in computer software or hardware entered the electrical engineering program.
- ifixit is a business specializing in supporting people in repairing devices that would otherwise end up in landfills, providing many repair and teardown guides for free and highly informative blog explaining the basics of right to repair laws and why they are especially needed wherever computers are involved. On the software side, it is worth reading about free/libre software at the Free Software Foundation and the overlapping but not identical concept of open source software at opensource.com.
- It is true that this means in the next section I will sidestep the question of whether computers would even exist without pressure to routinize labour and automate it away, but I will return to the question later.
- The still unsurpassed book on the topic for west asia is Elizabeth Wayland Barber's Prehistoric Textiles: The Development of Cloth in the Neolithic and Bronze Ages (Princeton: Princeton University Press, 1991). So far I have not found any comparable book-length treatment for any part of the americas or sub-saharan africa.
- It is likely that cloth dyeing developed as much for its preservative effects as decorative ones. There is good daily evidence for this accessible to many of us, even if only by venturing into a second-hand clothing store. Notice how much longer a pair of black-dyed denim jeans ast compared to blue-dyed jeans, or in the second-hand store, how while there are not necessarily more black denim jeans than blue, the black jeans are in better shape and so consistently more expensive.
- The introduction to Prehistoric Textiles, brief though it is, provides an excellent overview of how busy women were with textile making and repair (pages 2-5). A more popular-style treatment focussing on this aspect is her now recently returned to print Women's Work: The First 20, 000 Years (New York: W.W. Norton & Company Inc., 1994).
- Among the best-illustrated books for examining early evidence of strings and textiles are Marija Gimbutas' The Civilization of the Goddess: The World of Old Europe (San Francisco: HarperSanFrancisco, 1991) and The Language of the Goddess (London: Thamas & Hudson Ltd., 2001.)
- The symbolic language of embroidery and its deeper history are ongoing topics of research. Ethnographers and artists working in southeastern europe and west asia have completed some of the most detailed recent work. Among the most well known of these scholars is Mary B. Kelly, whose books include Goddess Embroideries of Eastern Europe, Goddess Embroideries of the Balkan Lands and the Greek Islands, Goddess Embroideries of the Northlands, and Goddess, Woman, Cloth: A Worldwide Tradition of Making and Using Ritual Textiles.
- I am simplifying and summarizing by necessity, do read the man himself at his site, indi.ca.
Corporations Are Already AI: The Stock Market is Sentient and it Hates You. To start with just his articles on "artificial intelligence" and corporations, he has a subject tag page for it that autoupdates and has its own RSS feed.
- This is the fundamental meaning of the older saying about not allowing the perfect to be the enemy of the good.
- For a brief overview of wireless mesh networking and its origins in the original wired internet that provides the major connections over larger areas and worldwide, see Chris Pollette and David Roos at howstuffworks, What Is Mesh Wifi, and How Does It Work?, 22 march 2024.
- See Lori Emerson's A Brief History of Barbed Wire Fence Telephone Networks, 31 august 2024. Emerson founded the Media Archaeology Lab in boulder colorado, making her a particularly great source.
- The united states has been on the pointy end proving how unsatisfying and destructive this approach to life in fact is, though the vast majority of citizens there never volunteered for this demonstration. One marker is the incredible levels of psychoactive drug prescriptions in that country, including to children between the ages of 0 and 5, as described in Number of People Taking Psychiatric Drugs in the United States by the citizens' commission on human rights international. Another is the infamously huge and growing prison population the united states, which is the highest in the world despite it being a nominal democracy. For specific numbers and excellent infographics, see the united states profile at the prison policy initiative. Readers who would prefer a more conservative source can start with Abby Rogers' article at business insider, Check Out Beautiful/Scary Satellite Pictures Of Every Prison In The US.
- The Tyranny of Structurelessness, by Jo Freeman aka Joreen. This is a famous and oft-reproduced paper without Freeman's permission. This version is from her own website.
- The ongoing and grimly slow-moving car crash that is currently the debian community is one of the most obvious, though still least openly discussed examples. Among the most vocal critics of its current state is former software developer Daniel Pocock.
|