The Distinctive Collections Archives at the Massachusetts Institute of Technology (MIT) is silent as the snowstorm blows outside. The silence seems to pile up with the falling snow. I am the only researcher in the archive, but there is a voice that I struggle to hear.
I'm looking for someone - let's call her the missing secretary. She played a crucial role in the history of computing, but her name has never been mentioned. I'm at MIT as part of my research into the history of talking machines. You may know them as 'chatbots': computer programs and interfaces that use dialogue as the primary means of interaction between humans and machines. Maybe you've talked to Alexa, Siri, or ChatGPT.
Despite the current buzz around generative artificial intelligence (AI), talking machines have a long history. In 1950, computer pioneer Alan Turing proposed a test of machine intelligence. The test asks whether a human can distinguish between a computer and a person through a conversation. Turing's test stimulated research into AI and the emerging field of computing. We now live in that future he imagined: we talk to machines.
I'm interested in why early computer pioneers dreamed of talking to computers, and what was at stake in that idea. What does this mean for the way we understand computer technology and human-machine interaction today? I find myself at MIT, in the middle of this blizzard, because it was the birthplace of the mother of all bots: Eliza.
Eliza's speech
Eliza was a computer program developed in the 1960s by mustachioed MIT electrical engineering professor Joseph Weizenbaum. Through Eliza he wanted to enable a conversation between humans and computers.
Eliza took the user's typed messages, parsed them for keyword triggers, and used transformation rules (where the meaning of a statement can be inferred from one or more other statements) to produce a response. In the best-known version, Eliza posed as a psychotherapist, an expert who responded to the user's needs. "Please tell me your problem" was the opening prompt. Not only could Eliza receive input in the form of natural language, it gave the "illusion of understanding."
The story continues
The name of the program was a nod to the main character in George Bernard Shaw's play Pygmalion (1912), in which a Cockney flower seller learns to speak "like a lady". Like Audrey Hepburn's 1964 musical, Eliza took the world by storm. Newspapers and magazines applauded the fulfillment of Turing's dream.
Even Playboy played with it. Eliza's legacy is significant. Siri and Alexa are the direct descendants of this program.
Accounts of Eliza tend to focus on a Frankensteinian story of the inventor's rejection of his own creation. Weizenbaum was shocked that users could be 'tricked' by a piece of simple software. He renounced Eliza and the whole 'artificial intelligence' for the next few decades - much to the chagrin of his colleagues.
But I'm not in the archive to hear Eliza's voice, or Weizenbaum's. In all these stories about Eliza, one woman keeps appearing: our missing secretary.
The missing secretary
In his accounts of Eliza, Weizenbaum repeatedly worries about a particular user:
My secretary watched me work on this program for a long time. One day she asked if she could talk to the system. Of course she knew she was talking to a machine. But after I watched her type a few sentences, she turned to me and said, "Would you please leave the room?"
Weizenbaum viewed her response as troubling evidence that: "Extremely brief exposure to a relatively simple computer program can induce powerful delusional thinking in very normal people." Her reaction sowed the seeds for his later horror at his creation.
But who was this 'very normal' person? And what did she think of Eliza? If the missing secretary played such an important role, why don't we hear from her? In this chapter of the history of talking machines, we only have one side of the conversation.
Back in the archive, I want to see if I can find the secretary's voice, to understand what we might learn from Eliza's user. I make my way through Weizenbaum's yellowed papers. Surely there must be evidence among the transcripts, code prints, letters and notebooks? There are some clues, reference to a secretary in letters to and from Weizenbaum. But no name.
I'm broadening my hunt for administrative documents. I look through department papers and the collections of Weizenbaum's workplace, Project MAC - the hallowed center of computer innovation at MIT. No luck. I'm contacting MIT's HR office and alumni group. I'm stretching the patience of the ever-generous archivists. As my last day arrives, all I hear is silence.
Listening to silences
But the hunt has revealed some things. How few organizations have historically cared about the people who produced, organized and preserved so much of their knowledge.
In the history of institutions like MIT and computing in general, the authors of these documents-often poorly paid, low-status women-have been largely written out. Our silent secretary is the quintessential erased, anonymous transcriber of the documents on which history is built.
The contributions of the users of talking machines - their labor, expertise, perspectives, creativity - are too often ignored. When the model is "talking," it's easy to think those contributions are effortless or unimportant. But belittling these contributions has real consequences, not just for the talking machine technology we design, but also for the way we value the human input into those systems.
In generative AI we talk about user input in terms of 'chat' and 'prompts'. But what kind of legal status can "talk" claim? For example, should we be able to claim copyright on those comments? What about the work those systems are trained on? How do we recognize those contributions?
The snowstorm is getting worse. It is announced that the campus will close early due to the weather. The voice of the missing secretary still escapes me. For now, the history of talking machines remains one-sided. It's a silence that haunts me as I trudge home through the muffled, snow-covered streets.
This article is republished from The Conversation under a Creative Commons license. Read the original article.