Joshua Foer. Moonwalking with Einstein: The Art and Science of Remembering Everything Penguin Press. 320 Pages. $26.95.
Brian Christian. The Most Human Human: What Talking with Computers Teaches Us About What It Means to Be Alive. Doubleday. 320 Pages. $27.95.

Humanity has ever been a favorite topic of human writers. Start with the Iliad and the Odyssey — or even long before, with, say, the Bhagavad Gita — move through the tales of classical antiquity and the medieval ages, past the works of the renaissance, baroque, and modern periods, and what you’ll have traversed is a corpus of prose, literary and otherwise, obsessed with considering the human condition.

This consideration has lately begun to narrow, and not merely because the popularity of ecumenical literature seems to wilt as that of insular memoir blooms. Consider the voguish pervasiveness of brain-science writing, a genre that endeavors to elucidate human actions by pinpointing where in the brain they originate, and how and why. Fine. Yet in too many of these books, the specific, scientific roots of human exploits are more significant than the exploits themselves. The expansive study of virtue tapers to the study of exact cause — the question becomes why, molecularly, a certain decision was made, and not necessarily whether the decision was right or good. “How to think?” is eclipsed by “How do we think?”

Oliver Sacks is the chieftain of the brain-science clan and one of its more-responsible members, although even his writing comes with a glaze of neuron-determinism. There is also Jonah Lehrer, a young polymath who in 2007 published the well-received and pleasant Proust Was a Neuroscientist, a book which contended (perhaps rather strainingly) that works by Proust, Cézanne, and Stravinsky foretold modern brain-science discoveries (Proust in this telling is said to have anticipated, with his famous madeleine, smell’s direct connection to the hippo-campus).1  But for every brain-science writer like Sacks and Lehrer — sharp, stylish, generally measured — there are five others who lack eloquence, insight, and judiciousness.

Two new authors have bounded into the clutter: Joshua Foer, with his book Moonwalking With Einstein, and Brian Christian, with The Most Human Human. Both books concern the mind. Both pull intermittently from Malcolm Gladwell’s dog-eared playbook (tell a shocking anecdote, then explain it with brain science) but, when they do, try to do so subordinately to broader inquisition. For all their books’ differences, Foer and Christian engage with science to seek instruction, not causation. They keep their considerations of humanity broad. The brain matters to them, but the lessons of literature, philosophy, and history matter, too.

Moonwalking With Einstein is about memory. The story starts with Foer on a whim attempting to identify the world’s smartest person, undertaking some Google research to that end, and discovering Ben Pridmore, the reigning world memory champion, who is able to “memorize the precise order of 1,528 random digits in an hour” and of a shuffled deck of cards in 32 seconds. Pridmore knows 50,000 digits of pi. Foer is astounded by this. He considers how having such a memory would make his own life “qualitatively different — and better.” Reading would be richer, the lessons of books retained; navigating parties would be a cinch, names lined up in the mind and ready for recall. And he could finally reliably remember where he had parked. Especially intriguing to Foer are two sentences Pridmore gave to a newspaper reporter: “It’s all about technique and understanding how the memory works. Anyone could do it, really.”

Two weeks later, Foer is in New York City, covering the 2005 U.S. Memory Championship for Slate magazine. He meets Ed Cooke, a 24-year-old memory grand master from England, and the two hit it off. After the competition, hours of discussion, and several rounds of beer at a local bar, Ed offers to instruct Foer in the techniques of memory, to be his “memory coach.” Foer accepts, and a year later he wins the American memory championship.

How that triumph happened is a remarkable tale and the backbone of Moonwalking With Einstein (the title refers to a mnemonic Foer used in competition). Much of the book is not about Foer, though, but the world of memory and its peculiar characters. We are introduced to Solomon Shereshevsky, a Russian journalist who in May 1928, after being reproached by his editor for failing to take notes at a meeting, sought out the neuropsychologist Alexander Luria. Shereshevsky arrived at Luria’s door bewildered, having forgone note-taking at meetings not because he was lazy or impudent but because he always remembered everything said at them, word for word, and couldn’t conceive that others didn’t. He asked Luria to administer some memory tests, and Luria did so, first asking Shereshevsky (known in the scientific literature as “S”) to memorize a list of numbers, and then listening “in amazement as his shy subject recited back seventy digits, first forward and then backward.” Test after test returned the same result: “The man was unstumpable.” Luria finally realized that he would not be able to perform what, he later said, “one would think was the simplest task a psychologist can do: measure the capacity of an individual’s memory.”

S had a condition called synesthesia; stimulation of one of his senses produced a reaction in all the others. For S, voices were colors and textures, and words immediately became vibrant images, or tastes, or smells. His memories were not abstract. Rather, they were vivid graphic images, ordered in relation to each other. “The nonlinear associative nature of our brains makes it impossible for us to consciously search our memories in an orderly way,” Foer writes. S’s memories were not nonlinear but “as regimentally ordered as a card catalog,” each memory sensation (a picture, a hue, a sound, flavor, or odor) assigned a precise location in his mental world. When he wanted to retrieve a memory, he simply went to the place where he had stored it.

What S was doing, albeit unknowingly, was using an ancient recall technique called the memory palace, in which ideas are arrayed in the mind throughout a well-known space (a childhood home, say, or even a well-trodden route to work) and can then be retrieved later.2  The human brain, Foer explains, has evolved to be quite proficient at remembering spaces. Spend five minutes at a party walking through a stranger’s home, and you’ll naturally form a mental map of the place — where the kitchen is, how to get to the bathroom, where you left your wife. Our brains are far less adept at remembering information like phone numbers, names, and world capitals, and so the memory palace allows us to do what S did naturally: convert nonvisual data into visual images, and place those images in precise spots where we can later find them. “The idea,” Foer writes, “is to create a space in the mind’s eye, a place you know well and can easily visualize, and then populate that imagined place with images representing whatever you want to remember.”  

Creation of the memory palace is credited to the Greek poet Simonides, who lived in the fifth century B.C., but the technique was first described, so far as we know, in a Latin textbook called the Rhetorica ad Herennium that was authored sometime between 86 and 82 B.C. Though no earlier writing makes mention of it, the memory palace was definitely in use in the 400-odd years between Simonides’s time and the Rhetorica’s publication. Cicero, for one, thought the palace techniques so well-known that “he felt he didn’t need to waste ink describing them in detail.” Memory training was central to classical education. “In a world with few books, memory was sacrosanct,” and in the pages of extant books people with extraordinary recall abilities were exalted. Indeed, Foer writes that besides their goodness, “the single most common theme in the lives of the saints . . . is their often extraordinary memories.”

Today, of course, memory is no longer so revered or commonly excellent. What happened? Technology, specifically writing. Foer quotes from Plato’s Phaedrus, in which Socrates tells how the Egyptian king Thamus rebuffed Theuth, the god who created writing and offered to bestow it on the land, for fear the invention would atrophy the people’s minds: “They will cease to exercise their memory and become forgetful,” says Thamus, and they will “rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks.” This is precisely what came to pass.

In a way, then, the memory competitors that Foer meets are in the business of reviving a lost craft. “This book is our bible,” Ed tells Foer about the Rhetorica, before also assigning him excerpts of Quintilian’s Institutio Oratoria and Cicero’s De Oratore and a ream of other musings on memory from Thomas Aquinas, Albertus Magnus, Hugh of St. Victor, and Peter of Ravenna. It is through these documents that Foer treads as he undertakes his own memory training. For half an hour each morning, and a few minutes in the afternoon, he practices, memorizing lists of words and numbers. He installs numbers in his memory palace by using the “Major System,” invented in 1648, which provides a simple code for converting numbers into letters. The number 34, for example, translates as mr. Vowels can be freely interspersed, so 34 might find a place in the memory palace as an image of the Russian space station Mir. For long strings of numbers, Foer learns more-advanced pao systems, in which each two-digit number is associated with a different person-action-object image; 34 could be Barack Obama speaking at a lectern (or, more memorably, Obama dancing meringue with a toad). A six digit number can then be turned into a single picture by mashing the person of the first two digits with the action of the second and the object of the third. Each number between 0 and 999,999 thus gets its own, unique image.

After a few months of such training Foer hits a plateau: his playing-card memorization time will not drop. Again, he looks to the past for answers. Help comes in the 1960s literature on speed-typing, from which he learns about the notorious “autonomous stage.” When a person, after practicing a given task, figures he has gotten pretty good at it, “the parts of the brain responsible for conscious reasoning become less active.” The person begins conducting the task on “autopilot.” The key, Foer learns, is to push through this wall by “consciously keeping out of the autonomous stage” while focusing on technique, remaining “goal-oriented,” and “getting constant and immediate feedback” on performance. With speed-typing, that means typing just a bit faster than you’re comfortable with, pushing yourself to continue hitting the keys even though the errors are piling up. Gradually, with focus, you’ll make fewer and fewer mistakes, even as your typing time drops. It’s much the same with memory training: Foer pushes himself to go faster and faster, and conscientiously marks his progress. “This is what differentiates the top memorizers from the second tier,” he writes. The best memorizers “approach memorization like a science” and “develop hypotheses about their limitations; they conduct experiments and track data.” “If I would have any chance at catapulting myself to the top tier of the competitive memory circuit,” Foer writes, “my practice would have to be focused and deliberate.” Foer focuses. He creates a spreadsheet to monitor his practice time and results, he graphs everything, and he keeps a journal. He buys a pair of industrial-grade earmuffs. His time starts to fall.

Moonwalking with Einstein is a book that could have easily followed the brain-science lodestar. Instead, Foer takes a smarter tack, following a multi-disciplinary guide that does not place all its wisdom in sparking synapses (he even ponders whether all his memorizing isn’t just a massive waste of time). Christian attempts something similar with The Most Human Human.

In February, Ken Jennings, the legendary Jeopardy! contestant who won 74 straight games, was soundly beaten on the show by Watson, a room-sized computer created by I.B.M. Facing certain defeat, Jennings jokingly scrawled on his video screen, “I, for one, welcome our new computer overlords.” Humor aside, the questions raised by such artificial-intelligence successes are real. What, if anything, did Watson’s victory prove? Did the computer “think”? Did it “act human”? What, after all, does doing either of those things really entail?

Such are among the questions that Christian undertakes to answer. His book’s scaffolding, from which its many digressions are built, is his participation in the annual Loebner Prize, a contest in which one group of humans (“judges”) engages in a battery of instant message conversations. In some of those conversations, the judges’ interlocutors are flesh-and-blood people (“confederates”) and in others are sophisticated, artificial-intelligence (ai) computer programs. A judge decides about each conversation whether he was speaking to a human or a computer, and he rates his confidence in his determination on a 1–10 scale. Scores are tabulated, and the computer program that tricks the most judges into believing it a human wins the coveted “Most Human Computer” award and all the money and prestige that accompany it. Christian is after a different honor, though: The confederate who convinces the most judges of his humanity receives the title “Most Human Human.”

The Loebner Prize launched in 1991, but the concept behind it dates to 1950, when it was proposed by British mathematician Alan Turing. Turing predicted that by 2000, five minutes of conversation would allow a computer to fool 30 percent of judges into believing it human. So far, no program has managed to hit this 30 percent mark, although in 2008 the top computer missed it by a single vote. The Turing test takes on significant meaning for Christian, who believes that “at bottom” it is “about the act of communication.” Its “deepest questions” are “practical ones” about connecting “meaningfully with each other” “within the limits of language and time.” He asks other questions, too: “How does empathy work? What is the process by which someone comes into our life and comes to mean something to us?” Such inquiries are the test’s fundamental ones, “the most central questions of being human.” “In a sense,” he continues, “this is a book about artificial intelligence, the story of its history and of my own personal involvement, in my own small way, in that history. But at its core, it’s a book about living life.”

In The Most Human Human, as in Moonwalking with Einstein, the author achieves his goal; in the end, Christian walks away from the Loebner Prize having pocketed the quintessentially-human designation. He describes his preparation for the contest, which consists not of regimented practice with decks of cards and sheets filled with random numbers but of research and rumination on the nature of humanity. Brain-science nuggets are interspersed with the author’s other musings, on subjects vast and varied, this being, after all, “a book about living life.” First up is a disquisition on what Christian calls “authenticating”; he moves from telling a story about a man with phonagnosia (the man cannot recognize voices, even his mother’s) to relaying the history of speed dating to dissecting the structure of a particular AI program called Cleverbot. Then it’s on to a consideration of machine translation of literature, a story about an ultimately successful call to a cellular phone company’s customer service department, four paragraphs about 50 First Dates (the 2004 comedy starring Adam Sandler and Drew Barrymore), and a tale about a 1989 “chatbot” program that argued for an hour and a half with an unsuspecting human. Sprinkled on top are bits of dialogue from Sex and the City, short anecdotes involving the author’s friends, and Nietzsche quotations. This is the initial, post-Introduction chapter. Subsequent chapters follow, similarly disjointed.

Christian describes his preparation for the Loebner Prize, which consists not of regimented practice with decks of cards and sheets filled with random numbers but of research and rumination on the nature of humanity.

Ingested as single-serving snacks, Christian’s brief commentaries are enjoyable, but taken together as a meal, in chapter- or book-form, they become a muddle. The book doesn’t have a narrative and needs one.

A reader can understand what Christian is attempting here, a winding exploration of what it is to be human; can appreciate his curiosity and perspicacity, which are frequently evident; and still feel lost pushing through paragraphs with such subheads as “Intimacy: Form & Content” and “Suspicion; Roulette; Purée.” Occasionally a chapter coheres — “Getting Out of Book,” for example, is an impressive, incisive 30 pages about chess and grandmaster Gary Kasparov’s 1997 loss to I.B.M.’s Deep Blue computer — but more often it doesn’t, a victim of ambition. Unlike Foer’s memory competition, never absent from his book’s pages, the Turing test is frequently forgotten in Christian’s ramblings. It is referenced haphazardly, laboriously; one pictures the author, suddenly aware that he has wandered too far from the road, scrambling to get back on track. Add to the reader’s disorientation the annoyance of being bombarded from The Most Human Human’s pages with italicized words, not infrequently five or more of them shot from a single, short paragraph. Amid the withering barrage, one perceives the author is maybe too fond of his own cleverness, his ability to reveal hidden truths: “I don’t want life to be solved; I don’t want it to be solvable,” Christian writes, and quickly follows with, “The reason to wake up in the morning is not the similarity between today and all other days, but the difference.” This is very irritating.

But to his credit, Christian, like Foer, eschews brain-science’s lowest-hanging fruit. He has written a book about humanity that looks to the vaunted (great literature and art) as well as the common (50 First Dates), the old (Sun Tzu) as well as the new (neuro-linguistic programming), and attempts to deglaze some essence from it all. But he includes too many ingredients. The author loses track of them, and so do we.

1 Two thousand seven was a good year for Proust and brain science: In addition to Lehrer’s book there was Proust and the Squid: The Story and Science of the Reading Brain, by Maryanne Wolf, a Tufts professor of child development. Proust and the Squid is a chronicle of how the human brain evolved to learn to read, and how this modernistic reading-brain’s development has affected humans’ development generally.

2 The late Tony Judt, for instance, used this technique when immobilized by amyotrophic lateral sclerosis to sort out in wakeful nights the memories that became the essays that became his recently published book The Memory Chalet.

overlay image