Mania/Suicide Attempt Prozac 08/07/2002 California College Professor Becomes Suicidal & Manic Summary:

Wallace was depressed & stressed because his grant applications were being rejected so he started on Prozac. Here is what happened to him as he continued on the Prozac. {Paragraph eleven].

"As Wallace's work progressed, though, his mental illness grew worse, making him both depressed and occasionally grandiose. He went on strike in class, refusing to grade his students' papers and instead awarding them all A's. He fired off acid e-mail messages dismissing colleagues as sellouts. When Wallace climbed out the window of his 16th-floor apartment and threatened to jump, his girlfriend pulled him back and took him down to N.Y.U.'s psychiatric department, where doctors told him he had bipolar disorder". _________________________________________________________________________

Approximating Life
http://c.moreover.com/click/here.pl?E42044304
New York Times

Approximating Life
By CLIVE THOMPSON



It's a good thing you didn't see me this morning,'' Richard Wallace warns me as he bites into his hamburger. We're sitting in a sports bar near his home in San Francisco, and I can barely hear his soft, husky voice over the jukebox. He wipes his lips clean of ketchup and grins awkwardly. ''Or you'd have seen my backup personality.''

The backup personality: that's Wallace's code name for his manic depression. To keep it in check, he downs a daily cocktail of psychoactive drugs, including Topamax, an anti-epileptic that acts as a mood stabilizer, and Prozac. Marijuana, too -- most afternoons, he'll roll about four or five joints the size of his index finger. The medications work pretty well, but some crisis always comes along to bring the backup personality to the front. This morning, a collection agency for Wallace's college loans wrote to say they'd begun docking $235 from the monthly disability checks he started getting from the government last year, when bipolar disorder was diagnosed. Oh, God, it's happening again, he panicked: His former employers -- the ones who had fired him from a string of universities and colleges -- would be cackling at his misfortune, happy they'd driven him out. Wallace, 41, had raged around the cramped apartment he shares with his wife and son, strewn with computer-science texts and action-doll figurines.

''Stuff like that really makes me insane, when I start thinking about my friends who are at Berkeley or Carnegie-Mellon with tenure and sabbaticals and promotions,'' he says, staring down at his plate. He looks awkward, as if he's borrowing someone else's body -- shifting his stocky frame in his chair, all rumpled jeans and unruly eyebrows. ''It's like I can't even talk to those people anymore. I live on a different planet.'' In June, after I visited him, his alienation from the academic establishment became more dramatic still: a former colleague, claiming Wallace had threatened him, took out a restraining order that prevents him from setting foot on the grounds of the University of California at Berkeley.

When he can't get along with the real world, Wallace goes back to the only thing he has left: his computer. Each morning, he wakes before dawn and watches conversations stream by on his screen. Thousands of people flock to his Web site every day from all over the world to talk to his creation, a robot called Alice. It is the best artificial-intelligence program on the planet, a program so eerily human that some mistake it for a real person. As Wallace listens in, they confess intimate details about their lives, their dreams; they talk to Wallace's computer about God, their jobs, Britney Spears.

It is a strange kind of success: Wallace has created an artificial life form that gets along with people better than he does.

Richard Wallace never really fit in to begin with. His father was a traveling salesman, and Richard was the only one of his siblings to go to college. Like many nerds, he wanted mostly to be left alone to research his passion, ''robot minimalism'' -- machines that require only a few simple rules to make complex movements, like steering around a crowded room. Simple, he felt, worked. He lived by the same ascetic code, scorning professors who got rich by patenting work they'd developed on government grants. ''Corporate welfare,'' he sniffed.

By 1992, Wallace's reputation was so strong that New York University recruited him to join the faculty. His main project, begun in December 1993, was a robot eye attached to the Internet, which visitors from afar could control. It was one of the first-ever Webcams, and Wallace figured that pioneering such a novel use of the Internet would impress his tenure committee. It didn't, and Wallace grew increasingly depressed as his grant applications were rejected one by one. At one point, a colleague found him quietly weeping at his desk, unable to talk. ''I had no clue what the rules were, what the game even was -- or that there was even a game,'' Wallace recalls. He started taking Prozac. How did all these successful senior professors do it, anyway?

One day he checked into his Webcam and noticed something strange: people were reacting to the robot eye in an oddly emotional way. It was designed so that remote viewers could type in commands like ''tilt up'' or ''pan left,'' directing the eye to poke around Wallace's lab. Occasionally it would break down, and to Wallace's amusement, people would snap at it as if it were real: ''You're stupid,'' they'd type. It gave him an idea: What if it could talk back?

Like all computer scientists, Wallace knew about a famous ''chat-bot'' experiment called Eliza. Back in 1966, an M.I.T. professor, Joseph Weizenbaum, created Eliza as a ''virtual therapist'' -- it would take a user's statement and turn it around as a question, emulating a psychiatrist's often-maddening circularity. (You: ''I'm mad at my mother.'' Eliza: ''Why are you mad at your mother?'') Eliza was quickly abandoned as a joke, even by its creator. It wasn't what scientists call ''strong'' A.I. -- able to learn on its own. It could only parrot lines Weizenbaum had fed it.

But Wallace was drawn to Eliza's simplicity. As a professor, he often felt like an Eliza-bot himself -- numbly repeating the same lessons to students over and over again, or writing the same monotonous descriptions of his work on endless, dead-end grant-application forms. He decided to create an updated version of Eliza and imbue it with his own personality -- something that could fire back witty repartee when users became irritable.

As Wallace's work progressed, though, his mental illness grew worse, making him both depressed and occasionally grandiose. He went on strike in class, refusing to grade his students' papers and instead awarding them all A's. He fired off acid e-mail messages dismissing colleagues as sellouts. When Wallace climbed out the window of his 16th-floor apartment and threatened to jump, his girlfriend pulled him back and took him down to N.Y.U.'s psychiatric department, where doctors told him he had bipolar disorder. Wallace resisted the diagnosis -- after all, didn't every computer scientist cycle through 72-hour sprees of creativity and then crash? ''I was in denial myself,'' he says now. '''I'm a successful professor, making $100,000 a year! I'm not one of those mental patients!'''

His supervisors disagreed. In April 1995, N.Y.U. told him his contract wouldn't be renewed.


lice came to life on Nov. 23, 1995. That fall, Wallace relocated to Lehigh College in Pennsylvania, hired again for his expertise in robotics. He installed his chat program on a Web server, then sat back to watch, wondering what people would say to it.
Continued