Roger Ebert Home

The human race on a key ring

4_twitter_birds_web_preview.jpg

Richard Dawkins observed in The Selfish Gene that from the point of view of a gene, a living body is merely a carrier to transport it into the future. I believe we are now entering the century of the Selfish Mind. Man has always been a creature restlessly seeking a reality beyond himself. We cannot know what a chimpanzee thinks about when he gazes at the stars, or what ideas a dolphin has about air. But we know what we think, and we have traveled so high in the atmosphere we cannot breathe and then beyond. We have placed humans on the Moon, sent our devices to other planets, and our signals reaching out to the universe, not to be received until after our extinction, if then.

The earliest hominids must have had complex ideas, but they were trapped inside their minds. Out of the desire to share those ideas with other minds, they devised symbols, sounds and speech. I see you, I see this, I think this, I want to tell you. Many species make sounds--at first to warn or to frighten, then to express more complex needs. We don't know if speech itself was a goal, because we cannot be sure if they had a conception of what that might be. But from its first "words," mankind found itself driven to improve and extend its self-expression. I will not rehearse here the forms that drive has taken. A short list will do: Symbols, drawings, signs, writing, printing, analog information, now digital information. The storage, manipulation and transmission of digital data was a threshold step as crucial as the bone used as a tool in "2001." The bone became our key to the physical domain. The transistor chip became our key to the digital domain.

We have an urgent need, whether innate or evolved, to communicate as quickly and easily as possible outside our own minds. We can only shout so loud. Then come drums, beacons, messengers, mirrors, flags, the telegraph, radio, television, computers, the internet, the web, and now the time of PDAs. In the earliest days of the web, people created personal web pages to extend themselves into cyberspace. Then web sites invited users to have web pages without knowing much about HTML. With the advent of cell phones, the Web came into our portable possession. Then came texting and its simplified offspring Twitter. All of this involved the communication of information, otherwise known as Talking. That you already know.

2_paris and nicky fashion weel.jpgNicky and Paris at Fashion Week

What interests me is the sight of our grandchildren in the presence of other people who are actually there, glancing down at an iPhone cupped in their palms like gamblers checking their cash. Then texting with their hands under the surface of the dinner table, as if sliding an ace up a sleeve. Sometimes receiving a message as surprising as a Royal Flush. OMG!

I thought at first this compulsion desire was centered in teenagers. Then I began to observe it among editors, lawyers, cops, waiters, sports fans, construction workers, people in restaurants, even people watching movies. During President Obama's recent address to a joint session of Congress, a good many members of his audience could be observed twittering. This is as childish as throwing paper airplanes in class.

Members of both parties were observed. Rep. Joe Barton (R-Tex.) twittered: "Aggie basketball game is about to start on espn2 for those of you that aren't going to bother watching pelosi smirk for the next hour." A few minutes later his Twitter friends read: "Disregard that last Tweet from a staffer." How Barton, still sitting on the flood of Congress, pried the Blackberry from his staffer's cold, dead hands, I leave it for you to imagine. Or did the staffer send both messages, the second one after Obama said it was time for us to focus on the future? Celebrities are already hiring ghost writers. OMG! Here I am in Paris's favorite boutique, at 4747 Sunset, open 12-9. OMG! I meant to text ***my*** favorite boutique!

Never mind. The point is that we are becoming a nation of twits. In the old days, when Mike Royko was stuck for a column subject, he'd call up the Chicago Daily News man at Police Headquarters or City Hall and ask what was happening. These days columnists seek to amaze us with their day's adventure on Twitter. OMG squared!

1_twitter-network.pngFriends and/or avatars

I will never become a Twit. I apparently have dozens or hundreds of "friends" on FaceBook; the problem is, the account under my name is a fraud. But this is not the place to deplore Twitter or FaceBook. They are facts of life. I am told I should have accounts on both: They will promote my reviews, let people feel more involved in my life, and make it easy for me to contact them. After nearly 2 million comments on my blog and 9,943 messages in my current "sent mail" file, that's just what I need. More friends.

I agree that many people find such sites useful. Cell phones have become an extension of the human ear. It is commonplace to find yourself standing next to people who are talking audibly, even loudly. They're hearing voices in their heads, but are not schizophrenic. What they are is elsewhere. I know how this feels. I am not at home right now. I am at an indeterminate place in my mind, talking with you. Perhaps you will respond. It has gotten to the point where I conceive of the message threads on my blog as places. I wrote an entry about my custom of "touching the bases" in favorite cities--walking around to special places where I confirm that they, and by implication myself, endure. Now I also touch bases on the internet.

Is this a bad thing? I've grown accustomed to it. Of course it would be good for me to spend more time in human conversation, but since the power of speech has been taken from me, the internet allows me a form of communication that has expanded to fill some of the silence. I suspect for those who have the power of speech, texting provides a place of silence, and in some ways can be more satisfying than speech. Teenagers once were famous for shutting the bedroom door and talking with friends for hours on the phone. They still spend time on the phone, but some of them may spend more time texting. Instead of telling one friend "I think he's cool," they reach dozens: "Who thinks he's cool?" The answers twitter in. Me. Not me. No way. This is not conversation, but it is contact. I am here. I am me. We are joined in a web. We keep the matrix afloat. At 3 a.m.: Anybody awake? Me. Me too. Me too.

6_Startling_Stories.jpgPulp fiction

Arthur C. Clarke wrote a novel titled Against the Fall of Night in 1948. A revised version was published in 1956 as The City and the Stars. It envisioned the earth a billion years from now, in which surviving humans occupy a domed city maintained by a vast computer. They occupy bodies, but their minds are stored in the computer, which activates them in rotation, providing a solution to the problem of population growth within a closed system. If Moore's Law holds only to a small extent, in a billion years there will be enough memory to store countless human memories, and no memory will be permanently annihilated. The inhabitants of the City have vicarious experiences supplied by the computer; they never leave the dome, and most don't think that's possible. The environment outside will not support life. But there is one man...

Yes, but never mind Alvin. What intrigues me is the notion of the minds stored in the machine. Clarke was always on top of computing. When post-polio syndrome struck him, he used the net to get around. We exchanged lot of emails, not simply because HAL 9000 and I shared the same home town, but because he would drop a line when he wanted to talk about movies. In 1997, when streaming video was relatively new, he was already prepared to talk with us live online, at HAL's birthday party, Cyberfest.

I'm sure had he lived a few more years, Clarke would have observed people deeply absorbed in Twittering, and realized that mentally they were in another place. Their environment at that moment has no physical existence, but it is real enough to them. Clarke might have been reminded of his human race under a dome, its inhabitants living virtual dreams. His science fiction was often uncannily predictive. In an AT&T videophone interview we did around his 80th birthday, he spoke of the computers named "Soul-Catchers" in his novel 3001. Those would be computers fast enough and with enough memory that an entire mind could be downloaded into one.

5_twits.jpgAlone in a crowd

The interaction of the mind and the computer has not been only intellectual for some time. Mental signals can activate nerve endings which operate prosthetic limbs. I'm interested in an invention called The Audeo, intended to help people without speech who still have their vocal centers intact, as I do. This device picks up the tiny signals sent by the brain which instructs those centers to speak, and it translates them via laptop into instructions for operating a wheelchair or other tools. It also allows people to "think" words and have them emerge as computerized speech. Audeo, in development by the Ambient Corp. in Urbana-Champaign, has discussed analyzing many samples of my recorded voice and building a synthesized voice that will emerge sounding something like me.

But let's return to the idea of the dreamers in Clarke's city under a dome, and the Soul-Catchers which might contain an entire human mind, memory and personality. That vision is already in its earliest days. This generation is growing into symbiosis with cyberspace. In February 2009, MIT unveiled a wearable device that can project a virtual touch screen on a surface you are facing and allow you to call up and manipulate computer data and information. The surface could be a wall. It could be the shirt of the person you are talking to. We will see two people having an argument while waving their hands strangely in the air, just like Tom Cruise in "Minority Report." The elements for the prototype are available off the shelf, and costs $350. This blog has tech-heads among its readers, and I expect some of you to assemble this device at home. Andy Ihnatko, are you there?

The brain transmits tiny electrical signals. Eventually Twits will be able to twitter mentally, eliminating the Blackberry as a middle man. If a memory chip can one day be implanted in a human brain, a human could find himself occupying a new body. Where will the body be found? Why, at a clone farm, of course; they've already been invented in science fiction. Your body could be cloned and implanted with you, and you would be Benjamin Button now living toward the future.

7_The_City_and_the_Stars.jpg Alvin and the chip monks

But...but...what of the rights of that clone to grow up on its own, and form its own memory and personality? Wouldn't this process be akin to abortion? And what would happen if you lingered on after your memory had been copied, and encountered yourself in an infant body? Would you inflict goo-goo sounds and make funny faces at that helpless creature? Never mind. Perhaps totally brainless bodies could be produced. Blank slates. What if one of them had a glitch and started to think for itself? How would it feel, facing the prospect of being replaced by you?

I think one lifetime will continue to be enough for most people. But we are approaching an age when humans will be entwined in a global mind, one with online translation. Will this be a good thing? Will kids still grow up playing in the sunshine, gazing at the stars and asking their dog "Who's a good dog?" Or will they receive twitters: The stars aren't out yet where I am, dude.

There will be no end of problems for the human race as it merges with memory chips. Back in 1997, Sir Arthur touched on this in our interview.

Ebert: If my personality, emotions and spirit are stored in a Soul-Catcher, will it be aware of that, or will it think it's really me?

Clarke: Again there you have a question of identity. If you make many copies they could all claim to be you, and that's a game we use in science fiction. A friend of mine, Bill Temple, wrote a story called "The Four-Sided Triangle" about two men in love with the same girl. Of course they make a duplicate of her and that only makes things worse.

MIT's wearable $350 Sixth Sense

A demonstration of The Audeo at the University of Illinois

How wrong could I be?

Arthur C. Clark on HAL 9000 and the immortality of memory

NPR's Peter Segal: Remarks from an old fanboy about all these kids nowadays

Roger Ebert

Roger Ebert was the film critic of the Chicago Sun-Times from 1967 until his death in 2013. In 1975, he won the Pulitzer Prize for distinguished criticism.

Latest blog posts

Latest reviews

We Grown Now
Blood for Dust
Dusk for a Hitman
Stress Positions
Hard Miles

Comments

comments powered by Disqus