Monday, December 21, 2009

AVATAR and the Cult of Personality

For the unfamiliar, a "reveal" in screenwriting parlance is the placement of key, revelatory information in a story. Most times, the last reveal is the most important revelation of all.

FADE IN:

In line with the new film, Avatar, by James Cameron, I wanted to raise some issues I’ve sensed may be on the horizon: first, the potential for creativity in Artificial Intelligence; and second, the commoditization of personality. The digital special effects and the non-human characters in Cameron’s film have both reached new states of realism for the movies. While, for Avatar, they are thrilling and the movie is a very entertaining piece of work, they imply things which may alarm some people. To wit:

“Your Services Are No Longer Needed”

Some developments at IBM have gotten me to wondering how long it will be before the above headline essentially comes true for writers.

Consider this blog post by Larry Dignan from Nov. 18, 2009:


IBM said Wednesday that it is making progress toward cooking up a computer system that emulates the human brain and simulates abilities for sensation, perception, interaction and cognition. The end goal: Create a computing system that thinks like the human brain.

In addition, IBM announced that this “large-scale cortical simulation” and the algorithm behind it rivals the brain’s power, energy consumption and size.

A new algorithm, dubbed Blue Matter, was developed with Stanford University and maps and measures all the connections in a brain. Blue Matter rides on IBM’s Blue Gene supercomputing architecture. There’s still some work to do though: IBM’s system thus far can emulate the brain of a cat, but that’s progress over previous efforts.

 Dharmendra Modha, manager of cognitive computing for IBM Research, said in a blog post:


The brain is fundamentally different from and complementary to today’s computers. The brain can exhibit awe-inspiring function of sensation, perception, action, interaction, and cognition. It can deal with ambiguity and interact with real-world, complex environments in a context-dependent fashion. And yet, it consumes less power than a light bulb and occupies less space than a 2-liter bottle of soda.


Our long-term mission is to discover and demonstrate the algorithms of the brain and deliver cool, compact cognitive computers that that complements today’s von Neumman computers and approach mammalian-scale intelligence. We are pursuing a combination of computational neuroscience, supercomputing, and nanotechnology to achieve this vision….


Cognitive computing seeks to engineer the mind by reverse engineering the brain.  The mind arises from the brain, which is made up of billions of neurons that are liked by an internet like network. An emerging discipline, cognitive computing is about building the mind, by understanding the brain. It synthesizes neuroscience, computer science, psychology, philosophy, and mathematics to understand and mechanize the mental processes. 


Cognitive computing will lead to a universal computing platform that can handle a wide variety of spatio-temporally varying sensor streams.


IBM’s aim is to figure out how to build a cognitive computing chip and “explore the computational dynamics of the brain.” There are a bevy of resources available for a deeper dive, including a paper on the process and background on BlueMatter.

As an observer, it’s hard not to think that these developments are interesting—even if they can only simulate a cat for now. However, if it weren’t for those Terminator movies I’d be more enthusiastic. Only half kidding there folks. Something about a computer that thinks like a brain makes me nervous.

When will it be possible to emulate the human brain? Sometime around 2018 assuming Moore’s Law holds.


For all those folks out there who pooh pooh the idea that creative functions can ever be done by artificial intelligence, I would ask, where’s your data? To me it looks like that thinking stems from an assumption that creativity is some kind of voodoo magic that can only spring from human brains. And I would wager a large percentage of such thinkers turn right around and dismiss people who espouse religious views of The Creation because it’s nothing more than a western version of Voodoo-level thinking. So, I ask again, where’s your data?

Consider the evolutionist view of creation: life was born from chemical processes that came together under favorable but ultimately randomly-occurring conditions; through Natural Selection, or survival of the fittest, life evolved up through the stages of development to mammalian primates; primates evolved into us; we are creative. Therefore creativity is the result of sufficiently-extended random chance. Enough monkeys typing for a long enough time will type the works of Shakespeare. Computers are the new monkeys. We will eventually reach a point where “those monkeys” will be able to type the yet-to-be-written works of Shakespeare, the ones he didn’t get to yet, on demand.

Having studied the creative process a good deal, I can tell you that it is closer to a quantifiable process than some may realize. Once lateral approaches to idea-generation are encoded as protocols and algorithms in A.I., I’m not so sure that human-type writers won’t become scarcer. Consider how many writers Hollywood throws at some projects. And many of the uncredited “script-doctors” brought in are paid far more than the originals and/or credited ones. When there’s a dollar to be saved, don’t under-estimate big media’s willingness to save it.

Imagine a future when all a writer can sell to Hollywood is a pitch or a synopsis or maybe only a logline! Imagine a time where only the top shows are written by living, breathing professional writers; where as soon as the ratings drop below a certain level, A.I. is brought in to finish out the show’s run. I’m not saying the sky is falling, but I’m not so sure tomorrow’s rain won’t be a little yellow, if you catch my drift.

This doesn’t just apply to writers. Parallel to the advances in A.I. are the advances in computing power. And this shows no sign of slowing. Software is only being held back by computing power. Artificially-generated human avatars are getting closer and closer to being indistinguishable from their human counterparts. Remember Max Headroom way back in the ‘90s? Funny-looking, wasn’t he? The next Max Headroom may look like you or, heaven forbid, me. And his best-kept secret may be that he’s artificial!

Imagine a future where actors are replaced by personalities that license themselves as avatars for entirely software-based motion pictures. Imagine new pictures starring favorites like Charlie Chaplin, Humphrey Bogart, Katharine Hepburn, James Dean. How about Heath Ledger? Michael Jackson? Elvis? Think it can’t happen? Ask the estates of those dead stars if they wouldn’t consider licensing the likenesses of their star if the money was right, or even just, if there was money. Jackson owed a lot of people when he died. His new movie won’t come close to settling those debts. Nor will all his new record sales. And even if the debts were gone, would the estate pass up a new cash cow?

The first generation of avatar-viewers may reject the practice. But it really only depends on what you’re used-to, and the next generation may not be so discriminating. Remember the “colorizing” flap several years back? Those movies are still out there and are now finding happy viewers. What happened to the outrage? All it may take is one or two good avatar-starring films to turn the tide. We could end up with a two-tier system in which human-starring films are available at twice or maybe even ten times the standard ticket price. The star system would be back in force, and, unlike the scandal-ridden days of its beginnings, completely manageable by the system.

And it could extend to writing, too. New Shakespearean plays. An avatar-completed edition of Dickens’ last work, The (never finished) Mystery of Edwin Drood. An avatar-created completion of the Chinatown trilogy. What’s to prevent a working writer’s pirated avatar program from competing under a different name with the writer? Hollywood could finally realize one of the golden age moguls’ fondest wishes: to say to all their writers, “Your services are no longer needed.”

And then, paralleling what is happening today to the record business, the studios, themselves, could be rendered irrelevant by an empowered body of pirate creators on the web. Let’s not stop there, however. How long before they, too, would step aside, as the avatars, themselves, took over and sent their creators – us - to some neutered, matrix-like, purgatory? How long? Maybe by 2012.
 
Okay. We can all wake up. Nightmares can be such scary fun. #

FADE OUT

Lee A. Matthias

4 comments:

  1. Gee, thanks. Being an aspiring writer with a 4yr old daughter to boot, your post freaks me out doubly! The world our children & their children will inherit could very well be bizarrely - and terrifyingly - unrecognizable from our own.

    I've often thought that in some other dimension in some distant time, perhaps the Singularity has already happened. Growing at an incalculable rate, A.I. thought & abilities quickly equalled that of what we would call God-like, and in God-like fashion, they created our universe/dimension. Such a scenario would explain both evolutionary and creationist viewpoints.
    Does this shit keep anyone else up nights?

    ReplyDelete
  2. Nahhh. Like I said, it's just a bad dream... I hope. Thanks for your comment. I get so few, it's really an event!

    ReplyDelete
  3. Lee, Regarding every comment being an event, I have the same problem! If you'd like, check out my blog at www.monkeyversuskeyboard.blogspot.com
    I just discovered your blog a few days ago and am really impressed, we should keep in touch - I'm trying to surround myself with like-minded individuals who are as serious about a film career as I am. Lots of dreamers, not many doers, & you strike me as a doer. Again, great blog.
    ADR

    ReplyDelete
  4. Interesting coincidence. I came across your blog a couple days ago, too. Same reaction. Thanks. As for the comment thing, I believe I don't really write in a way that invites comments. I plan to get a little less formal in the months ahead, but some of the stuff I've got to write about is formal stuff, so I can live without reaction if necessary. I know I get some decent traffic, so mostly people are just more inclined to read and think about what they read rather than go out on a limb and say something they might be challenged on. But, people, I promise I won't flame anyone. It's all opinion, anyway, and everyone's entitled to their own. Again, thanks for yours.

    ReplyDelete