3. World Building
Page 1 of 8 | Three Laws of Robotics | Imaginative Building | Freedom? | Approaches | Concrete vs Implied | Consistency | Future Histories | Brain Jag |
The Three Laws of Robotics
|
|
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
|
|
2. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.
|
|
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
|
|
Isaac Asimov
|
|
When you hear the phrase world building, probably the first thing that comes to mind is planets: great colorful settings in outer space, worlds for grand adventures to take place on.
That's partly right, of course; but there's a lot more to it than that. Take the three laws of robotics quoted above. These were devised by Isaac Asimov, with the help of his editor John W. Campbell, Jr., for his now-famous robot stories (beginning with the stories gathered in the collection I, Robot and developed through many more stories and novels). The three laws started as a logical outgrowth of the future Asimov imagined, which included intelligent, versatile robots. They soon became an inspiration to future plot lines. But more than that, they became a part of the world Asimov had built. They provided background and context. They also provided both story opportunity and preset limits. Star Trek's famed, and frequently violated, Prime Directive (prohibiting interference in other cultures) is a similar example. |
Course content copyright © 2018 Jeffrey A. Carver |