Flickr only shows a short clip… see the video in its entirety here:
www.youtube.com/watch?v=lKTP36tqwK0 or here: archivopedia.com/Curiosity/About.html
ARRIVE ON MARS AND BEGIN BRIEFING ON THE CURIOSITY AI PROJECT
Humankind has often wondered, “Is there intelligent life beyond Earth?” With the recent Augustine Report findings and the New NASA Plan that stresses unmanned exploration, the demand and use of humanlike and non-humanlike machines to be the robotic explorers in our solar system and beyond has become increasingly important. Robotic explorers, both humanlike and non-humanlike in appearance, can reduce the danger to humans by gathering environmental data and sending that data back to Earth for detailed analysis. Perhaps of greatest importance, AI robotic explorers have the potential to serve as “intelligent assistants” to aid those working in “the field.” These hypothetical scenarios are modeled in the virtual world Second Life project “Curiosity AI.” In particular, the ability of AI machines to control other machines; to sense, transmit and remember environmental data; and to perform tasks in groups are explored.
SIMULATION OF IN-SITU AI DATA REQUEST AND PROCESSING (PHOENIX)
In a presentation given by Tara Estlin, one of NASA’s Artificial Intelligence programmers whose work has been useful for several Mars rovers, one of the problems identified is the need for In-Situ data analysis. Data transmitted by the rovers only has a certain window of dish time when transmitting to the DSN receivers on Earth, so reducing the amount of data sent to only the most relevant is an important goal. The ability for an AI to summarize data therefore would be an asset. This demonstration shows Curiosity Scientist delegating to the rover a request to the Phoenix lander—to calculate and report the sum the first 20 readings of atmospheric pressure data for day 5 of the Phoenix mission. One way this type of in-situ analysis could be used is to detect extreme pressure drops which may be a sign of dust devils. If these spontaneous dust devils were detected, more than one robot could be coordinated so as to record these interesting phenomena while they are occurring.
SWARM BEHAVIOR SIMULATION
The simulated Mars Embla is an example of an expert machine. AI can be used to identify other AI and delegate responsibility to a machine with a specific expert capability, perhaps one that it lacks itself, like the previous example. One of the possibilities for future systems is an expectation that AI may serve as emissary exchange agents for collection and routing of other AI queries. This type of cooperative behavior allows communication with each other and the coordination of behavior to jointly accomplish tasks. In Second Life one way this can be accomplished is by programming robots to listen for certain commands while others ignore them. Another way is to program robots to only listen to certain AI agents, and ignore all other agents. In real life, the Curiosity rover is limited by its size, speed, and need to travel along a sometimes difficult terrain. In this simulation, Curiosity Scientist tells the rover to launch the Mars Embla swarm. The Mars Embla, modeled on the actual Embla, is small, travels much faster than the rover (50 mph compared to .6 mph), and hovers above the ground capable of flying up to 10,000 feet (which is above Mars’ dust devils). The Embla swarms quickly spread out to cover a lot of territory. The swarms are programmed to maintain a certain distance from each other. The swarms might be used to return simultaneous, multi-location readings back to Curiosity Scientist. In this case they are temperature readings, but they could be any type of readings or video capability for which the Embla is equipped. These readings might then be processed by Curiosity Scientist for in-situ analysis.
CURIOSITY’S HUMAN LIKENESS
Generally, the capabilities of AI agents can help people better manage “information overload” and to interact with data on a more “human” level. That is, an AI should be able to navigate the information superhighway reducing the information flow to meet the requester’s needs, communicate in natural language, and generate an emotional response. This attempt for computers to have a human-like response, was made famous by Alan Turing in 1950. So where is AI today? What’s beginning to occur is a paradigm shift from software-as-tool to software-as-assistant, performing tasks based on less specific requests. From the user’s perspective, it means less work and help finding information while still lacking a good understanding of the subject matter during the process of learning and decision making. Say I’m a student and I want to write a report about quasars. I don’t know anything about quasars yet, and I’m not sure where to look to find the best information available. Normally, I might turn to Google and use some resources that might in fact be unreliable. Here, Curiosity Scientist, an intelligent agent, can perform a search for my keyword in natural language. I can, of course, make a specific request to "Google quasars" and have the AI return the search result. But what is better is to have the ability to tell the AI to "Find astronomy resources on quasars." The AI is programmed to return results from authoritative, highly respected resources that professional astronomers (experts) turn to, revealing the latest research and findings in the field–sources I as a student probably did not know about. Like a library’s pathfinder, the AI guides the learner to trustworthy, targeted academic resources without the student needing to evaluate what is trustworthy and what is not. Curiosity can also simulate human emotion not only through his words but his actions and facial expressions that are triggered when conversing in natural language—an aspect that was not a consideration for Turing. Turing also assumed that the computer would be honest. But just ask Curiosity and he will tell you he is alive.
OBSTACLE AVOIDANCE AND MOVEMENT
Curiosity can move autonomously, navigating around obstacles when needed. When called, he avoids the tall crater and rover to go to Archivist. He rezzes the Odyssey satellite which immediately goes into orbit around Mars. Afterwards, Curiosity can command both the rover and the orbiting satellite to communicate with one another (in this case communication is simulated as green “UHF” waves). Next, Curiosity tells the rover to go to Spirit. The rover responds by moving to a waypoint location near the Spirit rover. Both Curiosity Scientist and the Curiosity Rover are able to go into an “Explore” mode, where they move in pseudorandom motion within a limited range. Here, Archivist tells Curiosity Scientist to wander about, meaning he can move around within a 50 meter range. In this case, he unexpectedly wanders outside the Mar sim and falls into a pit. Without assistance, Curiosity navigates his way out of the pit and back into the Mars sim.
In addition to the movement skills demonstrated, Curiosity Scientist has many capabilities that just can’t be covered in this short video. For example, Curiosity can perform 133+ animations, communicate in both local chat and IM, and Tweet his location. He can make observations about avatars , such as performing “facial recognition” of sorts, to associate an avatar with their photograph. Curiosity even has his own group—and, on a timer, detects nearby avatars and automatically invites them to join. The rover also has a laser, which simulates the actual Curiosity rover’s ability to detect the chemical composition of targeted rocks when rocks and the laser come into contact. Generally, agent actions can be immediate or scheduled. In Second Life, immediate questions are asked in local chat while scheduled tasks offer and on and off feature. Finally, for those without access to Second Life, a video stream to UStream (when broadcasting) simulates machine vision where viewers can watch the action live.
Thanks for watching the video. If you have any questions about Curiosity Scientist or the Curiosity AI project, please contact Archivist Llewellyn in Second Life.
Posted by Archivist Llewellyn on 2011-03-02 18:21:42
Tagged: , ai , artificial intelligence , robot , robots , rover , second life , virtual world , NASA , Embla , machine vision , avatars , space exploration , Mars , facial recognition , laser , simulation , serious games , US Army , FVWC , avatar , Turing test , Alan Turing , Curiosity , Opportunity , Spirit , Odyssey , waypoint , obstacle avoidance , databases , assistant , swarm , terrain , traverse map , AI agent , agents , Phoenix , Science , Scientist , Computers , Technology , Augustine Report , New NASA Plan , Intelligent Assistant , Data , Satellites , in situ data analysis