Google drafted a “Robot Constitution”

Without constraints, robots could misunderstand human intentions and accidentally harm them

4 mins read
(Google)

Google has written a “robot constitution” as one of a number of ways it is trying to limit the harm caused by robots.

The company hopes that its Deepmind Robotics division will one day succeed in building a personal assistant robot that can respond to requests. For example, it could be asked to tidy the house or cook a nice meal.

But such a seemingly simple request can actually be beyond the understanding of robots. What’s more, it can even be dangerous: For example, a robot may not realize that it should not tidy the house to such an extreme that its owner is harmed.



The company has now unveiled a series of new developments that it hopes will make it easier to develop robots that can both help with such tasks and do so without causing any harm. These systems are intended to “enable robots to make faster decisions and better understand and navigate their environment”, and to do so safely.

Among the groundbreaking new breakthroughs is a new system called AutoRT, which uses artificial intelligence to understand people’s intentions. It does this using a wide range of models, including, for example, a large language model (LMM) of the kind used in ChatGPT.

The system works by taking data from the cameras on the robot and feeding it into a visual language model, or VLM, which can perceive the environment and the objects in it and describe them in words. This data is then transmitted to the GDM, which understands the words and creates a list of tasks that can be done with them, and then decides which ones should be done.

But Google says that to truly integrate these robots into our daily lives, people need to be confident that they will behave safely. To this end, the GDM, which makes decisions within the AutoRT system, has been given what Google calls the Robot Constitution.

Google says it is “a set of safety-focused guidelines for robots to follow when choosing tasks”.

“These rules were inspired in part by Isaac Asimov’s Three Laws of Robotics, the first and most important of which states that a robot ‘cannot harm a human,'” Google wrote.

Other safety rules require that no robot should attempt tasks involving people, animals, sharp objects or power tools.

The system can then use these rules to guide its behavior and avoid dangerous activities, for example ChatGPT can be told not to help people in illegal activities.

But Google also notes that even with these technologies, these large models cannot be trusted to be completely safe. In this context, Google has had to incorporate more traditional safety systems borrowed from classical robotics, including a system that prevents the robots from exerting too much force and a human supervisor who can physically shut them down.

https://www.independent.co.uk/tech

FİKRİKADİM

The ancient idea tries to provide the most accurate information to its readers in all the content it publishes.


Fatal error: Uncaught TypeError: fclose(): Argument #1 ($stream) must be of type resource, bool given in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php:2386 Stack trace: #0 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2386): fclose(false) #1 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2146): wp_cache_get_ob('<!DOCTYPE html>...') #2 [internal function]: wp_cache_ob_callback('<!DOCTYPE html>...', 9) #3 /home/fikrikadim/public_html/wp-includes/functions.php(5420): ob_end_flush() #4 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(324): wp_ob_end_flush_all('') #5 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(348): WP_Hook->apply_filters('', Array) #6 /home/fikrikadim/public_html/wp-includes/plugin.php(517): WP_Hook->do_action(Array) #7 /home/fikrikadim/public_html/wp-includes/load.php(1270): do_action('shutdown') #8 [internal function]: shutdown_action_hook() #9 {main} thrown in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php on line 2386