Buscar en este blog

lunes, 8 de noviembre de 2010

Patterns of nonverbal emotional communication between infants and mothers to help scientists develop a baby robot that learns

Patterns of nonverbal emotional communication between infants and mothers to help scientists develop a baby robot that learns

The first phase of the project was studying face-to-face interactions between mother and child, to learn how predictable early communication is, and to understand what babies need to act intentionally. The findings are published in the current issue of the journal Neural Networks in a study titled "Applying machine learning to infant interaction: The development is in the details."

The scientists examined 13 mothers and babies between 1 and 6 months of age, while they played during five minute intervals weekly. There were approximately 14 sessions per dyad. The laboratory sessions were videotaped and the researchers applied an interdisciplinary approach to understanding their behavior.

The researchers found that in the first six months of life, babies develop turn- taking skills, the first step to more complex human interactions. According to the study, babies and mothers find a pattern in their play, and that pattern becomes more stable and predictable with age, explains Daniel Messinger, associate professor of Psychology in the UM College of Arts and Sciences and principal investigator of the study.

"As babies get older, they develop a pattern with their moms," says Messinger. "When the baby smiles, the mom smiles; then the baby stops smiling and the mom stops smiling, and the babies learn to expect that someone will respond to them in a particular manner," he says. "Eventually the baby also learns to respond to the mom."

The next phase of the project is to use the findings to program a baby robot, with basic social skills and with the ability to learn more complicated interactions. The robot's name is Diego-San. He is 1.3 meters tall and modeled after a 1-year-old child. The construction of the robot was a joint venture between Kokoro Dreams and the Machine Perception Laboratory at UC San Diego.

The robot will need to shift its gaze from people to objects based on the same principles babies seem to use as they play and develop. "One important finding here is that infants are most likely to shift their gaze, if they are the last ones to do so during the interaction," says Messinger. "What matters most is how long a baby looks at something, not what they are looking at."

The process comes full circle. The babies teach the researchers how to program the robot, and in training the robot the researchers get insight into the process of human behavior development, explains Paul Ruvolo, six year graduate student in the Computer Science Department at UC San Diego and co-author of the study.

"A unique aspect of this project is that we have state-of-the-art tools to study development on both the robotics and developmental psychology side," says Ruvolo. "On the robotics side we have a robot that mechanically closely approximates the complexity of the human motor system and on the developmental psychology side we have a fine-grained motion capture and video recording that shows the mother infant action in great detail," he says. "It is the interplay of these two methods for studying the process of development that has us so excited."

Ultimately, the baby robot will give scientists understanding on what motivates a baby to communicate and will help answer questions about the development of human learning. This study is funded by National Science Foundation.

viernes, 27 de agosto de 2010

MIT crea robot para limpiar derrames de petróleo

MIT crea robot para limpiar derrames de petróleo: "







Después del grave problema ecológico que se dio por el derrame de petróleo más grande del Golfo de México (y posiblemente del mundo) por parte de la empresa BP, tanto los científicos como los inventores de todo el mundo se dieron a la tarea de hallar una solución para separar el agua del petroleó y así poder limpiar en la medida de lo posible los cientos de miles de barriles de petróleo que fueron vertidos al océano.


El MIT (Instituto Tecnológico de Massachusetts) desarrollado un robot que es capaz de limpiar el océano y su principal virtud es su costo, su manejo vía remota y que cuando trabajan en grupos pueden ser completamente independientes de ser necesario, pues evalúan la situación y limpian de una mejor manera, estos robots son llamados “Seaswarm” y tienen la forma de una cinta transportadora la cual está hecha de nano cables de malla, que es la encargada de absorber el petróleo mientras que repele el agua del mar, una vez en funcionamiento el Seaswarm puede quemar el petróleo o en su caso guardarlo en una bolsa y dejarlo flotando sobre la superficie para ser recogida y utilizada en otra cosa.


El costo de los robots es de aproximadamente $20,000 dólares y como les había comentado puede trabajar como si fuera una flota, los cuales se comunican a través de GPS y WiFi para crear un sistema automatizado de recogida que puede funcionar perfectamente sin apoyo humano y el hecho de que sean tan pequeños (a comparación de los skimmers en los grandes buques pesqueros) es que pueden llegar fácilmente a lugares de difícil acceso, como los estuarios  y las líneas de costa.


Una vez más el MIT demostrando la capacidad tecnológica que le caracteriza, puede que en un futuro (Esperemos que lejano) cuando se dé un nuevo derrame, flotillas de estos robots limpien de manera eficaz y todo se haga de manera mucho más rápida para afectar en menor manera a la vida marina.


Vía: Physorg










"

miércoles, 18 de agosto de 2010

Inventores 'adoptan' a hijo robot

Inventores 'adoptan' a hijo robot: "Mientras en México la discusión se centra en las adopciones de parejas gay, en Inglaterra "Aimec"cuenta chistes, poda el pasto y se conecta a Internet"

Mientras en México la discusión está centrada en las adopciones por parte de parejas homosexuales, en Inglaterra una pareja de inventores creó a un hijo perfecto por el que no se tienen que preocupar, ni cumplirle caprichos: Aimec.

Aimec (Artificially Intelligent Mechanical Electronic Companion) puede conectarse inalámbricamente a Internet, donde busca las cosas que no entiende, y habla de lo que sea que uno quiera. Cuando ve que sus "padres" humanos están cerca, comenta cosas o dice chistes.

También puede conectarse a los electrodomésticos como la televisión para encenderla o apagarla, las luces de la casa, entre otras funciones.


Sus "padres" son los inventores Tony y Judie Ellis, quienes no tienen hijos y aseguran que en unos años, todos los hogares tendrán de estos útiles robots que cortan el pasto. Para el matrimonio "las aplicaciones potenciales son infinitas. Una cosa no se puede negar - los robots serán algo enorme. Estamos en la misma etapa que los computadores en 1980, cuando todo el mundo decía que nunca funcionarían."

Además, el precio de Aimec también es perfecto pues tan sólo costará 300 dólares, según reportó el Daily Mail.

Mexicanos crean brazo robótico anti bombas

Mexicanos crean brazo robótico anti bombas: "El organismo académico reconoce que el plazo de finalización del dispositivo podría reducirse de dos años a unos pocos meses si el proyecto contase con más apoyo"

Dos científicos mexicanos desarrollan un brazo robótico capaz de desactivar bombas "superior a los ya construidos en Japón y China" y que estará listo en menos de dos años, informó el Centro de Investigación y de Estudios Avanzados (Cinvestav).

El dispositivo, capaz de desplazarse a unos 30 metros por minuto, está siendo construido con tecnología exclusivamente mexicana, y su costo no sobrepasará los 750 mil pesos (59.542 dólares), explicó en un comunicado el Cinvestav, uno de los centros de investigación más importantes del país.

Los científicos Rafael Castro Linares y Alejandro Rodríguez Ángeles trabajan en este brazo que podrá ser dirigido de manera remota e incluso programado previamente, lo cual reducirá el riesgo para las personas que trabajen con él para desactivar explosivos.

El robot se servirá de dos grupos de motores: uno para mover el brazo, con siete grados de libertad de movimiento, y otro para desplazar la base móvil, con tres grados y capaz de girar sobre su propio eje.

Además, su peso de entre 12 y 15 kilos le otorgará, según el Cinvestav, gran estabilidad.

Esta capacidad para manipular objetos delicados le permitirá contar con otras aplicaciones, como la industria automotriz y la asistencia a personas con alguna discapacidad física, señalaron los expertos.

La nota del Cinvestav asegura que, de ser importado, el precio de este robot rondaría los tres millones de pesos (238 mil107 dólares), pero su tecnología local abarató costos adicionales como los de las piezas y el mantenimiento.

Otra característica con la que no cuentan sus competidores extranjeros es el "software amigable" en el que también trabajan los investigadores, con el que "cualquier persona pueda programar y operar" el robot, dijeron los expertos.

El organismo académico reconoce que el plazo de finalización del dispositivo podría reducirse de dos años a unos pocos meses si el proyecto contase con "el apoyo de instituciones públicas o de la iniciativa privada" .

jueves, 22 de julio de 2010

Mexicanos crean humanoide de bajo costo

Mexicanos crean humanoide de bajo costo: "Científicos del Cinvestav desarrollan el robot "Mexone", capaz de aprender por experiencia a caminar, subir escaleras e incluso jugar al póker"

Científicos mexicanos trabajan en el robot "Mexone", un humanoide que ya está construido de pelvis hacia arriba y que podrá aprender por experiencia a caminar, subir escaleras e incluso "jugar al póker", como bromearon hoy sus creadores.

"Mexone" es el robot de su tipo más avanzado en Latinoamérica, según los científicos mexicanos del Centro de Investigación y Estudios Avanzados (Cinvestav).

Autónomo y liviano, es capaz de mover y girar sus extremidades y articulaciones en casi 40 grados, además de sostener una baraja de cartas y muchos otros objetos con sus dedos, inspirados en los de los primates.

Eduardo Bayro, investigador del Cinvestav, explicó que a pesar de todos estos avances la principal aportación de "Mexone" reside en el software de su inteligencia artificial, muy adelantado y basado en prototipos que abaratan de forma notable sus costos frente a otros robots similares.

Al igual que sucede con el sueño humano, "Mexone" almacenará durante su desconexión la información que acumuló en su actividad, y será capaz de recuperarla y gestionarla una vez encendido.

Ello, combinado con las dos cámaras que conforman su visión, abre un mundo de posibles aplicaciones, entre las que figuran la ayuda a personas invidentes, señaló Bayro.

"Mexone" es una sorpresa en el panorama de la robótica mexicana, aún poco explotada, y compite fuertemente en rentabilidad y características con robots de firmas como Sony y Honda, dijo.

Su costo de producción, estimado en unos 100 mil dólares, es más de quince veces inferior al de otros humanoides e incorpora capacidades más avanzadas, como dedos controlables y plantas con sensores que le ayudarán a caminar con naturalidad.

"Pronto habrá robots como 'Mexone' ayudando en las tareas domésticas, en el cuidado de personas dependientes, en las fábricas y en la vigilancia de zonas inhóspitas", aseguró Bayro.

Las extremidades inferiores del robot, que se desarrollan en la ciudad estadounidense de Boston, otorgarán a "Mexone" un metro y medio de altura y permitirán sostener sus once kilos de peso.

Además, las piernas pueden contribuir al mundo de las prótesis y la rehabilitación de personas amputadas, añadió el investigador.

Bayro elogió el abanico de posibilidades a largo plazo que permite "la arquitectura abierta" de este proyecto, el cual comenzó hace un año en la unidad del Cinvestav en la ciudad de Guadalajara, en el oeste de México.

"Los gobiernos y las empresas ponen límites y plazos a la ciencia, pero la ciencia es un arte que debe avanzar por sí mismo", aseguró.

De igual modo, defendió la capacidad de México, cuyos científicos merecen "más apoyos" de los que reciben, argumentó.

"México no se debe quedar fuera de la ciencia internacional, sino avanzar al mismo nivel que Francia y Japón", recalcó.

El propio Bayro sostuvo que peleó hasta "romper el cerco de acero" que rodea a la robótica en todo el mundo y que hizo que un investigador japonés lo acusara "de espía" en París.

"México debe mostrar al mundo las letras de Octavio Paz y Carlos Monsiváis, pero también su ciencia", sentenció.

Piernas robot reemplazarán silla de ruedas

Piernas robot reemplazarán silla de ruedas: "El desarrollo neozelandés estará disponible a fin de año en el país isleño y para 2011 en EU"

Ponerse de pie, caminar unos pasos y hasta subir escaleras permitiría a las personas con parálisis un aparato denominado como el "exoesqueleto" Rex.

Este desarrollo neozelandés, que promete dejar obsoletas a las sillas de ruedas, estará disponible a fin de año en el país isleño y para 2011 en EU.

El "exoesqueleto" Rex costará unos 150 mil dólares, por lo que no será una alternativa económica.

El dispositivo puede ser empleado por personas que midan entre 1.46 y 1.95 metros de altura y que pesen menos de 100 kilos.

Se maneja a través de un joystik, como el de los videojuegos.

China crea su primer robot cirujano

China crea su primer robot cirujano: ""Mano Pequeña A" es capaz de participar en operaciones de quirófano de alta precisión"

miércoles, 14 de julio de 2010

En Bangkok, el sushi lo sirve un robot

En Bangkok, el sushi lo sirve un robot: "Los clientes de un nuevo restaurante alucinan con el primer "camarero-robot" del mundo, que sirve con precisión milimétrica platos de cocina japonesa"

Los clientes de un nuevo restaurante de Bangkok alucinan con el primer "camarero-robot" del mundo, que sirve con precisión milimétrica platos de cocina japonesa y baila al son de música pop entre los aplausos de los comensales.

Vestido con un traje de guerrero samurai y con una pantalla en lugar de ojos, este particular servidor es el último grito en tecnología robótica aplicada con fines comerciales y quizás el futuro del negocio de la restauración, según su creadora.

"No llega tarde, no se toma una pausa para fumar, y no pide una propina pese a que hace el trabajo de ocho personas por turno", asegura Lappassarada Thanapant, la dueña tailandesa de Hajime, un restaurante de cocina nipona que abrió sus puertas hace menos de dos meses en un moderno centro comercial de la ciudad.

Thanapant explica que la idea del robot le surgió hace seis años, cuando durante un viaje a Japón visitó por curiosidad una feria de robótica experimental y encargó a una empresa local un prototipo de robot con brazos denominado "Motoman".

En cuanto le presentaron con el modelo idóneo, no dudó en pagar seis millones de bat (185 mil dólares) por cuatro "camareros" súper eficientes y rápidos.

"Son casi perfectos, no se pueden equivocar y la higiene es total porque no pueden tocar la comida o a los clientes", señala la propietaria, que confía en recuperar su inversión en menos de dos años gracias a la popularidad del invento.

Cada mesa cuenta con una pantalla táctil en la que se piden los platos, según el típico sistema japonés de buffet motorizado y a un precio razonable: 449 bat (14 dólares) por el menú básico y 555 bat (17 dólares) por una degustación de carne y pescado importados.

En cuanto la comida está lista, el terminal avisa a los comensales de que el "camarero" está a punto de llegar, para que nadie se pierda el espectáculo.

Dos de los robots son una versión más simple con un solo brazo, que se encargan de tomar los platos de la cocina y entregárselos a sus compañeros más avanzados, con dos miembros, cuerpo móvil y "uniforme".

Éstos últimos toman el pedido y tras recorrer en un santiamén un raíl hasta la mesa correspondiente, despachan a través de una pequeña ventanilla que sólo se abre cuando están a punto de llegar y de la que luego se llevan los cuencos vacíos.

Thanapant confiesa que retirar los platos acabados es una de sus asignaturas pendientes, pues los camareros mecanizados detectan si el cliente ha terminado mediante un dispositivo óptico que a veces falla.

"De vez en cuando, alguien decide comer la carne pero no la verdura, o se deja algo que quería comerse luego, y el robot se lo lleva enseguida de vuelta", indica.

Tampoco han perfeccionado un sistema para que sirvan las bebidas o pagar la cuenta, casi las únicas funciones que sigue realizando el personal humano, del que también depende colocar cada plato en el lugar correcto para que lo recojan sus compañeros de metal.

La dueña del restaurante reconoce que todavía están empezando y quedan por pulir ciertos detalles, aunque subraya que hasta el momento la acogida entre el público ha sido tan positiva que piensa incluso en abrir otro local en Bangkok, por supuesto también con robots.

Además y por si todo esto no fuera suficiente, cada hora llega el momento de la actuación.

En cuanto empieza a sonar por los altavoces una conocida melodía pop, los robots dejan de servir platos y durante unos minutos se ponen a "bailar" , moviendo sus brazos al ritmo de la música, dando vueltas como una peonza mientras recorren arriba y abajo el raíl entre los aplausos de los comensales.

"A los niños les encanta, muchas familias se quedan hasta la siguiente función para volver a ver el baile" , afirma con orgullo Thanapant.

jueves, 1 de julio de 2010

Robots presumen sus habilidades

Robots presumen sus habilidades: "Más de 300 investigadores de 25 países se dan cita en España en el congreso Robotics: Science and Systems"

Bailar como Michael Jackson, tomar un huevo sin romperlo, doblar la ropa o seguir las instrucciones mentales que le envía una persona son funciones que pueden desarrollar los robots de última generación presentados hoy en el congreso internacional que acoge Zaragoza.

El congreso "Robotics: Science and Systems", el más importante de ámbito internacional, reúne en el Paraninfo de Zaragoza a unos 300 investigadores de robótica de 25 países, de compañías como Google o Microsoft e instituciones como la NASA, para analizar las últimas novedades en este campo, explicó en rueda de prensa el profesor José Neira, presidente del comité local de la reunión.

Aunque predecir el futuro de la robótica es difícil, Neira admitió que el cine y la televisión, en muchos sentidos, deciden cómo van a ser estos avances.

Puso como ejemplo la serie televisiva "El coche fantástico", sobre un automóvil ultramoderno en el que ya trabaja la empresa Volkswagen para tener un prototipo en 2025, o la saga de filmes "La guerra de las galaxias", en la que se implanta una mano protésica controlada por la mente, una idea en la que se está investigando.

Sin embargo y por el momento, las principales utilidades de los robots humanoides actuales, dotados de sensores para ver, escuchar, tocar y moverse, son facilitar la vida diaria, ayudar a personas con movilidad reducida o realizar trabajos peligrosos para los hombres, como se pudo comprobar hoy en el Paraninfo.

Así, y por primera vez en Europa, se presentó el androide PR2, de la empresa Willow Garage, de EU, el más avanzado del mundo en la categoría de robots personales para asistir a las personas con las labores domésticas y que es capaz de doblar una pila de ropa, organizarla y colocarla.

Otro androide, el "Nao", de 58 centímetros, de la empresa francesa Aldebaran Robotics y que está concebido como una plataforma de investigación y educación, demostró que puede bailar "Thriller" de Michael Jackson, levantarse del suelo y hasta contar historias.

Este pequeño robot tendrá un sucesor en "Romeo", un prototipo de 1.50 metros que tendrá listo la compañía en octubre de 2011 para poder ayudar a personas discapacitadas.

También se expuso la mano robótica más avanzada del mundo (de la firma norteamericana Barrett Technology) , que puede coger un huevo sin romperlo, o "Summit", el último robot de la española Robotnik, con alta movilidad y con un control más sofisticado que los que se emplean para desactivar explosivos, explicó Roberto Gúzman, del Departamento de Ingeniería de la compañía.

La Universidad de Zaragoza, organizadora del evento, presentó un proyecto para controlar con la mente los movimientos de un robot.

Así, dos jóvenes voluntarios, con un gorro con electrodos en la cabeza, se concentraron en los movimientos que querían que hiciera el robot, que recibía, a través de una red inalámbrica de comunicación las señales mentales previamente procesadas por un ordenador.

lunes, 28 de junio de 2010

Robots: Modeling Biology

Robots: Modeling Biology: "

The Robots Podcast on Modeling Biology


The latest episode of the Robots
podcast focusses on using robots to model biology. The first guest
is Barbara Webb, who is
director of the Insect
Robotics Group at the University of Edinburgh and has
published several seminal papers on the subject (her 2008 paper on 'Using robots to understand animal behavior' is a
good place to start). Following an earlier interview on her
work, Webb now addresses more complex questions: What is the
importance of distributed control and embodiment in biological
systems?
and How do we find equally powerful solutions for
robots?
This episode's second guest is Steffen
Wischmann, who is a Postdoctoral researcher at the Laboratory of Intelligent Systems at the EPFL and at the Department of Ecology and Evolution at the University of Lausanne, Switzerland.
Wischmann has a long-standing, deep interest in robotic models and his
work has covered both embodied and cognitive aspects of robot models. He
outlines the value of robotic models for biology, describes their
strengths and limitations, and explains their increasingly important
role in research fields that cannot rely on a fossil record to
understand the evolution of traits, such as animal communication. Read
on or directly tune
in!

�I N ' S B X"

Robots: Modeling Biology: "Barbara Webb from the University of Edinburgh discusses insect inspired robotics as a control system design approach. Steffen Wischman from the EPFL/UNIL then gives his view on when robots should be used to model biology and his interest in using artificial evolution."

viernes, 18 de junio de 2010

Scoop: KUKA's youBot Mobile Manipulator Unveiled

Scoop: KUKA's youBot Mobile Manipulator Unveiled: "With a 5-DOF manipulation arm sitting on omnidirectional wheels, KUKA's youBot has open interfaces that roboticists can use to experiment with the machine"

jueves, 10 de junio de 2010

Mobile Manipulation Robots Having Fun

Mobile Manipulation Robots Having Fun: "Mobile Manipulation Robots Having Fun"

Following the Mobile Manipulation Challenge at ICRA 2010, Willow Garage has compiled an entertaining video. It shows some robots having fun, and at the same time showcases some of the world's most advanced service robot arms.



In order of appearance, the video features Willow Garage's own PR2 shaking hands with a Kuka Lightweight and a Meka arm, the Fraunhofer IPA's Care-O-Bot which uses a Schunk arm, an unknown mobile robot, the PR2 shaking hands with the Barrett arm, Aldebaran's Nao, the homer@UniKoblenz Team's manipulation challenge robot based on a Pioneer 3AT and a Katana arm, the University of Bonn's Dynamaid, another brief glimpse of the Barrett arm and finally an impressive demo of the PR2 and Care-O-Bot dancing in the robotic equivalent of a tight embrace.

viernes, 4 de junio de 2010

Robots big and small showcase their skills

Robots big and small showcase their skills: "Two robotics events were designed to prove the viability of advanced technologies for robotic automation of manufacturing and microrobotics."

Make room, Bender, Rosie and R2D2! Your newest mechanical colleagues are a few steps closer to reality, thanks to lessons learned during two robotics events hosted by the National Institute of Standards and Technology (NIST) at the recent IEEE International Conference on Robotics and Automation (ICRA) in Anchorage, Alaska. The events -- the Virtual Manufacturing Automation Competition (VMAC) and the Mobile Microrobotics Challenge (MMC) -- were designed to prove the viability of advanced technologies for robotic automation of manufacturing and microrobotics.

n the first of two VMAC matches, contestants used off-the-shelf computer gaming engines to run simulations of a robot picking up boxes of various sizes and weights from a conveyor belt and arranging them on a pallet for shipping. The two teams in the competition -- both from Georgia Tech University -- showed that their systems were capable of solving mixed palletizing challenges. To do this, the system had to receive a previously unseen order list, create a logical plan for stacking and arranging boxes on a pallet to fulfill that order, and then computer simulate the process to show that the plan worked. Getting all of the boxes onto the pallet is relatively straightforward; however, creating a stable, dense pallet is a difficult challenge for a robot.

The second manufacturing contest "road tested" a robot's mobility in a one-third scale factory environment. The lone participating team, the University of Zagreb (Croatia), demonstrated that it could successfully deliver packages simultaneously to different locations in the mock factory by controlling three robotic Automated Guided Vehicles (AGVs) at once.

In the microrobotics match-up, six teams from Canada, Europe and the United States pitted their miniature mechanisms -- whose dimensions are measured in micrometers (millionths of a meter) -- against each other in three tests: a two-millimeter dash in which microbots sprinted across a distance equal to the diameter of a pin head; a microassembly task inserting pegs into designated holes; and a freestyle competition showcasing a robot's ability to perform a specialized activity emphasizing one or more of the following: system reliability, level of autonomy, power management and task complexity.

In the two-millimeter dash, the microbot from Carnegie Mellon University broke the world record held by Switzerland's ETH Zurich (the event also was part of earlier NIST-hosted "nanosoccer" competitions) with an average time of 78 milliseconds. However, the achievement was short-lived. Less than an hour later, the French team (representing two French research agencies: the FEMTO-ST Institute and the Institut des Systèmes Intelligents et de Robotique, or ISIR) shattered the mark with an average time of 32 milliseconds.

ETH Zurich was the champion in the microassembly event with a perfect 12 for 12 score steering pegs approximately 500 micrometers long (about the size of a dust particle) into holes at the edge of a microchip. Runner-up was Carnegie Mellon whose microbot successfully placed 4 of 9 pegs.

ETH Zurich's robot also captured the freestyle event, amazing spectators with its unprecedented ability to maneuver in three dimensions within a water medium. In fact, in one demonstration, the Swiss device "flew" over the edge of the microassembly field, reversed direction and pushed out the pegs it had inserted earlier. Taking second place in the freestyle event was the team from Carnegie Mellon that demonstrated how three microbots could be combined into a single system and then disassembled again into separate units. Third place in the event went to the microbot from the Stevens Institute of Technology.

NIST conducted the VMAC in cooperation with IEEE and Georgia Tech, and collaborated on the MMC with the IEEE Robotics and Automation Society. More events of this kind with evolving challenges are planned for the future, as robotics technologies mature. NIST will work with university and industry partners on these events with the goal of advancing skills that future robots -- both full-size and micro-size -- will need to carry out their functions.

Soccer-playing robots get creative with physics-based planning

Soccer-playing robots get creative with physics-based planning: "Robot soccer players are warming up to compete in this month's RoboCup 2010 world championship in Singapore. A new algorithm will help newly created robots to predict the ball's behavior based on physics principles."

Robot soccer players from Carnegie Mellon University competing in this month's RoboCup 2010 world championship in Singapore should be able to out-dribble their opponents, thanks to a new algorithm that helps them to predict the ball's behavior based on physics principles.



That means that the CMDragons, the Carnegie Mellon team that competes in RoboCup's fast-paced Small-Size League, likely will be able to out-maneuver their opponents and find creative solutions to game situations that could even surprise their programmers. It's possible that the physics-based planning algorithm also might enable the players to invent some new kicks. "Over the years, we have developed many successful teams of robot soccer players, but we believe that the physics-based planning algorithm is a particularly noteworthy accomplishment," said Manuela Veloso, professor of computer science and leader of Carnegie Mellon's two robot soccer teams.

"Past teams have drawn from a repertoire of pre-programmed behaviors to play their matches, planning mostly to avoid obstacles and acting with reactive strategies. "To reach RoboCup's goal of creating robot teams that can compete with human teams, we need robots that can plan a strategy using models of their capabilities as well as the capabilities of others, and accurate predictions of the state of a constantly changing game," said Veloso, who is president of the International RoboCup Federation. In addition to the Small-Size League team, which uses wheeled robots less than six inches high, Carnegie Mellon fields a Standard Platform League team that uses 22-inch-tall humanoid robots as players. Both teams will join more than 500 other teams with about 3,000 participants when they converge on Singapore June 19-25 for RoboCup 2010, the world's largest robotics and artificial intelligence event.

RoboCup includes five different robot soccer leagues, as well as competitions for search-and-rescue robots, for assistive robots and for students up to age 19. The CMDragons have been strong competitors at RoboCup, winning in 2006 and 2007 and finishing second in 2008. Last year, the team lost in the quarterfinals because of a programming glitch, but had dominated teams up to that point with the help of a preliminary version of the physics-based planning algorithm. "Physics-based planning gives us an advantage when a robot is dribbling the ball and needs to make a tight turn, or any other instance that requires an awareness of the dynamics of the ball," said Stefan Zickler, a newly minted Ph.D. in computer science who developed the algorithm for his thesis. "Will the ball stick with me when I turn? How fast can I turn? These are questions that the robots previously could never answer."

The algorithm could enable the robots to concoct some new kicks, including bank shots, Zickler said. But the computational requirements for kick planning are greater than for dribbling, so limited computational power and time will keep this use to a minimum.

Each Small-Size League team consists of five robots. The CMDragon robots include two kicking mechanisms -- one for flat kicks and another for chip shots. They also are equipped with a dribble bar that exerts backspin on the ball. Each team builds their own players; Michael Licitra, an engineer at Carnegie Mellon's National Robotics Engineering Center, built the CMDragons' highly capable robots. Like many robots in the league, the CMDragons have omni-directional wheels for tight, quick turns. In addition to physics-based planning, the CMDragons are preparing to use a more aggressive strategy than in previous years.

"We've noticed that in our last few matches against strong teams, the ball has been on our side of the field way too much," Zickler said. "We need to be more opportunistic. When no better option is available, we may just take a shot at the goal even if we don't have a clear view of it." In addition to Veloso and Zickler, the CMDragons include Joydeep Biswas, a Robotics Institute master's degree graduate and now a first-year Ph.D. student in robotics, and computer science undergraduate Can Erdogan. "Figuring out how to get robots to coordinate with each other and to do so in environments with high uncertainty is one of the grand challenges facing artificial intelligence," Veloso said. "RoboCup is focusing the energies of many smart young minds on solving this problem, which ultimately will enable using distributed intelligence technology in the general physical world."

viernes, 28 de mayo de 2010

Willow Garage's PR2 Robots Graduate

Willow Garage's PR2 Robots Graduate: "Eleven robots worth more than US $4 million head out to universities and research labs around the world



miércoles, 26 de mayo de 2010

Asimo can run.

Asimo can run.: "

Asimo the robot at the Museum of Emerging Science in Tokyo.

Cast: Derek Koch

"

Asimo can run. from Derek Koch on Vimeo.

Robots: The Nao Humanoid

Robots: The Nao Humanoid: "

Aldebaran's Nao robot


We've already reported on French company Aldebaran's Nao
in a previous post.
Nao has since grown up and made it into the RoboCup Standard
Platform League. The latest episode of the Robots podcast interviews Luc
Degaudenzi, Aldebaran's Vice President in Engineering, and his colleague
Cédric Vaudel, who is Aldebaran's Sales Manager for North America. In
addition, and as a premiere on the Robots Podcast, we also interview a
robot. Nao introduces himself and and then shares his own version of
Star Wars. Read
more or tune
in!

"

lunes, 24 de mayo de 2010

robotik robotlar

robotik robotlar: "

çeşitli robot uygulamaları robot kol lego

Cast: gevv

"

robotik robotlar from gevv on Vimeo.

jueves, 20 de mayo de 2010

The Conference Room That Re-Arranges Itself

The Conference Room That Re-Arranges Itself: "Just pick how you want it set up and the tables move themselves into position"

You can add a new entry to the long list of problems that can be solved by robots: arranging tables in a conference room. On my personal workplace hassle scale, I'm not sure that moving conference room furniture ranks much above "occasional nuisance." But Yukiko Sawada and Takashi Tsubouchi at the University of Tsukuba, Japan, evidently find shoving tables to be an unappealing task for humans. So they built a room that could re-arrange itself.

In this case, the tables are the robots. Select the arrangement you want from a graphical interface, and the tables will move to their new locations. The movement is monitored by an overhead camera with a fish-eye lens, and the software uses a trial-and-error approach to determine the best sequence of motion. But it's best to see the room in action for yourself. Check out the video the researchers presented at ICRA earlier this month.



In the paper, the authors explained the rationale for the project:

In these days, at conference rooms or event sites, people arrange tables to desired positions suitable for the event. If this work could be performed autonomously, it would cut down the man power and time needed. Furthermore, if it is linked to the Internet reservation system of the conference room, it would be able to arrange the tables to an arbitrary configuration by the desired time.

I'm not sure the cost and complexity of such a system could ever be low enough to be practical, but there's definitely something fun about watching the tables reconfigure themselves. And if you already have autonomous, why not go all the way and add a reconfigurable wall?

martes, 18 de mayo de 2010

Explained: Monte Carlo simulations

Explained: Monte Carlo simulations: "Speak to enough scientists, and you hear the words “Monte Carlo” a lot. “We ran the Monte Carlos,” a researcher will say. What does that mean?

The scientists are referring to Monte Carlo simulations, a statistical technique used to model probabilistic (or “stochastic”) systems and establish the odds for a variety of outcomes. The concept was first popularized right after World War II, to study nuclear fission; mathematician Stanislaw Ulam coined the term in reference to an uncle who loved playing the odds at the Monte Carlo casino (then a world symbol of gambling, like Las Vegas today). Today there are multiple types of Monte Carlo simulations, used in fields from particle physics to engineering, finance and more.

To get a handle on a Monte Carlo simulation, first consider a scenario where we do not need one: to predict events in a simple, linear system. If you know the precise direction and velocity at which a shot put leaves an Olympic athlete’s hand, you can use a linear equation to accurately forecast how far it will fly. This case is a deterministic one, in which identical initial conditions will always lead to the same outcome.

The world, however, is full of more complicated systems than a shot-put toss. In these cases, the complex interaction of many variables — or the inherently probabilistic nature of certain phenomena — rules out a definitive prediction. So a Monte Carlo simulation uses essentially random inputs (within realistic limits) to model the system and produce probable outcomes.

In the 1990s, for instance, the Environmental Protection Agency started using Monte Carlo simulations in its risk assessments. Suppose you want to analyze the overall health risks of smog in a city, but you know that smog levels vary among neighborhoods, and that people spend varying amounts of time outdoors. Given a range of values for each variable, a Monte Carlo simulation will randomly select a number within each range, and see how they combine — and repeat the process tens of thousands or even millions of times. No two iterations of the simulation might be identical, but collectively they build up a realistic picture of the population’s smog exposure.

“In a deterministic simulation, you should get the same result every time you run it,” explains MIT computer science professor John Guttag in his OpenCourseWare lecture on Monte Carlo simulations. However, Guttag adds, in “stochastic simulations, the answer will differ from run to run, because there’s an element of randomness in it.”

The aggregation of data makes it possible to identify, say, a median level of smog exposure. To be sure, Monte Carlo simulations are as good as their inputs; accurate empirical data would be necessary to produce realistic simulation results.


"

VEX Robotics World Championship Report

VEX Robotics World Championship Report: "


This year's VEX
Robotics World Championship was bigger than ever with more than 400
high school and university teams from around the world. It was held in
Dallas, Texas again this year, so you can be sure we were there. The
local Dallas Personal Robotics Group
also got in on the act, offering their members as volunteers to help with
the event. I managed to avoid serving as a judge this year, so I had more
time to take photos,
several of which you can see below. Grant Imahara
awarded the top prizes to China's Shanghai’s Luwan team and New
Zealand's Free Range Robotics and Kristin Doves teams. To make things
even crazier, the BEST National
Championship was held alongside the VEX events.
The Metro Homeschool team 229 from Blue Springs, Missouri took the first
place BEST award. Read on for more photos and details from the official
VEX press release or check out my almost 300
photos of the event.

"

viernes, 7 de mayo de 2010

Researchers create software for robot to improve rescue missions

Researchers create software for robot to improve rescue missions: "In disaster emergencies, such as the recent West Virginia mine explosion or the earthquake in Haiti, it is often unsafe for responders to enter the scene, prolonging the rescue of potential survivors. Now, researchers have developed software for a robot with a laser sensor that can enter dangerous structures to assess the structure's stability and locate any remaining people. This technology could lead to safer and more efficient rescue missions."



"We are developing computer graphics visualization software to allow the user to interactively navigate the 3-D data captured from the robot's scans," said Ye Duan, associate professor of computer science in the MU College of Engineering. "I worked with my students to develop computer software that helps the user to analyze the data and conduct virtual navigation, so they can have an idea of the structure before they enter it. The technology could save the lives of disaster victims and responders."

The remote-controlled robot, built by researchers at the Missouri University of Science and Technology, is designed to remotely transport a Light Detection and Ranging unit (LIDAR) so that responders, such as police, military, firefighters, and search and rescue teams, can know more about dangerous structures before entering. When inside the structure, the robot takes multiple scans using LIDAR that takes up to 500,000 point measurements per second. It also can scan through walls and windows. After the scans, the software forms the data points into sophisticated 3-D maps that can show individual objects, create floorplans and color-code areas inside the structure for stability. Depending on the data size, the data maps can take up from half hour to two hours for the software to create.

"Although the software and the robot can help in emergency situations, it could be commercialized for a variety of uses," Duan said. "This system could be used for routine structure inspections, which could help prevent tragedies such as the Minneapolis bridge collapse in 2007. It also could allow the military to perform unmanned terrain acquisition to reduce wartime casualties."

The researchers now are working on a proposal to make the robot faster and smaller than the current model, which resembles the NASA rovers sent to Mars, which weighs about 200 pounds.

Duan's research has been published in International Journal of CAD/CAM. The robot recently was named on the list of Kiplinger's "8 Robots That Will Change Your Life." Duan collaborated with MU students Kevin Karsh and Yongjian Xi, who developed the software and algorithms; Norbert Maerz, associate professor of geological engineering at Missouri S&T; and Missouri S&T students Travis Kassebaum, Kiernan Shea and Darrell Williams.

Robots Help With Deepwater Horizon Disaster

Robots Help With Deepwater Horizon Disaster: "


The image above, from the US Coast Guard's flickr
stream, shows an ROV attempting to activate the Deepwater
Horizon
Blowout Preventer (BOP). The attempt failed and the massive Deepwater
Horizon oil spill continues, threatening to become one of the
biggest environmental disasters of all time. Efforts to stop the spill
now include at
least 10 underwater
robots (in addition to 200 manned sea vessels). US Coast Guard ROVs
located two of the major leaks. There
have been unsuccessful attempts by six different ROVs
to close the BOP. Other
underwater robots are monitoring the disaster site, locating
portions of the spill and dispensing subsea oil dispersents. BP
has rented most of the ROVs they're using but ExxonMobil has donated
the use of one underwater robot plus a support vessel. ROVs working
on one of the three major leaks today successfully installed a half-ton
valve on the broken pipe and were
able to shut it off. Next up for the robots is to assist with the
lowering of a 100 ton containment dome over the disaster site to contain
the spilling oil. This type of operation has never been attempted at a
depth of 5,000 feet. If the containment dome doesn't work,
scientists
warn, the spill may get worse fast. Dr. Robert
H. Weisberg of the University of South Florida says,

It's very likely that at some point oil will be entrained in
the Loop Current. Once entrainment happens, the speed of the Loop
Current could go from that point to the Dry Tortugas in a week, to Cape
Hatteras in another two weeks.
Getting into the Loop Current may take some time. But once in the Loop
Current, the oil will move rather quickly.

If that happens the oil will threaten environments along the Gulf
coast, the Florida
Keys, and Atlantic Seaboard. Particulate pollution from burn offs and
VOCs outgassing from the massive slick could threaten human health as
well. USF is
sending a special robotic sensor platform called the Weatherbird II
into the spill zone to monitor how zooplankton are impacted by the cloud
of toxic water. Tiny oil droplets harmless to larger animals can kill
zooplankton, which are a key element in the undersea food chain.
For up to date information on the disaster see the NOAA photo
stream, NASA
satellite images (and NASA MODIS
rapid response sat images), and the EPA's live air quality monitoring
network.

"

Robots With Knives: A Study of Soft-Tissue Injury in Robotics

Robots With Knives: A Study of Soft-Tissue Injury in Robotics: "What would happen if a knife-wielding robot struck a person?"



The idea of a robot in the kitchen cooking us meals sounds great. We better just watch out for that swinging knife.

To find out what would happen if a robot handling a sharp tool accidentally struck a person, German researchers set out to perform a series of stabbing, puncturing, and cutting experiments.

They fitted an articulated robotic arm with various tools (scalpel, kitchen knife, scissors, steak knife, and screwdriver) and programmed it to execute different striking maneuvers. They used a block of silicone, a pig's leg, and at one point a human volunteer's bare arm as their, uh, test surface.

The researchers -- Sami Haddadin, Alin Albu-Schaffer, and Gerd Hirzinger from the Institute of Robotics and Mechatronics, part of DLR, the German aerospace agency, in Wessling, Germany -- presented their results today at the IEEE International Conference on Robotics and Automation, in Anchorage, Alaska.

The main goal of the study was to understand the biomechanics of soft-tissue injury caused by a knife-wielding robot. But the researchers also wanted to design and test a collision-detection system that could prevent or at least minimize injury. Apparently the system worked so well that in some cases the researchers were willing to try it on human subjects.

We applaud the guy at the end of the video who put his body on the line in the name of robotic science.

The researchers acknowledge that there are huge reservations about equipping robots with sharp tools in human environments. It won't happen any time soon. (Sorry, you'll still have to chop that cucumber salad yourself). But they argue that only by getting more data can roboticists build safer robots.

The experiments involved the DLR Lightweight Robot III, or LWRIII, a 7 degrees-of-freedom robot manipulator with a 1.1 meter reach and moderately flexible joints. The robot, which weighs 14 kilograms, is designed for direct physical interaction and cooperation with humans.

The tools the researchers tested included [photo, right]: (1) scalpel; (2) kitchen knife; (3) scissors; (4) steak knife; (5) screwdriver.




The researchers performed two types of experiments: stabbing and cutting, testing the different tools striking at various speeds, with and without the collision-detection system active.

In most cases, the contact resulted in deep cuts and punctures, with potentially lethal consequences. But remarkably, the collision-detection system, which relied on measurements from force-torque sensors on the robot's body, was able to significantly reduce the depth of the cuts in several cases, and even prevent penetration altogether.

This is the first study to investigate soft-tissue injuries caused by robots and sharp instruments. Previous studies by the same researchers, as well as other groups, have focused on blunt collisions involving non-sharp surfaces.

The video below shows impact experiments using crash-test dummies and large industrial robots. Ouch.

Georgia Tech Robot Masters the Art of Opening Doors and Drawers

Georgia Tech Robot Masters the Art of Opening Doors and Drawers: "Georgia Tech researchers have programmed a robot to autonomously approach and open doors, drawers, and cabinets"



To be useful in human environments, robots must be able to do things that people do on a daily basis -- things like opening doors, drawers, and cabinets. We perform those actions effortlessly, but getting a robot to do the same is another story. Now Georgia Tech researchers have come up with a promising approach.

Professor Charlie Kemp and Advait Jain at Georgia Tech's Healthcare Robotics Laboratory have programmed a robot to autonomously approach and open doors and drawers. It does that using omni-directional wheels and compliant arms, and the only information it needs is the location and orientation of the handles.

The researchers discussed their results yesterday at the IEEE International Conference on Robotics and Automation, in Anchorage, Alaska, where they presented a paper, "Pulling Open Doors and Drawers: Coordinating an Omni-Directional Base and a Compliant Arm with Equilibrium Point Control."

One of the neat things about their method is that the robot is not stationary while opening the door or drawer. "While pulling on the handle," they write in their paper, "the robot haptically infers the mechanism's kinematics in order to adapt the motion of its base and arm."

In other words, most researchers trying to make robots open doors, cabinets, and similar things rely on a simple approach: keep the robot's base in place and move its arms to perform the task. It's easier to do -- and in fact that's how most robot manipulation but limits the kinds of tasks a robot could accomplish.

The Georgia Tech researchers allow their robot to move its omni-directional base while simultaneously pulling things open -- an approach they say improves the performance of the task.

miércoles, 5 de mayo de 2010

Willow Garage Giving Away 11 PR2 Robots Worth Over $4 Million

Willow Garage Giving Away 11 PR2 Robots Worth Over $4 Million: "The robotics company has announced the 11 institutions in the U.S., Europe, and Japan that will receive its advanced PR2 robot to develop new applications"



Willow Garage, the Silicon Valley company dedicated to advancing open robotics, is announcing this morning that it will award 11 PR2 robots to institutions and universities around the world as part of its efforts to speed-up research and development in personal robotics.

The company, in Menlo Park, Calif., hopes that the 11 organizations [see list below] in the United States, Europe, and Japan that are receiving PR2 robots at no cost—a total worth over US $4 million—will use the robots to explore new applications and contribute back to the open-source robotics community.

An open robot platform design and built by Willow, the Personal Robot 2, or PR2, has a mobile base, two arms, a variety of sensors, and 16 CPU cores for computation. But what makes the robot stand out is its software: the open-source Robot Operating System, or ROS, that offers full control of the PR2, including libraries for navigation, manipulation, and perception.

Yesterday I spoke with Eric Berger, Willow's co-director of the personal robotics platform program, who said they’re "really excited about the new applications that will come out of this."

As an example of the possibilities, he mentioned that earlier this year a group at UC Berkeley programmed a PR2 to fold towels. The video of the robot neatly folding a stack of towels went viral.

"People get very excited with the idea of robots doing something that's really useful in their homes," Berger says. "People have seen a lot of military robots, industrial robots, robot vacuum cleaners, but the idea of something like Rosie the Robot, I think it's very powerful."

With its PR2 Beta Program, Willow Garage hopes to foster scientific robotics research, promote the development of new tools to improve the PR2 and other robots, and also help researchers create practical demonstrations and applications of personal robotics.

For the researchers receiving a state-of-the-art personal robot platform worth several hundred thousand dollars, the possibility of working on real-world problems without having to waste time reinventing the robotic wheel, so to speak, is a big deal.

Even more significant, the researchers will be able to "share their software for use by other groups and build on top of each other's work," says Pieter Abbeel, the UC Berkeley professor who created the towel folding demo and is one of the PR2 recipients. "This will significantly boost the rate of progress in robotics, and personal robotics in particular."

"Just as the Mac and PC hardware inspired new applications for personal computers in the 1980s, the PR2 could be the key step in making personal robots a reality," says Ken Goldberg, an IEEE Fellow and UC Berkeley professor. "It's a very exciting step forward for robotics and we're very excited to participate."

Here's the list of lucky 11 PR2 recipients that Willow is releasing this morning:

* Albert-Ludwigs-Universität Freiburg with the proposal TidyUpRobot
The University of Freiburg's strength in mapping has led to multiple open-source libraries in wide use. Their group will program the PR2 to do tidy-up tasks like clearing a table, while working on difficult underlying capabilities, like understanding how drawers and refrigerators open, how to recognize different types of objects, and how to integrate this information with the robot's map. Their goal is to detect, grasp, and put away objects with very high reliability, and reproduce these results at other PR2 Beta Program sites.

* Bosch with the proposal Developing the Personal Robotics Market: Enabling New Applications Through Novel Sensors and Shared Autonomy
Bosch will bring their expertise in manufacturing, sensing technologies and consumer products. Bosch will be making robotic sensors available to members of the PR2 Beta Program, including a limited number of "skins" that will give the PR2 the ability to feel its environment. Bosch will also make their PR2 remotely accessible and will expand on the libraries they've released for ROS.

* Georgia Institute of Technology with the proposal Assistive Mobile Manipulation for Older Adults at Home
The Healthcare Robotics Lab at Georgia Tech will be placing the PR2 in an "Aware Home" to study how robots can help with homecare and creative assistive capabilities for older adults. Their research includes creating easier ways for adults to interact with robots, and enabling robots to interact with everyday objects like drawers, lamps, and light switches. Their human-robot interaction focus will help ensure that the software development is closely connected to real-world needs.

* Katholieke Universiteit Leuven with the proposal Unified Framework for Task Specification, Control and Coordination for Mobile Manipulation
KU Leuven in Belgium is a key player in the open-source robotics community. As one of the founding institutions for the Orocos Project, they will be improving the tools and libraries used to program robots in ROS, by, for example, integrating ROS with Blender. They will also be working on getting the PR2 and people to perform tasks together, like carrying objects through a crowded environment.

* MIT CSAIL with the proposal Mobile Manipulation in Human-Centered Environments
The diverse MIT CSAIL group will use the PR2 to study the key capabilities needed by robots that operate in human-centered environments, such as safe navigation, interaction with humans via natural language, object recognition, and planning for complex goals. Their work will allow robots to build the maps they need in order to move around in buildings as large as MIT’s 11-story Stata Center. They will also program the PR2 to put away groceries and do simple cleaning tasks.

* Stanford University with the proposal STAIR on PR2
PR1 was developed in Kenneth Salisbury's lab at Stanford, and ROS was developed from the STAIR (Stanford AI Robot) Project. We're very excited that the PR2 will become the new platform for the STAIR Project's innovative research. Their team will work on several applications, which include taking inventory, retrieving items scattered about a building, and clearing a table after a meal.

* Technische Universität München with the proposal CRAM: Cognitive Robot Abstract Machine
TUM will research giving the PR2 the artificial intelligence skills and 3D perception to reason about what it is doing while it performs various kitchen tasks. These combined improvements will help the PR2 perform more complicated tasks like setting a table, emptying a dishwasher, preparing meals, and other kitchen-related tasks.

* University of California, Berkeley with the proposal PR2 Beta Program: A Platform for Personal Robotics
The PR2 is now known as the "Towel-Folding Robot", thanks to the impressive efforts of Pieter Abbeel's lab at Berkeley. In two short months, they were able to get the PR2 to fold fifty towels in a row. Berkeley will tackle the much more difficult challenge of doing laundry, from dirty laundry piles to neatly folded clothes. In addition, their team is interested in hierarchical planning, object recognition, and assembly and manufacturing tasks (e.g. IKEA products) through learning by demonstration

* University of Pennsylvania with the proposal PR2GRASP: From Perception and Reasoning to Grasping
The GRASP Lab proposal aims to tackle some of the challenges facing household robotics. These challenges include tracking people and planning for navigation in dynamic environments, and transferring handheld objects between robots and humans. Their contributions will include giving PR2 a tool belt to change its gripper on the fly, helping it track and navigate around people, and performing difficult two-arm tasks like opening spring-loaded doors.

* University of Southern California with the proposal Persistent and Persuasive Personal Robots (P^3R): Towards Networked, Mobile, Assistive Robotics
USC has already demonstrated teaching the PR2 basic motor skills so that it can adapt to different situations and tasks, such as pouring a cup. They will continue to expand on this work in imitation learning and building and refining skill libraries, while also doing research in human-robot interaction and self-calibration for sensors.

* University of Tokyo, Jouhou System Kougaku (JSK) Laboratory with the proposal Autonomous Motion Planning for Daily Tasks in Human Environments using Collaborating Robots
The JSK Laboratory at the University of Tokyo is one of the top humanoid robotics labs in the world. Their goal is to see robots safely and autonomously perform daily, human-like tasks such as retrieving objects and cleaning up domestic environments. They'll also be working on getting the PR2 to work together with other robots, as well as integrating the ROS, EusLisp, and OpenRAVE frameworks.

CyberWalk: Giant Omni-Directional Treadmill To Explore Virtual Worlds

CyberWalk: Giant Omni-Directional Treadmill To Explore Virtual Worlds: "Built by Italian and German researchers, it's the largest VR platform in the world"


It's a problem that has long annoyed virtual reality researchers: VR systems can create a good experience when users are observing or manipulating the virtual world (think Michael Douglas in "Disclosure") but walking is another story. Take a stroll in a virtual space and you might end up with your face against a real-world wall.

The same problem is becoming apparent in teleoperated robots. Imagine you were teleoperating a humanoid robot by wearing a sensor suit that captures all your body movements. You want to make the robot walk across a room at the remote location -- but the room you're in is much smaller. Hmm.

Researches have built a variety of contraptions to deal with the problem. Like a huge hamster ball for people, for example.

Or a giant treadmill. The CyberWalk platform is a large-size 2D omni-directional platform that allows unconstrained locomotion, adjusting its speed and direction to keep the user always close to the center. With a side of 5 meters, it's the largest VR platform in the world.

It consists of an array of synchronous linear belts. The array moves as a whole in one direction while each belt can also move in a perpendicular direction. Diagonal movement is possible by combining the two linear motions.

Built by a consortium of German, Italian, and Swiss labs, the machine currently resides at the Max Planck Institute for Biological Cybernetics, in Tubingen, Germany, where it's been in operation for over two years.

Last year at IROS, Alessandro De Luca and Raffaella Mattone from the Universita di Roma "La Sapienza," in Rome, Italy, and Paolo Robuffo Giordano and Heinrich H. Bulthoff from the Max Planck Institute for Biological Cybernetics presented details of the machine's control system.

According to the researchers, previous work on similar platforms paid little attention to control algorithms, relying on simple PID and heuristic controllers.

The Italian and German researchers came up with a kinematic model for the machine and from there they devised a control strategy. Basically the challenge is that the control system needs to adapt to changes in the user's direction and speed -- variables that it can't measure directly, so it needs to estimate them.

By precisely monitoring the position of the user on the platform using a Vicon motion-capture system, the controller computes estimates for the two variables and tries to adjust the speeds of the linear belts to keep the user close to the center -- all without abrupt accelerations.

The researchers also devised a way of using a frame of reference for the controller that varies with the user's direction. This method allowed the CyberWalk platform to provide a more natural walking experience, without making the user's legs cross when changing direction. The video above shows the results.

The CyberWalk platform is one of two locomotion devices developed as part of the European Union-funded Project CyberWalk. The other is a small-scale ball-array platform dubbed CyberCarpet.

The Technical University of Munich, another partner in the CyberWalker consortium, designed and built both platforms. And ETH Zurich, another partner, was responsible for the VR part -- creating a 3D VR model of ancient Pompeii and implementing the motion synchronization on the head-mounted display of the human walker.

You can read the researcher's paper, "Control Design and Experimental Evaluation of the 2D CyberWalk Platform," here.

A Robot That Balances on a Ball

A Robot That Balances on a Ball: "Masaaki Kumagai has built wheeled robots, crawling robots, and legged robots. Now he's built a robot that rides on a ball"


Dr. Masaaki Kumagai, director of the Robot Development Engineering Laboratory at Tohoku Gakuin University, in Tagajo City, Japan, has built wheeled robots, crawling robots, quadruped robots, biped robots, and biped robots on roller skates.

Then one day a student approached him to suggest they build a robot that would balance on a ball.

Dr. Kumagai thought it was a wonderful idea.

The robot they built rides on a rubber-coated bowling ball, which is driven by three omnidirectional wheels. The robot can not only stand still but also move in any direction and pivot around its vertical axis.

It can work as a mobile tray to transport cocktails objects and it can also serve as an omnidirectional supporting platform to help people carry heavy objects.

Such a ball-balancing design is like an inverted pendulum, and thus naturally unstable, but it offers advantages: it has a small footprint and can move in any direction without changing its orientation.

In other words, whereas a two-wheel self-balancing robot has to turn before it can drive in a different direction, a ball-riding robot can promptly drive in any direction. Try that, Segway!

Dr. Kumagai and student Takaya Ochiai built three robots and tested them with 10-kilogram bricks. They even made them work together to carry a large wooden frame.

lunes, 3 de mayo de 2010

Niños mexicanos van a Taiwan

Niños mexicanos van a Taiwan: "Un grupo de 14 niños, de entre seis y 12 años de edad, representarán a México en la ciudad de Kaohsiung, Taiwan, donde se llevará a cabo un encuentro de robótica denominado Open International Championship Smart Movie"

Un grupo de 14 niños, de entre seis y 12 años de edad, representarán a México en la ciudad de Kaohsiung, Taiwan, donde se llevará a cabo un encuentro de robótica denominado Open International Championship Smart Movie, del 6 al 8 de mayo.

El abierto asiático de robótica congregará a niñas y niños de más de 52 países, con el objetivo de desarrollar una atmósfera tipo deportiva acercando a la niñez y a la juventud a la ciencia y a la tecnología de forma lúdica.

El reto este año dado a conocer por la asociación FIRST LEGO League, organizadora del encuentro, consiste en pensar e imaginar nuevas formas de transporte para personas, bienes y servicios, proponiendo movimientos inteligentes "smart move'.

Se pretende que los participantes planteen una solución para liberar al planeta de consumibles como la gasolina, el petróleo y derivados que se aplica en llantas, plásticos, entre otros, con un transporte limpio.

En entrevista el coordinador de Sistemas del Colegio Williams, Francisco Brito Vidales, destacó que el grupo de niños que representará al Distrito Federal pertenece a ese colegio, plantel San Jerónimo, y se autodenominan Risk Takers o Tomadores de Riesgos.

Los Risk Takers, añadió, presentarán en Taiwan un proyecto de ciudad magnética a escala que mediante la colocación de polos inversos de imanes permite a los automóviles levitar.

Explicó que con más de cinco años de trabajo y decenas de horas de investigación y consultas a especialistas los menores crearon a 'Magnetito', el robot mexicano con el que buscarán poner en alto el nombre de México frente a países como Estados Unidos, Japón y Alemania.

Los Risk Takers son niños y jóvenes que se distinguen por su creatividad, pensamiento analítico, solución de problemas y capacidad para formar un verdadero equipo, comentó por su parte el profesor Sergio Becerril, otro de los 'coach' que acompaña al grupo.

'Ellos se prepararon no sólo para cumplir con el buen desempeño del robot al completar las misiones consideradas en la convocatoria, sino además profundizaron en la investigación de una propuesta tecnológica para modificar la forma en la que nos transportamos', abundó.

Adicionalmente trabajaron en el diseño del robot para cumplir con parámetros como robustez, versatilidad, adaptabilidad, factores que forman parte de la puntuación para la competencia además del trabajo en equipo para la resolución de problemas, anotó.

El reto de los participantes en el encuentro de robótica consiste en proponer un proyecto donde el transporte se vuelva más inteligente, contamine menos y mueva mayor cantidad de masas a un costo más bajo.

Al grupo que representa al Distrito Federal se sumarán otros cuatro procedentes de Querétaro, Toluca, Guadalajara y Monterrey.

El robot 'Magnetito', diseñado para cumplir con misiones determinadas para la competencia, recorre trayectorias sin colisionar con obstáculos, realiza movimientos de precisión, sigue rutas trazadas, manipula objetos y los deposita en áreas reducidas o lugares específicos con exactitud.

De acuerdo a Francisco Brito, de 12 años, los Risk takers son un grupo de niños 'normales', juguetones, pero ante todo tolerantes, pensadores, informados, solidarios, de mentalidad abierta y comunicadores.

martes, 27 de abril de 2010

Explained: Thermoelectricity

Explained: Thermoelectricity: "Thermoelectricity is a two-way process. It can refer either to the way a temperature difference between one side of a material and the other can produce electricity, or to the reverse: the way applying an electric current through a material can create a temperature difference between its two sides, which can be used to heat or cool things without combustion or moving parts. It is a field in which MIT has been doing pioneering work for decades.

The first part of the thermoelectric effect, the conversion of heat to electricity, was discovered in 1821 by the Estonian physicist Thomas Seebeck and was explored in more detail by French physicist Jean Peltier, and it is sometimes referred to as the Peltier-Seebeck effect.

The reverse phenomenon, where heating or cooling can be produced by running an electric current through a material, was discovered in 1851 by William Thomson, also known as Lord Kelvin (for whom the absolute Kelvin temperature scale is named), and is called the Thomson effect. The effect is caused by charge carriers within the material (either electrons, or places where an electron is missing, known as “holes”) diffusing from the hotter side to the cooler side, similarly to the way gas expands when it is heated. The thermoelectric property of a material is measured in volts per Kelvin.

These effects, which are generally quite inefficient, began to be developed into practical products, such as power generators for spacecraft, in the 1960s by researchers including Paul Gray, the electrical engineering professor who would later become MIT’s president. This work has been carried forward since the 1990s by Institute Professor Mildred Dresselhaus, Theodore Harman and his co-workers at MIT’s Lincoln Laboratory, and other MIT researchers, who worked on developing new materials based on the semiconductors used in the computer and electronics industries to convert temperature differences more efficiently into electricity, and to use the reverse effect to produce heating and cooling devices with no moving parts.

The fundamental problem in creating efficient thermoelectric materials is that they need to be good at conducting electricity, but not at conducting thermal energy. That way, one side can get hot while the other gets cold, instead of the material quickly equalizing the temperature. But in most materials, electrical and thermal conductivity go hand in hand. New nano-engineered materials provide a way around that, making it possible to fine-tune the thermal and electrical properties of the material. Some MIT groups, including ones led by professors Gang Chen and Michael Strano, have been developing such materials.

Such systems are produced for the heating and cooling of a variety of things, such as car seats, food and beverage carriers, and computer chips. Also under development by researchers including MIT’s Anantha Chandrakasan are systems that use the Peltier-Seebeck effect to harvest waste heat, for everything from electronic devices to cars and powerplants, in order to produce usable electricity and thus improve overall efficiency.


"

domingo, 25 de abril de 2010

R-2 rumbo a Estación Espacial Internacional

R-2 rumbo a Estación Espacial Internacional: "El robonauta está pasando el entrenamiento para ser el primer astronauta mecánico con forma más o menos humana en viajar al espacio"

El próximo mes de septiembre se incorporará a la tripulación permanente de la Estación Espacial Internacional (EEI) el más raro de los astronautas: un robot humanoide capaz de utilizar las mismas herramientas que los humanos y, sobre todo, de ayudarles a realizar tareas muy difíciles y peligrosas.

Se llama Robonauta 2, o R-2, y en los primeros meses tendrá que permanecer confinado en el módulo estadounidense Destiny de la EEI, mientras pasa las pruebas pertinentes de resistencia en las condiciones espaciales, pero está previsto que luego se desplace por toda la base orbital.

La idea es que en el futuro este tipo de robots realicen incluso paseos espaciales, pero el primer prototipo no va dotado de los sistemas de protección adecuados para funcionar en las condiciones de temperaturas extremas del espacio abierto, informó la NASA.

R-2, de momento, es sólo medio astronauta, o medio robot: una cabeza, con un torso, dos brazos y dos manos, con un peso total de unos 150 kilos.

El plan es llevarlo al espacio en el tansbordador Discovery.

El robot es un desarrollo tecnológico de la NASA y General Motors que puede utilizarse no sólo en el entorno espacial sino también en la Tierra, para múltiples tareas industriales.

"Es un ejemplo de una futura generación de robots espaciales y terrestres, no para sustituir a los humanos sino para acompañarlos y realizar trabajos clave de apoyo", dijo John Olson, director del Departamento de Integración de Sistemas de Exploración.

"El potencial combinado de humanos y robots es una demostración perfecta de que dos y dos pueden sumar mucho más que cuatro", agregó.

El plan inmediato para R-2 es realizar, dentro de la EEI, pruebas en condiciones de microgravedad y radiación para probar su funcionamiento en el espacio, explicó la NASA.

Las operaciones permitirán ensayar el trabajo del robot codo con codo con los astronautas.

El R- 2 está pasando el entrenamiento debido para ser el primer astronauta mecánico con forma más o menos humana en la EEI.

viernes, 23 de abril de 2010

Alumno del Tec colaborará con la NASA

Alumno del Tec colaborará con la NASA
El Universal
Jueves 22 de abril de 2010

Desarrollará cinco proyectos de ingeniería incluyendo varios vehículos robóticos para exploración de lugares inaccesibles

Junto con la agencia espacial estadounidense, el estudiante del Tecnológico de Monterrey, David Alonso Quiroz Rochell, participará en el desarrollo de vehículos robóticos para el proyecto Greenland Robotic Tractor.

El alumno de cuarto semestre de la carrera de Ingeniero en Mecatrónica, en el campus Estado de México, fue invitado por el Centro Goddard de Vuelos Espaciales de la NASA, para integrarse al proyecto Greenland Robotic Tractor de la NASA.

David, estuvo en contacto con Michael Comberiate, Senior System Manager de la NASA y con Matías Soto de la Universidad de Texas, quienes después de examinar su trabajo lo invitaron a participar con un grupo de 26 estudiantes más de ingeniería provenientes de diferentes países.

Este contempla el desarrollo de cinco proyectos de ingeniería incluyendo varios vehículos robóticos, uno de éstos diseñado para transportar un radar de penetración bajo tierra en la isla de Groenlandia no tripulado y operado de forma remota a través de enlaces vía satélite.

"Nuestra participación como universitarios en la agencia espacial también contempla el desarrollo de software para la exploración y la cartografía de terrenos a través de una red planetaria de robots", agregó el alumno.

"Creo que para todos los seleccionados, el participar en un proyecto de la NASA es ver cristalizado tu esfuerzo. Las participaciones y el trabajo constante es lo que nos lleva como alumnos a concretar nuestros sueños y alcanzarlos", puntualizó.

Trayectoria

David Alonso Quiroz Rochell obtuvo la medalla de bronce en el I-Sweeep 2009 por la creación del Robot RV 800 el cual podría utilizarse para la exploración en lugares de difícil acceso para los seres humanos, como en búsquedas de personas en terremotos o derrumbes.

Asimismo, el Instituto Mexiquense de la Juventud (IMEJ) le otorgó el Premio Estatal de la Juventud 2009 en la categoría de Trayectoria Innovación Tecnológica por su investigación y desempeño en el desarrollo de prototipos robóticos.

Robot Sumo

Robot Sumo

Robot Sumo from Heimo Aga on Vimeo.


During the RobotChallenge 2010 in Vienna/Austria, the First European Robot Sumo Championship was held.

Two robots compete and try to push the competitor off the ring. There are different classes: Standard (3kg), Mini (500g), Micro (100g), Nano (25g) and Humanoid Sumo.
Filmed entirely hand-held with EOS 5D II and 7D and Canon L lenses, post production in Final Cut Pro 7, color grading with Magic Bullet Mojo.

Cast: Heimo Aga

"

Robots: 50 Years of Robotics (Part 1)

Robots: 50 Years of Robotics (Part 1): "



Today we celebrate the 50th episode of Robots! For the occasion, the Robots podcast talked to 12
researchers about the most remarkable developments in robotics over the last 50 years and their prediction for the next half-century. This '50th Special' is split into two parts, with the second half airing in two weeks. In this first part, Rolf Pfeifer from the University of Zurich gives a general overview of the developments in robotics, Mark Tilden from WowWee focusses on robot toys, Hiroshi Ishiguro on androids, Oscar Schofield on underwater robots, Steve Potter on brain machine interfaces and Chris Rogers on eduction robots. Also coinciding with this 50th episode, the Robots website has gotten a major overhaul: Apart from an updated layout, you can now easily browse episodes by topic, interviewee or tags, and you can interact with other listeners by leaving comments below episodes and on the new Robots forum. To do both, just log-in once in the top bar of
the website. Thanks to all our faithful listeners!

"

Domestic robot to help sick elderly live independently longer

Domestic robot to help sick elderly live independently longer

The recently started research project, which has been named KSERA (Knowledgeable Service Robots for Aging) focuses in particular on COPD patients, people with chronic obstructive pulmonary disease. In 2030 this disease will be the third cause of death worldwide, according to expectations of the World Health Organization. The disease especially affects old-aged people.

In three years several demonstration houses should be finished. They will be equipped with a robot and the domestic systems of a 'smart home' -- think of self-opening curtains. The central role is played by the 'domestic robot'. It follows patients through the house, learns their habits, watches them closely, gives sound advice, turns the air conditioning up or down a bit, and warns a doctor when the patient is not doing well. In addition, the robot also provides entertainment in the form of the Internet and videos. "We want to show what is possible in this area ," says project coordinator dr. Lydia Meesters about the goal of the project.

The TU/e researcher, from the Department of Industrial Engineering and Innovation Sciences, emphasizes that this new type of intelligent care house will not be a cold environment. "It should be as homely as possible. In an ideal situation the only technology you see will be the robot. It will be the contact for all the domestic systems. Otherwise the place will just look very homely."

Ethical issues will also be given special attention. The robot must give good advice to patients, but it should not be a policeman, Meesters explains. What to do, for example, when a COPD patient lights a cigarette? And what may the robot system pass on to 'the central operator', and what not? Meesters: "We need to define clear limits, for the robot will continuously measure and see very private data."

The project has a total budget of almost 4 million euros, 2.9 million of which will be furnished by the EU. Other parties involved are the Italian research center Institute Superiore Mario Boella, Vienna University of Technology, Hamburg University, the Italian ICT company Consoft, the Central European Institute of Technology in Vienna and the Israeli care provider Maccabi Healthcare Services.

Light-based localisation for robotic systems

Light-based localisation for robotic systems: "Getting robotic systems to accurately detect both moving and static objects remains an obstacle to building more autonomous robots and more advanced surveillance systems. Innovative technology that uses light beams for localization and mapping may offer a solution."

The technology advances the current state of the art of Light Detection and Ranging (LIDAR), the optical equivalent of radar in which reflected beams of scattered light are used to determine the location of an object. Whereas most LIDAR systems use a one-step process to detect objects by scanning an area and measuring the time delay between transmission of a pulse and detection of the reflected signal, researchers working in the EU-funded IRPS project added a prior step.

They use LIDAR to first build a 3D map of the area, enabling their system to pinpoint the location of not just static objects but also moving ones -- be it a human, an open window or a leaking pipe -- to within a few millimetres. The researchers, from four EU countries and Israel and Canada, have called the technology 3D LIMS (3D LIDAR Imaging and Measurement System) and foresee a broad range of applications for it, from navigating autonomous vehicles around airports to monitoring industrial equipment and enhancing securitysurveillance.

"This two-step LIDAR process, involving first calibration and then real-time navigation, is the key innovation. It allows the system to accurately and rapidly detect changes in the environment," explains Maurice Heitz, the manager of the IRPS project and a researcher at French technology firm CS Communication & Systèmes.

The technology not only detects objects with greater accuracy, but unlike camera-based robotic vision systems it is not affected by shadows, rain or fog, and provides angular and distance information for each pixel, making it suitable for use in virtually any environment.

Robotic airport buggies

To highlight the potential of 3D LIMS, the IRPS team built a prototype application in which the technology was used to navigate buggy-like autonomous vehicles that might one day transport passengers or luggage around an airport.

Showcased at Faro Airport in Portugal last December, the robotic porter application involved first building up a 3D image of the airport environment so the system would know the location of static features such as walls, columns, doors and staircases. The buggies then use onboard LIDAR to accurately calculate their position and detect obstacles as they move around the airport.

"Our vision is that one day people, perhaps elderly or with a disability, will go to the airport and by speaking to a porter control centre on their mobile phone or through a web interface on their PDA would be able to order a vehicle to take them to their boarding gate. The vehicle would transport them autonomously, weaving its way between moving objects such as passengers and piles of luggage," Heitz says.

The IRPS project manager notes that there is real demand for such a system by airport operators, who are finding it increasingly hard to meet the transport needs of passengers and their luggage because of the large size of modern airports. However, he says it will probably be many years before robotic buggies start buzzing around airports autonomously due to a combination of safety concerns and the need for further technological advances.

"Running a 3D LIMS system requires a lot of computer processing power and a large investment," he notes.

Other applications are closer to market. In the field of security surveillance, 3D LIMS could improve upon current techniques for detecting intruders or spotting changes inside a building.

"The system compares the current acquisition [of reflected light] to its reference acquisition, allowing it to detect any change in the environment," Heitz says.

In the case of industrial monitoring, for example, a 3D LIMS system operating in a power plant would be able to instantly and accurately detect something as small as a leaking pipe.

Though the project partners say commercial applications for their system are still a few years away, they are continuing to work on the technology and are seeking support for further research and development.

The IRPS project received funding from the ICT strand of the EU's Sixth Framework Programme for research.