Fabricating open-source baby robots | Pierre-Yves Oudeyer | TEDxCannes

Translator: Georges Pattinson
Reviewer: Remco Mollema How do we understand
the world around us? Of course, we use our brain. But we also use our hands.
We fabricate. Children endlessly build, destroy, assemble to make sense of forces, objects, people. To understand boats and water, they take wooden sticks and throw them in rivers. Scientists do the same. To understand ocean waves, they build giant aquariums
in their labs. To understand the formation of spiral galaxies, they build computer simulations. What if we want to understand ourselves? What if we want to understand things like “emotions”, “learning”, “curiosity”? Again, we need to fabricate we can fabricate baby robots and provide them with mechanisms and models of the brain, of the body, of learning and change them and experiment them systematically to see what is happening We can compare the behavior we observe with the mechanism inside, since we designed it, and this can be illuminating. Let me give you a first example, that is about understanding the role of curiosity in child development. Human children learn many things, but they do so in a very progressive way, with a specific timing and ordering. For example, before they learn to walk with their two legs, they first learn how
to control their neck, then to roll on their belly, then to sit, to stand up, and walk with their hands
on the walls. Why and how do they follow such a progression? Of course, the social environment plays a big role. But there is another force which drives all of us:
this is curiosity. Curiosity, which pushes us to discover, to learn, to invent. Psychology and neuroscience have long understood that our brains like to explore novel activities for their own sake. But we still understand very little about curiosity. With my team,
we tried to improve this and we fabricated. We fabricated robots that learn, discover, and set their own goals. We fabricated the Playground experiment for example. Let’s look at the video. Here we see, a baby robot learning and making experiments to make sense of the world around it: he tries actions, observes effects, and tries to detects regularities along these actions and their effects. It allows him to make predictions. And the way he chooses those experiments, those actions, is like a little scientist: It chooses experiments which it thinks can provide a lot of progress in its predictions, which can provide new information. And what we observed that not only it leads robots to require novel skills at their own initiative for example learning how to grasp the object in front of them, but we also observed a spontaneous evolution and self-organization of its behavior, progressively increasing in complexity. We observe the emergence of cognitive stages, which were not pre-programmed but the engineer. For example, the robots ends up having systematically vocal interaction with the other robot, And this was not preprogrammed by the engineer. This fact was a result of the dynamic interaction between learning, curiosity, its body and the environment. The body itself is very important: If you change the body, but we keep the same learning mechanism, then you will see different cognitive stages emerge, in a different order. Let’s turn to a second example, what is the origins of languages. How can a community of individuals agree on a share system of sound? A share system of words, share system of concepts. This is what we explored and studied in the
Ergo-Robots experiment, which was recently shown at Fondation Cartier
pour l’Art Contemporain in Paris, and which was also
an opportunity for us to share our activities with the general public, and this was also a collaboration with artist and moviemaker David Lynch who did the scenography. Let’s look at the video Here we see five little robots. They have the same learning mechanisms of curiosity I just presented, but in addition they have mechanisms that allow them to invent their own language, their own system of words to speak of the world around them, by playing simple language games invented by a great researcher called Luc Steels. Here we see one of
these language games: one robot shows an object to the other one, and says how he calls it. The other does the same, and they update their vocabulary. Initially, they don’t share a language. They have a different vocabulary and this is a big mess, a lot of misunderstanding. But progressively, their language converges, crystallizes, And the whole population of robots shares the same language. And it is their own language. If you run again the experiment, then they will invent another language, which cannot be predicted by the engineer who built the robot. Here again the body is very important: If we have robots which have for example rich tactile sensors and poor visual capabilities their language will develop
in such a way, that you will have
a lot of words to speak about touch, and very little words to speak about colors. This leads us to quite an important question: Could we consider the body as an experimental variable thanks to robotics, something that one can change and experiment systematically? With animals and humans, it is obviously difficult: Can you imagine what it makes to put the body of a mouse around the brain of a snake? Of course it is complicated But with robots? Only quite recently. It was also very difficult because making a robot involve heavy manufacturing techniques, a lot of engineers and it took a long time
to build a robot; and when you built it, you didn’t want to change it because it was too complicated. But this is beginning to change thanks to something that is now revolutionizing design and manufacturing and that is 3D printing. 3D printing is a machine which can literally print any objects, any shape in 3 dimensions, in all kinds of materials. And based on a 3D drawing you make on your own personal computer. And with my team at INRIA Flowers we decided to take this to humanoid robots. You want to change and explore a new shape
for the head, for the hand, for the legs? In one afternoon, you make a drawing. The day after you print. And right away you assemble and experiment. And this lead us to design the Poppy humanoid robot. Its full skeleton, in white here, is entirely 3D printed. Initially this robot
was targeted to explore Study the role of morphology upon biped locomotion and human robot interaction. For example, you see two alternative shapes for the legs which makes it more or less easy to control balance. This look at the video
of the Poppy robot. 3D printing actually allows unbelievable creativity and productivity. Beforehand you need a whole team of engineers to build such a robot and many shapes were impossible. With 3D printing those shapes are very easy to build. And you only need a handful of people. In my team, for example, it was made especially by Matthieu Lapeyre, the main architect of the robot, and Pierre Rouanet, who did the software. And in addition to 3D printed, the robot it is also open-source, and openness is just as crucial as 3D printing. Open source means that everyone can download from the web the source files of the mechanics and of the software, so that anyone can rebuild, improve, hack the robot for one’s own project. So every scientists in its lab, every engineer in a company, every student at school, every artist, every geek
in its garage, can rebuild the robot, build on the great ideas of others and share again its own ideas. This multiples creativity through social sharing and social innovation. Openness also allows everyone to look at every detail inside. If you look at the
kind of experiments with baby robots I talked before, their full explanatory power only becomes alive when everyone can look at every detail inside them. Just let you look at
the end of the video. Just a couple of months after we announced the open source release of this humanoid robot, we have already more than a hundred institutions, labs, fablabs, geeks in their garage students in their
schools and universities who are already reusing and adapting the robots to their own projects. And sharing their ideas on the web sites
we created for this robot. We can expect
that in a few months we will see why the ecosystem of offsprings appearing on this website. A robot based on Poppy with maybe 4 legs, 8 arms, wheels or maybe propellers that could allow him to fly. Why not? We are ahead of a fascinating and creative set
of new experimentations, that are now possible. And, if we come back to the experiments and questions I talked at the beginning, it means that everyone each of you in your home, in your company,
in your university, each of you will be able
to fabricate and to understand better things like emotions, learning
and curiosity Thank you very much.


  1. notaras1985 said:

    puppy terminator apocalypse is at hand…

    September 27, 2014
  2. zied hamdi said:

    Very interesting subject, i think this project will see big progression in the future because they are lot of people like to work on the open source, software or hardware 

    September 28, 2014
  3. Lloyd Cloer said:

    When open-source 3-D Printing matures, the masses will be unstoppable. #3dprinting  

    October 2, 2014
  4. rodolfo arboleda said:

    Alguien que lo traduzca al español, parece interesante.

    October 17, 2014
  5. Lounge lizard said:

    I think he quite unintentionally scared the [email protected] out of these people (at about 13:00).  
    Did it sound like the clapping was a bit more subdued than normal?

    October 22, 2014
  6. Nelson LEON CASTILLO said:

    Gracias por permitirnos llevar esto a nuestro semillero y a nuestros alumnos acá en Bogotá Colombia para que los muchachos se adentren en la cultura de código abierto, de fabricación digital, de Makers y dar los primeros pasos hacia la  cuarta revolución industrial de tener un robot en cada casa

    March 29, 2015

Leave a Reply

Your email address will not be published. Required fields are marked *