fb-pixel
Back to Blog

When Momo the Robot met children

This is a story of children meeting a robot. There's a video at the end.

Six months ago Prizztech (a non-profit business development company owned by the municipalities in Satakunta region – Finland’s west coast), discussed a research case with our robotics lead Olli Ohls. They would like us to test using a humanoid robot in speech therapy with autistic children, working with a hospital district in the same region.

It would need to be able to talk in Finnish and speak some sign language. Now, would we happen to have such a robot?

Well, not really.

JF8FATk8lqqYmofOr2fPlvThQJsArwkhrC hkrtJ6iPQOj-w8uN-m2Gcx6Xxc8xLwC4PFVKNHhTXzsrL7G vpn3sjeSaJbw n58nP6BR5kUZdJyKM04guGFBOLH95ttlsXK5sj5j

Our open source InMoov robot – that Olli had built earlier – sure could wave its arms about, but its fingers were too clumsy for sign language. We also had some challenges with its mechanical reliability, mostly due to how we had put it together. The controlling library wasn't very well suited for this type of purpose. Not enough precision.

Wouldn’t work, I thought. Too challenging. Not enough time to improve it.

However Olli persisted. A robot helping autistic children with sign language, though? With an actual hospital district? Trying something that hasn’t been done anywhere? Isn’t this what Chilicorn Fund – our social responsibility program – is all about?

So inevitably, after wasting some time on my hand-wringing, we took the case.

Fortunately our designer Minja Axelsson decided to wrap her thesis around this project. She studied the subject and planned, together with the hospital district neuropsychologist and speech therapist, how the interaction between the robot and people should play out. What the robot needs to be able to do. How it should look. Under her directions Olli and I began transforming our generic InMoov into Momo, the Friendly Sign Language Robot.

Luck struck twice with the hands; Open Bionics had earlier open sourced their Ada hands that seemed to do what we wanted. Then, with minimal effort, I found us a great student team from Metropolia polytechnic to build those hands. It took them a few months, but they delivered, just in time.

Embedded content: https://vimeo.com/274641090

Meanwhile Olli and I had to entirely rebuild the robot’s arms with a sturdier, more reliable design. I needed to port the robot control to run on Mac, reprogram the robot, and teach it some sign language.

Kludges and hacks

When we got the new hands attached, some time was lost trying to integrate them with the controlling software. Cannot be done, not really, not in any sensible way. So, running out of time, I discarded any pretense of professionalism and piled hack upon hack. Separate Python daemons to talk directly to the left and right hand over serial. Breaking out of the MyRobotLab’s gesture code to echo commands over netcat to those daemons. It’s all very horrible. I mean, look at this:

# make a fist print commands.getoutput('/bin/echo "950,950,950,950,950" | /usr/bin/nc localhost 10000')

A tablet was needed on the robot’s chest to display pictures of the subject of the sign language, to study if that has an impact. I bought an iPad Mini 4 and attached it with duct tape. Then I created an index.html with...

<img src="pics/showthis.jpg">

...that I served from my Mac through Python’s SimpleHTTPServer. From the robot’s control code, again, I would break out and rename files that we wanted to display:

print commands.getoutput('/bin/cp /Users/ttur/Documents/myrobotlab.1.0.2693/tablet/pics/ipad_'+img+'.jpg /Users/ttur/Documents/myrobotlab.1.0.2693/tablet/pics/showthis.jpg')

The Dolphin browser on the iPad would display the images accordingly. With a world-class flicker!

With the individual actions resolved, how should we control the flow?

Minja provided a script for the session. Say hello, my name is Momo. Say I’m a robot. Look left and right. Say something else. Make the first sign language sign. Observe what happens. On cue by the speech therapist in the room, choose the next action. Thumbs up, nice try, no reaction?

It was less than a week to the deadline. I just made a long program file with all the actions in the script, commented them all out, and figured I’ll control the robot manually by uncommenting the relevant lines, running it, commenting them again, uncommenting the next relevant lines…

And so it continues for circa 300 lines. In retrospect, do I recommend this approach? I do not. It is too easy to make mistakes while executing it.

To get the robot to speak Finnish, I ended up using the macOS system narrator voice called Satu. The implementation is not perfect – our language is small, and text-to-speech and speech-to-text implementations are woefully few – but at least you can control the pitch and rate, and add some pauses where necessary.

Pitch adjustment was necessary to make the voice somewhat genderless. Some words and names required creative spelling to get right; Teemu would be better written as "tee mo" for the text-to-speech.

Duct tape

My stress level was peaking on that weekend. Monday and Tuesday were booked for testing with the people arranged by the hospital district. Permissions had been asked, parents convinced, families had made varying travelling arrangements to the Antinkartano rehab centre in Ulvila. Just so that their children could have a brief discussion with a robot, and we could perhaps learn something.

Thursday night Olli was still 3D printing non-stop in his hotel room in the nearby city of Pori.

sIr M2gPJrKG6OmsFIPugmNOL8 j I0GZuikCYN3 JW2-PgXlKYs BBM4EWMSVS5QUjMsnTnNZOT9aMv9owsXA2mjeaznGPnjjFgsBjW64SiqH WVNS3qnapY2Vtrgy9fcXNwh9t

On Saturday I was alone in the hospital district site after dark, trying to program the robot to make a better sign for a “ball”, for “flour”, for “cat”.

On Sunday Minja arrived and we got to properly test the flow for the first time. To my surprise it kind of worked, but how reliable could this possibly be?

We already had seven cables running between the rooms. My macOS kept losing the USB hub on the robot’s back for some weird-ass reason, and the only way to get it back was to either reboot my Mac or power cycle the hub. So Olli added a separate extension power cord between the rooms just so I can restart the hub remotely. Eight cables.

We had to cut some corners. Minja wanted attention lights on the robot’s arm, a part of her study. I had a programmable solution in mind, but in the end I just bought a LED strip, taped it around the robot’s arm, and added yet another extension cord between the rooms so I can manually switch it on and off. Nine cables.

One last minute realisation was that the rehab centre WiFi was too unreliable. That compromised the tablet’s connection to my Mac’s web server. Instead of going double digits with the cables, I taped my phone on the floor of the corridor in between, to serve as a mobile hotspot between the tablet and my Mac.

The final touch was a small white towel nicked from the hotel, to cover the robot’s back, so that the Arduino lights would not be reflected from the window behind it, possibly distracting. I later paid for the towel.

Live

Finally it was Monday morning, April 16th, 2018. A taxi was waiting at 6:30 am, the first session starting at 8 am. We had an hour to do final tests and refinements. Five minutes to eight we thought we might be ready.

At eight the first session started. Minja was observing the room through various camera feeds and giving me instructions on what the robot needed to do. Olli was armed with tools and more cables, ready to fix anything that would break.

d0fWKnxgaSB9qTu7FJtBZNOO8SUSFXaOCyRhZ9b75Xw8b8O1baJfwcTPmom 1EH49-DZfirwbbFuxGiAecdDemcmKWKrrBV2d6JxzbiUi3B1B5kWI7tHyb o1MfhRbF41R6zYC W (1)

A young kid was escorted into the room and sat on a chair across a table from the robot. He looked straight at the robot. Curiously? Can’t really say. Fearfully? Fortunately not – that was my main concern. A brief silence in the room, uncommenting the first commands, clicking run…

“Hello, my name is Momo. I am a robot.”

Fi1frxbX1ZC3g7clX4E8ttY3mlAV4hkiGqyASlmIqyCXn0z3TKgVYmcnpPXqRa8 LgqF5u-XDRF9Gfxg72Iq3YjDqbkNxKlwKGTqju96n2z7yJfxQWAgVGHOoT1JHbUbS1viSm22

Everything worked. Throughout the six sessions on the first day, and the six sessions on the second day.

Sure, I made a few mistakes commanding the robot. Chose a wrong line to say a few times. Misunderstood Minja’s directions. Occasionally there were uncomfortably lengthy pauses, while I was looking for the right code to uncomment. I even had to reboot the robot once in mid-session, because I clicked a few millimetres in the wrong place, and MyRobotLab jettisoned all preloaded code.

It was a very intense experience. My window of vision into the room was mainly through the robot’s eye camera, Minja and Olli were keeping an eye on the other cameras. When people formed an eye contact with the robot, they were looking directly at me. How will they react? What will they do? How do they feel about this? I went for walks outside to hug trees between the sessions to remain reasonably calm.

rRZV4-K8C6p6F1i4SK3KmqJhjU2c3tgUBf2gVQiFlwy0XecZD5AOWhBjwUxG2Q57Ocdinp-miIOYd19vbrbGSVNTmlEsWL9bDOPHhZE9xXL1WY26VgN2ThRZOioQhlWby-yz6SxU

Only a few sessions didn’t run all the way through. Many were completed pretty fast, because the children decided to play along with the robot, according to the script, mimicking its movements, or trying their best.

One young chap even imitated the clumsy servo movement of the sign language that I had failed to perfect, for instance by making a sign for “ball” that is a bit angular towards its lower end. He did it in real-time sync with the robot, with no delay that my eye could perceive.

Another young gentleman didn’t so much care for the script, but instead he laughed out loud the whole session through. It was contagious. It’s hard to control a robot while laughing.

In conclusion

This was a very interesting project and I believe there’s a good chance for a more ambitious continuation with proper funding.

Regardless of the results, it is too early to say just how much robots can help in this area. Determining this would require proper tests running over a longer period of time. However, because we decided to invest in this spike, the likelihood of such tests taking place in the near future – with public funding, perhaps – is now that much higher.

This is the kind of value creation companies like Futurice are well equipped to do. We can experiment with emerging technology, with our lean user-centric approach, on real social challenges that carry a high potential for impact. The advances in recent years in 3D printing and cheap (IoT driven) electronics now make it easier for us to include hardware components in our projects, such as the robot, and modify those as necessary. Open-source hardware is a very interesting domain, and I believe it will have a big impact on innovation in robotics. We will find ways to contribute to that.

I am happy and proud we have been able to do a project like this. I believe we will continue on this path.

And finally, a video clip. Thanks for reading!

Embedded content: https://vimeo.com/274637859

Author

  • Teemu Turunen
    Corporate Hippie