The Ideal Robot of the Future: How the Evolution of Emotional Robots and “Memory” Technology Will Shape Our Future - “In the Pipeline” Combining Memory and Technology in Pursuit of Future Horizons -

09.06.2022 The Ideal Robot of the Future: How the Evolution of Emotional Robots and “Memory” Technology Will Shape Our Future - “In the Pipeline” Combining Memory and Technology in Pursuit of Future Horizons -

As robots become more ubiquitous, what sort of relationships will we forge with them? And how will they use data to inform their actions—both the personal data that we’ve allowed our devices to surreptitiously track and the more complex experiential data that devices will be able to collect as sensing technology evolves? Shunsuke Aoki, founder and CEO of robotics start-up Yukai Engineering, discusses these questions and their implications in the field of memory with Shinya Okuda of Kioxia.

Yukai Engineering states its vision as “Using robotics to make the world more fun.” This goal is ingrained in their very name, yukai meaning “fun” in Japanese. Aoki defines robots as “beings that appeal to people’s emotions and trigger action” and has made it his mission to spread this concept throughout Japan. He travels across the country proposing unique ideas for robotic products and ways to use robots to assist in business. Aoki explains his vision of a future where robots are fully integrated into our lives.

Aoki: At Yukai Engineering, we develop commercial robotic products. As our name implies, we want to create robots that will make the world a happier and more fun place and make their users happy.

Okuda: Sorry, I’m getting distracted by the tail on that cushion. [Laughs.]

Aoki: This is Qoobo, a therapy robot in the form of a cushion with a tail. It’s very simple: you pat or stroke the cushion, and Qoobo wags its tail. Like this. [Demonstrates.]

Okuda: It’s wagging its tail! It looks so happy! So it reacts to touch?

Aoki: Yes. At Yukai, we are constantly brainstorming robot ideas like this one. If there’s an idea we like, we create a prototype and test it.

Okuda: Our ancestors had tails, too. Do you think there is a part of us that has memories or feelings from that time?

Aoki: I think so. Even after millions of years of evolution, we as a species still care about where our tails used to be. Look at other aspects of ourselves. Children can discern animals faster than they can learn a language—an ability that I think is inherited from our ancestors, who had to learn which animals were dangerous in order to survive. I’m fascinated with the idea of building robots that speak to something primal in ourselves.

Okuda: I like being able to hold this cushion in my hands. The weight feels just right.

Aoki: That was an important factor in its design. If it’s too light, it doesn’t feel like a living being. Texture’s also important for the same reason. It takes a lot of work to design a “feel” that stays with you in your memories. By the way, we also released a smaller version of the Qoobo called the Petit Qoobo at the end of 2020. This version has a built-in microphone that reacts to sound. If there’s a loud noise, the Petit Qoobo kind of jumps a bit, and if you’re holding it close to you, you can feel a tiny rhythm coming from inside, as if its heart is pounding.

Robots and Smart Environments

Okuda: These design choices seem to imply that you are envisioning a world where we all live with robots.

Aoki: When most people think of robots, they think of beings that are basically the perfect household servant. Actually creating such a robot would require extremely advanced technology, which would make the robot too expensive for the average person to have inside their home. But at Yukai, we want to create a world where there is at least one robot in every household. I think we’re going to start to see households with a wide variety of robots, rather like households with many pets.

Okuda: I guess you mean multiple robots, each specialized in a different task, rather than one robot that can do everything.

Aoki: Yes. For a robot to be able to do everything, it needs to be packed with sensors and processors. It’s just simpler to, for example, install cameras and sensors in a room than to install a monitoring function in a robot. We are making our environments smarter—a trend that I think will be good for us in the long run. The role of robots is to study the information gathered by our environments to better understand us and predict our actions. That is the future we at Yukai are hoping to achieve.

Okuda: You are right about that trend. I have an AI speaker that I use to control my robot vacuum cleaner and TV. It’s nice to dream of a future of Astro Boys and Doraemon, but I can’t imagine the amount of work it would take to build one of those. [Ed.: Doraemon is a popular Japanese manga and anime character who is a robot with an interdimensional pocket that allows him to pull out a wide variety of useful gadgets. He has come from the future to aid a young boy, Nobita, who is no good academically or at sport.]

Aoki: There’s a team of cutting-edge robot researchers who are trying to develop a robot that can pick one plate at a time from a stack of plates without breaking any. But they still don’t have a clear idea of what purpose such an ability would serve. For example, it would be quicker, more efficient, and cheaper to place dishes in a dishwasher than to have a robot wash the dishes. Just look at our robot vacuum cleaners. Developing a bipedal robot that can use a vacuum cleaner is just so much more technologically demanding.

Okuda: Robotics have advanced quite a bit overseas, where they now have robots that can run, fly, or jump. So, a bipedal cleaning robot isn’t completely out of the realm of possibility. It certainly wouldn’t be affordable, though. [Laughs.] At the end of the day, it makes more sense to place sensors around the house, collect the data in one location, and entrust the storage and protection of the master data to, for example, Kioxia. Any knowledge analyzed from the data can then be transmitted to Qoobo and other little robots to help inform their response to the user.

The Reassurance of Having “Someone” to Talk to

Aoki: I think today’s household appliances can be described as robots. But even as these appliances are becoming increasingly robot-like, we still don’t think of them or treat them like robots. To be fair, there is something unnatural about ordering someone you can’t see to vacuum the floor. It’s only human to want to talk directly to someone or something.

Okuda: You have a point. You can address a smart speaker from wherever you are, but it’s not the same as asking a robot directly to take care of a task. You want the robot stationed right in front of the smart speaker so you can address it directly. Talking to a speaker is boring. There’s something comforting about watching a robot physically respond to your command. It can be soothing and also make you feel more attached to the robot.

Most of my appliances at home can be operated by voice. It’s useful, but it would be even better if the appliances could respond to my actions. Imagine a ceiling camera that turns on the hot water heater for the shower or bath if it sees you walking towards the bathroom at a certain time. Of course, it could be annoying if the camera makes a false assumption, but I do think the next step for voice-operated commands is automated operations that are based on your actions, on eye contact, and on behavioral patterns.

A Robot That Can Read the Room

Okuda: Since college, I’ve been a fan of hard science-fiction worlds like the kind you see in the movie Minority Report. I love the gesture-based user interface featured in that movie.

Aoki: Have you ever imagined a future like the one depicted in the movie?

Okuda: Now that I’m a little bit older, I think using that interface would just be too much work. [Laughs.] At some point, your shoulders are going to be too weak for you to move your arms and hands around like that. So, I think the end goal is something like a brain-machine interface.

Aoki: We already have appliances that react to commands like “switch on”, like TVs.

Okuda: Right, but they’re not that widespread yet. Especially in Japan, where homes tend to be small.

Aoki: I do think eye-tracking and fingertip detection are the next steps in user interfaces. Correctly detecting finger movement would require installing sensors throughout the environment, but eye-tracking would be easier to implement. In the end, though, it would be nice to have interfaces that can read the room, so to speak. So if you enter the bathroom, it asks if you’d like the hot water turned on in case you’re there to take a shower or bath.

Okuda: I think that’s a good idea, but on the other hand, even Nobita sometimes gets fed up with Doraemon always trying to help. [Laughs.] A robot that cares too much might tell you to go to sleep when what you really want to do is play video games all night.

Aoki: I’m the opposite. I would love to have a Doraemon in my life because I can’t do anything right on my own. [Laughs.] I want lights that gradually dim as my bedtime approaches. Rather than doing whatever I want to do, I want to be conditioned by my environment. That’s the kind of future I’d like to see.

A Future Enabled by Robots and Massive Storage Capabiities

Developing an emotional attachment to robots could generate the kind of communication that produces data of a much higher quality than the big data accumulated to date. Researchers may be able to acquire a far larger trove of healthcare data to apply in their fields, while individual users may be able to revisit memories and vividly recreate their sensory experiences. As Okuda and Aoki continue their discussion, they imagine the lifestyles that could become possible with robots and high-capacity storage.

Aoki: I’d like to take this opportunity to learn more about your work, Shinya.

Okuda: My team and I create flash memory. I’m personally involved in developing process technologies for next-generation electronic devices. You’ll probably find the results of my work in your smartphone and laptop computer.

Aoki: What are process technologies?

Okuda: Manufacturing a flash memory unit involves numerous steps. These include coating the silicon substrate that forms the base with an ultrathin film only a few nanometers thick, imprinting the circuit design into the material, cleaning the unit, and putting the unit through a number of tests. The technologies used in these steps are collectively known as process technologies, but each step requires different technological expertise. For each step, expert engineers research and develop process technologies suitable for next-generation product designs, as well as technologies for speeding up the production line. I am in charge of coating technologies, which vary depending on the usage—insulation, for example, or storing electricity. Hundreds of these process technologies go into making a single flash memory unit.

If even just one of these processes is flawed, the product will be defective. That’s why we keep a huge amount of log data from every step in the manufacturing process. If there is a problem, the engineers can look over the data and figure out a way to solve the issue. Mass production involves a huge number of physical parts, and to ensure all the parts go through the process without a hitch, we continuously accumulate data and use AI to analyze it. To put this in robotics terms, we accumulate massive volumes of behavioral data from our semiconductor manufacturing machines, store the data in memory units with enormous capacity, and use AI to analyze the data. In the future, we might develop an AI-assisted robot that reads the analysis and suggests fixes.

Aoki: Interesting.

Okuda: When we discover an issue during the development process, it’s often difficult to come up with the correct solution on the spot. It takes many engineers putting their heads together to come up with a solution model that they can test and adjust until they have fixed the issue. I’m hoping that eventually a robot will be able to propose these solutions.

Development of Infinite Storage of Healthcare Data

Okuda: Our semiconductor manufacturing machines use sensors to monitor the product as it’s being made so they can warn us before any abnormalities in the product’s features occur. It’s the same mechanism found today in smartwatches and other wearables that allows a device to use sensory data to warn the user before they suffer a serious physical condition. I read a news article the other day about someone who collapsed but whose wearable device detected the impact—as well as abnormalities in the person’s heartbeat—and called for help in time. These devices use data to predict heart diseases and monitor the user’s health.

Aoki: One major topic in Japan is the aging population. I’ve heard there are now 10 million people in Japan with a high risk of dementia; it would be nice for these people to have wearable devices that can bring them some peace of mind. Charging the devices is a pain, though. When I wake up in the morning, there’s about a fifty-fifty chance that my devices are fully charged. Imagine how much more difficult it is for someone elderly.

Okuda: We need devices that can stay on all the time.

Aoki: Another issue related to devices and healthcare data is privacy. You can’t just put all this data into the cloud. And although cameras that can detect danger are useful, you also don’t want your every action to be recorded. This kind of sensitive data needs to be processed at the edge.

Okuda: That requires an edge computing infrastructure with enormous storage capacity. Say the system charts data per minute instead of per second. That’s still sixty times per hour, times 24 per day, times 365 per year. In five or ten years, the volume of data will become absolutely massive. And with better sensors, the system might pick up even more data per minute. Since we are a company that develops storage units, our goal is to create units that can collect this vast trove of data without a drop in performance. In other words, we want to develop memory that can store an infinite amount of data.

Physicality’s Role in Data Quality

Aoki: How flash memory works is, you enter a specific address inside the memory, and the memory reads out the data located at that address, right? Now, I don’t know much about brain science, but that’s not how human memory works. You remember things in the form of a story or as an order of events. I’ve been reading a book by the neuroscientist Jeff Hawkins, and he maintains that information related to where something happened is intertwined in every brain cell.

Okuda: Yes, memory and locational information are closely linked. Memory can also be triggered through sensory information such as smells and flavors. There was a man who had amnesia, and eating his favorite type of ramen brought his memory back. There’s a lot of data to which a memory is linked: locations, sensory experiences, emotions.

Sometimes you experience events that you do not enjoy, but which you then look back on fondly ten years later. You are remembering the exact same event, but you interpret it differently. It’s psychological. To actually replay a memory inside your head, you need to physically reshape your brain into what it was at the time you experienced it. That brings up another possibility: if we could implant someone else’s experiences inside your head—like in Total Recall—then you might be able to differentiate between your own memories and those of this stranger. But who knows for sure?

Aoki: Depending on how realistic the experience is, you may have trouble deciding if it’s a memory or reality.

Okuda: I think that level of realism is what we want to ultimately achieve with memory recreation. And to get there, we first need to figure out how to recreate sensory experiences.

Aoki: Interesting. I wouldn’t mind living in an apartment that records my memories. A place that records everything in my life and uses relevant data to tell me important things. Like, “You’ve been drinking too much.” [Laughs.]

Okuda: “What were you doing out so late?” [Laughs.]

Aoki: Right. And if I wanted to study English, the apartment might analyze what I need to find the motivation to do this and remind me. I really want a place like that—one that can push and pull me in all the right directions. And I want to live there with all my Yukai robots.

Okuda: I feel like Yukai robots could be engineered to gather more personal data through the act of hugging. There’s always some kind of emotional significance behind any physical action, whether it’s hugging, patting, stroking, or touching. For example, Yukai’s Amagami Hamu Hamu [robots that come in cat and dog form that replicate the playful biting of pets] might be able to tell something’s wrong if the user is unusually rough in the way they insert their finger in the robot’s mouth! [Laughs.] Or maybe you could design them so that they can read your pulse and measure your blood pressure when you insert your finger.

You usually don’t physically deal with the sensors inside your home. But Yukai’s products often encourage physical interaction, which is a great way of gathering more personal data—data that is different in both type and quality from those that a sensor can accumulate. There’s data that even wearables such as an Apple Watch won’t be able to pick up, but which can be detected through physical actions like touching or sticking your finger into a robot.

Aoki: That is really fascinating.

Enabling Collaboration by Accumulating Stories Linked to Memories!

Okuda: The robots you create at Yukai are valuable in that they appeal to our emotions, but I think they’re also valuable in terms of their potential for gathering data.

Aoki: I don’t think the emotional link is necessarily something distinctive about our robots. I think we’re already wired to connect emotionally with robots.

Okuda: I think emotions are the key to our future with robots. But recording our emotional interactions with robots as data might be very difficult. It would be neat to develop devices that can trigger emotional reactions in users and employ sensors to record those reactions as data.

Aoki: There’s nothing really out there right now that does that.

Okuda: No. The fun thing about emotional reactions is that there isn’t a correct one. You might look back on an event and think, “Oh, that was fun,” or “That was delicious.” But someone else looking back at the same event might feel angry.

Aoki: There’s an American adventurer who once said that when he looks back on the happiest moments in his life, he always discovers something painful or sad that was intrinsic to that event.

Okuda: Happiness as both joy and the release from pain? I guess there are some things in life you never want to go through again, but which allow you to feel a great sense of accomplishment when you get past them. This idea of being able to feel such exuberant joy because of something painful in the past tells me that stories are an important aspect of our memories.

Imagine being able to accumulate this story data through a variety of means so that Yukai’s products and other cutting-edge devices can study it and store their analyses in Kioxia’s memory units. We may be able to employ this knowledge in the future to help make the world a better place.

Aoki: Well, I’m on board!

The content and profile are current as of the time of the interview (Jun 2022).