• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

Sony

October 30, 2024

A First Look at Roku Shoku, Sony’s Culinary Recording System to Capture and Replicate Chefs’ Recipes

This past week in Japan, Sony unveiled a project they’ve been developing in secret called Roku Shoku, a culinary recording system designed to capture exactly how a chef prepares a meal. The system also serves as a guidance tool, helping casual or inexperienced cooks create dishes with the precision of an expert with years of training.

Sony has been working on this project, which stands for “Record” (Roku) and “Cooking” (Shoku), for the past five years. Last week, the entertainment and consumer electronics giant held the first-ever press demonstration of the recording studio for The Spoon team.

“We have a recording studio here in Tokyo,” said Tomoko Nomoto, Project Leader for Roku Shoku. “We invite Michelin-starred chefs, or even grandmothers, to the studio and ask them to cook with our system. We then record their culinary data, including temperature, steam levels, and the entire cooking process.”

The project is led by a Sony R&D team out of Tokyo and is separate from research in the area of gastronomy that has taken place at Sony’s AI Research division Sony AI (the formal Gastronomy program announced in 2020 has been sunsetted, but Sony continues to work on gastronomy-related projects). Since launching the Tokyo recording studio in 2021, the team has captured thousands of recipes across a range of cuisines, including Japanese, Chinese, French, Italian, and Thai.

The Roku Shoku system features induction cooktops with temperature sensors, scales to monitor and weigh ingredients, cameras to capture a chef’s movements, and an off-the-shelf game controller made by Steam to control the setup.

Nomoto shared that users can replicate meals precisely as chefs cook them, a claim I tested myself. You can watch me trying it in the video below.

First Ever Look at Sony's Roku Shoku Culinary Recording System

According to Nomoto, the goal is to use Roku Shoku both to document recipes for restaurants and food service locations and to preserve culinary creations for future use.

“The first step will be to work with restaurants that want to share a consistent experience worldwide or recreate dishes that are no longer available, like when a chef passes away or retires,” said Nomoto.

Spoon readers might recall Cloudchef, another system that records chef creations. Nomoto explained a key difference: Sony plans for Roku Shoku to enable only human chefs to recreate these meals, while Cloudchef eventually aims to use robots for meal replication. Currently, both systems are focused solely on human use (see Spoon’s Tiffany McClurg using the Cloudchef system here).

The company has launched a website where you can find out more and request a demo.

August 23, 2021

Forget Watching TV. Sony Wants Your TV To Watch You (and Monitor Your Eating)

The couch in front of our TV is a sacred no-judgment zone, the place where we wear sweats, eat pizza, and binge-watch Cobra Kai on Netflix.

But what if the TV watched and, well, judged us? That’s the dystopian future Sony is thinking about, at least if US patent number 11,081,227 is any indication.

Called “Monitoring and reporting the health condition of a television user,” the patent describes a television with embedded cameras and sensors that monitor everything a person while slouched on the couch, including tracking the food they eat. The patent was awarded to Sony on August 3rd of this year.

So why is Sony looking to invade the safe space on our couch? According to the patent, the thinking is a TV like this could monitor the health of the TV viewer by recording activities such and eating habits and look for any signs of a potential impending health problem.

Illustration of Remote with Biometric Sensors from Sony Patent #11,081,227

From the patent:

People spend a significant number of hours sitting in front of a television (TV). They may have a hidden health problem or a diminishing health condition. The system monitors one or more health vitals of the user such as heart rate, etc. While sitting in front of the TV, they may behave in an unhealthy manner. For example, they may eat too much while watching a TV program. The system may also monitor the types of food a person eats while watching TV.

The patent describes a system that would use a camera embedded in the TV to log a person’s food consumption and monitor eating patterns such as chewing speed, posture, and how fast they eat. The system would also identify the type of food, estimate portion size, and take a guess at how many calories are consumed. All that info could be synced with biometric monitoring (the patent suggests a heartbeat sensor in the remote) to paint a profile of a person’s health and how their TV watching habits impact it.

So who needs a TV to monitor their food intake? I guess if one consumes lots of empty calories while sitting in front of the tube, this might prod them to change their habits. But on the other hand, do we really need a TV to tell us we’re eating poorly?

And then there’s the problem of consumer privacy. Some of you may remember the controversy a few years back about a few lines in the terms of service or a Samsung smart TV that suggested the company might capture conversations and sell them to third parties. I can only imagine what people might think about cameras on their TV watching their every move.

In Sony’s defense, this isn’t close to being an actual product. Patents are oftentimes just corporate thought experiments, where R&D managers with budgets to burn ask what if? The likelihood of Sony actually building their big brother TV is probably pretty low.

Probably. But maybe someday they will, and if they do, it’ll be interesting to see if consumers are ok with the idea of a TV watching them, trying to make them better people. But hey, if you’re really watching Cobra Kai and downing whole quarts of ice cream every Friday night, maybe you could use the help of your Sony 70″ 4K OLED TV to help perform a little self-care.

March 29, 2021

Food Tech Show Live: Sony Invests in our Robot Chef Future

The Spoon team recently got together on Clubhouse to talk about some of the most interesting food tech and future food stories of the week. This time around, we were also joined by food tech investor Brian Frank.

If you’d like to join us for the live recording, make sure to follow The Spoon’s Food Tech Live club on Clubhouse, where you’ll find us recording our weekly news review every Friday.

The stories we talked about this week include:

  • Cell-Cultured Fish Startup Bluu Biosciences Raises €7 million
  • The Rise of ‘Premium’ Cultured Meat Startups
  • Sony Invests in Analytical Flavor Systems and our Robot Chef Future
  • NASA Harvest Partners with CropX to Combine Soil Monitoring and Satellite Data
  • Ex-WeWorkers Launching Santa, A Hybrid ‘Retail Experience’ Startup Focused on ‘Small US Cities’

As always, you can find the Food Tech Show on Apple Podcasts, Spotify or wherever you listen to podcasts. You can also download direct or just click play below.

March 24, 2021

Sony Invests in Analytical Flavor Systems To Help Our Robot Chefs of the Future Better Predict What We Want to Eat

When Analytical Flavor Systems (AFS) CEO Jason Cohen talks about his company’s technology, it usually involves a story about helping a big food company. In particular, how AFS’s AI flavor platform, Gastrograph AI, can predict how a new product such as a bag of chips or energy drink might perform in a new country or with a new demographic group.

Ask him to tell you about his company’s flagship product in the future, though, and the conversation might well include discussing how it helped a robot chef decide what exactly to make us for dinner.

That’s because the New York-based startup has recently taken a corporate investment round from Sony, which late last year announced a dedicated project focused specifically on developing food focused AI and robotics technology.

According to Cohen, Sony’s investment team believes AFS’s technology could help them achieve their goal of creating robot chefs that know exactly what to cook.

“Once you get a food robot up and running, you don’t know what people are gonna want to eat, what they’re going to prefer, like and dislike,” said Cohen in a Zoom call.

But with a technology like Gastrograph, robots of the future might be able to more accurately predict flavor combinations consumers better than humans themselves.

“Consumers have no idea what they like and dislike,” said Cohen. “They’re actually really bad at determining whether, say, they want more vanilla in their vanilla cup. You ask them and they say ‘yes’, you put it down and they want less. Sony recognized that and that’s what part of this investment is for: future innovation and growth, to help them accomplish their goals of getting cooking robots into every home.”

In addition to Sony, AFS also took on corporate investment from BASF. According to Cohen, the German conglomerate’s plant breeding and genomics program is interested in using Gastrograph AI to help them in seed development so they can ultimately predict with a higher success rate with the crops they are breeding.

“It takes a really long time, anywhere from a year to a couple of years, even decades, on some crops, in order to breed in specific traits,” said Cohen. “And so they work with us to rapidly screen and profile the fruits that they are market leaders in, or the fruits that they want to become market leaders in, and to breed better seed stock that they can then sell to farmers in different countries around the world.”

While Cohen wouldn’t disclose the amount of investment from Sony and BASF, he did say that this was an in-between “corporate”round that they had been discussing with the two strategic investors before COVID hit, so when everything shut down in March of last year things were “already in place.” Next year, said Cohen, he expects the company will raise a Series A round of funding.

Speaking of COVID, I asked Cohen how his company did this past year, and he said their core business of working with CPG innovation teams actually benefitted because as companies wanted to keep the innovation pipelines moving during a time of severe travel restrictions and AFS’s technology allowed them to do that. Since the normal big food practice of testing products in-market with focus groups curtailed during lockdowns, some companies relied on Gastrograph AI’s technology to predict how a new product might do.

This recent investment follows a 2018 seed round investment of $4 million.

December 15, 2020

Sony AI Unveils Trio of Food Projects Including AI-Powered Recipes and Robots

Sony AI, an artificial intelligence and robotics research and development company spun out from Sony, today announced its “Gastronomy Flagship Project.” The new food-related endeavors include an AI-powered recipe creation app, a robot chef’s assistant and a community co-creation initiative.

Sony launched Sony AI last November and the unit became its own company in April of this year. Sony AI’s mission is to develop AI-related products for videogames, imaging and sensing, and gastronomy “with the aim of enhancing the creativity and techniques of chefs around the world,” according to today’s press announcement.

To that end, the three projects Sony AI announced today are:

  • AI-Powered Recipe Creation App – Gathering up data such as aroma, flavor, molecular structure, nutrients and more, this app will use AI algorithms to help chefs create novel food pairing, recipes and menus.
  • Chef Assisting Cooking Robot – This is pretty self-explanatory. Sony AI will develop a robot that can mimic the physical actions of a human chef to do everything from preparing to plating dishes.
  • Community Co-creation Initiative – This project is a little more vague, with Sony AI only saying it will “aim to contribute to the long-term sustainability of the community” through relationships with universities, research institutes, companies and more. The first step in this process is a “Chef Interview” video series on the Sony AI website.

Sony AI’s press release didn’t provide specific details around when or where these projects would launch (other than the video series).

But today’s announcement continues a trend we’ve been seeing with large electronics corporations doing advanced work on food robotics and AI. LG is working on robot waiters with Woowa Brothers in South Korea. And Panasonic is working with with the Haidilao chain of hot pot restaurants in China to develop a robotic kitchen.

Sony itself is no slouch when it comes to food-related robots. The company collaborated on cooking robot research with Carnegie-Mellon University a couple years back. And it has a pretty grand vision for advanced AI-powered cooking robots and assistants that could make Michelin star meals in your home.

November 21, 2019

Sony Sets up AI Unit to Work on Food

Sony announced this week that is has launched Sony AI, a new organization that will research and develop artificial intelligence specifically for games, imaging and sensors, and “gastronomy.” The new initiative will have offices in Japan, Europe and the U.S.

There aren’t many details around what exactly Sony will be working on, but Sony spokesman Shinichi Tobe told AlJazeera yesterday that “AI and robotics will not replace chefs. We are aiming to offer new tools to expand their creativity with AI and robotics.”

This isn’t Sony’s first foray into food. In April of last year, Sony teamed up with Carnegie-Mellon University to work on food robots. As we reported at the time:

Sony said they were starting off with food-related robots because the complexities involved with food could later be applied to a wider range of industries. Specifically, it cited the ability to work with fragile and odd-shaped materials, as well as the ability to operate a robot in small spaces.

AI and robots are like peanut butter and chocolate with AI being the “brain” for the robot “hands.” Things like computer vision, deep learning and synthetic data help form the AI so the robot can determine objects to grab and manipulate, etc..

Sony’s motivations may also be more societal in nature as the company’s home country of Japan is facing an aging population. Robots and other forms of automation could help with a potentially diminished labor force.

Food is a popular subject for robotics and AI researchers. Nvidia’s Lab in Seattle built a kitchen to train its robots to do everyday tasks. IBM partnered with spice company McCormick to use AI to develop new food products. And Korea’s Woowa Bros. hooked up with UCLA to work on food robots as well.

Something tells me we’ll be seeing more of these types of deal throughout next year.

June 3, 2019

Video: Sony’s Masahiro Fujita on Bringing AI and Robotics to Food

We were thrilled when Masahiro Fujita, Chief Research Engineer of the AI Collaboration Office of Sony agreed to be a speaker at our ArticulATE food robotics conference in April. He started Sony’s Robot Entertainment project in 1993 and led the development of the famous AIBO robotic dog.

During his ArticulATE presentation, Fujita talked about how historically, Sony has provided technology throughout the entertainment stack. Its studio Sony Pictures finances movies, filmmakers use Sony products to shoot and create movies, and consumers can watch movies on their Sony Blu-Ray or PlayStation Network.

Fujita said that Sony views the food world in much the same way. It wants to provide the underlying robotic and AI technology that can help creative types like chefs make their food, as well as the mechanisms for people at home to enjoy high-level cooking.

Because the technology is still so nascent and not ready for prime time with consumers, Sony is looking first at B2B applications. The company really wants its AI and robots to be able to make meals in a 3 Michelin star restaurant. Those of us waiting for robotic help at dinnertime at home are going to have to wait awhile.

Check out Fujita’s full presentation (and a glimpse into the robo-chef future) from ArticulATE below:

ArticulATE 2019: Where Is It All Going? A Look Forward with Sony's Masahiro Fujita

April 16, 2019

Here’s The Spoon’s 2019 Food Robotics Market Map

Today we head to San Francisco for The Spoon’s first-ever food-robotics event. ArticulAte kicks off at 9:05 a.m. sharp at the General Assembly venue in SF, and throughout the daylong event talk will be about all things robots, from the technology itself to business and regulatory issues surrounding it.

When you stop and look around the food industry, whether it’s new restaurants embracing automation or companies changing the way we get our groceries, it’s easy to see why the food robotics market is projected to be a $3.1 billion market by 2025.

But there’s no one way to make a robot, and so to give you a sense of who’s who in this space, and to celebrate the start of ArticulAte, The Spoon’s editors put together this market map of the food robotics landscape.

This is the first edition of this map, which we’ll improve and build upon as the market changes and grows. If you have any suggestions for other companies or see ones we missed you think should be in there, let us know by leaving a comment below or emailing us at tips@thespoon.tech.

Click on the map below to enlarge it.

The Food Robotics Market 2019:

November 9, 2018

Sony Shows Off its Vision of the Robotic Kitchen in New Promo Video

Earlier this year, consumer electronics giant Sony announced a collaboration with Carnegie Mellon University to research robots for “optimizing food preparation, cooking and delivery.” This week, we got a sense of how those robots might look via a new Sony promo video.

Titled “AI x Robotics x Cooking,” the video shows off two scenarios. The first is an older gentleman who is prepping for what seems to be a Thanksgiving get together with friends and family. In it, the robot assistant, which looks like a long, plain countertop, quietly springs into action as the man tells it that there are more guests coming to the party.

Robotic arms rise up and chop vegetables by imitating the human. Potatoes are stacked with precision and rather than dumping them into a pot, a clear container comes down over them on the counter and fills with water (evidently a high-tech seal keeps it from leaking out the bottom). There are blender and pouring and cake icing attachments, along with all sorts of sensors that automatically cook the food (though the robot in the video may be trying to murder the family as it says the internal temperature of the served turkey is 146.3 degrees Fahrenheit. Safe cooking temperature for turkey is 165). And if all that automation wasn’t enough, a robotic assistant even slides out from under the counter to serve drinks.

While the robot in this video is definitely still the stuff of science fiction, it highlights how Sony is thinking about robots in the home. Japan has an aging population and robots will play an increasingly important role in providing home care (hence the older man highlighted in the video). But Japan’s shrinking population also means that it is facing a labor shortage in places like restaurants.

And the commercial scenario is what the second half of the Sony video is all about. In it, a hip young couple goes out for a fancy evening out in what kinda looks like the piano lounge of the Death Star. The robot in this setting is round, with people sitting around it like at a bar. Facial recognition is used to bring up the diners’ preferences, and robo-bartender arms whip up a cocktail. This robot also cooks using hot knives to sear while slicing, and creates a delicate edible flower arrangement dessert that will match what you are wearing (just watch the video).

Another thing that struck me while watching this video is how similar the home cooking bot’s form factor is to that of the Moley robot, which is active development now. Both use long, narrow counters and robot arms sliding around on tracks to manipulate food and make meals. That probably makes sense given the limited space available in most kitchens to potentially install a full-fledged robot.

Of course, Sony’s robotic vision is bound to change as they do more research and uncover more use cases. But until then, this video gives us a little guidance as to what our robot future will bring.

April 19, 2018

Sony and Carnegie Mellon Team Up to Work on Food Robots

Sony Corporation announced yesterday that it has hooked up with Carnegie Mellon University (CMU) to collaborate on artificial intelligence (AI) and robotics research that will initially focus on “optimizing food preparation, cooking and delivery.”

In a press release, Sony said they were starting off with food-related robots because the complexities involved with food could later be applied to a wider range of industries. Specifically, it cited the ability to work with fragile and odd-shaped materials, as well as the ability to operate a robot in small spaces.

The research will happen mostly at CMU’s School of Computer Science in Pittsburgh. Partnering with a big tech company isn’t new for CMU; the school has previously worked with Uber on self-driving car technology.

Despite, or perhaps because of, the complexities involved with its preparation, food robotics is a hot area right now. Miso Robotics has Flippy, a robot which uses a series of cameras, thermal imaging and AI to properly cook a hamburger. Cafe X launched its second generation robot barista-in-a-box. Meanwhile, 6d bytes and Alberts have both launched smoothie-making robots.

More difficult than actually building useful robots may be tackling the issues surrounding human/robot interaction. Flippy was “retired” after just one day because human workers just couldn’t keep up with the fast robot. Cafe X and 6d bytes’ bots are self-contained units that have pretty much taken humans out of the equation altogether.

The delivery aspect of this partnership is also intriguing. Companies like Marble and Starship have started rolling out pilot projects for robot food delivery in cities across the country. Leveraging CMU’s experience with autonomous driving could rapidly advance the viability of small, grocery-carrying robots scurrying around city sidewalks.

We’ll keep tabs on this project to see what comes of it. Who knows, maybe Sony can develop a delivery robot that’s as cute as its Aibo robot dog.

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...