• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

kitchen robot

December 18, 2020

Moley’s Robotic Kitchen Goes on Sale

Moley’s robotic kitchen first burst onto the scene in 2015, wowing audiences at CES Asia with its science fiction-levels of cooking automation. Now after more than five years of development, the Moley is finally going on sale. But the kitchen of tomorrow is not going to be cheap.

It’s probably best to set the stage first. Getting a Moley requires a lot more work than just bolting a robotic arm above your existing stove. It’s an entirely integrated system of appliances, cupboards, a touchscreen, storage containers, pots, pans, utensils, a protective screen and yes, two articulating arms with robotic hands. The Moley basically takes up one entire wall of a kitchen.

Once it’s installed, or more accurately, built, into your kitchen, there’s a little bit of set up for a person to do. Ingredients needs to go into special containers and identified in the system so the robot knows where to retrieve them. Utensils and pots are placed in special storage areas that slide out of view when not in use. But once all that’s done, the digital smarts and robot take over.

Using the built-in touchscreen, the user selects from one of 5,000 (and growing) recipes. A clear, protective screen drops down and the two robotic arms slide out from their storage on a rail up above to start grabbing the pots, pans and utensils it needs. The robot can fill a pot with water in the sink, turn on the induction burner and then, using sensors and cameras, retrieve all the ingredients it needs to make the dish. Moley will even let you know when you’re running low on a particular ingredient.

The robotic arm’s movements even have some pedigree. Tim Anderson, a former BBC Master Chef winner, “trained” the robot’s movements by recording his techniques in 3D, which were then translated into specific algorithms for the machine. It’s not hard to imagine Moley enlisting other chefs down the road and offering different downloads so you could have “Nigella Lawson,” or “Marcus Samuelsson” cooking your meals.

Listen. It’s a super complex piece of high-technology and perhaps the best way to grok it is to see it in action in this in-depth launch video that Moley put out this week:

Moley Robotics First Product Launch 2020

The Moley kitchen is also a hybrid of sorts. If you prefer to do the cooking, you can keep the robot in its closet and do all the work yourself. Though, at these prices, I imagine you’d want to put that robot to work.

Moley is targeting both residential and commercial kitchens with its launch. Make that, very expensive residential kitchens. A base Moley kitchen without the robotic arms will cost you £128,000 (~ $173,000 USD) and a unit with the robotic arms will cost £248,000 (~ $335,000 USD).

Moley CEO and Founder, Mark Oleynik, told The Spoon in a phone interview this week that the first customer will be receiving their Moley robotic kitchen next year. (No further details were offered).

Oleynik also put the high price of the Moley into some perspective by likening it to a dishwasher. When those were first introduced, they were expensive and most people didn’t think they were necessary. Now dishwashers are affordable, mainstream and play a key role in our kitchen lives.

Oleynik also envisions a future where his robotic kitchen can help the elderly age in place. With just a few taps on a screen, people of any age can have a homecooked, fresh meal prepared on the spot. It’s the same vision Sony has for its take on the future of kitchen robotics.

Samsung is also working on an articulating arm-based kitchen robot, and Oleynik welcomes the competition with a philosophy that a rising tide lifts all robotic kitchen boats.

The Moley is available for purchase now, and if any Spoon reader does get one, may we recommend pairing it with the WineCab Wine Wall robotic sommelier for the ultimate in futuristic dining.

May 30, 2018

How An Obscure Academic Project May Have Just Started A Kitchen Robot Revolution

Imagine it’s 2031 and you’ve sat down for dinner with your family.

It’s middle of the week so tonight’s meal is nothing too ambitious, mac and cheese or fajitas. As is the usual routine, you catch up with the family and share a few laughs until the meal is finally served, at which point everyone loads their plates and starts chowing down on what turns out to be a tasty dinner (the third one this week!).  Soon your youngest – the finicky one – asks for seconds.

Congrats parent, another successful meal, but don’t spend too much time patting yourself on the back because here’s the thing: Neither you nor your significant other spent any time preparing tonight’s dinner.  Instead, tonight’s dinner – and every dinner this week – was prepared in its entirety by a robot, the very same robot who is now in the kitchen cleaning up after dinner and preparing dessert.

Futuristic? Yes. A science fiction movie cliche? Definitely. But the above scenario may also be a very realistic possibility in large part due to an obscure research project involving 32 GoPro adorned home cooks making dinner.

Creating A Technology Big Bang

With any technology that changes the world, there’s almost always a research breakthrough or two that helps unleash innovation. In today’s world of AI and robotics, most experts would agree that one of these technological “big bangs” was a 2012 ImageNet Challenge research team led by the University of Toronto’s Geoff Hinton.

ImageNet is a crowdsourced database of millions of annotated images. The accompanying ImageNet Challenge is an annual contest where teams of researchers in the area of machine vision come together to pit their machine vision algorithms on the ImageNet dataset and against one another to try and achieve the highest degree of accuracy.

Hinton’s 2012 team had what is widely believed to be a breakthrough in AI research by utilizing deep learning techniques to achieve much greater accuracy than before (85%).  Since this breakthrough effort six years ago, there’s been leaps forward each year – today’s ImageNet Challenge teams routinely achieve 95% accuracy, better than most humans –   helping to drive significant progress in all corners of the AI world from autonomous driving to augmented reality to industrial and consumer robotics.

All of which brings us back to the kitchen.

And Now Into the Kitchen (The Epic Kitchen)

Now, a group of research academics is trying to create what is the equivalent of an ImageNet for the kitchen. Called EPIC-KITCHENS, the project is an ambitious effort to capture people performing natural tasks in their home kitchens like cooking, cleaning and doing laundry and then release the resulting millions of annotated images into the wild. The ultimate goal behind EPIC-KITCHENS is to create an open dataset about kitchen-centric objects, behavior, and interactions upon which researchers across the world can then focus their deep-learning algorithms on in the hope of advancing artificial intelligence in the kitchen.

Why the kitchen? According to the study’s lead, Dr. Dima Damen, the kitchen is one of the most complex environments in everyday life for artificial intelligence to master because it involves so many tasks and actions.

EPIC-KITCHENS 2018 TRAILER

“The most challenging type of object interactions tend to be in our kitchen,” said Damen in a phone interview I conducted last month. “We’re doing lots of tasks, on short notice, we’re multitasking. We might be adding something to our meal and moving something around. That makes the kitchen environment the most challenging environment for our types of perception.”

Damen, who is with the University of Bristol in the UK, partnered with researchers at the University of Toronto and Italy’s University of Catania to bring the project to life. The project took about a year to complete and involved a panel of 32 home cooks across ten nationalities in four cities in Europe (United Kingdom) and North America (Canada and US). To capture their activity, each participant mounted a GoPro on their head and went through 1-5 hours of preparing meals, cleaning and whatever else came naturally.

“We gave them a camera, sent them home, and said just record whatever you are doing in your kitchen for 3-5 days,” said Damen.

From there, the participants watched the video and narrated their videos so researchers had an audio track from which to manually annotate the atomized images – 11.5 million in all- captured in the 55 hours of video.

The result is a massive database its creators hope will help researchers in training their AI systems to better understand the kitchen. Like ImageNet, the creators also hope to foster competition with challenges and will track the progress with online leaderboards.

The data itself is something many will find somewhat mundane:

Distribution of actions in kitchen. Source: Epic Kitchens

The above distribution of annotated actions and objects are what you would probably expect: a really long list of things – like vegetables, kitchenware, spices – found in the kitchen. Same for actions. The above distribution breaks down pretty much all the verbs we perform in the kitchen such as put, take, twist and so on.

And that’s the point, at least if you’re a researcher hoping to train an artificial intelligence system. Just as this type of granular data helped ImageNet Challenge teams achieve a 95% accuracy rate with their software, the EPIC KITCHENS team hopes to reach a similar level of accuracy. By helping these systems understand what everyday objects are and how people manipulate them in a series of actions every day to do the basic functions of like in our kitchen like cooking and cleaning, the EPIC-KITCHENS data and what evolves out of it can provide a foundation upon which technologists can eventually create robots that act like humans and perform human-like functions in the kitchen.

The result could be an explosion in innovation in spaces like augmented reality, personalized food identification apps and, yes, cooking robotics. And while a fully-functional Rosie the home cooking robot could be the ultimate end-result of this research a decade from now, chances are we’ll see much more evolutionary improvements between now and then in the form of smarter appliances, more capable virtual assistants and more immersive guided cooking experiences.

And oh yeah: if you’re the type who wants to keep the robots out of the kitchen altogether, don’t worry. One of the biggest challenges with machine understanding of food is that the three-dimensional human comprehension of taste, smell and texture is extremely hard to replicate with machines. Add in the difficulty of AI to understand context and it makes me think that while we may eventually get to cooking robots, they may only be average cooks at best.

The real artists, the chefs – whether home based are on TV – are probably safe from the robot invasion.

Probably.

September 10, 2017

Spoon Video Top 3: Blockchain’s Impact on Food System & Bosch’s Mykie Gets Smarter

This is our video recap of the top three trending stories from the past week.

In this recap, we look at the latest iteration of Bosch’s kitchen robot Mykie, the crazy flash-freezer/microwave combo which debuted at IFA, and the impact Blockchain will have on the food system.

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...