• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

Watch The Figure 01 Robot Feed A Human, Sort The Dishes, And Stammer Like Us Meatbags

by Michael Wolf
March 15, 2024March 15, 2024Filed under:
  • News
  • Robotics, AI & Data
  • Click to share on Twitter (Opens in new window)
  • Click to share on LinkedIn (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to share on Reddit (Opens in new window)
  • Click to email this to a friend (Opens in new window)

While much of the startup funding for food-centric robots has been for task-specific fast-automation from the likes of Picnic Robot and Chef Robotics, some of the more intriguing – and creepy – action is happening with humanoid robots.

The latest entry into the “watch a humanoid robot handle kitchen tasks” files is from Figure, which just showed off the latest capabilities of the Figure 01 robot by showing how it can identify food and sort through kitchen tasks.

What really stands out to me is the weirdly human voice of the robot, which includes very human-like pauses and slight stammers. As an example, in one exchange, a human interviewer asks Figure 01 to explain why it handed over an apple. Figure 01 responds with a quick “On it” and then goes on to explain, complete with an “uh” pause that makes you almost think there’s an actor behind the curtain spitting out the lines.

You can watch for yourself below. The exchange I am talking about happens 48 seconds into the video.

Figure Status Update - OpenAI Speech-to-Speech Reasoning

According to Figure, the latest release showcased in the video illustrates how it has put OpenAI’s large language models to work to provide high-level visual and language intelligence, while its neural networks are responsible for powering the almost human-like dexterity of the robot. The company has raised an eye-popping $754 million in funding.


Related

Get the Spoon in your inbox

Just enter your email and we’ll take care of the rest:

Find us on some of these other platforms:

  • Apple Podcasts
  • Spotify
Tagged:
  • Figure One
  • Food robotics
  • robot

Post navigation

Previous Post Not Surprisingly, Starbucks Is Shutting Down Its NFT Program
Next Post The Food Tech News Show: Behold, The Humanoid Kitchen Robot is Here

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Get The Spoon in Your Inbox

The Spoon Podcast Network!

Feed your mind! Subscribe to one of our podcasts!

How ReShape is Using AI to Accelerate Biotech Research
How Eva Goulbourne Turned Her ‘Party Trick’ Into a Career Building Sustainable Food Systems
Combustion Acquires Recipe App Crouton
Next-Gen Fridge Startup Tomorrow Shuts Down
From Starday to Shiru to Givaudan, AI Is Now Tablestakes Across the Food Value Chain

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
loading Cancel
Post was not sent - check your email addresses!
Email check failed, please try again
Sorry, your blog cannot share posts by email.