• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

“Alexa, Look Into My Eyes”: New Prototype Combines Human Gaze with Voice Control to Help You Cook

by Michael Wolf
September 11, 2020September 11, 2020Filed under:
  • Interfaces
  • News
  • Smart Home
  • Click to share on Twitter (Opens in new window)
  • Click to share on LinkedIn (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to share on Reddit (Opens in new window)
  • Click to email this to a friend (Opens in new window)

There’s no doubt that voice control interfaces like Alexa and Google have had a huge impact on the way we search for recipes, access our appliances and add things to our grocery lists.

But what if that voice assistant had a contextual understanding of where you where looking when you are cooking the evening meal?

That’s the big idea behind a new prototype from Synapse, a division of Cambridge Consultants. The new Hobgoblin technology concept utilizes machine vision to gather information about where a person is looking when issuing a voice command and applies that information to the cooking experience.

From the project page:

We have been exploring the use of computer-vision based sensing as context, and for this cooktop demonstration we augmented the VUI using gaze tracking to make what feels like a magical interaction. The cooktop infers which burner is being addressed in a voice command by using its camera tracking to know which burner you’re looking at. This way, when the system detects a person standing in front of it looking at a burner, commands can omit the burner designation, e.g. “turn that burner on,” or simply saying a level like “medium high.”

In the concept video, a user is cooking and says “Alexa, turn up the heat.” Using a camera that is built into the cooktop, Alexa is able to infer that the user is cooking because they are looking at the cooktop.

There’s no doubt that the ability to combine contextual understanding of what is happening in a room to power commands given to digital assistants like Alexa could create much more powerful potential “smart kitchen” user scenarios. One could easily imagine combining other information to create more contextually relevant experiences, including facial recognition to, for example, apply a personalized knowledge base that understands a person’s cooking capabilities or their favorite recipes.

You can see the Hobgoblin demo in action below:

Hobgoblin Smart Appliance Interface | This New User-Interface Tech Isn't Just for the Kitchen


Related

Amazon Brings Cooking Capabilities To Alexa Smart Home Skill API

While over 50% of Echos end up in the kitchen, a lack of cooking-specific commands and categories within the popular voice assistant's smart home API has meant few people actually prepare food with Alexa today. But that could soon change. That's because today Amazon introduced built-in cooking controls for cooking…

Richard Blais is a Big Believer in Voice Interfaces

The way we interact with our kitchens is in the midst of massive changes. Guided cooking apps are helping anyone become a better cook, screens are popping up on fridges (and elsewhere), and we can control more and more appliances in the kitchen just by talking to them. It's this…

Why The Chatbot Interface Might Just Be The Smart Home Story of 2017

Voice interfaces are so 2016. Not that Alexa and Google's voice assistant won't grow a bunch more in 2017, they will. But the reality is the smart home continues to evolve at a rapid clip, and one of the early trends I've noticed for 2017 is the emergence of the social messaging…

Get the Spoon in your inbox

Just enter your email and we’ll take care of the rest:

Find us on some of these other platforms:

  • Apple Podcasts
  • Spotify
Tagged:
  • Alexa
  • Campbridge Consultants
  • gaze
  • smart home
  • smart kitchen
  • Synapse
  • Voice interfaces

Post navigation

Previous Post Soggy Food Sucks Re-Brands as SavrPak, Starts Scaling Up
Next Post North American Bancard Acquires Restaurant Management Platform SALIDO

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Get The Spoon in Your Inbox

The Spoon Podcast Network!

Feed your mind! Subscribe to one of our podcasts!

This Culinary Tech Inventor Thought He Could Build Some Parts For His Latest Gadget in the US. Then He Called Around.
Thermomix Has Long Been a Leader in Cooking Automation, But Now They’re Going Full Robot
Is IFT’s Launch of an AI Tool For Food Scientists an Indicator of Where Trade Associations Are Going in Age of AI?
From Red Bull to Zevia, Amy Taylor Shares Lessons Learned From a Career Built Around Buzzy Beverages
Study: AI-Powered Drones Fuel Advances in Precision Ag for Early Detection of Crop Stress

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
loading Cancel
Post was not sent - check your email addresses!
Email check failed, please try again
Sorry, your blog cannot share posts by email.
 

Loading Comments...