• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

Interfaces

August 9, 2021

Make Beats at the Breakfast Table with Reese’s Puffs AR Cereal Box Drum Machine

After seeing Mark Ronson’s “Watch the Sound” series on music, my 10-year-old son is now very into drum machines and beats. And while he would love a vintage Roland TR 808 to kick off his burgeoning music career, I think instead I’ll get him a box of Reese’s Puffs cereal.

Not that I think a bowl of Reese’s Puffs is the breakfast of champions, but a new promotional box for the cereal out now features an augmented reality drum machine on the back. I received a press release about the new cereal box beatmaker this morning. Usually when I get these types of emails, I immediately toss them. But as I looked at the information, it actually seemed like a pretty cool use of technology, so I went out and bought a box this morning (with apologies to my wife, who does not yet know there is a giant box of sugar cereal jammed into our pantry).

Here’s how it works. On the back of the box is a diagram of the RP-FX drum pad. You scan a special QR code with your mobile phone and it takes you to a special Reese’s web app that accesses your phone’s camera. Place cereal puffs wherever you like on the drum pad spaces and then hover your camera phone over the box. Using AR, the app “reads” where you placed your puffs and generates a beat accordingly. Move the puffs around and the beat changes.

It’s definitely not high fidelity or Pro Tools quality production, but it actually works surprisingly well. Once you have your beat just as you like, you can use the app to record it, so you can share it with friends or use it as the basis of your next club banger.

You can hear the one I whipped up this morning here (or, you know, wait a few months and it’ll be all over the radio).

As noted earlier, I typically shy away from covering promotional stunts like this. But as a parent and a fan of both music and technology, this promotion is actually worth, well, promoting. Besides, using a cereal box to build your own beat sure beats digging for a cheap plastic toy at the bottom of one.

April 15, 2021

Voiceitt’s App Aims to Make Those with Speech Impairments Able to Use Voice Control

It’s easy to joke about digital assistants like Alexa not understanding us. But that’s because most of us take for granted that the words we speak are easily understood. For many people with non-standard speech patterns, this isn’t the case. People with cerebral palsy, stroke victims or people with lifelong speech impairments may speak in a manner that digital assistants, or even other humans can’t readily understand.

Tel Aviv, Israel startup Voiceitt is working to help fix that with its mobile app that works as an individualized electronic translator. Once installed on a phone or tablet the user trains the app by speaking different words. Once Voiceitt has learned those words, the app can then be used to “speak” on behalf of the user. The technology is voice agnostic, so it can work in whatever language the app is delivered in.

You can see it in action in this video, where the user has trained the Voiceitt app to understand how he pronounces “burger” and how that means he would like a burger served in a particular way (with cheese, etc.). When it comes time to order, he says “burger” and the phone’s loudspeaker translates the order in a way that is easily understandable by the server.

Voiceitt also ties in directly with Amazon’s voice assistant, Alexa. (Voiceitt is funded in part by the Alexa Fund.) In this way, Voiceitt skips the intermediary step of saying the translation out loud on the phone. Instead when a command is spoken (“Turn on the light”) Voiceitt’s app talks directly to Alexa through the API to execute the command. Turning on a light may not sound like much, but without voice control, people with cerebral palsy, for instance, are reliant on caregivers to come in and do basic tasks like turning on a light or adjusting the volume on the TV or radio.

For our purposes here at The Spoon, this also means that a person with non-standard speech could also operate connected appliances like microwaves or water faucets, or order food and groceries for home delivery.

As noted, right now Voiceitt requires each user to train the app individually. It can’t extrapolate to words beyond the trained vocabulary spoken in that particular manner. But as more people use the app, Voiceitt’s database of speech patterns grows. Over time, Voiceitt’s artificial intelligence will process its library of data, recognize more patterns to become more fluent and more of a universal translator.

As that happens, Voiceitt could then be installed on the business side of more locations. For instance at a restaurant’s drive-thru, or voice-controlled ordering kiosk. With the software built into the kiosk, users wouldn’t need the middle-man of the tablet or phone, the kiosk would translate.

Voiceitt has raised $16 million in funding so far including non-equity money from governments and other funds interested in bringing more equity to technology. The Voiceitt app will be available here in the U.S. at the end of May, with a subscription of $200 a year, but Voiceitt says it is working to partner with relevant agencies to make it more accessible to people who aren’t able to afford that cost.

March 30, 2021

Piestro Adds Pay-With-Your Face Tech and Cubbies for Pickup

Robotic pizza vending startup Piestro announced today that it has partnered with PopID to integrate pay-with-your face technology into Piestro’s machines.

The integration of PopID’s technology will provide Piestro customers a new contactless ordering and payment system. In addition to paying with traditional credit/debit cards, users can create a PopID account to enable payment via facial recognition. Once that set up is complete, users can choose PopID as a payment method on Piestro’s app or on the machine itself. Orders placed ahead of time via the app can be retrieved via the same facial recognition. This pay-with-your-face option will be extended to Piestro’s white label, co-branded machines as well.

That ability to order ahead and pick up your pizza is also a new bit of functionality for the Piestro device. I spoke with Piestro CEO Massimo Noja De Marco by phone last week, who said that nine automat-style cubbies will be built into Piestro machines. This means you’ll be able to order your pizza ahead of time and have it held in a cubby that you unlock with your phone (or face).

That Piestro and PopID are working together isn’t that much of a surprise. PopID is part of the Cali Group of companies, which also includes Kitchen United, which De Marco founded and was Chief Concept Officer at. On a more existential level, in a post-COVID world, vending machine companies are looking to implement more contactless methods of interaction and reduce the number of physical touchpoints. As a result, other vending machine startups that may have been wary about facial recognition over privacy concerns could be more amenable to the technology now.

Piestro is certainly at the vanguard of a number of different technology trends. In addition to being a fully autonomous, robotic pizza restaurant and adopting a facial payment system, Piestro is also working with Kiwibot to allow delivery robots to pick up and deliver orders from its machine, and is embarking on its second round of equity crowdfunding.

We have to wait for all of this high-tech (and pizza) goodness, however. The first Piestros won’t roll out until early next year.

November 20, 2020

Kea Raises $10M to Bring its AI-Based Phone System to More Restaurants

Kea, which makes an AI-powered virtual phone assistant for restaurants, has raised a $10 million Series A round of funding. TechCrunch was first to report on the round, which was led by Marbruck, with Streamlined Ventures, Xfund, Heartland Ventures, DEEPCORE, Barrel Ventures and AVG Funds, as well as other angel investors participating. This brings the total amount raised by Kea to $17 million.

Basically, Kea is building an automated way for restaurants to answer the phone. The natural language processing software can hold a “conversation” with a customer to take and process their order. You can hear a demo of how it works on the Kea website.

As TechCrunch points out, many restaurants are understaffed and don’t always have a dedicated person to work the phones and take orders. During this time of off-premises eating, not answering the phone can translate into a lot of lost business. Plus, ordering directly from the restaurant instead of through a third-party like DoorDash or Uber Eats helps the restaurant avoid the sky-high commissions those services charge.

Kea is among a wave of natural language customer interaction systems coming to market. Google famously made news a couple years back with its Duplex AI-powered voice assistant for consumers to make automated restaurant reservations that sounded almost too human. Google also developed the CallJoy virtual phone assistant for small business owners. Clinc’s technology brings natural language conversations to the drive-thru, while McDonald’s acquired Apprente last year to add more voice capabilities to its drive-thru.

Kea told TechCrunch that its service is currently live in more than 250 restaurants including Papa John’s. With its new cash, the company is looking to be in 1,000 restaurants in 37 states next year.

November 11, 2020

Amazon Alexa Getting Better at Guessing Follow Up Requests

One big area where virtual assistants like Amazon Alexa and Google Assistant fall short of real human assistants is their inability to contextualize and anticipate what you’ll want next.

Currently, requests made to virtual assistants are often siloed, and go something like this:

“Alexa, how long should I steep tea?”

Alexa answers, with something like “Five minutes,” and then:

“Alexa, set a timer for five minutes.”

In a corporate blog post today (hat tip Geekwire), Amazon announced that Alexa is now getting better at bundling those types of requests together. Amazon refers to this as figuring out your “latent goal,” and actually provides tea steeping as an example. Asking Alexa how long to steep tea could have Alexa guess that your latent goal is to make tea. This, in turn would trigger an immediate and automatic follow up response from Alexa like “Would you like me to set a five minute timer?”

While this seems straightforward, as with so many AI-related tasks, understanding what people want isn’t exactly easy. From Amazon’s blog post:

The first step is to decide whether to anticipate a latent goal at all. Our early experiments showed that not all dialogue contexts are well suited to latent-goal discovery. When a customer asked for “recipes for chicken”, for instance, one of our initial prototypes would incorrectly follow up by asking, “Do you want me to play chicken sounds?”

Beyond tea, it’s not hard to think of how identifying latent goals could be useful in a smart kitchen. In the case of asking for chicken recipes, Alexa could follow up with offers to pre-heat an oven, or, more relevant to Amazon, offer to order you the necessary groceries for delivery that day (preferably from an Amazon grocery store).

Amazon says this latent goal capability is available now in English in the U.S. And while it doesn’t require any additional work from developers to activate, they can make their skills more discoverable with the Name-Free Interaction Tool Kit.

FWIW, I tried asking Alexa the tea steeping question, and it did not follow up with a timer suggestion. So its latent goals capabilities seem to still be, well, latent.

September 29, 2020

Amazon Launches Palm-based Contactless Payment Method

You have to, errrr, hand it to Amazon. The e-commerce giant today announced Amazon One, a new contactless payment method that relies on scanning your palm as you enter its store.

Amazon One is now an entry option at two of Amazon’s Go stores in Seattle (the 7th & Blanchard and South Lake Union stores, if you’re in the Emerald City). To use the new system, you insert your credit card into the terminal and hover your palm over the device. The terminal scans your palm print and from that point on, you just need to hold your hand over the One terminal upon entering the store. After that, the Go technology kicks in and automatically keeps track of and charges you for what you take from the store.

You do not need an Amazon account to use Amazon One, just a mobile phone and a credit card. But you can tie your Amazon account to One, should you choose.

In addition to its own physical stores, which include Go convenience stores, Go Grocery stores, Fresh grocery stores, Prime stores and more, Amazon envisions One being used by other retailers. From the blog post announcing the technology:

In most retail environments, Amazon One could become an alternate payment or loyalty card option with a device at the checkout counter next to a traditional point of sale system. Or, for entering a location like a stadium or badging into work, Amazon One could be part of an existing entry point to make accessing the location quicker and easier.

We’ve heard rumblings about some form of pay-with-your palm coming from Amazon for awhile now, so today’s announcement isn’t a surprise. It’s also not a surprise given Amazon’s devotion to speed and efficiency. Scanning your phone to enter a Go store may be easy, but waving your hand over a device is much easier and faster. This, in turn, could entice you to choose an Amazon store over the competition more often.

Amazon One is also coming out during a global pandemic and at a time when retailers are looking for more contactless payment methods. Amazon also licenses out its cashierless Go technology, and combining the two could be an attractive contactless option for retailers

Of course, given Amazon’s increasing dominance in not only retail but many other facets of our everyday lives, people may be reluctant to hand over their biometric data like a palm print. In its One FAQ, Amazon said it chose palm prints because they are more private, and that you can delete your data from the service after signing up.

I don’t need to be a palm reader to see that One will probably play an increasingly important part of Amazon’s physical retail experience, and that we could see it in a lot of other stores in the coming years.

September 16, 2020

Apple App Clips is Out Today, Right in the Middle of the Contactless HeyDay

Apple’s iOS 14 drops today, which means a bunch of new features are coming to your iPhone. Among the batch of goodies to be found in the update is the release of App Clips, which could help accelerate the adoption of mobile contactless payments across the food retail and restaurant space.

We covered App Clips back when the service was announced at Apple’s WWDC event. In a nutshell, App Clips allows you to pull down just a portion of a native mobile app to give you its basic functionality, without needing to download and set up the full app.

Let’s say, for example, you are at a Starbucks but don’t have the Starbucks app installed on your iPhone. Normally, you’d have to download the full app, set up an account and then enter a payment method before you could even start your order. Through App Clips, you can grab just enough of the Starbucks app to order and pay. No need to download the full app, no need for an account and since it uses Apple Pay, no need to enter credit card information.

The big limitation right now is that a cafe, restaurant or retailer has to be participating the program and developed an App Clip. But if/when they do, App Clips can be opened by scanning QR codes, tapping NFC tags, or opening links via Apple services like Messages or Safari. For a full run down of how App Clips works, check out this post from 9 to 5 Mac.

App Clips is definitely arriving at the right time. The COVID-19 pandemic’s sustained presence here in the U.S. is pushing cafes, restaurants and grocery stores towards more contactless payment systems in an attempt to reduce human-to-human contact.

There are a ton of companies bringing contactless payment tech to market. Order for Me, PayJunction and Bbot are among the many startups building contactless payment systems for restaurants. Kroger launched a contactless payment pilot at its QFC store in Seattle last month. And there are a number of other companies doing contactless 2.0, basically through holograms, voice control and pay with your face.

The difference with App Clips, though, is Apple’s massive iPhone installed base. If this catches on, and given it’s utility (no downloading full apps just to get a cup of coffee!), that seems very possible, it could spur vast numbers of people to switch to contactless mobile payments. We just need to see if Apple has the right, well, touch.

September 11, 2020

“Alexa, Look Into My Eyes”: New Prototype Combines Human Gaze with Voice Control to Help You Cook

There’s no doubt that voice control interfaces like Alexa and Google have had a huge impact on the way we search for recipes, access our appliances and add things to our grocery lists.

But what if that voice assistant had a contextual understanding of where you where looking when you are cooking the evening meal?

That’s the big idea behind a new prototype from Synapse, a division of Cambridge Consultants. The new Hobgoblin technology concept utilizes machine vision to gather information about where a person is looking when issuing a voice command and applies that information to the cooking experience.

From the project page:

We have been exploring the use of computer-vision based sensing as context, and for this cooktop demonstration we augmented the VUI using gaze tracking to make what feels like a magical interaction. The cooktop infers which burner is being addressed in a voice command by using its camera tracking to know which burner you’re looking at. This way, when the system detects a person standing in front of it looking at a burner, commands can omit the burner designation, e.g. “turn that burner on,” or simply saying a level like “medium high.”

In the concept video, a user is cooking and says “Alexa, turn up the heat.” Using a camera that is built into the cooktop, Alexa is able to infer that the user is cooking because they are looking at the cooktop.

There’s no doubt that the ability to combine contextual understanding of what is happening in a room to power commands given to digital assistants like Alexa could create much more powerful potential “smart kitchen” user scenarios. One could easily imagine combining other information to create more contextually relevant experiences, including facial recognition to, for example, apply a personalized knowledge base that understands a person’s cooking capabilities or their favorite recipes.

You can see the Hobgoblin demo in action below:

Hobgoblin Smart Appliance Interface | This New User-Interface Tech Isn't Just for the Kitchen

August 10, 2020

As Menus Move to Mobile Phones, Research Shows AR Could Drive More Sales

Among the countless ways COVID is altering the meal journey is the humble menu. Gone are the germy, reusable laminated menus of the past, and while single-use paper menus are a cheap stopgap, the whole experience will move to our mobile phones.

There’s a problem with ordering through mobile menus, though: they aren’t very enticing. Unless you’re familiar with the restaurant you’re ordering from, you’re scrolling and swiping through a lengthy list of tiny 2D thumbnail images to find what you want.

But new research out of Oxford University shows that augmented reality (AR) could be the way to create appetizing menus on your mobile phone. Oxford’s study, conducted in Oxford, England last year in partnership with the AR company Qreal, a subsidiary of The Glimpse Group, gave some participants traditional menus and others AR-capable menus that presented the virtual food as it would look right in front of them on the table.

Highlights from the study, which were emailed to The Spoon, found that “Participants were significantly more likely to order a dessert if they viewed options in the AR menu (41.2%) versus the control menu (18%).” This was across age groups and sexes, as well as across familiarity with AR, so it wasn’t just tech-savvy folk indulging in a shiny new toy.

Not only that, but participants in the study using the AR menu also “spent significantly more on dessert than those in the control condition, $2.93 versus $1.38 (increase of 112%)”

As Mike Cadoux, General Manager of Qreal, summed it up with me over the phone last week, the addition of AR plays into the old adage that you “eat with your eyes first.”

“It’s like a test drive for a car,” said Cadoux, “Same way when you buy food, you want to think about what it’s like to eat it.”

If the results of this study had been released even six months ago, it probably would have been viewed as more of an interesting idea. A nice-to-have kind of thing, but definitely a can kicked down the road in favor of something more pressing.

The coronavirus, however, could accelerate AR’s adoption in the restaurant industry. First, as noted, even if you can (legally and emotionally) to sit and dine in a restaurant, the menu is moving towards mobile, so restaurants need to rethink their digital strategy and how they present their food to customers to begin with.

But more pressing is the fact that the restaurant business was already moving towards off-premises eating before the pandemic and now relies on delivery and takeout to stay alive. This, in turn means that more people will be selecting their meals from the comfort of their couch via mobile phone.

“Instead of a thumbnail of a picture,” Cadoux said, “You can view it in 3D and move it to an AR experience.” AR gives customers a better sense of what the food will look like, from all angles, when it’s on their own plates on and tables.

In addition to restaurants, third-party delivery services, with their marketplaces and massive audiences, should also be looking closely at providing an AR option.

There are the economics of a shift to an AR menu for any restaurant of delivery service to consider. But thanks to Apple and Google, AR technology is easier than ever to implement. And while the Oxford study doesn’t prove outright that implementing AR menus will guarantee increased sales, the study is a nice data point that seems to indicate it’s worth at least experimenting with it.

June 15, 2020

Bevi Will Socially Distance Its Smart Water Coolers With Touchless Tech

As restaurants reopen and (some) employees go back to the office, ensuring sanitary, socially distanced public spaces is a major topic of discussion, and contactless is fast becoming a requirement for everything from restaurant menus to grocery deliveries to lunch.

Your office water cooler can join that list now, too. Today, Bevi, a tech company that makes smart water coolers for office and commercial spaces, announced a new touchless dispensing feature meant to make it machines feel more sanitized and socially distanced to users.

The Bevi machine dispenses both still and sparkling water in a variety of flavors. The system involves an internet-connected dispenser that hooks up to a tap water source. Up to now, users could set flavors, carbonation levels, and other preferences using a touchscreen built into the machine. But come July 13, both new and existing Bevi machines will offer touchless dispensing that utilizes an individual’s mobile phone, according to a press release sent to The Spoon.

Come July 13, Bevi will send an on-screen animation to all its machines that includes instructions on how to use touchless dispensing. To enable the animation, companies just have to run a simple software update. From there, users can scan a QR code, which will replicate Bevi’s dispensing menu on their own personal screen. The same options for drink customization (carbonation levels, flavor, etc.) will appear on the user’s phone just as they would have on the machine’s touchscreen.

On the surface, the update seems a small one, but actually, these micro innovations from the tech world play an important role in making the world, or at least your office or local restaurant, a more sanitary place. While the scale of germophobism varies from one individual to the next, the pandemic has definitely called into question our use of these screens in public settings.

Various efforts are in place to address those concerns. Restaurants across the world are being urged to adopt contactless menus. My colleague Chris Albrecht makes a good argument for gesture control on kiosks and smart dispensers. Others are releasing facial recognition technology on their machines, so that a user need only have their face scanned to access the customer profiles and past orders. 

But facial recognition systems are expensive and come with a double side of privacy concerns. In lots of cases, it may be that a simple QR code is more feasible for a business to implement, especially if it’s for something simple like dispensing a lime-flavored water.

That seems to be Bevi’s thinking behind its new feature update. Doubtless we’ll see many other device-makers rolling out their own touchless functionality in the near future.

June 1, 2020

Now is the Time to Add Gesture Control to Self-Serve Kiosks

Self-serve kiosks could be one bit of technology used to help restaurants re-open. Switching to kiosks can help eliminate ordering from a worker behind a counter and thereby reducing human-to-human contact.

But the problem with a lot of kiosks is that they still require people to use a touchscreen when it comes to browsing a menu and ordering food. Those touchscreens are, well, touched by a lot of hands throughout the day. This will mean that restaurants will need to not only clean the screens on a regular basis, but make sure those sanitation efforts are on display so customers know the kiosks are clean.

One feature that could help mitigate any concerns over using public kiosks is gesture tech — that is, the ability to wave and swipe your hands in front of a device without actually touching it to activate it.

More than just waving your hand under a faucet to get it to run, gesture controls have been around for a long time. Check out this demo of gesture control via Kinect on the Microsoft Xbox from 2014. It shows how you can scroll and select on-screen objects with your hand while never actually touching the screen. My recollection of using the system back then was that it was pretty clunky, but advancements in computing power, computer vision and even AI have assuredly resolved those issues.

You can start to see some of this contactless tech with PopID, which makes kiosks that let you pay with your face. Obviously, as with any facial recognition, there are privacy concerns. But, rightly or wrongly, those more abstract concerns could get set aside because people who don’t want to physically interact with a public machine and insert their credit card.

Obviously, an easier solution would be to skip the kiosk altogether and just have people use their phone to order. In that way, the only object they touch is their own. That is true, but not everybody has a smartphone. Also, while people may download apps for big companies they frequent like Starbucks or McDonald’s, they won’t download an app for every single restaurant they visit. Plus, mobile apps are expensive to build and not every mom and pop restaurant can afford to make them.

We are just starting to figure out what people want from restaurant interactions. Perhaps they will be fine with ordering from an actual person wearing a mask and all of this is for naught. But if the virus doesn’t go away or rebounds in a meaningful way, we could all be waving hello to gesture controlled kiosks.

January 4, 2020

Here’s Your Handy CES 2020 Food and Kitchen Tech Preview & Walking Guide

Heading to CES?

Make sure to wear comfortable shoes, bring some Tylenol, and get ready for lots of food and cooking tech!

Having gone to the world’s biggest consumer tech show for well over a decade, I’ve gotten pretty good at finding new products that are of interest to me. That said, even for someone like myself who’s spent more than his fair share of shoe leather getting around the ugly carpets of Vegas, finding the latest in food tech has always been something of a challenge at a show where entertainment, robotics and car tech news usually steal most of the headlines.

The good news is that all started to change last year with the debut of Impossible Burger 2.0, and, based on my pre-show planning over the past couple of weeks, I expect to see a whole bunch of food and kitchen tech news at this year’s CES.

I figured I’d share some of my research by putting together a guide to what’s going on in food tech and smart kitchen at CES 2020 to help you make the most of your time in Vegas. I’ve even added booth numbers for most of the products to help you get there.

During the next few days, I’d also suggest you check back in here at The Spoon for stories, videos and interviews from The Spoon editor team. And, if you plan on making foodtech news at CES with a cool product you think we should check out, drop us a note at the tip line.

So here we go! Check out these food and kitchen tech products at CES 2020:

Robot Pizza: Back in October, attendees of the Smart Kitchen Summit got to sample pizza made by the Seattle based pizza robot startup Picnic. For those of you who couldn’t make it to Seattle, now’s your chance: Picnic’s pizzabot will be serving pizza at CES. Lines for food are long at CES, and I expect the pizza-robot lines at the Las Vegas Convention Center to be even longer.

Robot Ramen: If pizza isn’t your thing, you might want to make your way over to the Taiwan Tech Area in the Sands Eureka Park (Sands 51411) to check to another Smart Kitchen Summit alumni Yo-kai Express, which will be dishing up noodles from their robot ramen vending machine.

Beerbots: You’re gonna need something to chase the pizza, so you might want to check out a beerbot like PicoBrew (Sands 41518) or stop by Treasure Island for FoodTech Live (ticket needed) to see MiniBrew or BEERMKR. Also, while I haven’t hear anything out of LG yet about their beerbot, I am waiting to see if they’ll have an update on their beer brewing appliance they debuted at last year’d CES.

Drinkbots: If you’d like something a little stiffer (I’d suggest to wait til after noon, but this is Vegas and you are an adult), try out a cocktail robot like the Drinkworks (Sands 42546 and FoodTech Live) or Bartesian (Sands 40852).

Wine Tech. Oh, so you’re a wine snob, are you? Don’t worry, you can find that too. Earlier this week Chris wrote about the Albi from Albicchiere (Sands 52722), a cool countertop appliance that stores and serves your wine. Invineo will be showing its connected wine dispenser off as well (Sands 50863).

DNA-Driven Food Choices. Food personalization is moving beyond simple suggestions, and in the future it will get downright personal by creating diet plans based on a person’s biological makeup. If you want to check out a couple of companies looking deep inside your body to make food recommendations, check out DNA Nudge (Sands Booth 44316) or Sun Genomics (FoodTech Live – ticket required for entry).

Smart Countertop Cooking. There will be an array of different countertop cooking appliances that are powered by smart software and cook in new and interesting ways. CES will be the first chance to see a working version of the long-promised Anova smart oven (see our post here), which you can see in the Sands (booth 40946) . The Julia, which is a multicooker reminiscent of the Thermomix, features guided cooking videos delivered via a touchscreen interface. You can find the Julia at in the Sands at booth 41367. Speaking of Thermomix, they’ll be showing off the TM6 at FoodTech Live (ticket required). If you’re the smoothie type, check out the cool next-gen Millo blender (Sands 40346), who will also be showing off a smart table with wireless power.

Intelligent Surface Cooking. I expect some interesting news in terms of smart cooking surfaces. One cool demo I plan to check out is the GHSP concept that is both an induction cooking surface and a touch interface (North Hall 3111). I also expect the Wireless Power Consortium to be showing off their Ki cordless kitchen platform at their usual spot in the Las Vegas Convention Center at on the walkway near the South Hall (South Hall SL-2).

Smart Home Gardening/Vertical Farming Systems. I’ve been following smart gardening systems for years at CES, but this is the first year we’ve seen big appliance brands jump in. As Jenn wrote earlier this week, LG will be showing off a new indoor gardening system at CES (Central Hall 11100). Not to be outdone, GE will be coming with its own home gardening kitchen concept called “Home Grown” which it will be showcasing at the Haier/GE booth in the central hall (Central booth 16006).

Home Food Robots. We’re not sure how fully fleshed out the Autokitch cooking robot concept is, but they’ll have a booth at CES (Sands 53034). And while it’s not quite a fully robotic kitchen concept or bread making robot, the Tigoût is a pod-based baking machine from Argentina that’s is worth checking out. Drop in a pod, out spits a souffle or a raspberry muffin. You can see the Tigoût in action at Sands 52768.

Coffee and Tea, Please. If you’re looking for a nice cup of tech-powered tea, you should check out the Teplo tea maker, which will be at the Panasonic booth in the Sands (Sands 42711). If you’re more of a coffee person, then you’re in luck if you have a ticket to FoodTech Live: A production line version of the grind and brew Spinn will make its debut after a long-anticipated wait. The Terra Kaffe – which grinds, brews and steams milk – will also be in attendance at Food Tech Live.

Alexa, Give Me a Coke. Sure, this one is kinda gimmicky, but we are talking CES after all. Amazon and Coca-Cola are teaming up for a voice-powered Amazon Alexa “Coke Energy Wall” where attendees will be able to ask Alexa for a coke and a smile (delivered via what the PR describes as a “one of Alexa’s witty responses”. You can find the Amazon Alexa Coke Energy Wall at Sands booth #40934.

Smart Fridges. Smart fridges have been debuting at CES for years, and this year they are more evolved than ever. LG will be showing off its new InstaView ThinQ refrigerator, which uses computer vision and AI for real-time inventory of what’s inside (Central Hall 11100). Samsung will be back with its latest edition of the Family Hub smart fridge line, this time powered by Whisk, a smart food AI platform they acquired in spring of 2019 (Central Hall 15006). If you’re interested in products that make existing dumb fridges smart, Smarter will be showing off its smart fridge cam platform at FoodTech Live (ticket required).

Fake Meat. Impossible stole the show last year at CES with the debut of the Impossible 2.0. In retrospect, it was a brilliant move for the fake meat unicorn to unveil their next-gen meat at CES 2019, mostly because it would be the first time most journalists and attendees would have a bite of a plant based burger. It also didn’t hurt that the 2.0 is much better than the 1.0. This year Impossible will be back, kicking things off with a press conference at 5 PM on Jan 6th and then serving up twenty five thousand free samples of Impossible Burger (I’m guessing it will be the new recipe/3.0 edition) their new pork product and the Impossible Burger at the Central Plaza of the Las Vegas Convention Center.

Smart Grillin’. The Weber folks partnered up with June late last year to add some software powered cooking intelligence to their grill. You can swing by their booth at the South Hall to see what those two have cooking on the Barbie (South Hall SP-2). Also, Chris wrote up the news this weekend about Yummly’s new entry in the increasingly crowded smart thermometer space. You can check that new product if you make an appointment with Whirlpool and swing by their meeting space at the Wynn (Wynn hospitality suites). Speaking of smart thermometers, Meater is back and you can check out their latest if you have a ticket to FoodTech Live.

Tiny Adorable Dishwashers. Like many of you kitchen nerds, I’ve been excited about the Tetra. Only problem is, Heatworks, the company behind the Tetra, is a bit behind on shipping their sexy little countertop cleaning machine and so it looks like they are staying home this year and focusing on getting it out the door in 2020. But don’t worry! If you’re looking to scratch the countertop cleaning machine itch, head on over to the Daan booth to check out equally adorable Bob (Sands 50819).

Ok, that’s it. While this list isn’t exhaustive, it’s a good start. If anything really interesting pops up before tomorrow’s CES Unveiled, I’ll update the post (send me any news I’ve missed via our tip form), and for more detailed updates make sure to check back here at The Spoon all next week!

Next

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...