• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

AI

January 27, 2026

Why Subtle Tech and Countertop Appliances, Not Robots, Are Driving Kitchen Innovation

For much of the past couple of decades, talk of the future kitchen at CES has conjured tech-forward images of robotic arms sautéing vegetables, humanoids flipping burgers, and, more recently, AI-powered assistants hovering over the stove. But during a conversation I had with a panel of kitchen insiders a couple of weeks ago in Las Vegas at The Spoon’s Food Tech conference, they made a compelling case that the future of cooking looks slightly more mundane, yet far more useful.

I was joined by Robin Liss, CEO of Suvie; Jonathan Blutinger, senior design engineer at Smart Design; and Nicole Papantoniou, director of the Kitchen Appliances Lab at the Good Housekeeping Institute. Together, they painted a picture of a near-term kitchen future shaped less by futuristic robots and more by quiet, behind-the-scenes intelligence.

To set the table (sorry), I started the conversation by asking where we’ve actually been over the last decade when it comes to the smart kitchen. Papantoniou said a core mistake made by early smart kitchen products was trying to solve problems consumers did not actually have. “A lot of people were putting smart features into products that you didn’t really need,” she said. “I don’t think people understood why they needed Alexa to make coffee for them.” Instead, she argued, success today comes from friction reduction. “It’s becoming way easier, very seamless, and people use it without even realizing it now”.

That shift toward subtlety was echoed by Blutinger, who said many early smart kitchen products were over-engineered. “Just because you can doesn’t necessarily mean you should,” he said. “It should be coming from a human need”.

Slap Some AI on It

A huge percentage of booths at this year’s CES claimed their product was AI-powered, which had me wondering whether today’s market risks repeating the mistakes of the smart kitchen a few years ago, when everyone was “slapping Wi-Fi on everything.” Liss argued that AI today is fundamentally different from the Wi-Fi-first era of connected appliances. “Almost all these products have embedded software or cloud-connected software,” she said. “The way we look at AI is it’s not some all-encompassing model… it’s integrations into steps of the process”.

Blutinger said AI’s biggest problem may be the overuse of the term by marketers, and that while the AI-ification of products is inevitable, both the label and the tech will eventually recede into the background. “That word alone has created such a stigma around it,” he said. “The technology should not be upfront and personal. It should be invisible in a sense”.

Papantoniou agreed, predicting consumer acceptance will likely be higher once AI fades into the background. “Once people stop advertising that it’s AI and it’s just part of the normal product, it’ll be way more accepted”.

Hold the Humanoids

As with my other session at CES focused on food robots, I asked the panelists when, if ever, we’d see humanoid robots walking around our kitchens. And just as with that other panel, they were skeptical.
“I still think that’s really soon for us to be seeing it in the home kitchen,” said Papantoniou. “Five years is soon”.

Liss said the adoption of food robots in the home would hinge on safety and practicality. “Food is inherently dangerous, and kitchen appliances dealing with high heat are inherently dangerous,” she said, noting that even in commercial settings, “getting the robot not to hurt the workers around it… that’s the hard part”.

Instead of humanoids, the panel advocated task-specific automation.
“We are designed as humans to do so many range of tasks,” said Blutinger. “Like we have to be perfect for so many things. It’s not like cooking takes up 100% of our time. So if we’re trying to optimize for just automation in the kitchen, why do we need these complex articulated (robot) arms doing things? Why not just have like a simple little one degree of freedom rotating thing that just rotates our sauce?”

Why Countertop Appliances Keep Winning

Despite talk of built-in, do-everything cooking boxes, the panelists agreed that innovation will continue to favor specialized countertop devices.

“I would say that probably the reason you’re seeing so many, the proliferation of lots of little countertop appliances, which makes me very happy, is because the innovation is happening there,” said Liss. “And frankly, if you look at the breakout companies, the stock performance of Breville, Shark Ninja, are, you know, Breville is larger than Whirlpool, Shark Ninja is many multiples larger than Whirlpool. It’s because all of the innovation is happening on the countertop because of that replacement cycle challenge of major appliances.”

Papantoniou was blunt about the trade-offs that come with multifunction. “There is that stigma that multifunctional appliances don’t do everything well. And while it’s gotten a lot better, I would say like an air fryer function in an oven is not going to compete with your basket air fryer.”

The Future of The Kitchen Has More Personalization and Less Friction

For my final question, I asked the panelists to look ahead and describe what they see for the kitchen over the next few years, and it was clear they were aligned around a quieter vision of progress.

Papantoniou predicted broader adoption as fear subsides. “People are adopting it more and not being so scared of it and not judging it as harshly, I think, as they did in the past. I think people actually do want their coffee maker to start working while they’re still in their bedroom. So I think that’s gonna just be coming more,” she said.

Blutinger focused on usability. “I think just reduce friction in the kitchen. That’s the biggest thing if you’re trying to innovate in the kitchen space.”

Liss closed with a vision for the future centered on humans, not robots. “I think it’s healthier, more personalized food, cooked how you want it. You’re getting to spend, most importantly, is families getting to spend time with each other happily enjoying meals for those everyday weeknight meals rather than spending an hour, mom spending an hour prepping the food or wasting money on really expensive delivery, right? It’s like a better life for people because they’re eating healthy, good food at home, saving money, and spending time with their loved ones.”

You can watch the full session below.

CES 2026: The Kitchen of the Future: AI, Robotics & Smart Tech

.

December 18, 2025

Shinkei Hopes Bringing Robotics & AI to the Fishing Boat Leads to Fresher Fish and Less Waste

Walk into almost any grocery store and, chances are, what you see in the fish case is not at peak freshness. 

You wouldn’t think it’d be that way, especially in places like Seattle, where I live, because such care is given to getting the fish to market quickly. But according to a new venture-backed startup named Shinkei, the critical factor in determining freshness is not what happens after fish leave the boat, but instead what happens in the moments immediately after they are caught. 

I caught up with Shinkei CEO Saif Khawaja earlier this month to discuss how exactly his company’s tech brings what he claims are Michelin-quality fish at commodity costs. According to Khawaja, conventional handling on the boat fails because it leaves most fish “flopping around,” which triggers stress responses that accelerate quality loss and shorten shelf life.

“Most fish available at a mass market retailer were handled on the boat in a way that releases stress hormone, lactic acid,” said Khawaja. “This stuff makes the meat more acidic, primes bacteria growth, and in turn speeds the shelf life and decay of meat quality.” 

Shinkei’s computer-vision-powered robot is designed to intervene immediately. Fish are placed into a machine, which the company calls Poseidon, while still alive, and it uses computer vision AI to scan the fish to determine the fastest (and least-stressful) path forward for the fish. Once the fish is scanned, the machine performs a fast sequence: a brain spike to euthanize as quickly as possible and a gill cut (to drain blood).

If this system seems, well, rough, it is, but the reality is fish caught are experiencing high stress from the time they’re caught, and the faster the fisherman can move towards euthanization, the more humane (and ultimately fresher and better tasting the fish). Khawaja says each fish is processed in about six seconds, and the company’s goal is to get the fish into the system quickly after landing, ideally within roughly a minute, before quality begins to degrade meaningfully.

Speed on the Boat = Less Waste in the Store

While much of the pitch is focused on better taste, Shinkei’s technology also a food waste angle. According to Khawaja, their solution also helps reduce waste in the store. That’s because, typically, a suffocated fish might enter rigor mortis in about 7 hours, but Shinkei’s process expands that up to 60 hours, which creates a much larger buffer before decomposition starts. Khawaja says it also makes a difference by species, with black cod handled in a traditional way lasting four to five days, where Shinkei-handled fish can stay fresh or up to two weeks. 

Khawaja attributes the compounding effect to two factors: reducing stress (less acidification) and removing blood that would otherwise diffuse through the meat and feed bacterial growth. He says the resulting shelf-life extension gives food distributors more options for logistics, allowing fish to be trucked rather than flown. 

If Shinkei’s technology works as promised, one might expect to see all professional fishermen and processors installing hardware at some point, right? Maybe…not. That’s because the company’s business model is to create a branded direct-to-consumer model for its fish, so instead of selling the hardware outright, Shinkei places machines on partner boats under a zero-cost lease and retains ownership of the machines. They also require an exclusive buying structure that grants Shinkei the right to purchase the catch processed that uses its machinery. 

From there, the company sells the fish into foodservice channels and retail under the brand Seremony, where they’re trying to get “Seremony grade” to catch on.  Khawaja says the company has sold into top-tier restaurants globally, including Michelin 3-star destinations across multiple countries, and recently launched in Wegmans (Manhattan) and FreshDirect (New York).

Today, Khawaja says Shinkei works with eight boats, sourcing species like black cod, rockfish (including vermilion rockfish), and red snapper, plus some ad hoc species (salmon, black sea bass, and others). The boats surf water in the US west coast (Alaska down to California), Texas, and Massachusetts.”

When I asked Khawaja about the underlying technology, he told me they built their AI models in-house, collecting their own data and building a pipeline informed by work like facial recognition research (fish face, that is). The computer vision stack performs a set of inferences: identifying species, detecting key points, and generating cutting paths.

He also talked about two new projects they are working on within the platform. One is Kronos, a weight-estimation model embedded in the machine that sends catch data back to the Shinkei sales team in real time so they can start selling fish before it reaches the dock. Another is Nira, which uses sensors to predict shelf life.

“We integrate sensor data into a model, and we will be able to generate ground truth at any point in the supply chain for what shelf life and quality is for that fish,” said Khawaja. 

The company recently raised $22 million and is currently at Series A. The Series A was co-led by Founders Fund and Interlagos, with new investments from Yamato Holdings, Shrug, CIV, Jaws, and Mantis.

Long-term, I wondered whether the company was open to expanding to a model in which it sells its hardware to fishermen who don’t feed their catches back to the company as part of the Seremoni pipeline.  Khawaja and Shinkei completely shut the door, but for now, they’re “focused on building the brand and basically establishing and making ceremony-grade as a certification.”

November 13, 2025

We Talked With Nectar About Their Plans to Build an AI for Better Tasting Alt Proteins

A few weeks ago, the philanthropic investment platform Food System Innovations announced that it had received a $2 million grant from the Bezos Earth Fund. FSI’s non-profit group NECTAR has been building a large dataset of consumers’ sensory responses to alt proteins, and the grant will help NECTAR to continue working on, in partnership with Stanford University, an AI model “that connects molecular structure, flavor, texture, and consumer preference.” The goal, according to NECTAR, is to create an open-source tool for CPGs and other food industry players to develop more flavorful—and hopefully better-selling—sustainable proteins.

I’d been following NECTAR for some time and have been closely tracking the impact of AI on food systems, so I thought it would be a good time to connect with NECTAR. I’d talked about the project briefly with Adam Yee, the chief food scientist who helped with the project, while I was in Japan, and this week I caught up with NECTAR managing director Caroline Cotto to get the full download on the project and where it’s all going.

Below is my interview with Caroline.

What are you building with this new Bezos Earth Fund grant?

“One of the things Nectar is doing is we just won a $2 million grant from the Bezos Earth Fund to take our sensory data and build a foundation model that will predict sensory. So we kind of bypass the need for doing these very expensive consumer panels, and then also predict market success from formulation. It’s intended to be sort of a food scientist’s best friend in terms of new product ideation.”

For people who don’t know Nectar, what’s the core mission, and how did this AI project start?

“Basically, Nectar is trying to amass the largest public data set on how sustainable protein products taste to omnivores. That’s what we have set out to do. We’re building that, and we are working heavily with academics to operationalize that data.

Over a year and a half ago, we started talking to the computer science folks at Stanford to say, like, what are things we could do with this novel data set that we’re creating? It happened to be around that time that the phase one Bezos Earth grant was opening up for their AI grand challenge. I connected Adam with the Stanford team, and they did some initial work on LLMs and found that it was able to do some of this support for food scientists. They published a paper together that came out in January for ICML, the largest machine learning conference, and we ended up winning that phase one grant, which then allowed us to apply for the phase two grant that we just found out about in October.”

From a technical standpoint, what kind of AI are you actually building?

“I am not an AI scientist myself here, so we are heavily partnered with Stanford and their computer science team, but it is an LLM base. We’re basically fine-tuning an LLM to be able to do this sensory prediction work, and it’s a multi-modal approach. There’s a similar project that’s been done out of Google DeepMind called Osmo for smell and olfactory, and we’re working with some of the folks that worked on that in order to model taste and sensory more broadly, and then connect that to sales outcomes.”

How does the Bezos Earth Fund AI Grand Challenge work in terms of phases and funding?

“It’s the Bezos Earth Fund AI Grand Challenge for Climate and Nature. It’s $30 million going to these projects. There were 15 phase two winners that each received $2 million and have to deliver over two years.

The phase one was a $50,000 grant to basically work on your idea and prepare a submission for phase two. We spent about six months preparing, trying to connect this Nectar data set with sales data and see which sensory attributes are most predictive of sales success, and also connecting the Nectar sensory data set to molecular-level ingredient data sets. Ideally the chain of prediction would be: can you predict sensory outcome from just putting in an ingredient list, and if so, what about sensory is predictive of sales success? We’re working on the different pieces of that predictive chain.”

What does your sensory testing process look like in practice?

“It’s all in-person blind taste testing. In our most recent study, we tested 122 plant-based meat alternatives across 14 categories. Each product was tried by a minimum of 100 consumers. They come to a restaurant where we’ve closed down the restaurant for the day, but we want to give them that more authentic experience. They try probably six products in a sitting, one at a time, and everything is blind, so they don’t know if they’re eating a plant-based product or an animal-based product and then they fill out a survey as they’re trying the product.”

How big is the data set now, and what’s coming next?

“We do an annual survey called the Taste of the Industry. For 2024, we tested about 45 plant-based meat products. For 2025, we tested 122 plant-based meat products. Outside of that, we have our emerging sector research, which are smaller reports. We’ve done two of those, and both have been on this category we’re calling balanced protein or hybrid products that combine meat. We’ve tested just under 50 products total in that category as well.

We’re testing blends of things like meat plus plant-based meat, meat plus mushrooms, meat plus microprotein, meat plus just savory vegetables in general. For 2026, our Taste of the Industry report is on dairy alternatives. We’re testing 100 dairy alternatives across 10 categories, and that will come out in March.”

When you overlap taste scores with sales data, what have you seen so far?

“The Nectar data set is mostly just focused on sensory. That’s the core of what we do. We are also interested in answering the question ‘do better-tasting products sell more?’ In our last report, we conducted an initial analysis of overlapping sensory data with sales data, finding that better-tasting categories capture a greater market share than worse-tasting categories. Better-tasting products are capturing greater market share than worse-tasting products. In certain categories, that seems to be agnostic of price. Even though the product might be more expensive, if it tastes better, it is capturing a greater market share.

We’re currently working with some data providers to get more granular on this sales data connection, because that analysis was from publicly available sales data. In this AI project, we are trying to connect sensory performance with sales more robustly to see which aspects of sensory are predictive of sales success. It’s hard because there are a ton of confounding variables; we have to figure out how to control for marketing spend, store placement, placement on shelf, that sort of thing. But we have access to the Nielsen consumer panel, this huge data set of grocery store transactions over many years, from households that have agreed to have all of their transactions tracked. We’re able to see what consumers are purchasing over time, and we’re trying to connect the sensory cassette to that.”

You also mentioned bringing ingredient lists and molecular data into the model. How does that fit in?

“We’re trying to say, there are a lot of black boxes in food product development because flavors are a black box. We don’t have a lot of visibility into companies’ actual formulations. We’re trying to determine if we can extract publicly available information from the ingredient list and identify the molecular-level components of those ingredients, and then determine if any correlations can be drawn between them.

It’s all of these factors plus images of the products and trying to see if we can predict that.”

What do you actually hope to deliver at the end of the two-year grant?

“The idea is to deliver an open source tool for the industry to use. The goal would be that you can put in all the constraints you have for sustainability, cost, nutrition, and demographic need, and that it would help you get to an endpoint where you don’t have to do a bunch of bench-top trials and then expensive sensory.”

How do you think about open source, data privacy, and companies actually using this tool?

“Data privacy is a big thing in this space. We don’t have any interest in companies sharing their proprietary formulations with us. The goal is that they would be able to utilize this tool, download it to their personal servers, and put in their private information and use it to make better products. If we’re rapidly increasing the speed at which these products come to market and they are actually successful, that would be a success for us.

There are other efforts in this space, from NotCo to IFT. Where does Nectar fit?

“I think everybody is trying to do similar things, but with slightly different inputs and different approaches. We are open to collaborating and learning from people. Our end goal is a mission-driven approach here, not to make a ton of money, so it depends on whether or not those partners are aligned with that goal.

IFT has trained its model on all of the IFT papers that have been published over the many years of its organization being around. We’re training our model on our proprietary dataset around sensory data, so there’s some nuance between things. They’re really focused on developing formulations, but there is a limitation to what you can do with that tool. It’ll tell you, ‘here’s how to make a plant-based bacon, add bacon flavoring,’ but there are 10 huge suppliers that provide bacon flavoring, and it doesn’t provide a ton of granularity on at what concentration and from what supplier.”

What’s the bigger climate mission you’re trying to advance with this work?

“Nectar’s specific directive is, how do we make these products favorable and delicious? We know that we need to reduce meat consumption in order to stay within the two degrees of climate warming, and we’re not going to get there by just telling people, ‘eat less steak.’ We have to use that whole lever and make the products really delicious so that people will be incentivized to buy them more and reduce consumption of factory-farmed meat.”

Answers have been lightly edited for grammar and clarity.

October 6, 2025

Are Big Food Companies Really Embracing AI?

While some companies like NotCo have positioned themselves as the OpenAI of the food world, the truth is that the AI transformation of the food industry is still in the first inning. That is partly because the food system itself, a mix of legacy CPG giants, agricultural suppliers, ingredient developers, and regulators, moves at a glacial pace.

In my recent conversation with Jasmin Hume, founder and CEO of Shiru, she confirmed that the industry is still in the early stages, in large part because food companies have massive amounts of data and strong confidence in their own research and development.

“Food companies have world-class R&D teams, and those scientists want to see proof before adopting a new tool. It’s a lot of tire-kicking in the first meetings,” said Hume.

This slow pace does not mean AI is not making inroads. It is simply happening beneath the surface. From discovery platforms like Shiru’s to optimization tools in manufacturing and retail analytics, AI is slowly reshaping how food gets made.

But the true question is not if the food system will use AI, but who will own the models that make it useful. The answer is usually tied to who owns the data. Legacy food and ingredient companies have decades, even centuries, of proprietary chemical, biological, and sensory data. This makes them both powerful and hesitant to engage with AI models that might use that data to build their own foundation models.

Big food brands “are not going to very quickly turn over that data,” said Hume. Many are debating whether to build their own in-house systems, using models fine-tuned on proprietary data that never leaves their servers. Others are beginning to explore partnerships with companies like NotCo and Shiru that specialize in the discovery layer.

That need for validation may be the biggest differentiator between food AI and other industries. As Hume put it, “You have to bring it into the lab and make sure that it actually works. Otherwise, the predictions are worthless.”

When Hume and I discussed whether large players like Microsoft or Google would eventually dominate vertical-specific foundation models, she acknowledged that possibility. However, she stressed that today’s large foundation models are not yet equipped to deal with the physical and regulatory realities of food. “There’s a ton of very specific know-how that goes into making those models usable for applications like protein discovery or formulation.”

For Shiru competitor NotCo, this highly specific data and domain knowledge are what the company is banking on to solidify its position as a key player in building a foundation model for food, a term now featured prominently on its website.

“I think what people need to understand is that AI is truly about the data sets and the value of the data sets that you have and the logic of the algorithms,” said Muchnick in an interview I had with him in July at Smart Kitchen Summit. “It’s really hard to get to where we were, and specifically also because we weren’t just an AI company. We are a CPG brand, and we learned a lot from being a CPG brand.”

In conversations with AI experts outside of food, several have said we are starting to see the big foundation models open up to allow companies to train them with vertical or highly specific domain knowledge. One pointed to Anthropic’s Model Context Protocol (MCP), which lets a foundation model connect to external data sets to process answers.

Another example is Thinking Machines’ newly announced fine-tuning API called Tinker, which could make it significantly easier for a food brand to train a model with domain-specific knowledge by removing the heavy infrastructure and engineering overhead typically required for custom AI development.

For Shiru, NotCo, and others developing food and ingredient-focused AI, there is still significant opportunity because the field is still so early.

“We’re just starting to see companies thinking about their own internal instances,” said Hume. “A lot of this is in progress, boardrooms are having these discussions right now.”

One of the biggest holdups for food brands is that data ownership and business-model alignment remain unsolved. Who owns the training data and the resulting outputs is a key question, and without clear answers, many companies will hold their data close, limiting the ability of shared platforms to reach critical mass.

For that reason, Hume believes partnerships and licensing models, not open data exchanges, will drive progress in the near term. Shiru’s model focuses on IP discovery and licensing, which allows the company to build intellectual property value without requiring massive manufacturing investments. “Our IP portfolio has doubled year over year since 2022,” said Hume. “Now the focus is on monetizing that through licensing and sales.”

The topic of food-specific foundation models and the adoption of AI by food brands is a fast-moving one, so you’ll want to make sure to listen to this episode to get caught up. You can listen to our entire conversation below or find it on Apple Podcasts, Spotify, or wherever you get your podcasts.

September 30, 2025

MIDEA Shows Off NFC-Enabled Smart Clips to Help Track Food

Earlier this month at IFA, Midea showed off a range of new smart kitchen gear, including a new refrigerator lineup that features a smart clip system to keep track of food expiration dates.

The smart fridge system, called the INSTA-FIT MASTER, included what the company calls “AI PREPMASTER” and the NFC clips “AI Food Clips”. The clips, which include an NFC communication chip, can be assigned to different food items by the home user within the app.

As can be seen in the video above, users of the new fridge deploy a new NFC smart clip by walking through a series of choices in the app, such as food category (meat, dairy products, vegetables, etc), where the food is stored (freezer, refrigerator) and recommended storage duration. According to MIDEA representative Haoyu Wang, the light in the clip storage rack will turn red when the food is hitting the end of its freshness window.

In some ways, the MIDEA clip system is reminiscent of the Ovie smart clip system, only unlike Ovie the system is integrated with the fridge itself. To our knowledge, it’s the first effort tby an appliance maker to use a fridge-integrated clip system for food tracking. Other smart fridge efforts to track food often featured built-in fridge cams and/or an app in which consumers logged food as they put it into the fridge.

The company indicated that the AI Food Clips and the AI PREPMASTER system will chip in 2026.

September 30, 2025

DoorDash Rolls Out Delivery Robot and an AI- Powered Delivery Orchestration Platform

There’s a new delivery robot in town, and its name is Dot.

DoorDash announced the new delivery bot today, confirming months of rumors that they were working on their own robot. The robot, developed entirely by its internal DoorDash Labs team, is the delivery company’s first homegrown bet on delivery automation and street-level autonomy, signaling the company sees owning more of the underlying robotics stack that powers delivery as a strategic priority.

Dot, which is roughly one-tenth the size of a car, can reach speeds up to 20 mph. DoorDash claims the system is designed to travel across sidewalks, bike lanes, and neighborhood roads, providing it with flexibility in navigating mixed urban environments. Watching the video of the Dot (see below), it’s clear that the robot really moves.

DoorDash Dot

“(Dot) is small enough to navigate doorways and driveways, fast enough to maintain food quality, and smart enough to optimize the best routes for delivery,” said Stanley Tang, Co-Founder and Head of DoorDash Labs. “Every design decision, from its compact size to its speed to the sensor suite, came from analyzing billions of deliveries on our global platform and understanding what actually moves the needle for merchants and consumers.”

As part of the Dot rollout, the company also introduced it’s Autonomous Delivery Platform (ADP). The company describes ADP as an AI-driven dispatcher that selects between human Dashers, robots, drones, or other modes based on order type, distance, and merchant requirements.

The company described its effort to build out the autonomy stack in a post on its engineering blog. According to the company, Dot continuously ingests multi-modal sensor data (LiDAR, cameras, radar) to detect obstacles, classify terrain, and localize itself within complex urban settings. That raw sensor input feeds into a perception module, which constructs a dynamic environmental model, identifying pedestrians, street furniture, curbs, driveways, and motion patterns of nearby agents. Above the perception player lies the planning layer, which reasons about safe and efficient paths, lane transitions, sidewalk maneuvers, and mode switching (e.g. segueing from bike lane to sidewalk). Finally, the control or actuation layer translates those planned trajectories into smooth motor commands, ensuring payload stability and maintaining robust compliance with safety constraints.

In a way, the announcement of ADP as an orchestration layer is perhaps more interesting than the debut of a DoorDash native delivery robot, as it marks a big move forward for DoorDash to build a multimodal delivery network managment system. The system will streamline “handoffs today while laying the groundwork for more reliable, efficient deliveries as autonomy scales.” The company stated that it plans to work with third-party delivery technology companies (like, perhaps, drone delivery as well as other sidewalk delivery companies like Coco, with which it has already partnered).

With DoorDash’s announcement, it’s work looking at the potential impact on the delivery company’s third party technology partners such as Coco Robotics. DoorDash launched its partnership with Coco Robotics, whose bright pink sidewalk robots have been operating under the DoorDash app in Los Angeles and Chicago since 2021. While DoorDash says it plans to work with third-party providers, framing of Dot as “purpose-built for local commerce” suggests a long-term intent to shift from relying on external robot vendors to deploying its own fleet.

After the publication of this story, a DoorDash representative reached out with a comment on their relationship with Coco and other partners: “Coco is a long-term partner, offering sidewalk robot delivery for DoorDash customers in select U.S. markets – including Los Angeles, Miami, and Chicago – as well as in the EU, in Helsinki – and we’re focused on continuing to scale that partnership. Built for dense urban environments and sidewalks, Coco is designed to handle a range of delivery scenarios. With DoorDash handling millions of deliveries a day, different types of robots are required for different situations – from dense urban areas to suburban neighborhoods – which is why our multimodal strategy is so important, and why Coco is an integral part of that strategy.“

You also have to wonder if DoorDash’s move to create its own delivery bot is an answer to Uber’s strategic partnership with Serve. Some might remember that Serve, which Uber acquired when it acquired Postmates, was spun out of Uber in 2021, and that Uber still owns a stake in Serve Robotics and that the two companies work closely together.

September 15, 2025

Fresco Partners Up With E.G.O. to Accelerate Smart Kitchen Software

As a former semiconductor analyst, I realized one truism of the tech world: makers of component building blocks for any hardware product eventually move up the stack, integrating with software solutions so their OEM partners see them as the preferred foundation on which to build their products.

So it’s not surprising to hear that one of the world’s leading makers of induction cooking hardware, E.G.O., has partnered with Fresco, a company that provides software to enable smart kitchen hardware solutions. The deal, announced at IFA last week, will allow E.G.O. to add smart kitchen functionality to its induction cooking systems and appliance control systems.

For Fresco, the partnership makes it possible to fast-track appliance partner integration. Because the software is integrated at the component level, it will require less customization for each new system powered by their smart kitchen software.

“We realized instead of trying to partner with all of the brands individually, that it’d be much smarter to go to the source,” said Ben Harris, CEO of Fresco, in an interview at IFA. “So we met E.G.O. here a year ago and aligned on the potential future of a partnership. I’m delighted to be here at IFA to now announce the kickoff of the partnership and sort of where this can ultimately go.”

According to Harris, the partnership means Fresco no longer needs to integrate one-on-one with dozens of different brands. Instead, by working directly with E.G.O., Fresco’s technology can come “available out of the box.” Harris explained: “It’s two things, both from a sales point of view, that we don’t need to speak to all hundred brands, but also we don’t need to do the small individual integrations with every one of them. The integration is available out of the box.”

It has been interesting to watch Fresco (formerly Drop) evolve from its early days as a maker of a connected kitchen scale to a software company that, historically, had to do significant customization for each appliance partner. With its new partnership with E.G.O., my guess is this will accelerate their partner growth while also making a smart software/connectivity stack a more standard part of the broader bill of materials for new appliances.

Fresco has remained fairly true to its early focus (post-scale) of being a smart kitchen software ingredient provider, while others like SideChef and Innit have focused more on commerce and, more recently, food and wellness-related AI solutions for CPG partners..

You can watch my conversation with Ben Harris at IFA below.

Fresco's Ben Harris Talks New Partnership with E.G.O at IFA

August 25, 2025

Japan’s Most Successful Food Robot Startup is Eyeing Humanoids As The Next Big Thing

One of the defining characteristics of early successful food robots has been focus.

Whether it’s the Flippy burger-flipping robot, Bear’s front-of-house robotic waiters, or the Autocado avocado-coring and processing robot, the ability of these focused-task robots to automate one or two core functions hyper-efficiently has been a – if not the – key ingredient for success that has set these machines apart from their less successful peers.

But as the world of AI and robotics increasingly talks up humanoids as ready for primetime, are multi-function robots that more closely resemble humans – both in appearance and in their seemingly unending ability to tackle different kinds of tasks – set to take the baton as the next big thing in food robots?

If you ask Yuji Shiraki, the CEO of one of Japan’s fastest-growing tech startups and a darling of the food robotics world, the answer is a cautious yes.

The idea to build a food robot first came to Shiraki during a visit with his grandmother. Over 90 years old, Shirak’s grandmother could not cook for herself, and so he started to think about how a home cooking robot might help her. However, he soon realized that Japanese kitchens were too small to build the type of robot he envisioned, and he started thinking about building robots for restaurants. 

I first met Shiraki in 2022 at his roboticized pasta restaurant in Tokyo, E Vino Spaghetti, located across from Tokyo Station. Inside, its P-Robo robot boiled pasta, heated sauces, plated dishes, and even handled cleanup, all in just over a minute.

After the pasta restaurant, TechMagic built Oh My Dot, a ramen restaurant in Shibuya, where a robotic system prepared noodles using modular flavor packs. In both cases, the goal was to test the products with real customers while showcasing the company’s robotics to potential partners.

And the partners came. The company began working with KFC Japan, Nissin Foods (the company behind Cup Noodles), and most recently announced a partnership with Lawson.

One of the key reasons for the company’s early success was that its robots were highly tailored to specific tasks like preparing pasta, ramen, or bowl food. But now, as TechMagic and Shiraki look to the future, they see a path forward built around robots that are, like humans, much more adaptable and multi-functional.

The company outlined some of those functions in a recent announcement, saying they envision humanoids expanding human “hands” and “judgment.” Specific functions include automating repetitive tasks such as serving, sorting, and transporting food in restaurants and factories; flexible food preparation that uses AI-driven “hand technology” to perform complex cooking tasks; and customer interaction, where humanoids would optimize service and store operations using emotion and behavior recognition.

Shiraki offered clues to his bigger vision for TechMagic’s humanoid plans in a post on Facebook:

“When we began developing cooking robots in 2018, many said the ‘chances of success were slim.’ Yet today, we’ve grown to the point of competing for the top global share of operating units. The hurdles for humanoid robots may be just as high, but in the long term, we believe this is an extremely rational strategy. And beyond that, the development of humanoid robots is full of dreams and romance.”

Part of Shiraki’s motivation is to help position Japan as a leader in developing humanoid robots.

“While China and the U.S. are leading the way, we intend to contribute to labor-strapped industries with a Japan-born humanoid robot and expand globally.”

He says the company is hiring and looking for partners. One of the first steps is working with existing humanoid robots such as those from Unitree in the TechMagic development lab. Shiraki even showed off a video of the Unitree robot on Facebook.

Shiraki told me they are eyeing around a three-year time horizon to develop their first humanoid for the food business and that the company is now busy raising its next funding round to help fund the development.

“TechMagic is taking the technology it has cultivated in cooking and service robots to the next level, fusing it with humanoid robotics to create a new ‘future of food’,” said Shiraki. “We aim to build a social infrastructure that frees people from boring, harsh, and dangerous tasks, enabling them to live more creatively.”

August 14, 2025

Sure, AI Might End Humanity, But First It Could Help Keep Your Food Fresher

If you ask Steve Statler, our current supply chains are essentially the equivalent of an old-school combustion engine (at best), and at worst something akin to a horse-drawn carriage.

“We’re running our supply chains with 19th-century visibility,” said Statler, the CEO and cofounder of AmbAI and host of the Mr Beacon Podcast. “The future is automating it completely, so we see everything everywhere all at once. We improve safety, we reduce waste, we increase shelf life.”

And while things may be largely stuck in the past, where we track food from farm to fork relying on things like barcodes, manual scans, and occasional checkpoints – the end result of which is blind spots that lead to waste, quality loss, and safety risks – Statler believes we are the precipice of dramatic change.

Statler believes much of the change will come as result of broad deployment of tiny, battery-free Bluetooth “stickers” and AI systems capable of reading, analyzing, and acting on their data in real time. “Basically, the cost of infrastructure to read these tags automatically is going down, down, down,” he said. “Over the next one to three years, these tags can harvest energy from the mobile devices, surrounding us. And that’s the unlock.”

The size of a postage stamp, these ambient IoT tags continuously transmit information on temperature and location without human intervention. We will “improve safety, we reduce waste, we increase shelf life,” Statler said, describing a not-too-distant future where every pallet, package, or even piece of produce is monitored end-to-end.

According to Statler, these types of tags could change the way we track food inventory in our fridge, with “dynamic expiry dates” that respond to actual conditions rather than rough estimates. “You talk to Alexa and you say, ‘when is this milk or this salmon or this shrimp going to expire?’ and it will know,” said Statler. “We will have looked at the temperature over time that the product has been exposed to, and we can come up with a 21st-century model of how long the product will last.”

Statler pointed out that Alexa and other home assistants are capable of this today with a small software upgrade. I pointed out that allowing Alexa to track freshness by accessing Bluetooth data emitted from various devices and smart tags in your fridge would require consumer opt-in, especially given growing consumer concerns about privacy and access to their data.

“I think Amazon is very sensitive to that, and when they do this, and this is just me speculating, then they’ll do it with privacy in mind,” said Statler. ” I believe that privacy, when done badly, can kill products.”

I also asked Statler if these types of small beacons are connecting with other IoT systems, like Strella or others that sense changing food chemistry to better predict and manage freshness, and he said they’re starting to, but it’s in the early stages. Statler says the primary focus right now of these beacon system is on temperature and identity. The tags are also part of a larger trend toward serialization, where every individual product has its digital passport for authenticity, traceability, and freshness management.

Feeding this data into AI could shift the industry from traditional supply chains to responsive ‘demand chains.’

“We can do a better job of making better products that people use,” Statler said. “And we can start to go from supply chains to ‘demand chains’ that are informing the production and distribution to be much more efficient.”

It was an interesting conversation, one in which Statler was clearly excited about the potential for AI in our supply chains and in our lives, but also saw a potential danger lurking.

“I’m a little pessimistic about where AI is going. I sort of have this dual view of artificial intelligence, which is it’s amazing, and this is why I got into computing years and years ago. But at the same time, there’s a real chance it’s going to kill us all or enslave us. And I think we have to kind of live with that duality in our heads and do our best to try and make sure that this technology evolves in a positive direction.”

You can listen to my full conversation with Steve by clicking play below, on Apple Podcasts, Spotify, or wherever you get your podcasts.


July 31, 2025

NotCo’s Next AI-Powered Innovation? Replicating Human Scent to Make My Dog Happy

As far as my family’s small Pomeranian, Zeus, is concerned, I’m a very distant second banana when it comes to the humans in his life. Sure, he’ll let me feed him and pay the cheese tax, but the reality is he’s only got eyes (and a nose) for one person in his life, which happens to be my wife.

Like many loyal dogs, when my wife is out of the house, Zeus finds comfort in lying on blankets, sweaters, or anything that may have a whiff of his favorite person’s scent. Where things get rough for the little guy is when we have to travel, but someday soon we’ll be able to bring a bottle of “mom” fragrance to provide a little canine aromatherapy when we drop him off at the dog sitter.

That’s at least according to NotCo CEO Matias Muchnick, with whom I sat down this week at the Smart Kitchen Summit to talk about what his company and the journey of being a pioneer in leveraging AI to develop new food (and now pet) products.

 “We’re partnering with one of the biggest pet companies in the world to generate human scent,” said Muchnick. “Literally, it’s like a 23andMe for your smell.” The idea according to Muchnick is to use an AI model to do scent profiling to create a mist that replicates your scent, helping ease separation anxiety for pets when their humans leave home.

NotCo Wants to Create the 23andMe of Scent for our Pets

There’s no doubt that this new direction is leveraging some of the work that NotCo has done in building out a “Generative Aroma Translator”, which the company unveiled at the Spoon’s Food AI Summit last fall. “The system intakes your prompt, such as ‘an ocean scent on a breezy summer day on a tropical island’ to create a novel chemical formulation of that scent in one-shot,” said the company’s former chief product officer, Aadit Patel.

Only add in an extra layer of personalization, which includes your odor and all the notes you pick up as you travel through the world.

“We will get you a report of your top notes of your own body whenever you get back home,” said Muchnick. “If you work in an office, it’s going to be an office, depending on the office that you work in. If you’re a mechanic, you’re going to have a lot of other odors.”

Muchnick kept quiet on who the partner is or what the actual product would look like, but did indicate this project is one of hundreds of new projects since the company doubled-down on being an AI-powered innovation engine for CPG brands.

“Our first investor decks in 2016 were all about AI,” Muchnick said. “But no one believed in it
back then, so we had to prove the model ourselves.” NotCo’s path to validation came by launching its own consumer products, such as mayo, ice cream, burgers, and capturing market share in Latin America and North America, after which big players couldn’t help but take notice.

Today, NotCo is firmly in phase two of its journey. Through partnerships with companies like Kraft Heinz, Starbucks, and PepsiCo, the company is showing how Giuseppe can help brands rapidly create new product formulations and adapt to regulatory or consumer-driven upheaval, such as the recent push to remove synthetic dyes or respond to GLP-1-driven shifts in eating habits. He said the company has over 50 active color replacement projects.

The different between now and just a couple years ago is drastic when it comes to big food’s receptivity to working with AI. Curiosity and hesitation has melted away and turned to eagerness and a sense of urgency.

Who he’s talking to has also changed. What used to be R&D director conversations are now CEO-level discussions. “AI is no longer optional,” said Muchnick. “If they don’t adapt, they’ll face the blockbuster effect. They’ll become obsolete.”

You can watch the replay of the full interview at The Spoon next week.

July 21, 2025

From Aspiring Pro Surfer to Delivery Robot CEO with Coco’s Zach Rash

Zach Rash wanted to be a professional surfer. So much so, that in high school, there was more surfing than academics.

That all changed when Rash reached UCLA and met Brad Squicciarini. It wasn’t long before the two spent every waking hour together in a small room building robots.

“We spent like our entire life in this like box at UCLA with no windows, and we’re just building robots from scratch, and it was the best job ever.”

Eventually, the real world came knocking as Rash and Squicciarini graduated and had to find jobs. After applying for many of the same positions, they eventually decided they should just start their own robot company.

“We just had a lot of really strong opinions about what it would take to get these things into the world and make them useful. So… decided to do it ourselves.”

Coco launched in early 2020. “We started building them in our living room and we couldn’t get more wheels… so it was a bit of a sketchy robot.” Still, their first merchant deployment went smoothly. “The first day of the business, I mean, we gave it to a merchant and Brad and I just took turns driving it and fixing it.”

They faced steep financing challenges: “We didn’t have any money… Even if you’re only building a few, you know, it’s still going to cost you tens of thousands of dollars.” They pitched more than 200 investors before raising a modest $50,000 to start. “We thought that was a lot of money and we built a few robots with that and kind of proved out that we could run a service, not just build the robot.”

Their persistence paid off. In June 2025, Coco raised $80 million, led by angel investors Sam Altman and his brother Max, alongside Pelion Venture Partners, Offline Ventures, and others. 

This brought Coco’s total funding to over $110 million, which Rash says the company plans to use to scale its operations and technology.

“Coco Robotics will use the new funds to improve the technology and to scale up its fleet,” Rash told TechCrunch. “The company expects to go from low‑thousands to 10,000 robots by the end of next year.”

According to the company, Coco bots have delivered over 500,000 items to date, working with retailers like Subway, Wingstop, Jack in the Box, Uber Eats and DoorDash.

It’s only been a few short years since Rash was largely concerned about surfing, but now, armed with funding and lots of interest from retail partners, he’s ready to ride to the wave of growth of his robot delivery company.

“We’re building as many as we can as fast as we can.”

Zach will be speaking at SKS 2025 tomorrow, so make sure to get your tickets. You can listen to our conversation on the latest episode of The Spoon podcast below, on Apple Podcasts or Spotify, or wherever you get your podcasts.

July 9, 2025

Thermomix Has Long Been a Leader in Cooking Automation, But Now They’re Going Full Robot

For years, I’ve said that the Thermomix is quite possibly the most successful automated cooking appliance in the world. Sure, it’s not a humanoid robot or what we’ve come to expect from cooking robots in recent years, but the TM6 and TM7 are software-powered cooking appliances that automate and sequence functions in a way that feels surprisingly intelligent, especially compared to typical countertop or built-in kitchen appliances.

But now, if recent moves by Thermomix’s corporate parent, Vorwerk, are any indication, Thermomix may be going full robot. At last month’s Automatica conference in Munich, Thermomix and red-hot German robotics startup Neura Robotics announced a partnership in which Neura’s humanoid robot used Vorwerk’s Thermomix and Kobold vacuum cleaners to perform everyday household tasks.

According to Neura CEO David Reger, optimizing his robots to work with Vorwerk’s cooking and cleaning appliances is a step toward building an aging-in-place platform powered by humanoids.

“Together with Vorwerk, we are redefining household robotics – with intelligent assistants that provide concrete relief for people in their everyday lives: from cooking to independent living in old age,” said Reger.

Even more interestingly, Vorwerk also announced a partnership with AI and chip giant NVIDIA last month. According to the announcement, “Vorwerk is post-training NVIDIA Isaac GR00T N1, an open robot foundation model, to support families around the home, whether seniors looking to maintain their independence, or busy families in need of an extra pair of hands. To post-train the model, Vorwerk is leveraging the Isaac GR00T-Mimic data pipeline to generate large, diverse synthetic motions data to prepare robots for common household tasks such as cooking, cleaning, and more.

“Together with NVIDIA Robotics we are now taking a significant step towards the connected and automated home,” wrote Vorwerk CEO Thomas Rodemann on Linkedin. “Our goal: creating integrated digital/physical ecosystems that support our community in their everyday lives and make the home more convenient for everyone – whether it’s providing busy families with an extra pair of hands or giving seniors more independence.”

When Jensen Huang showed up at CES in January and said that the ChatGPT moment for robotics is right around the corner, I’m not sure if he was thinking about cooking food with the Thermomix, but maybe he was. Vorwerk would be a logical candidate to build true home robot assistants, since progressing rightward on the simple tool to fully-capable robot continuum already and they’ve been the most successful at integrating software with home cooking automation

You can watch the video of the NVIDIA-powered robot making food with the Thermomix in the video below.


Next

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2026 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...