• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

AI

January 26, 2026

Gambit Robotics Hopes to Usher In a New Era of Guided Cooking Without Robots (Yet)

Coming out of CES earlier this month, you might think a new kitchen assistant from a startup called Gambit Robotics would look something like the dozens of humanoid robots roaming the show floor in Las Vegas.

Instead, the company’s newest product, launching on Kickstarter tomorrow, is something much more familiar, closer in spirit to the guided cooking systems that began to emerge in the smart kitchen over the past decade, albeit with a computer-vision-driven twist.

The eponymously named Gambit, described by the company as an “AI sous chef,” uses an AI-powered computer vision system mounted above the stove to detect heat patterns and track cooking progress. Positioned above the cooktop, the device can see what’s happening in the pan, monitor burner activity, and sense temperature changes. According to the company, users can drop in almost any recipe, whether from a website, a photo of a handwritten card, or a family favorite, and the system will break down the steps and follow along as you cook. The home cook will be guided by a “conversational” voice interface and an associated Gambit app.

Company cofounder Nicole Maffeo says Gambit provides guidance and coaching you “can turn on or off,” including educational nudges designed to help users improve over time. “You can leave the kitchen,” she said.

The Future of Cooking Starts Here

As for the company’s longer-term ambitions, Maffeo says Gambit’s vision extends well beyond a single device. She and cofounder Eliot Horowitz see an eventual ecosystem of kitchen assistants, including devices that understand what’s in your pantry or fridge and connect shopping, planning, and execution.

The company is building on top of a platform created by Horowitz for his company called Viam, which I described as something of a “WordPress for robotics” when I interviewed him in November. Down the road, that ecosystem could include robotic arms or deeper appliance integrations. In the near term, however, Maffeo says the company is also exploring software licensing opportunities with appliance makers, particularly around its computer vision and thermal sensing stack.

“We don’t need to own every piece of hardware,” Maffeo said. “If there’s a hood above a stove, that software should be there.”

Gambit plans to price the hardware at roughly $500 at retail, with early Kickstarter backers receiving a modest discount. The company is pairing the device with a monthly subscription, expected to land between $9 and $15. Maffeo says Gambit is targeting Q3 of this year for shipping products to consumers.

Rather than a walking, talking kitchen robot chef, Gambit strikes me as much closer in function to the guided cooking systems that were a major focus of smart kitchen startups a decade ago. Companies such as Hestan, ChefSteps, and Thermomix paired software, sensors, and cooking hardware to create cooking assistants. While guided cooking eventually faded as many of those products failed to find the level of success their creators hoped for, Maffeo believes the timing is right this time around, thanks to advances in AI systems that can make these tools work better.

She may be right, but a couple of questions remain. The first is whether consumers will understand what this product actually does. Gambit is not a cooking device, but a cooking guidance system. The promised benefits are similar to those offered by earlier guided cooking products, such as the Hestan Cue, except Gambit delivers those benefits through a device mounted above the stove rather than through smart cookware or appliances.

The second question is whether consumers will be willing to pay for those benefits in the form of a $500 device and an ongoing subscription. Consumers have historically been reluctant to spend on entirely new kitchen product categories, and this is not an insignificant price tag.

There’s little doubt the technology itself is impressive, and it’s encouraging to see experienced entrepreneurs like Horowitz and Maffeo looking to the kitchen as a place to apply AI-enabled technology. I’ll be watching closely to see whether Gambit can help usher in a new era of guided cooking, this time powered by AI and, eventually, robotics.

The product launches on Kickstarter tomorrow, January 27th.

December 18, 2025

Shinkei Hopes Bringing Robotics & AI to the Fishing Boat Leads to Fresher Fish and Less Waste

Walk into almost any grocery store and, chances are, what you see in the fish case is not at peak freshness. 

You wouldn’t think it’d be that way, especially in places like Seattle, where I live, because such care is given to getting the fish to market quickly. But according to a new venture-backed startup named Shinkei, the critical factor in determining freshness is not what happens after fish leave the boat, but instead what happens in the moments immediately after they are caught. 

I caught up with Shinkei CEO Saif Khawaja earlier this month to discuss how exactly his company’s tech brings what he claims are Michelin-quality fish at commodity costs. According to Khawaja, conventional handling on the boat fails because it leaves most fish “flopping around,” which triggers stress responses that accelerate quality loss and shorten shelf life.

“Most fish available at a mass market retailer were handled on the boat in a way that releases stress hormone, lactic acid,” said Khawaja. “This stuff makes the meat more acidic, primes bacteria growth, and in turn speeds the shelf life and decay of meat quality.” 

Shinkei’s computer-vision-powered robot is designed to intervene immediately. Fish are placed into a machine, which the company calls Poseidon, while still alive, and it uses computer vision AI to scan the fish to determine the fastest (and least-stressful) path forward for the fish. Once the fish is scanned, the machine performs a fast sequence: a brain spike to euthanize as quickly as possible and a gill cut (to drain blood).

If this system seems, well, rough, it is, but the reality is fish caught are experiencing high stress from the time they’re caught, and the faster the fisherman can move towards euthanization, the more humane (and ultimately fresher and better tasting the fish). Khawaja says each fish is processed in about six seconds, and the company’s goal is to get the fish into the system quickly after landing, ideally within roughly a minute, before quality begins to degrade meaningfully.

Speed on the Boat = Less Waste in the Store

While much of the pitch is focused on better taste, Shinkei’s technology also a food waste angle. According to Khawaja, their solution also helps reduce waste in the store. That’s because, typically, a suffocated fish might enter rigor mortis in about 7 hours, but Shinkei’s process expands that up to 60 hours, which creates a much larger buffer before decomposition starts. Khawaja says it also makes a difference by species, with black cod handled in a traditional way lasting four to five days, where Shinkei-handled fish can stay fresh or up to two weeks. 

Khawaja attributes the compounding effect to two factors: reducing stress (less acidification) and removing blood that would otherwise diffuse through the meat and feed bacterial growth. He says the resulting shelf-life extension gives food distributors more options for logistics, allowing fish to be trucked rather than flown. 

If Shinkei’s technology works as promised, one might expect to see all professional fishermen and processors installing hardware at some point, right? Maybe…not. That’s because the company’s business model is to create a branded direct-to-consumer model for its fish, so instead of selling the hardware outright, Shinkei places machines on partner boats under a zero-cost lease and retains ownership of the machines. They also require an exclusive buying structure that grants Shinkei the right to purchase the catch processed that uses its machinery. 

From there, the company sells the fish into foodservice channels and retail under the brand Seremony, where they’re trying to get “Seremony grade” to catch on.  Khawaja says the company has sold into top-tier restaurants globally, including Michelin 3-star destinations across multiple countries, and recently launched in Wegmans (Manhattan) and FreshDirect (New York).

Today, Khawaja says Shinkei works with eight boats, sourcing species like black cod, rockfish (including vermilion rockfish), and red snapper, plus some ad hoc species (salmon, black sea bass, and others). The boats surf water in the US west coast (Alaska down to California), Texas, and Massachusetts.”

When I asked Khawaja about the underlying technology, he told me they built their AI models in-house, collecting their own data and building a pipeline informed by work like facial recognition research (fish face, that is). The computer vision stack performs a set of inferences: identifying species, detecting key points, and generating cutting paths.

He also talked about two new projects they are working on within the platform. One is Kronos, a weight-estimation model embedded in the machine that sends catch data back to the Shinkei sales team in real time so they can start selling fish before it reaches the dock. Another is Nira, which uses sensors to predict shelf life.

“We integrate sensor data into a model, and we will be able to generate ground truth at any point in the supply chain for what shelf life and quality is for that fish,” said Khawaja. 

The company recently raised $22 million and is currently at Series A. The Series A was co-led by Founders Fund and Interlagos, with new investments from Yamato Holdings, Shrug, CIV, Jaws, and Mantis.

Long-term, I wondered whether the company was open to expanding to a model in which it sells its hardware to fishermen who don’t feed their catches back to the company as part of the Seremoni pipeline.  Khawaja and Shinkei completely shut the door, but for now, they’re “focused on building the brand and basically establishing and making ceremony-grade as a certification.”

November 18, 2025

Can AI Help Chocolate Survive? NotCo and Swiss Chocolate Maker Barry Callebaut Think So

Is the world of chocolate heading toward the same fate as the dodo bird?

It may be a surprise to some to consider that a centuries-old, worldwide favorite like chocolate is on its way out, but the reality is that most experts agree that climate change, ingredient shortages, rising prices, and other global pressures have put chocolate in jeopardy. Some even predict that it could one day become extinct.

The impending peril has meant that every global chocolate supplier has started looking for ways to adapt to the future, and increasingly, one of those ways is to do what many companies both in and outside of food are doing: look to AI to accelerate change. Some, like Hershey’s, have developed their own tools such as Atlas, while others are looking to companies with deep experience building AI models focused on food to help transform their business.

In that second category is a new partnership between Barry Callebaut, one of the world’s largest premium chocolate makers, and NotCo, the AI-powered food company that has made a name for itself in recent years with its Giuseppe food AI platform. The deal calls for NotCo AI to embed what it describes as its foundational AI platform directly into Barry Callebaut’s R&D pipeline. The announcement says the collaboration gives Barry Callebaut access to the same engine that has helped NotCo accelerate formulation cycles, solve complex ingredient challenges, and unlock unexpected flavor and functionality breakthroughs for global CPG brands.

For NotCo, the deal marks its most significant category-wide integration yet and reinforces what CEO Matias Muchnick said he and his cofounders believed from the very beginning. NotCo is not simply a maker of plant-based food. It is a next-generation R&D operating system for the food industry.

“This is exactly what we built NotCo for,” Muchnick said at SKS 2025 in July. “The value of our platform comes from a decade of high-fidelity data, from formulations and ingredient chemistry to sensory outputs and manufacturing parameters, all connected so we can solve multi-dimensional problems faster and with no human bias.”

The two companies plan to feed Barry Callebaut’s 100-year knowledge base and ingredient data into NotCo’s AI foundation model and build what they are calling the chocolate industry’s first end-to-end AI innovation hub. The goal is to iterate on new formulations, explore functional ingredients, and optimize for sustainability, cost, and Nutri-Score constraints.

At Future Food Tech in the spring, Muchnick gave a presentation that emphasized their push to become the go-to partner for AI transformation. The company’s Kraft Heinz partnership had already given them some street cred, so it is no surprise that they have seen strong interest from global food brands.

“Every big food company is having board-level conversations: do we have the technology to adapt to new consumers, shortages, and regulations? And consistently the answer is no,” Muchnick said at SKS in July. “That is why they’re coming to us. Everything changed in the last six months.”

In a sense, food AI specialists like NotCo are in a race against time as bigger general-purpose foundation models from OpenAI, Anthropic, and others become easier for different industries to customize with their proprietary data. Most CPGs do not yet have it in their DNA to build AI-forward development cycles, but that is likely to change in the next five years as boards demand transformation while they watch competitors accelerate product development and, in categories like chocolate, identify new alternatives in a market where environmental pressure, inflation, cost volatility, and other external factors force their hand.

“The companies that don’t adopt AI the right way will get the Blockbuster effect,” said Muchnick. “They’ll become obsolete. The future food companies will be AI companies.”

November 13, 2025

We Talked With Nectar About Their Plans to Build an AI for Better Tasting Alt Proteins

A few weeks ago, the philanthropic investment platform Food System Innovations announced that it had received a $2 million grant from the Bezos Earth Fund. FSI’s non-profit group NECTAR has been building a large dataset of consumers’ sensory responses to alt proteins, and the grant will help NECTAR to continue working on, in partnership with Stanford University, an AI model “that connects molecular structure, flavor, texture, and consumer preference.” The goal, according to NECTAR, is to create an open-source tool for CPGs and other food industry players to develop more flavorful—and hopefully better-selling—sustainable proteins.

I’d been following NECTAR for some time and have been closely tracking the impact of AI on food systems, so I thought it would be a good time to connect with NECTAR. I’d talked about the project briefly with Adam Yee, the chief food scientist who helped with the project, while I was in Japan, and this week I caught up with NECTAR managing director Caroline Cotto to get the full download on the project and where it’s all going.

Below is my interview with Caroline.

What are you building with this new Bezos Earth Fund grant?

“One of the things Nectar is doing is we just won a $2 million grant from the Bezos Earth Fund to take our sensory data and build a foundation model that will predict sensory. So we kind of bypass the need for doing these very expensive consumer panels, and then also predict market success from formulation. It’s intended to be sort of a food scientist’s best friend in terms of new product ideation.”

For people who don’t know Nectar, what’s the core mission, and how did this AI project start?

“Basically, Nectar is trying to amass the largest public data set on how sustainable protein products taste to omnivores. That’s what we have set out to do. We’re building that, and we are working heavily with academics to operationalize that data.

Over a year and a half ago, we started talking to the computer science folks at Stanford to say, like, what are things we could do with this novel data set that we’re creating? It happened to be around that time that the phase one Bezos Earth grant was opening up for their AI grand challenge. I connected Adam with the Stanford team, and they did some initial work on LLMs and found that it was able to do some of this support for food scientists. They published a paper together that came out in January for ICML, the largest machine learning conference, and we ended up winning that phase one grant, which then allowed us to apply for the phase two grant that we just found out about in October.”

From a technical standpoint, what kind of AI are you actually building?

“I am not an AI scientist myself here, so we are heavily partnered with Stanford and their computer science team, but it is an LLM base. We’re basically fine-tuning an LLM to be able to do this sensory prediction work, and it’s a multi-modal approach. There’s a similar project that’s been done out of Google DeepMind called Osmo for smell and olfactory, and we’re working with some of the folks that worked on that in order to model taste and sensory more broadly, and then connect that to sales outcomes.”

How does the Bezos Earth Fund AI Grand Challenge work in terms of phases and funding?

“It’s the Bezos Earth Fund AI Grand Challenge for Climate and Nature. It’s $30 million going to these projects. There were 15 phase two winners that each received $2 million and have to deliver over two years.

The phase one was a $50,000 grant to basically work on your idea and prepare a submission for phase two. We spent about six months preparing, trying to connect this Nectar data set with sales data and see which sensory attributes are most predictive of sales success, and also connecting the Nectar sensory data set to molecular-level ingredient data sets. Ideally the chain of prediction would be: can you predict sensory outcome from just putting in an ingredient list, and if so, what about sensory is predictive of sales success? We’re working on the different pieces of that predictive chain.”

What does your sensory testing process look like in practice?

“It’s all in-person blind taste testing. In our most recent study, we tested 122 plant-based meat alternatives across 14 categories. Each product was tried by a minimum of 100 consumers. They come to a restaurant where we’ve closed down the restaurant for the day, but we want to give them that more authentic experience. They try probably six products in a sitting, one at a time, and everything is blind, so they don’t know if they’re eating a plant-based product or an animal-based product and then they fill out a survey as they’re trying the product.”

How big is the data set now, and what’s coming next?

“We do an annual survey called the Taste of the Industry. For 2024, we tested about 45 plant-based meat products. For 2025, we tested 122 plant-based meat products. Outside of that, we have our emerging sector research, which are smaller reports. We’ve done two of those, and both have been on this category we’re calling balanced protein or hybrid products that combine meat. We’ve tested just under 50 products total in that category as well.

We’re testing blends of things like meat plus plant-based meat, meat plus mushrooms, meat plus microprotein, meat plus just savory vegetables in general. For 2026, our Taste of the Industry report is on dairy alternatives. We’re testing 100 dairy alternatives across 10 categories, and that will come out in March.”

When you overlap taste scores with sales data, what have you seen so far?

“The Nectar data set is mostly just focused on sensory. That’s the core of what we do. We are also interested in answering the question ‘do better-tasting products sell more?’ In our last report, we conducted an initial analysis of overlapping sensory data with sales data, finding that better-tasting categories capture a greater market share than worse-tasting categories. Better-tasting products are capturing greater market share than worse-tasting products. In certain categories, that seems to be agnostic of price. Even though the product might be more expensive, if it tastes better, it is capturing a greater market share.

We’re currently working with some data providers to get more granular on this sales data connection, because that analysis was from publicly available sales data. In this AI project, we are trying to connect sensory performance with sales more robustly to see which aspects of sensory are predictive of sales success. It’s hard because there are a ton of confounding variables; we have to figure out how to control for marketing spend, store placement, placement on shelf, that sort of thing. But we have access to the Nielsen consumer panel, this huge data set of grocery store transactions over many years, from households that have agreed to have all of their transactions tracked. We’re able to see what consumers are purchasing over time, and we’re trying to connect the sensory cassette to that.”

You also mentioned bringing ingredient lists and molecular data into the model. How does that fit in?

“We’re trying to say, there are a lot of black boxes in food product development because flavors are a black box. We don’t have a lot of visibility into companies’ actual formulations. We’re trying to determine if we can extract publicly available information from the ingredient list and identify the molecular-level components of those ingredients, and then determine if any correlations can be drawn between them.

It’s all of these factors plus images of the products and trying to see if we can predict that.”

What do you actually hope to deliver at the end of the two-year grant?

“The idea is to deliver an open source tool for the industry to use. The goal would be that you can put in all the constraints you have for sustainability, cost, nutrition, and demographic need, and that it would help you get to an endpoint where you don’t have to do a bunch of bench-top trials and then expensive sensory.”

How do you think about open source, data privacy, and companies actually using this tool?

“Data privacy is a big thing in this space. We don’t have any interest in companies sharing their proprietary formulations with us. The goal is that they would be able to utilize this tool, download it to their personal servers, and put in their private information and use it to make better products. If we’re rapidly increasing the speed at which these products come to market and they are actually successful, that would be a success for us.

There are other efforts in this space, from NotCo to IFT. Where does Nectar fit?

“I think everybody is trying to do similar things, but with slightly different inputs and different approaches. We are open to collaborating and learning from people. Our end goal is a mission-driven approach here, not to make a ton of money, so it depends on whether or not those partners are aligned with that goal.

IFT has trained its model on all of the IFT papers that have been published over the many years of its organization being around. We’re training our model on our proprietary dataset around sensory data, so there’s some nuance between things. They’re really focused on developing formulations, but there is a limitation to what you can do with that tool. It’ll tell you, ‘here’s how to make a plant-based bacon, add bacon flavoring,’ but there are 10 huge suppliers that provide bacon flavoring, and it doesn’t provide a ton of granularity on at what concentration and from what supplier.”

What’s the bigger climate mission you’re trying to advance with this work?

“Nectar’s specific directive is, how do we make these products favorable and delicious? We know that we need to reduce meat consumption in order to stay within the two degrees of climate warming, and we’re not going to get there by just telling people, ‘eat less steak.’ We have to use that whole lever and make the products really delicious so that people will be incentivized to buy them more and reduce consumption of factory-farmed meat.”

Answers have been lightly edited for grammar and clarity.

October 6, 2025

Are Big Food Companies Really Embracing AI?

While some companies like NotCo have positioned themselves as the OpenAI of the food world, the truth is that the AI transformation of the food industry is still in the first inning. That is partly because the food system itself, a mix of legacy CPG giants, agricultural suppliers, ingredient developers, and regulators, moves at a glacial pace.

In my recent conversation with Jasmin Hume, founder and CEO of Shiru, she confirmed that the industry is still in the early stages, in large part because food companies have massive amounts of data and strong confidence in their own research and development.

“Food companies have world-class R&D teams, and those scientists want to see proof before adopting a new tool. It’s a lot of tire-kicking in the first meetings,” said Hume.

This slow pace does not mean AI is not making inroads. It is simply happening beneath the surface. From discovery platforms like Shiru’s to optimization tools in manufacturing and retail analytics, AI is slowly reshaping how food gets made.

But the true question is not if the food system will use AI, but who will own the models that make it useful. The answer is usually tied to who owns the data. Legacy food and ingredient companies have decades, even centuries, of proprietary chemical, biological, and sensory data. This makes them both powerful and hesitant to engage with AI models that might use that data to build their own foundation models.

Big food brands “are not going to very quickly turn over that data,” said Hume. Many are debating whether to build their own in-house systems, using models fine-tuned on proprietary data that never leaves their servers. Others are beginning to explore partnerships with companies like NotCo and Shiru that specialize in the discovery layer.

That need for validation may be the biggest differentiator between food AI and other industries. As Hume put it, “You have to bring it into the lab and make sure that it actually works. Otherwise, the predictions are worthless.”

When Hume and I discussed whether large players like Microsoft or Google would eventually dominate vertical-specific foundation models, she acknowledged that possibility. However, she stressed that today’s large foundation models are not yet equipped to deal with the physical and regulatory realities of food. “There’s a ton of very specific know-how that goes into making those models usable for applications like protein discovery or formulation.”

For Shiru competitor NotCo, this highly specific data and domain knowledge are what the company is banking on to solidify its position as a key player in building a foundation model for food, a term now featured prominently on its website.

“I think what people need to understand is that AI is truly about the data sets and the value of the data sets that you have and the logic of the algorithms,” said Muchnick in an interview I had with him in July at Smart Kitchen Summit. “It’s really hard to get to where we were, and specifically also because we weren’t just an AI company. We are a CPG brand, and we learned a lot from being a CPG brand.”

In conversations with AI experts outside of food, several have said we are starting to see the big foundation models open up to allow companies to train them with vertical or highly specific domain knowledge. One pointed to Anthropic’s Model Context Protocol (MCP), which lets a foundation model connect to external data sets to process answers.

Another example is Thinking Machines’ newly announced fine-tuning API called Tinker, which could make it significantly easier for a food brand to train a model with domain-specific knowledge by removing the heavy infrastructure and engineering overhead typically required for custom AI development.

For Shiru, NotCo, and others developing food and ingredient-focused AI, there is still significant opportunity because the field is still so early.

“We’re just starting to see companies thinking about their own internal instances,” said Hume. “A lot of this is in progress, boardrooms are having these discussions right now.”

One of the biggest holdups for food brands is that data ownership and business-model alignment remain unsolved. Who owns the training data and the resulting outputs is a key question, and without clear answers, many companies will hold their data close, limiting the ability of shared platforms to reach critical mass.

For that reason, Hume believes partnerships and licensing models, not open data exchanges, will drive progress in the near term. Shiru’s model focuses on IP discovery and licensing, which allows the company to build intellectual property value without requiring massive manufacturing investments. “Our IP portfolio has doubled year over year since 2022,” said Hume. “Now the focus is on monetizing that through licensing and sales.”

The topic of food-specific foundation models and the adoption of AI by food brands is a fast-moving one, so you’ll want to make sure to listen to this episode to get caught up. You can listen to our entire conversation below or find it on Apple Podcasts, Spotify, or wherever you get your podcasts.

August 14, 2025

Sure, AI Might End Humanity, But First It Could Help Keep Your Food Fresher

If you ask Steve Statler, our current supply chains are essentially the equivalent of an old-school combustion engine (at best), and at worst something akin to a horse-drawn carriage.

“We’re running our supply chains with 19th-century visibility,” said Statler, the CEO and cofounder of AmbAI and host of the Mr Beacon Podcast. “The future is automating it completely, so we see everything everywhere all at once. We improve safety, we reduce waste, we increase shelf life.”

And while things may be largely stuck in the past, where we track food from farm to fork relying on things like barcodes, manual scans, and occasional checkpoints – the end result of which is blind spots that lead to waste, quality loss, and safety risks – Statler believes we are the precipice of dramatic change.

Statler believes much of the change will come as result of broad deployment of tiny, battery-free Bluetooth “stickers” and AI systems capable of reading, analyzing, and acting on their data in real time. “Basically, the cost of infrastructure to read these tags automatically is going down, down, down,” he said. “Over the next one to three years, these tags can harvest energy from the mobile devices, surrounding us. And that’s the unlock.”

The size of a postage stamp, these ambient IoT tags continuously transmit information on temperature and location without human intervention. We will “improve safety, we reduce waste, we increase shelf life,” Statler said, describing a not-too-distant future where every pallet, package, or even piece of produce is monitored end-to-end.

According to Statler, these types of tags could change the way we track food inventory in our fridge, with “dynamic expiry dates” that respond to actual conditions rather than rough estimates. “You talk to Alexa and you say, ‘when is this milk or this salmon or this shrimp going to expire?’ and it will know,” said Statler. “We will have looked at the temperature over time that the product has been exposed to, and we can come up with a 21st-century model of how long the product will last.”

Statler pointed out that Alexa and other home assistants are capable of this today with a small software upgrade. I pointed out that allowing Alexa to track freshness by accessing Bluetooth data emitted from various devices and smart tags in your fridge would require consumer opt-in, especially given growing consumer concerns about privacy and access to their data.

“I think Amazon is very sensitive to that, and when they do this, and this is just me speculating, then they’ll do it with privacy in mind,” said Statler. ” I believe that privacy, when done badly, can kill products.”

I also asked Statler if these types of small beacons are connecting with other IoT systems, like Strella or others that sense changing food chemistry to better predict and manage freshness, and he said they’re starting to, but it’s in the early stages. Statler says the primary focus right now of these beacon system is on temperature and identity. The tags are also part of a larger trend toward serialization, where every individual product has its digital passport for authenticity, traceability, and freshness management.

Feeding this data into AI could shift the industry from traditional supply chains to responsive ‘demand chains.’

“We can do a better job of making better products that people use,” Statler said. “And we can start to go from supply chains to ‘demand chains’ that are informing the production and distribution to be much more efficient.”

It was an interesting conversation, one in which Statler was clearly excited about the potential for AI in our supply chains and in our lives, but also saw a potential danger lurking.

“I’m a little pessimistic about where AI is going. I sort of have this dual view of artificial intelligence, which is it’s amazing, and this is why I got into computing years and years ago. But at the same time, there’s a real chance it’s going to kill us all or enslave us. And I think we have to kind of live with that duality in our heads and do our best to try and make sure that this technology evolves in a positive direction.”

You can listen to my full conversation with Steve by clicking play below, on Apple Podcasts, Spotify, or wherever you get your podcasts.


August 4, 2025

From AI as Health Advisor to Leaving Shark Tank, Here Are 5 Takeaways From My Conversation With Mark Cuban

Last week, I sat down with Mark Cuban at the Smart Kitchen Summit to talk about how he sees AI changing innovation and medicine, his motivation for starting Cost Plus Drugs, and why he decided to step away from Shark Tank after this upcoming season.’

Below are five takeaways from my interview with Cuban.

Cuban’s Frustration With the Healthcare System Led Him to Start CostPlusDrugs.com

Cuban’s motivation for starting Cost Plus Drugs was rooted in frustration with a complex and often predatory prescription drug system. “First off, at Cost Plus Drugs, we sell more than just generics,” he said. “We do have brands. We just don’t have all of them yet.”

But Cuban made it clear that the economics of generics where the company has made the most significant impact. “We’ve cut prices down for chemotherapy drugs like Imatinib from $2,000 or more to $21 to $40,” Cuban said. “And so those guys, those big guys, they don’t like us.”

By “those guys,” he means pharmacy benefit managers (PBMs), who are powerful intermediaries he says are actively limiting access to drugs. “PBMS basically control the entire pharmaceutical industry. And they see us as competition.” Cuban said the company’s pricing model is completely transparent: “We only mark it up 15%. If you prefer mail order, the cost is $5 for the pharmacist and $5 for shipping. Or we have local pharmacies, and you can do a pickup there.”

Cuban says his target customer is anyone stuck in the cracks of the healthcare system. “If you have a high deductible plan, you don’t have insurance, there’s a good chance that we carry your medication, and there’s an even better chance that you can pay cash through us and it’ll be cheaper than your deductible and out of pocket.”

Cuban Sees GLP-1 Pricing Becoming More Accessible

I asked Cuban about where he sees pricing going for GLP-1 drugs like Ozempic and Wegovy. He recognizes the importance (and consumer demand) and feels they will become more accessible – including via his site – over time.

“As it applies to GLP-1 drugs, there’s a drug that costs, that we carry that costs $50. It’s a brand drug. And it costs $50 a month instead of $400 or $1,300 a month,” he said. “I think those will come down in price because of the competition, and I think you’ll see new forms of GLP-1s and pills come out as well, which will also put the pricing down. And we’ll carry everything we can.”

He Sees AI as an Increasingly Important Healthcare Tool

Throughout our conversation, Cuban repeatedly came back to the disruptive potential of AI, suggesting it’s the biggest potential harbinger of change in tech and more broadly than anything in his career. This includes in healthcare.

Cuban belief in AI’s potential in health support tool isn’t theoretical – he already uses it himself.

“I do it all the time, right?” he said. “I have to take this thing called Synthroid for hypothyroidism, and I also need more iron after I got my blood tested. I had no idea that taking them both at the same time didn’t work. My doctor didn’t even realize that.” Cuban said he turned to ChatGPT, asking if he could take them both at the same time? “It said, ‘hell no, do not take them at the same time’. It said you have to have three hours between them. And so now my TSH went down to right where it’d be perfect numbers. And my iron levels are going up as well.”

Cuban also said he’s still skeptical of ChatGPT’s responses, so he’ll check responses against a site designed for doctors called Open Evidence. “It’s my way of checking ChatGPT’s work.”

Shark Tank Will Remain The World’s “Best Commercial” Even After Cuban Leaves

After 15 seasons and hundreds of deals, Cuban announced he’s stepping away from Shark Tank. It wasn’t because he’s starting a new business or running for president. He just wanted to spend more time with his family.

“I did it just because of family time,” he said. “Because right about now, I might be shooting Shark Tank, right? And this is the time to spend with my family.”

Cuban still believes in the show’s power to help entrepreneurs: “On Shark Tank, you can have somebody from Idaho, from New York, from wherever, somebody who’s 18 years old, somebody who’s 80 years old, standing on that carpet, telling millions of people about their product.”

It’s the “world’s greatest commercial,” said Cuban.

The Importance of Becoming AI Literate

For Cuban, becoming AI literate is essential. “Learn everything you can about AI because it changes everything,” he said. He said that regardless of whether it’s starting a business, working a trade, or building a career in any field, understanding how to use AI will be required. “There’s going to be two types of companies,” he said, “those who are great at AI and everybody else.”

“There is no job that won’t be touched by artificial intelligence,” he said. “Whether it’s an optimization, in some cases replacement, some cases creating new jobs because you know how to use AI—it goes in all directions.”

Cuban may have stepped away from Shark Tank and sold the controlling interest in the Mavericks, but he definitely hasn’t slowed down. After 30 years in tech, helping to build the world of streaming and becoming one of the world’s most famous tech entrepreneurs, he’s excited about learning and adapting to the future.

You can watch our full interview below, on YouTube, or listen to it on Apple Podcasts, Spotify or wherever you get your podcasts.

Mark Cuban Talks Leaving Shark Tank and AI

July 3, 2025

Is IFT’s Launch of an AI Tool For Food Scientists an Indicator of Where Trade Associations Are Going in Age of AI?

Interesting news out of IFT First this week, the food scientist expo in Chicago, where the longtime trade association announced its own AI tool called CoDeveloper.

According to the announcement, CoDeveloper is a platform built for food scientists by food scientists, offering a suite of AI-powered tools to help them formulate new products, reverse-engineer existing ones, and tap into decades of peer-reviewed food science research. Branded as a “co-scientist” named Sous, the platform is designed to live alongside R&D teams and support early-stage development work.

It’s an interesting move for the group, and as far as I can tell, the first time a trade association in the food space (or possibly any industry) has launched its own AI tool to help practitioners do their jobs. It also raises a larger question: could this be a sign of where trade associations are headed as AI becomes more integral to how we work?

It would make sense. Trade associations have historically provided value through education, convening, standards development, and general promotion. In a future where most industries are driven in large part by AI, why wouldn’t these associations, especially science-focused ones like IFT, want to get in on the action?

Of course, there has been no shortage of efforts across the food industry to develop food AI models, whether that’s startups looking to sell their AI as a SaaS platform or big food brands creating AI tools to differentiate themselves. Whether an available-to-everyone AI food product development tool is something hyper-competitive CPG companies would be interested in is yet to be seen, but I am sure that it will be something most members of the IFT community will want to take for a spin around the block.

June 30, 2025

Study: AI-Powered Drones Fuel Advances in Precision Ag for Early Detection of Crop Stress

Early stress detection via precision agriculture just got a serious upgrade, according to new research out of the Hebrew University of Jerusalem. Led by Dr. Ittai Herrmann, the team developed a drone-based platform that blends hyperspectral, thermal, and RGB imaging with powerful deep learning technology to precisely identify nitrogen and water deficiencies in sesame crops.

Sesame, known for its resilience to climate variations, is rapidly growing in global importance. However, accurately identifying early-stage crop stress has historically posed a significant challenge, limiting the ability of farmers to respond quickly to potential catastrophic challenges. To tackle this, the researchers combined three advanced imaging technologies into a single drone system, creating a robust solution capable of decoding complex plant stress signals.

Hyperspectral imaging provides detailed spectral insights into plant chemistry, including nitrogen and chlorophyll levels, which are critical markers for plant nutrition. Thermal imaging spots subtle temperature changes in leaves that indicate water stress, while high-resolution RGB images provide clear visual context of overall plant health and structure.

What made this study cutting-edge was its use of multimodal convolutional neural networks (CNNs), an advanced AI approach that can unravel intricate data patterns and add context, which significantly enhances diagnostic precision. These advanced techniques unlocked the researchers’ ability to distinguish overlapping signals of plant stress, such as differentiating between nutrient and water deficiency, something that conventional methods often struggle to achieve. According to the researchers, by accurately pinpointing the exact stressor, farmers can now apply resources such as fertilizer and irrigation more strategically, reducing waste and environmental impact while increasing crop yields.

While other researchers have studied using advanced AI techniques with drones to aid in combatting stress in walnut and specialty crops, the use of deep multimodal CNN appears to be a leap forward in precision ag. It remains to be seen how quickly this technology reaches the farmer level, but given the challenges of climate change, its easy to envision that these types of advances in precision agriculture will be invaluable tools for farmers in the future to protect against climate-related crop stress.



June 12, 2025

Starbucks Unveils Green Dot Assist, a Generative AI Virtual Assistant for Coffee Shop Employees

While most companies across the food value chain are embracing AI in some form, one major player that’s been notably quiet is Starbucks.

From mobile ordering to Web3 experiments, and computer vision-powered bioauthentication to automated drink-making, the Seattle-based coffee giant has never shied away from tooting its own hard about tech-forward initiatives. But when it came to generative AI, the most hyped tech trend of the past few years, Starbucks had kept relatively quiet, leaving many to wonder what it was working on and when it might reveal its plans.

That wait is over. This week, at a 14,000-employee conference in Las Vegas, the company unveiled Green Dot Assist, a generative AI-powered assistant designed to help baristas and store managers streamline their operations.

So, what is Green Dot Assist? In short, it’s a Microsoft Azure-powered virtual assistant currently being piloted in 35 Starbucks locations. The app assists with a range of tasks, from training new employees on how to prepare specific beverages to supporting shift managers with dynamic scheduling in response to real-time changes, such as last-minute call-outs.

Green Dot Assist even troubleshoots hardware issues. In a demo video shared by Starbucks, a barista named Dave uses the assistant to diagnose an espresso machine that’s pulling inconsistent shots. The AI provides 3D visual guides and prompts Dave to submit a service ticket—an experience that blends visual diagnostics with conversational support.

Packaged in an iPad app (apparently, Microsoft couldn’t convince the coffee chain to use Surface devices), Green Dot Assist combines training, support, and efficiency tools, all powered by Azure’s generative AI capabilities.

Given Starbucks’ longstanding emphasis on employee training, an AI-powered employee training guide and assistant makes sense. But my guess is this is just the beginning. In the longer term, I expect Starbucks to leverage AI to further enhance operational efficiency, particularly given the significant shift in order mix towards mobile ordering, which has led to increased wait times and customer frustration. This next wave will likely include more advanced automation, as we’ve already started to see with the chain’s push to roll out its Clover Vertica machine nationwide this year and – possible – a new point of sale system announced this week at the company’s employee conference.

May 13, 2025

A Week in Rome: Conclaves, Coffee, and Reflections on the Ethics of AI in Our Food System

Last week, I was in Rome at the Vatican for a workshop on the ethical and social implications of artificial intelligence and automation in our food system.

The workshop was part of an ongoing three-year NIH-funded project focused on the ethics of AI in food. It took place at the Vatican’s Pontifical Academy for Life, the same institution that played a pivotal role in 2020 in getting Microsoft, IBM, and others to sign the Rome Call for AI Ethics, a cross-sector commitment to develop AI that “serves every person and humanity as a whole; that respects the dignity of the human person.”

I was invited to provide an overview of AI in the food system to help set the stage for the day’s conversations, which featured Michelin-starred chefs, Catholic priests, journalists, authors, and professors specializing in ethics, artificial intelligence, and more. I walked through some of the developments I’ve seen across the food system—in agriculture, next-gen food product development, restaurants, and the home. As I wrote recently for The Spoon, today “every major food brand has made significant investments — in people, platforms, products — as part of the AI-powered transformation.”

I posed questions like: What happens when AI dictates what we eat? Or if it engineers the “perfect sandwich”—something so addictive it floods demand and strains supply chains, as Mike Lee has imagined? What does it mean when a company builds a proprietary food AI trained on global culinary data? Does that dataset become the intellectual property of one corporation? And if AI can tailor nutrition down to the molecule, who controls those insights?

These are not just technical questions. They’re questions with deep implications for humanity.

One thing was clear throughout the day: everyone in the room recognized both the promise of AI as a tool for addressing complex challenges in the food system, and the risks posed by such a powerful, society-shaping technology. Among the questions raised: How do we balance the cultural and inherently human-centered significance of food—growing it, preparing it, sharing it at the family dinner table—with the use of AI and automation across kitchens, farms, and wellness platforms?

Above: The signed Rome Call for AI Ethics

As some attendees expressed, there’s a growing concern that the “soul” of food—its role in connection, tradition, and creativity—could be lost in a world where AI plays a central role.

For obvious reasons, being at The Vatican and in Rome at this time was a bit surreal, as the two days of the workshop and the Vatican came during the same week that the College of Cardinals gathered to select the next Pope after last month’s passing of Pope Francis.

As we wrapped up our discussions, the Conclave began. And just as I was leaving Rome, white smoke rose from the chimney of the Sistine Chapel, signaling that a new pope had been chosen.

In his first address, Pope Leo XIV made it clear that he is thinking deeply about AI’s role in society, so much so that he chose his name in homage to a previous pope who guided the Church through an earlier technological upheaval.

“… I chose to take the name Leo XIV. There are different reasons for this, but mainly because Pope Leo XIII, in his historic encyclical Rerum Novarum, addressed the social question in the context of the first great industrial revolution. In our own day, the Church offers to everyone the treasury of her social teaching in response to another industrial revolution and to developments in the field of artificial intelligence that pose new challenges for the defence of human dignity, justice and labour.”

Also present at the workshop was our friend Sara Roversi, founder of the Future Food Institute. The Spoon and Future Food Institute co-founded the Food AI Co-Lab, a monthly virtual forum where experts across disciplines explore the intersection of food and AI.

Sara, Tiffany McClurg from The Spoon, and I grabbed coffee at a small café in Rome to reflect on the workshop and what it means for our ongoing work. We launched the Food AI Co-Lab in early 2024 as a space to gather our communities and talk through how AI is impacting the food system. So far, much of the conversation has focused on education—helping people understand what AI is and how to thoughtfully implement it in their organizations.

But we all agreed: the world has changed rapidly since we began. Nearly everyone is now seriously considering how to integrate AI into their companies, institutions, or personal lives. And so, the Co-Lab needs to evolve too. Our hour-long sessions, often featuring guest speakers, have been great for tracking innovation, but now it’s time to elevate the conversation. Ethics. Labor. Equity. Sustainability. These aren’t side topics—they’re central to how AI will shape the future of food.

If the world feels more chaotic than ever, one thing is certain: we need to prepare for faster, more unpredictable change. At the first workshop two years ago, most attendees were just learning about AI. There was plenty of fear about a runaway system invading the food chain.

Today, there’s greater recognition that AI is inevitable and that it can be a powerful tool for solving some of the food system’s most complex problems. There was even a bit more optimism this time.

But above all, there’s a clear understanding that we still have a long road ahead to strike the right balance: embracing AI as a tool while preserving what makes food so deeply human, so critical to our culture, communities, and shared existence.

You can learn more about the Food AI Ethics project led by Cal Poly at San Luis Obispo [here]. If you’d like to join us for future Food AI Co-Lab events, you can sign up via our LinkedIn Group or The Spoon Slack. We’ll keep you updated on upcoming events and speakers.

April 28, 2025

From Starday to Shiru to Givaudan, AI Is Now Tablestakes Across the Food Value Chain

Back in the early days of the cloud computing revolution, my former employer, GigaOM, hosted perhaps the biggest and most influential conference on the topic called STRUCTURE.

One of the phrases that has stuck with me from those days is “data is the new oil,” which I heard declared from the STRUCTURE stage more than a handful of times. At the time, big data technologies were leveraging machine-learning-driven analytics tools to create new correlations and insights from disparate datasets faster than ever before. Those who controlled the data — and could mine it effectively — wielded enormous power.

Now, nearly two decades into the cloud era and three years after the AI “big bang” sparked by the launch of ChatGPT, those early days seem almost quaint by comparison. New AI-powered tools and companies are emerging every day. While much of the “data is the new oil” rhetoric back then felt like spin, today we’re seeing real, transformative progress, especially in new product development.

Food is no exception.

Take the news from Shiru this past week. The company, which uses AI to sort through plant-based food building blocks, announced that it had scaled its first AI-discovered products: OleoPro and uPro. These new approaches to identifying proteins — particularly oleogel structurants (structured fat systems) — are designed to support large-scale production.

As Shiru CEO Jasmin Hume put it:

“This moment is a turning point not just for Shiru, but for the food industry. Even though oleogels have been explored for years (there are over 500 publications on them in the last decade), commercially scaled examples have been elusive — until now. Our AI platform helped us identify the right proteins, but that was only part of the story. Our team then engineered a scalable and entirely new process for producing those proteins with the precise performance attributes required to succeed in real-world formulations.”

But it’s not just next-generation ingredient discovery. New CPG brands are also using AI to decipher early consumer signals and connect the dots before anyone else can launch the next big product. One example is Starday, a startup that recently raised $11 million. Starday uses AI to sift through millions of data points from social media feeds, surveys, point-of-sale data, and more to identify emerging opportunities in food that could lead to future hits.

“Imagine if you had 10,000 consumer insights folks that are watching every video on internet, typing up what’s being said, tagging it, and then kind of building these regression models around how these trends are happening,” said Starday CEO Chaz Flexman in a recent interview with The Spoon. “We’re trying to do that on steroids. We take in about 10 million pieces of content every week, which is very significant.”

In the early big data heyday, companies could look at things like trending tweet mentions. Today, companies like Starday are able to dive into video content, extract context much faster, and build predictive intelligence to guide new product development.

Shiru and Starday are just two examples making headlines recently about how AI is reshaping the food industry. Others are innovating across different parts of the food value chain — from manufacturing optimization (Keychain) to intelligent automation (Chef Robotics), all the way back to the farm with companies like Agtonomy.

Even century-old flavor companies are getting into the act. This past week, Givaudan announced Myromi, a handheld digital aroma delivery device that leverages an AI platform called ATOM.

In short, AI is enabling both startups and established players to move much faster.

And they’re going to have to. In the current MAHA moment in the US, companies are urgently reevaluating ingredient lists and being forced to replace ingredients like food dyes and sugars. This new urgency is adding to what many had already been doing as they see climate change slowly but surely impacting how and what they can source for their products.

Back in 2010, there was a lot of talk about using big data to create better products, but no one was seriously using AI to build food products at that point (heck, Watson, after all, hadn’t even become a chef.) Today, every major food brand has made significant investments — in people, platforms, products — as part of the AI-powered transformation.

In other words, if data is the new oil, it’s now clear that AI is the engine of innovation that is accelerating and driving change across every part of the food system.

Next

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2026 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...