• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

AI

September 17, 2024

I’ve Seen The Future of Food Logging Apps, And It’s GPT Food Cam

If you’ve ever tried to log your food intake with an app, you probably have realized the following:

  1. Manual food logging with an app is a pain.
  2. Food logging apps are often inaccurate because they require users to estimate portion size, ingredients, etc.
  3. Food logging apps have cumbersome onboarding processes, ask for a lot of personal info, and usually try to upsell users into premium subscriptions.

For these reasons, I have not used a food-logging app for more than a few weeks, but that may soon change with GPT Food Cam. What is GPT Food Cam? In a few words, it’s a free food-logging app that lets you snap each meal or snack with your smartphone camera and uses AI to estimate calories. The app, which can be downloaded from the iOS App Store, doesn’t ask you to take a survey or require a subscription, and, from what I can tell within a day of use, it is really pretty darn good.

That’s my description, but what does Raj Singh, the longtime entrepreneur who is the visionary behind the app, have to say about it? According to Singh, who posted recently about the app on LinkedIn, GPT Food Cam is different from other food logging apps in three primary ways:

Instant Camera Access: The app opens straight to the camera, allowing users to quickly capture their meals without navigating menus. “I wanted it to be fast and low friction,” Singh said. “In social settings, it’s less intrusive to quickly snap a photo.”

Calorie Ranges Instead of Exact Figures: Singh said that because AI has its own limitations and portion sizes vary, the app provides a calorie range. “By presenting a range, it’s mostly right,” Singh said. “The goal is to build the habit of food logging and become a more mindful eater.”

Free and Unobtrusive: Unlike many apps that require subscriptions or bombard users with ads, GPT Food Cam is entirely free and supported by occasional, non-intrusive advertisements. “Right now, ads are making four times the revenue of the AI costs,” Singh said in a phone interview with The Spoon. “This allows us to keep the app free and potentially expand its features and availability to more countries.”

After working with a food coach who encouraged him to send photos of his meals for feedback, Singh sought a convenient digital solution to continue the practice. However, he found existing apps lacking—either too complex, costly, or both.

“They were designed for the 5% who need precision, but I wanted something simple, free, and for the other 95%,” Singh said.

According to Singh, GPT Food Cam leverages Gemini Flash, a fast and cost-effective AI model, to analyze images and estimate calorie content. Users simply snap a photo of their meal, and the app processes the image to provide an approximate calorie range.

“A lot of this is prompt engineering,” Singh explained. “We use ‘chain-of-thought’ prompting, where we break down the AI’s task into specific steps. The prompt instructs the AI to look at what’s in the picture, consider each ingredient independently, estimate serving sizes based on context—like whether it’s in a bowl or on a plate—and then estimate the calories of each item before adding them up.”

Singh emphasized that while AI isn’t perfect—with about 95% accuracy—it’s sufficient for promoting mindful eating. “AI has consistently been 95% accurate,” he said. “It’s great for recommendations and suggestions, but when it comes to critical workflows, it might get things wrong 5% of the time. For food logging, where precision isn’t as critical, this level of accuracy is acceptable.”

The creation of GPT Food Cam came after a serendipitous conversation with a friend. Singh’s friend, Zvika Ashkenazi, mentioned that his son, Ben Ashkenazi, was seeking an unpaid summer internship and wondered if Singh could mentor him. Singh soon began working with Ben, and six weeks later, GPT FoodCam was born.

“Ben is graduating from ASU in Computer Science in December,” Singh said. “He taught himself React, iOS development, and more this summer with minimal help from my network. He built this end-to-end.”

While GPT Food Cam emerged just in the last couple of months after Singh’s epiphany and Ben Ashkenazi’s coding work, Singh has been toying with the idea of a low-friction app to track food intake for a decade. In 2009, he tried to develop a similar application but soon realized the technology wasn’t mature enough.

“In 2009, I tried to create this exact app,” said Singh, who is currently the head of product for Mozilla’s Solo after the browser company acquired his startup Pulse in 2022. “It wasn’t good enough, and so we pivoted into a recipe company, which became Allthecooks.” Allthecooks would go on to become the number one recipe community on Android in 2010, with 30 million users, and would later be acquired by Cookpad.

Unlike then, “the tech is now here, making GPT Food Cam a reality,” Singh said. “Advancements in AI and image recognition have finally caught up with the vision I had over a decade ago.”

With the technology to make friction-free food logging a reality, Singh told me he wants to disrupt the food logging industry by offering a free, low-friction app, but he thinks it can do so with little involvement from him going forward.

“I build some things for fun. At the onset of a new project, I’m like, ‘This is not gonna make money, but the world needs it,’ or, ‘This is gonna be my next business, and I’m leaving where I’m at.'”

Singh made it clear he is happy at Mozilla and, in fact, used the product he conceived of building for Mozilla (Solo, an AI website builder) to create the website for GPT Food Cam. From here, he will let Ashkenazi run with the product, even if he periodically suggests some ideas to make it successful.

“I think it can be very, very disruptive. People are paying $10 a month for apps they don’t need to. This app can encourage better habits without the cost and complexity.”

Singh said he is also considering expanding the app’s capabilities and reach. With the ad revenue already exceeding the AI costs by a four-to-one margin, there’s potential to increase daily usage limits (currently, users are limited to six snaps a day) and make the app available in more countries.

Selfishly, I hope he and Ashkenazi succeed because, from what I’ve seen so far, I think the app is, in fact, potentially disruptive, and I hope to keep on using it. Who knows, maybe Ashkenazi (with a little help from Singh) can put their app on a similar journey we saw with Marco Arment’s Overcast app, which originally was a passion project that emerged from Arment’s annoyance with the current state of podcast apps to become the most user-friendly podcast app (and most popular, outside of Apple’s podcast app) in the world.

September 9, 2024

IFA Smart Kitchen Roundup: Appliance Brands Try to Tap Into AI Zeitgeist With AI-Powered Food Recognition

This weekend at IFA, several big appliance brands used the show to tell the world that they are all in on AI, mainly through the integration of cameras into their ovens paired with software to enable personalized recipes and customized shopping lists.

Siemens showed off the iQ 700 oven has a built-in camera that recognizes over 80 different dishes and automatically adjusts to the ideal cooking settings. This feature allows users to place food, like a frozen pizza, in the oven and hit start for optimized cooking. The updated model offers more food recognition capabilities than previous versions and includes an optional steam function to achieve a crispy crust on baked goods.

Hisense debuted the Hi9 Series Oven, equipped with AI-powered InCamera technology for intelligent baking with over 140 pre-programmed recipes. The company also introduced a smart fridge in the Hisense Refrigerator PureFlat Smart Series, and its description sounds like they’ve been taking cues from Samsung and the Family Hub. The company described the fridge as “a home appliance control center” that “allows you to adjust temperature settings remotely through the ConnectLife app.”. The fridge also has AI-powered inventory tracking, though the company was light on details about how the tracking feature works.

Beko also let everyone know that they are trying to jam AI into as many things as possible, including their ovens. Like with HiSense and Siemens, they pointed to camera-assisted cooking in their ovens. From the release: “Beko brings AI-assisted camera technology to its Smart Home ovens, delivering a self-improving cooking experience for optimal results in the kitchen whatever the dish. With food recognition and cooking suggestions across more than 30 different food types, the new Beko Autonomous Cooking technology uses AI to finish cooking according to personalized browning levels.”

Ovens with cameras and food recognition aren’t exactly new, as we’ve been seeing this feature for the better part of a decade since June (RIP) debuted the technology. The appliance industry often displays a herd mentality, and clearly, the herd feels they’ve got to show off their AI chops, even if the technology is somewhat pedestrian at this point.

Electrolux Debuts Taste Assist AI on AEG Line

Not every new AI-feature introduction at IFA was tied to integrated cameras and image recognition. Electrolux introduced its AI Taste Assist feature on its AEG line of kitchen appliances. According to the announcement, AI Taste Assist will take recipes from the Internet, import them, and send cooking instructions to the oven, but not before it recommends ways to enhance and optimize the cook. In a demo on-stage by Electrolux at IFA, the company emphasized how the new feature was meant to overcome what they called the “cooking gap”, which they described as the limitations of existing recipes and the enhanced capabilities of modern cooking equipment. The feature that Electrolux primarily promoted to bridge this gap was steam cooking, a feature that was injected into a lasagna recipe in an on-stage demo of the Taste Assist feature by Christopher Duncan, Electrolux’s SVP of Taste for Europe.

One notable absence at Electrolux’s IFA new conference was GRO, the next-generation modular kitchen concept the company announced in June of 2022. All indications are that the Swedish appliance brand has not made any progress in commercializing GRO, probably partly due to the company’s struggles over the past couple of years. The company laid off approximately three thousand employees last year, and earlier this year, it saw the departure of its longtime CEO, Jonas Samuelson, as the company continued to struggle post-pandemic and in the fast of increased competition from Asian appliance brands.

SideChef Unveils AI Feature in App That Creates Step-by-Step Recipes From Photos of Food

SideChef recently introduced RecipeGen AI, a new beta feature that generates step-by-step recipes from a photo of any dish. Users can upload pictures of meals from restaurants or social media, and the app will provide a shoppable recipe based on the image.

From the release: “We are living in exciting times, where every inspiration can become a person’s reality,” says SideChef Founder & CEO, Kevin Yu. “At SideChef we’re excited to be the first to use AI to allow any home cook to make their food inspiration a reality for themselves and loved ones, with a single photo!

CNET writer Amanda Smith gave the app a test drive and came away with mixed feelings. While the app successfully identified many ingredients, it missed key components in some cases, such as sourdough focaccia and strawberry butter. It also occasionally added ingredients that weren’t in the dish, like bell peppers, leaving Smith feeling the accuracy was somewhat hit or miss.

Smith’s takeaway: Succes “depends on the recipe. It has a hard time with nuance and, like other AI tools, tends to make it up if it’s unsure. It’s a handy little app that could be used to inspire new ideas and ingredient concoctions or if you’re in a restaurant and don’t want to bother the waiter with dish details.”

Samsung Food Also Debuts AI-Powered Shopping Lists From Photos

SideChef isn’t the only smart kitchen company debuting photo-to-recipes/shopping lists powered by AI in their apps. At IFA last week, Samsung announced new AI-powered meal planning and food management features. The Vision AI feature now allows users to add ingredients to their Food List by simply taking a photo with their smartphone, expanding beyond the previous limitations of Samsung’s Family Hub smart fridge. This list can be used to suggest recipes, prioritize items nearing expiration, and automatically update after meals are cooked or ingredients are purchased.

Additionally, the company announced a new premium tier called Samsung Food+, a $7/month subscription service offers personalized weekly meal plans, tailored to users’ nutritional goals and dietary preferences, and tracks macronutrients and caloric intake. This premium tier also integrates more advanced AI functionality, allowing users to customize recipes and receive a full week of meal recommendations, helping reduce food waste and simplify grocery shopping by making the app a central hub for food management and meal preparation.

September 4, 2024

From Data-Scraping to Discernment Layer: How NotCo’s Giuseppe AI Has Evolved Over the Past Decade

Almost a decade ago, while others experimenting with AI focused on algorithms for trading, diagnostics, or digital advertising, a company called NotCo was experimenting with AI by the name of Giuseppe to create plant-based foods that could match the taste and texture of their animal-based counterparts.

According to Aadit Patel, SVP of AI Product and Engineering at NotCo, the company’s founders (Patel would join a couple of years after the company was founded in 2015) realized early on that, in order to build an AI model that could help create plant-based products mimicking the taste, texture, and functionality of their animal-based counterparts, they would need a whole lot of data.

The problem was, as a startup, they didn’t have any.

When I asked Patel in a recent interview how the company overcame the infamous “cold start” problem—the challenge many embryonic AI models face before they have built large datasets on which to train—he told me they found the solution in a very public place: the U.S. government’s website.

“In the early days, when we had no money, we literally scraped the USDA website,” said Patel. “If you go to the USDA website, there’s a bunch of free data materials for you to use. And I guess no one had actually joined it together to create a comprehensive dataset… So the first versions of Giuseppe were built on that.”

This cobbled-together dataset formed the foundation for Giuseppe’s recommendations, leading to the creation of products like NotMilk, which uses unexpected combinations like pineapple and cabbage to replicate the taste of dairy milk.

As NotCo grew, so did Giuseppe’s capabilities. New analytical labs in San Francisco and Santiago, Chile, gave the company a wealth of new data on which to train its AI. Over time, the model’s ability to create innovative food products also improved.

One of the biggest hurdles in food development is the fragmented nature of the supply chain. Data is scattered across various entities—ingredient suppliers, flavor houses, manufacturers, and research institutions—each holding critical information that contributes to the success of a product. Over time, the company realized that to create an AI capable of building innovative products, it couldn’t rely solely on NotCo’s datasets. Instead, Giuseppe would need to integrate and analyze data from across this complex web of partners.

“What we’ve done with Giuseppe is figure out a way to incentivize this very fragmented ecosystem,” Patel said.

According to Patel, pulling together these disparate datasets from across the product development and supply chain would result in a more holistic understanding of what is needed for a successful product that is better aligned with market realities.

“We realized that if we just made an AI system that’s specific to CPG, we’d be losing out,” said Patel.

Generative AI and Flavor and Fragrance Development

One recent expansion of Giuseppe’s capabilities has been the exploration of new flavors and fragrances using generative AI. While GenAI models like ChatGPT have become infamous for creating sometimes strange and off-putting combinations when designing recipes and new food product formulations, Patel explained that the company has been able to overcome issues with general LLMs by creating what he calls a discernment layer. This layer filters and evaluates the multitude of generated possibilities, narrowing them down to the most promising candidates.

“Discernment is key because it’s not just about generating ideas; it’s about identifying the ones that are likely to succeed in the real world,” Patel said. “With generative AI, you can prompt it however you want and get an infinite amount of answers. The question is, how do we discern which of these 10,000 ideas are the ones most likely to work in a lab setting, a pilot setting, or beyond?”

The discernment layer works by incorporating additional data points and contextual knowledge into the model. For instance, it might consider a formulation’s scalability, cost-effectiveness, or alignment with consumer preferences. This layer also allows human experts to provide feedback and fine-tune the AI’s outputs, creating a process that combines AI’s creativity with the expertise of flavor and fragrance professionals.

Early tests have shown positive results. When tasked with creating a new flavor, both the AI and the human perfumers receive the same brief. When the results are compared in A/B tests, Patel says the outputs of Giuseppe’s generative AI were indistinguishable from those created by human experts.

“What we’ve built is a system where AI and human expertise complement each other,” said Patel. “This gives us the flexibility to create products that are not just theoretically possible but also market-ready.”

CPG Brands Still Have a Long Way to Go With AI-Enhanced Food Creation

Nearly a decade after building an AI model with scraped data from the USDA website, NotCo has evolved its AI to create new products through a collaborative approach that results in a modern generative AI model incorporating inputs from its partners up and down the food value chain. This collaborative approach is being used for internal product development and third-party CPG partners, many of whom Patel said approached the company after they announced their joint venture with Kraft Heinz.

“Ever since our announcement with Kraft Heinz and signing a joint venture, there’s been a lot of inbound interest from a lot of other large CPGs asking ‘What can you do for us?’ and ‘What is Giuseppe?’ They want to see it.”

When I told Patel I thought that big CPG brands have come a long way over the past twelve months in their embrace and planning for using AI, he slightly disagreed. He said that while there’s a lot of interest, most big brands haven’t actually transformed their business to fully create products with the help of AI.

“I would say there’s strong intent to adopt it, but I think there hasn’t been put forth like a concrete action plan to actually develop the first AI-enabled R&D workforce,” said Patel. “There is room, I think, for new AI tech for formulators, and room for best practices and lessons learned of adopting AI.”

You can watch my full interview with Aadit below.

The NotCo team will be at the Food AI Summit talking about their new efforts using generative AI to develop flavor and fragrance, so make sure to get your tickets here.

NotCo's Aadit Patel Talks About the Evolution The Company's Food AI Giuseppe

August 20, 2024

The Idea of Smell-O-Vision Has Been Around for Over a Century. AI May Finally Make It Work

Since the early 1900s, the entertainment industry has been attempting to pair the experience of smell with video entertainment.

In 1916, the Rivoli Theater in New York City introduced scents into the theater during a movie called The Story of Flowers. In 1933, the Rialto Theater installed an in-theater smell system. Hans Laube developed a technique called Scentovision, which was introduced at the 1939 World’s Fair. A decade ago, Japanese researchers were also exploring “Smell-O-Vision” for home TVs, working on a television that used vaporizing gel pellets and emitted air streams from each corner of the screen into the living room.

However, none of these efforts took off, primarily because they didn’t work very well. These attempts at Smell-O-Vision failed because we’ve never been able to adequately recreate the world’s smells in an accurate or scalable way, largely because we’ve never been able to digitally capture them.

This doesn’t mean the fragrance and scent industry hasn’t been robust and growing, but it’s a very different task to create a singular fragrance for a consumer product than to develop something akin to a “smell printer” that emits scents on command. The latter requires a comprehensive digital understanding of scent molecules, something that has only recently become possible.

The digital understanding of the world of smells has accelerated in recent years, and one company leading the way is Osmo, a startup that has raised $60 million in funding. Osmo is led by Alex Wiltschko, a Harvard-trained, ex-Googler who received his PhD in olfactory neuroscience from Harvard in 2016. Wiltschko, who led a group at Google that spent five years using machine learning to predict how different molecules will smell, founded Osmo in early 2023 with the mission of “digitizing smell to improve the health and well-being of human life” by “building the foundational capabilities to enable computers to do everything our noses can do.”

Osmo employed AI to explore the connection between molecular structure and the perception of smell, demonstrating that a machine can predict scents with remarkable accuracy. They developed a machine-learning model using graph neural networks (GNNs), trained on a dataset of 5,000 known compounds, each labeled with descriptive smells like “fruity” or “floral.” This model was then tested on 400 novel compounds, selected to be structurally distinct from anything previously studied or used in the fragrance industry, to see how well it could predict their scents compared to human panelists.

The model’s capabilities were further challenged in an “adversarial” test, where it had to predict scents for molecules that were structurally similar but smelled different. Osmo’s model correctly predicted scents 50% of the time in this difficult scenario. Additionally, the model was able to generalize well beyond the original training data, assessing other olfactory properties like odor strength across a massive dataset of 500,000 potential scent molecules.

The Principal Odor Map (POM) created by Osmo’s model outperformed human panelists in predicting the consensus scent of molecules, marking a significant advancement in olfactory science and demonstrating that AI can predict smells based on molecular structure better than individual human experts in many cases.

We’ve been able to digitally capture and categorize other sensory categories, such as vision, which has led to massive new industry value creation in robotics and autonomous vehicles. The biggest leaps have been a result of machine learning models, and now we’re seeing another massive leap forward in capabilities and product innovation through the application of generative AI.

One potential application Wiltschko describes is “teleporting scent,” where we’ll be able to capture a smell from one part of the world and digitally transfer it to another. To do this, he envisions a world where a local AI-guided molecular sensor could instantly identify the molecular makeup of any scent. From there, his odor map can create what is essentially a formula ready for teleportation without significant manual intervention by scent experts.

This idea, using AI to recreate scents based on a digital framework quickly, could lay the foundation for what film and TV makers have long dreamed of: creating technology that can recreate odors and smells at scale. In other words, we may finally enter a world where Smell-O-Vision becomes a reality. The potential for video entertainment, virtual reality, and other experiences in food service, travel, and more would no doubt lead to a multitude of new applications, much like we’ve seen over the past couple of decades with advances in computer and machine vision.

June 27, 2024

The Food AI Weekly Bulletin: Will AI Fakery Make Restaurant Reviews a Thing of the Past?

Welcome to the Food AI Weekly Bulletin, our new weekly wrapup that highlights important happenings at the intersection of AI and food.

Nowadays, there’s a lot of noise around how AI is changing food, so we decided to create a weekly brief to bring you what’s important, decipher through all the noise, and deliver actionable insights. If you’d like to sign up for our weekly Food AI Weekly, you can do so here.

Highlights

Is AI Ruining Restaurant Reviews? A new study shows people cannot distinguish between real and AI-generated reviews.

AI Food Art Is Everywhere (And It’s Not Great for Freelancers) Generative AI tools like Midjourney and DALL-E are revolutionizing food imagery, but what does this mean for freelancers and creatives who traditionally provided these services?

First, Al Michaels. Next, How About an AI-powered Anthony Bourdain? The news of Al Michaels allowing AI to replicate his voice has almost everyone freaking out, but what does it mean for the future of AI-generated avatars of famous food personalities?

Swallowing A Robot. Endiatx has developed the Pillbot, a tiny robot that can be swallowed to explore the gastrointestinal tract, potentially revolutionizing diagnostics and personalized nutrition.

Food & Nutrition Centric LLMs Could Be an Investible Opportunity. VCs see potential in industry-specific AI models, particularly in the domains of biology, chemistry, and materials, as these specialized LLMs could offer unique investment opportunities.

Brightseed’s Forager AI Finds Novel Bioactives. Cranberry giant Ocean Spray teams up with Brightseed to uncover new bioactive compounds in cranberries.

Our Favorite AI Food Art of the Week. We’ll be making this a regular feature. If you’d like your art featured, submit it on our Spoon slack. 

We’re going to be exploring all of this at our Food AI Summit in September. Join us won’t you? Super Early Bird pricing expires at the end of this month.

Is AI Ruining Restaurant Reviews?

What happens when humans can’t tell real restaurant reviews from fake ones? The restaurant industry has begun asking itself this question as a tidal wave of fake AI reviews floods online sites.

According to Yale professor Balazs Kovacs, humans are already losing their ability to discern the real from the fake. Kovacs recently unveiled the results of a study demonstrating AI’s ability to mimic human-written restaurant reviews convincingly. For his test, Kovacs fed Yelp reviews into GPT-4 and then asked a panel of human test subjects whether they could tell the difference. At first, the results generated by GPT-4 were too perfect, so Kovacs then prompted GPT-4 to insert “

While this raises obvious concerns about the authenticity of online reviews and the trustworthiness of consumer-generated content, it shouldn’t be surprising. Figure 01’s human-like speech tics were creepy, but mostly because of how human its awkward conversation seemed. With typos and sub-par grammar—in other words, what we see every day on social media—it makes sense that AI-generated reviews seemed more human.

One potential workaround to this problem of AI-generated reviews is using AI to detect and notify us what fake content is, but early tests show that even AI can’t tell what is real and what is fake. Another suggestion is to require reviewers to have purchased a product to review it (similar to having Amazon labels whose reviews are from verified purchasers) and apply it to restaurants. My guess is that this will be the best (and potentially last) line of defense against the coming tidal wave of AI reviews.

AI Food Art Is Everywhere (And It’s Not Great for Freelancers)

One early application of generative AI, as it applies to food, is the creation of images. Midjourney, DALL-E, and other tools allow us to create instant realistic images with a few sentences. As a result, we’ve seen CPGs, food tech software companies, and restaurant tech startups jump on the generative art trend.

While that isn’t necessarily good news for actual artists (this WSJ article is a must-read about the impact of AI on freelancers and creatives), these tools have democratized professional-ish like photos and art for folks in the same way Canva made professional-style graphics and presentations available to anyone.

One company that’s benefitted significantly is Innit. The company, which focused in its early life on hiring celebrity chefs like Tyler Florence and spending tens of thousands on photo shoots for a recipe, is now whipping them instantly with generative AI for its Innit FoodLM.

While most Internet-savvy marketing types at food brands, restaurants, and other food-related businesses have at least learned to dabble in generative AI prompt engineering, that hasn’t stopped some from trying to create a business out of it. Lunchbox created an AI food image generator utilizing DALL-E as the underlying LLM over a year ago (the website has since gone dark), and just this week I got pitched on a new AI-powered food generator that wants to charge for its service (which is essentially a user interface to manage prompt engineering for an underlying LLM (which most likely is Midjourney or GPT-4). There’s likely a small lifespan for these types of services, but my guess is most marketing folks will learn to prompt engineers directly with popular image generators like Midjourney.

First, Al Michaels. Next, an AI-Powered Anthony Bourdain?

The Internet freaked out yesterday when news broke that Al Michaels has agreed to let an AI copy his voice, and rightly so. First off, it’s creepy. Second, this is the exact thing was the main reason the Hollywood writers and actors guilds kept striking for so long, so I’m guessing the Hollywood creative community isn’t exactly happy with Al. And finally, it goes to show you that if you throw enough money at us humans, the temptation to cave to the bots will be too much.

My guess is we’ll eventually see AI-generated avatars of famous chefs. All it would take is for the estate of Julia Child or Anthony Bourdain to get a good enough offer and it won’t be long before we hear (and maybe see) their avatars.

Swallowing A Robot

According to Venturebeat, Endiatx has developed a microscopic robot that can traverse your body and is equipped with cameras, sensors, and wireless communication capabilities. The robot, called Pillbot, allowing doctors to examine the gastrointestinal tract and be used both for diagnostic and therapeutic purposes.

The company’s CEO, Torrey Smith, has taken 43 of these Pillbots and swallowed one live on stage, which can be seen here. If this technology actually works (and those pills can be made smaller because, holy cow, that’s a literal big pill to swallow), it’s not hard to imagine these being used to dial in and optimize personalized nutrition regimens.

Food & Nutrition Centric LLMs Could Be an Investible Opportunity

Business Insider asked some VCs what they’re bored by when it comes to AI and what they’re excited about. Not surprisingly, they talked alot about how it will be hard for startups to break through in foundational large language models, where big players like Open AI and Google play. And like any good VC looking at an early market they talked up up picks and shovels

Even as investors shift their focus to promising AI infrastructure startups, there may still be some opportunities for new LLM startups to win, especially when they’re trained for specific industries, explained Kahini Shah, a principal at Obvious Ventures.

“We’re excited about what we call Generative Science at Obvious, i.e, large, multi-modal models trained in domains such as biology, chemistry, materials,” she said.

Brightseed’s Forager AI Finds Novel Bioactives

Brightseed, a company that uses AI to accelerate bioactive and food compound discovery, announced that it has (in partnership with Ocean Spray) used its Forager AI to uncover novel bioactive compounds in cranberries. Forager identified multiple bioactives, such as terpenes, which Brightseed believes hold significant potential for human health. These findings, based on in silico analyses, will undergo further clinical validation and will be presented at the American Society of Nutrition’s NUTRITION 2024 conference.

This acceleration effect of new health-positive compounds is another example of the AI acceleration effect I wrote about yesterday. Things are beginning to move exponentially faster at every stage of the food value chain, which over time means our basic understanding of the rules underpinning what we do (such as food product development) gives way to entirely new rules that are rewritten in large part by AI.

Our Favorite AI Food Image of the Week: Hungry Monkey

We like looking at AI-generated food art and figured we’d show you some of our favorites on a weekly basis. 

If you’d like to submit your AI-created food art (or if you’ve found one you think we should feature, drop the image and the source/attribution (preferably a link) on our Spoon Slack.

June 26, 2024

‘All The Rules Are Changing’: Why AI is Accelerating Change to Every Part of the Food Business (and Beyond)

This week, I attended the Fancy Food Show in New York City. It’s long been one of my favorite food conferences, mostly because I just love walking around and sampling all the great food. I mean, who wouldn’t?

While the fantastic food samples on the show floor are reason enough for me to get on a plane to NYC, the real reason I was there was to give a keynote talk on how AI is changing the food business.

Granted, the crowd at Fancy Food isn’t your typical Silicon Valley audience, the types that get excited about technology for its own sake. Instead, these are usually successful small to medium-sized businesses making anywhere from $1 million to $250 million annually by selling your favorite hot sauce or healthier crackers.

In other words, the good stuff.

Since these are food brands first and not technology companies, I kept my talk straightforward. I discussed how AI has long been used in the food business, how new forms of AI (particularly generative AI) are advancing rapidly, and how, over the next decade, every rule governing their business—from sales and supply chain to customer acquisition and product development—will change dramatically.

If you just rolled your eyes, I understand; I’ve long been skeptical of hyperbolic warnings about ‘disruption,’ and by now, most of us are tired of hearing how AI is a big deal. But that didn’t stop me because, despite all the talk, I still think most people underestimate the significant difference AI will make in our daily lives in the next decade. In other words, most of us are unprepared for how dramatically the rules governing business and everyday professional life will change.

This belief was reinforced last week when I caught up with Samantha Rose, a long-time consumer-product entrepreneur. She transitioned from being an editor at a Yale magazine and an award-winning poet to building a highly successful housewares startup, which she sold in 2021 to Pattern Brands. Since then, she started a third-party logistics and business services company and is now raising funds for a new venture that buys distressed consumer product brands to turn them around. And, somewhere along the way, she was featured in a Chase card commercial.

In short, Sam has mastered the modern rules of today’s business. Yet, when I asked her about AI, she said, “I wish I could take a year off to study and become an expert on AI because I feel like all the rules are changing.”

I thought if someone as savvy as Sam feels the need to go back to school on AI, what chance do the rest of us have?

After my talk, I led a panel on AI, where we delved deeper into how businesses may change and how small food business entrepreneurs should prepare.

One theme that emerged from the session is that growing food brands need to pay attention to how consumer buying behavior will be radically impacted by AI. Imagine a future where we have our own AI copilots telling us what to eat, where to get the best deals, and more. In a world where everyone is guided by an AI or multiple AIs, how will that change consumer behavior when it comes to buying food?

This is already starting to happen and will undoubtedly be widely adopted in a decade.

And then there’s the purposeful creation of AI-derived information sent to consumers with the intent of changing their buying behavior. We’re seeing it in restaurants as AI reviews flood review sites, and they’re already good enough that consumers can’t tell the difference.

As a publisher, I can’t help but think about how Google deemphasizing website search results and pushing their own AI-generated answers will impact not only my business but also the type of information consumers consume to steer their behavior.

Bottom line: Every direction we look, every industry and its associated value chains are changing faster than ever before. The rules are changing. Unfortunately, most of us can’t take the time to study and will all have to learn on the fly.

I’ll share the suggestions I made for these businesses at the Fancy Food Show in a follow-up post.

May 22, 2024

We Used CloudChef’s Cooking Guidance System to Cook Like a Chef

Earlier this month, we visited Google in Chicago, where we got a chance to put the Cloudchef cooking guidance system through its paces.

For those not familiar with Cloudchef, the company uses computer vision to monitor a chef working through a recipe. Sensors and cameras monitor everything from the temperature of a protein to the moisture lost while reducing a sauce to the brownness of an onion and put it all into a machine-readable playback file that can be executed in a kitchen powered with Cloudchef’s software.

We put this playback system, which acts essentially as a cooking guidance system, into action. In the video below, you can watch The Spoon’s Tiffany McClurg being introduced to the system by Cloudchef CEO Nikhil Abraham, and then watch as she cooks a meal of fried rice using a recipe that was created and “captured” in a Google Mountain View kitchen by one of their in-house chefs.

According to Tiffany, she’d never cooked fried rice. Using Cloudchef’s system, it took her about ten minutes to make a meal that tasted pretty darn good!

You can watch The Spoon’s new newly trained chef cook the entire meal using Cloudchef in the video below.

The Spoon Cooks a Meal With CloudChef

May 15, 2024

The Story of Samsung Food with Nick Holzherr

Nick Holzherr, founder of Whisk and head of Samsung Food, is this week’s guest on the Spoon Podcast.

Those in the smart kitchen industry know Nick, in part because his company helped pioneer the early tech behind shoppable recipes, but also because his acquisition by Samsung is the culmination of one of the true success stories in this market.

Today, the technology that Nick and Whisk built is what powers Samsung Food, the AI-powered food and recipe platform that the consumer electronics giant debuted at CES 2024.

Some of the things we talk about this latest episode of The Spoon Podcast include:

  • The story of how Whisk was the first recipe startup to explore how to use AI and apply it to recipes.
  • Nick’s experience going on the British version of The Apprentice and appearing before Lord Sugar (the British version of Donald Trump) to pitch the company. 
  • The growth of the company as Whisk started working with grocers in Europe and eventually appliance brands
  • Nick fielding calls from three companies who presented offer sheets to buy the company.
  • We talk about what Nick is excited about and how he sees technologies like AI being applied in the kitchen in the future.

You can read the full transcript of the conversation below and listen to the full interview by clicking play on the podcast player below or heading to Apple Podcasts, Spotify, or wherever you get your podcasts.

If you’d like to connect with Nick in person, he will be at the Smart Kitchen Summit on June 4th. You can get your tickets here.

Conversation Transcript:

Michael Wolf: Nick just told me whatever I throw at him, he’s ready for, especially with that new microphone. You look great in that, in your little room there, the microphone.

Nick Holzherr: It’s brand new. It was three months old. I realized I needed to start upping my game when everyone’s podcasting now, right?

Michael Wolf: Everyone’s podcasting. So when’s the Nick Holzherr podcast start, by the way?

Nick Holzherr: I haven’t got my own podcast, but if anybody wants to invite me onto theirs, I’ve now got a good mic.

Michael Wolf: Yeah, well, Nick, I feel I’ve known you probably since 2016, 2017. I think Whisk was the first company I remember that was really talking up AI and recipes. That was one of your original stories, right?

Nick Holzherr: Yeah, I mean, we started with the problem that users want to use recipes anywhere, not just from one manufacturer or one publisher or one grocery company or one tech startup with a couple hundred recipes. And the only way to really make that happen is with AI. If you want a smart experience, that is. Use AI to parse all these recipes and make them smart. So we did that, but that was before deep learning, before it was accessible to startups. We were literally flying out on planes to America, buying the latest processors, latest chips, graphics cards, and using them with basically PhDs, ex-professors that we had hired to build these models. It was really early in AI, before TensorFlow, before these libraries. Of course, now it’s totally changed. Super early.

Michael Wolf: I was looking back in the history a little bit and I didn’t realize Whisk had started back in 2012. Back then, you were actually on a British TV show, British Apprentice on the BBC. Was it pitching Whisk at the time?

Nick Holzherr: Yep. It actually was. It’s called the Apprentice, and it’s the same thing as basically Donald Trump has in the US, right? And we have a different person running it called Lord Sugar. The guy in the UK is called Lord Sugar and he was an advisor or member of the Labour Party, which is the left side of the government. So it’s kind of a little bit the opposite of Donald Trump.

Michael Wolf: Lord Sugar. Way better name.

Nick Holzherr: Yeah, he was originally Sir, I think, and then they made him a Lord. This is the British monarchy system, which is also hilarious. It was amazing fun. You do 12 weeks of tasks competing against other people. And then if you get to the final, you get to pitch your business plan. If you win, he gives you money and becomes a partner in your business. I got to the final and I didn’t win. But because it was such a popular TV show, and I got to pitch it to him and his advisors and therefore the UK, everyone wanted to invest in it, but not everyone, but enough people wanted to invest in it. Most people would take the meetings with me. So I got inbound from a lot of the grocery stores and publishers and stuff saying, hey, we’d love to work with you.

Michael Wolf: The Tescos the world came in with you say, hey, let’s meet. Was it immediate like the next day?

Nick Holzherr: Yeah, literally that night, my inbox, my LinkedIn and my inbox just filled up with loads of people who wanted to work, who said, ‘I love it. I know he (Lord Sugar) didn’t invest, but can I invest or can I work with you? And then the ones who hadn’t, when I messaged them months later and said, Hey, this is the idea, I pitched it to on the Apprentice, they would take the meeting, not necessarily because they’re interested, but just wanted to meet the guy who’s on TV.

Michael Wolf: There was a curiosity in meeting you, right?

Nick Holzherr: And for a salesperson, that’s what you need, right, a foot in the door.

Michael Wolf: I feel The Apprentice was the predecessor to Shark Tank, which is also Dragon’s Den in the UK. The value for you as an entrepreneur is it’s just a giant commercial. The value is in the millions of dollars in terms of free publicity.

Nick Holzherr: Exactly. It’s huge. And it’s also actually fun. It is really, really fun. It is also challenging because it’s not a show about business success. It’s about conflict, essentially, right? Of people conflicting and trying to make something work. It’s like a business assault course. They put you through psychometric tests and psychology interviews to basically choose people that they think will conflict with each other and create good TV. So that part is stressful, but you also get to do some really cool things on the show, like build products super quickly and pitch them to big people. So it’s really fun. But ultimately it was the platform I needed to get Whisk started.

Michael Wolf: And so you got Whisk started and I think I connected with you in around 2016, 2017. From those early days when you had the business plan, what you pitched on Apprentice BBC versus when I met you around 2016, what had changed? Obviously you’d started the company, went from business plan to actually starting this thing, but had the vision changed at all?

Nick Holzherr: The core mission was to help people cook, make it more joyful as we talk about it today, and to help them in that process. That hadn’t changed. But what had changed was we started off with a consumer app and we couldn’t get it to what we thought was enough scale. So we pivoted from trying to build a consumer app to looking at all that tech we’d built. And you started off talking about AI. We’d built a lot of AI way back, 10, 12 years ago. And so we were really early in that. And businesses wanted to use that. We knew that because we had, when we built our consumer, we had built widgets that publishers could integrate on their sites. And through that, we had loads of conversations with big publishers, big retailers and big CPG brands. And a lot of them said, ‘we love this widget. Yes, we’ll integrate it, but can we also use some of the other tech?’

So, in2014, 2015, and into 2016, we couldn’t make the consumer app work. So we pivoted into a B2B platform, offered all the tech that we had built for anybody to use, built an API platform for people to use. We were powering half a billion monthly consumer interactions at one point, at the height of it, which was pretty impressive and we were powering it for a lot of the big players in the world like a lot of big CBG brands. If you made a list of the top players, maybe half of those were using our technology.

Michael Wolf: Were you on big publisher websites?

Nick Holzherr: Yeah, the biggest, right? Food Network, Allrecipes, BBC Good Food, About.com, the big, big sites with many tens of millions of hits a month, users, those were integrating us and we were sending them across to grocery retailers like Walmart, Amazon, Instacart, the UK ones like Tesco. And the big CPG brands wanted to be part of it because they wanted to have their products featured. So when it’s butter, they want it to be the Unilever butter, or when it’s stock, the Unilever stock, right? So they cared about that. And they also wanted to connect their own websites to commerce. Because we had those integrations, they then said, can you power more of our platform? We said ‘yes’. Then all the big IoT companies came along, the appliance manufacturers came along and said, ‘hey, can we use your platform as well?’ So we were powering a whole bunch of different IoT players. And that’s, I think, where you and I first started talking.

Michael Wolf: So just real quickly, you were with the publishers and you were essentially doing shoppable recipes. So, like Bisquick, Nestle wants their yogurt or whatever. You were one of the pioneers in the whole shoppable recipe concept. And that was around the 2013, 2014 time frame.

Nick Holzherr: That’s right. I mean, there was Constant Commerce (now Constant.co), which are no longer there, but Constant Commerce was there at the same kind of time as us, super, super early. But it was basically only Constant Commerce and us. And we took a different approach to them. They were more enterprise-y in their proposition. So some of the big enterprise players wanted them. And, but we were more like open and like, let’s integrate the whole market kind of sort of game. So someone wanted us and we had a bit more of a B2C to B2B2C kind of play where we integrated onto their sites and then they added to a shopping list. It would still be our shopping list and it users could save the recipe into our recipe box. Our consumer experience is always there, but it was integrated on other people’s sites and shops and brands.

Michael Wolf: And the appliance guys showed up around in the 2015, 2016 timeframe. And what were they asking you? ‘Hey, we want apps. We want people to have recipes.’ What were they showing up with, and what questions did they have?

Nick Holzherr: It was ‘we want to connect our appliances, so can you give us recipes in a structured way?’ Or can you make it work with our products?

Michael Wolf: It was the guided cooking recipe era.

Nick Holzherr: Exactly. But also shoppable recipes. I think everybody always thinks and starts in the early stages of developing an experience that thinks, ‘I want to make money. How do I make money? I make it by connecting to commerce. So, let me connect to commerce.’

So that was definitely part of the game as well. And then what happened was really at that point what we were offering was in its infancy. It was not a mainstream thing to do. And it was really around the 2016 and 2017 mark when suddenly the e-commerce market had grown. Penetration of e-commerce had doubled or tripled from what it was back in 2012. And so you suddenly have everyone wanting it. And it was kind of a crazy time where we had struggled as a business until 2016, 2017, and suddenly they were inundated with everyone saying, ‘hey, can we use your platform?’ And we suddenly became profitable. We went from five people to 30 people in a year, not from any investment, just revenue. We had no investment at that point. Everything was bootstrapped. We were making money, and the business was profitable. It was like, wow, this is really fun. What is it like to make money? A startup making money? What is this?

Michael Wolf: It’s nice to make money. And so how were you making money? You were doing these recipes, and were you kind of taking every recipe impression that went through, you got a little bit of money? Was there a big appliance company that wanted to pay you to do a custom integration?

Nick Holzherr: All of those. So, we had three revenue models. We had the license fee model, a monthly fee for the API. We had a grocery per click or per new customer fee, which was a small fee per transaction. But when you add it all up, it works out relatively well. And then we also had an ad model where you, as a CPG company, if you’re Nestle and you want to get your yogurt sold, you pay to have your yogurt featured. It doesn’t say Nestle yogurt in the recipe, but yes, Nestle yogurt the customer gets. And so they don’t pay on a per acquisition there, they pay on a per view because you are, you’re inspiring the customer. You’re not just selling the customer.

Michael Wolf: Okay, and so this thing’s going crazy. You had 30 employees and you’re making money. Life is good. And then, at some point, or probably I imagine around this time, you’re the kind of the apex of this ride, you get a call from Samsung?

Nick Holzherr: Yeah, I mean, there’s three companies, another one of the big appliance companies, and one of the world’s slash one of the world’s biggest technology companies all rang up at roughly the same time. Like in the six month period of time, we had three serious acquirers wanting to buy us and a bunch of other ones on the side saying, ‘maybe could we, would you be open to it?’ But three of them were kind of term sheet level interest, like real interest. Flying people into our offices saying, ‘hey, can we buy you?’ And it was a really good time; we were profitable, and we didn’t need to sell. And then, ultimately, we chose to go with Samsung.

Michael Wolf: And you’re looking and you’re just thinking in your head. “Lord sugar. You should have made a better offer’. So you had these three offers; a technology platform company, two big appliance companies, it sounds like. Were you inviting them in, or was their head of acquisitions calling you up?

Nick Holzherr: The latter. They were ringing in us. And this was kind of what’s crazy about it because initially I was even saying no to them, not to the acquisition teams, but to the initial integrations. I was like, why do you want to integrate to an appliance? Are we going to get any users from this? I was in the user’s head and thinking I’m not going to make any transactions. How many fridges have you got live? How many users are going to click on it? Am I going to make enough money on this to make it worthwhile while visiting your office? Because often they’re these large enterprises, their request is, can you visit our office to pitch this certain person? And I was apprehensive. I was like, is it wasting my time?

Michael Wolf: You’re busy. You have 30 employees.

Nick Holzherr: We’re busy. And the funny thing is, of course, those conversations ended up with the M&A team suddenly getting interested. Because they’re looking at it going, ‘hang on, how many people in the market are using it? Our competitors are using it. They’re thinking, ;who’s going to end up owning this?’ They probably knew that M&A stuff was happening because the people in the industry know each other quite well. So they probably all started worrying about what happens if the other person wins it? And that’s a great place to be as a startup. It was literally the perfect time. I could probably have gotten a better price for it if I’d held on another few years. It’s impossible to price it. It’s impossible to get the right time, and it was certainly a good time and I’m happy with what worked out.

Michael Wolf: I think it was a good time, and Samsung is an interesting company. Did you have to fly out to Seoul to pitch?

Nick Holzherr: No, I had phone out to Seoul multiple times after we had a partnership with them, because we started powering loads of stuff for them, right? It became a really good partnership where we were powering way more than initially was requested. We were building, even building some new features based on their budgets and based on their requests and their needs. It was being used across lots of appliances. And I had a vision to being used by lots of other divisions as well, which is what ultimately ended up happening, of course. But that we couldn’t predict that at the time. But then I did fly a bunch to San Francisco because the acquisition team was called Samsung Next. It was an innovation arm of Samsung. And basically HQ, because it’s quite hard for the Korean team to buy stuff because of how something is structured, they asked the innovation arm, hey, can you guys take a look at this? And they did, and they did the acquisitions. So actually, our bosses, if you like, or our acquirers was the San Francisco entity of Samsung.

Michael Wolf: And so that deal comes through, you accept it. I think I remember being in Chicago at the Housewares show. I think when I interviewed you about that deal, I think is right. So there must have been around the March timeframe, it was at Housewares, it was 2018, 2019, one of those years. And it’s been an interesting ride.

Nick Holzherr: Yep.

Michael Wolf: A lot’s probably happened. So maybe talk about you get acquired by Samsung. What’s that like initially kind of absorbing you into this giant Borg, which is it’s a massive company, even though it’s like the next company like the San Francisco companies, but it’s still Samsung.

Nick Holzherr: Yeah, it is. Actually, what I would say is, what I’ve been really pleasantly surprised by is how much autonomy they’ve given us and how much they’ve let us keep parts of the culture that matter. We were always a distributed team and they’ve let us keep that. And that was a massive advantage during COVID when it was so difficult if you were not a distributed team, but it also allowed us to kind of operate, I think, quite effectively. And keep some of the culture we have, like we go on annual offsites where we fly everyone to one place in the world. Like that’s not a normal thing in an enterprise to explain to your HQ that, hey, we’re going to fly the whole team to Greece or something. And they support that. So that’s been awesome.

Michael Wolf: You go to Greece for your offsights? What the heck?

Nick Holzherr: Well, we have to fly somewhere and actually Greece ends up being quite, we actually haven’t actually, Greece is our next one. We actually haven’t been to Greece yet, but like the previous one, Cyprus, we did actually do one in Korea. That was the most expensive one we did. We did Lisbon, Madrid, Budapest. And these places are actually, if you look at them on spreadsheet, the cheapest places to fly everyone, but it is also nice. It’s also fun to actually spend a week together and meet people in person, especially if you’re working in a distributed environment. So we spend a lot of time building a distributed culture.

Michael Wolf: Nice!

Nick Holzherr: and making sure that we, people felt valued and motivated and knew what they’re doing. And Samsung let us keep that. So I’m really grateful and respect Samsung for that. What was crazy was some of the enterprise ways of managing a business, right? That definitely added a layer of complexity and that’s inevitable with any enterprise. And then I think the other thing that was crazy was scaling. So they wanted us to scale our team from 30 people to 120, and they asked me to do it.

in ASAP, I did it in nine months. So going from 30 to 120 in nine months is an experience that I’m not, maybe I’ll do it again, but it was intense. The amount of interviews you have to do, the amount of like how to integrate that at the same time as integrating your company to a large enterprise like Samsung. So all of that in one go was probably the most intensive six months of work that I’ve ever done. Massive learning curve again, hard work, but I learned a bunch and I’m grateful for the experience.

Michael Wolf: Were you in on almost every one of those interviews and decisions? You were pretty hands-on, it sounds like. That is crazy. That’s crazy. Okay, so you grow to 120, that’s impressive. And I think the company over time, I saw more and more getting integrated. And I think the culmination of that was this past CES, when I think it essentially transformed from Whisk to Samsung food. So Samsung food was a big announcement for Samsung at CES 2024 in January.

Nick Holzherr: Yeah.

Michael Wolf: But the heart of that, the beating heart of that was really the Whisk acquisition, right? It’s almost a rebranding of Whisk.

Nick Holzherr: Yeah, that’s right. And there are two sides to that, whether it was a good idea or a bad idea. So I built the Whisk brand, right? I own a bunch of Whisk. I’m drinking out of a Whisk flask right now. So like, hey, kill Whisk as a brand. And some users felt that too. But ultimately, there is a thing with any enterprise probably. If you’re trying to build an app that comes across as not from here, you’re going to struggle getting integration into the different hardware units and you’re going to struggle integrating into the Samsung ecosystem. And ultimately, if you’re building a consumer experience, you have to leverage the distribution of a platform like Samsung. Samsung has one of the most devices in the world of any company. I think it probably has the most devices of any company if you add all the different units, because it’s so broad in what it does in TVs and mobiles and kitchen appliances, watches and now rings, everything. So we had to use that distribution. And the most effective way of doing that was to call ourselves Samsung Foods. And actually that has worked, in terms of the integrations we have planned and the ones we’ve achieved, but ones we’ve now have on the table that are going to happen over the next six, 12 months. I think it’s part of what I’m actually most excited about in where we’re going. It’s especially with the health, because food and health are so closely coupled with obesity and diabetes and other things. Yet, no one has really solved that. Actually people have worked on it, but no one’s really solved it. And I think Samsung has got an important part to play because they have health and food appliances. And I think if we do that, that’s actually a really important thing in the world that we’re doing. Something I feel proud of and passionate about. And that I find awesome. So that is, Samsung food as a brand, I know people are split on the opinion. I think it’s a good name and there’s lots of advantages of the name. If you’re looking at 10 food apps or 20 food apps in your…

Michael Wolf: Yeah, you’re going to see Samsung food. Yeah.

Nick Holzherr: you’ll notice it and you’ll probably try it. Your propensity to try out the app is higher. But of course, Samsung hasn’t got a huge plethora of successful apps. So that’s part of also what we’re helping Samsung with.

Michael Wolf: I always have an 8:30 call, so I’m going to tell them to be a couple minutes late.

Michael Wolf: and late. I’m like, I wanted to go for an hour and then all of a sudden.

Nick Holzherr: We can do part two later, a different day if you want.

Michael Wolf: Yeah, I’ll edit that part out. So, I think that when I think of Samsung food or Samsung with your appliances, you have the wearables, you have what you built in Whisk. There’s a lot of different parts of the puzzle there. And I think you’re almost on a collision course with some of the other folks who are just coming at it from the precision nutrition side. Like I look at January AI, which is a pretty cool app. They do like take a photo of food and you can predict blood sugar. It seems like makes sense that ultimately that type of function would be built into what Samsung has because you guys have so many other parts of the puzzle. So I see you guys have a great foundation. A lot of it’s built on the appliance and the wearables and then that and the app side.

Nick Holzherr: Yeah, I think that’s what’s exciting about the sensors, right? If you’ve got access to sensors and distribution through the hardware and the brand, that gives you some good starting Lego bricks. It’s not the whole solution. It’s such a big problem in the world. A big proportion of the world is going to suffer from the health challenges of not eating healthily. Of course there’s going to be thousands of companies. I hope a lot of them are successful alongside us.

Michael Wolf: I mean, I think a lot of people are just thinking, hey, now we can just take a pill and I will no longer be at risk for type two diabetes. I’ll be always 50 pounds. But I think that’s not realistic. You can’t have half the world’s population at risk for type two taking medication or a shot. I think what you guys are building could be really interesting. It’ll be a few years though, a lot of these different things need to come together. That sounds like what that’s driving you. What else is driving you as you look forward towards five years down the line, 10 years down the line in this space? What is really exciting for you?

Nick Holzherr: I think the advances in AI are exciting for everyone in the space, including us. We tried to do stuff like vision AI, use your camera to detect items, food, two, three years ago and failed. It wasn’t good enough. That was not despite having good engineers on it, it was hard to do. Yet now you come along and use an open source library by open AI and boom, it works. Sure, it needs tuning, it needs some work, it doesn’t maybe it takes a little bit of work to get it to production level, but it’s so easy. Stuff that we spent 10 years building, you can now use the OpenAI API for it and get a good way there. Which is so scary for people who’ve invested a whole bunch of time and effort into building really, really advanced technology that so much of it’s now easier than ever to do, which means it’s open to anybody. The scary part is, how do you win if everyone can do it? The fun thing from a consumer side is it’s going to be possible to build some of the stuff that we all hoped would be there five years ago. Now it will finally get to a point where it’ll be better and usable and actually add starting value. The smart kitchen, there’s always the question is how far off a truly smart kitchen that actually adds value to the user are we. When will it stop being a gimmick and when will it start being smart and useful? We’re getting closer and closer to that being true. Of course it’s true in some ways already, some devices are fantastic in adding value. But on a macro level where I can talk to my mom and say, do you want a smart kitchen? And she goes, yes, I do because it adds loads of value.

Michael Wolf: It’s not just turning things on and off with an app and making things more difficult by adding more processes in the way and kind of app friction, it’s actually making it more useful. So going from those early days where you’re buying GPUs and trying to figure out to build AI and going on TV and then ultimately to where you are today, it’s been quite a ride. It’s been fun talking to you, Nick, about hearing and hearing about all this. Thanks for spending some time with us.

Nick Holzherr: Thank you, I look forward to seeing you in Seattle.

Michael Wolf: You were seeing Seattle at Smart Kitchen Summit. Everyone who wants to see Nick in person, famous TV star, Apprentice star. I’m just embarrassing him. You can see him in Seattle. All right, Nick. Thank you, man. Don’t hang up yet. Don’t hang up yet.

Nick Holzherr: Thank you, thanks Mike, bye bye.

April 22, 2024

Micromart Wants to Create Just-Walk-Out Convenience Anywhere With Its Just-Plug-In Cabinets

Earlier this month, we learned that Amazon is phasing out its Just Walk Out technology at its Amazon Fresh grocery stores. The company didn’t say much about the reasoning behind it, but one likely reason is customers never valued skipping the checkout line in a traditional grocery store shopping experience as much as Amazon anticipated.

But that doesn’t mean shoppers don’t value speed to completion and low-friction shopping experiences. Getting in and out quickly is highly desirable when watching a ballgame or picking up something quickly for lunch during the workday. That’s why Amazon will continue to keep its Just Walk Out technology in sports stadiums and in its Amazon Go fast-format convenience stores, which are typically located in busy downtown office corridors.

Still, do we need whole stores outfitted with cameras and sensors? What if we could condense all this down to a couple of cabinets that can sit in any condo or office lobby?

That’s the idea behind Micromart, an eponymously named micro-market platform from the same Toronto-based team behind Kitchenmate. Micromart’s solution uses AI-powered image recognition technology, putting it into standalone refrigerated cabinets that fit anywhere with a little floor space and a power outlet to plug the cabinets in.

To open the locked refrigerated or freezer cabinet, the customer taps with their phone. They open the cabinet, grab the item(s) off the shelf, and once they close the cabinet, a receipt is generated. If the item is a meal that needs to be heated, the customer can then heat the meal in a “smart cooker” that is attached to the cabinet.

The addition of a food heating system is one of the major differentiators for the Micromart solution, something that company CEO Yang Yu says they developed for Kitchenmate. Kitchenmate, which The Spoon covered way back in 2019, started as a combination food-to-go service for condos and offices. According to Yu, it was while looking for available technology to enable easy unattended purchases of their Kitchenmate meals that the company realized they would need to make their own smart fridges and commerce system.

“We started with the heater,” said Yu. “That was the only thing we had, but then we realized we needed to put the food somewhere, so we built a fridge. When we built the fridge, we were looking at AI companies that did just-walk-out technology, but all of them had issues, and they were all very expensive. And none of them were very accurate. So we had to build our own.”

After building just-walk-out technology for their fridge and deploying it in different locations, they realized the refrigerated cabinets and the heating system were the business. Not long after, Micromart was born.

One reason that Yu and his team saw this as a potential big business is the realization that many office buildings are shutting down cafeterias, often replacing them with just a couple of vending machines. While some solutions, like Farmer’s Fridge, provide fresh options, there aren’t many choices for fresh and hot food.

“Nobody wants to eat vending food,” said Yu. “There’s definitely success stories around healthy vending, but you’re not going to get the variety and the hot food that people expect out of a cafeteria.”

In addition to the refrigerated cabinets and the food heating system, the Micromart solution comes with software as a service that lets retailers track and forecast inventory, electric price tags, and built-in digital ad displays that the operator can customize. The company’s offering also includes a Shop consumer app that can be customized with the operator’s branding. Pricing for a three-cabinet system is $19 thousand for the cabinets, plus transaction and monthly SaaS fees.

Micro-markets aren’t new. Researchers estimate that the micromarket business in the US was almost $4 billion in 2022 and expect it to grow by 13% through 2030. However, many of the solutions are not much more than refrigerators with RFID scanning or weight sensors built in. Other solutions, like those deployed at airports, require the customer to pick up the items and go through a self-checkout scan, often with a store employee eyeing them from close by. Micromart wanted to marry the lighter footprint of older cabinet systems with the more advanced Amazon Go-like vision systems.

“The whole premise behind this was that you could literally put it anywhere in North America,” said Yu. All you need is a standard electrical outlet, and you plug it in, and it works.”

According to Yu, the Micromart solution will debut at the NAMA show in May.

April 19, 2024

The Food Tech News Show: A Look at Our Food Lives in The Year 2055

Welcome to the Food Tech News Show! You can watch the show live here at 1 Pacific/4 Eastern or on Streamyard, YouTube, or LinkedIn.

Fast Food Facial Recognition? - FTNS

This week Mike and Carlos be joined by Future Market’s Mike Lee to talk over some of the most interesting stories of the week. Mike Lee will also talk about his new book Mise,  which paints four different scenarios depicting the potential direction of our future food system.

Here are all the stories we’ll be talking about.

Bored & Hungry Closes – Food & Web3: A check in on where things are

In March 2022, NFT and crypto investor Andy Nguyen purchased Bored Ape #6184 along with three Mutant Apes and soon decided to establish a Bored Ape-themed restaurant named Bored & Hungry. The restaurant opened its doors on April 9, and by the end of its first day, it had served 1,500 burgers and had lines stretching around the block.

Two years later, Bored & Hungry has closed.

Last week, Nguyen announced on Instagram that the restaurant’s original location in Long Beach, California, was closing. He shared that they had sold the concept to a franchising company from Asia known as HUNGRY Dao.

Is AI-Powered Customer Interaction at Fast Food & Retail Giving Up too Much Privacy?

A Fast Company article titled “How fast food is becoming a new surveillance ground” looks at how new customer interaction layers using things like bio-authentication, cameras, profile information, and more are a new risk for gathering information about the public. 

And earlier this week, we saw Steak n Shake launch facial recognition nationwide for check-in.

Are we going through an airport or going to buy a surf and turf?

Vow Thinks Imitating Meat We Eat is Bad Approach. Enter the Quail Parfait. 

Green Queen: The Syndey-based startup is today launching its cultivated Japanese quail in Singapore’s Mandala Club, after the country’s regulator gave it the go-ahead to sell the product. But unlike other rollouts of cultivated meat, where chefs are supplied with the meat itself (which they then incorporate into dishes), Vow is taking a novel approach. What restaurant kitchens get is a parfait containing its cultivated quail.

Mike’s Book about The Future of Food

The four future visions in Mise (pronounced “meez”), which range from the year 2033 to 2067, are created to help people understand the potential long-term impact of things that are happening today in the world on our food system. The book identifies 5 major happenings in society, technology, the economy, the environment, and politics (abbreviated and referred to as the STEEP factors) that will have a profound impact on the way the world produces and consumes food.

April 1, 2024

When It Comes to Making Generative AI Food Smart, Small Language Models Are Doing the Heavy Lifting

Since ChatGPT debuted in the fall of 2022, much of the interest in generative AI has centered around large language models. Large language models, or LLMs, are the giant compute-intensive computer models that are powering the chatbots and image generators that seemingly everyone is using and talking about nowadays.

While there’s no doubt that LLMs produce impressive and human-like responses to most prompts, the reality is most general-purpose LLMs suffer when it comes to deep domain knowledge around things like, say, health, nutrition, or culinary. Not that this has stopped folks from using them, with occasionally bad or even laughable results and all when we ask for a personalized nutrition plan or to make a recipe.

LLMs’ shortcomings in creating credible and trusted results around those specific domains have led to growing interest in what the AI community is calling small language models (SLMs). What are SLMs? Essentially, they are smaller and simpler language models that require less computational power and fewer lines of code, and often, they are specialized in their focus.

From The New Stack:

Small language models are essentially more streamlined versions of LLMs, in regards to the size of their neural networks, and simpler architectures. Compared to LLMs, SLMs have fewer parameters and don’t need as much data and time to be trained — think minutes or a few hours of training time, versus many hours to even days to train a LLM. Because of their smaller size, SLMs are therefore generally more efficient and more straightforward to implement on-site, or on smaller devices.

The shorter development/training time, domain-specific focus, and the ability to put on-device are all benefits that could ultimately be important in all sorts of food, nutrition, and agriculture-specific applications.

Imagine, for example, a startup that wants to create an AI-powered personalized nutrition coach. Some key features of such an application would be an understanding of the nutritional building blocks of food, personal dietary preferences and restrictions, and instant on-demand access to the application at all times of the day. A cloud-based LLM would likely fall short here, partly because it would not only not have all the up-to-date information around various food and nutrition building blocks but also tends to be more susceptible to hallucination (as anyone knows who’s prompted an AI chatbot for recipe suggestions).

There are a number of startups in this space creating focused SLMs around food and nutrition, such as Spoon Guru, that are trained around specific nutrition and food data. Others, like Innit, are building their food and nutrition-specific data sets and associated AI engine to be what they are terming their Innit LLM validator models, which essentially puts food and nutrition intelligence guardrails around the LLM to make sure the LLM output is good information and doesn’t suggest, as Innit CEO Kevin Brown has suggested is possible, a recommendation for “Thai noodles with peanut sauce when asking for food options for someone with a nut allergy.”

The combination of LLMs for generation conversational competency with SLMs for domain-specific knowledge around a subject like food is the best of both worlds; it provides the seemingly realistic interaction capability of an LLM trained on vast swaths of data with savant-y nerdish specificity of a language model focused on the specific domain you care about.

Academic computer scientist researchers have created a model for fusing the LLM and SLMs to deliver this peanut butter and chocolate combination that they call BLADE, which “enhances Black-box LArge language models with small Domain-spEcific models. BLADE consists of a black-box LLM and a small domain-specific LM.” 

As we envision a food future of highly specific specialized AIs helping us navigate personal and professional worlds, my guess is that the combination of LLM and SLM will become more common in building helpful services. Having SLM access on-device, such as through a smartwatch or phone, will be critical for speed of action and accessibility of vital information. Most on-device SLM agents will benefit from persistent access to LLMs, but hopefully, they will be designed to interact independently – even with temporarily limited functionality – when their human users disconnect by choice or through limited access to connectivity.

March 12, 2024

Announcing The Food AI Co-Lab, a New Collaboration Between The Spoon & Future Food Institute

If there was one thing we learned when we held the first-ever Food AI Summit last October, it is that pretty much every food company believes their business will fundamentally change due to artificial intelligence.

Whether it’s companies building farm equipment, managing food supply chains, launching new grocery shopping formats, or creating new quick-service restaurant chains, no one along the food value chain will remain untouched by the rapid pace of change brought on by AI. In other words, we are in a once-in-a-generational rethink of business as usual, a tectonic shift that demands company leaders continuously learn, strategize, and collaborate to make sure their companies survive and even thrive into the future.

Because of this, we realized that we wanted to find a way to bring together our community and others within the food system to talk about the different impacts AI is having across various parts of the food system more than once a year. While we loved the fact that the big ideas that were shared at the Food AI Summit have already resulted in new partnerships and collaborations, we wondered if we brought together folks more regularly – on a monthly basis or even more frequently – might have an even bigger impact.

Luckily for us, one of my favorite organizations – the Future Food Institute, led by one of the most consequential leaders in the future food space in Sara Roversi – had a similar idea. So when Sara approached me about joining forces for a collaborative new organization to do just that – I jumped at the chance.

So, alongside the FFI, I am super excited to announce today the launch of the Food AI Co-Lab!

What is the Food AI Co-Lab? It’s a collaboration that aims to be a meeting space and learning center for leaders who are building the future of food through artificial intelligence. We will explore different topics, engage with our community, and provide information such as industry surveys about what people are doing at the intersection of food and AI.

To kick things off, we will host monthly industry-focused meetings with thought leaders creating using AI across various parts of the food system. Soon, we will also announce in-person events in the US and Italy where the community can get together, network, learn together, and build their own collaborations.

If you’d like to join us on this journey, we encourage you to join our LinkedIn group and also register for our first virtual event, AI & The Future of Food, which will take place next Tuesday, March 19th. At that event, we’ll interview two thought leaders: Dr. Patrick Story, a professor of Philosophy at Cal Poly, San Luis Obispo, participating in a National Science Foundation-funded project analyzing the impact of automation and AI on the food system, and Kevin Brown, the CEO of Innit, a company building a platform that plugs into generative AI large language models to make them more food “fluent” and power AI-assisted food knowledge systems and services.

I hope to see you there, and I am excited to work with you to learn, collaborate, and build the future with the Food AI Co-Lab!

Previous
Next

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...