• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • News
    • Alternative Protein
    • Business of Food
    • Connected Kitchen
    • COVID-19
    • Delivery & Commerce
    • Foodtech
    • Food Waste
    • Future of Drink
    • Future Food
    • Future of Grocery
    • Podcasts
    • Startups
    • Restaurant Tech
    • Robotics, AI & Data
  • Spoon Plus Central
  • Events
  • Newsletter
  • Connect
    • Send us a Tip
    • Spoon Newsletters
    • Slack
    • RSS
    • The Spoon Food Tech Survey Panel
  • Advertise
  • About
    • Staff
  • Become a Member
The Spoon
  • Home
  • News
    • Alternative Protein
    • Business of Food
    • Connected Kitchen
    • Foodtech
    • Food Waste
    • Future Food
    • Future of Grocery
    • Restaurant Tech
    • Robotics, AI & Data
  • Spoon Plus Central
  • Newsletter
  • Events
  • Jobs
  • Slack
  • Advertise
  • About
  • Become a Member

machine learning

January 8, 2020

CES 2020: Stratuscent’s Digital Nose Can “Smell” When Crops are Ripe or Food is Burning

Something good-smelling must be in the air at CES this week, because digital noses are becoming a bit of a thing at this year’s tech expo. Yesterday I dropped by the booth of Stratuscent, a Montreal, Quebec-based startup which is digitizing scents to detect freshness.

The company’s sensors, called eNoses, detect chemicals in the air to create a scent print — like a fingerprint for a smell. According to CEO David Wu, who gave me a tour, Stratuscent’s “secret sauce” is its superior AI and machine learning, which can quickly and accurately determine any number of complex scents, even ones too tricky for humans to smell. The company’s tech came from NASA, where it was originally used for leak detection.

The eNose is pretty simple to use. Just wave the product in question under the eNose and it will determine what it is — as well as its percentage of accuracy — in under thirty seconds. You can Wu demonstrating the technology below:

CES 2020: Stratuscent's eNose is a Digital Smelling Machine

Wu told me that Stratuscent’s noses have a variety of applications, including sniffing ethylene, a chemical that indicates spoilage, in crop shipments. They’re also working with a dairy company to detect milk freshness. In the home, Wu told me that the eNose could also be integrated into smart kitchen appliances to identify cooking stages (your sauce is about to burn!) and alert users to food spoilage.

Startuscent was founded in 2017 and has raised $4.3 million thus far. Wu said that in addition to its partnership with a dairy company, Stratuscent is pushing further into the food and agriculture space, and is also in conversations to work with indoor agriculture farmers.

Stratuscent isn’t the only player digitizing smell technology (what a world). Yesterday Chris wrote about Aryballe’s new Digital Nose 2.0, which also debuted at CES this week and also digitizes scent to detect freshness, cooking smells, etc.

Regardless, the digital scent landscape is just beginning to emerge. As food safety outbreaks grow — and consumers become more conscious about reducing home food waste — I think there will be a growing market for this sort of technology. Which means there’s ample opportunity for more than one player to nose its way into the digital smelling space.

August 23, 2019

Ai Palette Raises $1.1M in Seed Funding for its Food Trend Prediction Platform

Ai Palette, a Singapore-based startup that uses machine learning and artificial intelligence to predict trends for food companies, announced it has raised $1.45 million Singapore dollars (~$1.1M USD) in seed funding. The blog e27 reported the news first, writing that the funding round was led by Decacorn Capital with participation from the Singapore government’s SGInnovate and AgFunder as well as existing investor Entrepreneur First.

According to Ai Palette’s website, the company uses “Artificial Intelligence and Machine Learning to draw insights from millions of data points to identify consumer needs in real time and combine it with the brand personality to create winning food product concepts.”

Using machine learning and AI to help food and CPG companies capitalize on emerging trends is an already crowded field. Companies like Spoonshot, Analytical Flavor Systems, Tastewise and Halla all use various implementations of machine learning and AI in this capacity.

The goal for all of these companies is to speed up the R&D process for food producers. The sooner a food trend can be spotted, the faster the process can be to get it to market. Additionally, AI allows companies to suck up even more data to provide more granular predictions based on things like regional preferences, or even come up with new flavor combinations that might not ordinarily be tried.

Ai Palette says it will use its new funding to scale up development and grow out its customer base across multiple markets.

June 27, 2019

Will Punchh’s New ML-Based Customer Lifetime Value Predictor Create More Data Darwinism?

Punchh, a software startup that creates digital marketing tools for physical retailers like restaurants, announced the release of its new machine learning-based “Predictive Customer Lifetime Value” (PCLV) application this week. But will this new technology just be another avenue for data darwinism?

CLV is a metric companies use to predict how much money they will reasonably get from any one customer. The concept certainly makes you wonder whether restaurants are feeding you meat, or if you’re the meat feeding the restaurant. Regardless, it’s something restaurants are using more. Fellow restaurant app-maker, Toast, did an explainer post on CLV awhile back.

From the Punchh press release:

From the moment a customer makes their first purchase, Punchh instantly predicts their CLV, then constantly refines that prediction as the relationship between the brand and customer deepens. Based on that PCLV, retail marketers can create target segments with this data to, for example, encourage high CLV segments to enroll in rewards programs while offering low CLV segments incentives through coupons.

While this PCLV may be useful to a restaurant for marketing purposes, it also feels like more data darwinism, like my purchases will determine a company’s level of interest in me. If a restaurant predicts that I’m a low-ticket customer for them from my very first purchase, will they just ignore me? Or will I get worse service? I asked Punchh about this and Xin Heng, Punchh’s Senior Director of Data sent me the following response:

Xin: They won’t be ignored, they will just be put in a different bucket (or segmentation). In other words, low CLV customers will be continuously monitored and treated with winback campaigns. But those who are outside this segment can be subject to games, compression campaigns, a referral callout and more. It’s just about segmentation, but every customer is consistently monitored regardless of what segment they fall into.

Great(?), a restaurant will still be sending me emails no matter how much–or little–I spend! The company is billing the PCLV tool as a restaurant’s virtual data scientist, but it seems like moneyballing me could be just another way that data ruins dining out with too many predictions about my behavior.

We’ll soon see, as Punchh works with more than 160 brands including Pizza Hut, Del Taco, Denny’s and TGI Friday’s. The company has raised $31 million in funding and earlier this month opened up an engineering hub in Toronto.

May 15, 2019

Halla Raises $1.4M Seed Round, Pivots to Focus on AI-Powered Grocery Recommendations

Halla, a startup that uses machine learning and AI to power food recommendations for grocery shoppers, announced today that it has raised a $1.4 million seed round led by E&A Venture Capital with participation from SOSV. This brings the total amount of money Halla has raised to $1.9 million.

Halla has a B2B platform dubbed Halla I/O that helps recommend relevant food products to shoppers. As we wrote at the time of Halla’s launch last year, the “company created an entirely new model and a new taxonomy that doesn’t just look at what a food item is, but also the molecules that make it up, a map of attributes linked to other food as well as how people interact with that food.”

So if you are using a grocer’s app with Halla I/O built in, the app will serve up intelligent recommendations as you continue to shop online. Buy salt, it could recommend pepper. By salt and noodles and beef, and it might guess that you are making a bolognese and recommend tomato sauce.

If you read our coverage of Halla last year, you’d notice something different about the company now. Initially, its go-to market strategy included both grocery stores and restaurants. But in the ensuing year, Halla has abandoned its pitch to restaurants, choosing instead to focus exclusively on grocery retail.

“What we’ve found is that the market timing was screaming ‘where tech meets grocery,'” Halla Co-Founder and CEO Spencer Price told me by phone recently, “The restaurant space is more crowded for building recommendations.”

But all that work in the restaurant space didn’t go to waste. “The truth is we were able to keep all of our learnings from restaurant and made our grocery recommendations stronger,” Price said. “One core learning is that restaurant dishes and menu items, as long as they have descriptions, are just recipes without instructions.”

Halla now has more than 100,000 grocery items and one hundred million unique grocery transactions from retailers across the country in its data set, informing its machine learning algorithms. Price is quick to point out that Halla does not have any personally identifiable information on people. “We can make recommendations to customer X without knowing who customer X is,” Price said

Though a grocery chain can move a lot of product and provide a lot of data for better purchasing recommendations, grocery chains as a whole do not move quickly. To get them to adopt a new technology is like turning a battleship — they need a lot of time to execute. “They’re not looking for speed,” Price said, “but a reliable solution.”

To this end, the biggest thing Halla’s funding buys them is time. “We’ve bought some runway,” said Price. The company now has some breathing room to take its time and conduct even more tests with slow-moving retailers. Halla is in tests with unnamed grocers right now, and offers its recommendation on an API pay-per-call model.

AI-based B2B food recommendation is almost its own mini-industry. Spoonshot, Analytical Flavor Systems, and Tastewise all use vast data sets to make product predictions and recommendations to restaurants and CPG companies. Other companies like AWM Smart Shelf are using a combination of prediction and smart digital signage to make in-store grocery purchase recommendations.

With online grocery shopping reaching a tipping point, people buying food via apps adding one or two more items to their cart because of intelligent recommendations could mean a nice sales boost for the grocery industry.

February 4, 2019

McCormick and IBM are Using AI to Develop Better-Tasting Spices

Today spice giant McCormick announced that it is partnering with IBM to create a research coalition to explore how artificial intelligence (AI) can improve flavor and food product development.

According to the press release, McCormick will use IBM Research AI for Product Composition to “explore flavor territories more quickly and efficiently” and “predict new flavor combinations from hundreds of millions of data points across the areas of sensory science, consumer preference, and flavor palettes.” In short: McCormick is applying IBM’s AI/machine learning power to their own taste data in an effort to develop better-tasting products more quickly — and with fewer duds.

The first product platform, called “ONE,” will debut mid-2019, and will include a set of one-dish Recipe Mixes, which I’m assuming are spice packets. The mixes are meant to season both a protein and a vegetable and come in flavors like Tuscan Chicken and New Orleans Sausage. McCormick is aiming to have them on grocery shelves by spring of this year.

To learn more about the ONE platform, we spoke with McCormick’s Chief Science Officer Hamed Faridi. “This technology uses multiple machine learning algorithms that are trained on information, including hundreds of millions of data points across the areas of sensory science, decades of past McCormick product formulas and information related to consumer taste preferences and palettes,” he told the Spoon. “What distinguishes the new system is its ability to learn and improve every time [it] is being used by our product and flavor developers.”

Unlike McCormick consumer-facing Flavorprint, which drew on recipe search histories to recommend new flavors and recipes, the ONE platform is purely internal. However, Faridi made it clear that the ONE platform would not replace consumer taste testing. “AI can’t taste flavors in the same way a human can,” he said. However, it will seriously up the speed of new product development. Faridi said that the AI system would let McCormick create new flavors up to three times faster, giving the company more agility so it could quickly develop products to take advantage of new dining trends.

Anytime the term “AI” — or the even trendier “machine learning” — is used by a Big Food company or fast food chain, it’s wise to take it with a grain of salt. As buzzwords, AI and machine learning can sometimes be more of a marketing gimmick than a value add.

That’s not to say that there aren’t several companies successfully leveraging AI to improve flavor. In fact, last year I wrote a piece predicting that services combining flavor and AI would be a new food tech trend. Analytical Flavor Systems has an AI-powered flavor prediction platform to help companies develop new food products with less trial and error. Plantjammer uses AI to help home cooks make better plant-based dishes. And Foodpairing applies AI to its flavor database to help professional chefs develop more innovative recipes. But these are smaller, tech-driven startups that have built their service based around AI from the beginning. For Big Food, AI is sometimes as much of a promotional tool as an actual service.

Since McCormick is working with IBM, its new platform seems more like a serious effort than, say, Dominos’. But is McCormick, as it states in the aforementioned press release, “ushering in a new era of flavor innovation and changing the course of the industry”? Probably not. But then again, I haven’t tried that New Orleans Sausage recipe mix.

December 6, 2018

Amazon Go is On a Massive Hiring Spree, and Not Just in the U.S.

Amazon Go, the retail store that uses cashierless technology so you can walk in, choose your items, and walk out without stopping to pay, has 338 open listings on its job site (big h/t to Sean Butler).

There are a few takeaways from this, but most notable is the sheer amount of investment in engineers on both the software and hardware side. There are a whopping 130 positions in software development, and 44 in hardware development.

But that’s just the start. While the majority of the listings are for Software Engineers, listings also include everything from Data Collection Technician to Creative Director to Security Engineer to Senior Vision Research Scientist. There are even 7 listings for real estate and construction positions, Whew!

They’re also searching for a Specification Technologist to join the Amazon Meal Kits team and help out with product development. Meal kits are already some of the most popular items at Go stores, so it’s not surprising that Amazon is looking to amp up its offerings, especially as they expand into new cities.

Many of the jobs are quite recent, and were either posted or updated within the past month. Which means that Amazon is poised to make some serious Amazon Go expansion moves in the new year, and willing to invest some serious man (and woman) power to do it. Good thing too, since the company is considering a plan to open 3,000 Go stores by 2021.

It’s also worth noting where the Go jobs are located. While the locations don’t necessary indicate where Amazon will set up future Go stores, it’s a good data point to learn where they will base R&D and development of their cashierless technology.

In the U.S. there are openings in Seattle (duh), Westborough, MA, San Francisco, and New York City. Abroad, there are listings in two cities in Israel: Tel Aviv and Haifa.

Perhaps most eye-catching on the list is Westborough, MA. That’s the home of Amazon Robotics, a subsidiary which works on Amazon’s mobile robotic fulfillment systems. According to job descriptions, that’s also where Amazon is building an Advanced Projects Group, which will develop “new technologies that go well beyond the current state of the art.”

The location is certainly strategic from a hiring standpoint: Westborough is less than an hour outside of Boston, making it an easy way to recruit tech-savvy post-grads from MIT and Harvard. I’m speculating here, but the Westborough job listings, with its proximity to Amazon Robotics, could also indicate plans on Amazon’s part to add more robots to its Go store experience.

Outside of the U.S., Amazon Go is hiring in Israel. This could simply be a way for Amazon to take advantage of Israel’s flourishing AI landscape and hire some top-notch computer scientists. But it could also indicate that Amazon is ready to expand its Go stores internationally.

It wouldn’t be the first company to bring cashierless tech Israel. Trigo Vision recently partnered with Israel’s largest supermarket chain Shufersal to implement its checkout-free tech in all locations across Israel. However, Trigo Vision and Amazon aren’t direct competitors: Trigo licenses out its tech to existing retailers, while Amazon builds its Go stores from the ground up.

Of course, even outside of Israel Amazon still has plenty of competition in the cashierless tech space. Microsoft has been working on its own version and has reportedly been in partnership talks with Walmart. In San Francisco, Aipoly is developing its own walk-in-walk-out store solution and Standard Cognition recently opened up a store in San Francisco to show off its technology.

Which is all the more reason that Amazon needs to grow fast if it wants to keep up its unique value proposition in the food retail space. The high number of job listings, and their wide geographic reach, show that when it comes to Go stores (and most things grocery, in fact), Amazon isn’t slowing down anytime soon. Now we just have to wait and see when they launch a cashierless Whole Foods.

Thanks to Sean Butler, who posted on his Linkedin about Amazon Go’s massive open jobs list. Do you have a tip for us at the Spoon? We’d love to hear it. 

November 6, 2018

Taranis Harvests $20M for Aerial Imaging Tech that Detects Crop Diseases

Today crop threat detection company Taranis announced that they closed a $20 million Series B funding round led by Viola Ventures, with participation from existing investors Finistere Ventures, Vertex Ventures, and others. This latest round brings the company’s total funding to $30 million.

Founded in 2014 and based in Tel Aviv, Taranis uses aerial imaging to help farmers monitor their acreage for crop threats and irregularities, such as disease, weeds, soil nutrition, and harmful insects.

To do this, the startup combines multi-spectral imagery gathered from satellites, planes, and drones to keep constant tabs on farmers’ fields. The image resolution is so high, according to Taranis co-founder and CEO Ofir Schlam, that they can see a single beetle on a single leaf.

All images (they’ve captured 2 million in the past year alone) are uploaded into the Taranis database, which then analyzes them and creates a synthesized report of any potential threats/problems they “see”. Farmers can access said report through Taranis’ mobile app or via a web browser and decide how best to remedy any issues.

A soy plant with a potassium deficiency.

Pricing varies depending on the type of crop: high-value plans like sugar beets or potatoes, which require more scans, would cost farmers around $15 to $20 per acre per season. Lower-lift crops like soybeans, wheat, or cotton would cost them only $5 per acre per season.

Considering the average U.S. farmer has 444 acres, the price of Taranis’ service can really add up. However, Schlam was quick to emphasis that the return is 3 to 5 times the price, and that their technology can increase crop yields by up to 7.5 percent. As regulations around drones relax and open up in rural areas, he also expects that they’ll be able to reduce their price.

While I’m wary of any company that claims to do anything so drastic as increase crop yields across the board by 7.5 percent, Taranis seems to have the technology and team to back it up. Schlam has a background in tech and intelligence, and another cofounder came from aerospace engineering. He told me that the company has been awarded three patents and has 24 others pending. Earlier this year, Taranis also acquired Mavrx, a San Francisco-based agricultural aerial imaging platform.

As of now, the company has around 60 contracted agronomists who manually train the system to identify problems. However, once the technology has learned that a certain discoloration equates to, say, a potassium deficiency, it can make the connection on its own from then on. Bad for any future agronomists, good for farmers.

Taranis isn’t alone in using technology to make farmers’ lives easier. On the ground, companies like CropX (also based in Tel Aviv) and Teralytic use wireless sensors to help farmers manage things like moisture and fertilizer in their fields. Most similarly, Walmart has filed a patent for an application which uses machine learning to monitor pests, though to our knowledge they haven’t done any pilots so far.

The startup currently works with 19,000 farms across 30 states in the U.S., with an additional footprint in Canada, Argentina, Brazil, Russia, the Ukraine and Australia. With their new funding, Schlam said Taranis would continue to scale up operations in their existing geographies. They’ll also invest in R&D so that their service can identify more types of pests and diseases.

July 30, 2018

Dishq Raises $400,000 Pre-Seed From Techstars and Arts Alliance

Dishq, the Bengaluru, India-based startup that uses artificial intelligence (AI) to deliver personalized food recommendations, announced today that it has raised a $400,000 “pre-seed” round of financing from several investors including the Techstars Farm to Fork accelerator and Arts Alliance. This brings the total amount of money raised by the company to $560,000.

Dishq combines a vast database of dishes and their attributes, as well as anonymized customer behavior analytics and food science research into a machine learning algorithm. The company’s AI platform plugs into any digital food business, such as Uber Eats or at an in-restaurant ordering kiosk, to provide recommendations for the end-user.

According to Vasani, dishq has 11 customers across 6 countries spanning across more than one thousand restaurants and foodservice locations. The software is powering 30 million recommendations per month with more than 176,000 consumers receiving recommendations each month. For comparison, when I spoke with the company in January, it has 6 customers and was powering just two million recommendation a month.

In addition to TechStars and Arts Alliance, existing investors Zeroth and Artesian Venture Partners also participated in this round, as well as The Syndicate Fund and angel investor Sven Hensen. Dishq says it’s going to use the new money to build out its engineering team and expand sales and marketing activities. In addition to the Bangaluru office, dishq will be growing its presence in London and just opened up an office in St. Paul, MN, which is where the TechStars accelerator is located.

Now is a good time for dishq to bolster its coffers, as the AI-powered recommendation space is getting hot. Just yesterday, Halla announced its I/O platform for B2B food recommendations, and others such as FoodPairing and Plantjammer are all bringing their machine learning to help predict what people want to eat.

This money will also presumably help dishq get closer to its long-term goal of creating a real-time “taste analytics as a service,” which will help CPG brands better react to food trends as they are happening and take action.

Vasani spoke at our recent Smart Kitchen Summit: Europe, where he was on the panel Personalized, Shoppable and Guided: Recipes Are Not Dead. You can watch it here.

SKS Europe: Personalized, Shoppable and Guided: Recipes Are Not Dead from The Spoon on Vimeo.

May 23, 2018

Microsoft Gets Visual Food Logging Patent

Microsoft appears to be applying its computer vision and AI smarts to make watching what you eat easier. The Redmond giant was awarded a patent yesterday for “Food Logging From Images.” That basically means, you can take a picture of your food and Microsoft will provide you with its nutritional information (calories, protein, vitamins, etc.).

Yesterday’s patent indicates that it is a continuation of a previous Microsoft patent in May of 2017 for “Restaurant-Specific Food Logging From Images.” Restaurants are called out specifically in this new patent because the Microsoft Food Logger would use GPS to know when you’re at a restaurant. From there, the Food Logger could use information from text menus online via Yelp! or a restaurant’s own site to assess nutritional information.

The technology would supposedly also work outside restaurants, using image recognition to understand home cooked meals as well. And there are tools to allow the user to edit or correct any inaccuracies in what the Logger identifies. So if you slathered butter on a piece of bread, you could specify the amount.

The obvious use case with this patent is a mobile phone app, which is listed. However, Microsoft goes even further to say this technology would work with camera-equipped glasses. From how the company describe it, if you walk into a restaurant wearing these hypothetical Food Logger glasses, you would almost get Terminator-like vision looking at the nutritional content of various meals people around you were eating.

The idea of taking a picture of food and automatically getting its nutritional content isn’t new. Apps like Lose It and Calorie Mama AI say they offer the same type of functionality. Samsung even recently added Calorie Mama’s technology into its Bixby virtual assistant.

Google, of course, has also been working on food recognition for awhile. And this week it came to light that Google is reportedly adding human-powered food identification capabilities to Google Maps. Humans labeling pictures of food taken from different angles and visibility will be beneficial to help train Google’s image recognition algorithms.

Right now, this is just a patent for Microsoft, so who knows how this will ever make it to market. But that market is huge, and it’s unlikely Microsoft will sit on the sidelines.

April 23, 2018

Intello Labs Uses AI to Help Farmers Get a Fair Price for Their Crops

When we talk about artificial intelligence (AI), we often speak in giant, world-shifting terms about revolutionizing a certain industry. But AI can also benefit a single person at a time. In the case of Intello Labs, its AI can be used to help prevent a poor farmer from getting screwed.

Food inspection is often still done manually. One person’s perfect tomato may be another’s piece of trash, and these basic biases can lead to an imbalance of power. A poor, rural, farmer may not be educated on price points or what “fresh” produce means to a buyer. As a result, they may want to sell tomatoes at a dollar per tomato, but buyers may scoff, refuting the quality of those tomatoes, and only offer fifty cents. How are they to know how much the literal fruits of their labor are actually worth?

Intello Labs is working to help balance these scales through a combination of computer vision and artificial intelligence. Using their mobile phone app, the tomato farmer could take a picture of a bushel of tomatoes and upload it into Intello’s system. The company’s algorithms would examine the photo of the tomatoes and gives it a rating based on a set of government (i.e. USDA) or other criteria. With this objective, algorithmic rating in place, each party in the negotiation now knows the quality of the tomatoes being sold — and they can be priced accordingly.

The company started with commodities like tomatoes and potatoes, but according to Sreevidya Ghantasala, Intello Labs Head of U.S. Operations, the company’s core technology can be customized for almost any food. It could be used to rate products like seafood and chicken, or even as a tool for plant disease identification. “We have a pest and disease application for six or seven different crops,” said Ghantasala, “Our system is highly customizeable. If there’s something we don’t see on our library, we can update it in 2 to 3 months.”

Intello, which is headquartered in Bengaluru, India, has already gone live elsewhere in that country at the farmer’s market in Rajasthan to work with 10,000 farmers there for wheat and grain analysis. The company has also worked with the Reliance Foundation in India to help 100,000 farmers with pest and disease detection for crops.

Pricing for Intello’s software is subscription based, and Ghantasala wouldn’t provide specific numbers. She said that cost was dependent on what was being analyzed, and what users want to use it to detect. The company was founded in May 2016, and has raised money through friends, families and various different accelerator programs. It now has 30 employees across offices in Bengaluru, Stockholm, Sweden and Plano, Texas.

Intello isn’t the only one using computer vision and AI to generate objective food ratings. Here in the U.S., AgShift is using a similar mobile phone app to provide better data for food buyers in the supply chain to help reduce food waste. And grocery giant Walmart has implemented its own machine learning-based Eden technology to assess food freshness.

But according to Ghantasala, Intello’s ambitions go beyond food altogether. The company is working with gas and oil companies in Sweden to apply its computer vision to parts identification, and they want to expand its vision into hyperspectral imaging for more in-depth analysis.

Intello, it seems, wants to use its AI to change the world. But for now, it’s changing the world for one farmer at a time.

April 16, 2018

Motorleaf Uses AI to Predict Crop Yields for Indoor Farmers & Greenhouse Growers

Between unpredictable weather, pests, and degrading soil quality, farming is an extremely difficult way to make a living. Indoor farming, though less weather-dependent, carries its own set of burdens.

Montreal-based startup Motorleaf wants to lighten the lift for indoor farmers by improving crop yield predictions and optimizing growing conditions. The company hopes that their software, which CEO and co-founder Ally Monk likens to a “virtual agronomist,” will take the uncertainty out of farming.

To do this, Motorleaf first gathers data on the grow environment through machine vision, agricultural sensors, and historical information. It then applies algorithms and AI to help farmers determine the adjustments they need to make to the indoor grow environment to optimize their crop. Which means farmers can monitor CO2 levels, light spectrum, and other atmospheric conditions remotely through wireless devices or laptops.

Customers can opt to install Motorleaf’s own hardware (a suite of IoT-enabled sensors), though they can also just connect the Motorleaf’s software to the farm’s existing pre-installed hardware to measure and remotely adjust environmental inputs. Its interoperability makes Motorleaf an easy tool for larger greenhouse operations, ones who already have their own monitoring hardware in place, to install.  “At the end of the day, we are a software company,” said Monk.

Motorleaf isn’t the only company helping indoor farmers help manage the lifecycle of their crop. In fact, it’s not even the only company which sees itself as a “virtual agronomist.” What sets it apart, however, is its ability to predict crop yield. Monk claims that motorleaf is the first company to use AI and machine learning to increase the accuracy of yield estimations.

This is a lot more important than an average person (read: the author) might think. Commercial greenhouses pre-sell produce before their harvest based on estimations given by agronomists — though they’re not always accurate. It’s extremely difficult to guarantee the quantity or quality of their crop before it’s harvested, and miscalculations can lead to loss of profits for both the buyer and producer, and also generate huge amounts of food waste.

Motorleaf claims that their software can cut yield prediction error by more than half — at least for some crops. Monk explained that each plant needs its own specialized software for yield prediction, likening farming to a recipe. “Maybe they think there’s a right recipe to growing kale; they need this many nutrients, this much light,” he explained. “We very strongly disagree with that. We think that any farming protocol needs to be dynamic, because if something happens in a greenhouse — which happens all the time — why would you stay rigid? You have to adapt.”

So far, their AI has only been proven to work for estimating tomato yield. However, they’re also deploying algorithms for peppers and silently developing technology for five other crops.

Photo: Motorleaf.

I was surprised to learn that indoor farming environments could be so volatile. After all, that’s the whole point of bringing them indoors, right? Apparently, not so. Monk explained that variable factors like sunlight, outside air temperatures, and human error can all affect greenhouse conditions. Even the plants themselves can do unexpected things that can affect their climate change.

Motorleaf got $100,000 Canadian dollars from the FounderFuel accelerator in the summer of 2016, and later that month Motorleaf raised $850,000 (US) for their seed round of funding. The startup is currently working with clients in Canada, USA, South Africa, South America, Mexico, Holland, Poland, New Zealand and the UK, and aims to be in Spain, France and Germany by early 2019.

Monk concluded our call with what he called “a crazy thought,” one he had when he saw celebrity-branded color palettes. “Why can’t I have a Jamie Oliver taste palette? Why can’t I buy a radish that’s the exact kind he likes to cook with?“ he asked. Farmers could use Motorleaf’s software to manipulate crops into having a certain taste and look, one that would be specific to, and branded by, celebrity chefs. Consumers could purchase produce that had the same taste profile as those preferred by their favorite chefs, and even integrate them into those chef’s recipes.

In the age of celebrity-branded meal kits and baking mixes, this idea isn’t too far-fetched. We’ve even seen companies like Bowery use AI to tweak the flavor, taste, and color of fruits and vegetables.

Motorleaf hasn’t started developing any of this technology yet, but Monk used it chiefly as an example to show how AI can open up “a whole slew of possibilities” for farming. He hopes that one of its applications will be to take the unpredictability out of farming, and put the power back in the hands of the growers.

March 30, 2018

Ingest.AI Unifies Disparate Data to Run Restaurants more Efficiently

If you learn one thing while covering restaurant software companies, it’s that there are a lot of restaurant software companies. Payment systems, HR, inventory management. Not to mention all of the software applications built on top of those like GrubHub, OpenTable, and a host of others.

The problem is that none of these systems talk to one another, so useful data sits in silos, unable to integrate and deliver holistic, business-wide insights for restaurants. The result can be inefficiencies that cause wastes of human capital and food.

To solve this, Kenneth Kuo founded and is CEO of Ingest.AI, a software layer that plugs into all these disparate restaurant systems, uses machine learning to extract data from all of them, and unifies them into a single platform.

“We clean, classify and aggregate all the data to prep it for our second set of machine learning,” said Kuo.

Because Ingest.AI accesses and combines data from every part of the store, it can then tell a manager what will happen at a restaurant at any given time slice with “upwards of 90 percent accuracy.” This allows the manager to properly order the right amount of food and schedule staff accordingly.

And proper staffing can be a huge headache for a manager, especially in states that have high minimum wage and punitive overtime laws. Ingest.AI can make dynamic staffing suggestions to deliver alerts when workers are nearing overtime, and it can schedule around that or ensure there aren’t too many or too little servers at any given time.

With its predictive analytics, Ingest.AI can also help in the back of the house with proper ordering. It knows when a particular ingredient is running low as well as how long it takes a vendor to make deliveries. With this info, Ingest.AI can automate the ordering so restaurants have enough inventory on hand in anticipation of busy times.

Ingest.AI can also make smaller tweaks throughout the dining experience to increase incremental revenue. It will know, for example, that when parties of six or more people sit down, the first thing they do is order beer. The software will send out a notification to servers to suggest that the first thing they say to customers is “Hello, what kind of beer can we get for you, here’s what we have on tap…”

Restaurant managers aren’t typically data scientists, and connecting data from every aspect of the house all at once could quickly set them adrift in a sea of numbers. But Kuo is cognizant of this, and says he basically wants to answer two questions for the restaurant manager: “1. Did I make money? 2. Am I going to go under in the next week?”

You’d be forgiven for thinking was all too good to be true. A magical AI layer that can talk to and predict just about anything in your restaurant saving you time and money. It has a whiff of software snake oil. But Kuo has bona fides when it comes to artificial intelligence: Prior to his startup, he worked on IBM Watson using natural language to deliver personality insights.

There are two things that stand out for me when thinking about Ingest.AI. First, it has the capacity to replace a lot of other restaurant software startups out there. Ordermark unifies orders from different delivery services and Gebni provides dynamic pricing on menu items — but that’s all they do. Ingest.AI does those bits plus a lot more.

And second, honestly the food industry could be just the beginning for Ingest.AI. Every company I’ve worked for uses multiple software applications (Slack, Salesforce, Braintree, Workday, etc.) that don’t talk directly with one another. If Ingest.AI works as promised, there’s no reason it couldn’t expand beyond restaurants into any vertical.

But that is further down the road. Right now Ingest.AI is bootstrapped, based in Manhattan and was just inducted into the latest Food-X cohort. The company has nearly twenty customers paying anywhere from $150 – $250 per month for the service. Kuo says that it has a few deals in the pipeline and after that they will begin looking for a $1 – $1.5 million round of funding around November of this year.

So sure, there are a lot of restaurant software startups out there, but Ingest.AI seems like one to watch.

Next

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2021 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube