• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

MIT

January 20, 2021

MIT Research Opens Up Potential of Lab-Grown Plants

By now Spoon readers are most likely familiar with the idea of lab-grown meat. After all, there are a number of companies around the world tackling the issue. But lab-grown plants? That’s something we haven’t heard as much about. However, it’s an idea that got a little boost this week, thanks to new research from MIT (hat tip: TechCrunch).

MIT News reports today that researchers at that university grew wood-like plant structures in a lab from cells extracted from Zinnia leaves. The process is akin to the way meat is grown in a lab: starter plant cells were placed in a growth medium, and then engineers added different hormones and compounds to “tune” the final structure created.

While still very early on, the results from MIT and continued research could have a potential impact on agriculture and the way food is produced. Just as lab-grown meat looks to create actual meat without the environmental strains of raising animals, lab-grown plants could be raised anywhere at anytime, and require less land and fewer inputs.

This new research is actually coming out at a fortuitous time in the agriculture industry. Indoor farms, or controlled ag facilities, are popping up across the country, changing the way our produce is grown. For instance, the first tomatoes grown in AppHarvest’s 60-acre indoor farm in Kentucky hit store shelves this week. The facility is projected to grow 45 million pounds of tomatoes every year.

What if, instead of just controlling the growing conditions of a plant, they could also control the “manufacturing” of the plant as well, reducing the growth time, or developing different nutritional strains on the cellular level. Those controlled ag companies could truly “control” the entire process of growing food from the ground up.

That is still a ways off, but as we’ve seen with cultured meat, innovation in lab-grown food happens quickly.

October 28, 2019

SKS 2019: For the Future of Kitchen Design, Think Hydroponic Grow Cabinets and Robot Furniture

When you think about it, the basic design of a kitchen hasn’t changed much in the past 50 years. Most of them have a fridge, a sink, cabinets, a stove, an oven, and counters. Sure, there’s been innovation around smart appliances, but the layout of the kitchen itself has essentially remained the same.

But it doesn’t have to be that way. At SKS 2019, Veronica Schreibeis Smith of Vera Iconica Architecture and Suleiman Alhadidi of the MIT Media Lab spoke about how the kitchen is begging for a major design renovation to embrace evolving consumer needs.

If you’re at all interested in design, you should watch the whole video below. As a little teaser, here are some of the biggest takeaways from the discussion.

Kitchens can help you eat healthier
To make a kitchen more futuristic, we don’t necessarily need to transform everything into a robot. According to Schreibeis Smith, simple design solutions are all we really need to help people have more ease — and mindfulness — in the kitchen. That could take the form of climate-controlled cabinets to help preserve nutrients in food, or even hydroponic grow systems built into the kitchen itself.

For small kitchens, automation is key
We might not be headed towards a Jetson-like robotic kitchen anytime soon, but that doesn’t mean that automation won’t play a part in the kitchen of the future. Alhadidi showed off his work at the MIT Media Lab, which is trying to create design solutions for millennials living in small urban spaces and need all of their rooms to be multi-functional. Hint: the term “robotic furniture” comes up at least once.

If you want to learn more about how smart design can transform not only how we cook, but the kitchen itself, be sure to check out the whole video below.

SKS 2019: Designing for the Next 50 Years: Rethinking Kitchen Design

March 14, 2019

This Origami Robot was Built to Bag Groceries

Groceries are a bit of an odd duck when it comes to automation. It’s a bunch of irregular shapes with different weights and oftentimes they are very delicate, so you can’t have just any robot hand trying to pick up and move them around. A grocery packing robot needs a certain… touch.

This was the inspiration for researchers at MIT and Harvard, who have developed an oragami-inspired Magic Ball robot gripper that can gently pick up irregularly shaped objects and place them where they need to go. From the MIT Computer Science & Artificial Intelligence Lab blog post:

To give these soft robots a bit of a hand, researchers from MIT and Harvard have developed a new gripper that’s both soft and strong: a cone-shaped origami structure that collapses in on objects, much like a Venus flytrap, to pick up items that are as much as 100 times its weight. This motion lets the gripper grasp a much wider range of objects – such as soup cans, hammers, wine glasses, drones, and even a single broccoli floret.

You can check the origami bot in action here:

Origami Robot Gripper

Funny enough, this breakthrough robot was built to bag groceries. From that same blog post:

“One of my moonshots is to create a robot that can automatically pack groceries for you,” says MIT professor Daniela Rus, director of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), and one of the senior authors of a new paper about the project.

There are actually a number of companies working on making robotic manipulation of groceries better. Soft Robotics has its “air actuated soft elastomeric end effectors” for delicate handling, and OpenAI has developed a robotic hand that can learn to do more human-like actions (like nestling groceries in a bag).

Rus’ breakthrough is coming at the right time. Robot-powered fulfillment centers are being built by a number of grocery outlets including Kroger, Albertsons and Ahold Delhaize. Right now, these robot systems are more like crates that shuttle food items around and deposit them with a human who takes the assembled food and places it into the appropriate order.

If this origami robot works as described, humans could be removed from the fulfillment equation altogether. This brings up a whole hornet’s nest of societal issues that we talk about often here, and is part of a much-needed larger discussion about the role of robots in human society. But for the purposes of this blog post, let’s just focus on the technology.

Using an origami picker in conjunction with an automated fulfillment center means your grocer can bag your order faster. Add in a self-driving delivery vehicle and you have an automated workflow that can run all day and night.

I’m sure this new Origami gripper will come up during our chat with Albertsons on-stage at our upcoming ArticulATE food robotics summit in San Francisco on April 16th. Get a grip on your ticket today!

February 8, 2019

Tech From MIT Uses RFID to Reveal Food Contamination

Given the job I have, my parents like to tell me about food tech-related news they come across. Last night they were trying to explain a story from CBS This Morning that aired yesterday, but they had trouble relaying it. “It’s a scanner, you use it at the market… something about e. coli and…”

Intrigued, I found the report they were talking about. It’s no wonder they couldn’t explain it: the story was vague and provided almost no details as to how the technology works. So, for my parents and anyone else who saw the CBS Story and wanted a little more information, here are some details.

The technology in question is RFIQ (radio frequency IQ). Here’s a brief explainer from the MIT RFIQ research page:

Our system leverages RFID (Radio Frequency Identification) stickers that are already attached to hundreds of billions objects. When an RFID powers up and transmits its signal, it interacts with material in its near vicinity (i.e., inside a container) even if it is not in direct contact with that container. This interaction is called “near-field coupling,” and it impacts the wireless signal transmitted by an RFID. Our system, RFIQ, extracts features from this signal and feeds it to a machine learning model that can classify and detect different types of adulterants in the container.

You can read the full RFIQ paper.

According to the research overview, the technology can detect fake alcohol (like if methanol is mixed into a drink) with 97 percent accuracy, and tainted baby formula with 96 percent accuracy. In the CBS story, MIT Assistant Professor, Fadel Adib said RFIQ could be used for a broader set of applications including finding lead in water or e. coli on lettuce.

The bones of RFIQ sound akin to hyperspectral imaging, which studies how light reflects off objects to assess freshness, quality and foreign objects. But companies like ImpactVision and P&P Optica, which use hyperspectral imaging, don’t tout the technology as a way to detect foodborne illnesses.

The drawback to the RFIQ technology as it is envisioned now, is that in order for it to work, each item has to have a RFID sticker on it, and the user would have to carry around a small device that would plug into their phone to scan each item. This seems cumbersome and a big ask for food producers and consumers alike.

I’m sure Mr. Fadel and his team have thought about this and way beyond what I’m pondering. There is probably a more industrial grade solution that can be implemented in bulk throughout the supply chain. The RFIQ technology is still five years out from reaching the market anyway, so who knows what breakthroughs and advancements the MIT team will make by then.

For now, I’m just happy that there are researchers going about solving the problem of food contamination from different angles, and I’m happy to help fill in the blanks of my parents’ news watching.

July 26, 2017

MIT’s Pic2recipe Uses AI To Match Photos to Recipes

The smart folks at MIT have come up with a new food-technology AI breakthrough that has some interesting applications when it comes to meal prep and curating recipes. Called Pic2recipe, the system identifies recipes based on images delivering a list of possible matches and their ingredients. While it is a long way from launch, Pic2recipe not only has value as a standalone app. It could also become a key component of guided cooking systems.

Pic2recipe uses computer vision, a technology long associated with video search. Using a dataset—in this case recipes from All Recipes and Food.com—images are annotated with information about the picture that could then be identified and correlated to the matching recipe. To accomplish this final step, a neural network is trained to find patterns and make connections between pictures and recipes. At this point, Pic2recipe has a database of more than one million recipes.

“In computer vision, food is mostly neglected because we don’t have the large-scale datasets needed to make predictions,” MIT’s Yusuf Aytar said in a recent interview. “But seemingly useless photos on social media can actually provide valuable insight into health habits and dietary preferences.”

A test of what looks to be a beta version of the Pic2recipe system was hit and miss. This enhanced search engine was unable to correctly match a salad and a vegetable stew to the correct recipe. On a third try, this a picture of homemade minestrone soup, Pic2recipe fared far better. Beef minestrone was fourth on the list with a 79% degree of confidence.

Moving forward, it’s easy to imagine Pic2recipe as an app that could feed into a guided recipe system. By showing the recipe ingredients, a user can make any modification necessary for a special diet. For example, the minestrone soup identified in the trial is actually vegan; the recipe shown in the search results can easily be modified by substituting ingredients. So, if you are in an Italian restaurant and snap a pic of your eggplant parmesan, once you make the recipe match, you can cut down on the salt and substitute a lower-fat cheese to make the dish healthier. Send that recipe to your guided cooking system and you are on your way to preparing it at home.

“This could potentially help people figure out what’s in their food when they don’t have explicit nutritional information,” Nick Hynes graduate student at MIT adds. “For example, if you know what ingredients went into a dish but not the amount, you can take a photo, enter the ingredients, and run the model to find a similar recipe with known quantities, and then use that information to approximate your own meal.”

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...