Skip to main content
  1. Home
  2. Emerging Tech
  3. News

A recycling robot named Clarke could be the key to reducing waste

Add as a preferred source on Google

Admit it — you’re not entirely sure how to recycle. It’s understandable, really. With so many different materials in play, how are you supposed to know what needs to be thrown into a landfill and what can be reused? Humans might not be the best at the Three R’s (reduce, reuse, and recycle, of course), but another “R” is here to save us — a robot, affectionately named Clarke.

Developed by AMP Roboticsthis robot makes use of artificial intelligence to recognize and sort food and beverage containers. Clarke has already been deployed in a municipal waste facility in Denver, Colorado, where it helps out with the trash-sorting system. Using a visible-light camera, it can spot milk, juice, and food cartons and pull them out using its robotic arm and suction cups. These items are then diverted away from the landfill, and sent instead to the appropriate recycling facility.

Recommended Videos

With a reliable rate of 60 items a minute, Clarke picks up recyclable waste with 90-percent accuracy and is about 50 percent faster than a human doing the same job. Ultimately, that results in a 50-percent reduction in sorting costs.

“The fundamental platform that we’ve created was a system to sort pretty much all the commodities that are in a recycling facility today,” AMP Robotics founder Mantanya Horowitz told Engadget, “Whether it’s cardboard, No. 1 plastics, No. 2 plastics, or cartons — cartons just ended up being a great place for us to start.”

But because Clarke is an AI-based system, the more it works, the smarter it gets.

“Even though this first system is picking cartons, it’s actually watching and learning from all the other commodities that it’s seeing as well,” Horowitz added. “That’s what’s really exciting. The more systems that we have out there, the better they’re going to be.”

In the future, the hope is to introduce more granularity to Clarke’s sorting abilities. “Right now we can say, ‘That’s a No. 1 plastic’ but we want to be able to say ‘That’s a Pepsi bottle, that’s a Gatorade bottle’ and give recycling facilities even finer resolution on what’s going through [their lines],” Horowitz explained.

So do your best to learn what’s recyclable and what’s not — but remember that if you mess up, Clarke may be able to save the day. Aren’t robots great?

Lulu Chang
Fascinated by the effects of technology on human interaction, Lulu believes that if her parents can use your new app…
6 things Gemini Intelligence is about to do across your Android devices
Logo, Disk, Symbol

Google is bringing Gemini Intelligence to Android, which brings the best of Gemini to its most intelligent devices. The company really wants you to get your work done by Gemini throughout the day, all while staying in control and keeping your data private. Google is rolling out these features starting with the Samsung Galaxy and Google Pixel devices this summer. Furthermore, we’ll see these features on other Android devices, including watches, cars, glasses, and laptops, later this year.

Your assistant is about to get a lot more hands-on, without you having to ask twice

Read more
Google’s next Chrome update is a big deal for Android users
Electronics, Mobile Phone, Phone

Gemini is clearly becoming the centerpiece of Google’s AI strategy, and that focus is now extending deep into Chrome on Android. Starting in June, Chrome is getting a fresh wave of AI-powered features built around Gemini, and the goal is pretty simple: turn your browser into something that actually helps you think, plan, and act, instead of just showing you pages.

Chrome is about to get a little too helpful in the best way

Read more
Rice grain-sized sensor could give robots a delicate touch and keep them from breaking stuff
Sprout Robot

Robots are incredibly precise, but being gentle is not always their strong suit. A machine that can build a car with near-perfect accuracy can still apply too much pressure when working in places where even the smallest mistake matters, like inside a human eye or during delicate surgery. That is why researchers at Shanghai Jiao Tong University are developing a new type of force sensor that could help robots “feel” what they are touching more accurately.

The sensor is tiny, about the size of a grain of rice at just 1.7 millimeters wide, making it small enough to fit inside advanced surgical tools. What makes it especially interesting is that it does not rely on traditional electronics. Instead, it uses light to measure force from every direction, including pressure, sliding movements, and twisting. Here is how it works. At the tip of an optical fiber sits a soft material that slightly changes shape when it comes into contact with something. That tiny deformation alters how light travels through the sensor. The altered light pattern is then sent through optical fibers to a camera, which captures it like an image. Researchers then use a machine learning model to study those light patterns and translate them into precise force readings. In simple terms, the system learns how to “read” touch through light alone, without needing a bunch of wires or multiple separate sensors packed into such a tiny space.

Read more