Metaspectral’s hyperspectral vision tech boosts plastics recycling accuracy
Key Highlights
- Metaspectral's technology uses hyperspectral cameras capturing 300 light frequencies to analyze material composition at a molecular level.
- Deep neural networks enable the system to detect subtle differences in plastics, such as variations caused by manufacturing processes or layered materials.
- The company has secured government funding to further develop methods for sorting challenging plastics, supporting sustainable recycling initiatives.
By Ron Shinn
Separating different types of plastics on a fast-moving recycling feed conveyor may not be rocket science, but it comes close.
Metaspectral, a Vancouver, B.C., high-tech company, has combined spatial and spectral analysis with deep learning in real time to see sub-pixel level information with its computer vision application.
Metaspectral, founded in 2018, first applied the technology to separating waste plastic. It has since worked on applications for space, defense and agriculture.
“We learned about the challenges of sorting different kinds of plastic, primarily consumer packaging, and we started to research it a bit,” CEO Francis Doumet said. “I decided it was a good problem for us.”
The first project was to separate different grades of PET, such as the higher grades used in plastics bottles from the lower grades used in fruit containers. “From there, we gradually started working on more solutions,” he said.
The process eventually led to separating black plastics. Doumet said the company is currently working on separating shredded automotive residue.
Metaspectral’s intellectual property is in its software. It buys the hardware from various vendors, but the software will work with any hyperspectral camera and sensor platform.
The hardware can be purchased or leased from Metaspectral.
Doumet said the system is cost-competitive with other vision systems.
“We have competitors that are using the same sensors and cameras we are,” Doumet said. “What sets us apart is that we are able to figure out how to use very deep neural networks, which are basically very high order or very complex algorithms that can find relationships in the data that are otherwise hard to find.”
Using those relationships in real time allows Metaspectral to find features that would otherwise likely go undetected.
Doumet described one instance where a competitor’s vision unit failed to detect a PVC container, but the Metaspectral unit — installed on the line farther along than the competitor’s unit — found it and alerted the operator so the line could be stopped and the PVC removed.
“It happened to be a very thin layer of PVC that was sitting on top of something else,” Doumet said. “It is very possible that the other unit that was upstream from us saw the material that was sitting underneath and decided that it was the dominant material in the stack. But, in fact, there were different layers of material on top, and the top-most layer was the very thin piece of transparent PVC.”
Metaspectral puts its Fusion platform, which is the software used to train models of different types of plastics, on the cloud. “It allows us to take in the data that the cameras see and build models to identify the different kinds of materials in them,” Doumet said.
The hyperspectral cameras capture far more data than the camera on our phones, which see only three frequencies of light — red, green and blue. Hyperspectral cameras capture 300 frequencies of light and go deep into the infrared range that is invisible to our eyes.
“These cameras look at how light bounces back from the material at each one of these 300 different frequencies of light,” Doumet said. “We look at how light bounces back from the material at each one of those 300 different frequencies, and that basically paints a scientific picture as to what the molecular composition of the material is.”
Doumet described a recent successful test for a customer where Metaspectral’s system distinguished polypropylene (PP) that had been extruded from PP that had been injection molded.
“We did not think it was going to be possible, but there was in fact a difference,” Doumet said. “What that means to us is that the different processes the PP goes through alters the molecule a bit in the final product.”
Vision systems have always been susceptible to environmental conditions. The Metaspectral unit recalibrates and adjusts itself every hour for temperature changes. “We have built into the system a calibration panel that robotically slides down under the camera lens every hour, recalibrates everything, then retracts” Doumet said.
There are also fans on the top of the unit blowing air down to push dust out.
Metaspectral has three systems in operation in British Columbia. A fourth unit is currently being installed in Atlanta.
The first vision system was installed at Merlin Plastics in 2022 and is used to perform automatic auditing of waste material in incoming bales, identify black plastics and perform quality control on optical sorters.
No special training was provided to Merlin.
Metaspectral has received $1.3 million from Canadian and British Columbian governments to develop methods for sorting hard-to-recycle plastics.
Contact:
Metaspectral, Vancouver, British Columbia, https://metaspectral.com
About the Author
Ron Shinn
Editor
Editor Ron Shinn is a co-founder of Plastics Machinery & Manufacturing and has been covering the plastics industry for more than 35 years. He leads the editorial team, directs coverage and sets the editorial calendar. He also writes features, including the Talking Points column and On the Factory Floor, and covers recycling and sustainability for PMM and Plastics Recycling.

