Scientists at Massachusetts Institute of Technology (MIT) have developed a sub-terahertz-radiation receiving system that could safely steer driverless cars in foggy and dusty conditions.

Unlike light-based image sensors, the on-chip system detects signals at sub-terahertz wavelengths.

In order to perceive objects, initial signals through a transmitter are sent by a sub-terahertz imaging system.

Subsequently, a receiver measures the absorption and reflection of the reflecting sub-terahertz wavelengths. It sends a signal to a processor that recreates an image of the object.

Traditional systems are large and expensive, which makes implementation of sub-terahertz sensors into driverless cars a challenge.

To overcome this challenge, researchers implemented a scheme of independent signal-mixing pixels, known as ‘heterodyne detectors’. These are generally difficult to densely integrate into chips.

Researchers then reduced the size of the heterodyne detectors and made them small enough to fit into a chip.

“A big motivation for this work is having better ‘electric eyes’ for autonomous vehicles and drones.”

The team built a prototype, which has a 32-pixel array integrated on a 1.2mm² device. Pixels are almost 4,300 times more sensitive compared to the existing best on-chip sub-terahertz array sensors. With further more development, the chip could potentially be used in driverless cars and autonomous robots.

An essential aspect of the design is ‘decentralisation’. Under this process, a single pixel called a ‘heterodyne’ produces the frequency beat and ‘local oscillation’, which is an electrical signal that alters the frequency of one that is input.

Due to this ‘down-mixing’ process, a signal is produced in the megahertz range that can be easily interpreted by a baseband processor.

The output signal can then be used to calculate the distance of objects, just as how LiDAR calculates the time it takes a laser to hit an object and rebound.

Furthermore, by combining the output signals of an array of pixels, and directing the pixels in a certain way, it is possible to get high-resolution images of a scene. This process not only helps in the detection but also the recognition of objects, which is critical in autonomous vehicles and robots.

MIT Microsystems Technology Laboratories (MTL) terahertz integrated electronics group director Ruonan Han said: “A big motivation for this work is having better ‘electric eyes’ for autonomous vehicles and drones.

“Our low-cost, on-chip sub-terahertz sensors will play a complementary role to LiDAR for when the environment is rough.”