Skip to content

Neuromorphic Sensing: Coming Soon to Consumer Products

//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>

What does “neuromorphic” mean today?

“You will get 10 different answers from 10 different people,” laughed Luca Verre, CEO of Prophesee. “As companies take the step from ‘this is what we believe’ to ‘how can we make this a reality,’ what neuromorphic means will change.”

Most companies doing neuromorphic sensing and computing have a similar vision in mind, he said, but implementations and strategies will be different based on varying product, market, and investment constraints.

“The reason why… all these companies are working [on neuromorphic technologies] is because there is a fundamental belief that the biological model has superior characteristics compared to the conventional,” he said. “People make different assumptions on product, on system integration, on business opportunities, and they make different implementations… But fundamentally, the belief is the same.”

Luca Verre (Source: Prophesee)

Verre’s vision is that neuromorphic technologies can bring technology closer to human beings, which ultimately makes for a more immersive experience and allows technologies such as autonomous driving and augmented reality to be adopted faster.

“When people understand the technology behind it is closer to the way we work, and fundamentally natural, this is an incredible source of reassurance,” he said.

Which markets first?

Prophesee is already several years into its mission to commercialize the event-based camera using its proprietary dynamic vision sensor technology. The company has collaborated with camera leader Sony to make a compact, high–resolution event–based camera module, the IMX 636. In this module, the photodiode layer is stacked directly on top of the CMOS layer using Sony’s 3D die stacking process.

According to Verre, the sector closest to commercial adoption of this technology is industrial machine vision.

“Industrial is a leading segment today because historically we pushed our third–generation camera into this segment, which was a bigger sensor and more tuned for this type of application,” he said. “Industrial has historically been a very active machine vision segment, in fact, it is probably one of the segments that adopted the CCD and CMOS technologies at the very beginning… definitely a key market.”

Prophesee Sony IMX 636 event-based sensor
Prophesee’s and Sony’s jointly developed IMX 6363 event-based sensor (Source: Prophesee)

The second key market for the IMX 636 is consumer technologies, driven by the shrink in size enabled by Sony’s die–stacking process. Consumer applications include IoT cameras, surveillance cameras, action cameras, drones, consumer robots, and even smartphones. In many cases, the event-based camera is used alongside a full-frame camera, detecting motion so that image processing can be applied to capture better quality images, even when the subject is moving.

“The reason is very simple: event–based cameras are great to understand motion,” he said. “This is what they are meant for. Frame–based cameras are more suited to understanding static information. The combination of the dynamic information from an event–based camera and static information from a frame–based camera is complementary if you want to capture a picture or video in a scene where there’s something moving.”

Event data can be combined with full–frame images to correct any blur on the frame, especially for action cameras and surveillance cameras.

“We clearly see some traction in this in this area, which of course is very promising because the volume typically associated with this application can be quite substantial compared to industrial vision,” he said.

Prophesee is also working with a customer on automotive driver monitoring solutions, where Verre said event–based cameras bring advantages in terms of low light performance, sensitivity, and fast detection. Applications here include eye blinking detection, tracking or face tracking, and micro–expression detection.

Approach to Commercialization

Prophesee's EV4 evaluation kit
Prophesee’s EV4 evaluation kit (Source: Prophesee)

Prophesee has been working hard on driving commercialization of event-based cameras. The company recently released a new evaluation kit (EVK4) for the IMX 636. This kit is designed for industrial vision with a rugged housing but will work for all applications (Verre said several hundred of these kits have been sold). The company’s Metavision SDK for event–based vision has also recently been open–sourced in order to reduce friction in the adoption of event–based technology. The Metavision community has around 5,000 registered members today.

“The EDK is a great tool to further push and evangelize the technology, and it comes in a very typical form factor,” Verre said. “The SDK hides the perception of complexity that every engineer or researcher may have when testing or exploring a new technology… Think about engineers that have been working for a couple of decades on processing images that now see events… they don’t want to be stretched too much out of their comfort zone.”

New to the Metavision SDK is a simulator to convert full frames into events to help designers transition between the way they work today and the event domain. Noting a reluctance of some designers to move away from full frames, Verre said the simulator is intended to show them there’s nothing magic about events.

“[Events are] just a way of capturing information from the scene that contains much more temporal precision compared to images, and is actually much more relevant, because typically you get only what is changing,” he said.

The simulator can also reconstruct image full frames from event data, which he says people find reassuring.

“The majority of customers don’t pose this challenge any longer because they understand that they need to see from a different perspective, similar to when they use technology like time of flight or ultrasound,” he said. “The challenge is when their perception is that this is another image sensor… for this category of customer, we made this tool that can show them the way to transition stepwise to this new sensing modality… it is a mindset shift that may take some time , but it will eat.”

Applications realized in the Prophesee developer community include restoring some sight for the blind, detecting and classifying contaminants in medical samples, particle tracking in research, robotic touch sensors, and tracking space debris.

hardware road map

In terms of roadmap, Prophesee plans to continue development of both hardware and software, alongside new evaluation kits, development kits, and reference designs. This may include system reference designs which combine Prohpesee sensors with specially developed processors. For example, Prohpesee partner iCatch has developed an AI vision processor SoC that interfaces natively with the IMX 636 and features an on–chip event decoder. Japanese AI core provider DMP is also working with Prophesee on an FPGA–based system, and there are more partnerships in the works, said Verre.

Prophesee Sony IMX 636 event-based camera
Prophesee and Sony IMX 636 is a fourth-generation product. Prophesee said future generations will reduce pixel pitch and ease integration with conventional computing platforms (Source: Prophesee)

“We see that there is growing interest from ecosystem partners at the SoC level, but also the software level, that are interested in building new solutions based on Prophesee technology,” he said. “This type of asset is important for the community, because it is another step towards the full solution — they can get the sensor, camera, computing platform, and software to develop an entire solution.”

Where does event-based sensor hardware go from here? Verre cited two key directions the technology will move in. The first is further reduction of pixel size (pixel pitch) and overall reduction of the sensor to make it suitable for compact consumer applications such as wearables. The second is facilitating the integration of event–based sensing with conventional SoC platforms.

Working with computing companies will be critically important to ensure next–generation sensors natively embed the capability to interface with the computing platform, which simplifies the task at the system level. The result will be smarter sensors, with added intelligence at the sensor level.

“We think events make sense, so let’s do more pre-processing inside the sensor itself, because it’s where you can make the least compromise,” Verre said. “The closer you get to the acquisition of the information, the better off you are in terms of efficiency and low latency. You also avoid the need to encode and transmit the data. So this is something that we are pursuing.”

As foundries continue to make progress in the 3D stacking process, stacking in two or even three layers using the most advanced CMOS processes can help bring more intelligence down to the pixel level.

How much intelligence in the pixel is the right amount?

Verre said it’s a compromise between increasing the cost of silicon and having sufficient intelligence to make sure the interface with conventional computing platforms is good enough.

“Sensors don’t typically use advanced process nodes, 28nm or 22nm at most,” he said. “Mainstream SoCs use 12nm, 7nm, 5nm, and below, so they’re on technology nodes that can compress the digital component extremely well. The size versus cost equation means at a certain point it’s more efficient, more economical [to put the intelligence] in the SoC.”

There is also a certain synergy to combining event–based sensors with neuromorphic computing architectures.

“The ultimate goal of neuromorphic technology is to have both the neuromorphic or event-based sensing and processing, but we are not yet there in terms of maturity of this type of solution,” he said. “We are very active in this area to prepare for the future — we are working with Intel, SynSense, and other partners in this area — but in the short term, the mainstream market is occupied by conventional SoC platforms.

Prophesee’s approach here is pragmatic. Verre said the company’s aim is to try to minimize any compromises to deliver benefits that are superior to conventional solutions.

“Ultimately we believe that events should naturally stream asynchronously to a compute architecture that is also asynchronous in order to benefit fully in terms of latency and power,” he said. “But we need to be pragmatic and stage this evolution, and really capitalize on the existing platforms out there and work with key partners in this space that are willing to invest in software–hardware developments and to optimize certain solutions for certain markets.”

Leave a Reply

Your email address will not be published.