BrainChip Execs Talk Advantages of Putting AI on Sensors at the Edge as Part of Embedded Vision Summit

BrainChip Holdings Ltd. (ASX: BRN), (OTCQX: BRCHF), a leading provider of ultra-low power, high-performance AI technology, will present the Expert Bar session “Can You Put AI at the Sensor? (Not the Edge of the Cloud!)” at the Embedded Vision Summit May 27 at 11:30 a.m. PDT. The virtual presentation will be broadcast live as well as be available on-demand for attendees of the event.

The BrainChip team will help viewers better understand the requirements of sensors at the edge and how challenges associated with traditional machine learning make it difficult to properly enable these devices. Deploying a solution that leverages advanced neuromorphic computing as the engine for intelligent AI at the edge can be better used to solve critical problems such as privacy, security, latency, and low power requirements, while providing key features, such as one-shot learning and computing on the device itself, without dependency on the cloud.

“Cloud use for AI might be effective in a data center setting but relying on it for the millions of edge sensors being deployed in emerging ‘smart’ markets is a recipe for disaster,” said Anil Mankar, Chief Development Officer at BrainChip. “How do those devices overcome latency in communicating with the cloud? Next-generation AI needs a solution that will provide resources to edge devices. We look forward to sharing with attendees of the Embedded Vision Summit how our Akida Neural Processing Unit has been developed to address these concerns and provide true device intelligence without the need for the cloud.”

BrainChip is delivering on next-generation demands by achieving efficient, effective AI functionality. The company’s Akida neuromorphic processors are revolutionary advanced neural networking processors that bring artificial intelligence to the edge in a way that existing technologies are not capable. The solution is high-performance, small, ultra-low power and enables a wide array of edge capabilities. The Akida (NSoC) and intellectual property, can be used in applications including Smart Home, Smart Health, Smart City and Smart Transportation. These applications include, but are not limited to, home automation and remote controls, industrial IoT, robotics, security cameras, sensors, unmanned aircraft, autonomous vehicles, medical instruments, object detection, sound detection, odor and taste detection, gesture control and cybersecurity. The Akida NSoC is designed for use as a stand-alone embedded accelerator or as a co-processor, and includes interfaces for ADAS sensors, audio sensors, and other IoT sensors. Akida brings AI processing capability to edge devices for learning, enabling personalization of products without the need for retraining.

Since 2012, the Embedded Vision Summit has been the premier conference and expo devoted to practical, deployable computer vision and visual AI. The Summit is organized by the Edge AI and Vision Alliance, an industry partnership operated by BDTI. Additional information about the event is available at https://embeddedvisionsummit.com/.

About BrainChip Holdings Ltd (ASX: BRN)

BrainChip is a global technology company that is producing a groundbreaking neuromorphic processor that brings artificial intelligence to the edge in a way that is beyond the capabilities of other products. The chip is high performance, small, ultra-low power and enables a wide array of edge capabilities that include on-chip training, learning and inference. The event-based neural network processor is inspired by the spiking nature of the human brain and is implemented in an industry standard digital process. By mimicking brain processing BrainChip has pioneered a processing architecture, called Akida™, which is both scalable and flexible to address the requirements in edge devices. At the edge, sensor inputs are analyzed at the point of acquisition rather than through transmission via the cloud to a data center. Akida is designed to provide a complete ultra-low power and fast AI Edge Network for vision, audio, olfactory and smart transducer applications. The reduction in system latency provides faster response and a more power efficient system that can reduce the large carbon footprint of data centers.

Additional information is available at https://www.brainchipinc.com

Follow BrainChip on Twitter: https://www.twitter.com/BrainChip_inc

Follow BrainChip on LinkedIn: https://www.linkedin.com/company/7792006

Contacts

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.