Groundbreaking CV AI accelerator design announced at Flash Memory Summit could be game changer for real-time threat detection
Blueshift Memory, designer of a novel proprietary high-speed memory architecture, has announced the successful completion of a 13-month R&D project to demonstrate the performance of its Cambridge Architecture™, which was funded by an Innovate UK Smart grant. Blueshift Memory is also showcasing at Flash Memory Summit the chip it has designed during the project, which is an accelerator solution for computer vision (CV) AI-enhanced image recognition.
A paper will be presented at the Summit, describing the development of the RISC-V-based chip, and reporting the achievement of acceleration by a factor of 16 to 128 times for processing image data, along with ultra-low power consumption.
When used in a security camera monitoring a rapidly-evolving active shooter situation, for example, the chip can enable real-time identification of different types of firearms to automatically trigger an alarm. This capability could be a game changer and could potentially save many lives.
The Cambridge Architecture has been developed to address the Von Neumann Bottleneck – the phenomenon that data transfer between the core and the memory has become the limiting factor in computational speed. As computing tasks grow more data-hungry, it overcomes a growing obstacle to computational efficiency, and it also offers huge energy savings by eliminating unnecessary movement of data.
“This is the first time that Blueshift Memory’s technology has been demonstrated in a real-life application, and the results are extremely promising,” said Peter Marosan, CTO and founder of Blueshift Memory. “We know that in more challenging, data-intensive use cases like servers for high-frequency trading, the Cambridge Architecture is capable of even higher levels of acceleration, up to 1000x or more, and this is the first step towards us reaching that market. This high performance will also be accompanied by dramatic energy savings, since moving large amounts of data around unnecessarily makes excessive demands on energy consumption.”