TechNews Pictorial PriceGrabber Video Sun Nov 24 02:20:36 2024

0


Artificial intelligence now fits inside a USB stick
Source: Aaron Souppouris


Everywhere you go, you'll always take the neural network with you


Movidius chips have been showing up in quite a few products recently. It's the company that helps DJI's latest drone avoid obstacles, and FLIR's new thermal camera automatically spot people trapped in a fire, all through deep learning via neural networks. It also signed a deal with Google to integrate its chips into as-yet-unannounced products. Now, the chip designer has a product it says will bring the capacity for powerful deep learning to everyone: a USB accessory called the Fathom Neural Compute Stick.

The Fathom contains the Myriad 2 MA2450 VPU paired with 512MB of LPDDR3 RAM. The Myriad 2 is the chip found in the previously mentioned DJI and FLIR products. It's able to handle many processes simultaneously, which is exactly what neural networks call for. Because it's specifically designed for this -- its architecture is very different from the GPUs and CPUs that typically handle processing -- it offers a lot of grunt without requiring much power. It can handle up to 150 gigaFLOPS (150 billion floating-operations per second) while consuming no more than 1.2 watts.

Unlike Tegra's solutions for deep learning, the Fathom isn't a standalone system. The idea is you plug it into the USB 3.0 port of any system running Linux to get a "20-30x performance improvement in neural compute." You can use the Fathom to rapidly prototype neural networks, moving to something with a lot more power once you're ready to deploy.

Of course, this is neural networking, so it's not that simple. The Fathom accepts networks defined in Caffe and TensorFlow (two frameworks popular in deep learning circles) and their accompanying datasets. You need to use a Movidius tool to execute the network on the Myriad 2 chip, where it'll run natively while sipping power. At first glance, it's a very similar process to CUDA and cuDNN (Nvidia's system for handing off neural networks to its graphics cards). That said, the whole point of Fathom is it can be used in an environment where you don't have expensive graphics cards and processors.

The Fathom is a very interesting device. As anyone that's attempted to run even a basic neural network on an underpowered machine will tell you, it's slow going. At present, the best way to prototype a network is using a cloud-based system, tapping into computing power far away. Being able to add a decent amount of compute to a regular laptop could simplify and reduce the cost of building a network massively.

But the potential for Fathom doesn't end there. It could prove very useful for robotics, drones, and the maker community at large. With a Fathom connected to a Raspberry Pi, for example, you could easily add some very advanced computer vision capabilities to something like a GoPro. The long game, of course, is to persuade more manufacturers to add Myriad chips into their devices, but something like the Fathom is a key step along the way.

Movidius
DJI's obstacle avoidance is powered by the same chip as the Fathom.

The AI community has reacted positively to the announcement. Facebook's Director of Artificial Intelligence Dr. Yann LeCunn said he's "been hoping for a long time that something like Fathom would become available ... With Fathom, every robot, big and small, can now have state-of-the-art vision capabilities." while Google's AI Technical Lead Pete Warden said that "Fathom goes a long way towards helping tune and run these complex neural networks inside devices."

While some organizations are being receiving their Fathoms now, the Neural Compute Stick won't go on general sale until this winter. There's no firm price yet, but we're told it'll be less than $100.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |