Google SBC Coral USB Accelerator, Performs high-speed ML inferencing, Supports all major platforms, Supports TensorFlow Lite, Supports AutoML Vision Edge
Out of stock
Google SBC Coral USB Accelerator, Performs high-speed ML inferencing, Supports all major platforms, Supports TensorFlow Lite, Supports AutoML Vision Edge
- Brand: Google
- MPN: 114991790
- Part #: SBCGOG0002
- UPC: 192876503706
What PB Tech customers are saying about this product...
See More Reviews"Plug and play on my Unraid NAS, just set it up with Frigate and all good to go!"
"Certainly speeds up video analysis using software like Frigate."
"Works well, happy that it was in stock they are hard to get at the moment"
Features
Specifications
Reviews
Delivery & Pick-up
Returns & Warranty
Related Promotions
Popular Other Single Board Accessories
Google SBC Coral USB Accelerator, Performs high-speed ML inferencing, Supports all major platforms, Supports TensorFlow Lite, Supports AutoML Vision Edge
- Brand: Google
- MPN: 114991790
- Part #: SBCGOG0002
- UPC:192876503706
Product URL: https://www.pbtech.co.nz/product/SBCGOG0002/Google-SBC-Coral-USB-Accelerator-Performs-high-spe
Branch | New Stock | On Display |
---|---|---|
Albany | 0 | |
Glenfield | 0 | |
Queen Street | 0 | |
Auckland Uni | 0 | |
Newmarket | 0 | |
Westgate | 0 | |
Penrose | 0 | |
Henderson (Express) | 0 | |
St Lukes | 0 | |
Manukau | 0 | |
Hamilton | 0 | |
Tauranga | 0 | |
New Plymouth | 0 | |
Palmerston North | 0 | |
Petone | 0 | |
Wellington | 0 | |
Head Office | 0 | |
Hornby | 0 | |
Christchurch Central | 0 |
Features
NOTE: Plus $2 get 1x of Google Coral Camera which value is over $55. Till the stock last!
Datasheet
Get started guide
The Coral USB Accelerator adds an Edge TPU coprocessor to your system. It includes a USB socket you can connect to a host computer to perform accelerated ML inferencing.
The on-board Edge TPU is a small ASIC designed by Google that provides high performance ML inferencing with a low power cost. For example, it can execute state-of-the-art mobile vision models such as MobileNet v2 at almost 400 FPS, in a power-efficient manner.
Performs high-speed ML inferencing
The on-board Edge TPU coprocessor is capable of performing 4 trillion operations (tera-operations) per second (TOPS), using 0.5 watts for each TOPS (2 TOPS per watt). For example, it can execute state-of-the-art mobile vision models such as MobileNet v2 at almost 400 FPS, in a power efficient manner. See more performance benchmarks.
Supports all major platforms
Connects via USB to any system running Debian Linux (including Raspberry Pi), macOS, or Windows 10.
Supports TensorFlow Lite
No need to build models from the ground up. TensorFlow Lite models can be compiled to run on the Edge TPU.
Supports AutoML Vision Edge
Easily build and deploy fast, high-accuracy custom image classification models to your device with AutoML Vision Edge.
Features:
- Models are built using TensorFlow
- Fully supports MobileNet and Inception architectures though custom architectures are possible
- Compatible with Google Cloud
Specifications
ML accelerator
Google Edge TPU coprocessor:
4 TOPS (int8); 2 TOPS per watt
Connector
USB 3.0 Type-C* (data/power)
* Compatible with USB 2.0 but inferencing speed is slower.
Dimensions
65 mm x 30 mm
System requirements
- One of the following operating systems:
- Linux Debian 6.0 or higher, or any derivative thereof (such as Ubuntu 10.0+), and an x86-64 or ARM64 system architecture
- macOS 10.15, with either MacPorts or Homebrew installed
- Windows 10
- One available USB port (for the best performance, use a USB 3.0 port)
- Python 3.5, 3.6, or 3.7