What is Neuralet?
Neuralet is an open-source platform for edge deep learning models on GPU, TPU, and more.
With Neuralet, you can start running your favorite image classification and object detection deep learning models on various edge devices with minimal effort.
Use Neuralet if you need a platform that:
- Makes getting started with any model on any edge device super easy.
- Supports a wide variety of deep learning models on state-of-the-art edge devices.
- Is open-source, light-weight, secure, and fast.
How does it work?
Neuralet made running deep learning models easy by introducing a set of Docker containers that are packaged to run directly on any device. A separate Docker container is built for each model on each device to make it very simple to get started with.
Tell us which device you are using, select the model of your choice, and Neuralet takes care of the rest. Start using Neuralet now!
Neuralet supports a wide variety of deep learning models for image classification and object detection tasks. You can choose from different versions of EfficientNet, Inception, MobileNet, and other popular architectures listed below.
Image Classification Models:
Object Detection Models:
Getting started: only three steps to Neuralet
Getting started with Neuralet is easy. Follow these three steps to see your model running: STEP 1: Navigate to Neuralet website or Neuralet Github repository. STEP 2: Select your preferences. Which task are you interested in? Which model do you want to use? On which device? STEP 3: Run the command. That’s it!
A Simple Example
Let’s say you have just bought and set up Google’s Coral Dev Board, and you want to perform image classification using the EfficientNet model. EfficientNet is a classification model trained on the ImageNet dataset and can recognize 1000 different object categories. Following the previously mentioned steps, we first head to the Neuralet website (or Neuralet Github repository equivalently). Then we choose the “host” to be Coral Dev Board, and we select the version of EfficientNet (or any other model of our choice) suitable for our case, say EfficienNet-edgetpu-L. We can also choose to run the model in the “queue mode” or run it with ROS. Let’s stick with the queue mode for the sake of this example. That’s all we have to do. Neuralet gives us the right command to execute.
We have two options to run the Docker container:
1- Build the container from source:
# 1- Clone the repository git clone https://github.com/neuralet/neuralet # 2- Build the container MODEL_NAME=efficientnet-edgetpu-L cd neuralet/coral-dev-board/$MODEL_NAME docker build -t "neuralet/coral-dev-board:$MODEL_NAME" . # 3- Run the container docker run -it --privileged --net=host -v $(pwd)/../../:/repo neuralet/coral-dev-board:$MODEL_NAME # 4- Run inference on a test image python3 src/client.py [PATH-TO-IMAGE] # 5- Terminate the server and stop the container python3 src/client.py stop
2- Pull the container from Dockerhub:
# 1- Clone the repository git clone https://github.com/neuralet/neuralet # 2- Run the container MODEL_NAME=efficientnet-edgetpu-L cd neuralet/coral-dev-board/$MODEL_NAME docker run -it --privileged --net=host -v $(pwd)/../../:/repo neuralet/coral-dev-board:$MODEL_NAME # 3- Run inference on a test image python3 src/client.py [PATH-TO-IMAGE] # 4- Terminate the server and stop the container python3 src/client.py stop
Congratulations! You have just run your first EfficientNet model on Coral Dev Board using the Neuralet platform.
You can ask questions, post bug reports, and request new features by reaching out to our Github repo or contacting us.
Never miss out on the latest news about Neuralet products by subscribing to our newsletter.