piccolo
piccolo copied to clipboard
SensiML's open-source AutoML solution for Edge AI model development
Piccolo AI™
Piccolo AI is the open-source version of SensiML Analytics Studio intended for individual developers, researchers, and AI enthusiasts. For enterprise teams contact SensiML for a licensed enterprise version of the software.
The Piccolo AI project includes
- SensiML's ML Engine: The engine behind SensiML’s AutoML model building, sensor data management, model tracking, and embedded firmware generation
- Embedded ML SDK: SensiML's Inferencing and DSP SDK designed for building and running DSP & ML pipelines on edge devices
- Analytic Studio UI: An Intuitive web interface for working with the SensiML ML Engine to build TinyML applications
- SensiML Python Client: Allows programmatic access to the REST API services through Jupyter Notebooks
Piccolo AI is currently optimized for the classification of time-series sensor data. Common use cases enabled by Piccolo AI include:
- Acoustic Event Detection
- Activity Recognition
- Gesture Detection
- Anomaly Detection
- Keyword Spotting
- Vibration Classification
To learn more about Piccolo AI features and capabilities, see the product page.
Get Started
The simplest way to get started learning and using the features in Piccolo AI is to sign up for an account on SensiML's managed SaaS service at SensiML Free Trial.
If you prefer to install and manage Piccolo AI yourself, you can get the latest version on github.
Run Piccolo AI Locally
To try out Piccolo AI on your machine, we recommend using Docker.
Prerequisites
-
Install and start docker and docker-compose - https://docs.docker.com/engine/
- Note: Windows users will need to use the wsl2 backend. Follow the instructions here
- Note: If using Docker Desktop go to Preferences > Resources > Advanced and set Memory to at least 12GB.
-
Follow the post-installation instructions as well to avoid having to run docker commands as sudo user
-
Some functionality requires calling docker from within docker. To enable this, you will need to make the docker socket accessible to docker containers
sudo chmod 666 /var/run/docker.sock
-
By default the repository does not include docker images to generate model/compiler code for devices. You'll need to pull docker images for the compilers you want to use. See docker images:
docker pull sensiml/sml_x86_generic:9.3.0-v1.0 docker pull sensiml/sml_armgcc_generic:10.3.1-v1.0 docker pull sensiml/sml_x86mingw_generic:9.3-v1.0 docker pull sensiml/sensiml_tensorflow:0a4bec2a-v4.0
-
Make sure you have the latest sensiml base docker image version. If not, you can do a docker pull
docker pull sensiml/base
Start Piccolo AI
-
Make sure docker is running by checking your system tray icons. If it is not running then open Docker Desktop application in Windows to start the docker background service.
-
Open a Ubuntu terminal through Windows Subsystem for Linux
git clone https://github.com/sensiml/piccolo cd piccolo
-
Use docker compose to start the services
docker compose up
Login via the Web UI Interface
Go to your browser at http://localhost:8000
to log in to the UI. See the Getting Started Guide to get started.
The default username and password is stored in the src/server/datamanager/fixtures/default.yaml
file
username: [email protected]
password: TinyML4Life
Step-by-Step Install Video
To help you get up and running quickly, you may also find our video installation guide useful. Click on the video below for a complete walkthrough for getting Piccolo AI installed on a Windows 10 / 11 PC:
Data Studio
The Data Studio is SensiML's standalone Data Capture, Labeling, and Machine Learning Model testing application. It is a companion application to Piccolo AI, but it requires a subscription to use with your local Piccolo AI.
Upgrades
Upgrades will need to check out the latest code and run database.intialize container script which will perform any migrations. You may also need to pull the newest sensiml/base docker image in some cases when the underlying packages have been changed.
Documentation
Piccolo AI is the open-source version of SensiML ML Engine. As such, it's features and capabilities align with the majority of those found in SensiML's ML Engine for which documentation can be found here.
For more information about our documentation processes or to build them yourself, see the docs README.
Contribute
We welcome contributions from the community. See our contribution guidelines, see CONTRIBUTING.md.
Developer Guides
Documentation for developers coming soon!
Support
-
To report a bug or request a feature, create a GitHub Issue. Please ensure someone else hasn't created an issue for the same topic.
-
Need help using Piccolo AI? Reach out on the forum, fellow community member or SensiML engineer will be happy to help you out.