Expand description
§Edge Impulse Runner
A Rust library for running inference with Edge Impulse models and uploading data to Edge Impulse projects. This crate provides safe and easy-to-use interfaces for:
- Running machine learning models on Linux and macOS
- Uploading training, testing and anomaly data to Edge Impulse projects
§Inference Modes
The crate supports two inference modes:
§EIM Mode (Default)
- Run Edge Impulse models (.eim files) using binary communication
- Requires model files to be present on the filesystem
- Compatible with all Edge Impulse deployment targets
§FFI Mode
- Direct FFI calls to the Edge Impulse C++ SDK
- Models are compiled into the binary
- Faster startup and inference times
- Requires the
ffi
feature to be enabled
§Model Support
- Classification models: Multi-class and binary classification
- Object detection models: Bounding box detection with labels
- Anomaly detection: Visual and sensor-based anomaly detection
- Sensor types: Camera, microphone, accelerometer, positional sensors
- Continuous mode: Real-time streaming inference support
§Data Ingestion
- Upload data to Edge Impulse projects
- Support for multiple data categories:
- Training data
- Testing data
- Anomaly data
- Handle various file formats:
- Images (JPG, PNG)
- Audio (WAV)
- Video (MP4, AVI)
- Sensor data (CBOR, JSON, CSV)
§Quick Start Examples
§EIM Mode (Default)
use edge_impulse_runner::{EdgeImpulseModel, InferenceResult};
fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create a new model instance with EIM file
let mut model = EdgeImpulseModel::new("path/to/model.eim")?;
// Prepare normalized features (e.g., image pixels, audio samples)
let features: Vec<f32> = vec![0.1, 0.2, 0.3];
// Run inference
let result = model.infer(features, None)?;
// Process results
match result.result {
InferenceResult::Classification { classification } => {
println!("Classification: {:?}", classification);
}
InferenceResult::ObjectDetection {
bounding_boxes,
classification,
} => {
println!("Detected objects: {:?}", bounding_boxes);
if !classification.is_empty() {
println!("Classification: {:?}", classification);
}
}
InferenceResult::VisualAnomaly {
visual_anomaly_grid,
visual_anomaly_max,
visual_anomaly_mean,
anomaly,
} => {
let (normalized_anomaly, normalized_max, normalized_mean, normalized_regions) =
model.normalize_visual_anomaly(
anomaly,
visual_anomaly_max,
visual_anomaly_mean,
&visual_anomaly_grid.iter()
.map(|bbox| (bbox.value, bbox.x as u32, bbox.y as u32, bbox.width as u32, bbox.height as u32))
.collect::<Vec<_>>()
);
println!("Anomaly score: {:.2}%", normalized_anomaly * 100.0);
println!("Maximum score: {:.2}%", normalized_max * 100.0);
println!("Mean score: {:.2}%", normalized_mean * 100.0);
for (value, x, y, w, h) in normalized_regions {
println!("Region: score={:.2}%, x={}, y={}, width={}, height={}",
value * 100.0, x, y, w, h);
}
}
}
Ok(())
}
§FFI Mode
use edge_impulse_runner::{EdgeImpulseModel, InferenceResult};
fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create a new model instance with FFI mode
let mut model = EdgeImpulseModel::new_ffi(false)?;
// Prepare normalized features (e.g., image pixels, audio samples)
let features: Vec<f32> = vec![0.1, 0.2, 0.3];
// Run inference
let result = model.infer(features, None)?;
// Process results (same as EIM mode)
match result.result {
InferenceResult::Classification { classification } => {
println!("Classification: {:?}", classification);
}
InferenceResult::ObjectDetection {
bounding_boxes,
classification,
} => {
println!("Detected objects: {:?}", bounding_boxes);
if !classification.is_empty() {
println!("Classification: {:?}", classification);
}
}
InferenceResult::VisualAnomaly { .. } => {
println!("Anomaly detection result");
}
}
Ok(())
}
§Data Upload
use edge_impulse_runner::ingestion::{Category, Ingestion, UploadOptions};
// Create client with API key
let ingestion = Ingestion::new("your-api-key".to_string());
// Upload a file
let result = ingestion
.upload_file(
"data.jpg",
Category::Training,
Some("label".to_string()),
Some(UploadOptions {
disallow_duplicates: true,
add_date_id: true,
}),
)
.await?;
§Architecture
§Backend Abstraction
The crate uses a trait-based backend system that allows switching between different inference modes:
-
EIM Backend: Uses Unix socket-based IPC to communicate with Edge Impulse model processes. The protocol is JSON-based and follows a request-response pattern for model initialization, classification requests, and error handling.
-
FFI Backend: Direct FFI calls to the Edge Impulse C++ SDK, providing faster startup times and lower latency by eliminating IPC overhead.
§Ingestion API
The ingestion module interfaces with the Edge Impulse Ingestion API over HTTPS, supporting both data and file endpoints for uploading samples to Edge Impulse projects.
§Prerequisites
Some functionality (particularly video capture) requires GStreamer to be installed:
- macOS: Install both runtime and development packages from gstreamer.freedesktop.org
- Linux: Install required packages (libgstreamer1.0-dev and related packages)
§Error Handling
The crate uses the EdgeImpulseError
type to provide detailed error information:
use edge_impulse_runner::{EdgeImpulseModel, EdgeImpulseError};
// Match on model creation
match EdgeImpulseModel::new("model.eim") {
Ok(mut model) => {
// Match on classification
match model.infer(vec![0.1, 0.2, 0.3], None) {
Ok(result) => println!("Success!"),
Err(EdgeImpulseError::InvalidInput(msg)) => println!("Invalid input: {}", msg),
Err(e) => println!("Other error: {}", e),
}
},
Err(e) => println!("Failed to load model: {}", e),
}
§Modules
backends
: Backend abstraction and implementations (EIM, FFI)error
: Error types and handlingffi
: Safe Rust bindings for Edge Impulse C++ SDK (whenffi
feature is enabled)inference
: Model management and inference functionalityingestion
: Data upload and project managementtypes
: Common types and parameters
§Cargo Features
eim
(default): Enable EIM binary communication modeffi
: Enable FFI direct mode (requiresedge-impulse-ffi-rs
dependency)
Only one backend should be enabled at a time. Enabling both features simultaneously is not supported and may cause conflicts.
Re-exports§
pub use inference::messages::InferenceResponse;
pub use inference::messages::InferenceResult;
pub use error::EdgeImpulseError;
pub use inference::model::EdgeImpulseModel;
pub use types::BoundingBox;
pub use types::ModelParameters;
pub use types::ProjectInfo;
pub use types::SensorType;
pub use types::TimingInfo;