blindai-preview icon indicating copy to clipboard operation
blindai-preview copied to clipboard

Blindai Preview (no longer used, merged with the main repo blindai)

Contributors Forks Stargazers Issues Apache License


Logo

BlindAI

Website Blog LinkedIn

BlindAI is an AI inference server with an added privacy layer, protecting the data sent to models.

Explore the docs »

Try Demo · Report Bug · Request Feature

Table of Contents
  1. About The Project
    • Built With
  2. Getting Started
    • Prerequisites
    • Installation
  3. Usage
  4. Getting Help
  5. License
  6. Contact

🔒 About The Project

BlindAI facilitates privacy-friendly AI model deployment by letting AI engineers upload and delete models to their secure server instance using our Python API. Clients can then connect to the server, upload their data and run models on it without compromising on privacy.

Data sent by users to the AI model is kept confidential at all times. Neither the AI service provider nor the Cloud provider (if applicable), can see the data.

Confidentiality is assured by hardware-enforced Trusted Execution Environments. We explain how they keep data and models safe in detail here.

Built With

Rust Python Intel-SGX Tract

(back to top)

🚀 Getting Started

We strongly recommend for you to get started with our Quick tour to discover BlindAI with a hands-on example using COVID-Net.

But here’s a taste of what using BlindAI could look like 🍒

AI company's side

Uploading and deleting models

An AI company AI company want to provide their model as an an easy-to-use service. They upload it to the server, which is assigned a model ID.

response = client_1.upload_model(model="./COVID-Net-CXR-2.onnx")
MODEL_ID = response.model_id
print(MODEL_ID)

8afcdab8-209e-4b93-9403-f3ea2dc0c3ae

When collaborating with clients is done, the AI company can delete their model from the server.

# AI company deletes model after use
client_1.delete_model(MODEL_ID)

Client's side

Running a model on confidential data

The client wants to feed their confidential data to the model while protecting it from third-party access. They connect and run the model on the following confidential image.

pos_ret = client_2.run_model(MODEL_ID, positive)
print("Probability of Covid for positive image is", pos_ret.output[0].as_flat()[0][1])

Probability of Covid for positive image is 0.890598714351654

For more examples, please refer to the Documentation

Installation

🥇 Recommended 🥇

Deploying BlindAI on Azure DCsv3 VM

  • ✅ No requirement to have your own Intel SGX-ready device or a particular distribution.
  • ✅ Secure. Hardware security guarantees protect your data and model from any third-party access.
  • ❌ Can be more expensive than local deployment.

You can deploy the server in your Azure DCsv3 VM using our docker image with the following command:

docker run -it -e BLINDAI_AZURE_DCS3_PATCH=1 -p 9923:9923 -p 9924:9924 \
--device /dev/sgx/enclave --device /dev/sgx/provision \
-v /var/run/aesmd/aesm.socket:/var/run/aesmd/aesm.socket \
mithrilsecuritysas/blindai-preview-server:latest /root/start.sh

For alternative deployment methods (on-premise, testing only...) or more information, visit our installation guide.

(back to top)

🙋 Getting help

📜 License

Distributed under the Apache License, version 2.0. See LICENSE.md for more information.

📇 Contact

Mithril Security - @MithrilSecurity - [email protected]

Project Link: https://github.com/mithril-security/blindai-preview

(back to top)