embedded.ai
embedded.ai copied to clipboard
Repository for DCA0306, an undergraduate course about Embedded Artifical Intelligence

Federal University of Rio Grande do Norte
Technology Center
Department of Computer Engineering and Automation
Embedded AI
References
- :books: Daniel Situnayake and Pete Warden. TinyML: Machine Learning with TensorFlow Lite on Arduino and Ultra-Low-Power Microcontrollers. [Link]
- :books: Gian Marco Iodice. TinyML Cookbook: Combine Artificial Intelligence and Ultra-low-power Embedded Devices to Make the World Smarter [Link]
- :books: Aurélien Géron. Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow [Link]
- :books: François Chollet. Deep Learning with Python [Link]
Lessons
- Machine Learning Fundamentals
- You'll learn how machine learning models work, how to build them, and how to optimize them. By the end, you’ll know the basics behind building models that will make data-driven predictions.
- :hourglass_flowing_sand: Estimated time: 10h
- Git and Version Control
- You'll learn how to: a) organize your code using version control, b) resolve conflicts in version control, c) employ Git and Github to collaborate with others.
- :facepunch: getting a git repository.
- :hourglass_flowing_sand: Estimated time: 5h
- Complementary materials
Week 02: TinyML Fundamentals
-
- Three fundamental steps to explore a TinyML solution
- :page_facing_up: Further reading paper
-
- You'll learn how to: a) define mathematical functions using calculus; b) employ intermediate machine learning techniques.
- :hourglass_flowing_sand: Estimated time: 6h
Week 03: TinyML Challenges
- What are the challenges for TinyML?
- AI lifecycle and ML workflow
- ML evaluation metrics
- Linear Algebra For Machine Learning
- You'll learn how to: a) Understand the key ideas to understand linear systems; b) Apply the concepts to machine learning techniques.
- :hourglass_flowing_sand: Estimated time: 6h
- :page_facing_up: Further reading paper
Week 04: Deep Learning Fundamentals I
- The big-picture
- Introduction
- Hands on DL fundamentals
- You'll learn how to: a) Understand how neural networks are represented; b) understand how adding hidden layers can provide improved model performance; c) Understand how neural networks capture nonlinearity in the data.
- :hourglass_flowing_sand: Estimated time: 8h
Week 05: Deep Learning Fundamentals II
- A first image classification model using MLOps best practices
- Project :star2: :smiley_cat: :dog: :panda_face:
Week 06: Convolutional Neural Networks
Week 07: Using CNN to Classify Images
Week 08: Going Deeper with CNN
- Study of Classical Architectures
- LeNet-5
- Best practices
- Extensions using: batch normalization, dropout, data augmentation
- Sweepy (hyperparameter tuning)
Week 09: Going Deeper with CNN II
Week 10: Transfer Learning
Week 11: Edge Impulse crash course
- A brief overview of Edge Impulse Platform
- Data Acquisition
- Create a impulse design and a preprocessing task
- Training
- Understanding training evaluation metrics
- Model testing
- Live classification using a mobile phone
- AutoML configuration using EON Tuner
- Understanding the results of EON Tuner and versioning the model
- Set a primary model using EON Tuner and Transfer Learning
- Training an EON Tuner primary model using transfer learning
- Final remarks
Week 12: TFLite Optimizations and Quantization
- Post Training Quantization (PTQ)
- Introduction to TensorFlow-Lite
- PTQ of MNIST
- A regression model using TensorFlow-Lite
- Case study using Wandb developed by Ishan Dutta et al.