Metahuman-Emotion-Recognition
Metahuman-Emotion-Recognition copied to clipboard
Emotionally responsive Virtual Metahuman CV with Real-Time User Facial Emotion Detection (Unreal Engine 5).
🧠💻 Real-Time Emotion Recognition Metahuman🧠💻
Creating real-time facial and emotion recgonition software to pair with Unreal Engine 5 Metahuman to predict and mimic user emotions.
Supported Emotions:
- Anger 😡
- Fear 😨
- Happy 😄
- Neutral 😐
- Sad 😢
- Surprised 😲
Our Virtual Humans
They're terrifying, we know.
Example Results
(Showcasing Happy & Surprised)
Features
- Emotion Recognition
- Age & Gender Prediction
- Enviornmental Variables Customization
- Headless mode
- Flexible Emotion Selection: Choose between the most common emotion every X captured emotions or the latest emotion.
Installation Process & Setup Process
Run locally:
- Clone this repository, cd into it, and install dependancies:
git clone https://github.com/Prem-ium/Metahuman-Emotion-Recognition.git
cd EmotionDetection
pip install -r requirements.txt
- Configure your
.env
file (See below and example for options) - Run the main script:
python emotional-detection-main.py
- Open Unreal Engine Project & Run the Blueprint
- Click the button to trigger the text reader to process the most common emotion recorded.
- The Metahuman mimics the user's most common emotion.
- Repeat Steps 5-6 until desired termination
Enviornmental Variables
Configure your variables in a .env file within the same directory. All .env variables are optional and have default values if not specified.
Variable | Description | Default Value |
---|---|---|
HEADLESS | True or False. Whether to open a GUI for testing webcam accuracy. | True |
PRODUCTION | True or False. Whether the program is running in Unreal Engine. | False |
DELAY | Integer value of how many seconds the program will wait before starting the next iteration | - |
FILE_PATH | Path of directory containing the model and weights | - |
WEIGHTS | Name of the model being used | - |
Donations
I've been working on this project for a few months now, and I'm really happy with how it's turned out. It's also been a helpful tool for users to detect a user's age, gender, and emotions with a hyper-realistic virtual human built using Unreal Engine 5. I'm currently working on adding new features to the script and working on other similar programs to generate passive income. I'm also working on making the script more user-friendly and accessible to a wider audience.
I'm accepting donations through GitHub Sponsors (No Fees!) or Buy-Me-Coffee. Any amount you can donate will be greatly appreciated.

Your donations will help me to cover the costs of hosting the project, developing new features, and marketing the project to a wider audience. Thank you for your support!
License
Capstone Group Members
- Python Scripting: Prem Patel (@Prem-ium)
- Integrating Script/Blueprint into Unreal Engine: Gabe Vindas (@GabeV95) & Dustin Lynn (@Onemorehell)
- Creating Unreal Engine Emotion Animations: Matthew Goetz