Skip to content

An emotion-aware AI assistant that uses facial recognition to detect emotions and provide contextual responses using LLMs.

License

Notifications You must be signed in to change notification settings

siddvoh/Emo-Aware-LLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Emo-Aware-LLM

An emotion-aware AI assistant that uses facial recognition to detect emotions and provide contextual responses using LLMs.

Prerequisites

  • Python 3.8+
  • Webcam
  • Microphone
  • OpenAI API key

Installation

  1. Clone the repository:
git clone https://github.com/yourusername/Emo-Aware-LLM.git
cd Emo-Aware-LLM
  1. Create and activate a virtual environment:
python3 -m venv venv
source venv/bin/activate  # On Windows, use: venv\Scripts\activate
  1. Install dependencies:
pip install -r requirements.txt
  1. Set up environment variables:
cp .env.example .env
# Add your OpenAI API key to .env file

Usage

Run the assistant:

python src/main.py

License

MIT License

About

An emotion-aware AI assistant that uses facial recognition to detect emotions and provide contextual responses using LLMs.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages