Integration of face tracking and human-machine interaction with the ChatGPT API on the head of the INMOOV robot
This is my 4th year project in the service robotics major at CPE Lyon
- Perrichet Theotime
This project aims to develop a solution for using the INMOOV robot head to perform face tracking and establish human-machine interaction using the ChatGPT API. The aim is to enable the robot to detect and track human faces and then interact with users using an advanced language model based on the ChatGPT API. This project combines the robot's visual recognition capabilities with natural language processing techniques to create a natural interaction between man and machine.
the youtube link to the Pitch and Tutorial of the project
Creating a virtual environment:
python3 -m venv env_inmoov
source env_inmoov/bin/activate
Installation of Python libraries:
pip install -r requirements.txt
- INMOOV robot (head)
- A camera
- a microphone
- 2 Arduino
- OpenAI account
-
Followed by face :
- Face detection
- Face tracking
-
Human-machine interaction with the ChatGPT API
- Speech To Text
- Text to Speech
- ChatGPT
-
Mouth movement as a function of audio
-
Using 2 Arduino :
- Arduino 1: Head movement management
- Arduino 2: Mouth movement management
See this link
- an mpd.txt file: contains the OpenAI token
- an organisation.txt file: contains the organisation of the OpenAI account
Place them in the API folder
- Upload the inmoov_jaw.ino code to the arduino controlling the mouth
- Upload the face_detector.ino code to the arduino controlling the robot head
python3 conversation.py #starts conversation with chatGPT
python3 face_detector.py #starts face detection
- Create a virtual environment
- Making STT(Speech To Text) and TTS(Text To Speech) in the cloud
- Running Text to Speech locally
- Make a code to carry out a conversation with ChatGPT
- Produce a program combining the various functions of the robot using multiprocessing.
- Make the robot's mouth move according to the amplitude of the audio.
- Ensure that audio and mouth movement are coordinated
- Using a single Arduino