recog-ai-demo
is a web application designed to showcase an AI-powered recognition workflow. Leveraging advanced machine learning models (LLM), this application harmonizes and compares various modules on a semantic level.
-
Module Description Harmonization:
- Parses and harmonizes uploaded module descriptions, enhancing comparability with internally stored modules in a vector database.
-
Internal Module Suggestions:
- Utilizes the vector database and semantic similarity to suggest internal modules that are likely to have a high chance of recognition for the uploaded external module.
-
Module Comparison and Recognition Possibility:
- Compares an external module with an internal module, evaluating the possibility of recognition based on predefined criteria.
To install and run the application locally, follow these steps:
-
Clone the repository:
git clone https://github.com/pascalhuerten/recog-ai-demo.git cd recog-ai-demo
-
Create a
.env
file in the project root and set your OpenAI API key:OPENAI_API_KEY=your_openai_api_key
-
Creating a Vector Store (Alternative to proprietary vector store):
- Prepare your module descriptions in a suitable format (e.g., JSON, plain text).
- Modify the application code to read the module descriptions and create a vector store using chromadb. Update the code adjusting paths and configurations as needed.
-
Build the Docker image:
docker build -t recog-ai-demo .
-
Run the Docker container:
docker run -p 80:80 recog-ai-demo
The application will be accessible at http://localhost:80
.
- Access the application at
http://localhost:80
. - Upload a module description file (PDF, TXT, or XML) or enter the description in the provided text area.
- Click on "Find Modules" to get module suggestions based on the description.
- Select an external module and an internal module for comparison.
- Click on "Select Module" to see the examination result, including recognition possibility.