Skip to content

DigitalProductInnovationAndDevelopment/Security-Findings-Recommender-System

Repository files navigation

Security Findings Recommender System

Overview

The Security Findings Recommender System is designed to assist in identifying, managing, and reporting security vulnerabilities. Leveraging AI-powered analysis, the system generates detailed reports that categorize findings and provide actionable recommendations.

Project Goals

  • Automate Vulnerability Reporting: Streamline the process of generating vulnerability reports from raw data.
  • Enhance Findings with AI: Use AI to categorize findings and suggest solutions.
  • Improve Security Management: Help users prioritize and address security vulnerabilities based on severity and priority.

Features

  • AI-powered categorization and solution generation
  • Customizable categories and solutions
  • Detailed single finding solutions and aggregated vulnerability multi-finding reports
  • API interface for easy integration

For a detailed architectural overview and component descriptions, refer to the Project Architecture Documentation.

System Overview

The system consists of three main layers:

  1. Data Layer: Manages core data structures such as VulnerabilityReport, Finding, and Solution.
  2. Aggregation Layer: Handles the grouping and processing of findings using classes like FindingBatcher, FindingGrouper, and AgglomerativeClusterer.
  3. LLM (Language Model) Layer: Provides natural language processing capabilities through services such as OLLAMAService and OpenAIService.

For more details, see the System Overview.

Prerequisites

Environment Setup

  • Copy the .env.docker.example file to .env.docker and fill in the required values.
  • For running without Docker, ensure the correct URL for the Ollama API is set in your .env file: OLLAMA_URL=http://localhost:11434.

For detailed setup instructions, refer to the Prerequisites Documentation.

Installation

  1. Clone the Repository:
    git clone <repository-url>
    cd <repository-directory>

Docker (details in Docker Installation Guide)

  1. Build and run the application using Docker:
docker compose up -d --build

ollama will not build on default , if you want to add ollama.

docker compose --profile ollama  up -d --build

Local Development (details in Local Development Guide)

  1. Set Up Environment Variables: Ensure .env file is configured as per prerequisites.

  2. Install Dependencies:

    pipenv install
  3. Run the Application:

    cd src && pipenv shell
    python app.py

Usage

Accessing the Application

  • API: Available at http://localhost:8000
  • Dashboard: Available at http://localhost:3000

Available Routes (exempt)

  • GET /api/v1/tasks/: Retrieve all tasks.
  • DELETE /api/v1/tasks/{task_id}: Delete a specific task by ID.
  • POST /api/v1/recommendations/: Retrive recommendations.
  • POST /api/v1/recommendations/aggregated: Retrive aggregated recommendations.
  • POST /api/v1/upload/: Upload data for Processing.
  • GET /: Root endpoint.

For a complete list of available routes, refer to the API Routes Documentation.

Example Requests

Get Available Recommendations

curl -X POST http://localhost:8000/api/v1/recommendations/ -d '{}' -H "Content-Type: application/json"

Check Task Status

curl http://localhost:8000/api/v1/tasks/1/status

For more usage examples, refer to the Usage Documentation.

Development

Local Development

To run the code without Docker, follow these steps:

  1. Install Dependencies:

    pipenv install
  2. Activate Virtual Environment:

    pipenv shell
  3. Run the Application:

    cd src && python app.py

Guides

This repository contains several detailed guides to help you get started and understand the system better. Have a look at the Guides Directory.

FAQs

For common questions and troubleshooting, refer to the FAQs Documentation.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project Information

This Project was created in the context of TUMs practical course Digital Product Innovation and Development by fortiss in the summer semester 2024. The task was suggested by Siemens, who also supervised the project.

FAQ

Please refer to FAQ

Contact

To contact fortiss or Siemens, please refer to their official websites.

Team Members