Skip to content

Test generation using recorded API calls from production traffic

License

Notifications You must be signed in to change notification settings

kusho-co/record-replay

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Kusho Record Replay Service

A Flask-based service for collecting, analyzing, and generating test cases for API traffic patterns.

Prerequisites

  • Docker and Docker Compose
  • Python 3.x (for local development)
  • MySQL 8.0
  • OpenAI API key (for test generation features)

Quick Start

  1. Clone the repository:

    git clone [email protected]:kusho-co/record-replay.git
    cd record-replay
  2. Configure the services in docker-compose.yml:

    services:
      collector:
        environment:
          - MYSQL_HOST=mysql
          - MYSQL_PORT=3306
          - MYSQL_USER=kusho
          - MYSQL_PASSWORD=kusho_password
          - MYSQL_DATABASE=kusho_traffic
          - OPENAI_ORGID=your_org_id
          - OPENAI_API_KEY=your_api_key
    
      mysql:
        environment:
          - MYSQL_ROOT_PASSWORD=root_password
          - MYSQL_DATABASE=kusho_traffic
          - MYSQL_USER=kusho
          - MYSQL_PASSWORD=kusho_password
  3. Start the services:

    docker-compose up -d

The service will be available at http://localhost:7071.


API Endpoints

Event Collection

Collect Events

POST /api/v1/events
  • Description: Collect and store traffic events.
  • Request Body:
    {
      "events": [
        {
          "id": "event1",
          "timestamp": "2024-12-29T12:00:00Z",
          "data": "event_data"
        }
      ]
    }
  • Response:
    {
      "status": "success",
      "message": "Stored <number> events"
    }

Traffic Analysis

Get Anomalies

GET /api/v1/analysis/anomalies
  • Description: Retrieve traffic anomalies.
  • Query Parameters:
    • hours (optional, default: 24): Number of past hours to analyze.
    • min_score (optional, default: 0.0): Minimum anomaly score.
  • Response:
    {
      "anomalies": [],
      "count": 0
    }

Analyze Traffic

POST /api/v1/analysis/analyze
  • Description: Analyze recent traffic data.
  • Request Body:
    {
      "hours": 24
    }
  • Response:
    {
      "status": "success",
      "message": "Analyzed traffic for past 24 hours"
    }

Start Analysis Job

POST /api/v1/analysis/start-job
  • Description: Start a background job for analyzing traffic and generating tests.
  • Request Body:
    {
      "hours": 24
    }
  • Response:
    {
      "status": "success",
      "job_id": "<job_id>",
      "message": "Started analysis job for past 24 hours"
    }

Get Job Status

GET /api/v1/analysis/job-status
  • Description: Get the status of an analysis job.
  • Query Parameters:
    • job_id (required): The ID of the job.
  • Response:
    {
      "job_id": "<job_id>",
      "job_status": "completed",
      "result": {},
      "error": null
    }

OpenAPI Export

Export OpenAPI

GET /api/v1/export/openapi
  • Description: Export the test suite in OpenAPI-compatible JSON.
  • Query Parameters:
    • base_url (required): Base URL for the API.
  • Response:
    {
      "openapi": "3.0.0",
      "paths": {}
    }

Export OpenAPI for an Endpoint

GET /api/v1/export/openapi/endpoint
  • Description: Export OpenAPI-compatible data for a specific endpoint.
  • Query Parameters:
    • base_url (required): Base URL for the API.
    • url (required): Endpoint URL.
    • http_method (required): HTTP method (e.g., GET, POST).
  • Response:
    {
      "summary": "Endpoint description",
      "parameters": [],
      "responses": {}
    }

Miscellaneous

List Available Endpoints

GET /api/v1/endpoints
  • Description: List all available API endpoints.
  • Response:
    {
      "status": "success",
      "endpoints": ["/api/v1/events", "/api/v1/analysis/anomalies"]
    }

Development Setup

  1. Create a virtual environment:

    python -m venv venv
    source venv/bin/activate  # Linux/Mac
    # or
    .\venv\Scripts\activate  # Windows
  2. Install dependencies:

    pip install -r requirements.txt
  3. Run migrations:

    flask db upgrade
  4. Start the development server:

    python -m flask run --port 7071

Docker Configuration

The service uses two main containers:

  • collector: The Flask application service
  • mysql: MySQL 8.0 database server

Volumes:

  • mysql_data: Persistent storage for MySQL data
  • migrations: Database migration scripts
  • custom.cnf: Custom MySQL configuration

About

Test generation using recorded API calls from production traffic

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published