Skip to content

character level machine translation, using fastapi, pytorch sequence 2 sequence model and docker

License

Notifications You must be signed in to change notification settings

vincentporte/machine_translation_fastapi_pytorch_docker

Repository files navigation

Contributors Forks Stargazers Issues GNU General Public License v3.0 LinkedIn


Logo

char level seq2seq machine translation on named entities

character level machine translation on named entities, using fastapi, spacy, pytorch sequence 2 sequence model and docker

Report Bug · Request Feature

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Roadmap
  5. Contributing
  6. License
  7. Contact
  8. Acknowledgments

About The Project

Main goals

Use this project to easily setup a machine translation api for authenticated user. Get 'normalized' entiites from a raw text (email, files, chatbot conversations):

  1. Add your own translation in database,
  2. Crain your models using your translation pairs
  3. Convert raw entities into actionnable features

(back to top)

Built With

(back to top)

Getting Started

Prerequisites

  • Install Docker and Docker-Compose

Installation

  • Clone the repo
    git clone https://github.com/vincentporte/machine_translation_fastapi_pytorch_docker.git
  • Build the docker image
    docker-compose build backend
  • Setup keys and credentials in .env
    POSTGRES_USER=db_user
    POSTGRES_PASSWORD=db_pass
    POSTGRES_DB=db_name
    SECRET_KEY=secret_key_for_users_management
    DATABASE_URL=postgres://db_user:db_pass@db:5432/db_name
  • Add your own NER model, see Spacy docs
  • Run your containers
    docker-compose up -d;docker-compose logs -f
  • Init you database
    docker-compose exec backend aerich init-db
  • Add your dataset files and train your own seq2seq model
    docker-compose exec backend python app/services/training.py
    • Run tests
    docker-compose exec backend pytest

Setup superuser

  • Access DB cmd line
    docker exec -it mt_db psql -U db_user -h 127.0.0.1 -W db_name
  • Grant superuser rigths
    UPDATE usermodel SET is_superuser = 't', is_verified = 't' WHERE email = '[email protected]';

Upgrade Database Model

  • Generate migration file
    docker-compose exec backend aerich migrate
  • Apply upgrade to DB
    docker-compose exec backend aerich upgrade

Deployement

  • Replace /config/nginx/nginx.conf with nginx.conf.live and update server_name refs
    server_name subdomain.domain.com;
  • Add CAA record in your DNS
    sudomain.domain.com. CAA 0 issue letsencrypt.org
  • Update refs in letsencrypt.sh
    domains=(subdomain.domain.com)
    email="[email protected]"
  • Run letsencrypt.sh script to setup certificates
    sudo ./init-letsencrypt.sh

(back to top)

Usage

Users

Using FastAPIUsers

Registering

curl \
    -H "Content-Type: application/json" \
    -X POST \
    -d "{\"email\": \"[email protected]\",\"password\": \"strongpassword\"}" \
    http://localhost/auth/register

Returns:

{
    "id":"800e9564-6804-4ab5-bc59-a088182227be",
    "email":"[email protected]",
    "is_active":true,
    "is_superuser":false,
    "is_verified":false
}

Login

curl \
    -H "Content-Type: multipart/form-data" \
    -X POST \
    -F "[email protected]" \
    -F "password=strongpassword" \
    http://localhost:8000/auth/login

Returns:

{
    "access_token":"eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyX2lkIjoiODAwZTk1NjQtNjgwNC00YWI1LWJjNTktYTA4ODE4MjIyN2JlIiwiYXVkIjpbImZhc3RhcGktdXNlcnM6YXV0aCJdLCJleHAiOjE2MzAxMzk5OTJ9.w-ZWpm51fyybFivmKjun3qbXuqwXCgYyxGbPD1yhIr4",
    "token_type":"bearer"
}

Named Entities Recognition

Get entities from text

curl -X 'POST'
'http://localhost/ner'
-H 'accept: application/json'
-H 'Authorization: Bearer token'
-H 'Content-Type: application/json'
-d '{ "sentence": "un devis pour 500 flyers en quadri r/v, format a4 pour demain svp" }'

Returns:

{
  "entities": [
    {
      "text": "500",
      "entity": "EXEMPLAIRES",
      "pos": 0,
      "start": 14,
      "end": 17
    },
    {
      "text": "flyers",
      "entity": "PRODUCT",
      "pos": 1,
      "start": 18,
      "end": 24
    },
    {
      "text": "quadri r/v",
      "entity": "IMPRESSION",
      "pos": 2,
      "start": 28,
      "end": 38
    },
    {
      "text": "format a4",
      "entity": "FORMAT",
      "pos": 3,
      "start": 40,
      "end": 49
    }
  ],
  "ner": "imprimeur_4.3.20210312124255"
}

Translations

Add a translation pair to your training dataset

curl -X 'POST'
'http://localhost/products'
-H 'accept: application/json'
-H 'Authorization: Bearer token'
-H 'Content-Type: application/json'
-d '{ "entity_type": "FO", "source": "fo a4", "translation": "format ouvert : 210.0 x 297.0 mm" }'

Returns:

{
  "id": 2,
  "entity_type": "FO",
  "source": "fo a4",
  "translation": "format ouvert : 210.0 x 297.0 mm"
}

Export your training seq2seq training dataset (user must be "verified")

curl -X 'POST'
'http://localhost/products/extract'
-H 'accept: application/json'
-H 'Authorization: Bearer token'
-d ''

Returns:

{
  "msg": "extracting"
}

Translate text entities

curl -X 'POST'
'http://localhost/translate'
-H 'accept: application/json'
-H 'Authorization: Bearer token'
-H 'Content-Type: application/json'
-d '{ "entities": [ { "text": "500", "entity": "EXEMPLAIRES", "pos": 0, "start": 14, "end": 17 }, { "text": "flyers", "entity": "PRODUCT", "pos": 1, "start": 18, "end": 24 }, { "text": "quadri r/v", "entity": "IMPRESSION", "pos": 2, "start": 28, "end": 38 }, { "text": "format a4", "entity": "FORMAT", "pos": 3, "start": 40, "end": 49 } ], "model": "imprimeur" }'

Returns:

{
  "entities": [
    {
      "text": "500",
      "entity": "EXEMPLAIRES",
      "pos": 0,
      "start": 14,
      "end": 17
    },
    {
      "text": "flyers",
      "entity": "PRODUCT",
      "pos": 1,
      "start": 18,
      "end": 24
    },
    {
      "text": "recto : quadri, verso : quadri",
      "entity": "IMPRESSION",
      "pos": 2,
      "start": 28,
      "end": 38
    },
    {
      "text": "format fini : 210.0 x 297.0 mm",
      "entity": "FORMAT",
      "pos": 3,
      "start": 40,
      "end": 49
    }
  ]
}

For more examples, please refer to the Documentation

(back to top)

Roadmap

  • Train translation model with users dataset
    • add translation pairs in DB
    • extract dataset and train pytorch model
  • User Verification by email
    • setup mailgun API Key
  • Named entity recognition to extract part of text to translate
    • setup spay

See the open issues for a full list of proposed features (and known issues).

(back to top)

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

License

Distributed under the GPL-3.0 License. See LICENSE.txt for more information.

(back to top)

Contact

Vincent PORTE - [email protected]

Project Link: https://github.com/vincentporte/machine_translation_fastapi_pytorch_docker

(back to top)

Acknowledgments

(back to top)

About

character level machine translation, using fastapi, pytorch sequence 2 sequence model and docker

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published