Skip to content

espositoandrea/chatidea

 
 

Repository files navigation

CHATIDEA

CHATIDEA is a framework that allows the generation of a chatbot starting from a database's schema.

This file holds a brief documentation for the implementation of the CHATIDEA framework, just enough to get contributors up and running. For a detailed documentation, please visit the full documentation. An online version can be found on ReadTheDocs

Requirements

This version of CHATIDEA requires the following to be installed in your machine (please refer to the official documentations for instructions on how to install them).

Note that we manage everything with Virtual Environments, so just make sure you have Python, Poetry and ODBC drivers.

Before continuing, a database is needed. Be sure to have a DBMS running on your system or on a remote server, and remember to set the appropriate SQL dialect needed in the file chatidea/database/broker.py.

Install all Python and Node.js dependencies in a virtual environment using the following command:

poetry install
npm i --dev

Please note that if you want to train a custom NLP model, you must also install the NLP-specific environment. You can do this by accessing the nlu-model directory and executing once again poetry install. For further documentation on quirks and gotchas of the NLU model, please refer to the appropriate nlu-model/README.md file.

Then edit the .env file to fit your environment. If the .env file does not exist, copy the provided example template. This can be done using the following command.

cp .env.example .env

Execute the Pipeline

The NLU pipeline is fully contained in the directory nlu-model, thus be sure to change the directory using the following command before executing the pipeline.

cd nlu-model

Please make sure to double-check any gotchas that are documented in the NLU model README file

Generate Data and Train the Model

poetry run dvc repro

Start the Rasa Server

poetry run rasa run --enable-api

Deploying Information

The project is configured to be deployed using containerization. More precisely, one container for each microservice is expected. Thus, this repository holds configuration files for Docker, using Dockerfiles and docker-compose (used to manage the communication between the microservices). To build the deployment containers, run the following:

docker-compose build

To execute the chatbot, run the following command (remove the -d at the end to avoid executing as a daemon).

docker-compose up -d

To shut down the services, you can either use C-c (if you are not running in daemon mode) or you can run the following command:

docker-compose down

Testing the Deployment on Apple Silicon and Other ARM Processors

There may be some problems with pyodbc until the maintainers do not provide a correctly built image of the library for Apple Silicon. If you incur in errors after installing pyodbc, simply reinstall the library using the following command.

poetry run pip3 install --force-reinstall --no-binary :all: pyodbc

At the moment, the project is based on Rasa. Sadly, the official Docker image of Rasa does not support the ARM architecture. As a workaround, until an official version is released, an unofficial image can be downloaded from Docker Hub. To use it, in the file nlu-model/Dockerfile replace the FROM image by changing rasa/rasa to khalosa/rasa-aarch64:3.5.2. The following command does that automatically.

Note: sed's inplace option -i is not used to ensure POSIX compatibility, as macOS systems do not ship by default with the GNU version of sed

TEMP_FILE=$(mktemp) && \
    sed '1s;rasa/rasa;khalosa/rasa-aarch64:3.5.2;' nlu-model/Dockerfile > $TEMP_FILE && \
    mv $TEMP_FILE nlu-model/Dockerfile

Known Issues and Future Actions

  • Separate NLU model's environment from the main app's environment

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 94.6%
  • HTML 4.6%
  • Dockerfile 0.8%