This project is managed using a Makefile. The Makefile simplifies the setup and management of various tasks, including installing dependencies, linting, testing, and more.
To set up the project for the first time, you need to perform some initial steps:
-
Install Poetry: Poetry is used for dependency management. You can install it by following the instructions at https://python-poetry.org/docs/#installation.
-
Install Google Cloud CLI: This project uses Google Cloud services (such as Firestore), so you need to install the Google Cloud CLI. Instructions can be found here.
-
Install Make: Make sure
make
is installed on your system. On most Unix-based systems, it should be available by default. For Windows, you can use tools likechoco
orscoop
to installmake
. -
Environment Variables: Create a
.env
file in the root directory and add the necessary environment variables. An example.env
file:JWT__SECRET_KEY="secret" FIRESTORE__DATABASE="eu-dev"
Once the initial setup is complete, you can use the Makefile to manage the project.
To install all necessary dependencies and set up pre-commit hooks, run:
make install
Pre-commit hooks are used to check your code before committing. By default, the following tools are configured:
- black: Formats your code.
- mypy: Validates types.
- isort: Sorts imports in all files.
- flake8: Spots possible bugs.
To install the pre-commit hooks, use the make install
command as mentioned above.
To lint and format your code, run:
make lint
make format
To run tests and generate a coverage report, use:
make test
To run tests automatically on every change, run:
make watch
To update poetry dependencies and export them to requirements.txt
, run:
make poetry-update
Simple export without updating dependencies:
make poetry-export
$ tree "persony_admin"
.
βββ bigquery
β βββ schema_views # Configuration and views for BigQuery data analysis
βββ functions
β βββ src # Source code for serverless functions
βββ modelmind
β βββ _mocker # Utilities for mocking data and functionalities in tests
β βββ api # API endpoints, presentation, permissions, dependencies
β βββ clients # Integrations with third-party APIs
β βββ community # Domain knowledge external to core questionnaires
β β βββ engines
β β β βββ persony # Custom engine using community-based knowledge
β β βββ theory
β β βββ jung # Jungian personality theory implementation
β β βββ mbti # MBTI personality theory implementation
β βββ db # Persistence layer and database interactions
β βββ models # Core business logic and data manipulation
β β βββ analytics # Calculations based on questionnaire results
β β βββ engines # Logic for selecting questions from a questionnaire
β β βββ questionnaires # (Main) Core questionnaire models
β β βββ questions # Questions models that belong to a questionnaire
β β βββ results # Management of results from questionnaires
β βββ services # Auxiliary services, including event notification
β β βββ event_notifier # Service to handle event notifications within the system
β βββ utils # Utility functions and helpers
βββ terraform # Infrastructure as code configurations
βββ tests
βββ endpoints # Tests for API endpoints
βββ models # Tests for core business logic and models
This project uses Terraform for managing infrastructure as code on Google Cloud. We also use a Cloud Build trigger (deploy_cloud_run_prod.yml
) for CI/CD to deploy our application to Google Cloud Run.
- Poetry: Read more about Poetry here.
- Pre-commit: Learn more about pre-commit here.
- Pydantic BaseSettings: Read more about the BaseSettings class here.