Skip to content

ddeutils/ddeutil-workflow

Repository files navigation

Workflow Orchestration

test codecov pypi version python support version size gh license code style: black

The Lightweight Workflow Orchestration with fewer dependencies the was created for easy to make a simple metadata driven data workflow. It can use for data operator by a .yaml template.

Warning

This package provide only orchestration workload. That mean you should not use the workflow stage to process any large volume data which use a lot of compute resource 😰.


πŸ“Œ Rules of This Workflow:

  1. The Minimum frequency unit of built-in scheduling is 1 Minute πŸ•˜
  2. Can not re-run only failed stage and its pending downstream ↩️
  3. All parallel tasks inside workflow core engine use Multi-Threading pool (Python 3.13 unlock GIL πŸπŸ”“)
  4. Recommend to pass a Secret Value with environment variable in YAML template πŸ”
  5. Any datatime value convert to UTC Timezone 🌐

πŸ“ Workflow Diagrams:

This diagram show where is this application run on the production infrastructure. You will see that this application do only running code with stress-less which mean you should to set the data layer separate this core program before run this application.

flowchart LR
    A((fa:fa-user User))

    subgraph Docker Container
        direction TB
        G@{ shape: rounded, label: "πŸ“‘Observe<br>Application" }
    end

    subgraph Docker Container
        direction TB
        B@{ shape: rounded, label: "πŸƒWorkflow<br>Application" }
    end

    A <-->|action &<br>response| B
    B -...-> |response| G
    G -...-> |request| B

    subgraph Data Context
        D@{ shape: processes, label: "Logs" }
        E@{ shape: lin-cyl, label: "Audit<br>Logs" }
    end

    subgraph Config Context
        F@{ shape: tag-rect, label: "YAML<br>files" }
    end

    A ---> |push| H(Repo)
    H -.-> |pull| F

    B <-->|disable &<br>read| F

    B <-->|read &<br>write| E

    B -->|write| D

    D -.->|read| G
    E -.->|read| G
Loading

Warning

Disclaimer: I inspire the dynamic YAML statement from the GitHub Action, and my experience of data framework configs pattern. 😬

Other workflow orchestration services that I interest and pick them to be this project inspiration:

πŸ“¦ Installation

This project need ddeutil and ddeutil-io extension namespace packages to be the base deps. If you want to install this package with application add-ons, you should add app in installation;

Use-case Install Optional Support
Python ddeutil-workflow βœ…
FastAPI Server ddeutil-workflow[all] βœ…

Check the version of the current workflow package:

$ pip install ddeutil-workflow
$ workflow-cli version

Initial workflow project:

$ workflow-cli init

πŸ“– Documentation

For comprehensive API documentation, examples, and best practices:

🎯 Usage

This is examples that use workflow file for running common Data Engineering use-case.

Important

I recommend you to use the call stage for all actions that you want to do with workflow activity that you want to orchestrate. Because it is able to dynamic an input argument with the same call function that make you use less time to maintenance your data workflows.

run-py-local:

   # Validate model that use to parsing exists for template file
   type: Workflow
   on:
      # If workflow deploy to schedule, it will run every 5 minutes
      # with Asia/Bangkok timezone.
      - cronjob: '*/5 * * * *'
        timezone: "Asia/Bangkok"
   params:
      # Incoming execution parameters will validate with this type. It allows
      # to set default value or templating.
      source-extract: str
      run-date: datetime
   jobs:
      getting-api-data:
         runs-on:
            type: local
         stages:
            - name: "Retrieve API Data"
              id: retrieve-api
              uses: tasks/get-api-with-oauth-to-s3@requests
              with:
                 # Arguments of source data that want to retrieve.
                 method: post
                 url: https://finances/open-data/currency-pairs/
                 body:
                    resource: ${{ params.source-extract }}

                    # You can use filtering like Jinja template but this
                    # package does not use it.
                    filter: ${{ params.run-date | fmt(fmt='%Y%m%d') }}
                 auth:
                    type: bearer
                    keys: ${API_ACCESS_REFRESH_TOKEN}

                 # Arguments of target data that want to land.
                 writing_mode: flatten
                 aws:
                    path: my-data/open-data/${{ params.source-extract }}

                    # This Authentication code should implement with your custom call
                    # function. The template allow you to use environment variable.
                    access_client_id: ${AWS_ACCESS_CLIENT_ID}
                    access_client_secret: ${AWS_ACCESS_CLIENT_SECRET}

Before execute this workflow, you should implement caller function first.

registry-caller/
  ╰─ tasks.py

This function will store as module that will import from WORKFLOW_CORE_REGISTRY_CALLER value (This config can override by extra parameters with registry_caller key).

Note

You can use Pydantic Model as argument of your caller function. The core workflow engine will auto use the model_validate method before run your caller function.

from ddeutil.workflow import Result, CallerSecret, tag
from ddeutil.workflow.errors import StageError
from pydantic import BaseModel


class AwsCredential(BaseModel):
    path: str
    access_client_id: str
    access_client_secret: CallerSecret


class RestAuth(BaseModel):
    type: str
    keys: CallerSecret


@tag("requests", alias="get-api-with-oauth-to-s3")
def get_api_with_oauth_to_s3(
     method: str,
     url: str,
     body: dict[str, str],
     auth: RestAuth,
     writing_node: str,
     aws: AwsCredential,
     result: Result,
) -> dict[str, int]:
    """Get the data from RestAPI via Authenticate with OAuth and then store to
    AWS S3 service.
    """
    result.trace.info("[CALLER]: Start get data via RestAPI to S3.")
    result.trace.info(f"... {method}: {url}")
    if method != "post":
        raise StageError(f"RestAPI does not support for {method} action.")
    # NOTE: If you want to use secret, you can use `auth.keys.get_secret_value()`.
    return {"records": 1000}

The above workflow template is main executor pipeline that you want to do. If you want to schedule this workflow, you want to dynamic its parameters change base on execution time such as run-date should change base on that workflow running date.

from ddeutil.workflow import Workflow, Result

workflow: Workflow = Workflow.from_conf('run-py-local')
result: Result = workflow.execute(
   params={"source-extract": "USD-THB", "run-date": "2024-01-01"}
)

πŸͺ Configuration

The main configuration that use to dynamic changing this workflow engine for your objective use environment variable only. If any configuration values do not set yet, it will use default value and do not raise any error to you.

Important

The config value that you will set on the environment should combine with prefix, component, and name which is WORKFLOW_{component}_{name} (Upper case).

Name Component Default Description
REGISTRY_CALLER CORE . List of importable string for the call stage.
REGISTRY_FILTER CORE ddeutil.workflow.templates List of importable string for the filter template.
CONF_PATH CORE ./conf The config path that keep all template .yaml files.
STAGE_DEFAULT_ID CORE false A flag that enable default stage ID that use for catch an execution output.
GENERATE_ID_SIMPLE_MODE CORE true A flog that enable generating ID with md5 algorithm.
DEBUG_MODE LOG true A flag that enable logging with debug level mode.
TIMEZONE LOG Asia/Bangkok A Timezone string value that will pass to ZoneInfo object.
FORMAT LOG %(asctime)s.%(msecs)03d (%(name)-10s, %(process)-5d,%(thread)-5d) [%(levelname)-7s] %(message)-120s (%(filename)s:%(lineno)s) A trace message console format.
FORMAT_FILE LOG {datetime} ({process:5d}, {thread:5d}) {message:120s} ({filename}:{lineno}) A trace message format that use to write to target pointer.
DATETIME_FORMAT LOG %Y-%m-%d %H:%M:%S A datetime format of the trace log.
TRACE_URL LOG file:./logs A pointer URL of trace log that use to emit log message.
TRACE_ENABLE_WRITE LOG false A flag that enable writing trace log.
AUDIT_URL LOG file:./audits A pointer URL of audit log that use to write audit metrix.
AUDIT_ENABLE_WRITE LOG true A flag that enable writing audit log after end execution in the workflow release step.

πŸš€ Deployment

This package able to run as an application service for receive manual trigger from any node via RestAPI with the FastAPI package.

API Server

This server use FastAPI package to be the base application.

(.venv) $ workflow-cli api --host 127.0.0.1 --port 80

Note

If this package already deploy, it is able to use multiprocess; $ workflow-cli api --host 127.0.0.1 --port 80 --workers 4

Docker Container

Build a Docker container from this package.

$ docker pull ghcr.io/ddeutils/ddeutil-workflow:latest
$ docker run --rm ghcr.io/ddeutils/ddeutil-workflow:latest ddeutil-worker

πŸ’¬ Contribute

I do not think this project will go around the world because it has specific propose, and you can create by your coding without this project dependency for long term solution. So, on this time, you can open the GitHub issue on this project πŸ™Œ for fix bug or request new feature if you want it.

About

πŸƒ Lightweight Workflow Orchestration with YAML template

Topics

Resources

License

Stars

Watchers

Forks

Contributors 2

  •  
  •