An end-to-end evaluation pipeline for SegFormer models on semantic segmentation tasks, with support for various quantization methods.
(DRAFT) (WIP) ----> Not fully implemented yet
For version history have a look at the CHANGELOG.
- Model loading and quantization (float8, int8, int4, int2)
- Dataset processing and sharding
- Evaluation metrics computation (mean IoU, mean accuracy, overall accuracy)
- Integration with Weights & Biases for experiment tracking
- Install uv:
pip install uv
- Install dependencies:
uv sync [--frozen]
- Set up Weights & Biases API key in environment variables
uv run [--locked] python -m src
docker build -t segformer-quant-eval .
docker run segformer-quant-eval
To build with different python version
docker build --build-arg
PYTHON_VERSION=<py_version> \
.
uv sync --only-group dev
uv run pytest tests/
Adjust settings in src/config.py
for model, dataset, and evaluation parameters.
Documentation SegFormer Quantization Pipeline
/
├── src/
│ ├─ utils/
│ │ ├── data_processing.py
│ │ ├── evaluator.py
│ │ ├── general_utils.py
│ │ ├── model_loader.py
│ │ ├── quantization.py
│ │ └── wandb_utils.py
│ ├── app.py
│ └── config.py
└── pyproject.toml
- TDD
- Implement tests before implementing concrete function
- test_model_loading, test_image_preprocessing
- test_quantization, test_predict, test_end_to_end
- Use pydantic and python typing
- mkdocs
- Extend workflow to copy only files in nav of mkdocs.yaml
- bump
- Check steps summary output json, possible bug in GHA
- Check if version tag exists, abort if so to avoid association of tag with an unrelated commit
-
README.md
- Include badge for tests
- Insert link to report and project within WandB
- Optional
- Hugging Face
- Include option to call HF API instead of saving model locally
- Might be useful for evaluation purposes
- Include option to call HF API instead of saving model locally
- Docker
- Evaluate
callisto
for fast cloud-native builds
- Evaluate
- Hugging Face
- Use pt or cuda images to reduce loading time size, e.g.
pytorch/pytorch:2.5.1-cuda12.4-cudnn9-runtime
nvidia/12.6.3-base-ubuntu24.04
- mkdocs
- Add .md to LICENSE/LICENSES to avoid download instead of open
- Remove/Change #href ↑(#toc) to avoid conflict with gh-pages
- Remove/Change #href for light/dark png to avoid conflict with
- Fix mkdocs not indenting checkbox ul
- Fix mkdocs not including png with plain in-line html, assets/ not copied by mkdocsgh-pages
- bump
- Fix
bump-my-version.yaml
rollback step to delete auto-created branch afterfailure()
- Handle error
fatal: could not read Username
&Error: Process completed with exit code 128.
- Handle error
- Fix
-
CHANGELOG.md
auto-generate- Solution: do manual bump occasionally and populate
CHANGELOG.md
before - Conventional Commits
.gitmessage
- Tools like
git-changelog
- Solution: do manual bump occasionally and populate
- Docker
- Where are site-packages in Dockerfile for copy to runtime located?
- Solution:
.venv
- Solution:
- Where are site-packages in Dockerfile for copy to runtime located?
- Branch protection rules
- Push to main with PR only
- Use dedicated branch
dev-auto-push-to-main
- Incorporate branch to workflow
bump-my-version.yaml
- Create workflow
update_changelog.yaml
This project is licensed under the BSD 3-Clause License. See the LICENSE file for details Third-party licenses might also apply.