Skip to content
forked from banodoco/Dough

Dough is a open source tool for steering AI animations with precision.

License

Notifications You must be signed in to change notification settings

giantmonster/Dough

Repository files navigation

Welcome to Dough v. 0.5 (beta)

⬇️ Scroll down for Setup Instructions - Currently available on Linux & Windows, hosted version coming soon.

Dough is a tool for crafting videos with AI. Our goal is to give you enough control over video generations that you can make beautiful creations of anything you imagine that feel uniquely your own.

To achieve this, we allow you to guide video generations with precision using a combination of images (via Steerable Motion) examples videos (via Motion Director).

Below is brief overview and some examples of outputs:

With Dough, you can makes guidance frames using Stable Diffusion XL, IP-Adapter, Fooocus Inpainting, and more:

You can then assemble these frames into shots that you can granularly edit:

And then animate these shots by defining parameters for each frame and selecting guidance videos via Motion LoRAs:

As an example, here's a video that's guided with just images on high strength:

While here's a more complex one, with low strength images driving it alongside a guidance video:

And here's a more complex example combining high strength guidance with a guidance video strongly influencing the motion:

We're obviously very biased think that it'll be possible to create extraordinarily beautiful things with this and we're excited to see what you make! Please share stuff you made in our Discord.

Setup Instructions

Setting up on Runpod (click to expand)
  1. We recommend setting up persistent storage for a quick setup and for your projects to persist. To get it going, click into “Storage”, select “New Network Volume”. 50GB should be more than enough to start.

  2. Select a machine - any should work, but we recommend a 4090.

  3. During setup, open the relevant ports for Dough like below:

  1. When you’ve launched the pod, click into Jupyter Notebook:
  1. Follow the “Setup for Linux” below and come back here when you’ve gone through them.

  2. Once you’re done that, grab the IP Address for your instance:

Then form put these into this form with a : between them like this:

{Public ID}:{External Pair Value}

In the above example, that would make it:

213.173.108.4:14810

Then go to this URL, and it should be running!

Important: remember to terminate the instance once you’re done - you can restart it by following the instructions from step 3 above.

Instructions for Linux:

Clone the repository

git clone --depth 1 -b staging https://github.com/banodoco/Dough.git

Install packages

Create a visual environment using:

python3 -m venv dough-env
source ./dough-env/bin/activate
cd Dough

NOTE: the app will break for python versions other than python3.10 and using the app without the virtual environment can cause issues/conflicts with other packages.

apt-get update
apt install libpq-dev python3.10-dev -y

install requirements

pip install -r requirements.txt

Copy the env file

copy the “.env.sample” file and rename it to “.env”

cp .env.sample .env

Run the app

you can run the app using

sh entrypoint.sh

Instructions for Windows:

Clone the git Repo

git clone --depth 1 -b staging https://github.com/banodoco/Dough.git

Install packages

Create a virtual environment and install dependencies

python -m venv venv # don't use python3
.\venv\Scripts\activate
pip install -r requirements.txt
pip install websocket # extra dependency

install torch dependencies (if not already present)

pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118

Copy the env file

copy .env.sample .env

Run the app

.\entrypoint.bat

If you're having any issues, please share them in our Discord.

About

Dough is a open source tool for steering AI animations with precision.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.2%
  • Other 0.8%