Skip to content

Commit

Permalink
Added Dockerfile alias (#199)
Browse files Browse the repository at this point in the history
* Added Dockerfile alias

* Added load_agent bucket compability and gcloud functionality to do it

* Added load trained model documentation

* Deleted version 3.6 and command python3 from Dockerfile container

* Solved strange bug with python modules
  • Loading branch information
AlejandroCN7 authored Mar 31, 2022
1 parent 3a0780f commit 4d8e04f
Show file tree
Hide file tree
Showing 4 changed files with 77 additions and 15 deletions.
22 changes: 14 additions & 8 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -51,14 +51,14 @@ RUN cd /usr/local/bin find -L . -type l -delete
# Install ping dependency
RUN apt update && apt install iputils-ping -y

# Install Python
RUN apt update
RUN apt install software-properties-common -y
RUN add-apt-repository ppa:deadsnakes/ppa
RUN ln -s /usr/bin/pip3 /usr/bin/pip
RUN ln -s /usr/bin/python${PYTHON_VERSION} /usr/bin/python
RUN apt install python${PYTHON_VERSION} python${PYTHON_VERSION}-distutils -y
RUN apt install python3-pip -y
# Install Python version PYTHON_VERSION
RUN apt update \
&& apt install software-properties-common -y \
&& add-apt-repository ppa:deadsnakes/ppa \
&& apt install python${PYTHON_VERSION} python${PYTHON_VERSION}-distutils -y \
&& apt install python3-pip -y \
&& ln -s /usr/bin/pip3 /usr/bin/pip \
&& ln -s /usr/bin/python${PYTHON_VERSION} /usr/bin/python

# Install enchant for sinergym documentation
RUN apt-get update && echo "Y\r" | apt-get install enchant --fix-missing -y
Expand All @@ -77,6 +77,7 @@ RUN python -m pip install --upgrade pip
# Upgrade setuptools for possible errors (depending on python version)
RUN pip install --upgrade setuptools
RUN apt-get update && apt-get upgrade -y && apt-get install -y git

WORKDIR /sinergym
COPY requirements.txt .
COPY setup.py .
Expand All @@ -89,6 +90,11 @@ COPY check_run_times.py .
COPY try_env.py .
RUN pip install -e .${SINERGYM_EXTRAS}


#uninstall 3.6 python default version
RUN apt-get remove --purge python3-pip python3 -y \
&& apt-get autoremove -y && apt-get autoclean -y

CMD ["/bin/bash"]

# Build: docker build -t sinergym:1.1.0 --build-arg ENERGYPLUS_VERSION=9.5.0 --build-arg ENERGYPLUS_INSTALL_VERSION=9-5-0 --build-arg ENERGYPLUS_SHA=de239b2e5f .
Expand Down
24 changes: 24 additions & 0 deletions docs/source/pages/gcloudAPI.rst
Original file line number Diff line number Diff line change
Expand Up @@ -308,6 +308,30 @@ Hence, it is **necessary** to **set up this service account** and give privilege
In short, we create a new service account called **storage-account**. Then, we dote this account with *roles/owner* permission. The next step is create a file key (json) called **google-storage.json** in our project root (gitignore will ignore this file in remote).
Finally, we export this file in **GOOGLE_CLOUD_CREDENTIALS** in our local computer in order to gcloud SDK knows that it has to use that token to authenticate.

**********************
Load a trained model
**********************

For this purpose, we have a script called *load_agent.py* which can be used both on a remote machine and locally on our computer, just like *DRL_battery.py*.

So, what this script does is to use the path that we pass as a parameter where our model is located. It loads the model and performs the evaluation that we want.

The list of parameter is:

- ``--environment`` or ``-env``: Environment name you want to use.
- ``--model`` or ``-mod``: Trained model (zip file) you want to use to execute the evaluation. This path can be a local path file into your computer (remote or host) or a Google Cloud Storage resource (bucket like ``gs://<bucket_name>/<model_path>``).
- ``--episodes`` or ``-ep``: Number of episodes you want to evaluate agent in simulation (Depending on environment episode length can be different)
- ``--algorithm`` or ``-alg``: Algorithm which model was trained (Currently, it is available *PPO*, *A2C*, *DQN*, *DDPG* and *SAC*).
- ``--reward`` or ``-rw``: Reward class you want to use for reward function (same reward than training model is recommended). Currently, possible values are "linear" and "exponential".
- ``--normalization`` or ``-norm``: Apply normalization wrapper to observations during evaluation. If it isn't specified wrapper will not be applied.
- ``--logger`` or ``-log``: Apply Sinergym logger wrapper during evaluation. If it isn't specified wrapper will not be applied.
- ``--seed`` or ``-sd``: Seed for evaluation, random components in process will be able to be recreated.
- ``--remote_store`` or ``-sto``: Determine if sinergym output will be sent to a common resource (Bucket), else will be allocate in container or host memory only.
- ``--group_name`` or ``-group``: It specify to which MIG the host instance belongs, it is important if --auto-delete is activated.
- ``--auto_delete`` or ``-del``: Whether this parameter is specified, remote instance will be auto removed when its job has finished.

This script loads the model. Once the model is loaded, it predicts the actions from the states during the agreed episodes. The information is collected and sent to a cloud storage if it has been specified, otherwise it is stored in local memory.

***********************
Remote Tensorboard log
***********************
Expand Down
26 changes: 19 additions & 7 deletions load_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@
# ---------------------------------------------------------------------------- #
# Evaluation name #
# ---------------------------------------------------------------------------- #
name = args.model.split('/')[-1] + 'EVAL-episodes' + str(args.episodes)
name = args.model.split('/')[-1] + '-EVAL-episodes' + str(args.episodes)

# ---------------------------------------------------------------------------- #
# Environment definition #
Expand Down Expand Up @@ -131,19 +131,31 @@
# ---------------------------------------------------------------------------- #
# Load Agent #
# ---------------------------------------------------------------------------- #
# If a model is from a bucket, download model
if 'gs://' in args.model:
# Download from given bucket (gcloud configured with privileges)
client = gcloud.init_storage_client()
bucket_name = args.model.split('/')[2]
model_path = args.model.split(bucket_name + '/')[-1]
gcloud.read_from_bucket(client, bucket_name, model_path)
model_path = './' + model_path
else:
model_path = args.model


model = None
if args.algorithm == 'DQN':
model = DQN.load(args.model)
model = DQN.load(model_path)
elif args.algorithm == 'DDPG':
model = DDPG.load(args.model)
model = DDPG.load(model_path)
elif args.algorithm == 'A2C':
model = A2C.load(args.model)
model = A2C.load(model_path)
elif args.algorithm == 'PPO':
model = PPO.load(args.model)
model = PPO.load(model_path)
elif args.algorithm == 'SAC':
model = SAC.load(args.model)
model = SAC.load(model_path)
elif args.algorithm == 'TD3':
model = TD3.load(args.model)
model = TD3.load(model_path)
else:
raise RuntimeError('Algorithm specified is not registered.')

Expand Down
20 changes: 20 additions & 0 deletions sinergym/utils/gcloud.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

import glob
import os
from pathlib import Path

import requests
from google.cloud import storage
Expand All @@ -22,6 +23,25 @@ def init_storage_client() -> storage.Client:
####################### GCLOUD BUCKETS MANIPULATION #######################


def read_from_bucket(client, bucket_name, blob_prefix):
"""Read a file or a directory (recursively) from specified bucket to local file system.
Args:
client: Google Cloud storage client object to ask resources.
bucket_name: Origin bucket name where reading.
blob_prefix: Path where you want to read data inner the bucket (excluding gs://<bucket_name>).
"""
bucket = client.get_bucket(bucket_name)
blobs = bucket.list_blobs(prefix=blob_prefix)
for blob in blobs:
if blob.name.endswith("/"):
continue
file_split = blob.name.split("/")
directory = '/'.join(file_split[0:-1])
Path(directory).mkdir(parents=True, exist_ok=True)
blob.download_to_filename(blob.name)


def upload_to_bucket(
client: storage.Client,
src_path: str,
Expand Down

0 comments on commit 4d8e04f

Please sign in to comment.