Skip to content

Commit

Permalink
[docs] Fix indentation issues
Browse files Browse the repository at this point in the history
Improper indentation will end up rendering as blockquotes.
  • Loading branch information
econchick committed Feb 3, 2021
1 parent f5c8375 commit 6017347
Show file tree
Hide file tree
Showing 5 changed files with 69 additions and 69 deletions.
70 changes: 35 additions & 35 deletions docs/src/userguide/config/data_config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,17 +13,17 @@ Google Cloud Storage

Example configuration for `Google Cloud Storage`_:

.. code-block:: yaml
name: my-cool-job
pipeline_options:
streaming: True
job_config:
data:
inputs:
- type: gcs
location: gs://my-bucket/my-jobs-folder
file_suffix: .ogg
.. code-block:: yaml
name: my-cool-job
pipeline_options:
streaming: True
job_config:
data:
inputs:
- type: gcs
location: gs://my-bucket/my-jobs-folder
file_suffix: .ogg
.. _data-inputs-type:
.. option:: job_config.data.inputs[].type STR
Expand Down Expand Up @@ -75,14 +75,14 @@ Custom

Example configuration for a custom data input that is not supported by Klio:

.. code-block:: yaml
.. code-block:: yaml
name: my-cool-job
job_config:
data:
inputs:
- type: custom
some_key: some_value
name: my-cool-job
job_config:
data:
inputs:
- type: custom
some_key: some_value
.. option:: job_config.data.inputs[].type

Expand Down Expand Up @@ -116,17 +116,17 @@ Google Cloud Storage

Example configuration for `Google Cloud Storage`_:

.. code-block:: yaml
.. code-block:: yaml
name: my-cool-job
pipeline_options:
streaming: True
job_config:
data:
outputs:
- type: gcs
location: gs://my-bucket/my-jobs-folder
file_suffix: .wav
name: my-cool-job
pipeline_options:
streaming: True
job_config:
data:
outputs:
- type: gcs
location: gs://my-bucket/my-jobs-folder
file_suffix: .wav
.. option:: job_config.data.outputs[].type STR

Expand Down Expand Up @@ -172,14 +172,14 @@ Custom

Example configuration for a custom data output that is not supported by Klio:

.. code-block:: yaml
.. code-block:: yaml
name: my-cool-job
job_config:
data:
outputs:
- type: custom
some_key: some_value
name: my-cool-job
job_config:
data:
outputs:
- type: custom
some_key: some_value
.. option:: job_config.data.outputs[].type

Expand Down
20 changes: 10 additions & 10 deletions docs/src/userguide/config/event_config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -371,20 +371,20 @@ Example configuration for reading elements from a specific avro file:
| **Runner**: Dataflow, Direct
| *Required*
.. option:: job_config.events.inputs[].file_pattern STR
.. option:: job_config.events.inputs[].file_pattern STR

Pattern of file name(s) to read.
This field is optional if ``job_config.events.inputs[].location`` is provided.
Pattern of file name(s) to read.
This field is optional if ``job_config.events.inputs[].location`` is provided.

| **Runner**: Dataflow, Direct
| *Optional*
| **Runner**: Dataflow, Direct
| *Optional*
.. note::
.. note::

If both ``job_config.events.inputs[].location`` and
``job_config.events.inputs[].file_pattern`` are provided,
the two fields will be joined to to find files matching ``file_pattern``
located in the provided ``location`` path.
If both ``job_config.events.inputs[].location`` and
``job_config.events.inputs[].file_pattern`` are provided,
the two fields will be joined to to find files matching ``file_pattern``
located in the provided ``location`` path.


.. option:: job_config.events.inputs[].min_bundle_size INT
Expand Down
26 changes: 13 additions & 13 deletions docs/src/userguide/config/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -267,9 +267,9 @@ Streaming

*Case:*

* **Runner**: DirectRunner
* **Events**: Consume ``KlioMessage`` events from a Google Pub/Sub subscription; write ``KlioMessage`` events to a Google Pub/Sub topic.
* **Data**: Read input binary data from a GCS bucket; write output binary data to a GCS bucket.
* **Runner**: DirectRunner
* **Events**: Consume ``KlioMessage`` events from a Google Pub/Sub subscription; write ``KlioMessage`` events to a Google Pub/Sub topic.
* **Data**: Read input binary data from a GCS bucket; write output binary data to a GCS bucket.


.. literalinclude:: examples/streaming.yaml
Expand All @@ -278,9 +278,9 @@ Streaming

*Case:*

* **Runner**: DataflowRunner
* **Events**: Consume ``KlioMessage`` events from a Google Pub/Sub subscription; write ``KlioMessage`` events to a Google Pub/Sub topic.
* **Data**: Read input binary data from a GCS bucket; write output binary data to a GCS bucket.
* **Runner**: DataflowRunner
* **Events**: Consume ``KlioMessage`` events from a Google Pub/Sub subscription; write ``KlioMessage`` events to a Google Pub/Sub topic.
* **Data**: Read input binary data from a GCS bucket; write output binary data to a GCS bucket.


.. literalinclude:: examples/streaming-dataflow.yaml
Expand All @@ -292,20 +292,20 @@ Batch

*Case:*

* **Runner**: DirectRunner
* **Events**: Generate ``KlioMessage`` events from a local file; write ``KlioMessage`` events to a local file.
* **Data**: Read input binary data from a GCS bucket; write output binary data to a GCS bucket.
* **Runner**: DirectRunner
* **Events**: Generate ``KlioMessage`` events from a local file; write ``KlioMessage`` events to a local file.
* **Data**: Read input binary data from a GCS bucket; write output binary data to a GCS bucket.


.. literalinclude:: examples/batch.yaml
:language: yaml

*Case:*

* **Runner**: DataflowRunner
* **Events**: Generate ``KlioMessage`` events from a local file; write ``KlioMessage`` events to a local file.
* **Data**: Read input binary data from a GCS bucket; write output binary data to a GCS bucket.
* Use `Python packaging for dependency management`_ instead of using/packaging with Docker to run on the workers
* **Runner**: DataflowRunner
* **Events**: Generate ``KlioMessage`` events from a local file; write ``KlioMessage`` events to a local file.
* **Data**: Read input binary data from a GCS bucket; write output binary data to a GCS bucket.
* Use `Python packaging for dependency management`_ instead of using/packaging with Docker to run on the workers

.. literalinclude:: examples/batch-dataflow.yaml
:language: yaml
Expand Down
16 changes: 8 additions & 8 deletions docs/src/userguide/pipeline/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -81,10 +81,10 @@ Below is an example of a transform that inherits from Beam's DoFn.
Klio enhances Beam by offering decorators that can be imported from ``klio.transforms``
then used then decorating methods on transforms to make use of functionalities such as the examples below.

* :ref:`De/serialization of Klio Messages <serialization-klio-message>`
* :ref:`Inject klio context on methods and functions <accessing-klio-context>`
* :ref:`Handle timeouts <timeout>`
* :ref:`Retry on failure <retries>`
* :ref:`De/serialization of Klio Messages <serialization-klio-message>`
* :ref:`Inject klio context on methods and functions <accessing-klio-context>`
* :ref:`Handle timeouts <timeout>`
* :ref:`Retry on failure <retries>`



Expand Down Expand Up @@ -131,10 +131,10 @@ Custom transforms can be imported and used in the ``run.py`` to put together the
Klio also offers composite :ref:`built-in transforms <builtin-transforms>` that can be used directly in the ``run.py`` function.

* :ref:`Data existence checks <data-existence-checks>`
* :ref:`Inject klio context on methods and functions <accessing-klio-context>`
* :ref:`Handle timeouts <timeout>`
* :ref:`Retry on failure <retries>`
* :ref:`Data existence checks <data-existence-checks>`
* :ref:`Inject klio context on methods and functions <accessing-klio-context>`
* :ref:`Handle timeouts <timeout>`
* :ref:`Retry on failure <retries>`



Expand Down
6 changes: 3 additions & 3 deletions docs/src/userguide/pipeline/state.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@ State can be passed from one transform to the next within a pipeline using the `
(``bytes``) attribute. What one transform yields/returns will turn into the ``data.payload`` value
for the next transform, with three exceptions:

1. the yielded/returned value is equal to the ``data`` argument given to the transform;
2. the yielded/returned value is ``None``; or
3. the transform raises an exception, therefore dropping the message.
1. the yielded/returned value is equal to the ``data`` argument given to the transform;
2. the yielded/returned value is ``None``; or
3. the transform raises an exception, therefore dropping the message.


An illustrative example:
Expand Down

0 comments on commit 6017347

Please sign in to comment.