Skip to content

Commit

Permalink
[AIRFLOW-3557] Fix various typos (apache#4357)
Browse files Browse the repository at this point in the history
  • Loading branch information
BasPH authored and kaxil committed Dec 22, 2018
1 parent 701b877 commit 3be104b
Show file tree
Hide file tree
Showing 9 changed files with 12 additions and 12 deletions.
6 changes: 3 additions & 3 deletions CHANGELOG.txt
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Improvements:
[AIRFLOW-2622] Add "confirm=False" option to SFTPOperator
[AIRFLOW-2662] support affinity & nodeSelector policies for kubernetes executor/operator
[AIRFLOW-2709] Improve error handling in Databricks hook
[AIRFLOW-2723] Update lxml dependancy to >= 4.0.
[AIRFLOW-2723] Update lxml dependency to >= 4.0.
[AIRFLOW-2763] No precheck mechanism in place during worker initialisation for the connection to metadata database
[AIRFLOW-2789] Add ability to create single node cluster to DataprocClusterCreateOperator
[AIRFLOW-2797] Add ability to create Google Dataproc cluster with custom image
Expand Down Expand Up @@ -269,7 +269,7 @@ AIRFLOW 1.10.0, 2018-08-03
[AIRFLOW-2429] Make Airflow flake8 compliant
[AIRFLOW-2491] Resolve flask version conflict
[AIRFLOW-2484] Remove duplicate key in MySQL to GCS Op
[ARIFLOW-2458] Add cassandra-to-gcs operator
[AIRFLOW-2458] Add cassandra-to-gcs operator
[AIRFLOW-2477] Improve time units for task duration and landing times charts for RBAC UI
[AIRFLOW-2474] Only import snakebite if using py2
[AIRFLOW-48] Parse connection uri querystring
Expand Down Expand Up @@ -1504,7 +1504,7 @@ AIRFLOW 1.8.0, 2017-03-12
[AIRFLOW-784] Pin funcsigs to 1.0.0
[AIRFLOW-624] Fix setup.py to not import airflow.version as version
[AIRFLOW-779] Task should fail with specific message when deleted
[AIRFLOW-778] Fix completey broken MetastorePartitionSensor
[AIRFLOW-778] Fix completely broken MetastorePartitionSensor
[AIRFLOW-739] Set pickle_info log to debug
[AIRFLOW-771] Make S3 logs append instead of clobber
[AIRFLOW-773] Fix flaky datetime addition in api test
Expand Down
4 changes: 2 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -166,10 +166,10 @@ There are three ways to setup an Apache Airflow development environment.
tox -e py35-backend_mysql
```

If you wish to run individual tests inside of docker enviroment you can do as follows:
If you wish to run individual tests inside of Docker environment you can do as follows:

```bash
# From the container (with your desired enviroment) with druid hook
# From the container (with your desired environment) with druid hook
tox -e py35-backend_mysql -- tests/hooks/test_druid_hook.py
```

Expand Down
2 changes: 1 addition & 1 deletion airflow/contrib/hooks/bigquery_hook.py
Original file line number Diff line number Diff line change
Expand Up @@ -1594,7 +1594,7 @@ def insert_all(self, project_id, dataset_id, table_id,
self.log.info('All row(s) inserted successfully: {}:{}.{}'.format(
dataset_project_id, dataset_id, table_id))
else:
error_msg = '{} insert error(s) occured: {}:{}.{}. Details: {}'.format(
error_msg = '{} insert error(s) occurred: {}:{}.{}. Details: {}'.format(
len(resp['insertErrors']),
dataset_project_id, dataset_id, table_id, resp['insertErrors'])
if fail_on_error:
Expand Down
2 changes: 1 addition & 1 deletion airflow/contrib/hooks/emr_hook.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@

class EmrHook(AwsHook):
"""
Interact with AWS EMR. emr_conn_id is only neccessary for using the
Interact with AWS EMR. emr_conn_id is only necessary for using the
create_job_flow method.
"""

Expand Down
2 changes: 1 addition & 1 deletion airflow/executors/celery_executor.py
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ def execute_command(command_to_exec):

class ExceptionWithTraceback(object):
"""
Wrapper class used to propogate exceptions to parent processes from subprocesses.
Wrapper class used to propagate exceptions to parent processes from subprocesses.
:param exception: The exception to wrap
:type exception: Exception
:param traceback: The stacktrace to wrap
Expand Down
2 changes: 1 addition & 1 deletion airflow/sensors/base_sensor_operator.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ class BaseSensorOperator(BaseOperator, SkipMixin):
When set to ``poke`` the sensor is taking up a worker slot for its
whole execution time and sleeps between pokes. Use this mode if the
expected runtime of the sensor is short or if a short poke interval
is requried.
is required.
When set to ``reschedule`` the sensor task frees the worker slot when
the criteria is not yet met and it's rescheduled at a later time. Use
this mode if the expected time until the criteria is met is. The poke
Expand Down
2 changes: 1 addition & 1 deletion tests/dags/.gitignore
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# This line is to avoid accidental commits of example dags for integration testing
# In order to test example dags easily we often create symbolic links in this directory
# and run the Airflow with AIRFLOW__CORE__UNIT_TEST_MODE=True
# this line prevents accidental commiting of such symbolic links
# this line prevents accidental committing of such symbolic links.
example_*
2 changes: 1 addition & 1 deletion tests/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -1364,7 +1364,7 @@ def tearDownClass(cls):

def test_get_existing_dag(self):
"""
test that were're able to parse some example DAGs and retrieve them
Test that we're able to parse some example DAGs and retrieve them
"""
dagbag = models.DagBag(dag_folder=self.empty_dir, include_examples=True)

Expand Down
2 changes: 1 addition & 1 deletion tests/test_jobs.py
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,7 @@ def test_backfill_examples(self):
"""
Test backfilling example dags
Try to backfill some of the example dags. Be carefull, not all dags are suitable
Try to backfill some of the example dags. Be careful, not all dags are suitable
for doing this. For example, a dag that sleeps forever, or does not have a
schedule won't work here since you simply can't backfill them.
"""
Expand Down

0 comments on commit 3be104b

Please sign in to comment.