Skip to content

Commit

Permalink
[AIRFLOW-4321] Replace incorrect info of Max Size limit of GCS Object…
Browse files Browse the repository at this point in the history
… Size (apache#5106)
  • Loading branch information
kaxil authored and XD-DENG committed Apr 15, 2019
1 parent 79fbf25 commit 91a4478
Show file tree
Hide file tree
Showing 3 changed files with 9 additions and 9 deletions.
6 changes: 3 additions & 3 deletions airflow/contrib/operators/cassandra_to_gcs.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,9 +57,9 @@ class CassandraToGoogleCloudStorageOperator(BaseOperator):
:type schema_filename: str
:param approx_max_file_size_bytes: This operator supports the ability
to split large table dumps into multiple files (see notes in the
filenamed param docs above). Google cloud storage allows for files
to be a maximum of 4GB. This param allows developers to specify the
file size of the splits.
filename param docs above). This param allows developers to specify the
file size of the splits. Check https://cloud.google.com/storage/quotas
to see the maximum allowed file size for a single object.
:type approx_max_file_size_bytes: long
:param cassandra_conn_id: Reference to a specific Cassandra hook.
:type cassandra_conn_id: str
Expand Down
6 changes: 3 additions & 3 deletions airflow/contrib/operators/mysql_to_gcs.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,9 +59,9 @@ class MySqlToGoogleCloudStorageOperator(BaseOperator):
:type schema_filename: str
:param approx_max_file_size_bytes: This operator supports the ability
to split large table dumps into multiple files (see notes in the
filenamed param docs above). Google cloud storage allows for files
to be a maximum of 4GB. This param allows developers to specify the
file size of the splits.
filename param docs above). This param allows developers to specify the
file size of the splits. Check https://cloud.google.com/storage/quotas
to see the maximum allowed file size for a single object.
:type approx_max_file_size_bytes: long
:param mysql_conn_id: Reference to a specific MySQL hook.
:type mysql_conn_id: str
Expand Down
6 changes: 3 additions & 3 deletions airflow/contrib/operators/postgres_to_gcs_operator.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,9 +70,9 @@ def __init__(self,
:type schema_filename: str
:param approx_max_file_size_bytes: This operator supports the ability
to split large table dumps into multiple files (see notes in the
filenamed param docs above). Google Cloud Storage allows for files
to be a maximum of 4GB. This param allows developers to specify the
file size of the splits.
filename param docs above). This param allows developers to specify the
file size of the splits. Check https://cloud.google.com/storage/quotas
to see the maximum allowed file size for a single object.
:type approx_max_file_size_bytes: long
:param postgres_conn_id: Reference to a specific Postgres hook.
:type postgres_conn_id: str
Expand Down

0 comments on commit 91a4478

Please sign in to comment.