Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
criccomini committed Sep 13, 2017
2 parents 004a347 + 6ac2963 commit 9bc0d39
Show file tree
Hide file tree
Showing 149 changed files with 1,836 additions and 1,721 deletions.
10 changes: 10 additions & 0 deletions .rat-excludes
Original file line number Diff line number Diff line change
Expand Up @@ -25,3 +25,13 @@ CHANGELOG.txt
kerberos_auth.py
airflow_api_auth_backend_kerberos_auth_py.html
licenses/*
airflow/www/static/docs
parallel.js
underscore.js
jquery.dataTables.min.js
jqClock.min.js
dagre-d3.min.js
bootstrap-toggle.min.js
bootstrap-toggle.min.css
d3.v3.min.js
ace.js
45 changes: 45 additions & 0 deletions CHANGELOG.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,48 @@
AIRFLOW 1.8.2, 2017-09-04
-------------------------

9a53e66 [AIRFLOW-809][AIRFLOW-1] Use __eq__ ColumnOperator When Testing Booleans
333e0b3 [AIRFLOW-1296] Propagate SKIPPED to all downstream tasks
93825d5 Re-enable caching for hadoop components
33a9dcb Pin Hive and Hadoop to a specific version and create writable warehouse dir
7cff6cd [AIRFLOW-1308] Disable nanny usage for Dask
c6a09c4 Updating CHANGELOG for 1.8.2rc1
570b2ed [AIRFLOW-1294] Backfills can loose tasks to execute
3f48d48 [AIRFLOW-1291] Update NOTICE and LICENSE files to match ASF requirements
e10af9a [AIRFLOW-XXX] Set version to 1.8.2rc1
69bd269 [AIRFLOW-1160] Update Spark parameters for Mesos
9692510 [AIRFLOW 1149][AIRFLOW-1149] Allow for custom filters in Jinja2 templates
6de5330 [AIRFLOW-1119] Fix unload query so headers are on first row[]
b4e9eb8 [AIRFLOW-1089] Add Spark application arguments
a4083f3 [AIRFLOW-1078] Fix latest_runs endpoint for old flask versions
7a02841 [AIRFLOW-1074] Don't count queued tasks for concurrency limits
a2c18a5 [AIRFLOW-1064] Change default sort to job_id for TaskInstanceModelView
d1c64ab [AIRFLOW-1038] Specify celery serialization options explicitly
b4ee88a [AIRFLOW-1036] Randomize exponential backoff
9fca409 [AIRFLOW-993] Update date inference logic
272c2f5 [AIRFLOW-1167] Support microseconds in FTPHook modification time
c7c0b72 [AIRFLOW-1179] Fix Pandas 0.2x breaking Google BigQuery change
acd0166 [AIRFLOW-1263] Dynamic height for charts
7f33f6e [AIRFLOW-1266] Increase width of gantt y axis
fc33c04 [AIRFLOW-1290] set docs author to 'Apache Airflow'
2e9eee3 [AIRFLOW-1282] Fix known event column sorting
2389a8a [AIRFLOW-1166] Speed up _change_state_for_tis_without_dagrun
bf966e6 [AIRFLOW-1192] Some enhancements to qubole_operator
57d5bcd [AIRFLOW-1281] Sort variables by key field by default
802fc15 [AIRFLOW-1244] Forbid creation of a pool with empty name
1232b6a [AIRFLOW-1243] DAGs table has no default entries to show
b0ba3c9 [AIRFLOW-1227] Remove empty column on the Logs view
c406652 [AIRFLOW-1226] Remove empty column on the Jobs view
51a83cc [AIRFLOW-1199] Fix create modal
cac7d4c [AIRFLOW-1200] Forbid creation of a variable with an empty key
5f3ee52 [AIRFLOW-1186] Sort dag.get_task_instances by execution_date
f446c08 [AIRFLOW-1145] Fix closest_date_partition function with before set to True If we're looking for the closest date before, we should take the latest date in the list of date before.
93b8e96 [AIRFLOW-1180] Fix flask-wtf version for test_csrf_rejection
bb56805 [AIRFLOW-1170] DbApiHook insert_rows inserts parameters separately
093b2f0 [AIRFLOW-1150] Fix scripts execution in sparksql hook[]
777f181 [AIRFLOW-1168] Add closing() to all connections and cursors
bc8e912 [AIRFLOW-XXX] Updating CHANGELOG, README, and UPDATING after 1.8.1 release

AIRFLOW 1.8.1, 2017-05-09
-------------------------

Expand Down
9 changes: 9 additions & 0 deletions INSTALL
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# INSTALL / BUILD instruction for Apache Airflow (incubating)
# fetch the tarball and untar the source

# [optional] run Apache RAT (release audit tool) to validate license headers
# RAT docs here: https://creadur.apache.org/rat/
java -jar apache-rat.jar -E ./.rat-excludes -d .

# install the release
python setup.py install
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,7 @@ Currently **officially** using Airflow:
1. [eRevalue](https://www.datamaran.com) [[@hamedhsn](https://github.com/hamedhsn)]
1. [evo.company](https://evo.company/) [[@orhideous](https://github.com/orhideous)]
1. [FreshBooks](https://github.com/freshbooks) [[@DinoCow](https://github.com/DinoCow)]
1. [GameWisp](https://gamewisp.com) [[@tjbiii](https://github.com/TJBIII) & [@theryanwalls](https://github.com/theryanwalls)]
1. [Gentner Lab](http://github.com/gentnerlab) [[@neuromusic](https://github.com/neuromusic)]
1. [Glassdoor](https://github.com/Glassdoor) [[@syvineckruyk](https://github.com/syvineckruyk)]
1. [GovTech GDS](https://gds-gov.tech) [[@chrissng](https://github.com/chrissng) & [@datagovsg](https://github.com/datagovsg)]
Expand Down
4 changes: 3 additions & 1 deletion UPDATING.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,9 @@ assists people when migrating to a new version.
- No updates are required if you are using ftpHook, it will continue work as is.

### Logging update
Logs now are stored in the log folder as ``{dag_id}/{task_id}/{execution_date}/{try_number}.log``.
Airflow's logging has been rewritten to uses Python’s builtin `logging` module to perform system logging. By extending classes with the existing `LoggingMixin`, all the logging will go through a central logger. The main benefit that this brings to us is the easy configuration of the logging through `default_airflow_logging.py` and the ability to use different handlers for logging.

Logs now are stored in the log folder as `{dag_id}/{task_id}/{execution_date}/{try_number}.log`.

### New Features

Expand Down
10 changes: 6 additions & 4 deletions airflow/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,10 @@
"""
from builtins import object
from airflow import version
from airflow.utils.log.LoggingMixin import LoggingMixin

__version__ = version.version

import logging
import sys

from airflow import configuration as conf
Expand All @@ -40,13 +41,15 @@


def load_login():
log = LoggingMixin().logger

auth_backend = 'airflow.default_login'
try:
if conf.getboolean('webserver', 'AUTHENTICATE'):
auth_backend = conf.get('webserver', 'auth_backend')
except conf.AirflowConfigException:
if conf.getboolean('webserver', 'AUTHENTICATE'):
logging.warning(
log.warning(
"auth_backend not found in webserver config reverting to "
"*deprecated* behavior of importing airflow_login")
auth_backend = "airflow_login"
Expand All @@ -55,7 +58,7 @@ def load_login():
global login
login = import_module(auth_backend)
except ImportError as err:
logging.critical(
log.critical(
"Cannot import authentication module %s. "
"Please correct your authentication backend or disable authentication: %s",
auth_backend, err
Expand All @@ -76,7 +79,6 @@ def __init__(self, namespace):
from airflow import hooks
from airflow import executors
from airflow import macros
from airflow import contrib

operators._integrate_plugins()
hooks._integrate_plugins()
Expand Down
12 changes: 8 additions & 4 deletions airflow/api/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,16 @@
# limitations under the License.
from __future__ import print_function

import logging

from airflow.exceptions import AirflowException
from airflow import configuration as conf
from importlib import import_module

from airflow.utils.log.LoggingMixin import LoggingMixin

api_auth = None

log = LoggingMixin().logger


def load_auth():
auth_backend = 'airflow.api.auth.backend.default'
Expand All @@ -33,6 +35,8 @@ def load_auth():
global api_auth
api_auth = import_module(auth_backend)
except ImportError as err:
logging.critical("Cannot import {} for API authentication due to: {}"
.format(auth_backend, err))
log.critical(
"Cannot import %s for API authentication due to: %s",
auth_backend, err
)
raise AirflowException(err)
14 changes: 9 additions & 5 deletions airflow/api/auth/backend/kerberos_auth.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,12 @@
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

from future.standard_library import install_aliases

from airflow.utils.log.LoggingMixin import LoggingMixin

install_aliases()

import kerberos
import logging
import os

from airflow import configuration as conf
Expand All @@ -45,14 +47,16 @@

_SERVICE_NAME = None

log = LoggingMixin().logger


def init_app(app):
global _SERVICE_NAME

hostname = app.config.get('SERVER_NAME')
if not hostname:
hostname = getfqdn()
logging.info("Kerberos: hostname {}".format(hostname))
log.info("Kerberos: hostname %s", hostname)

service = 'airflow'

Expand All @@ -62,12 +66,12 @@ def init_app(app):
os.environ['KRB5_KTNAME'] = conf.get('kerberos', 'keytab')

try:
logging.info("Kerberos init: {} {}".format(service, hostname))
log.info("Kerberos init: %s %s", service, hostname)
principal = kerberos.getServerPrincipalDetails(service, hostname)
except kerberos.KrbError as err:
logging.warning("Kerberos: {}".format(err))
log.warning("Kerberos: %s", err)
else:
logging.info("Kerberos API: server is {}".format(principal))
log.info("Kerberos API: server is %s", principal)


def _unauthorized():
Expand Down
4 changes: 0 additions & 4 deletions airflow/api/common/experimental/get_task.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,9 @@
# See the License for the specific language governing permissions and
# limitations under the License.

import logging

from airflow.exceptions import AirflowException
from airflow.models import DagBag

_log = logging.getLogger(__name__)


def get_task(dag_id, task_id):
"""Return the task object identified by the given dag_id and task_id."""
Expand Down
4 changes: 0 additions & 4 deletions airflow/api/common/experimental/get_task_instance.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,9 @@
# See the License for the specific language governing permissions and
# limitations under the License.

import logging

from airflow.exceptions import AirflowException
from airflow.models import DagBag

_log = logging.getLogger(__name__)


def get_task_instance(dag_id, task_id, execution_date):
"""Return the task object identified by the given dag_id and task_id."""
Expand Down
1 change: 0 additions & 1 deletion airflow/bin/airflow
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import logging
import os
from airflow import configuration
from airflow.bin.cli import CLIFactory
Expand Down
Loading

0 comments on commit 9bc0d39

Please sign in to comment.