Skip to content

Commit

Permalink
Merge pull request #19 from ingestly/add-consent-info
Browse files Browse the repository at this point in the history
adjust a condition for ID handling
  • Loading branch information
hjmsano authored Apr 12, 2020
2 parents 6106a5f + bd1e0a0 commit ca1acac
Show file tree
Hide file tree
Showing 6 changed files with 18 additions and 13 deletions.
6 changes: 3 additions & 3 deletions BigQuery/log_format
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,13 @@
"endpoint_version":"1.0.0",
"action":"%{json.escape(urldecode(subfield(req.url.qs, "action", "&")))}V",
"category":"%{json.escape(urldecode(subfield(req.url.qs, "category", "&")))}V",
"cookie":"%{json.escape(urldecode(subfield(req.url.qs, "ck", "&")))}V",
"cookie":"%{json.escape(if(subfield(req.url.qs, "ck", "&") != "false", "true", "false"))}V",
"consent":"%{json.escape(urldecode(subfield(req.url.qs, "consent", "&")))}V",
"request_id":"%{digest.hash_sha256(req.url)}V",
"ingestly_id":"%{json.escape(if(req.http.Cookie:ingestlyId, req.http.Cookie:ingestlyId, subfield(req.url.qs, "rootId", "&")))}V",
"session_id":"%{json.escape(if(req.http.Cookie:ingestlySes, req.http.Cookie:ingestlySes, subfield(req.url.qs, "rootId", "&")))}V",
"is_id_new":"%{json.escape(if(req.http.Cookie:ingestlyId && subfield(req.url.qs, "ck", "&") == "true", "false", "true"))}V",
"is_session_new":"%{json.escape(if(req.http.Cookie:ingestlySes && subfield(req.url.qs, "ck", "&") == "true", "false", "true"))}V",
"is_id_new":"%{json.escape(if(req.http.Cookie:ingestlyId && subfield(req.url.qs, "ck", "&") != "false", "false", "true"))}V",
"is_session_new":"%{json.escape(if(req.http.Cookie:ingestlySes && subfield(req.url.qs, "ck", "&") != "false", "false", "true"))}V",
"root_id":"%{json.escape(urldecode(subfield(req.url.qs, "rootId", "&")))}V",
"since_init_ms":"%{if(subfield(req.url.qs, "sinceInitMs", "&"), subfield(req.url.qs, "sinceInitMs", "&"), "-1")}V",
"since_prev_ms":"%{if(subfield(req.url.qs, "sincePrevMs", "&"), subfield(req.url.qs, "sincePrevMs", "&"), "-1")}V",
Expand Down
5 changes: 5 additions & 0 deletions BigQuery/table_schema
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,11 @@
"type": "STRING",
"mode": "NULLABLE"
},
{
"name": "endpoint_version",
"type": "STRING",
"mode": "NULLABLE"
},
{
"name": "action",
"type": "STRING",
Expand Down
6 changes: 3 additions & 3 deletions Elasticsearch/log_format
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,13 @@
"endpoint_version":"1.0.0",
"action":"%{json.escape(urldecode(subfield(req.url.qs, "action", "&")))}V",
"category":"%{json.escape(urldecode(subfield(req.url.qs, "category", "&")))}V",
"cookie":"%{json.escape(urldecode(subfield(req.url.qs, "ck", "&")))}V",
"cookie":"%{json.escape(if(subfield(req.url.qs, "ck", "&") != "false", "true", "false"))}V",
"consent":%{if(subfield(req.url.qs, "consent", "&"), urldecode(subfield(req.url.qs, "consent", "&")), "\{\}")}V,
"request_id":"%{digest.hash_sha256(req.url)}V",
"ingestly_id":"%{json.escape(if(req.http.Cookie:ingestlyId, req.http.Cookie:ingestlyId, subfield(req.url.qs, "rootId", "&")))}V",
"session_id":"%{json.escape(if(req.http.Cookie:ingestlySes, req.http.Cookie:ingestlySes, subfield(req.url.qs, "rootId", "&")))}V",
"is_id_new":"%{json.escape(if(req.http.Cookie:ingestlyId && subfield(req.url.qs, "ck", "&") == "true", "false", "true"))}V",
"is_session_new":"%{json.escape(if(req.http.Cookie:ingestlySes && subfield(req.url.qs, "ck", "&") == "true", "false", "true"))}V",
"is_id_new":"%{json.escape(if(req.http.Cookie:ingestlyId && subfield(req.url.qs, "ck", "&") != "false", "false", "true"))}V",
"is_session_new":"%{json.escape(if(req.http.Cookie:ingestlySes && subfield(req.url.qs, "ck", "&") != "false", "false", "true"))}V",
"root_id":"%{json.escape(urldecode(subfield(req.url.qs, "rootId", "&")))}V",
"since_init_ms":"%{if(subfield(req.url.qs, "sinceInitMs", "&"), subfield(req.url.qs, "sinceInitMs", "&"), "-1")}V",
"since_prev_ms":"%{if(subfield(req.url.qs, "sincePrevMs", "&"), subfield(req.url.qs, "sincePrevMs", "&"), "-1")}V",
Expand Down
6 changes: 3 additions & 3 deletions README-JP.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ B. Analyzerを無効化するため、 `analysis` セクション(22行目〜4
1. CONFIGUREページから `Logging` を開きます。
2. `CREATE ENDPOINT` をクリックし、 `Google BigQuery` を選択します。
3. ハイライトされている `CONDITION` の近くにある `attach a condition.` リンクを開き、`CREATE A NEW RESPONSE CONDITION` を選択します。
4. `Data Ingestion` のような名前を入力し、 `Apply if…` には `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*")` をセットします。
4. `Data Ingestion` のような名前を入力し、 `Apply if…` には `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*" || resp.status == 200 && req.url ~ "^/ingestly-sync/(.*?)/\?.*")` をセットします。
5. 各設定項目に情報を:
- `Name` : お好きな名前
- `Log format` : このリポジトリの `BigQuery/log_format` ファイルの中身をコピー&ペースト
Expand All @@ -108,7 +108,7 @@ B. Analyzerを無効化するため、 `analysis` セクション(22行目〜4
1. CONFIGUREページから `Logging` を開きます。
2. `CREATE ENDPOINT` をクリックし、 `Elasticsearch` を選択します。
3. ハイライトされている `CONDITION` の近くにある `attach a condition.` リンクを開き、`CREATE A NEW RESPONSE CONDITION` を選択します。
4. `Data Ingestion` のような名前を入力し、 `Apply if…` には `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*")` をセットします。
4. `Data Ingestion` のような名前を入力し、 `Apply if…` には `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*" || resp.status == 200 && req.url ~ "^/ingestly-sync/(.*?)/\?.*")` をセットします。
5. 各設定項目に情報を:
- `Name` : お好きな名前
- `Log format` : このリポジトリの `Elasticsearch/log_format` ファイルの中身をコピー&ペースト
Expand All @@ -122,7 +122,7 @@ B. Analyzerを無効化するため、 `analysis` セクション(22行目〜4
1. CONFIGUREページから `Logging` を開きます。
2. `CREATE ENDPOINT` をクリックし、 `Amazon S3` を選択します。
3. ハイライトされている `CONDITION` の近くにある `attach a condition.` リンクを開き、`CREATE A NEW RESPONSE CONDITION` を選択します。
4. `Data Ingestion` のような名前を入力し、 `Apply if…` には `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*")` をセットします。
4. `Data Ingestion` のような名前を入力し、 `Apply if…` には `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*" || resp.status == 200 && req.url ~ "^/ingestly-sync/(.*?)/\?.*")` をセットします。
5. 各設定項目に情報を:
- `Name` : お好きな名前
- `Log format` : このリポジトリの `S3/log_format` ファイルの中身をコピー&ペースト
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ B. Remove `analysis` section (from line 22 to line 40) from `Elasticsearch/mappi
1. Open `Logging` in CONFIGURE page.
2. Click `CREATE ENDPOINT` button and select `Google BigQuery`.
3. Open `attach a condition.` link near highlighted `CONDITION` and select `CREATE A NEW RESPONSE CONDITION`.
4. Enter a name like `Data Ingestion` and set `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*")` into `Apply if…` field.
4. Enter a name like `Data Ingestion` and set `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*" || resp.status == 200 && req.url ~ "^/ingestly-sync/(.*?)/\?.*")` into `Apply if…` field.
5. Fill information into fields:
- `Name` : anything you want.
- `Log format` : copy and paste the content of `BigQuery/log_format` file in this repository.
Expand All @@ -108,7 +108,7 @@ B. Remove `analysis` section (from line 22 to line 40) from `Elasticsearch/mappi
1. Open `Logging` in CONFIGURE page.
2. Click `CREATE ENDPOINT` button and select `Elasticsearch`.
3. Open `attach a condition.` link near highlighted `CONDITION` and select `CREATE A NEW RESPONSE CONDITION`.
4. Enter a name like `Data Ingestion` and set `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*")` into `Apply if…` field.
4. Enter a name like `Data Ingestion` and set `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*" || resp.status == 200 && req.url ~ "^/ingestly-sync/(.*?)/\?.*")` into `Apply if…` field.
5. Fill information into fields:
- `Name` : anything you want.
- `Log format` : copy and paste the content of `Elasticsearch/log_format` file in this repository.
Expand All @@ -122,7 +122,7 @@ B. Remove `analysis` section (from line 22 to line 40) from `Elasticsearch/mappi
1. Open `Logging` in CONFIGURE page.
2. Click `CREATE ENDPOINT` button and select `Amazon S3`.
3. Open `attach a condition.` link near highlighted `CONDITION` and select `CREATE A NEW RESPONSE CONDITION`.
4. Enter a name like `Data Ingestion` and set `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*")` into `Apply if…` field.
4. Enter a name like `Data Ingestion` and set `(resp.status == 204 && req.url ~ "^/ingestly-ingest/(.*?)/\?.*" || resp.status == 200 && req.url ~ "^/ingestly-sync/(.*?)/\?.*")` into `Apply if…` field.
5. Fill information into fields:
- `Name` : anything you want.
- `Log format` : copy and paste the content of `S3/log_format` file in this repository. You can specify not only CSV but JSON format here (`{ ... }` form).
Expand Down
Loading

0 comments on commit ca1acac

Please sign in to comment.