Skip to content

Commit

Permalink
FAQ on changing data granularity
Browse files Browse the repository at this point in the history
added cause this question is asked a ton of times on google group and i had a similar question and had to resort to the forums cause there was no doc.
  • Loading branch information
pdeva committed Sep 25, 2014
1 parent d53ccf7 commit bc66b42
Showing 1 changed file with 9 additions and 0 deletions.
9 changes: 9 additions & 0 deletions docs/content/Ingestion-FAQ.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,15 @@ You can check `<BROKER_IP>:<PORT>/druid/v2/datasources/<YOUR_DATASOURCE>?interva
You can use IngestSegmentFirehose with index task to ingest existing druid segments using a new schema and change the name, dimensions, metrics, rollup, etc. of the segment.
See [Firehose](Firehose.html) for more details on IngestSegmentFirehose.

## How can I change the granularity of existing data in Druid?

In a lot of situations you may want to lower the granularity of older data. Example, any data older than 1 month has only hour level granularity but newer data has minute level granularity.

To do this use the IngestSegmentFirehose and run an indexer task. The IngestSegment firehose will allow you to take in existing segments from Druid and aggregate them and feed them back into druid. It will also allow you to filter the data in those segments while feeding it back in. This means if there are rows you want to delete, you can just filter them away during re-ingestion.

Typically the above will be run as a batch job to say everyday feed in a chunk of data and aggregate it.


## More information

Getting data into Druid can definitely be difficult for first time users. Please don't hesitate to ask questions in our IRC channel or on our [google groups page](https://groups.google.com/forum/#!forum/druid-development).

0 comments on commit bc66b42

Please sign in to comment.