Skip to content

Commit

Permalink
MINOR: Improve topic management instructions for Kafka Streams examples
Browse files Browse the repository at this point in the history
Author: Michael G. Noll <[email protected]>

Reviewers: Matthias J. Sax <[email protected]>, Eno Thereska <[email protected]>, Ismael Juma <[email protected]>

Closes apache#2812 from miguno/trunk-streams-examples-docs
  • Loading branch information
Michael G. Noll authored and ijuma committed Apr 6, 2017
1 parent afeadbe commit 6ba98f6
Show file tree
Hide file tree
Showing 5 changed files with 16 additions and 11 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -41,12 +41,13 @@
* using specific data types (here: JSON POJO; but can also be Avro specific bindings, etc.) for serdes
* in Kafka Streams.
*
* In this example, we join a stream of pageviews (aka clickstreams) that reads from a topic named "streams-pageview-input"
* In this example, we join a stream of pageviews (aka clickstreams) that reads from a topic named "streams-pageview-input"
* with a user profile table that reads from a topic named "streams-userprofile-input", where the data format
* is JSON string representing a record in the stream or table, to compute the number of pageviews per user region.
*
* Before running this example you must create the source topic (e.g. via bin/kafka-topics.sh --create ...)
* and write some data to it (e.g. via bin-kafka-console-producer.sh). Otherwise you won't see any data arriving in the output topic.
* Before running this example you must create the input topics and the output topic (e.g. via
* bin/kafka-topics.sh --create ...), and write some data to the input topics (e.g. via
* bin/kafka-console-producer.sh). Otherwise you won't see any data arriving in the output topic.
*/
public class PageViewTypedDemo {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -49,8 +49,9 @@
* with a user profile table that reads from a topic named "streams-userprofile-input", where the data format
* is JSON string representing a record in the stream or table, to compute the number of pageviews per user region.
*
* Before running this example you must create the source topic (e.g. via bin/kafka-topics.sh --create ...)
* and write some data to it (e.g. via bin-kafka-console-producer.sh). Otherwise you won't see any data arriving in the output topic.
* Before running this example you must create the input topics and the output topic (e.g. via
* bin/kafka-topics.sh --create ...), and write some data to the input topics (e.g. via
* bin/kafka-console-producer.sh). Otherwise you won't see any data arriving in the output topic.
*/
public class PageViewUntypedDemo {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,9 @@
* In this example, we implement a simple "pipe" program that reads from a source topic "streams-file-input"
* and writes the data as-is (i.e. unmodified) into a sink topic "streams-pipe-output".
*
* Before running this example you must create the source topic (e.g. via bin/kafka-topics.sh --create ...)
* and write some data to it (e.g. via bin-kafka-console-producer.sh). Otherwise you won't see any data arriving in the output topic.
* Before running this example you must create the input topic and the output topic (e.g. via
* bin/kafka-topics.sh --create ...), and write some data to the input topic (e.g. via
* bin/kafka-console-producer.sh). Otherwise you won't see any data arriving in the output topic.
*/
public class PipeDemo {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,9 @@
* represent lines of text; and the histogram output is written to topic "streams-wordcount-output" where each record
* is an updated count of a single word.
*
* Before running this example you must create the source topic (e.g. via bin/kafka-topics.sh --create ...)
* and write some data to it (e.g. via bin-kafka-console-producer.sh). Otherwise you won't see any data arriving in the output topic.
* Before running this example you must create the input topic and the output topic (e.g. via
* bin/kafka-topics.sh --create ...), and write some data to the input topic (e.g. via
* bin/kafka-console-producer.sh). Otherwise you won't see any data arriving in the output topic.
*/
public class WordCountDemo {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,8 +40,9 @@
* represent lines of text; and the histogram output is written to topic "streams-wordcount-processor-output" where each record
* is an updated count of a single word.
*
* Before running this example you must create the source topic (e.g. via bin/kafka-topics.sh --create ...)
* and write some data to it (e.g. via bin/kafka-console-producer.sh). Otherwise you won't see any data arriving in the output topic.
* Before running this example you must create the input topic and the output topic (e.g. via
* bin/kafka-topics.sh --create ...), and write some data to the input topic (e.g. via
* bin/kafka-console-producer.sh). Otherwise you won't see any data arriving in the output topic.
*/
public class WordCountProcessorDemo {

Expand Down

0 comments on commit 6ba98f6

Please sign in to comment.