Error Reading JSON message from KAFKA topic

Hello Every one,

I am working on 7.4 and for my Use case

I am streaming the JSON message to Kafka topic which is configured in the external infrastructure (ECP).The hand shake is successful with below steps-

1. I am able to stream messages to Kafka topic where i am storing the JSON message in one string say .pyNote and stream to KAFKA successfully using DataSet-Execute method

I am trying to to retrieve the message and i tried below 2 options -

A. DataSet-Execute method on step page ReadEventsFromKafka of work class with operation as "browse" and store results in "OperationResult" of Code-Pega-List but i am getting error while reading from the topic " Exception during data set execution"

B. DataFlow-Execute method on step page ReadEventsFromKafka of work class with operation as "Start" with below steps-

--> call data flow ReadKafkaMessages

--> define the dataset name in the first shape and then defined the custom activity(ReadMsg) in the next shape and using the same step page ReadEventsFromKafka but i am not able to read it and on tracer it gives below error.

Note:: SharedKafka2_MatrixONBEvents is also configured as consumer and producer both.

Can you please help here

m.pega.dsm.dnode.api.dataflow.StageException: Exception in stage: SharedKafka2_MatrixONBEvents at com.pega.dsm.dnode.api.dataflow.StageException.create( at com.pega.dsm.dnode.api.dataflow.DataFlowStage$StageOutputSubscriber.onError( at com.pega.dsm.dnode.api.dataflow.DataFlowStage$StageInputSubscriber.onError( at$SafeDataSubscriber.onError( at at$SafeDataSubscriber.onError( at com.pega.dsm.dnode.impl.dataset.kafka.KafkaBrowseOperation.processRecords( at com.pega.dsm.dnode.impl.dataset.kafka.KafkaBrowseOperation.access$000( at com.pega.dsm.dnode.impl.dataset.kafka.KafkaBrowseOperation$1.emit( at$SafeDataSubscriber.subscribe( at at$3.emit( at$SafeDataSubscriber.subscribe( at at com.pega.dsm.dnode.api.dataflow.DataFlow$ at com.pega.dsm.dnode.api.dataflow.DataFlow$ at com.pega.dsm.dnode.util.PrpcRunnable.execute( at com.pega.dsm.dnode.impl.dataflow.DataFlowThreadContext$ at java.util.concurrent.Executors$ at$TrustedFutureInterruptibleTask.runInterruptibly( at at at java.util.concurrent.ThreadPoolExecutor.runWorker( at java.util.concurrent.ThreadPoolExecutor$ at com.pega.dsm.dnode.util.PrpcRunnable$ at com.pega.dsm.dnode.util.PrpcRunnable$ at com.pega.dsm.dnode.util.PrpcRunnable.execute( at com.pega.dsm.dnode.impl.prpc.PrpcThreadFactory$ Caused by: com.pega.dsm.dnode.api.ExceptionWithInputRecord: java.lang.IllegalStateException: Cannot parse json ... 22 more Caused by: java.lang.IllegalStateException: Cannot parse json at com.pega.dsm.dnode.util.ClipboardPageJsonConverter.adoptJson( at com.pega.dsm.dnode.util.ClipboardPageJsonConverter.adoptJson( at com.pega.dsm.dnode.impl.dataset.kafka.KafkaBrowseOperation.convertRecordToClipboardPage( at com.pega.dsm.dnode.impl.dataset.kafka.KafkaBrowseOperation.processRecords( ... 21 more Caused by: com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'test': was expecting 'null', 'true', 'false' or NaN at [Source: (com.pega.dsm.dnode.util.ClipboardPageJsonConverter$ByteBufferInputStream); line: 1, column: 6] at com.fasterxml.jackson.core.JsonParser._constructError( at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError( at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._reportInvalidToken( at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._reportInvalidToken( at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._matchToken2( at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._matchTrue( at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._nextTokenNotInObject( at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken( at com.pega.dsm.dnode.util.ClipboardPageJsonConverter.adoptJson( ... 24 more

***Edited by Moderator: Pallavi to update platform capability tags***


Keep up to date on this post and subscribe to comments

March 8, 2019 - 9:31pm

check this link, you should use real time dataflow rule to consume Kafka dataset:

September 11, 2019 - 5:01am

i have used above but it didnot resolve the problem can you please suggest

October 17, 2019 - 1:37pm
Response to Nikhil_Garge

Hi Nikhil,

I have data set configured with JSON and when I run by default in Clipboard "OperationResult" I am  getting results.

But when i configured data flow with producer/input as data set and out put with configured activity to store to data base

values are not storing to data base.  Please let me know if you aware of the solution