KafkaImporter Error with non-string values

It seems if I am using any other type then TEXT I get a error.
Is the KafkaImporter just for use with string data in TEXT fields?

E 8333 0 RowToColumnLoader.cpp:326 Input exception thrown: Invalid conversion from string to INTEGER. Row discarded,

7747 0 RowToColumnLoader.cpp:326 Input exception thrown: Invalid conversion from string to FLOAT. Row discarded

In the example at: Kafka - OmniSci Docs

Use omnisql to create a table to store the stream.

create table stream1(name text, id int);

Uses a Dtype int.

Thank you,

Hi @darthseal ,

welcome to out community forum.

I think out Docs are a little outdated, and we are working to fix them, anyway I tried out creating a kafka topic loading data into a table with a text and an iteger without any issue.

started all the processed needed by kafka, then created a producer

candido@zion-legion:~/kafka/kafka_2.13-2.7.0$ bin/kafka-console-producer.sh --topic quickstart-events --bootstrap-server localhost:9092
>ciao,1
>ciao ciao,2
>ciao ciao ciao,3

then used the KafkaImporter utility supplied with out database (I’m using the 5.4.1 release)

candido@zion-legion:/opt/mapd/omnisci-os-5.4.1-20200928-3d17eec6c1-Linux-x86_64$ sudo bin/KafkaImporter --brokers localhost:9092 --topic quickstart-events --host localhost --port 6274 --database omnisci -u admin -p HyperInteractive --table test_kafka --group-id test --batch 3
Field Delimiter: ,
Line Delimiter: \n
Null String: \N
Insert Batch Size: 3
3 Rows Inserted, 0 rows skipped.

then checked out the records in the table test_kafka

candido@zion-legion:/opt/mapd/omnisci-os-5.4.1-20200928-3d17eec6c1-Linux-x86_64$ bin/omnisql -p HyperInteractive
User admin connected to database omnisci
omnisql> select * from test_kafka;
f1|f2
ciao|1
ciao ciao|2
ciao ciao ciao|3
omnisql> \d test_kafka
CREATE TABLE test_kafka (
f1 TEXT ENCODING DICT(32),
f2 INTEGER);

Could you share the steps you followed and the data used with Kafka?

Regards
Candido

Hey Candido,

Thank you for responding it was an input problem.

I was sending JSON {“column1” : “ciao”, “column2”: 1"} in to Kafka, then tried a string with key:value like “column1 : ciao, column2: 1”

Your example

ciao,1
ciao ciao,2
ciao ciao ciao,3

Made me realize I should just be sending just the values in to the importer.

Hi @darthseal ,

I am happy the problem is solved :+1: