KafkaImporter batch size limitation


When giving --batch 10000, then KafkaImporter waits till 10000 records are accumulated.
So when my last batch is fetched , if it contains 500 records it doen’t get inserted in omnisci table, as 10000 batch condition is not fulfilled.
Is there any alternative to this?


I have a branch where I added a time parameter as a commit interval.

So the commit would be performed when the number of records reached the batch size or when enough time has passed before the latest commit.

This could be useful when you have parts of the day where you get a lot of records and others where you haven’t so much, and you don’t want to have too epochs ro your table.

Alternatively, you can change the batch size of the importer.



I didn’t find any branch option in KafkaImporter Command line of omnisci.
Can you please elaborate


It’s a personal branch, so nothing official until Is mergee with master, but I can give you a link to an executable

Yes, please share the link.
I also tried writing KafkaConsumer in Java, but the insertion speed is very less compared to KafkaImporter which we use in command line.

@Paramjeet_Kaur sent a private message with a link