StreamInsert from local directory


Hi, I have been trying to use the ./StreamInsert from the SampleCode folder in the mapd directory. I am running the mapd server in a p2.xlarge EC2 cluster with MapD community edition AWS AMI. I am running the following command "sudo cp ~/out.csv -| /raidStorage/prod/mapd/SampleCode/StreamInsert t mapd -u mapd -p <instance_id> --port 9091 --delim , --null NULL --batch 1000. I am running it inside the server, not from the Immerse. The out.csv file is around 6GB and it has 5 columns with many number of rows. Once I run the command it pauses for sometime. However, In the Immerse web client I see no data is inserted in the table. I have been successfully able to ingest the csv file to a MapD table using mapdql but for streaminsert, I could not make it work. Can you tell me where is the problem. Here is are some sample rows of my csv file

Also here is the structure of the table I created in mapd
Column Type
local INT



You stream pipeline looks incorrect

Also local is a reserved word so should not be able to label a column that?


sudo cp ~/out.csv -| /raidStorage/prod/mapd/SampleCode/StreamInsert t mapd -u mapd -p --port 9091 --delim , --null NULL --batch 1000


sudo cat ~/out.csv | /raidStorage/prod/mapd/SampleCode/StreamInsert t mapd -u mapd -p <password> --port 9091 --delim , --null NULL --batch 1000

and let us know how it goes.

You may want to up your batch size from 1000 to something a little bigger like say 100000.



Thanks it’s working now. Thank you very much.