I want to execute .sql file on MapD. But pipe is Broken


I am currently working on putting data made using TPC-H into MapD.

i’m writed command line like below.
./mapdql -u mapd -p HyperInteractive -db TPC_100GB < value_inject.sql
value_inject.sql is include item.sql(72GB), order.sql(10GB)…etc… and use ‘copy from’ command line.

It was successful to put in 10GB of data. However, 100GB of data is error-prone( error name is Broken pipe).
How can I put more than 10 GB of data into MapD using queries.?

I have another question. Can I set the CPU mode or GPU mode or CPU+GPU mode separately using MapDJDBC?


Hi @Hyuck -

There is no technical reason why you shouldn’t be able to upload that volume of data.

If you are having issues with unstable connections, I suggest you 1) break up the data into multiple files, 2) load the files directly on the database server and 3) make sure you are using a tool like screen/tmux so the process runs on the server instead of being controlled from your laptop.

If you mean using the JDBC drive through another tool, I suspect the answer is no, as that wouldn’t be part of the JDBC spec.

But if you are writing the Java code yourself, set_execution_mode is part of our Thrift specification (i.e. the communication layer between a client and the OmniSci database). The code would be client.set_execution_mode(session, TExecuteMode.CPU); or similar (I’m not a Java person).


Ah…It was resolved to put 100GB of data. Thank you!


hello,Mr Hyuck
When i want to put more than 100GB of data into mapd,i got the same problem as you. It told me Thrift error: write() send():Broken pipe.So,could you tell me how to solve the problem? Thank you very much.


hello, hchen
i changed MAPD_STORAGE directory path. It was placed in centos-root(50G). So i removed mapd and install again. so MAPD_STORAGE was placed in centos-home(889G). Finally, i solved a problem.


Thank you very much,i solved the problem by spliting the file of 18GB into several files.I think the initial file is too large to insert into mapd.


Hi @hchen

As I mentioned above, there’s no practical limit to the file size you can upload. In @Hyuck’s case, he ran out of disk space (congrats on figuring that out!). In other cases, you might have a network issue or some other process issue.

But that said, breaking up the files into smaller pieces is usually a good idea, so that if something does happen during the load you can just re-load the smaller piece vs. starting completely over.



Thank you,i got the reason of the problem and solved the problem successfully.