Loading large dataset of zipped csv files

I’m having an issue with this command, I can submit smaller batches but i have over 2000 files i need uploaded.
is there a better method of doing so besides zipping them up into small packages?
COPY flightdata FROM ‘/home/just/DataShare/import/test/data500.zip’;

I receive this error when i go to large
Retrying connection
Thrift: Tue Aug 11 16:39:19 2020 TSocket::write_partial() send() <Host: localhost Port: 6274>: Broken pipe
Thrift error: write() send(): Broken pipe
Thrift connection error: write() send(): Broken pipe

Hi @JustXtreme,

Are you executing 2000 COPY FROM commands or are you using globbing?

If the former, switch to globbing instead, ie.

COPY flightdata FROM '/home/just/DataShare/import/test/data*';

I tried that with a group of zip files created from winrar but i don’t think it likes how Winrar handles the zips. I’m going to try to create a few zip batches manually created like the following
(each zip contains hundreds, thousands of csv files? I’m not sure what the limits are)
data1.zip
data2.zip
data3.zip
Then i should be able to try
COPY flightdata FROM ‘/home/just/DataShare/import/test/data*’;
Correct?

correct @JustXtreme,

but check if you have enough free space on disk to accomodate the data you are lonading, because if you run out of space you will get