Expand all | Collapse all

Mapd java jdbc PreparedStatement bug

  • 1.  Mapd java jdbc PreparedStatement bug

    Posted 10-26-2017 08:45

    hi author:
    I’m sorry, my English is bad, I hope not express the wrong meaning.
    I used jdbc batch insert data when found to specify the data field, but the insertion of data or disorder, such as the above two pictures. A value of 1 is inserted into field 1, but is inserted into field n.
    I hope to get your help. Consult under the current version of jdbc support batch insert? If the support can tell me the correct batch insert data operation or mapd jdbc source website can download it?
    Thank you!

  • 2.  RE: Mapd java jdbc PreparedStatement bug

    Posted 10-26-2017 13:44


    Thanks for your note, and thanks for trying out MapD.

    You have uncovered a bug in the addBatch bulk insert process. It is not respecting the order of the fields in the INSERT statement.

    As a work around for now you will have to order the INSERT INTO statement to match the exact order of the columns as defined in the schema.

    So in your case you need the INSERT to look like

    INSERT INTO fr_fba123 (mmax. mmin, forcast_field, chmin, chmax, cach, myct) values (?,?,?,?,?,?,?)

    or more simply

    INSERT INTO fr_fba123 values (?,?,?,?,?,?,?)

    Issue for tracking https://github.com/mapd/mapd-core/issues/124


  • 3.  RE: Mapd java jdbc PreparedStatement bug

    Posted 10-27-2017 02:36

    thank you very much for your reply.
    I have found a temporary solution, first in the bulk insert when the table field, and then according to the order of the table field parameters. Although it is uncertain that the solution is correct, but because of the project time relationship temporarily take the program to deal with.:喜悦:笑脸:

  • 4.  RE: Mapd java jdbc PreparedStatement bug

    Posted 02-02-2019 16:20

    Hi unfortunately I encountered exactly the same issue but using scalalikejdbc.
    I first created a table like:

      CREATE TABLE test_table (
        first      TEXT          ENCODING DICT(16),
        second     TEXT          ENCODING DICT(16),
        third      TEXT          ENCODING DICT(16));

    Then I ran:

     def testInsert(): Unit = {
        val insert = Seq(Seq('first -> ""aaa"", 'second -> ""bbb"", 'third -> ""ccc""), Seq('first -> ""aaa"", 'second -> ""bbb"", 'third -> ""ccc""), Seq('first -> ""aaa"", 'second -> ""bbb"", 'third -> ""ccc""))
        analyticsDb().localTx { implicit session ⇒
               INSERT INTO test_table (third, second, first)
               VALUES ({third}, {second}, {first})
            .batchByName(insert: _*)

    and the output will be:

    select * from test_table;

    Is there an ETA for this to be fixed?