How to contorl OpenGL memory?



I’m using the US flights demo database, I loaded two years data about 14,000,000 records in it.
Then I tried to add POINTMAP chart in this dashboard, then error message was shown.
In the case of 7,000,000 records one year was no problem.

The error message is :

E0803 13:48:21.378784 8301 MapDHandler.cpp:1848] Exception: Not enough OpenGL memory to render the query results

I’d like to know the amount of memory required for these data.
And how can I contorol OpenGL memory?

CentOS Linux release 7.3.1611 (Core)




The parameter is render-mem-bytes, it defaults to 500MB.

You will need to increase this if you see that issue, I would suggest you increase it in 100MB increments as you do not have a lot of memory available to you on that card.



Hi !

I tried to set the startmapd parameter to 1000MB, it was ok to display for three years .
Is this parameter hidden ?
I could not find it in the MapD Core Guild.

Thanks a lot.



Glad it helped you out.

It is an advanced option. We are adding more documentation all the time for all of the options.



Thank you for your comment.
I’ll check them.



I have questions.

  1. That database had 21,591,502 records, but number of origin airport was only 315.
    I think that OpenGL will map 315 geometries, not 21,591,502. Is that right?
    I don’t know why OpenGL needs so large memory.

  2. When can I see new document that advanced options are written.

Thank you.



  1. The memory buffer should be considered as a scratch area for all the render processing, not just the buffer for the final render. In the MapD world a point can also include additional attributes which need to be ‘blended’ at the render level so its not quite so simple to say there are only 315 points.

  2. Documentation is always being worked on. Best is just to keep an eye on the online docs.



Hi, thanks for your reply.

  1. I understood that the rendering process is not simple.
  2. I’ll check online docs.

Thank you very much.

MapD Charting Will Only Render Exactly 10 Million Geographic Coordinates After Table Size Exceeds Threshold

Just to add a bit more detail regarding memory requirements, even though there are only 315 distinct airport locations Immerse currently only allows rendering by non-grouped measures, which means that we need enough memory to render all 1.4M records. We have on our roadmap functionality to allow “grouped” rendering, i.e. rendering only one point per airport, which would take much less memory in this use case. We also have some optimizations in mind to have rendering take less memory and thus require smaller buffer sizes.



We have on our roadmap functionality to allow “grouped” rendering.

I’m really glad to hear that.
Do you mean that this rendering will be automatically grouped in the near future?

Thanks a lot.