We are using MapD Charting to render geographic coordinates on a map.
We are encountering an issue where we get this error once we load enough rows onto our table:
Exception: Not enough OpenGL memory to render the query results
We have tried setting enable-watchdog to false and allow-cpu-retry to true. However, it seems to have this weird behavior in which when pass a certain threshold of the number of rows in a table, it can only load exactly 10 million points. Once I go 10,000,001 points, then I receive the error. We do this by setting a .cap() on the rasterLayer(“points”).
However, if the number of rows in the table is a relatively small number, then it can load more than 10 million points. Why is it behaving like this? We noticed that the map will only render at 10 million rows once the table size exceeds approximately 75 million rows.
For reference, we are running MapD on a p2.8xlarge on an AWS EC2 Instance.
Is our hardware not equipped to deal with such large datasets? Are there any configurations we can add to render more points on the map.
Our goal is to be able to load approximately 6 billion rows of data, with each having its own latitude/longitude coordinates.