-
Notifications
You must be signed in to change notification settings - Fork 18
Presto-odbc with large tables (>300 K rows) #49
Comments
Hi Nataliya, Nezih |
Hi. On the driver.d, how could i turn SQLExecuteImpl method to return the result set to the application by paging (or similar functionality) and not only at the end of batch processing? |
@yitzchakl - I just did some checking. This may be possible (though a definitive answer will require yet more investigation). Notes to a future implementer:
|
Ok, thanks @markisaa . My main goal is to improve the performance of the execute, right now my testing results are not so good (i'm working with Qlikview, not Tableau) : Off course i'm running without tracing log :-) Thanks in advance. |
Sorry to hear that. Performance was definitely not high on the original design considerations; we wanted something that worked and my internship only lasted so long. The only way to know what's slow would be to benchmark. It would not surprise me if the [mis]use of memory played a role here (though I'm sure there's plenty of code that can just be optimized anyway). I'm not in a position to work on this at the moment (swamped with my last term of university), but should that change I will see if I can help on this front. I'll also do some reading on how to properly re-enable the GC. |
Hi @nezihyigitbasi,
We are seeing a little bit of an issue with loading big tables sizes (over 300K-500K rows) into Tableau with Presto odbc driver. It becomes very slow, seems to only work with only an extract of data.
Have you seen any issues like that before? What kind of tables have you tested it on (row numbers, number of columns, data types)
Thanks Nezih.
Best,
Nataliya.
The text was updated successfully, but these errors were encountered: