You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your requests! Your request actually boils down into two independent missing features in Hyper: parallel CTAS and parallel ARRAY-scanning (more information below, in case you are interested).
We already have both of them on our backlog, but without any committed timeline so far.
Request for PARRALEL option for external table with ARRAY
I agree, reading across multiple files in parallel would be beneficial. We don't currently do so, but it is on our list... We would probably not expose this as an option, though. We would just do it under the hood without any additional hints in the query.
Note that we already do parallelize reading within individual files. A query like
SELECT SUM(x) FROM external('path/to/my/data.parquet');
is already parallelized if your individual files are sufficiently large
CTAS [...] is sequential
this is indeed a separate problem. This is not a problem with the scanning of external files, but rather with the insertion into tables. Hyper's table insertion is currently always single-threaded
Currently CTAS using an external table for ARRAY as source is sequential : the source files in the array are read one after another.
It would be great if the select (and may be the insert) could be done in parallel.
A PARRALEL => degree could be use to parameter/limit that.
The text was updated successfully, but these errors were encountered: