You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
In hybrid parquet scan, we are missing a lot of tests around predicate push down for various data types that we say we support: byte, short, int, long, float, double, date, decimal_32, decimal_64, string, struct, array, ...
Is your feature request related to a problem? Please describe.
In hybrid parquet scan, we are missing a lot of tests around predicate push down for various data types that we say we support: byte, short, int, long, float, double, date, decimal_32, decimal_64, string, struct, array, ...
We also support a large number of expressions.
spark-rapids/sql-plugin/src/main/scala/org/apache/spark/rapids/hybrid/HybridExecutionUtils.scala
Lines 138 to 295 in 9db0338
But we haven't tested them as part of predicate push down either.
Describe the solution you'd like
Add tests for all the data types and expressions that we support in hybrid parquet scan filter pushdown.
Additional context
Originally posted by @revans2 in #12112 (review)
The text was updated successfully, but these errors were encountered: