You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Given that the rest of Deequ relies on Spark, it seems incongruous that there is no support for loading metrics from a Spark table. Saving to a JSON works fine for now, but as we scale up, we would like to take advantage of the data catalog/governance that comes along with using Spark tables (specifically with Databricks in our case, but can imagine it being generally useful outside of that).
Describe the solution you'd like
An implementation of MetricsRepository using Spark tables as the data source.
Describe alternatives you've considered
This can be hacked together by dumping a spark table to a JSON file and then reading that with the FS MR, but it's quite inelegant.
Additional context
Happy to take a crack at the implementation myself when I have more capacity in a few days.
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
Given that the rest of Deequ relies on Spark, it seems incongruous that there is no support for loading metrics from a Spark table. Saving to a JSON works fine for now, but as we scale up, we would like to take advantage of the data catalog/governance that comes along with using Spark tables (specifically with Databricks in our case, but can imagine it being generally useful outside of that).
Describe the solution you'd like
An implementation of
MetricsRepository
using Spark tables as the data source.Describe alternatives you've considered
This can be hacked together by dumping a spark table to a JSON file and then reading that with the FS MR, but it's quite inelegant.
Additional context
Happy to take a crack at the implementation myself when I have more capacity in a few days.
The text was updated successfully, but these errors were encountered: