Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Delta 1.1.0 for Spark 3.2.x #192

Merged
merged 2 commits into from
Jun 17, 2024
Merged

Conversation

wjxiz1992
Copy link
Collaborator

As title.

To fix #191

  • Change Delta version to 1.1.0 to be compatible with Spark3.2.0 (spark-rapids drop support for 3.1.x since 24.08)
  • Add note in README

Signed-off-by: Allen Xu <[email protected]>
@wjxiz1992 wjxiz1992 requested a review from GaryShen2008 June 15, 2024 02:29
@wjxiz1992 wjxiz1992 self-assigned this Jun 15, 2024
@sameerz sameerz added the dependencies Pull requests that update a dependency file label Jun 15, 2024
@wjxiz1992 wjxiz1992 merged commit 32fa7c6 into NVIDIA:dev Jun 17, 2024
3 checks passed
@@ -175,6 +175,9 @@ when you are about to shutdown the Metastore service.
For [unmanaged tables](https://docs.databricks.com/lakehouse/data-objects.html#what-is-an-unmanaged-table),
user doesn't need to create the Metastore service, appending `--delta_unmanaged` to arguments will be enough.

NOTE: To enabling Delta against different Spark versions, please modify the Delta package version accordingly in the template file.
For more version compatibility information, please visit [compatibility with apache spark](https://docs.delta.io/latest/releases.html#compatibility-with-apache-spark).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apache Spark should be capitalized properly.

@@ -15,7 +15,7 @@
# limitations under the License.
#

# 1. The io.delta:delta-core_2.12:1.0.1 only works on Spark 3.1.x
# 1. The io.delta:delta-core_2.12:1.1.0 only works on Spark 3.2.x
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We really should be using Delta Lake 2.0.x for consistency, as that's what we support in the RAPIDS Accelerator when accelerating Delta Lake. See https://docs.nvidia.com/spark-rapids/user-guide/24.04.01/additional-functionality/delta-lake-support.html#delta-lake-versions-supported-for-write

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Delta related jobs failed due to Spark version incompatibility
3 participants