Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QST] Running using custom Spark Builds #11957

Closed
marwansalem opened this issue Jan 12, 2025 · 2 comments
Closed

[QST] Running using custom Spark Builds #11957

marwansalem opened this issue Jan 12, 2025 · 2 comments
Labels
question Further information is requested

Comments

@marwansalem
Copy link

What is your question?
I have my own custom spark versions which is based on spark 3.4.1

I am getting
Exception in thread "main" java.lang.IllegalArgumentException: This RAPIDS Plugin build does not support Spark build 3.4.1-INX-1.1.0. Supported Spark versions: 3.2.0 {buildver=320}, 3.2.1 {buildver=321}, 3.2.1-cloudera-3.2.7171000 {buildver=321cdh}, 3.2.2 {buildver=322}, 3.2.3 {buildver=323}, 3.2.4 {buildver=324}, 3.3.0 {buildver=330}, 3.3.0-cloudera-3.3.7180 {buildver=330cdh}, 3.3.0-databricks {buildver=330db}, 3.3.1 {buildver=331}, 3.3.2 {buildver=332}, 3.3.2-cloudera-3.3.7190 {buildver=332cdh}, 3.3.2-databricks {buildver=332db}, 3.3.3 {buildver=333}, 3.3.4 {buildver=334}, 3.4.0 {buildver=340}, 3.4.1 {buildver=341}, 3.4.1-databricks {buildver=341db}, 3.4.2 {buildver=342}, 3.4.3 {buildver=343}, 3.4.4 {buildver=344}, 3.5.0 {buildver=350}, 3.5.1 {buildver=351}, 3.5.2 {buildver=352}, 3.5.3 {buildver=353}. Consult the Release documentation at https://nvidia.github.io/spark-rapids/docs/download.html

Is there a way to bypass this?

Appreciate your help

@marwansalem marwansalem added ? - Needs Triage Need team to review and classify question Further information is requested labels Jan 12, 2025
@tgravescs
Copy link
Collaborator

tgravescs commented Jan 13, 2025

Hello, there is a config that you can use to override this behavior when the version numbers don't match exactly. The thing to keep in mind is if you really have a custom version with different behaviors we may not be 100% compatible with that.

The config is: spark.rapids.shims-provider-override

.doc("Overrides the automatic Spark shim detection logic and forces a specific shims " +
    "provider class to be used. Set to the fully qualified shims provider class to use. " +
    "If you are using a custom Spark version such as Spark 3.2.0 then this can be used to " +
    "specify the shims provider that matches the base Spark version of Spark 3.2.0, i.e.: " +
    "com.nvidia.spark.rapids.shims.spark320.SparkShimServiceProvider. If you modified Spark " +
    "then there is no guarantee the RAPIDS Accelerator will function properly." +
    "When tested in a combined jar with other Shims, it's expected that the provided " +
    "implementation follows the same convention as existing Spark shims. If its class" +
    " name has the form com.nvidia.spark.rapids.shims.<shimId>.YourSparkShimServiceProvider. " +
    "The last package name component, i.e., shimId, can be used in the combined jar as the root" +
    " directory /shimId for any incompatible classes. When tested in isolation, no special " +
    "jar root is required"

So if you know your version is 3.4.1 based. You can try:

--conf spark.rapids.shims-provider-override=com.nvidia.spark.rapids.shims.spark341.SparkShimServiceProvider

Let us know if you have any other questions or problems.

@mattahrens mattahrens removed the ? - Needs Triage Need team to review and classify label Jan 14, 2025
@mattahrens
Copy link
Collaborator

Please re-open with any further questions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants