We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running mortgage_test.py::test_mortgage fails with a JavaPackage object is not callable error.
mortgage_test.py::test_mortgage
JavaPackage object is not callable
To run the test please run WITH_DEFAULT_UPSTREAM_SHIM=0 TEST_PARALLEL=0 TESTS=mortgage_test.py ./jenkins/databricks/test.sh
WITH_DEFAULT_UPSTREAM_SHIM=0 TEST_PARALLEL=0 TESTS=mortgage_test.py ./jenkins/databricks/test.sh
This is very similar to the #8910 issue which is specific to EMR
../../src/main/python/mortgage_test.py::test_mortgage[DATAGEN_SEED=1737684235, TZ=UTC, IGNORE_ORDER, INCOMPAT, APPROXIMATE_FLOAT, ALLOW_NON_GPU(ANY), LIMIT(100000)] FAILED [100%] =================================== FAILURES =================================== ________________________________ test_mortgage _________________________________ mortgage = <conftest.MortgageRunner object at 0x7fae9642fbb0> @incompat @approximate_float @limit @ignore_order @allow_non_gpu(any=True) def test_mortgage(mortgage): > assert_gpu_and_cpu_are_equal_iterator( lambda spark : mortgage.do_test_query(spark)) ../../src/main/python/mortgage_test.py:27: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../src/main/python/asserts.py:607: in assert_gpu_and_cpu_are_equal_iterator _assert_gpu_and_cpu_are_equal(func, 'ITERATOR', conf=conf, is_cpu_first=is_cpu_first) ../../src/main/python/asserts.py:506: in _assert_gpu_and_cpu_are_equal from_cpu = run_on_cpu() ../../src/main/python/asserts.py:491: in run_on_cpu from_cpu = with_cpu_session(bring_back, conf=conf) ../../src/main/python/spark_session.py:150: in with_cpu_session return with_spark_session(func, conf=copy) /usr/lib/python3.10/contextlib.py:79: in inner return func(*args, **kwds) ../../src/main/python/spark_session.py:134: in with_spark_session ret = func(_spark) ../../src/main/python/asserts.py:221: in <lambda> bring_back = lambda spark: limit_func(spark).toLocalIterator() ../../src/main/python/asserts.py:202: in with_limit df = sorted_func(spark) ../../src/main/python/asserts.py:192: in with_sorted df = func(spark) ../../src/main/python/mortgage_test.py:28: in <lambda> lambda spark : mortgage.do_test_query(spark)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <conftest.MortgageRunner object at 0x7fae9642fbb0> spark = <pyspark.sql.session.SparkSession object at 0x7fae96e46a10> def do_test_query(self, spark): from pyspark.sql.dataframe import DataFrame jvm_session = _get_jvm_session(spark) jvm = _get_jvm(spark) acq = self.mortgage_acq_path perf = self.mortgage_perf_path run = jvm.com.nvidia.spark.rapids.tests.mortgage.Run if self.mortgage_format == 'csv': df = run.csv(jvm_session, perf, acq) elif self.mortgage_format == 'parquet': > df = run.parquet(jvm_session, perf, acq) E TypeError: 'JavaPackage' object is not callable ../../src/main/python/conftest.py:496: TypeError
The text was updated successfully, but these errors were encountered:
Might be similar to #11988
Even though the stacktrace looks different we suspect the underlying problem is related to a jar not being found on the classpath.
Sorry, something went wrong.
gerashegalov
Successfully merging a pull request may close this issue.
Running
mortgage_test.py::test_mortgage
fails with aJavaPackage object is not callable
error.To run the test please run
WITH_DEFAULT_UPSTREAM_SHIM=0 TEST_PARALLEL=0 TESTS=mortgage_test.py ./jenkins/databricks/test.sh
This is very similar to the #8910 issue which is specific to EMR
The text was updated successfully, but these errors were encountered: