Skip to content

Commit 8589518

Browse files
committed
update for 2412 release
Signed-off-by: liyuan <[email protected]>
1 parent 8c0833f commit 8589518

File tree

28 files changed

+40
-40
lines changed

28 files changed

+40
-40
lines changed

docs/get-started/xgboost-examples/csp/databricks/databricks.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ Navigate to your home directory in the UI and select **Create** > **File** from
2121
create an `init.sh` scripts with contents:
2222
```bash
2323
#!/bin/bash
24-
sudo wget -O /databricks/jars/rapids-4-spark_2.12-24.10.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.10.0/rapids-4-spark_2.12-24.10.0.jar
24+
sudo wget -O /databricks/jars/rapids-4-spark_2.12-24.12.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.12.0/rapids-4-spark_2.12-24.12.0.jar
2525
```
2626
1. Select the Databricks Runtime Version from one of the supported runtimes specified in the
2727
Prerequisites section.
@@ -68,7 +68,7 @@ create an `init.sh` scripts with contents:
6868
```bash
6969
spark.rapids.sql.python.gpu.enabled true
7070
spark.python.daemon.module rapids.daemon_databricks
71-
spark.executorEnv.PYTHONPATH /databricks/jars/rapids-4-spark_2.12-24.10.0.jar:/databricks/spark/python
71+
spark.executorEnv.PYTHONPATH /databricks/jars/rapids-4-spark_2.12-24.12.0.jar:/databricks/spark/python
7272
```
7373
Note that since python memory pool require installing the cudf library, so you need to install cudf library in
7474
each worker nodes `pip install cudf-cu11 --extra-index-url=https://pypi.nvidia.com` or disable python memory pool

docs/get-started/xgboost-examples/csp/databricks/init.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
sudo rm -f /databricks/jars/spark--maven-trees--ml--10.x--xgboost-gpu--ml.dmlc--xgboost4j-gpu_2.12--ml.dmlc__xgboost4j-gpu_2.12__1.5.2.jar
22
sudo rm -f /databricks/jars/spark--maven-trees--ml--10.x--xgboost-gpu--ml.dmlc--xgboost4j-spark-gpu_2.12--ml.dmlc__xgboost4j-spark-gpu_2.12__1.5.2.jar
33

4-
sudo wget -O /databricks/jars/rapids-4-spark_2.12-24.10.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.10.0/rapids-4-spark_2.12-24.10.0.jar
4+
sudo wget -O /databricks/jars/rapids-4-spark_2.12-24.12.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.12.0/rapids-4-spark_2.12-24.12.0.jar
55
sudo wget -O /databricks/jars/xgboost4j-gpu_2.12-1.7.1.jar https://repo1.maven.org/maven2/ml/dmlc/xgboost4j-gpu_2.12/1.7.1/xgboost4j-gpu_2.12-1.7.1.jar
66
sudo wget -O /databricks/jars/xgboost4j-spark-gpu_2.12-1.7.1.jar https://repo1.maven.org/maven2/ml/dmlc/xgboost4j-spark-gpu_2.12/1.7.1/xgboost4j-spark-gpu_2.12-1.7.1.jar
77
ls -ltr

docs/get-started/xgboost-examples/on-prem-cluster/kubernetes-scala.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ export SPARK_DOCKER_IMAGE=<gpu spark docker image repo and name>
4040
export SPARK_DOCKER_TAG=<spark docker image tag>
4141

4242
pushd ${SPARK_HOME}
43-
wget https://github.com/NVIDIA/spark-rapids-examples/raw/branch-24.10/dockerfile/Dockerfile
43+
wget https://github.com/NVIDIA/spark-rapids-examples/raw/branch-24.12/dockerfile/Dockerfile
4444

4545
# Optionally install additional jars into ${SPARK_HOME}/jars/
4646

docs/get-started/xgboost-examples/prepare-package-data/preparation-python.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ For simplicity export the location to these jars. All examples assume the packag
55
### Download the jars
66

77
Download the RAPIDS Accelerator for Apache Spark plugin jar
8-
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.10.0/rapids-4-spark_2.12-24.10.0.jar)
8+
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.12.0/rapids-4-spark_2.12-24.12.0.jar)
99

1010
### Build XGBoost Python Examples
1111

docs/get-started/xgboost-examples/prepare-package-data/preparation-scala.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ For simplicity export the location to these jars. All examples assume the packag
55
### Download the jars
66

77
1. Download the RAPIDS Accelerator for Apache Spark plugin jar
8-
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.10.0/rapids-4-spark_2.12-24.10.0.jar)
8+
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.12.0/rapids-4-spark_2.12-24.12.0.jar)
99

1010
### Build XGBoost Scala Examples
1111

examples/ML+DL-Examples/Optuna-Spark/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -147,8 +147,8 @@ We use [RAPIDS](https://docs.rapids.ai/install/#get-rapids) for GPU-accelerated
147147
``` shell
148148
sudo apt install libmysqlclient-dev
149149

150-
conda create -n rapids-24.10 -c rapidsai -c conda-forge -c nvidia \
151-
cudf=24.10 cuml=24.10 python=3.10 'cuda-version>=12.0,<=12.5'
150+
conda create -n rapids-24.12 -c rapidsai -c conda-forge -c nvidia \
151+
cudf=24.12 cuml=24.12 python=3.10 'cuda-version>=12.0,<=12.5'
152152
conda activate optuna-spark
153153
pip install mysqlclient
154154
pip install optuna joblib joblibspark ipywidgets

examples/ML+DL-Examples/Optuna-Spark/optuna-examples/databricks/init_optuna.sh

+2-2
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ fi
4141

4242

4343
# rapids import
44-
SPARK_RAPIDS_VERSION=24.10.1
44+
SPARK_RAPIDS_VERSION=24.12.0
4545
curl -L https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/${SPARK_RAPIDS_VERSION}/rapids-4-spark_2.12-${SPARK_RAPIDS_VERSION}.jar -o \
4646
/databricks/jars/rapids-4-spark_2.12-${SPARK_RAPIDS_VERSION}.jar
4747

@@ -54,7 +54,7 @@ ln -s /usr/local/cuda-11.8 /usr/local/cuda
5454

5555
sudo /databricks/python3/bin/pip3 install \
5656
--extra-index-url=https://pypi.nvidia.com \
57-
"cudf-cu11==24.10.*" "cuml-cu11==24.10.*"
57+
"cudf-cu11==24.12.*" "cuml-cu11==24.12.*"
5858

5959
# setup python environment
6060
sudo apt clean && sudo apt update --fix-missing -y

examples/ML+DL-Examples/Optuna-Spark/optuna-examples/databricks/start_cluster.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ json_config=$(cat <<EOF
1212
"spark_version": "13.3.x-gpu-ml-scala2.12",
1313
"spark_conf": {
1414
"spark.task.resource.gpu.amount": "1",
15-
"spark.executorEnv.PYTHONPATH": "/databricks/jars/rapids-4-spark_2.12-24.10.1.jar:/databricks/spark/python:/databricks/python3",
15+
"spark.executorEnv.PYTHONPATH": "/databricks/jars/rapids-4-spark_2.12-24.12.0.jar:/databricks/spark/python:/databricks/python3",
1616
"spark.executor.cores": "8",
1717
"spark.rapids.memory.gpu.minAllocFraction": "0.0001",
1818
"spark.plugins": "com.nvidia.spark.SQLPlugin",

examples/ML+DL-Examples/Optuna-Spark/optuna-examples/optuna-dataframe.ipynb

+2-2
Original file line numberDiff line numberDiff line change
@@ -444,14 +444,14 @@
444444
"24/12/11 23:47:52 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\n",
445445
"Setting default log level to \"WARN\".\n",
446446
"To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).\n",
447-
"24/12/11 23:47:52 WARN RapidsPluginUtils: RAPIDS Accelerator 24.10.1 using cudf 24.10.0, private revision bd4e99e18e20234ee0c54f95f4b0bfce18a6255e\n",
447+
"24/12/11 23:47:52 WARN RapidsPluginUtils: RAPIDS Accelerator 24.12.0 using cudf 24.12.0, private revision bd4e99e18e20234ee0c54f95f4b0bfce18a6255e\n",
448448
"24/12/11 23:47:52 WARN RapidsPluginUtils: RAPIDS Accelerator is enabled, to disable GPU support set `spark.rapids.sql.enabled` to false.\n"
449449
]
450450
}
451451
],
452452
"source": [
453453
"def get_rapids_jar():\n",
454-
" SPARK_RAPIDS_VERSION = \"24.10.1\"\n",
454+
" SPARK_RAPIDS_VERSION = \"24.12.0\"\n",
455455
" rapids_jar = f\"rapids-4-spark_2.12-{SPARK_RAPIDS_VERSION}.jar\"\n",
456456
" if not os.path.exists(rapids_jar):\n",
457457
" print(\"Downloading Spark Rapids jar\")\n",

examples/ML+DL-Examples/Spark-Rapids-ML/pca/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ Please refer to the Spark-Rapids-ML [README](https://github.com/NVIDIA/spark-rap
1010
## Download RAPIDS Jar from Maven Central
1111

1212
Download the [Spark-Rapids plugin](https://nvidia.github.io/spark-rapids/docs/download.html#download-rapids-accelerator-for-apache-spark-v24081).
13-
For Spark-RAPIDS-ML version 24.10.1, download the RAPIDS jar from Maven Central: [rapids-4-spark_2.12-24.10.1.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.10.1/rapids-4-spark_2.12-24.10.1.jar).
13+
For Spark-RAPIDS-ML version 24.12.1, download the RAPIDS jar from Maven Central: [rapids-4-spark_2.12-24.12.1.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.12.1/rapids-4-spark_2.12-24.12.1.jar).
1414

1515
## Running the Notebooks
1616

examples/ML+DL-Examples/Spark-Rapids-ML/pca/notebooks/pca.ipynb

+3-3
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@
4343
"24/10/04 18:04:27 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\n",
4444
"Setting default log level to \"WARN\".\n",
4545
"To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).\n",
46-
"24/10/04 18:04:27 WARN RapidsPluginUtils: RAPIDS Accelerator 24.10.0 using cudf 24.10.0, private revision 9fac64da220ddd6bf5626bd7bd1dd74c08603eac\n",
46+
"24/10/04 18:04:27 WARN RapidsPluginUtils: RAPIDS Accelerator 24.12.0 using cudf 24.12.0, private revision 9fac64da220ddd6bf5626bd7bd1dd74c08603eac\n",
4747
"24/10/04 18:04:27 WARN RapidsPluginUtils: RAPIDS Accelerator is enabled, to disable GPU support set `spark.rapids.sql.enabled` to false.\n",
4848
"24/10/04 18:04:31 WARN GpuDeviceManager: RMM pool is disabled since spark.rapids.memory.gpu.pooling.enabled is set to false; however, this configuration is deprecated and the behavior may change in a future release.\n"
4949
]
@@ -57,7 +57,7 @@
5757
" import os\n",
5858
" import requests\n",
5959
"\n",
60-
" SPARK_RAPIDS_VERSION = \"24.10.1\"\n",
60+
" SPARK_RAPIDS_VERSION = \"24.12.1\"\n",
6161
" rapids_jar = f\"rapids-4-spark_2.12-{SPARK_RAPIDS_VERSION}.jar\"\n",
6262
" if not os.path.exists(rapids_jar):\n",
6363
" print(\"Downloading spark rapids jar\")\n",
@@ -539,7 +539,7 @@
539539
],
540540
"metadata": {
541541
"kernelspec": {
542-
"display_name": "rapids-24.10",
542+
"display_name": "rapids-24.12",
543543
"language": "python",
544544
"name": "python3"
545545
},

examples/SQL+DF-Examples/micro-benchmarks/notebooks/micro-benchmarks-gpu.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@
2222
"import os\n",
2323
"# Change to your cluster ip:port and directories\n",
2424
"SPARK_MASTER_URL = os.getenv(\"SPARK_MASTER_URL\", \"spark:your-ip:port\")\n",
25-
"RAPIDS_JAR = os.getenv(\"RAPIDS_JAR\", \"/your-path/rapids-4-spark_2.12-24.10.0.jar\")\n"
25+
"RAPIDS_JAR = os.getenv(\"RAPIDS_JAR\", \"/your-path/rapids-4-spark_2.12-24.12.0.jar\")\n"
2626
]
2727
},
2828
{

examples/SQL+DF-Examples/tpcds/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,6 @@ Google Colab and connect it to a [GPU instance](https://research.google.com/cola
2323
</a>
2424

2525
Here is the bar chart from a recent execution on Google Colab's T4 High RAM instance using
26-
RAPIDS Spark 24.10.0 with Apache Spark 3.5.0
26+
RAPIDS Spark 24.12.0 with Apache Spark 3.5.0
2727

2828
![tpcds-speedup](/docs/img/guides/tpcds.png)

examples/SQL+DF-Examples/tpcds/notebooks/TPCDS-SF10.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@
3030
"outputs": [],
3131
"source": [
3232
"spark_version='3.5.0'\n",
33-
"rapids_version='24.10.0'"
33+
"rapids_version='24.12.0'"
3434
]
3535
},
3636
{

examples/UDF-Examples/RAPIDS-accelerated-UDFs/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -186,7 +186,7 @@ then do the following inside the Docker container.
186186

187187
### Get jars from Maven Central
188188

189-
[rapids-4-spark_2.12-24.10.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.10.0/rapids-4-spark_2.12-24.10.0.jar)
189+
[rapids-4-spark_2.12-24.12.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.12.0/rapids-4-spark_2.12-24.12.0.jar)
190190

191191

192192
### Launch a local mode Spark

examples/UDF-Examples/RAPIDS-accelerated-UDFs/pom.xml

+1-1
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@
3737
<cuda.version>cuda11</cuda.version>
3838
<scala.binary.version>2.12</scala.binary.version>
3939
<!-- Depends on release version, Snapshot version is not published to the Maven Central -->
40-
<rapids4spark.version>24.10.0</rapids4spark.version>
40+
<rapids4spark.version>24.12.0</rapids4spark.version>
4141
<spark.version>3.1.1</spark.version>
4242
<scala.version>2.12.15</scala.version>
4343
<udf.native.build.path>${project.build.directory}/cpp-build</udf.native.build.path>

examples/XGBoost-Examples/agaricus/notebooks/python/agaricus-gpu.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@
7373
"Setting default log level to \"WARN\".\n",
7474
"To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).\n",
7575
"2022-11-30 06:57:40,550 WARN resource.ResourceUtils: The configuration of cores (exec = 2 task = 1, runnable tasks = 2) will result in wasted resources due to resource gpu limiting the number of runnable tasks per executor to: 1. Please adjust your configuration.\n",
76-
"2022-11-30 06:57:54,195 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator 24.10.0 using cudf 24.10.0.\n",
76+
"2022-11-30 06:57:54,195 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator 24.12.0 using cudf 24.12.0.\n",
7777
"2022-11-30 06:57:54,210 WARN rapids.RapidsPluginUtils: spark.rapids.sql.multiThreadedRead.numThreads is set to 20.\n",
7878
"2022-11-30 06:57:54,214 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator is enabled, to disable GPU support set `spark.rapids.sql.enabled` to false.\n",
7979
"2022-11-30 06:57:54,214 WARN rapids.RapidsPluginUtils: spark.rapids.sql.explain is set to `NOT_ON_GPU`. Set it to 'NONE' to suppress the diagnostics logging about the query placement on the GPU.\n",

examples/XGBoost-Examples/mortgage/notebooks/python/MortgageETL.ipynb

+3-3
Original file line numberDiff line numberDiff line change
@@ -6,18 +6,18 @@
66
"source": [
77
"## Prerequirement\n",
88
"### 1. Download data\n",
9-
"Dataset is derived from Fannie Mae’s [Single-Family Loan Performance Data](http://www.fanniemae.com/portal/funding-the-market/data/loan-performance-data.html) with all rights reserved by Fannie Mae. Refer to these [instructions](https://github.com/NVIDIA/spark-rapids-examples/blob/branch-24.10/docs/get-started/xgboost-examples/dataset/mortgage.md) to download the dataset.\n",
9+
"Dataset is derived from Fannie Mae’s [Single-Family Loan Performance Data](http://www.fanniemae.com/portal/funding-the-market/data/loan-performance-data.html) with all rights reserved by Fannie Mae. Refer to these [instructions](https://github.com/NVIDIA/spark-rapids-examples/blob/branch-24.12/docs/get-started/xgboost-examples/dataset/mortgage.md) to download the dataset.\n",
1010
"\n",
1111
"### 2. Download needed jars\n",
12-
"* [rapids-4-spark_2.12-24.10.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.10.0/rapids-4-spark_2.12-24.10.0.jar)\n",
12+
"* [rapids-4-spark_2.12-24.12.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.12.0/rapids-4-spark_2.12-24.12.0.jar)\n",
1313
"\n",
1414
"\n",
1515
"### 3. Start Spark Standalone\n",
1616
"Before running the script, please setup Spark standalone mode\n",
1717
"\n",
1818
"### 4. Add ENV\n",
1919
"```\n",
20-
"$ export SPARK_JARS=rapids-4-spark_2.12-24.10.0.jar\n",
20+
"$ export SPARK_JARS=rapids-4-spark_2.12-24.12.0.jar\n",
2121
"$ export PYSPARK_DRIVER_PYTHON=jupyter \n",
2222
"$ export PYSPARK_DRIVER_PYTHON_OPTS=notebook\n",
2323
"```\n",

examples/XGBoost-Examples/mortgage/notebooks/python/cv-mortgage-gpu.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@
6363
"Setting default log level to \"WARN\".\n",
6464
"To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).\n",
6565
"2022-11-25 09:34:43,952 WARN resource.ResourceUtils: The configuration of cores (exec = 4 task = 1, runnable tasks = 4) will result in wasted resources due to resource gpu limiting the number of runnable tasks per executor to: 1. Please adjust your configuration.\n",
66-
"2022-11-25 09:34:58,155 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator 24.10.0 using cudf 24.10.0.\n",
66+
"2022-11-25 09:34:58,155 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator 24.12.0 using cudf 24.12.0.\n",
6767
"2022-11-25 09:34:58,171 WARN rapids.RapidsPluginUtils: spark.rapids.sql.multiThreadedRead.numThreads is set to 20.\n",
6868
"2022-11-25 09:34:58,175 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator is enabled, to disable GPU support set `spark.rapids.sql.enabled` to false.\n",
6969
"2022-11-25 09:34:58,175 WARN rapids.RapidsPluginUtils: spark.rapids.sql.explain is set to `NOT_ON_GPU`. Set it to 'NONE' to suppress the diagnostics logging about the query placement on the GPU.\n"

examples/XGBoost-Examples/mortgage/notebooks/python/mortgage-gpu.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,7 @@
8484
"22/11/24 06:14:06 INFO org.apache.spark.SparkEnv: Registering BlockManagerMaster\n",
8585
"22/11/24 06:14:06 INFO org.apache.spark.SparkEnv: Registering BlockManagerMasterHeartbeat\n",
8686
"22/11/24 06:14:06 INFO org.apache.spark.SparkEnv: Registering OutputCommitCoordinator\n",
87-
"22/11/24 06:14:07 WARN com.nvidia.spark.rapids.RapidsPluginUtils: RAPIDS Accelerator 24.10.0 using cudf 24.10.0.\n",
87+
"22/11/24 06:14:07 WARN com.nvidia.spark.rapids.RapidsPluginUtils: RAPIDS Accelerator 24.12.0 using cudf 24.12.0.\n",
8888
"22/11/24 06:14:07 WARN com.nvidia.spark.rapids.RapidsPluginUtils: spark.rapids.sql.multiThreadedRead.numThreads is set to 20.\n",
8989
"22/11/24 06:14:07 WARN com.nvidia.spark.rapids.RapidsPluginUtils: RAPIDS Accelerator is enabled, to disable GPU support set `spark.rapids.sql.enabled` to false.\n",
9090
"22/11/24 06:14:07 WARN com.nvidia.spark.rapids.RapidsPluginUtils: spark.rapids.sql.explain is set to `NOT_ON_GPU`. Set it to 'NONE' to suppress the diagnostics logging about the query placement on the GPU.\n"

examples/XGBoost-Examples/mortgage/notebooks/scala/mortgage-ETL.ipynb

+2-2
Original file line numberDiff line numberDiff line change
@@ -20,14 +20,14 @@
2020
"Refer to these [instructions](https://github.com/NVIDIA/spark-rapids-examples/blob/branch-23.12/docs/get-started/xgboost-examples/dataset/mortgage.md) to download the dataset.\n",
2121
"\n",
2222
"### 2. Download needed jars\n",
23-
"* [rapids-4-spark_2.12-24.10.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.10.0/rapids-4-spark_2.12-24.10.0.jar)\n",
23+
"* [rapids-4-spark_2.12-24.12.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.12.0/rapids-4-spark_2.12-24.12.0.jar)\n",
2424
"\n",
2525
"### 3. Start Spark Standalone\n",
2626
"Before Running the script, please setup Spark standalone mode\n",
2727
"\n",
2828
"### 4. Add ENV\n",
2929
"```\n",
30-
"$ export SPARK_JARS=rapids-4-spark_2.12-24.10.0.jar\n",
30+
"$ export SPARK_JARS=rapids-4-spark_2.12-24.12.0.jar\n",
3131
"\n",
3232
"```\n",
3333
"\n",

examples/XGBoost-Examples/taxi/notebooks/python/cv-taxi-gpu.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@
6262
"Setting default log level to \"WARN\".\n",
6363
"To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).\n",
6464
"2022-11-30 08:02:10,103 WARN resource.ResourceUtils: The configuration of cores (exec = 2 task = 1, runnable tasks = 2) will result in wasted resources due to resource gpu limiting the number of runnable tasks per executor to: 1. Please adjust your configuration.\n",
65-
"2022-11-30 08:02:23,737 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator 24.10.0 using cudf 24.10.0.\n",
65+
"2022-11-30 08:02:23,737 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator 24.12.0 using cudf 24.12.0.\n",
6666
"2022-11-30 08:02:23,752 WARN rapids.RapidsPluginUtils: spark.rapids.sql.multiThreadedRead.numThreads is set to 20.\n",
6767
"2022-11-30 08:02:23,756 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator is enabled, to disable GPU support set `spark.rapids.sql.enabled` to false.\n",
6868
"2022-11-30 08:02:23,757 WARN rapids.RapidsPluginUtils: spark.rapids.sql.explain is set to `NOT_ON_GPU`. Set it to 'NONE' to suppress the diagnostics logging about the query placement on the GPU.\n",

examples/XGBoost-Examples/taxi/notebooks/python/taxi-ETL.ipynb

+2-2
Original file line numberDiff line numberDiff line change
@@ -19,14 +19,14 @@
1919
"All data could be found at https://www1.nyc.gov/site/tlc/about/tlc-trip-record-data.page\n",
2020
"\n",
2121
"### 2. Download needed jars\n",
22-
"* [rapids-4-spark_2.12-24.10.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.10.0/rapids-4-spark_2.12-24.10.0.jar)\n",
22+
"* [rapids-4-spark_2.12-24.12.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/24.12.0/rapids-4-spark_2.12-24.12.0.jar)\n",
2323
"\n",
2424
"### 3. Start Spark Standalone\n",
2525
"Before running the script, please setup Spark standalone mode\n",
2626
"\n",
2727
"### 4. Add ENV\n",
2828
"```\n",
29-
"$ export SPARK_JARS=rapids-4-spark_2.12-24.10.0.jar\n",
29+
"$ export SPARK_JARS=rapids-4-spark_2.12-24.12.0.jar\n",
3030
"$ export PYSPARK_DRIVER_PYTHON=jupyter \n",
3131
"$ export PYSPARK_DRIVER_PYTHON_OPTS=notebook\n",
3232
"```\n",

examples/XGBoost-Examples/taxi/notebooks/python/taxi-gpu.ipynb

+2-2
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@
7373
"Setting default log level to \"WARN\".\n",
7474
"To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).\n",
7575
"2022-11-30 07:51:19,480 WARN resource.ResourceUtils: The configuration of cores (exec = 2 task = 1, runnable tasks = 2) will result in wasted resources due to resource gpu limiting the number of runnable tasks per executor to: 1. Please adjust your configuration.\n",
76-
"2022-11-30 07:51:33,277 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator 24.10.0 using cudf 24.10.0.\n",
76+
"2022-11-30 07:51:33,277 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator 24.12.0 using cudf 24.12.0.\n",
7777
"2022-11-30 07:51:33,292 WARN rapids.RapidsPluginUtils: spark.rapids.sql.multiThreadedRead.numThreads is set to 20.\n",
7878
"2022-11-30 07:51:33,295 WARN rapids.RapidsPluginUtils: RAPIDS Accelerator is enabled, to disable GPU support set `spark.rapids.sql.enabled` to false.\n",
7979
"2022-11-30 07:51:33,295 WARN rapids.RapidsPluginUtils: spark.rapids.sql.explain is set to `NOT_ON_GPU`. Set it to 'NONE' to suppress the diagnostics logging about the query placement on the GPU.\n",
@@ -266,7 +266,7 @@
266266
"name": "stdout",
267267
"output_type": "stream",
268268
"text": [
269-
"Training takes 24.10 seconds\n"
269+
"Training takes 24.12 seconds\n"
270270
]
271271
},
272272
{

0 commit comments

Comments
 (0)