Skip to content

Reduce mvn -Xmx option to 2g in publish_snapshot workflow #318

Reduce mvn -Xmx option to 2g in publish_snapshot workflow

Reduce mvn -Xmx option to 2g in publish_snapshot workflow #318

GitHub Actions / Report test results failed Oct 26, 2023 in 0s

45557 tests run, 873 skipped, 18 failed.

Annotations

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_apply_in_pandas_returning_no_column_names_and_wrong_amount

<_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T10:40:07.937211476+00:00", grpc_status:14}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py", line 51, in test_apply_in_pandas_returning_no_column_names_and_wrong_amount
    self.check_apply_in_pandas_returning_no_column_names_and_wrong_amount()
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 343, in check_apply_in_pandas_returning_no_column_names_and_wrong_amount
    self._test_apply_in_pandas(
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 817, in _test_apply_in_pandas
    df.groupby("id").applyInPandas(f, schema=output_schema).sort("id", "mean").toPandas()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1730, in toPandas
    return self._session.client.to_pandas(query, self._plan.observations)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 849, in to_pandas
    table, schema, metrics, observed_metrics, _ = self._execute_and_fetch(
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1301, in _execute_and_fetch
    for response in self._execute_and_fetch_as_iterator(req, observations):
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1279, in _execute_and_fetch_as_iterator
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T10:40:07.937211476+00:00", grpc_status:14}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_apply_in_pandas_returning_wrong_column_names

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T10:50:20.922251332+00:00"}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py", line 48, in test_apply_in_pandas_returning_wrong_column_names
    self.check_apply_in_pandas_returning_wrong_column_names()
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 327, in check_apply_in_pandas_returning_wrong_column_names
    self._test_apply_in_pandas(
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 817, in _test_apply_in_pandas
    df.groupby("id").applyInPandas(f, schema=output_schema).sort("id", "mean").toPandas()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 488, in groupBy
    _cols.append(self[c])
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1681, in __getitem__
    self.select(item).isLocal()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1743, in isLocal
    result = self._session.client._analyze(method="is_local", plan=query).is_local
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1131, in _analyze
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T10:50:20.922251332+00:00"}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_array_type_correct

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T11:00:32.382983403+00:00"}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 207, in test_array_type_correct
    result = df.groupby("id").apply(udf).sort("id").toPandas()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 488, in groupBy
    _cols.append(self[c])
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1681, in __getitem__
    self.select(item).isLocal()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1743, in isLocal
    result = self._session.client._analyze(method="is_local", plan=query).is_local
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1131, in _analyze
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T11:00:32.382983403+00:00"}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_case_insensitive_grouping_column

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T11:10:44.771364761+00:00"}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 805, in test_case_insensitive_grouping_column
    df = self.spark.createDataFrame([[1, 1]], ["column", "score"])
  File "/__w/spark/spark/python/pyspark/sql/connect/session.py", line 506, in createDataFrame
    _schema = self._inferSchemaFromList(_data, _cols)
  File "/__w/spark/spark/python/pyspark/sql/connect/session.py", line 322, in _inferSchemaFromList
    ) = self._client.get_configs(
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1356, in get_configs
    configs = dict(self.config(op).pairs)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1397, in config
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T11:10:44.771364761+00:00"}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_coerce

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T11:20:57.120875124+00:00", grpc_status:14}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 246, in test_coerce
    result = df.groupby("id").apply(foo).sort("id").toPandas()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 488, in groupBy
    _cols.append(self[c])
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1681, in __getitem__
    self.select(item).isLocal()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1743, in isLocal
    result = self._session.client._analyze(method="is_local", plan=query).is_local
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1131, in _analyze
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T11:20:57.120875124+00:00", grpc_status:14}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_column_order

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T11:31:09.225619721+00:00", grpc_status:14}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py", line 45, in test_column_order
    self.check_column_order()
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 566, in check_column_order
    grouped_df = df.groupby("id")
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 488, in groupBy
    _cols.append(self[c])
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1681, in __getitem__
    self.select(item).isLocal()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1743, in isLocal
    result = self._session.client._analyze(method="is_local", plan=query).is_local
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1131, in _analyze
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T11:31:09.225619721+00:00", grpc_status:14}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_complex_groupby

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T11:41:21.199453189+00:00"}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 259, in test_complex_groupby
    result = df.groupby(col("id") % 2 == 0).apply(normalize).sort("id", "v").toPandas()
  File "/__w/spark/spark/python/pyspark/sql/connect/group.py", line 269, in apply
    return self.applyInPandas(udf.func, schema=udf.returnType)  # type: ignore[attr-defined]
  File "/__w/spark/spark/python/pyspark/sql/connect/group.py", line 290, in applyInPandas
    cols=self._df.columns,
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 244, in columns
    return self.schema.names
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1737, in schema
    return self._session.client.schema(query)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 941, in schema
    schema = self._analyze(method="schema", plan=plan).schema
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1131, in _analyze
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T11:41:21.199453189+00:00"}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_datatype_string

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T11:51:32.488040854+00:00", grpc_status:14}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 399, in test_datatype_string
    result = df.groupby("id").apply(foo_udf).sort("id").toPandas()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 488, in groupBy
    _cols.append(self[c])
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1681, in __getitem__
    self.select(item).isLocal()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1743, in isLocal
    result = self._session.client._analyze(method="is_local", plan=query).is_local
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1131, in _analyze
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T11:51:32.488040854+00:00", grpc_status:14}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_decorator

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T12:01:44.940428336+00:00", grpc_status:14}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 237, in test_decorator
    result = df.groupby("id").apply(foo).sort("id").toPandas()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 488, in groupBy
    _cols.append(self[c])
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1681, in __getitem__
    self.select(item).isLocal()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1743, in isLocal
    result = self._session.client._analyze(method="is_local", plan=query).is_local
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1131, in _analyze
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T12:01:44.940428336+00:00", grpc_status:14}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_empty_groupby

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T12:11:56.444995246+00:00"}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 274, in test_empty_groupby
    result = df.groupby().apply(normalize).sort("id", "v").toPandas()
  File "/__w/spark/spark/python/pyspark/sql/connect/group.py", line 269, in apply
    return self.applyInPandas(udf.func, schema=udf.returnType)  # type: ignore[attr-defined]
  File "/__w/spark/spark/python/pyspark/sql/connect/group.py", line 290, in applyInPandas
    cols=self._df.columns,
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 244, in columns
    return self.schema.names
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1737, in schema
    return self._session.client.schema(query)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 941, in schema
    schema = self._analyze(method="schema", plan=plan).schema
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1131, in _analyze
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T12:11:56.444995246+00:00"}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_grouped_over_window

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T12:22:08.783852136+00:00"}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 705, in test_grouped_over_window
    df = self.spark.createDataFrame(data, ["id", "group", "ts", "result"])
  File "/__w/spark/spark/python/pyspark/sql/connect/session.py", line 506, in createDataFrame
    _schema = self._inferSchemaFromList(_data, _cols)
  File "/__w/spark/spark/python/pyspark/sql/connect/session.py", line 322, in _inferSchemaFromList
    ) = self._client.get_configs(
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1356, in get_configs
    configs = dict(self.config(op).pairs)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1397, in config
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T12:22:08.783852136+00:00"}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_grouped_over_window_with_key

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T12:32:19.929207937+00:00", grpc_status:14}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 734, in test_grouped_over_window_with_key
    timezone = self.spark.conf.get("spark.sql.session.timeZone")
  File "/__w/spark/spark/python/pyspark/sql/connect/conf.py", line 65, in get
    result = self._client.config(operation)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1397, in config
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T12:32:19.929207937+00:00", grpc_status:14}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_mixed_scalar_udfs_followed_by_groupby_apply

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T12:42:32.313176638+00:00"}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 667, in test_mixed_scalar_udfs_followed_by_groupby_apply
    df = df.withColumn("v2", udf(lambda x: x + 1, "int")(df["v1"])).withColumn(
  File "/__w/spark/spark/python/pyspark/sql/utils.py", line 190, in wrapped
    return getattr(functions, f.__name__)(*args, **kwargs)
  File "/__w/spark/spark/python/pyspark/sql/connect/functions.py", line 3954, in udf
    return _create_py_udf(f=f, returnType=returnType, useArrow=useArrow)
  File "/__w/spark/spark/python/pyspark/sql/connect/udf.py", line 66, in _create_py_udf
    str(session.conf.get("spark.sql.execution.pythonUDF.arrow.enabled")).lower()
  File "/__w/spark/spark/python/pyspark/sql/connect/conf.py", line 65, in get
    result = self._client.config(operation)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1397, in config
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T12:42:32.313176638+00:00"}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_positional_assignment_conf

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T12:52:44.490469464+00:00"}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 635, in test_positional_assignment_conf
    with self.sql_conf(
  File "/usr/lib/python3.9/contextlib.py", line 117, in __enter__
    return next(self.gen)
  File "/__w/spark/spark/python/pyspark/testing/sqlutils.py", line 181, in sql_conf
    old_values = [self.spark.conf.get(key, None) for key in keys]
  File "/__w/spark/spark/python/pyspark/testing/sqlutils.py", line 181, in <listcomp>
    old_values = [self.spark.conf.get(key, None) for key in keys]
  File "/__w/spark/spark/python/pyspark/sql/connect/conf.py", line 65, in get
    result = self._client.config(operation)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1397, in config
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T12:52:44.490469464+00:00"}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_self_join_with_pandas

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T13:02:55.531339579+00:00"}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 654, in test_self_join_with_pandas
    df = self.spark.createDataFrame(
  File "/__w/spark/spark/python/pyspark/sql/connect/session.py", line 506, in createDataFrame
    _schema = self._inferSchemaFromList(_data, _cols)
  File "/__w/spark/spark/python/pyspark/sql/connect/session.py", line 322, in _inferSchemaFromList
    ) = self._client.get_configs(
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1356, in get_configs
    configs = dict(self.config(op).pairs)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1397, in config
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T13:02:55.531339579+00:00"}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_timestamp_dst

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T13:13:07.82324843+00:00"}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 468, in test_timestamp_dst
    df = self.spark.createDataFrame(dt, "timestamp").toDF("time")
  File "/__w/spark/spark/python/pyspark/sql/connect/session.py", line 358, in createDataFrame
    schema = self.client._analyze(  # type: ignore[assignment]
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1131, in _analyze
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-26T13:13:07.82324843+00:00"}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.test_udf_with_key

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T13:23:19.584177139+00:00", grpc_status:14}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_grouped_map.py", line 477, in test_udf_with_key
    pdf = df.toPandas()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1730, in toPandas
    return self._session.client.to_pandas(query, self._plan.observations)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 845, in to_pandas
    (self_destruct_conf,) = self.get_config_with_defaults(
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1367, in get_config_with_defaults
    configs = dict(self.config(op).pairs)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1397, in config
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1521, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1581, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:40345: Failed to connect to remote host: Connection refused {created_time:"2023-10-26T13:23:19.584177139+00:00", grpc_status:14}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_pandas_grouped_map.py.tearDownClass (pyspark.sql.tests.connect.test_parity_pandas_grouped_map.GroupedApplyInPandasTests)

[Errno 111] Connection refused
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/testing/connectutils.py", line 194, in tearDownClass
    cls.spark.stop()
  File "/__w/spark/spark/python/pyspark/sql/connect/session.py", line 669, in stop
    PySparkSession._activeSession.stop()
  File "/__w/spark/spark/python/pyspark/sql/session.py", line 1804, in stop
    self._jvm.SparkSession.clearDefaultSession()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1712, in __getattr__
    answer = self._gateway_client.send_command(
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1036, in send_command
    connection = self._get_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 284, in _get_connection
    connection = self._create_new_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 291, in _create_new_connection
    connection.connect_to_java_server()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 438, in connect_to_java_server
    self.socket.connect((self.java_address, self.java_port))
ConnectionRefusedError: [Errno 111] Connection refused