Skip to content

Commit

Permalink
[SPARK-50814][DOCS] Remove unused SQL error pages
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?

Remove standalone SQL error pages that were made obsolete by the work completed in apache#44971.

Also fix the formatting of the error message for `QUERY_ONLY_CORRUPT_RECORD_COLUMN`, since it was incorrect and overflowing the table cell it belongs to.

### Why are the changes needed?

These error pages are either already captured completely in `common/utils/src/main/resources/error/error-conditions.json`, or are obsolete and not needed (and are not being rendered in the documentation output anyway).

The formatting of `QUERY_ONLY_CORRUPT_RECORD_COLUMN` before and after:

<img src="https://github.com/user-attachments/assets/476c57e0-dfa5-403e-8a7d-2d05301eb7a3" width=650 />
<img src="https://github.com/user-attachments/assets/106d5bca-6569-488c-9b9c-1a27345fc7a8" width=450 />

### Does this PR introduce _any_ user-facing change?

Yes, documentation formatting.

### How was this patch tested?

Built the docs locally and reviewed them in my browser.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes apache#49486 from nchammas/SPARK-50814-unused-error-docs.

Authored-by: Nicholas Chammas <[email protected]>
Signed-off-by: Max Gekk <[email protected]>
  • Loading branch information
nchammas authored and MaxGekk committed Jan 15, 2025
1 parent 47d831e commit 21a37a7
Show file tree
Hide file tree
Showing 14 changed files with 5 additions and 599 deletions.
10 changes: 5 additions & 5 deletions common/utils/src/main/resources/error/error-conditions.json
Original file line number Diff line number Diff line change
Expand Up @@ -5485,12 +5485,12 @@
"message" : [
"Queries from raw JSON/CSV/XML files are disallowed when the",
"referenced columns only include the internal corrupt record column",
"(named _corrupt_record by default). For example:",
"spark.read.schema(schema).json(file).filter($\"_corrupt_record\".isNotNull).count()",
"and spark.read.schema(schema).json(file).select(\"_corrupt_record\").show().",
"(named `_corrupt_record` by default). For example:",
"`spark.read.schema(schema).json(file).filter($\"_corrupt_record\".isNotNull).count()`",
"and `spark.read.schema(schema).json(file).select(\"_corrupt_record\").show()`.",
"Instead, you can cache or save the parsed results and then send the same query.",
"For example, val df = spark.read.schema(schema).json(file).cache() and then",
"df.filter($\"_corrupt_record\".isNotNull).count()."
"For example, `val df = spark.read.schema(schema).json(file).cache()` and then",
"`df.filter($\"_corrupt_record\".isNotNull).count()`."
]
},
"REMOVE_NAMESPACE_COMMENT" : {
Expand Down
41 changes: 0 additions & 41 deletions docs/sql-error-conditions-codec-not-available-error-class.md

This file was deleted.

41 changes: 0 additions & 41 deletions docs/sql-error-conditions-collation-mismatch-error-class.md

This file was deleted.

52 changes: 0 additions & 52 deletions docs/sql-error-conditions-failed-read-file-error-class.md

This file was deleted.

41 changes: 0 additions & 41 deletions docs/sql-error-conditions-illegal-state-store-value-error-class.md

This file was deleted.

49 changes: 0 additions & 49 deletions docs/sql-error-conditions-invalid-aggregate-filter-error-class.md

This file was deleted.

41 changes: 0 additions & 41 deletions docs/sql-error-conditions-invalid-conf-value-error-class.md

This file was deleted.

41 changes: 0 additions & 41 deletions docs/sql-error-conditions-invalid-datetime-pattern-error-class.md

This file was deleted.

49 changes: 0 additions & 49 deletions docs/sql-error-conditions-invalid-delimiter-value-error-class.md

This file was deleted.

Loading

0 comments on commit 21a37a7

Please sign in to comment.