Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Write spark StringType columns as Utf8 YTsaurus type #21

Open
alextokarew opened this issue Sep 13, 2024 · 2 comments
Open

Write spark StringType columns as Utf8 YTsaurus type #21

alextokarew opened this issue Sep 13, 2024 · 2 comments
Assignees

Comments

@alextokarew
Copy link
Collaborator

It is needed to support writing Spark dataframes with StringType columns to YTsaurus tables with Utf8 type columns. Two ways should be supported:

  1. df.write.option("string_to_utf8", "true").yt("//path/to/table") to convert all string columns to Utf8 type on writing;
  2. import yt.type_info as ti; df.write.schema_hint({"some_string_column": ti.Utf8}) to convert explicitly specified fields to utf8
@alextokarew
Copy link
Collaborator Author

@faucct I think we've implemented it. Please link appropriate commit to this Issue and close it.

@faucct
Copy link
Collaborator

faucct commented Nov 14, 2024

#26

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants