From f43c835ec98185ab85aa5a11182ea482566aa9d8 Mon Sep 17 00:00:00 2001 From: semyonsinchenko Date: Tue, 29 Oct 2024 22:01:25 +0100 Subject: [PATCH] Fixes from comments --- README.md | 2 +- docs/dev/contributing.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 49f9d89..cebc6ba 100644 --- a/README.md +++ b/README.md @@ -97,7 +97,7 @@ poetry env use python3.10 # any version bigger than 3.10 should work poetry install --with dev # that install tsumugi as well as jupyter notebooks and pyspark[connect] ``` -Now you can run jupyter and try the example notebook (`tsumugi-python/examples/basic_example.ipynb`): [Notebook](https://github.com/SemyonSinchenko/tsumugi-spark/blob/main/tsumugi-python/examples/basic_example.ipynb) +Now you can run jupyter and try the example notebook (`tsumugi-python/examples/basic_example.ipynb`): [Notebook](https://github.com/mrpowers-io/tsumugi-spark/blob/main/docs/notebooks/basic_example.ipynb) ### Server diff --git a/docs/dev/contributing.md b/docs/dev/contributing.md index 0bfb252..6593d87 100644 --- a/docs/dev/contributing.md +++ b/docs/dev/contributing.md @@ -106,7 +106,7 @@ To maintain a consistent style across all Python code, the following rules shoul 1. All classes that wrap code generated from protobuf messages should be implemented as dataclasses, unless there is a compelling reason not to do so. 2. Each of these classes should have a private method `_to_proto(self) -> "protobuf class"` that converts the dataclass to the proto-serializable class. -## Runnign examples or testing clients +## Running examples or testing clients To simplify testing and development, there is a script that builds a server plugin, downloads and unpacks the Spark distribution, and runs the Spark Connect Server with all the necessary configurations. To run it, use `make run_spark_server`. After that, the server will be available at `sc://localhost:15002`.