Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MAINTENANCE] Unpin snowflake-sqlalchemy #10838

Open
wants to merge 15 commits into
base: develop
Choose a base branch
from

Conversation

NathanFarmer
Copy link
Contributor

@NathanFarmer NathanFarmer commented Jan 9, 2025

Unpin snowflake-sqlalchemy

They didn't yank the broken version (1.7.0), so we just exclude it.

There are some other breaking changes, but they only affect tests in a meaningful way.

  • Description of PR changes above includes a link to an existing GitHub issue
  • PR title is prefixed with one of: [BUGFIX], [FEATURE], [DOCS], [MAINTENANCE], [CONTRIB]
  • Code is linted - run invoke lint (uses ruff format + ruff check)
  • Appropriate tests and docs have been updated

@NathanFarmer NathanFarmer self-assigned this Jan 9, 2025
@NathanFarmer NathanFarmer requested a review from a team as a code owner January 9, 2025 19:40
Copy link

netlify bot commented Jan 9, 2025

Deploy Preview for niobium-lead-7998 ready!

Name Link
🔨 Latest commit 543aeaa
🔍 Latest deploy log https://app.netlify.com/sites/niobium-lead-7998/deploys/67883e9a01905900088ab8b1
😎 Deploy Preview https://deploy-preview-10838.docs.greatexpectations.io
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@@ -1,4 +1,4 @@
pandas<2.2.0; python_version >= "3.9"
snowflake-connector-python>=2.5.0; python_version < "3.11"
snowflake-connector-python>2.9.0; python_version >= "3.11" # earlier versions fail to build on 3.11
snowflake-sqlalchemy>=1.2.3,<1.7.0 # pinned due to breaking in 1.7
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They didn't yank the broken version

Copy link

codecov bot commented Jan 9, 2025

❌ 20 Tests Failed:

Tests completed Failed Passed Skipped
21219 20 21199 5858
View the top 3 failed tests by shortest run time
tests.data_context.test_data_context_in_code_config::test_DataContext_construct_data_context_id_uses_id_of_currently_configured_expectations_store
Stack Traces | 0.152s run time
aws_credentials = None

    @pytest.mark.aws_deps
    @mock_s3
    def test_DataContext_construct_data_context_id_uses_id_of_currently_configured_expectations_store(
        aws_credentials,
    ):
        """
        What does this test and why?
    
        A DataContext should have an id. This ID should come from either:
        1. configured expectations store store_backend_id
        2. great_expectations.yml
        3. new generated id from DataContextConfig
        This test verifies that DataContext._construct_data_context_id
        uses the store_backend_id from the currently configured expectations store
        when instantiating the DataContext
        """
    
        store_backend_id_filename = StoreBackend.STORE_BACKEND_ID_KEY[0]
        bucket = "leakybucket"
        expectations_store_prefix = "expectations_store_prefix"
        validation_results_store_prefix = "validation_results_store_prefix"
        data_docs_store_prefix = "data_docs_store_prefix"
        data_context_prefix = ""
    
        # Create a bucket in Moto's mock AWS environment
        conn = boto3.resource("s3", region_name="us-east-1")
        conn.create_bucket(Bucket=bucket)
    
        # Create a TupleS3StoreBackend
        # Initialize without store_backend_id and check that the store_backend_id is generated correctly
        s3_expectations_store_backend = TupleS3StoreBackend(
            filepath_template="my_file_{0}",
            bucket=bucket,
            prefix=expectations_store_prefix,
        )
        # Make sure store_backend_id is not the error string
        store_error_uuid = uuid.UUID("00000000-0000-0000-0000-00000000e003")
        s3_expectations_store_backend_id = s3_expectations_store_backend.store_backend_id
>       assert s3_expectations_store_backend_id != store_error_uuid
E       AssertionError: assert UUID('00000000-0000-0000-0000-00000000e003') != UUID('00000000-0000-0000-0000-00000000e003')

tests/data_context/test_data_context_in_code_config.py:160: AssertionError
tests.data_context.store.test_store_backends::test_TupleS3StoreBackend_with_s3_put_options
Stack Traces | 0.166s run time
aws_credentials = None

    @mock_s3
    @pytest.mark.aws_deps
    def test_TupleS3StoreBackend_with_s3_put_options(aws_credentials):
        bucket = "leakybucket"
        conn = boto3.client("s3", region_name="us-east-1")
        conn.create_bucket(Bucket=bucket)
    
        my_store = TupleS3StoreBackend(
            bucket=bucket,
            # Since not all out options are supported in moto, only Metadata and StorageClass is passed here.  # noqa: E501 # FIXME CoP
            s3_put_options={
                "Metadata": {"test": "testMetadata"},
                "StorageClass": "REDUCED_REDUNDANCY",
            },
        )
    
        assert my_store.config["s3_put_options"] == {
            "Metadata": {"test": "testMetadata"},
            "StorageClass": "REDUCED_REDUNDANCY",
        }
    
        my_store.set(("AAA",), "aaa")
    
        res = conn.get_object(Bucket=bucket, Key="AAA")
    
        assert res["Metadata"] == {"test": "testMetadata"}
        assert res["StorageClass"] == "REDUCED_REDUNDANCY"
    
>       assert my_store.get(("AAA",)) == "aaa"

.../data_context/store/test_store_backends.py:838: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../data_context/store/_store_backend.py:121: in get
    value = self._get(key, **kwargs)
.../data_context/store/tuple_store_backend.py:545: in _get
    return self._get_by_s3_object_key(client, s3_object_key)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <great_expectations.data_context.store.tuple_store_backend.TupleS3StoreBackend object at 0x7f756ad3b490>
s3_client = <botocore.client.S3 object at 0x7f756ae5d520>, s3_object_key = 'AAA'

    def _get_by_s3_object_key(self, s3_client, s3_object_key):
        try:
            s3_response_object = s3_client.get_object(Bucket=self.bucket, Key=s3_object_key)
        except (s3_client.exceptions.NoSuchKey, s3_client.exceptions.NoSuchBucket) as e:
            raise InvalidKeyError(  # noqa: TRY003 # FIXME CoP
                f"Unable to retrieve object from TupleS3StoreBackend with the following Key: {s3_object_key!s}"  # noqa: E501 # FIXME CoP
            ) from e
    
        return (
>           s3_response_object["Body"]
            .read()
            .decode(s3_response_object.get("ContentEncoding", "utf-8"))
        )
E       LookupError: unknown encoding: utf-8,aws-chunked

.../data_context/store/tuple_store_backend.py:569: LookupError
tests.data_context.store.test_store_backends::test_TupleS3StoreBackend_with_prefix
Stack Traces | 0.192s run time
aws_credentials = None

    @mock_s3
    @pytest.mark.aws_deps
    def test_TupleS3StoreBackend_with_prefix(aws_credentials):
        """
        What does this test test and why?
    
        We will exercise the store backend's set method twice and then verify
        that the we calling get and list methods will return the expected keys.
    
        We will also check that the objects are stored on S3 at the expected location,
        and that the correct S3 URL for the object can be retrieved.
    
        """
        bucket = "leakybucket"
        prefix = "this_is_a_test_prefix"
        base_public_path = "http://www.test.com/"
    
        # create a bucket in Moto's mock AWS environment
        conn = boto3.resource("s3", region_name="us-east-1")
        conn.create_bucket(Bucket=bucket)
    
        my_store = TupleS3StoreBackend(
            filepath_template="my_file_{0}",
            bucket=bucket,
            prefix=prefix,
        )
    
        # We should be able to list keys, even when empty
        keys = my_store.list_keys()
        assert len(keys) == 1
    
        my_store.set(("AAA",), "aaa", content_type="text/html; charset=utf-8")
>       assert my_store.get(("AAA",)) == "aaa"

.../data_context/store/test_store_backends.py:491: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../data_context/store/_store_backend.py:121: in get
    value = self._get(key, **kwargs)
.../data_context/store/tuple_store_backend.py:545: in _get
    return self._get_by_s3_object_key(client, s3_object_key)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <great_expectations.data_context.store.tuple_store_backend.TupleS3StoreBackend object at 0x7f756c6dcd90>
s3_client = <botocore.client.S3 object at 0x7f756c6f1fa0>
s3_object_key = 'this_is_a_test_prefix/my_file_AAA'

    def _get_by_s3_object_key(self, s3_client, s3_object_key):
        try:
            s3_response_object = s3_client.get_object(Bucket=self.bucket, Key=s3_object_key)
        except (s3_client.exceptions.NoSuchKey, s3_client.exceptions.NoSuchBucket) as e:
            raise InvalidKeyError(  # noqa: TRY003 # FIXME CoP
                f"Unable to retrieve object from TupleS3StoreBackend with the following Key: {s3_object_key!s}"  # noqa: E501 # FIXME CoP
            ) from e
    
        return (
>           s3_response_object["Body"]
            .read()
            .decode(s3_response_object.get("ContentEncoding", "utf-8"))
        )
E       LookupError: unknown encoding: utf-8,aws-chunked

.../data_context/store/tuple_store_backend.py:569: LookupError

To view more test analytics, go to the Test Analytics Dashboard
📢 Thoughts on this report? Let us know!

@@ -86,7 +86,7 @@ def test_queryable_asset_should_pass_test_connection(
)

inspector: Inspector = sa.inspection.inspect(snowflake_ds.get_engine())
inspector_tables = inspector.get_table_names()
Copy link
Contributor Author

@NathanFarmer NathanFarmer Jan 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This call started returning dict_keys instead of list. We only call it in tests. Open issue here: snowflakedb/snowflake-sqlalchemy#566

@@ -21,11 +21,14 @@

@pytest.mark.snowflake
class TestSnowflake:
@pytest.mark.xfail(
raises=sa.exc.ProgrammingError
) # inspector.get_table_names() fails with this role
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm looking into how to reproduce this for a snowflake-sqlalchemy GH issue, but in the meantime xfailing this test allows us to un-pin.

@NathanFarmer NathanFarmer added this pull request to the merge queue Jan 15, 2025
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Jan 15, 2025
@NathanFarmer NathanFarmer added this pull request to the merge queue Jan 15, 2025
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Jan 15, 2025
@NathanFarmer NathanFarmer added this pull request to the merge queue Jan 15, 2025
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Jan 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants