Amazon SageMaker is a fully managed service for data science and machine learning (ML) workflows. You can use Amazon SageMaker to simplify the process of building, training, and deploying ML models.
To train a model, you can include your training script and dependencies in a Docker container that runs your training code. A container provides an effectively isolated environment, ensuring a consistent runtime and reliable training process.
The Snowpark library provides an intuitive API for querying and processing data in a data pipeline. Using the Snowpark library, you can build applications that process data in Snowflake without moving data to the system where your application code runs. You can also automate data transformation and processing by writing stored procedures and scheduling those procedures as tasks in Snowflake.
In this GitHub repository we will demonstrate how to use SageMaker training jobs using Snowpark Python API to fetch data from Snowflake.
The following figure represents the high-level architecture of the proposed solution to use Snowflake as a data source, using Snowpark Python API to train ML models with Amazon SageMaker.
- An AWS Account.
- An IAM user with SageMaker and CodeBuild permissions.
- Snowflake account - you can sign up here.
We suggest for the initial setup, to use Cloud9 on a m5.large
instance type with 64 GB of storage.
We aim to explain how to create a custom image for Amazon SageMaker Studio that has Snowpark already installed. The advantage of creating an image and make it available to all SageMaker Studio users is that it creates a consistent environment for the SageMake Studio users, which they could also run locally. To create the custom Conda environment for Snowpark, please follow the instructions here.
After you complete this step, the outcome should be a snowflake-env-kernel
attached to your SageMaker Studio domain
Secrets Manager enables you to replace hardcoded credentials in your code, including passwords, with an API call to Secrets Manager to retrieve the secret programmatically. This helps ensure the secret can't be compromised by someone examining your code, because the secret no longer exists in the code.
We recommend to store Snowflake account
, user
and password
in AWS Secrets Manager.
- Navigate to AWS Secrets Manager on the console and choose
Store new secret
- Choose
Other type of secret
, add rows foraccount
,user
andpassword
and fill your Snowflake account id, username and password. - Choose
Next
- Give secret a name:
dev/ml/snowflake
- Choose
Store
Note: Please make sure you run Build a custom SageMaker Studio image with Snowpark already installed step, so you'll have a snowflake-env-kernel
kernel set up in SageMaker Studio.
Run the Getting Started with Snowpark for Machine Learning on SageMaker workshop to populate the Snowflake tables.
When opening 0_setup.ipynb
notebook on SageMaker Studio to Load HOL data to Snowflake, choose snowflake-env-kernel
kernel you have created in previous step.
Upon completing running 0_setup.ipynb
notebook from Getting Started with Snowpark for Machine Learning on SageMaker workshop, you should have the HOL_DB.PUBLIC.MAINTENANCE_HUM
table populated.
Alternatively, you can open terminal in SageMaker Studio (File -> New -> Terminal) and execute:
git clone https://github.com/aws-samples/amazon-sagemaker-training-jobs-with-snowflake-and-snowpark
- Open the snowflake_bring_your_own_container_training notebook, you can choose any kernel. Run the notebook cell by cell and read the instructions.
See CONTRIBUTING for more information.
This library is licensed under the MIT-0 License. See the LICENSE file.