top of page
Search

Scripting a Snowflake Stage for AWS

  • Writer: Tim Burns
    Tim Burns
  • Oct 8, 2021
  • 1 min read

Scripting a Snowflake Stage for AWS is straightforward, but it's easy to get tripped up on the details.


1. Create a Shared Environment File to Store the Important Details (Keep Private)


export EXPORT_BUCKET=<my bucket>
export EXPORT_STAGE="stage/kexp"
export EXPORT_BUCKET_ARN=arn:aws:s3:::$EXPORT_BUCKET
export STAGE_LOCATION_ARN=$EXPORT_BUCKET_ARN/stage

export SNOWFLAKE_INTEGRATION_ROLE=<create the role>
export TRUSTED_ENTITY=<Your Snowflake account IAM User>
export TRUSTED_CONDITION=<A Storage Integration Identifier>
export S3_STORAGE_INTEGRATION=<The S3 Bucket for incoming data>

2. Build out the role in IAM with a Cloud Formation template

aws_cloudformation$ make deploy-stage-role

2. Build the Storage Integration as a foundational layer and save the results

snowflake/ddl$ make create-storage-integration

3. Fill in the shared environment file with the output from building the entity


4. Use the STORAGE_AWS_EXTERNAL_ID as a stored credential in creating the role

aws_cloudformation$ make deploy-stage-role

5. Create the stage object

snowflake/ddl$ make create-kexp-stage

The end result here should be a list of all the files that have been loaded from the ongoing Step Functions job.


 
 
 

Recent Posts

See All
Getting Bedrock to Recognize Images

AWS's Bedrock documentation is a bit of a disaster. It took me a while to get some code to generate image descriptions. I finally found...

 
 
 

Comments


  • Facebook
  • Twitter
  • LinkedIn

©2019 by Owl Mountain Software, LLC. Proudly created with Wix.com

bottom of page