I want to talk about storage integration in Snowflake, which I think is quite powerful for migration and similar use cases.
As far as I know, storage integration in Snowflake can connect to several object storages such as AWS S3, Google Cloud Storage, and Azure Blob Storage.

To set this up, we first need to create a storage integration in our Snowflake account. But before doing that, we must make sure an IAM Role is created with permission to access the bucket.

Let’s take an example with an S3 bucket. 
If the IAM Role is ready, copy its ARN and run the following command in Snowflake to create the storage integration:

CREATE STORAGE INTEGRATION my_s3_integration
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = S3
ENABLED = TRUE
STORAGE_AWS_ROLE_ARN = 'arn:aws:iam::123456789012:role/myRandomSnowflakeRole'
STORAGE_ALLOWED_LOCATIONS = ('s3://my-bucket/data/');

Once the command is executed successfully, grant the IAM role permission to access the S3 bucket.

Go to IAM in AWS, select the role you created earlier, open the `Trust Relationships` tab, and choose `Edit trust relationship`.
Then, update the JSON policy by replacing the values of "AWS" and "sts:ExternalId" with the two IDs provided by the Snowflake query from the previous step.

After this setup, we can use the COPY INTO command in Snowflake to load data from the S3 bucket. For example:

COPY INTO {SNOWFLAKE_TABLE} 
FROM {S3_FILE_PATH}
STORAGE_INTEGRATION = {STORAGE_INTEGRATION}
FILE_FORMAT = (TYPE = CSV NULL_IF = ('NULL') COMPRESSION = GZIP 
FIELD_DELIMITER = '|' ESCAPE = '\\\\' 
FIELD_OPTIONALLY_ENCLOSED_BY='"');