Share
Snowflake (Edge) Setup Guide

Snowflake offers a cloud-based data storage and analytics service, generally termed as data warehouse-as-a-service. Companies can use it to store and analyze data using cloud-based hardware and software.

Snowflake automatically provides you with one data warehouse when you create an account. Further, each data warehouse can have one or more databases, although this is not mandatory.

The data from your Pipeline is staged in Hevo’s S3 bucket before finally being loaded to your Snowflake warehouse.

The Snowflake data warehouse may be hosted on any of the following cloud providers:

  • Amazon Web Services (AWS)

  • Google Cloud Platform (GCP)

  • Microsoft Azure (Azure)

To connect your Snowflake instance to Hevo, you can either use a private link that directly connects to your cloud provider through a Virtual Private Cloud (VPC) or connect via a public network using the Snowflake account URL.

A private link enables communication and network traffic to remain exclusively within the cloud provider’s private network while maintaining direct and secure access across VPCs. It allows you to transfer data to Snowflake without going through the public internet or using proxies to connect Snowflake to your network. Note that even with a private link, the public endpoint is still accessible, and Hevo uses that to connect to your database cluster.

Please reach out to Hevo Support to retrieve the private link for your cloud provider.


Modifying Snowflake Destination Configuration in Edge

You can modify some settings of your Snowflake Edge Destination after its creation. However, any configuration changes will affect all the Pipelines using that Destination.

To modify the configuration of your Snowflake Destination in Edge:

  1. In the detailed view of your Destination, do one of the following:

    • Click the More () icon to access the Destination Actions menu, and then click Edit Destination.

      Access Edit Destination

    • In the Destination Configuration section, click EDIT.

      Access Edit Destination

  2. On the <Your Destination Name> editing page:

    Edit Destination Configuration

    • You can specify a new name for your Destination, not exceeding 255 characters.

    • In the Authentication section, you can modify your Authentication type and update the necessary fields based on the selected type:

      • Key Pair:

        • Private Key: Click the attach () icon to upload your encrypted or non-encrypted private key file. As the database user configured in your Destination cannot be changed, ensure that the public key corresponding to the uploaded private key is assigned to it.

        • Passphrase: Click Change to clear the field. If you uploaded an encrypted private key, provide the password used to generate it; otherwise, leave the field blank.

      • Access Credentials:

        Change Database Password

        • Database Password: Click Change to update the password for the user configured in your Destination.
  3. Click TEST & SAVE to check the connection to your Snowflake Destination and then save the modified configuration.


Data Type Mapping

Hevo maps a Source data type internally to a unified data type, referred to as the Hevo Data Type in the table below. This data type is used to represent the Source data from all supported data types in a lossless manner. The Hevo data types are then mapped to the corresponding data types that are supported in each Destination.

Hevo Data Type Snowflake Data Type
ARRAY ARRAY
BOOLEAN BOOLEAN
BYTEARRAY BINARY
BYTE BYTEINT
DATE DATE
- DATETIME
- TIMESTAMP
TIMESTAMP_NTZ
DECIMAL NUMBER
DOUBLE DOUBLE
FLOAT FLOAT
INTEGER INTEGER
JSON VARIANT
LONG BIGINT
SHORT SMALLINT
TIME TIME
- TIMESTAMPTZ
- TIMETZ
- ZONEDDATETIME
TIMESTAMP_TZ
VARCHAR VARCHAR

Destination Considerations

  • Snowflake converts the Source table and column names to uppercase while mapping to the Destination table. For example, the Source table, Table_namE_05, is converted to TABLE_NAME_05. The same conventions apply to column names.

Limitations

  • Hevo replicates a maximum of 4096 columns to each Snowflake table, of which six are Hevo-reserved metadata columns used during data replication. Therefore, your Pipeline can replicate up to 4090 (4096-6) columns for each table.

  • If a Source object has a column value exceeding 16 MB, Hevo marks the Events as failed during ingestion, as Snowflake allows a maximum column value size of 16 MB.

Last updated on Feb 05, 2025

Tell us what went wrong

Skip to the section