Skip to content

Redshift

Local/Built-in Scheduler

Engine Adapter Type: redshift

Installation

pip install "sqlmesh[redshift]"

Connection options

Option Description Type Required
type Engine type name - must be redshift string Y
user The username to use for authentication with the Amazon Redshift cluster string N
password The password to use for authentication with the Amazon Redshift cluster string N
database The name of the database instance to connect to string N
host The hostname of the Amazon Redshift cluster string N
port The port number of the Amazon Redshift cluster int N
ssl Is SSL enabled. SSL must be enabled when authenticating using IAM (Default: True) bool N
sslmode The security of the connection to the Amazon Redshift cluster. verify-ca and verify-full are supported. string N
timeout The number of seconds before the connection to the server will timeout. int N
tcp_keepalive Is TCP keepalive used. (Default: True) bool N
application_name The name of the application string N
preferred_role The IAM role preferred for the current connection string N
principal_arn The ARN of the IAM entity (user or role) for which you are generating a policy string N
credentials_provider The class name of the IdP that will be used for authenticating with the Amazon Redshift cluster string N
region The AWS region of the Amazon Redshift cluster string N
cluster_identifier The cluster identifier of the Amazon Redshift cluster string N
iam If IAM authentication is enabled. IAM must be True when authenticating using an IdP dict N
is_serverless If the Amazon Redshift cluster is serverless (Default: False) bool N
serverless_acct_id The account ID of the serverless cluster string N
serverless_work_group The name of work group for serverless end point string N

Airflow Scheduler

Engine Name: redshift

In order to share a common implementation across local and Airflow, SQLMesh's Redshift engine implements its own hook and operator.

To enable support for this operator, the Airflow Redshift provider package should be installed on the target Airflow cluster along with SQLMesh with the Redshift extra:

pip install "apache-airflow-providers-amazon"
pip install "sqlmesh[redshift]"

The operator requires an Airflow connection to determine the target Redshift account. Refer to AmazonRedshiftConnection for details on how to define a connection string.

By default, the connection ID is set to sqlmesh_redshift_default, but it can be overridden using the engine_operator_args parameter to the SQLMeshAirflow instance as in the example below:

1
2
3
4
5
6
7
sqlmesh_airflow = SQLMeshAirflow(
    "redshift",
    default_catalog="<database name>",
    engine_operator_args={
        "redshift_conn_id": "<Connection ID>"
    },
)