SQLAlchemy Integration¶
This section covers SQLAlchemy dialect implementations for Amazon Athena.
Dialects¶
- class pyathena.sqlalchemy.rest.AthenaRestDialect(json_deserializer=None, json_serializer=None, **kwargs)[source]¶
SQLAlchemy dialect for Amazon Athena using the standard REST API cursor.
This dialect uses the default Cursor implementation, which retrieves query results via the GetQueryResults API. Results are returned as Python tuples with type conversion handled by the default converter.
This is the standard dialect for general-purpose Athena access and is suitable for most use cases where specialized result formats (Arrow, pandas) are not required.
- Connection URL Format:
awsathena+rest://{access_key}:{secret_key}@athena.{region}.amazonaws.com/{schema}
Example
>>> from sqlalchemy import create_engine >>> engine = create_engine( ... "awsathena+rest://:@athena.us-west-2.amazonaws.com/default" ... "?s3_staging_dir=s3://my-bucket/athena-results/" ... )
See also
Cursor: The underlying cursor implementation.AthenaDialect: Base dialect class.- supports_statement_cache: bool = True¶
indicates if this dialect supports caching.
All dialects that are compatible with statement caching should set this flag to True directly on each dialect class and subclass that supports it. SQLAlchemy tests that this flag is locally present on each dialect subclass before it will use statement caching. This is to provide safety for legacy or new dialects that are not yet fully tested to be compliant with SQL statement caching.
Added in version 1.4.5.
See also
engine_thirdparty_caching
- classmethod import_dbapi() ModuleType[source]¶
Import the DBAPI module that is used by this dialect.
The Python module object returned here will be assigned as an instance variable to a constructed dialect under the name
.dbapi.Changed in version 2.0: The
Dialect.import_dbapi()class method is renamed from the previous method.Dialect.dbapi(), which would be replaced at dialect instantiation time by the DBAPI module itself, thus using the same name in two different ways. If a.Dialect.dbapi()classmethod is present on a third-party dialect, it will be used and a deprecation warning will be emitted.
- CACHE_HIT = 0¶
- CACHE_MISS = 1¶
- CACHING_DISABLED = 2¶
- NO_CACHE_KEY = 3¶
- NO_DIALECT_SUPPORT = 4¶
- __init__(json_deserializer=None, json_serializer=None, **kwargs)¶
- bind_typing = 1¶
define a means of passing typing information to the database and/or driver for bound parameters.
See
BindTypingfor values.Added in version 2.0.
- colspecs: dict[type[Any], type[Any]] = {<class 'sqlalchemy.sql.sqltypes.DATE'>: <class 'pyathena.sqlalchemy.types.AthenaDate'>, <class 'sqlalchemy.sql.sqltypes.DATETIME'>: <class 'pyathena.sqlalchemy.types.AthenaTimestamp'>, <class 'sqlalchemy.sql.sqltypes.TIMESTAMP'>: <class 'pyathena.sqlalchemy.types.AthenaTimestamp'>}¶
A dictionary of TypeEngine classes from sqlalchemy.types mapped to subclasses that are specific to the dialect class. This dictionary is class-level only and is not accessed from the dialect instance itself.
- connect(*cargs: Any, **cparams: Any) DBAPIConnection¶
Establish a connection using this dialect’s DBAPI.
The default implementation of this method is:
def connect(self, *cargs, **cparams): return self.dbapi.connect(*cargs, **cparams)
The
*cargs, **cparamsparameters are generated directly from this dialect’sDialect.create_connect_args()method.This method may be used for dialects that need to perform programmatic per-connection steps when a new connection is procured from the DBAPI.
- Parameters:
*cargs – positional parameters returned from the
Dialect.create_connect_args()method**cparams – keyword parameters returned from the
Dialect.create_connect_args()method.
- Returns:
a DBAPI connection, typically from the PEP 249 module level
.connect()function.
See also
Dialect.create_connect_args()Dialect.on_connect()
- connection_characteristics = {'isolation_level': <sqlalchemy.engine.characteristics.IsolationLevelCharacteristic object>, 'logging_token': <sqlalchemy.engine.characteristics.LoggingTokenCharacteristic object>}¶
- construct_arguments: list[tuple[type[SchemaItem | ClauseElement], Mapping[str, Any]]] | None = [(<class 'sqlalchemy.sql.schema.Table'>, {'bucket_count': None, 'compression': None, 'file_format': None, 'location': None, 'row_format': None, 'serdeproperties': None, 'tblproperties': None}), (<class 'sqlalchemy.sql.schema.Column'>, {'cluster': False, 'partition': False, 'partition_transform': None, 'partition_transform_bucket_count': None, 'partition_transform_truncate_length': None})]¶
Optional set of argument specifiers for various SQLAlchemy constructs, typically schema items.
To implement, establish as a series of tuples, as in:
construct_arguments = [ (schema.Index, {"using": False, "where": None, "ops": None}), ]
If the above construct is established on the PostgreSQL dialect, the
Indexconstruct will now accept the keyword argumentspostgresql_using,postgresql_where, nadpostgresql_ops. Any other argument specified to the constructor ofIndexwhich is prefixed withpostgresql_will raiseArgumentError.A dialect which does not include a
construct_argumentsmember will not participate in the argument validation system. For such a dialect, any argument name is accepted by all participating constructs, within the namespace of arguments prefixed with that dialect name. The rationale here is so that third-party dialects that haven’t yet implemented this feature continue to function in the old way.See also
DialectKWArgs- implementing base class which consumesDefaultDialect.construct_arguments
- create_connect_args(url: URL) tuple[tuple[str], MutableMapping[str, Any]]¶
Build DB-API compatible connection arguments.
Given a
URLobject, returns a tuple consisting of a(*args, **kwargs)suitable to send directly to the dbapi’s connect function. The arguments are sent to theDialect.connect()method which then runs the DBAPI-levelconnect()function.The method typically makes use of the
URL.translate_connect_args()method in order to generate a dictionary of options.The default implementation is:
def create_connect_args(self, url): opts = url.translate_connect_args() opts.update(url.query) return ([], opts)
- Parameters:
url – a
URLobject- Returns:
a tuple of
(*args, **kwargs)which will be passed to theDialect.connect()method.
See also
URL.translate_connect_args()
- create_xid()¶
Create a random two-phase transaction ID.
This id will be passed to do_begin_twophase(), do_rollback_twophase(), do_commit_twophase(). Its format is unspecified.
- cte_follows_insert: bool = True¶
target database, when given a CTE with an INSERT statement, needs the CTE to be below the INSERT
- classmethod dbapi() ModuleType¶
- dbapi_exception_translation_map: Mapping[str, str] = {}¶
A dictionary of names that will contain as values the names of pep-249 exceptions (“IntegrityError”, “OperationalError”, etc) keyed to alternate class names, to support the case where a DBAPI has exception classes that aren’t named as they are referred to (e.g. IntegrityError = MyException). In the vast majority of cases this dictionary is empty.
- ddl_compiler¶
alias of
AthenaDDLCompiler
- default_metavalue_token: str = 'DEFAULT'¶
for INSERT… VALUES (DEFAULT) syntax, the token to put in the parenthesis.
- default_schema_name: str | None = None¶
the name of the default schema. This value is only available for supporting dialects, and is typically populated during the initial connection to the database.
- default_sequence_base: int = 1¶
the default value that will be rendered as the “START WITH” portion of a CREATE SEQUENCE DDL statement.
- delete_returning_multifrom: bool = False¶
if the dialect supports RETURNING with DELETE..FROM
Added in version 2.0.
- denormalize_name(name)¶
convert the given name to a case insensitive identifier for the backend if it is an all-lowercase name.
This method is only used if the dialect defines requires_name_normalize=True.
- detect_autocommit_setting(dbapi_conn: DBAPIConnection) bool¶
Detect the current autocommit setting for a DBAPI connection.
- Parameters:
dbapi_connection – a DBAPI connection object
- Returns:
True if autocommit is enabled, False if disabled
- Return type:
This method inspects the given DBAPI connection to determine whether autocommit mode is currently enabled. The specific mechanism for detecting autocommit varies by database dialect and DBAPI driver, however it should be done without network round trips.
Note
Not all dialects support autocommit detection. Dialects that do not support this feature will raise
NotImplementedError.
- property dialect_description¶
- dispatch: dispatcher[Dialect] = <sqlalchemy.event.base.DialectEventsDispatch object>¶
- do_begin(dbapi_connection)¶
Provide an implementation of
connection.begin(), given a DB-API connection.The DBAPI has no dedicated “begin” method and it is expected that transactions are implicit. This hook is provided for those DBAPIs that might need additional help in this area.
- Parameters:
dbapi_connection – a DBAPI connection, typically proxied within a
ConnectionFairy.
- do_begin_twophase(connection: Connection, xid: Any) None¶
Begin a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
- do_close(dbapi_connection)¶
Provide an implementation of
connection.close(), given a DBAPI connection.This hook is called by the
_pool.Poolwhen a connection has been detached from the pool, or is being returned beyond the normal capacity of the pool.
- do_commit(dbapi_connection)¶
Provide an implementation of
connection.commit(), given a DB-API connection.- Parameters:
dbapi_connection – a DBAPI connection, typically proxied within a
ConnectionFairy.
- do_commit_twophase(connection: Connection, xid: Any, is_prepared: bool = True, recover: bool = False) None¶
Commit a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
is_prepared – whether or not
TwoPhaseTransaction.prepare()was called.recover – if the recover flag was passed.
- do_execute(cursor, statement, parameters, context=None)¶
Provide an implementation of
cursor.execute(statement, parameters).
- do_execute_no_params(cursor, statement, context=None)¶
Provide an implementation of
cursor.execute(statement).The parameter collection should not be sent.
- do_executemany(cursor, statement, parameters, context=None)¶
Provide an implementation of
cursor.executemany(statement, parameters).
- do_ping(dbapi_connection: DBAPIConnection) bool¶
ping the DBAPI connection and return True if the connection is usable.
- do_prepare_twophase(connection: Connection, xid: Any) None¶
Prepare a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
- do_recover_twophase(connection: Connection) List[Any]¶
Recover list of uncommitted prepared two phase transaction identifiers on the given connection.
- Parameters:
connection – a
_engine.Connection.
- do_release_savepoint(connection, name)¶
Release the named savepoint on a connection.
- Parameters:
connection – a
_engine.Connection.name – savepoint name.
- do_rollback(dbapi_connection: PoolProxiedConnection) None¶
Provide an implementation of
connection.rollback(), given a DB-API connection.- Parameters:
dbapi_connection – a DBAPI connection, typically proxied within a
ConnectionFairy.
- do_rollback_to_savepoint(connection, name)¶
Rollback a connection to the named savepoint.
- Parameters:
connection – a
_engine.Connection.name – savepoint name.
- do_rollback_twophase(connection: Connection, xid: Any, is_prepared: bool = True, recover: bool = False) None¶
Rollback a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
is_prepared – whether or not
TwoPhaseTransaction.prepare()was called.recover – if the recover flag was passed.
- do_savepoint(connection, name)¶
Create a savepoint with the given name.
- Parameters:
connection – a
_engine.Connection.name – savepoint name.
- do_set_input_sizes(cursor: DBAPICursor, list_of_tuples: _GenericSetInputSizesType, context: ExecutionContext) Any¶
invoke the cursor.setinputsizes() method with appropriate arguments
This hook is called if the
Dialect.bind_typingattribute is set to theBindTyping.SETINPUTSIZESvalue. Parameter data is passed in a list of tuples (paramname, dbtype, sqltype), whereparamnameis the key of the parameter in the statement,dbtypeis the DBAPI datatype andsqltypeis the SQLAlchemy type. The order of tuples is in the correct parameter order.Added in version 1.4.
Changed in version 2.0: - setinputsizes mode is now enabled by setting
Dialect.bind_typingtoBindTyping.SETINPUTSIZES. Dialects which accept ause_setinputsizesparameter should set this value appropriately.
- do_terminate(dbapi_connection)¶
Provide an implementation of
connection.close()that tries as much as possible to not block, given a DBAPI connection.In the vast majority of cases this just calls .close(), however for some asyncio dialects may call upon different API features.
This hook is called by the
_pool.Poolwhen a connection is being recycled or has been invalidated.Added in version 1.4.41.
- engine_config_types: Mapping[str, Any] = {'echo': <function bool_or_str.<locals>.bool_or_value>, 'echo_pool': <function bool_or_str.<locals>.bool_or_value>, 'future': <function asbool>, 'max_overflow': <function asint>, 'pool_recycle': <function asint>, 'pool_size': <function asint>, 'pool_timeout': <function asint>}¶
a mapping of string keys that can be in an engine config linked to type conversion functions.
- classmethod engine_created(engine: Engine) None¶
A convenience hook called before returning the final
_engine.Engine.If the dialect returned a different class from the
get_dialect_cls()method, then the hook is called on both classes, first on the dialect class returned by theget_dialect_cls()method and then on the class on which the method was called.The hook should be used by dialects and/or wrappers to apply special events to the engine or its components. In particular, it allows a dialect-wrapping class to apply dialect-level events.
- exclude_set_input_sizes: Set[Any] | None = None¶
set of DBAPI type objects that should be excluded in automatic cursor.setinputsizes() calls.
This is only used if bind_typing is BindTyping.SET_INPUT_SIZES
- execution_ctx_cls¶
alias of
DefaultExecutionContext
- favor_returning_over_lastrowid: bool = False¶
for backends that support both a lastrowid and a RETURNING insert strategy, favor RETURNING for simple single-int pk inserts.
cursor.lastrowid tends to be more performant on most backends.
- property full_returning¶
Deprecated since version 2.0: full_returning is deprecated, please use insert_returning, update_returning, delete_returning
- classmethod get_async_dialect_cls(url: URL) Type[Dialect]¶
Given a URL, return the
Dialectthat will be used by an async engine.By default this is an alias of
Dialect.get_dialect_cls()and just returns the cls. It may be used if a dialect provides both a sync and async version under the same name, like thepsycopgdriver.Added in version 2.
See also
Dialect.get_dialect_cls()
- get_check_constraints(connection: Connection, table_name: str, schema: str | None = None, **kw: Any) List[ReflectedCheckConstraint]¶
Return information about check constraints in
table_name.Given a string
table_nameand an optional stringschema, return check constraint information as a list of dicts corresponding to theReflectedCheckConstraintdictionary.This is an internal dialect method. Applications should use
Inspector.get_check_constraints().
- get_columns(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
Return information about columns in
table_name.Given a
_engine.Connection, a stringtable_name, and an optional stringschema, return column information as a list of dictionaries corresponding to theReflectedColumndictionary.This is an internal dialect method. Applications should use
Inspector.get_columns().
- get_default_isolation_level(dbapi_conn)¶
Given a DBAPI connection, return its isolation level, or a default isolation level if one cannot be retrieved.
May be overridden by subclasses in order to provide a “fallback” isolation level for databases that cannot reliably retrieve the actual isolation level.
By default, calls the
_engine.Interfaces.get_isolation_level()method, propagating any exceptions raised.Added in version 1.3.22.
- classmethod get_dialect_cls(url: URL) Type[Dialect]¶
Given a URL, return the
Dialectthat will be used.This is a hook that allows an external plugin to provide functionality around an existing dialect, by allowing the plugin to be loaded from the url based on an entrypoint, and then the plugin returns the actual dialect to be used.
By default this just returns the cls.
- get_dialect_pool_class(url: URL) Type[Pool]¶
return a Pool class to use for a given URL
- get_driver_connection(connection: DBAPIConnection) Any¶
Returns the connection object as returned by the external driver package.
For normal dialects that use a DBAPI compliant driver this call will just return the
connectionpassed as argument. For dialects that instead adapt a non DBAPI compliant driver, like when adapting an asyncio driver, this call will return the connection-like object as returned by the driver.Added in version 1.4.24.
- get_foreign_keys(connection: Connection, table_name: str, schema: str | None = None, **kw) list[ReflectedForeignKeyConstraint]¶
Return information about foreign_keys in
table_name.Given a
_engine.Connection, a stringtable_name, and an optional stringschema, return foreign key information as a list of dicts corresponding to theReflectedForeignKeyConstraintdictionary.This is an internal dialect method. Applications should use
_engine.Inspector.get_foreign_keys().
- get_indexes(connection: Connection, table_name: str, schema: str | None = None, **kw) list[ReflectedIndex]¶
Return information about indexes in
table_name.Given a
_engine.Connection, a stringtable_nameand an optional stringschema, return index information as a list of dictionaries corresponding to theReflectedIndexdictionary.This is an internal dialect method. Applications should use
Inspector.get_indexes().
- get_isolation_level(dbapi_connection: DBAPIConnection) Literal['SERIALIZABLE', 'REPEATABLE READ', 'READ COMMITTED', 'READ UNCOMMITTED', 'AUTOCOMMIT']¶
Given a DBAPI connection, return its isolation level.
When working with a
_engine.Connectionobject, the corresponding DBAPI connection may be procured using the_engine.Connection.connectionaccessor.Note that this is a dialect-level method which is used as part of the implementation of the
_engine.Connectionand_engine.Engineisolation level facilities; these APIs should be preferred for most typical use cases.See also
_engine.Connection.get_isolation_level()- view current level_engine.Connection.default_isolation_level- view default level:paramref:`.Connection.execution_options.isolation_level` - set per
_engine.Connectionisolation level:paramref:`_sa.create_engine.isolation_level` - set per
_engine.Engineisolation level
- get_isolation_level_values(dbapi_conn: DBAPIConnection) Sequence[Literal['SERIALIZABLE', 'REPEATABLE READ', 'READ COMMITTED', 'READ UNCOMMITTED', 'AUTOCOMMIT']]¶
return a sequence of string isolation level names that are accepted by this dialect.
The available names should use the following conventions:
use UPPERCASE names. isolation level methods will accept lowercase names but these are normalized into UPPERCASE before being passed along to the dialect.
separate words should be separated by spaces, not underscores, e.g.
REPEATABLE READ. isolation level names will have underscores converted to spaces before being passed along to the dialect.The names for the four standard isolation names to the extent that they are supported by the backend should be
READ UNCOMMITTED,READ COMMITTED,REPEATABLE READ,SERIALIZABLEif the dialect supports an autocommit option it should be provided using the isolation level name
AUTOCOMMIT.Other isolation modes may also be present, provided that they are named in UPPERCASE and use spaces not underscores.
This function is used so that the default dialect can check that a given isolation level parameter is valid, else raises an
_exc.ArgumentError.A DBAPI connection is passed to the method, in the unlikely event that the dialect needs to interrogate the connection itself to determine this list, however it is expected that most backends will return a hardcoded list of values. If the dialect supports “AUTOCOMMIT”, that value should also be present in the sequence returned.
The method raises
NotImplementedErrorby default. If a dialect does not implement this method, then the default dialect will not perform any checking on a given isolation level value before passing it onto theDialect.set_isolation_level()method. This is to allow backwards-compatibility with third party dialects that may not yet be implementing this method.Added in version 2.0.
- get_materialized_view_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of all materialized view names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_materialized_view_names().- Parameters:
schema –
schema name to query, if not the default schema.
Added in version 2.0.
- get_multi_check_constraints(connection, **kw)¶
Return information about check constraints in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_check_constraints().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_columns(connection, **kw)¶
Return information about columns in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_columns().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_foreign_keys(connection, **kw)¶
Return information about foreign_keys in all tables in the given
schema.This is an internal dialect method. Applications should use
_engine.Inspector.get_multi_foreign_keys().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_indexes(connection, **kw)¶
Return information about indexes in in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_indexes().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_pk_constraint(connection, **kw)¶
Return information about primary key constraints in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_pk_constraint().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_table_comment(connection, **kw)¶
Return information about the table comment in all tables in the given
schema.This is an internal dialect method. Applications should use
_engine.Inspector.get_multi_table_comment().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_table_options(connection, **kw)¶
Return a dictionary of options specified when the tables in the given schema were created.
This is an internal dialect method. Applications should use
_engine.Inspector.get_multi_table_options().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_unique_constraints(connection, **kw)¶
Return information about unique constraints in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_unique_constraints().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_pk_constraint(connection: Connection, table_name: str, schema: str | None = None, **kw) ReflectedPrimaryKeyConstraint¶
Return information about the primary key constraint on table_name`.
Given a
_engine.Connection, a stringtable_name, and an optional stringschema, return primary key information as a dictionary corresponding to theReflectedPrimaryKeyConstraintdictionary.This is an internal dialect method. Applications should use
Inspector.get_pk_constraint().
- classmethod get_pool_class(url: URL) Type[Pool]¶
- get_schema_names(connection, **kw)¶
Return a list of all schema names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_schema_names().
- get_sequence_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of all sequence names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_sequence_names().- Parameters:
schema – schema name to query, if not the default schema.
Added in version 1.4.
- get_table_comment(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
Return the “comment” for the table identified by
table_name.Given a string
table_nameand an optional stringschema, return table comment information as a dictionary corresponding to theReflectedTableCommentdictionary.This is an internal dialect method. Applications should use
Inspector.get_table_comment().- Raise:
NotImplementedErrorfor dialects that don’t support comments.
Added in version 1.2.
- get_table_names(connection: Connection, schema: str | None = None, **kw)¶
Return a list of table names for
schema.This is an internal dialect method. Applications should use
_engine.Inspector.get_table_names().
- get_table_options(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
Return a dictionary of options specified when
table_namewas created.This is an internal dialect method. Applications should use
_engine.Inspector.get_table_options().
- get_temp_table_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of temporary table names on the given connection, if supported by the underlying backend.
This is an internal dialect method. Applications should use
_engine.Inspector.get_temp_table_names().
- get_temp_view_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of temporary view names on the given connection, if supported by the underlying backend.
This is an internal dialect method. Applications should use
_engine.Inspector.get_temp_view_names().
- get_unique_constraints(connection: Connection, table_name: str, schema: str | None = None, **kw: Any) List[ReflectedUniqueConstraint]¶
Return information about unique constraints in
table_name.Given a string
table_nameand an optional stringschema, return unique constraint information as a list of dicts corresponding to theReflectedUniqueConstraintdictionary.This is an internal dialect method. Applications should use
Inspector.get_unique_constraints().
- get_view_definition(connection: Connection, view_name: str, schema: str | None = None, **kw)¶
Return plain or materialized view definition.
This is an internal dialect method. Applications should use
_engine.Inspector.get_view_definition().Given a
_engine.Connection, a stringview_name, and an optional stringschema, return the view definition.
- get_view_names(connection: Connection, schema: str | None = None, **kw)¶
Return a list of all non-materialized view names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_view_names().- Parameters:
schema – schema name to query, if not the default schema.
- has_index(connection, table_name, index_name, schema=None, **kw)¶
Check the existence of a particular index name in the database.
Given a
_engine.Connectionobject, a stringtable_nameand string index name, returnTrueif an index of the given name on the given table exists,Falseotherwise.The
DefaultDialectimplements this in terms of theDialect.has_table()andDialect.get_indexes()methods, however dialects can implement a more performant version.This is an internal dialect method. Applications should use
_engine.Inspector.has_index().Added in version 1.4.
- has_schema(connection: Connection, schema_name: str, **kw: Any) bool¶
Check the existence of a particular schema name in the database.
Given a
_engine.Connectionobject, a stringschema_name, returnTrueif a schema of the given exists,Falseotherwise.The
DefaultDialectimplements this by checking the presence ofschema_nameamong the schemas returned byDialect.get_schema_names(), however dialects can implement a more performant version.This is an internal dialect method. Applications should use
_engine.Inspector.has_schema().Added in version 2.0.
- has_sequence(connection: Connection, sequence_name: str, schema: str | None = None, **kw: Any) bool¶
Check the existence of a particular sequence in the database.
Given a
_engine.Connectionobject and a string sequence_name, returnTrueif the given sequence exists in the database,Falseotherwise.This is an internal dialect method. Applications should use
_engine.Inspector.has_sequence().
- has_table(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
For internal dialect use, check the existence of a particular table or view in the database.
Given a
_engine.Connectionobject, a string table_name and optional schema name, return True if the given table exists in the database, False otherwise.This method serves as the underlying implementation of the public facing
Inspector.has_table()method, and is also used internally to implement the “checkfirst” behavior for methods like_schema.Table.create()and_schema.MetaData.create_all().Note
This method is used internally by SQLAlchemy, and is published so that third-party dialects may provide an implementation. It is not the public API for checking for table presence. Please use the
Inspector.has_table()method.Changed in version 2.0:::
_engine.Dialect.has_table()now formally supports checking for additional table-like objects:any type of views (plain or materialized)
temporary tables of any kind
Previously, these two checks were not formally specified and different dialects would vary in their behavior. The dialect testing suite now includes tests for all of these object types, and dialects to the degree that the backing database supports views or temporary tables should seek to support locating these objects for full compliance.
- has_terminate: bool = False¶
Whether or not this dialect has a separate “terminate” implementation that does not block or require awaiting.
- include_set_input_sizes: Set[Any] | None = None¶
set of DBAPI type objects that should be included in automatic cursor.setinputsizes() calls.
This is only used if bind_typing is BindTyping.SET_INPUT_SIZES
- initialize(connection: Connection) None¶
Called during strategized creation of the dialect with a connection.
Allows dialects to configure options based on server version info or other properties.
The connection passed here is a SQLAlchemy Connection object, with full capabilities.
The initialize() method of the base dialect should be called via super().
Note
as of SQLAlchemy 1.4, this method is called before any
_engine.Dialect.on_connect()hooks are called.
- inline_comments: bool = False¶
Indicates the dialect supports comment DDL that’s inline with the definition of a Table or Column. If False, this implies that ALTER must be used to set table and column comments.
- insert_executemany_returning: bool¶
dialect / driver / database supports some means of providing INSERT…RETURNING support when dialect.do_executemany() is used.
- insert_executemany_returning_sort_by_parameter_order: bool¶
dialect / driver / database supports some means of providing INSERT…RETURNING support when dialect.do_executemany() is used along with the :paramref:`_dml.Insert.returning.sort_by_parameter_order` parameter being set.
- insert_null_pk_still_autoincrements = False¶
- insertmanyvalues_implicit_sentinel: InsertmanyvaluesSentinelOpts = symbol('NOT_SUPPORTED')¶
Options indicating the database supports a form of bulk INSERT where the autoincrement integer primary key can be reliably used as an ordering for INSERTed rows.
Added in version 2.0.10.
See also
engine_insertmanyvalues_returning_order
- insertmanyvalues_max_parameters: int = 32700¶
Alternate to insertmanyvalues_page_size, will additionally limit page size based on number of parameters total in the statement.
- insertmanyvalues_page_size: int = 1000¶
Number of rows to render into an individual INSERT..VALUES() statement for
ExecuteStyle.INSERTMANYVALUESexecutions.The default dialect defaults this to 1000.
Added in version 2.0.
See also
:paramref:`_engine.Connection.execution_options.insertmanyvalues_page_size` - execution option available on
_engine.Connection, statements
- is_disconnect(e: DBAPIModule.Error, connection: pool.PoolProxiedConnection | interfaces.DBAPIConnection | None, cursor: interfaces.DBAPICursor | None) bool¶
Return True if the given DB-API error indicates an invalid connection
- ischema_names: dict[str, type[Any]] = {'array': <class 'sqlalchemy.sql.sqltypes.String'>, 'bigint': <class 'sqlalchemy.sql.sqltypes.BIGINT'>, 'binary': <class 'sqlalchemy.sql.sqltypes.BINARY'>, 'boolean': <class 'sqlalchemy.sql.sqltypes.BOOLEAN'>, 'char': <class 'sqlalchemy.sql.sqltypes.CHAR'>, 'date': <class 'sqlalchemy.sql.sqltypes.DATE'>, 'decimal': <class 'sqlalchemy.sql.sqltypes.DECIMAL'>, 'double': <class 'sqlalchemy.sql.sqltypes.DOUBLE'>, 'float': <class 'sqlalchemy.sql.sqltypes.FLOAT'>, 'int': <class 'sqlalchemy.sql.sqltypes.INTEGER'>, 'integer': <class 'sqlalchemy.sql.sqltypes.INTEGER'>, 'json': <class 'sqlalchemy.sql.sqltypes.JSON'>, 'map': <class 'sqlalchemy.sql.sqltypes.String'>, 'real': <class 'sqlalchemy.sql.sqltypes.FLOAT'>, 'row': <class 'pyathena.sqlalchemy.types.AthenaStruct'>, 'smallint': <class 'sqlalchemy.sql.sqltypes.SMALLINT'>, 'string': <class 'sqlalchemy.sql.sqltypes.String'>, 'struct': <class 'pyathena.sqlalchemy.types.AthenaStruct'>, 'timestamp': <class 'sqlalchemy.sql.sqltypes.TIMESTAMP'>, 'tinyint': <class 'pyathena.sqlalchemy.types.TINYINT'>, 'varbinary': <class 'sqlalchemy.sql.sqltypes.BINARY'>, 'varchar': <class 'sqlalchemy.sql.sqltypes.VARCHAR'>}¶
- classmethod load_provisioning()¶
set up the provision.py module for this dialect.
For dialects that include a provision.py module that sets up provisioning followers, this method should initiate that process.
A typical implementation would be:
@classmethod def load_provisioning(cls): __import__("mydialect.provision")
The default method assumes a module named
provision.pyinside the owning package of the current dialect, based on the__module__attribute:@classmethod def load_provisioning(cls): package = ".".join(cls.__module__.split(".")[0:-1]) try: __import__(package + ".provision") except ImportError: pass
Added in version 1.3.14.
- loaded_dbapi¶
- max_constraint_name_length: int | None = None¶
The maximum length of constraint names if different from
max_identifier_length.
- max_index_name_length: int | None = None¶
The maximum length of index names if different from
max_identifier_length.
- name: str = 'awsathena'¶
identifying name for the dialect from a DBAPI-neutral point of view (i.e. ‘sqlite’)
- non_native_boolean_check_constraint = True¶
- normalize_name(name)¶
convert the given name to lowercase if it is detected as case insensitive.
This method is only used if the dialect defines requires_name_normalize=True.
- on_connect() Callable[[Any], None] | None¶
return a callable which sets up a newly created DBAPI connection.
The callable should accept a single argument “conn” which is the DBAPI connection itself. The inner callable has no return value.
E.g.:
class MyDialect(default.DefaultDialect): # ... def on_connect(self): def do_on_connect(connection): connection.execute("SET SPECIAL FLAGS etc") return do_on_connect
This is used to set dialect-wide per-connection options such as isolation modes, Unicode modes, etc.
The “do_on_connect” callable is invoked by using the
_events.PoolEvents.connect()event hook, then unwrapping the DBAPI connection and passing it into the callable.Changed in version 1.4: the on_connect hook is no longer called twice for the first connection of a dialect. The on_connect hook is still called before the
_engine.Dialect.initialize()method however.Changed in version 1.4.3: the on_connect hook is invoked from a new method on_connect_url that passes the URL that was used to create the connect args. Dialects can implement on_connect_url instead of on_connect if they need the URL object that was used for the connection in order to get additional context.
If None is returned, no event listener is generated.
- Returns:
a callable that accepts a single DBAPI connection as an argument, or None.
See also
Dialect.connect()- allows the DBAPIconnect()sequence itself to be controlled.Dialect.on_connect_url()- supersedesDialect.on_connect()to also receive the_engine.URLobject in context.
- on_connect_url(url: URL) Callable[[Any], Any] | None¶
return a callable which sets up a newly created DBAPI connection.
This method is a new hook that supersedes the
_engine.Dialect.on_connect()method when implemented by a dialect. When not implemented by a dialect, it invokes the_engine.Dialect.on_connect()method directly to maintain compatibility with existing dialects. There is no deprecation for_engine.Dialect.on_connect()expected.The callable should accept a single argument “conn” which is the DBAPI connection itself. The inner callable has no return value.
E.g.:
class MyDialect(default.DefaultDialect): # ... def on_connect_url(self, url): def do_on_connect(connection): connection.execute("SET SPECIAL FLAGS etc") return do_on_connect
This is used to set dialect-wide per-connection options such as isolation modes, Unicode modes, etc.
This method differs from
_engine.Dialect.on_connect()in that it is passed the_engine.URLobject that’s relevant to the connect args. Normally the only way to get this is from the_engine.Dialect.on_connect()hook is to look on the_engine.Engineitself, however this URL object may have been replaced by plugins.Note
The default implementation of
_engine.Dialect.on_connect_url()is to invoke the_engine.Dialect.on_connect()method. Therefore if a dialect implements this method, the_engine.Dialect.on_connect()method will not be called unless the overriding dialect calls it directly from here.Added in version 1.4.3: added
_engine.Dialect.on_connect_url()which normally calls into_engine.Dialect.on_connect().- Parameters:
url – a
_engine.URLobject representing the_engine.URLthat was passed to the_engine.Dialect.create_connect_args()method.- Returns:
a callable that accepts a single DBAPI connection as an argument, or None.
See also
_engine.Dialect.on_connect()
- preexecute_autoincrement_sequences: bool = False¶
True if ‘implicit’ primary key functions must be executed separately in order to get their value, if RETURNING is not used.
This is currently oriented towards PostgreSQL when the
implicit_returning=Falseparameter is used on aTableobject.
- preparer¶
alias of
AthenaDMLIdentifierPreparer
- reflection_options: Sequence[str] = ()¶
Sequence of string names indicating keyword arguments that can be established on a
Tableobject which will be passed as “reflection options” when using :paramref:`.Table.autoload_with`.Current example is “oracle_resolve_synonyms” in the Oracle Database dialects.
- requires_name_normalize: bool = False¶
Indicates symbol names are returned by the database in UPPERCASED if they are case insensitive within the database. If this is True, the methods normalize_name() and denormalize_name() must be provided.
- reset_isolation_level(dbapi_conn)¶
Given a DBAPI connection, revert its isolation to the default.
Note that this is a dialect-level method which is used as part of the implementation of the
_engine.Connectionand_engine.Engineisolation level facilities; these APIs should be preferred for most typical use cases.See also
_engine.Connection.get_isolation_level()- view current level_engine.Connection.default_isolation_level- view default level:paramref:`.Connection.execution_options.isolation_level` - set per
_engine.Connectionisolation level:paramref:`_sa.create_engine.isolation_level` - set per
_engine.Engineisolation level
- returns_native_bytes: bool = False¶
indicates if Python bytes() objects are returned natively by the driver for SQL “binary” datatypes.
Added in version 2.0.11.
- sequences_optional: bool = False¶
If True, indicates if the :paramref:`_schema.Sequence.optional` parameter on the
_schema.Sequenceconstruct should signal to not generate a CREATE SEQUENCE. Applies only to dialects that support sequences. Currently used only to allow PostgreSQL SERIAL to be used on a column that specifies Sequence() for usage on other backends.
- server_side_cursors: bool = False¶
deprecated; indicates if the dialect should attempt to use server side cursors by default
- server_version_info: Tuple[Any, ...] | None = None¶
a tuple containing a version number for the DB backend in use.
This value is only available for supporting dialects, and is typically populated during the initial connection to the database.
- set_connection_execution_options(connection: Connection, opts: Mapping[str, Any]) None¶
Establish execution options for a given connection.
This is implemented by
DefaultDialectin order to implement the :paramref:`_engine.Connection.execution_options.isolation_level` execution option. Dialects can intercept various execution options which may need to modify state on a particular DBAPI connection.Added in version 1.4.
- set_engine_execution_options(engine: Engine, opts: Mapping[str, Any]) None¶
Establish execution options for a given engine.
This is implemented by
DefaultDialectto establish event hooks for newConnectioninstances created by the givenEnginewhich will then invoke theDialect.set_connection_execution_options()method for that connection.
- set_isolation_level(dbapi_connection: DBAPIConnection, level: Literal['SERIALIZABLE', 'REPEATABLE READ', 'READ COMMITTED', 'READ UNCOMMITTED', 'AUTOCOMMIT']) None¶
Given a DBAPI connection, set its isolation level.
Note that this is a dialect-level method which is used as part of the implementation of the
_engine.Connectionand_engine.Engineisolation level facilities; these APIs should be preferred for most typical use cases.If the dialect also implements the
Dialect.get_isolation_level_values()method, then the given level is guaranteed to be one of the string names within that sequence, and the method will not need to anticipate a lookup failure.See also
_engine.Connection.get_isolation_level()- view current level_engine.Connection.default_isolation_level- view default level:paramref:`.Connection.execution_options.isolation_level` - set per
_engine.Connectionisolation level:paramref:`_sa.create_engine.isolation_level` - set per
_engine.Engineisolation level
- statement_compiler¶
alias of
AthenaStatementCompiler
- supports_alter: bool = False¶
Trueif the database supportsALTER TABLE- used only for generating foreign key constraints in certain circumstances
- supports_constraint_comments: bool = False¶
Indicates if the dialect supports comment DDL on constraints.
Added in version 2.0.
- supports_for_update_of = False¶
- supports_is_distinct_from = True¶
- supports_multivalues_insert: bool = True¶
Target database supports INSERT…VALUES with multiple value sets, i.e. INSERT INTO table (cols) VALUES (…), (…), (…), …
- supports_native_boolean: bool = True¶
Indicates if the dialect supports a native boolean construct. This will prevent
_types.Booleanfrom generating a CHECK constraint when that type is used.
- supports_native_decimal: bool = True¶
indicates if Decimal objects are handled and returned for precision numeric types, or if floats are returned
- supports_native_enum: bool = False¶
Indicates if the dialect supports a native ENUM construct. This will prevent
_types.Enumfrom generating a CHECK constraint when that type is used in “native” mode.
- supports_native_uuid: bool = False¶
indicates if Python UUID() objects are handled natively by the driver for SQL UUID datatypes.
Added in version 2.0.
- supports_sane_multi_rowcount: bool = True¶
Indicate whether the dialect properly implements rowcount for
UPDATEandDELETEstatements when executed via executemany.
- supports_sane_rowcount: bool = True¶
Indicate whether the dialect properly implements rowcount for
UPDATEandDELETEstatements.
- property supports_sane_rowcount_returning¶
True if this dialect supports sane rowcount even if RETURNING is in use.
For dialects that don’t support RETURNING, this is synonymous with
supports_sane_rowcount.
- supports_schemas = True¶
- supports_server_side_cursors: generic_fn_descriptor[bool] | bool = False¶
indicates if the dialect supports server side cursors
- supports_simple_order_by_label: bool = True¶
target database supports ORDER BY <labelname>, where <labelname> refers to a label in the columns clause of the SELECT
- supports_views = True¶
- type_compiler¶
alias of
AthenaTypeCompiler
- type_compiler_cls¶
alias of
GenericTypeCompiler
- type_descriptor(typeobj)¶
Provide a database-specific
TypeEngineobject, given the generic object which comes from the types module.This method looks for a dictionary called
colspecsas a class or instance-level variable, and passes on to_types.adapt_type().
- update_returning_multifrom: bool = False¶
if the dialect supports RETURNING with UPDATE..FROM
Added in version 2.0.
- use_insertmanyvalues: bool = False¶
if True, indicates “insertmanyvalues” functionality should be used to allow for
insert_executemany_returningbehavior, if possible.In practice, setting this to True means:
if
supports_multivalues_insert,insert_returninganduse_insertmanyvaluesare all True, the SQL compiler will produce an INSERT that will be interpreted by theDefaultDialectas anExecuteStyle.INSERTMANYVALUESexecution that allows for INSERT of many rows with RETURNING by rewriting a single-row INSERT statement to have multiple VALUES clauses, also executing the statement multiple times for a series of batches when large numbers of rows are given.The parameter is False for the default dialect, and is set to True for SQLAlchemy internal dialects SQLite, MySQL/MariaDB, PostgreSQL, SQL Server. It remains at False for Oracle Database, which provides native “executemany with RETURNING” support and also does not support
supports_multivalues_insert. For MySQL/MariaDB, those MySQL dialects that don’t support RETURNING will not reportinsert_executemany_returningas True.Added in version 2.0.
See also
engine_insertmanyvalues
- use_insertmanyvalues_wo_returning: bool = False¶
if True, and use_insertmanyvalues is also True, INSERT statements that don’t include RETURNING will also use “insertmanyvalues”.
Added in version 2.0.
See also
engine_insertmanyvalues
- compiler_linting: Linting¶
- type_compiler_instance: TypeCompiler¶
instance of a
Compiledclass used to compile SQL type objectsAdded in version 2.0.
- identifier_preparer: IdentifierPreparer¶
This element will refer to an instance of
IdentifierPrepareronce aDefaultDialecthas been constructed.
- default_isolation_level: IsolationLevel | None¶
the isolation that is implicitly present on new connections
- skip_autocommit_rollback: bool¶
Whether or not the :paramref:`.create_engine.skip_autocommit_rollback` parameter was set.
Added in version 2.0.43.
- class pyathena.sqlalchemy.pandas.AthenaPandasDialect(json_deserializer=None, json_serializer=None, **kwargs)[source]¶
SQLAlchemy dialect for Amazon Athena with pandas DataFrame result format.
This dialect extends AthenaDialect to use PandasCursor, which returns query results as pandas DataFrames. This integration enables seamless use of Athena data in data analysis and machine learning workflows.
- Connection URL Format:
awsathena+pandas://{access_key}:{secret_key}@athena.{region}.amazonaws.com/{schema}- Query Parameters:
In addition to the base dialect parameters: - unload: If “true”, use UNLOAD for Parquet output (better performance
for large datasets)
engine: CSV parsing engine (“c”, “python”, or “pyarrow”)
chunksize: Number of rows per chunk for memory-efficient processing
Example
>>> from sqlalchemy import create_engine >>> engine = create_engine( ... "awsathena+pandas://:@athena.us-west-2.amazonaws.com/default" ... "?s3_staging_dir=s3://my-bucket/athena-results/" ... "&unload=true&chunksize=10000" ... )
- supports_statement_cache: bool = True¶
indicates if this dialect supports caching.
All dialects that are compatible with statement caching should set this flag to True directly on each dialect class and subclass that supports it. SQLAlchemy tests that this flag is locally present on each dialect subclass before it will use statement caching. This is to provide safety for legacy or new dialects that are not yet fully tested to be compliant with SQL statement caching.
Added in version 1.4.5.
See also
engine_thirdparty_caching
- create_connect_args(url)[source]¶
Build DB-API compatible connection arguments.
Given a
URLobject, returns a tuple consisting of a(*args, **kwargs)suitable to send directly to the dbapi’s connect function. The arguments are sent to theDialect.connect()method which then runs the DBAPI-levelconnect()function.The method typically makes use of the
URL.translate_connect_args()method in order to generate a dictionary of options.The default implementation is:
def create_connect_args(self, url): opts = url.translate_connect_args() opts.update(url.query) return ([], opts)
- Parameters:
url – a
URLobject- Returns:
a tuple of
(*args, **kwargs)which will be passed to theDialect.connect()method.
See also
URL.translate_connect_args()
- classmethod import_dbapi() ModuleType[source]¶
Import the DBAPI module that is used by this dialect.
The Python module object returned here will be assigned as an instance variable to a constructed dialect under the name
.dbapi.Changed in version 2.0: The
Dialect.import_dbapi()class method is renamed from the previous method.Dialect.dbapi(), which would be replaced at dialect instantiation time by the DBAPI module itself, thus using the same name in two different ways. If a.Dialect.dbapi()classmethod is present on a third-party dialect, it will be used and a deprecation warning will be emitted.
- CACHE_HIT = 0¶
- CACHE_MISS = 1¶
- CACHING_DISABLED = 2¶
- NO_CACHE_KEY = 3¶
- NO_DIALECT_SUPPORT = 4¶
- __init__(json_deserializer=None, json_serializer=None, **kwargs)¶
- bind_typing = 1¶
define a means of passing typing information to the database and/or driver for bound parameters.
See
BindTypingfor values.Added in version 2.0.
- colspecs: dict[type[Any], type[Any]] = {<class 'sqlalchemy.sql.sqltypes.DATE'>: <class 'pyathena.sqlalchemy.types.AthenaDate'>, <class 'sqlalchemy.sql.sqltypes.DATETIME'>: <class 'pyathena.sqlalchemy.types.AthenaTimestamp'>, <class 'sqlalchemy.sql.sqltypes.TIMESTAMP'>: <class 'pyathena.sqlalchemy.types.AthenaTimestamp'>}¶
A dictionary of TypeEngine classes from sqlalchemy.types mapped to subclasses that are specific to the dialect class. This dictionary is class-level only and is not accessed from the dialect instance itself.
- connect(*cargs: Any, **cparams: Any) DBAPIConnection¶
Establish a connection using this dialect’s DBAPI.
The default implementation of this method is:
def connect(self, *cargs, **cparams): return self.dbapi.connect(*cargs, **cparams)
The
*cargs, **cparamsparameters are generated directly from this dialect’sDialect.create_connect_args()method.This method may be used for dialects that need to perform programmatic per-connection steps when a new connection is procured from the DBAPI.
- Parameters:
*cargs – positional parameters returned from the
Dialect.create_connect_args()method**cparams – keyword parameters returned from the
Dialect.create_connect_args()method.
- Returns:
a DBAPI connection, typically from the PEP 249 module level
.connect()function.
See also
Dialect.create_connect_args()Dialect.on_connect()
- connection_characteristics = {'isolation_level': <sqlalchemy.engine.characteristics.IsolationLevelCharacteristic object>, 'logging_token': <sqlalchemy.engine.characteristics.LoggingTokenCharacteristic object>}¶
- construct_arguments: list[tuple[type[SchemaItem | ClauseElement], Mapping[str, Any]]] | None = [(<class 'sqlalchemy.sql.schema.Table'>, {'bucket_count': None, 'compression': None, 'file_format': None, 'location': None, 'row_format': None, 'serdeproperties': None, 'tblproperties': None}), (<class 'sqlalchemy.sql.schema.Column'>, {'cluster': False, 'partition': False, 'partition_transform': None, 'partition_transform_bucket_count': None, 'partition_transform_truncate_length': None})]¶
Optional set of argument specifiers for various SQLAlchemy constructs, typically schema items.
To implement, establish as a series of tuples, as in:
construct_arguments = [ (schema.Index, {"using": False, "where": None, "ops": None}), ]
If the above construct is established on the PostgreSQL dialect, the
Indexconstruct will now accept the keyword argumentspostgresql_using,postgresql_where, nadpostgresql_ops. Any other argument specified to the constructor ofIndexwhich is prefixed withpostgresql_will raiseArgumentError.A dialect which does not include a
construct_argumentsmember will not participate in the argument validation system. For such a dialect, any argument name is accepted by all participating constructs, within the namespace of arguments prefixed with that dialect name. The rationale here is so that third-party dialects that haven’t yet implemented this feature continue to function in the old way.See also
DialectKWArgs- implementing base class which consumesDefaultDialect.construct_arguments
- create_xid()¶
Create a random two-phase transaction ID.
This id will be passed to do_begin_twophase(), do_rollback_twophase(), do_commit_twophase(). Its format is unspecified.
- cte_follows_insert: bool = True¶
target database, when given a CTE with an INSERT statement, needs the CTE to be below the INSERT
- classmethod dbapi() ModuleType¶
- dbapi_exception_translation_map: Mapping[str, str] = {}¶
A dictionary of names that will contain as values the names of pep-249 exceptions (“IntegrityError”, “OperationalError”, etc) keyed to alternate class names, to support the case where a DBAPI has exception classes that aren’t named as they are referred to (e.g. IntegrityError = MyException). In the vast majority of cases this dictionary is empty.
- ddl_compiler¶
alias of
AthenaDDLCompiler
- default_metavalue_token: str = 'DEFAULT'¶
for INSERT… VALUES (DEFAULT) syntax, the token to put in the parenthesis.
- default_schema_name: str | None = None¶
the name of the default schema. This value is only available for supporting dialects, and is typically populated during the initial connection to the database.
- default_sequence_base: int = 1¶
the default value that will be rendered as the “START WITH” portion of a CREATE SEQUENCE DDL statement.
- delete_returning_multifrom: bool = False¶
if the dialect supports RETURNING with DELETE..FROM
Added in version 2.0.
- denormalize_name(name)¶
convert the given name to a case insensitive identifier for the backend if it is an all-lowercase name.
This method is only used if the dialect defines requires_name_normalize=True.
- detect_autocommit_setting(dbapi_conn: DBAPIConnection) bool¶
Detect the current autocommit setting for a DBAPI connection.
- Parameters:
dbapi_connection – a DBAPI connection object
- Returns:
True if autocommit is enabled, False if disabled
- Return type:
This method inspects the given DBAPI connection to determine whether autocommit mode is currently enabled. The specific mechanism for detecting autocommit varies by database dialect and DBAPI driver, however it should be done without network round trips.
Note
Not all dialects support autocommit detection. Dialects that do not support this feature will raise
NotImplementedError.
- property dialect_description¶
- dispatch: dispatcher[Dialect] = <sqlalchemy.event.base.DialectEventsDispatch object>¶
- do_begin(dbapi_connection)¶
Provide an implementation of
connection.begin(), given a DB-API connection.The DBAPI has no dedicated “begin” method and it is expected that transactions are implicit. This hook is provided for those DBAPIs that might need additional help in this area.
- Parameters:
dbapi_connection – a DBAPI connection, typically proxied within a
ConnectionFairy.
- do_begin_twophase(connection: Connection, xid: Any) None¶
Begin a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
- do_close(dbapi_connection)¶
Provide an implementation of
connection.close(), given a DBAPI connection.This hook is called by the
_pool.Poolwhen a connection has been detached from the pool, or is being returned beyond the normal capacity of the pool.
- do_commit(dbapi_connection)¶
Provide an implementation of
connection.commit(), given a DB-API connection.- Parameters:
dbapi_connection – a DBAPI connection, typically proxied within a
ConnectionFairy.
- do_commit_twophase(connection: Connection, xid: Any, is_prepared: bool = True, recover: bool = False) None¶
Commit a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
is_prepared – whether or not
TwoPhaseTransaction.prepare()was called.recover – if the recover flag was passed.
- do_execute(cursor, statement, parameters, context=None)¶
Provide an implementation of
cursor.execute(statement, parameters).
- do_execute_no_params(cursor, statement, context=None)¶
Provide an implementation of
cursor.execute(statement).The parameter collection should not be sent.
- do_executemany(cursor, statement, parameters, context=None)¶
Provide an implementation of
cursor.executemany(statement, parameters).
- do_ping(dbapi_connection: DBAPIConnection) bool¶
ping the DBAPI connection and return True if the connection is usable.
- do_prepare_twophase(connection: Connection, xid: Any) None¶
Prepare a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
- do_recover_twophase(connection: Connection) List[Any]¶
Recover list of uncommitted prepared two phase transaction identifiers on the given connection.
- Parameters:
connection – a
_engine.Connection.
- do_release_savepoint(connection, name)¶
Release the named savepoint on a connection.
- Parameters:
connection – a
_engine.Connection.name – savepoint name.
- do_rollback(dbapi_connection: PoolProxiedConnection) None¶
Provide an implementation of
connection.rollback(), given a DB-API connection.- Parameters:
dbapi_connection – a DBAPI connection, typically proxied within a
ConnectionFairy.
- do_rollback_to_savepoint(connection, name)¶
Rollback a connection to the named savepoint.
- Parameters:
connection – a
_engine.Connection.name – savepoint name.
- do_rollback_twophase(connection: Connection, xid: Any, is_prepared: bool = True, recover: bool = False) None¶
Rollback a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
is_prepared – whether or not
TwoPhaseTransaction.prepare()was called.recover – if the recover flag was passed.
- do_savepoint(connection, name)¶
Create a savepoint with the given name.
- Parameters:
connection – a
_engine.Connection.name – savepoint name.
- do_set_input_sizes(cursor: DBAPICursor, list_of_tuples: _GenericSetInputSizesType, context: ExecutionContext) Any¶
invoke the cursor.setinputsizes() method with appropriate arguments
This hook is called if the
Dialect.bind_typingattribute is set to theBindTyping.SETINPUTSIZESvalue. Parameter data is passed in a list of tuples (paramname, dbtype, sqltype), whereparamnameis the key of the parameter in the statement,dbtypeis the DBAPI datatype andsqltypeis the SQLAlchemy type. The order of tuples is in the correct parameter order.Added in version 1.4.
Changed in version 2.0: - setinputsizes mode is now enabled by setting
Dialect.bind_typingtoBindTyping.SETINPUTSIZES. Dialects which accept ause_setinputsizesparameter should set this value appropriately.
- do_terminate(dbapi_connection)¶
Provide an implementation of
connection.close()that tries as much as possible to not block, given a DBAPI connection.In the vast majority of cases this just calls .close(), however for some asyncio dialects may call upon different API features.
This hook is called by the
_pool.Poolwhen a connection is being recycled or has been invalidated.Added in version 1.4.41.
- engine_config_types: Mapping[str, Any] = {'echo': <function bool_or_str.<locals>.bool_or_value>, 'echo_pool': <function bool_or_str.<locals>.bool_or_value>, 'future': <function asbool>, 'max_overflow': <function asint>, 'pool_recycle': <function asint>, 'pool_size': <function asint>, 'pool_timeout': <function asint>}¶
a mapping of string keys that can be in an engine config linked to type conversion functions.
- classmethod engine_created(engine: Engine) None¶
A convenience hook called before returning the final
_engine.Engine.If the dialect returned a different class from the
get_dialect_cls()method, then the hook is called on both classes, first on the dialect class returned by theget_dialect_cls()method and then on the class on which the method was called.The hook should be used by dialects and/or wrappers to apply special events to the engine or its components. In particular, it allows a dialect-wrapping class to apply dialect-level events.
- exclude_set_input_sizes: Set[Any] | None = None¶
set of DBAPI type objects that should be excluded in automatic cursor.setinputsizes() calls.
This is only used if bind_typing is BindTyping.SET_INPUT_SIZES
- execution_ctx_cls¶
alias of
DefaultExecutionContext
- favor_returning_over_lastrowid: bool = False¶
for backends that support both a lastrowid and a RETURNING insert strategy, favor RETURNING for simple single-int pk inserts.
cursor.lastrowid tends to be more performant on most backends.
- property full_returning¶
Deprecated since version 2.0: full_returning is deprecated, please use insert_returning, update_returning, delete_returning
- classmethod get_async_dialect_cls(url: URL) Type[Dialect]¶
Given a URL, return the
Dialectthat will be used by an async engine.By default this is an alias of
Dialect.get_dialect_cls()and just returns the cls. It may be used if a dialect provides both a sync and async version under the same name, like thepsycopgdriver.Added in version 2.
See also
Dialect.get_dialect_cls()
- get_check_constraints(connection: Connection, table_name: str, schema: str | None = None, **kw: Any) List[ReflectedCheckConstraint]¶
Return information about check constraints in
table_name.Given a string
table_nameand an optional stringschema, return check constraint information as a list of dicts corresponding to theReflectedCheckConstraintdictionary.This is an internal dialect method. Applications should use
Inspector.get_check_constraints().
- get_columns(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
Return information about columns in
table_name.Given a
_engine.Connection, a stringtable_name, and an optional stringschema, return column information as a list of dictionaries corresponding to theReflectedColumndictionary.This is an internal dialect method. Applications should use
Inspector.get_columns().
- get_default_isolation_level(dbapi_conn)¶
Given a DBAPI connection, return its isolation level, or a default isolation level if one cannot be retrieved.
May be overridden by subclasses in order to provide a “fallback” isolation level for databases that cannot reliably retrieve the actual isolation level.
By default, calls the
_engine.Interfaces.get_isolation_level()method, propagating any exceptions raised.Added in version 1.3.22.
- classmethod get_dialect_cls(url: URL) Type[Dialect]¶
Given a URL, return the
Dialectthat will be used.This is a hook that allows an external plugin to provide functionality around an existing dialect, by allowing the plugin to be loaded from the url based on an entrypoint, and then the plugin returns the actual dialect to be used.
By default this just returns the cls.
- get_dialect_pool_class(url: URL) Type[Pool]¶
return a Pool class to use for a given URL
- get_driver_connection(connection: DBAPIConnection) Any¶
Returns the connection object as returned by the external driver package.
For normal dialects that use a DBAPI compliant driver this call will just return the
connectionpassed as argument. For dialects that instead adapt a non DBAPI compliant driver, like when adapting an asyncio driver, this call will return the connection-like object as returned by the driver.Added in version 1.4.24.
- get_foreign_keys(connection: Connection, table_name: str, schema: str | None = None, **kw) list[ReflectedForeignKeyConstraint]¶
Return information about foreign_keys in
table_name.Given a
_engine.Connection, a stringtable_name, and an optional stringschema, return foreign key information as a list of dicts corresponding to theReflectedForeignKeyConstraintdictionary.This is an internal dialect method. Applications should use
_engine.Inspector.get_foreign_keys().
- get_indexes(connection: Connection, table_name: str, schema: str | None = None, **kw) list[ReflectedIndex]¶
Return information about indexes in
table_name.Given a
_engine.Connection, a stringtable_nameand an optional stringschema, return index information as a list of dictionaries corresponding to theReflectedIndexdictionary.This is an internal dialect method. Applications should use
Inspector.get_indexes().
- get_isolation_level(dbapi_connection: DBAPIConnection) Literal['SERIALIZABLE', 'REPEATABLE READ', 'READ COMMITTED', 'READ UNCOMMITTED', 'AUTOCOMMIT']¶
Given a DBAPI connection, return its isolation level.
When working with a
_engine.Connectionobject, the corresponding DBAPI connection may be procured using the_engine.Connection.connectionaccessor.Note that this is a dialect-level method which is used as part of the implementation of the
_engine.Connectionand_engine.Engineisolation level facilities; these APIs should be preferred for most typical use cases.See also
_engine.Connection.get_isolation_level()- view current level_engine.Connection.default_isolation_level- view default level:paramref:`.Connection.execution_options.isolation_level` - set per
_engine.Connectionisolation level:paramref:`_sa.create_engine.isolation_level` - set per
_engine.Engineisolation level
- get_isolation_level_values(dbapi_conn: DBAPIConnection) Sequence[Literal['SERIALIZABLE', 'REPEATABLE READ', 'READ COMMITTED', 'READ UNCOMMITTED', 'AUTOCOMMIT']]¶
return a sequence of string isolation level names that are accepted by this dialect.
The available names should use the following conventions:
use UPPERCASE names. isolation level methods will accept lowercase names but these are normalized into UPPERCASE before being passed along to the dialect.
separate words should be separated by spaces, not underscores, e.g.
REPEATABLE READ. isolation level names will have underscores converted to spaces before being passed along to the dialect.The names for the four standard isolation names to the extent that they are supported by the backend should be
READ UNCOMMITTED,READ COMMITTED,REPEATABLE READ,SERIALIZABLEif the dialect supports an autocommit option it should be provided using the isolation level name
AUTOCOMMIT.Other isolation modes may also be present, provided that they are named in UPPERCASE and use spaces not underscores.
This function is used so that the default dialect can check that a given isolation level parameter is valid, else raises an
_exc.ArgumentError.A DBAPI connection is passed to the method, in the unlikely event that the dialect needs to interrogate the connection itself to determine this list, however it is expected that most backends will return a hardcoded list of values. If the dialect supports “AUTOCOMMIT”, that value should also be present in the sequence returned.
The method raises
NotImplementedErrorby default. If a dialect does not implement this method, then the default dialect will not perform any checking on a given isolation level value before passing it onto theDialect.set_isolation_level()method. This is to allow backwards-compatibility with third party dialects that may not yet be implementing this method.Added in version 2.0.
- get_materialized_view_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of all materialized view names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_materialized_view_names().- Parameters:
schema –
schema name to query, if not the default schema.
Added in version 2.0.
- get_multi_check_constraints(connection, **kw)¶
Return information about check constraints in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_check_constraints().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_columns(connection, **kw)¶
Return information about columns in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_columns().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_foreign_keys(connection, **kw)¶
Return information about foreign_keys in all tables in the given
schema.This is an internal dialect method. Applications should use
_engine.Inspector.get_multi_foreign_keys().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_indexes(connection, **kw)¶
Return information about indexes in in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_indexes().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_pk_constraint(connection, **kw)¶
Return information about primary key constraints in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_pk_constraint().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_table_comment(connection, **kw)¶
Return information about the table comment in all tables in the given
schema.This is an internal dialect method. Applications should use
_engine.Inspector.get_multi_table_comment().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_table_options(connection, **kw)¶
Return a dictionary of options specified when the tables in the given schema were created.
This is an internal dialect method. Applications should use
_engine.Inspector.get_multi_table_options().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_unique_constraints(connection, **kw)¶
Return information about unique constraints in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_unique_constraints().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_pk_constraint(connection: Connection, table_name: str, schema: str | None = None, **kw) ReflectedPrimaryKeyConstraint¶
Return information about the primary key constraint on table_name`.
Given a
_engine.Connection, a stringtable_name, and an optional stringschema, return primary key information as a dictionary corresponding to theReflectedPrimaryKeyConstraintdictionary.This is an internal dialect method. Applications should use
Inspector.get_pk_constraint().
- classmethod get_pool_class(url: URL) Type[Pool]¶
- get_schema_names(connection, **kw)¶
Return a list of all schema names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_schema_names().
- get_sequence_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of all sequence names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_sequence_names().- Parameters:
schema – schema name to query, if not the default schema.
Added in version 1.4.
- get_table_comment(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
Return the “comment” for the table identified by
table_name.Given a string
table_nameand an optional stringschema, return table comment information as a dictionary corresponding to theReflectedTableCommentdictionary.This is an internal dialect method. Applications should use
Inspector.get_table_comment().- Raise:
NotImplementedErrorfor dialects that don’t support comments.
Added in version 1.2.
- get_table_names(connection: Connection, schema: str | None = None, **kw)¶
Return a list of table names for
schema.This is an internal dialect method. Applications should use
_engine.Inspector.get_table_names().
- get_table_options(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
Return a dictionary of options specified when
table_namewas created.This is an internal dialect method. Applications should use
_engine.Inspector.get_table_options().
- get_temp_table_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of temporary table names on the given connection, if supported by the underlying backend.
This is an internal dialect method. Applications should use
_engine.Inspector.get_temp_table_names().
- get_temp_view_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of temporary view names on the given connection, if supported by the underlying backend.
This is an internal dialect method. Applications should use
_engine.Inspector.get_temp_view_names().
- get_unique_constraints(connection: Connection, table_name: str, schema: str | None = None, **kw: Any) List[ReflectedUniqueConstraint]¶
Return information about unique constraints in
table_name.Given a string
table_nameand an optional stringschema, return unique constraint information as a list of dicts corresponding to theReflectedUniqueConstraintdictionary.This is an internal dialect method. Applications should use
Inspector.get_unique_constraints().
- get_view_definition(connection: Connection, view_name: str, schema: str | None = None, **kw)¶
Return plain or materialized view definition.
This is an internal dialect method. Applications should use
_engine.Inspector.get_view_definition().Given a
_engine.Connection, a stringview_name, and an optional stringschema, return the view definition.
- get_view_names(connection: Connection, schema: str | None = None, **kw)¶
Return a list of all non-materialized view names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_view_names().- Parameters:
schema – schema name to query, if not the default schema.
- has_index(connection, table_name, index_name, schema=None, **kw)¶
Check the existence of a particular index name in the database.
Given a
_engine.Connectionobject, a stringtable_nameand string index name, returnTrueif an index of the given name on the given table exists,Falseotherwise.The
DefaultDialectimplements this in terms of theDialect.has_table()andDialect.get_indexes()methods, however dialects can implement a more performant version.This is an internal dialect method. Applications should use
_engine.Inspector.has_index().Added in version 1.4.
- has_schema(connection: Connection, schema_name: str, **kw: Any) bool¶
Check the existence of a particular schema name in the database.
Given a
_engine.Connectionobject, a stringschema_name, returnTrueif a schema of the given exists,Falseotherwise.The
DefaultDialectimplements this by checking the presence ofschema_nameamong the schemas returned byDialect.get_schema_names(), however dialects can implement a more performant version.This is an internal dialect method. Applications should use
_engine.Inspector.has_schema().Added in version 2.0.
- has_sequence(connection: Connection, sequence_name: str, schema: str | None = None, **kw: Any) bool¶
Check the existence of a particular sequence in the database.
Given a
_engine.Connectionobject and a string sequence_name, returnTrueif the given sequence exists in the database,Falseotherwise.This is an internal dialect method. Applications should use
_engine.Inspector.has_sequence().
- has_table(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
For internal dialect use, check the existence of a particular table or view in the database.
Given a
_engine.Connectionobject, a string table_name and optional schema name, return True if the given table exists in the database, False otherwise.This method serves as the underlying implementation of the public facing
Inspector.has_table()method, and is also used internally to implement the “checkfirst” behavior for methods like_schema.Table.create()and_schema.MetaData.create_all().Note
This method is used internally by SQLAlchemy, and is published so that third-party dialects may provide an implementation. It is not the public API for checking for table presence. Please use the
Inspector.has_table()method.Changed in version 2.0:::
_engine.Dialect.has_table()now formally supports checking for additional table-like objects:any type of views (plain or materialized)
temporary tables of any kind
Previously, these two checks were not formally specified and different dialects would vary in their behavior. The dialect testing suite now includes tests for all of these object types, and dialects to the degree that the backing database supports views or temporary tables should seek to support locating these objects for full compliance.
- has_terminate: bool = False¶
Whether or not this dialect has a separate “terminate” implementation that does not block or require awaiting.
- include_set_input_sizes: Set[Any] | None = None¶
set of DBAPI type objects that should be included in automatic cursor.setinputsizes() calls.
This is only used if bind_typing is BindTyping.SET_INPUT_SIZES
- initialize(connection: Connection) None¶
Called during strategized creation of the dialect with a connection.
Allows dialects to configure options based on server version info or other properties.
The connection passed here is a SQLAlchemy Connection object, with full capabilities.
The initialize() method of the base dialect should be called via super().
Note
as of SQLAlchemy 1.4, this method is called before any
_engine.Dialect.on_connect()hooks are called.
- inline_comments: bool = False¶
Indicates the dialect supports comment DDL that’s inline with the definition of a Table or Column. If False, this implies that ALTER must be used to set table and column comments.
- insert_executemany_returning: bool¶
dialect / driver / database supports some means of providing INSERT…RETURNING support when dialect.do_executemany() is used.
- insert_executemany_returning_sort_by_parameter_order: bool¶
dialect / driver / database supports some means of providing INSERT…RETURNING support when dialect.do_executemany() is used along with the :paramref:`_dml.Insert.returning.sort_by_parameter_order` parameter being set.
- insert_null_pk_still_autoincrements = False¶
- insertmanyvalues_implicit_sentinel: InsertmanyvaluesSentinelOpts = symbol('NOT_SUPPORTED')¶
Options indicating the database supports a form of bulk INSERT where the autoincrement integer primary key can be reliably used as an ordering for INSERTed rows.
Added in version 2.0.10.
See also
engine_insertmanyvalues_returning_order
- insertmanyvalues_max_parameters: int = 32700¶
Alternate to insertmanyvalues_page_size, will additionally limit page size based on number of parameters total in the statement.
- insertmanyvalues_page_size: int = 1000¶
Number of rows to render into an individual INSERT..VALUES() statement for
ExecuteStyle.INSERTMANYVALUESexecutions.The default dialect defaults this to 1000.
Added in version 2.0.
See also
:paramref:`_engine.Connection.execution_options.insertmanyvalues_page_size` - execution option available on
_engine.Connection, statements
- is_disconnect(e: DBAPIModule.Error, connection: pool.PoolProxiedConnection | interfaces.DBAPIConnection | None, cursor: interfaces.DBAPICursor | None) bool¶
Return True if the given DB-API error indicates an invalid connection
- ischema_names: dict[str, type[Any]] = {'array': <class 'sqlalchemy.sql.sqltypes.String'>, 'bigint': <class 'sqlalchemy.sql.sqltypes.BIGINT'>, 'binary': <class 'sqlalchemy.sql.sqltypes.BINARY'>, 'boolean': <class 'sqlalchemy.sql.sqltypes.BOOLEAN'>, 'char': <class 'sqlalchemy.sql.sqltypes.CHAR'>, 'date': <class 'sqlalchemy.sql.sqltypes.DATE'>, 'decimal': <class 'sqlalchemy.sql.sqltypes.DECIMAL'>, 'double': <class 'sqlalchemy.sql.sqltypes.DOUBLE'>, 'float': <class 'sqlalchemy.sql.sqltypes.FLOAT'>, 'int': <class 'sqlalchemy.sql.sqltypes.INTEGER'>, 'integer': <class 'sqlalchemy.sql.sqltypes.INTEGER'>, 'json': <class 'sqlalchemy.sql.sqltypes.JSON'>, 'map': <class 'sqlalchemy.sql.sqltypes.String'>, 'real': <class 'sqlalchemy.sql.sqltypes.FLOAT'>, 'row': <class 'pyathena.sqlalchemy.types.AthenaStruct'>, 'smallint': <class 'sqlalchemy.sql.sqltypes.SMALLINT'>, 'string': <class 'sqlalchemy.sql.sqltypes.String'>, 'struct': <class 'pyathena.sqlalchemy.types.AthenaStruct'>, 'timestamp': <class 'sqlalchemy.sql.sqltypes.TIMESTAMP'>, 'tinyint': <class 'pyathena.sqlalchemy.types.TINYINT'>, 'varbinary': <class 'sqlalchemy.sql.sqltypes.BINARY'>, 'varchar': <class 'sqlalchemy.sql.sqltypes.VARCHAR'>}¶
- classmethod load_provisioning()¶
set up the provision.py module for this dialect.
For dialects that include a provision.py module that sets up provisioning followers, this method should initiate that process.
A typical implementation would be:
@classmethod def load_provisioning(cls): __import__("mydialect.provision")
The default method assumes a module named
provision.pyinside the owning package of the current dialect, based on the__module__attribute:@classmethod def load_provisioning(cls): package = ".".join(cls.__module__.split(".")[0:-1]) try: __import__(package + ".provision") except ImportError: pass
Added in version 1.3.14.
- loaded_dbapi¶
- max_constraint_name_length: int | None = None¶
The maximum length of constraint names if different from
max_identifier_length.
- max_index_name_length: int | None = None¶
The maximum length of index names if different from
max_identifier_length.
- name: str = 'awsathena'¶
identifying name for the dialect from a DBAPI-neutral point of view (i.e. ‘sqlite’)
- non_native_boolean_check_constraint = True¶
- normalize_name(name)¶
convert the given name to lowercase if it is detected as case insensitive.
This method is only used if the dialect defines requires_name_normalize=True.
- on_connect() Callable[[Any], None] | None¶
return a callable which sets up a newly created DBAPI connection.
The callable should accept a single argument “conn” which is the DBAPI connection itself. The inner callable has no return value.
E.g.:
class MyDialect(default.DefaultDialect): # ... def on_connect(self): def do_on_connect(connection): connection.execute("SET SPECIAL FLAGS etc") return do_on_connect
This is used to set dialect-wide per-connection options such as isolation modes, Unicode modes, etc.
The “do_on_connect” callable is invoked by using the
_events.PoolEvents.connect()event hook, then unwrapping the DBAPI connection and passing it into the callable.Changed in version 1.4: the on_connect hook is no longer called twice for the first connection of a dialect. The on_connect hook is still called before the
_engine.Dialect.initialize()method however.Changed in version 1.4.3: the on_connect hook is invoked from a new method on_connect_url that passes the URL that was used to create the connect args. Dialects can implement on_connect_url instead of on_connect if they need the URL object that was used for the connection in order to get additional context.
If None is returned, no event listener is generated.
- Returns:
a callable that accepts a single DBAPI connection as an argument, or None.
See also
Dialect.connect()- allows the DBAPIconnect()sequence itself to be controlled.Dialect.on_connect_url()- supersedesDialect.on_connect()to also receive the_engine.URLobject in context.
- on_connect_url(url: URL) Callable[[Any], Any] | None¶
return a callable which sets up a newly created DBAPI connection.
This method is a new hook that supersedes the
_engine.Dialect.on_connect()method when implemented by a dialect. When not implemented by a dialect, it invokes the_engine.Dialect.on_connect()method directly to maintain compatibility with existing dialects. There is no deprecation for_engine.Dialect.on_connect()expected.The callable should accept a single argument “conn” which is the DBAPI connection itself. The inner callable has no return value.
E.g.:
class MyDialect(default.DefaultDialect): # ... def on_connect_url(self, url): def do_on_connect(connection): connection.execute("SET SPECIAL FLAGS etc") return do_on_connect
This is used to set dialect-wide per-connection options such as isolation modes, Unicode modes, etc.
This method differs from
_engine.Dialect.on_connect()in that it is passed the_engine.URLobject that’s relevant to the connect args. Normally the only way to get this is from the_engine.Dialect.on_connect()hook is to look on the_engine.Engineitself, however this URL object may have been replaced by plugins.Note
The default implementation of
_engine.Dialect.on_connect_url()is to invoke the_engine.Dialect.on_connect()method. Therefore if a dialect implements this method, the_engine.Dialect.on_connect()method will not be called unless the overriding dialect calls it directly from here.Added in version 1.4.3: added
_engine.Dialect.on_connect_url()which normally calls into_engine.Dialect.on_connect().- Parameters:
url – a
_engine.URLobject representing the_engine.URLthat was passed to the_engine.Dialect.create_connect_args()method.- Returns:
a callable that accepts a single DBAPI connection as an argument, or None.
See also
_engine.Dialect.on_connect()
- preexecute_autoincrement_sequences: bool = False¶
True if ‘implicit’ primary key functions must be executed separately in order to get their value, if RETURNING is not used.
This is currently oriented towards PostgreSQL when the
implicit_returning=Falseparameter is used on aTableobject.
- preparer¶
alias of
AthenaDMLIdentifierPreparer
- reflection_options: Sequence[str] = ()¶
Sequence of string names indicating keyword arguments that can be established on a
Tableobject which will be passed as “reflection options” when using :paramref:`.Table.autoload_with`.Current example is “oracle_resolve_synonyms” in the Oracle Database dialects.
- requires_name_normalize: bool = False¶
Indicates symbol names are returned by the database in UPPERCASED if they are case insensitive within the database. If this is True, the methods normalize_name() and denormalize_name() must be provided.
- reset_isolation_level(dbapi_conn)¶
Given a DBAPI connection, revert its isolation to the default.
Note that this is a dialect-level method which is used as part of the implementation of the
_engine.Connectionand_engine.Engineisolation level facilities; these APIs should be preferred for most typical use cases.See also
_engine.Connection.get_isolation_level()- view current level_engine.Connection.default_isolation_level- view default level:paramref:`.Connection.execution_options.isolation_level` - set per
_engine.Connectionisolation level:paramref:`_sa.create_engine.isolation_level` - set per
_engine.Engineisolation level
- returns_native_bytes: bool = False¶
indicates if Python bytes() objects are returned natively by the driver for SQL “binary” datatypes.
Added in version 2.0.11.
- sequences_optional: bool = False¶
If True, indicates if the :paramref:`_schema.Sequence.optional` parameter on the
_schema.Sequenceconstruct should signal to not generate a CREATE SEQUENCE. Applies only to dialects that support sequences. Currently used only to allow PostgreSQL SERIAL to be used on a column that specifies Sequence() for usage on other backends.
- server_side_cursors: bool = False¶
deprecated; indicates if the dialect should attempt to use server side cursors by default
- server_version_info: Tuple[Any, ...] | None = None¶
a tuple containing a version number for the DB backend in use.
This value is only available for supporting dialects, and is typically populated during the initial connection to the database.
- set_connection_execution_options(connection: Connection, opts: Mapping[str, Any]) None¶
Establish execution options for a given connection.
This is implemented by
DefaultDialectin order to implement the :paramref:`_engine.Connection.execution_options.isolation_level` execution option. Dialects can intercept various execution options which may need to modify state on a particular DBAPI connection.Added in version 1.4.
- set_engine_execution_options(engine: Engine, opts: Mapping[str, Any]) None¶
Establish execution options for a given engine.
This is implemented by
DefaultDialectto establish event hooks for newConnectioninstances created by the givenEnginewhich will then invoke theDialect.set_connection_execution_options()method for that connection.
- set_isolation_level(dbapi_connection: DBAPIConnection, level: Literal['SERIALIZABLE', 'REPEATABLE READ', 'READ COMMITTED', 'READ UNCOMMITTED', 'AUTOCOMMIT']) None¶
Given a DBAPI connection, set its isolation level.
Note that this is a dialect-level method which is used as part of the implementation of the
_engine.Connectionand_engine.Engineisolation level facilities; these APIs should be preferred for most typical use cases.If the dialect also implements the
Dialect.get_isolation_level_values()method, then the given level is guaranteed to be one of the string names within that sequence, and the method will not need to anticipate a lookup failure.See also
_engine.Connection.get_isolation_level()- view current level_engine.Connection.default_isolation_level- view default level:paramref:`.Connection.execution_options.isolation_level` - set per
_engine.Connectionisolation level:paramref:`_sa.create_engine.isolation_level` - set per
_engine.Engineisolation level
- statement_compiler¶
alias of
AthenaStatementCompiler
- supports_alter: bool = False¶
Trueif the database supportsALTER TABLE- used only for generating foreign key constraints in certain circumstances
- supports_constraint_comments: bool = False¶
Indicates if the dialect supports comment DDL on constraints.
Added in version 2.0.
- supports_for_update_of = False¶
- supports_is_distinct_from = True¶
- supports_multivalues_insert: bool = True¶
Target database supports INSERT…VALUES with multiple value sets, i.e. INSERT INTO table (cols) VALUES (…), (…), (…), …
- supports_native_boolean: bool = True¶
Indicates if the dialect supports a native boolean construct. This will prevent
_types.Booleanfrom generating a CHECK constraint when that type is used.
- supports_native_decimal: bool = True¶
indicates if Decimal objects are handled and returned for precision numeric types, or if floats are returned
- supports_native_enum: bool = False¶
Indicates if the dialect supports a native ENUM construct. This will prevent
_types.Enumfrom generating a CHECK constraint when that type is used in “native” mode.
- supports_native_uuid: bool = False¶
indicates if Python UUID() objects are handled natively by the driver for SQL UUID datatypes.
Added in version 2.0.
- supports_sane_multi_rowcount: bool = True¶
Indicate whether the dialect properly implements rowcount for
UPDATEandDELETEstatements when executed via executemany.
- supports_sane_rowcount: bool = True¶
Indicate whether the dialect properly implements rowcount for
UPDATEandDELETEstatements.
- property supports_sane_rowcount_returning¶
True if this dialect supports sane rowcount even if RETURNING is in use.
For dialects that don’t support RETURNING, this is synonymous with
supports_sane_rowcount.
- supports_schemas = True¶
- supports_server_side_cursors: generic_fn_descriptor[bool] | bool = False¶
indicates if the dialect supports server side cursors
- supports_simple_order_by_label: bool = True¶
target database supports ORDER BY <labelname>, where <labelname> refers to a label in the columns clause of the SELECT
- supports_views = True¶
- type_compiler¶
alias of
AthenaTypeCompiler
- type_compiler_cls¶
alias of
GenericTypeCompiler
- type_descriptor(typeobj)¶
Provide a database-specific
TypeEngineobject, given the generic object which comes from the types module.This method looks for a dictionary called
colspecsas a class or instance-level variable, and passes on to_types.adapt_type().
- update_returning_multifrom: bool = False¶
if the dialect supports RETURNING with UPDATE..FROM
Added in version 2.0.
- use_insertmanyvalues: bool = False¶
if True, indicates “insertmanyvalues” functionality should be used to allow for
insert_executemany_returningbehavior, if possible.In practice, setting this to True means:
if
supports_multivalues_insert,insert_returninganduse_insertmanyvaluesare all True, the SQL compiler will produce an INSERT that will be interpreted by theDefaultDialectas anExecuteStyle.INSERTMANYVALUESexecution that allows for INSERT of many rows with RETURNING by rewriting a single-row INSERT statement to have multiple VALUES clauses, also executing the statement multiple times for a series of batches when large numbers of rows are given.The parameter is False for the default dialect, and is set to True for SQLAlchemy internal dialects SQLite, MySQL/MariaDB, PostgreSQL, SQL Server. It remains at False for Oracle Database, which provides native “executemany with RETURNING” support and also does not support
supports_multivalues_insert. For MySQL/MariaDB, those MySQL dialects that don’t support RETURNING will not reportinsert_executemany_returningas True.Added in version 2.0.
See also
engine_insertmanyvalues
- use_insertmanyvalues_wo_returning: bool = False¶
if True, and use_insertmanyvalues is also True, INSERT statements that don’t include RETURNING will also use “insertmanyvalues”.
Added in version 2.0.
See also
engine_insertmanyvalues
- compiler_linting: Linting¶
- type_compiler_instance: TypeCompiler¶
instance of a
Compiledclass used to compile SQL type objectsAdded in version 2.0.
- identifier_preparer: IdentifierPreparer¶
This element will refer to an instance of
IdentifierPrepareronce aDefaultDialecthas been constructed.
- default_isolation_level: IsolationLevel | None¶
the isolation that is implicitly present on new connections
- skip_autocommit_rollback: bool¶
Whether or not the :paramref:`.create_engine.skip_autocommit_rollback` parameter was set.
Added in version 2.0.43.
- class pyathena.sqlalchemy.arrow.AthenaArrowDialect(json_deserializer=None, json_serializer=None, **kwargs)[source]¶
SQLAlchemy dialect for Amazon Athena with Apache Arrow result format.
This dialect extends AthenaDialect to use ArrowCursor, which returns query results as Apache Arrow Tables. Arrow format provides efficient columnar data representation, making it ideal for analytical workloads and integration with data science tools.
- Connection URL Format:
awsathena+arrow://{access_key}:{secret_key}@athena.{region}.amazonaws.com/{schema}- Query Parameters:
In addition to the base dialect parameters: - unload: If “true”, use UNLOAD for Parquet output (better performance
for large datasets)
Example
>>> from sqlalchemy import create_engine >>> engine = create_engine( ... "awsathena+arrow://:@athena.us-west-2.amazonaws.com/default" ... "?s3_staging_dir=s3://my-bucket/athena-results/" ... "&unload=true" ... )
- supports_statement_cache: bool = True¶
indicates if this dialect supports caching.
All dialects that are compatible with statement caching should set this flag to True directly on each dialect class and subclass that supports it. SQLAlchemy tests that this flag is locally present on each dialect subclass before it will use statement caching. This is to provide safety for legacy or new dialects that are not yet fully tested to be compliant with SQL statement caching.
Added in version 1.4.5.
See also
engine_thirdparty_caching
- create_connect_args(url)[source]¶
Build DB-API compatible connection arguments.
Given a
URLobject, returns a tuple consisting of a(*args, **kwargs)suitable to send directly to the dbapi’s connect function. The arguments are sent to theDialect.connect()method which then runs the DBAPI-levelconnect()function.The method typically makes use of the
URL.translate_connect_args()method in order to generate a dictionary of options.The default implementation is:
def create_connect_args(self, url): opts = url.translate_connect_args() opts.update(url.query) return ([], opts)
- Parameters:
url – a
URLobject- Returns:
a tuple of
(*args, **kwargs)which will be passed to theDialect.connect()method.
See also
URL.translate_connect_args()
- classmethod import_dbapi() ModuleType[source]¶
Import the DBAPI module that is used by this dialect.
The Python module object returned here will be assigned as an instance variable to a constructed dialect under the name
.dbapi.Changed in version 2.0: The
Dialect.import_dbapi()class method is renamed from the previous method.Dialect.dbapi(), which would be replaced at dialect instantiation time by the DBAPI module itself, thus using the same name in two different ways. If a.Dialect.dbapi()classmethod is present on a third-party dialect, it will be used and a deprecation warning will be emitted.
- CACHE_HIT = 0¶
- CACHE_MISS = 1¶
- CACHING_DISABLED = 2¶
- NO_CACHE_KEY = 3¶
- NO_DIALECT_SUPPORT = 4¶
- __init__(json_deserializer=None, json_serializer=None, **kwargs)¶
- bind_typing = 1¶
define a means of passing typing information to the database and/or driver for bound parameters.
See
BindTypingfor values.Added in version 2.0.
- colspecs: dict[type[Any], type[Any]] = {<class 'sqlalchemy.sql.sqltypes.DATE'>: <class 'pyathena.sqlalchemy.types.AthenaDate'>, <class 'sqlalchemy.sql.sqltypes.DATETIME'>: <class 'pyathena.sqlalchemy.types.AthenaTimestamp'>, <class 'sqlalchemy.sql.sqltypes.TIMESTAMP'>: <class 'pyathena.sqlalchemy.types.AthenaTimestamp'>}¶
A dictionary of TypeEngine classes from sqlalchemy.types mapped to subclasses that are specific to the dialect class. This dictionary is class-level only and is not accessed from the dialect instance itself.
- connect(*cargs: Any, **cparams: Any) DBAPIConnection¶
Establish a connection using this dialect’s DBAPI.
The default implementation of this method is:
def connect(self, *cargs, **cparams): return self.dbapi.connect(*cargs, **cparams)
The
*cargs, **cparamsparameters are generated directly from this dialect’sDialect.create_connect_args()method.This method may be used for dialects that need to perform programmatic per-connection steps when a new connection is procured from the DBAPI.
- Parameters:
*cargs – positional parameters returned from the
Dialect.create_connect_args()method**cparams – keyword parameters returned from the
Dialect.create_connect_args()method.
- Returns:
a DBAPI connection, typically from the PEP 249 module level
.connect()function.
See also
Dialect.create_connect_args()Dialect.on_connect()
- connection_characteristics = {'isolation_level': <sqlalchemy.engine.characteristics.IsolationLevelCharacteristic object>, 'logging_token': <sqlalchemy.engine.characteristics.LoggingTokenCharacteristic object>}¶
- construct_arguments: list[tuple[type[SchemaItem | ClauseElement], Mapping[str, Any]]] | None = [(<class 'sqlalchemy.sql.schema.Table'>, {'bucket_count': None, 'compression': None, 'file_format': None, 'location': None, 'row_format': None, 'serdeproperties': None, 'tblproperties': None}), (<class 'sqlalchemy.sql.schema.Column'>, {'cluster': False, 'partition': False, 'partition_transform': None, 'partition_transform_bucket_count': None, 'partition_transform_truncate_length': None})]¶
Optional set of argument specifiers for various SQLAlchemy constructs, typically schema items.
To implement, establish as a series of tuples, as in:
construct_arguments = [ (schema.Index, {"using": False, "where": None, "ops": None}), ]
If the above construct is established on the PostgreSQL dialect, the
Indexconstruct will now accept the keyword argumentspostgresql_using,postgresql_where, nadpostgresql_ops. Any other argument specified to the constructor ofIndexwhich is prefixed withpostgresql_will raiseArgumentError.A dialect which does not include a
construct_argumentsmember will not participate in the argument validation system. For such a dialect, any argument name is accepted by all participating constructs, within the namespace of arguments prefixed with that dialect name. The rationale here is so that third-party dialects that haven’t yet implemented this feature continue to function in the old way.See also
DialectKWArgs- implementing base class which consumesDefaultDialect.construct_arguments
- create_xid()¶
Create a random two-phase transaction ID.
This id will be passed to do_begin_twophase(), do_rollback_twophase(), do_commit_twophase(). Its format is unspecified.
- cte_follows_insert: bool = True¶
target database, when given a CTE with an INSERT statement, needs the CTE to be below the INSERT
- classmethod dbapi() ModuleType¶
- dbapi_exception_translation_map: Mapping[str, str] = {}¶
A dictionary of names that will contain as values the names of pep-249 exceptions (“IntegrityError”, “OperationalError”, etc) keyed to alternate class names, to support the case where a DBAPI has exception classes that aren’t named as they are referred to (e.g. IntegrityError = MyException). In the vast majority of cases this dictionary is empty.
- ddl_compiler¶
alias of
AthenaDDLCompiler
- default_metavalue_token: str = 'DEFAULT'¶
for INSERT… VALUES (DEFAULT) syntax, the token to put in the parenthesis.
- default_schema_name: str | None = None¶
the name of the default schema. This value is only available for supporting dialects, and is typically populated during the initial connection to the database.
- default_sequence_base: int = 1¶
the default value that will be rendered as the “START WITH” portion of a CREATE SEQUENCE DDL statement.
- delete_returning_multifrom: bool = False¶
if the dialect supports RETURNING with DELETE..FROM
Added in version 2.0.
- denormalize_name(name)¶
convert the given name to a case insensitive identifier for the backend if it is an all-lowercase name.
This method is only used if the dialect defines requires_name_normalize=True.
- detect_autocommit_setting(dbapi_conn: DBAPIConnection) bool¶
Detect the current autocommit setting for a DBAPI connection.
- Parameters:
dbapi_connection – a DBAPI connection object
- Returns:
True if autocommit is enabled, False if disabled
- Return type:
This method inspects the given DBAPI connection to determine whether autocommit mode is currently enabled. The specific mechanism for detecting autocommit varies by database dialect and DBAPI driver, however it should be done without network round trips.
Note
Not all dialects support autocommit detection. Dialects that do not support this feature will raise
NotImplementedError.
- property dialect_description¶
- dispatch: dispatcher[Dialect] = <sqlalchemy.event.base.DialectEventsDispatch object>¶
- do_begin(dbapi_connection)¶
Provide an implementation of
connection.begin(), given a DB-API connection.The DBAPI has no dedicated “begin” method and it is expected that transactions are implicit. This hook is provided for those DBAPIs that might need additional help in this area.
- Parameters:
dbapi_connection – a DBAPI connection, typically proxied within a
ConnectionFairy.
- do_begin_twophase(connection: Connection, xid: Any) None¶
Begin a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
- do_close(dbapi_connection)¶
Provide an implementation of
connection.close(), given a DBAPI connection.This hook is called by the
_pool.Poolwhen a connection has been detached from the pool, or is being returned beyond the normal capacity of the pool.
- do_commit(dbapi_connection)¶
Provide an implementation of
connection.commit(), given a DB-API connection.- Parameters:
dbapi_connection – a DBAPI connection, typically proxied within a
ConnectionFairy.
- do_commit_twophase(connection: Connection, xid: Any, is_prepared: bool = True, recover: bool = False) None¶
Commit a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
is_prepared – whether or not
TwoPhaseTransaction.prepare()was called.recover – if the recover flag was passed.
- do_execute(cursor, statement, parameters, context=None)¶
Provide an implementation of
cursor.execute(statement, parameters).
- do_execute_no_params(cursor, statement, context=None)¶
Provide an implementation of
cursor.execute(statement).The parameter collection should not be sent.
- do_executemany(cursor, statement, parameters, context=None)¶
Provide an implementation of
cursor.executemany(statement, parameters).
- do_ping(dbapi_connection: DBAPIConnection) bool¶
ping the DBAPI connection and return True if the connection is usable.
- do_prepare_twophase(connection: Connection, xid: Any) None¶
Prepare a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
- do_recover_twophase(connection: Connection) List[Any]¶
Recover list of uncommitted prepared two phase transaction identifiers on the given connection.
- Parameters:
connection – a
_engine.Connection.
- do_release_savepoint(connection, name)¶
Release the named savepoint on a connection.
- Parameters:
connection – a
_engine.Connection.name – savepoint name.
- do_rollback(dbapi_connection: PoolProxiedConnection) None¶
Provide an implementation of
connection.rollback(), given a DB-API connection.- Parameters:
dbapi_connection – a DBAPI connection, typically proxied within a
ConnectionFairy.
- do_rollback_to_savepoint(connection, name)¶
Rollback a connection to the named savepoint.
- Parameters:
connection – a
_engine.Connection.name – savepoint name.
- do_rollback_twophase(connection: Connection, xid: Any, is_prepared: bool = True, recover: bool = False) None¶
Rollback a two phase transaction on the given connection.
- Parameters:
connection – a
_engine.Connection.xid – xid
is_prepared – whether or not
TwoPhaseTransaction.prepare()was called.recover – if the recover flag was passed.
- do_savepoint(connection, name)¶
Create a savepoint with the given name.
- Parameters:
connection – a
_engine.Connection.name – savepoint name.
- do_set_input_sizes(cursor: DBAPICursor, list_of_tuples: _GenericSetInputSizesType, context: ExecutionContext) Any¶
invoke the cursor.setinputsizes() method with appropriate arguments
This hook is called if the
Dialect.bind_typingattribute is set to theBindTyping.SETINPUTSIZESvalue. Parameter data is passed in a list of tuples (paramname, dbtype, sqltype), whereparamnameis the key of the parameter in the statement,dbtypeis the DBAPI datatype andsqltypeis the SQLAlchemy type. The order of tuples is in the correct parameter order.Added in version 1.4.
Changed in version 2.0: - setinputsizes mode is now enabled by setting
Dialect.bind_typingtoBindTyping.SETINPUTSIZES. Dialects which accept ause_setinputsizesparameter should set this value appropriately.
- do_terminate(dbapi_connection)¶
Provide an implementation of
connection.close()that tries as much as possible to not block, given a DBAPI connection.In the vast majority of cases this just calls .close(), however for some asyncio dialects may call upon different API features.
This hook is called by the
_pool.Poolwhen a connection is being recycled or has been invalidated.Added in version 1.4.41.
- engine_config_types: Mapping[str, Any] = {'echo': <function bool_or_str.<locals>.bool_or_value>, 'echo_pool': <function bool_or_str.<locals>.bool_or_value>, 'future': <function asbool>, 'max_overflow': <function asint>, 'pool_recycle': <function asint>, 'pool_size': <function asint>, 'pool_timeout': <function asint>}¶
a mapping of string keys that can be in an engine config linked to type conversion functions.
- classmethod engine_created(engine: Engine) None¶
A convenience hook called before returning the final
_engine.Engine.If the dialect returned a different class from the
get_dialect_cls()method, then the hook is called on both classes, first on the dialect class returned by theget_dialect_cls()method and then on the class on which the method was called.The hook should be used by dialects and/or wrappers to apply special events to the engine or its components. In particular, it allows a dialect-wrapping class to apply dialect-level events.
- exclude_set_input_sizes: Set[Any] | None = None¶
set of DBAPI type objects that should be excluded in automatic cursor.setinputsizes() calls.
This is only used if bind_typing is BindTyping.SET_INPUT_SIZES
- execution_ctx_cls¶
alias of
DefaultExecutionContext
- favor_returning_over_lastrowid: bool = False¶
for backends that support both a lastrowid and a RETURNING insert strategy, favor RETURNING for simple single-int pk inserts.
cursor.lastrowid tends to be more performant on most backends.
- property full_returning¶
Deprecated since version 2.0: full_returning is deprecated, please use insert_returning, update_returning, delete_returning
- classmethod get_async_dialect_cls(url: URL) Type[Dialect]¶
Given a URL, return the
Dialectthat will be used by an async engine.By default this is an alias of
Dialect.get_dialect_cls()and just returns the cls. It may be used if a dialect provides both a sync and async version under the same name, like thepsycopgdriver.Added in version 2.
See also
Dialect.get_dialect_cls()
- get_check_constraints(connection: Connection, table_name: str, schema: str | None = None, **kw: Any) List[ReflectedCheckConstraint]¶
Return information about check constraints in
table_name.Given a string
table_nameand an optional stringschema, return check constraint information as a list of dicts corresponding to theReflectedCheckConstraintdictionary.This is an internal dialect method. Applications should use
Inspector.get_check_constraints().
- get_columns(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
Return information about columns in
table_name.Given a
_engine.Connection, a stringtable_name, and an optional stringschema, return column information as a list of dictionaries corresponding to theReflectedColumndictionary.This is an internal dialect method. Applications should use
Inspector.get_columns().
- get_default_isolation_level(dbapi_conn)¶
Given a DBAPI connection, return its isolation level, or a default isolation level if one cannot be retrieved.
May be overridden by subclasses in order to provide a “fallback” isolation level for databases that cannot reliably retrieve the actual isolation level.
By default, calls the
_engine.Interfaces.get_isolation_level()method, propagating any exceptions raised.Added in version 1.3.22.
- classmethod get_dialect_cls(url: URL) Type[Dialect]¶
Given a URL, return the
Dialectthat will be used.This is a hook that allows an external plugin to provide functionality around an existing dialect, by allowing the plugin to be loaded from the url based on an entrypoint, and then the plugin returns the actual dialect to be used.
By default this just returns the cls.
- get_dialect_pool_class(url: URL) Type[Pool]¶
return a Pool class to use for a given URL
- get_driver_connection(connection: DBAPIConnection) Any¶
Returns the connection object as returned by the external driver package.
For normal dialects that use a DBAPI compliant driver this call will just return the
connectionpassed as argument. For dialects that instead adapt a non DBAPI compliant driver, like when adapting an asyncio driver, this call will return the connection-like object as returned by the driver.Added in version 1.4.24.
- get_foreign_keys(connection: Connection, table_name: str, schema: str | None = None, **kw) list[ReflectedForeignKeyConstraint]¶
Return information about foreign_keys in
table_name.Given a
_engine.Connection, a stringtable_name, and an optional stringschema, return foreign key information as a list of dicts corresponding to theReflectedForeignKeyConstraintdictionary.This is an internal dialect method. Applications should use
_engine.Inspector.get_foreign_keys().
- get_indexes(connection: Connection, table_name: str, schema: str | None = None, **kw) list[ReflectedIndex]¶
Return information about indexes in
table_name.Given a
_engine.Connection, a stringtable_nameand an optional stringschema, return index information as a list of dictionaries corresponding to theReflectedIndexdictionary.This is an internal dialect method. Applications should use
Inspector.get_indexes().
- get_isolation_level(dbapi_connection: DBAPIConnection) Literal['SERIALIZABLE', 'REPEATABLE READ', 'READ COMMITTED', 'READ UNCOMMITTED', 'AUTOCOMMIT']¶
Given a DBAPI connection, return its isolation level.
When working with a
_engine.Connectionobject, the corresponding DBAPI connection may be procured using the_engine.Connection.connectionaccessor.Note that this is a dialect-level method which is used as part of the implementation of the
_engine.Connectionand_engine.Engineisolation level facilities; these APIs should be preferred for most typical use cases.See also
_engine.Connection.get_isolation_level()- view current level_engine.Connection.default_isolation_level- view default level:paramref:`.Connection.execution_options.isolation_level` - set per
_engine.Connectionisolation level:paramref:`_sa.create_engine.isolation_level` - set per
_engine.Engineisolation level
- get_isolation_level_values(dbapi_conn: DBAPIConnection) Sequence[Literal['SERIALIZABLE', 'REPEATABLE READ', 'READ COMMITTED', 'READ UNCOMMITTED', 'AUTOCOMMIT']]¶
return a sequence of string isolation level names that are accepted by this dialect.
The available names should use the following conventions:
use UPPERCASE names. isolation level methods will accept lowercase names but these are normalized into UPPERCASE before being passed along to the dialect.
separate words should be separated by spaces, not underscores, e.g.
REPEATABLE READ. isolation level names will have underscores converted to spaces before being passed along to the dialect.The names for the four standard isolation names to the extent that they are supported by the backend should be
READ UNCOMMITTED,READ COMMITTED,REPEATABLE READ,SERIALIZABLEif the dialect supports an autocommit option it should be provided using the isolation level name
AUTOCOMMIT.Other isolation modes may also be present, provided that they are named in UPPERCASE and use spaces not underscores.
This function is used so that the default dialect can check that a given isolation level parameter is valid, else raises an
_exc.ArgumentError.A DBAPI connection is passed to the method, in the unlikely event that the dialect needs to interrogate the connection itself to determine this list, however it is expected that most backends will return a hardcoded list of values. If the dialect supports “AUTOCOMMIT”, that value should also be present in the sequence returned.
The method raises
NotImplementedErrorby default. If a dialect does not implement this method, then the default dialect will not perform any checking on a given isolation level value before passing it onto theDialect.set_isolation_level()method. This is to allow backwards-compatibility with third party dialects that may not yet be implementing this method.Added in version 2.0.
- get_materialized_view_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of all materialized view names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_materialized_view_names().- Parameters:
schema –
schema name to query, if not the default schema.
Added in version 2.0.
- get_multi_check_constraints(connection, **kw)¶
Return information about check constraints in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_check_constraints().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_columns(connection, **kw)¶
Return information about columns in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_columns().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_foreign_keys(connection, **kw)¶
Return information about foreign_keys in all tables in the given
schema.This is an internal dialect method. Applications should use
_engine.Inspector.get_multi_foreign_keys().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_indexes(connection, **kw)¶
Return information about indexes in in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_indexes().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_pk_constraint(connection, **kw)¶
Return information about primary key constraints in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_pk_constraint().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_table_comment(connection, **kw)¶
Return information about the table comment in all tables in the given
schema.This is an internal dialect method. Applications should use
_engine.Inspector.get_multi_table_comment().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_table_options(connection, **kw)¶
Return a dictionary of options specified when the tables in the given schema were created.
This is an internal dialect method. Applications should use
_engine.Inspector.get_multi_table_options().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_multi_unique_constraints(connection, **kw)¶
Return information about unique constraints in all tables in the given
schema.This is an internal dialect method. Applications should use
Inspector.get_multi_unique_constraints().Note
The
_engine.DefaultDialectprovides a default implementation that will call the single table method for each object returned byDialect.get_table_names(),Dialect.get_view_names()orDialect.get_materialized_view_names()depending on the providedkind. Dialects that want to support a faster implementation should implement this method.Added in version 2.0.
- get_pk_constraint(connection: Connection, table_name: str, schema: str | None = None, **kw) ReflectedPrimaryKeyConstraint¶
Return information about the primary key constraint on table_name`.
Given a
_engine.Connection, a stringtable_name, and an optional stringschema, return primary key information as a dictionary corresponding to theReflectedPrimaryKeyConstraintdictionary.This is an internal dialect method. Applications should use
Inspector.get_pk_constraint().
- classmethod get_pool_class(url: URL) Type[Pool]¶
- get_schema_names(connection, **kw)¶
Return a list of all schema names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_schema_names().
- get_sequence_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of all sequence names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_sequence_names().- Parameters:
schema – schema name to query, if not the default schema.
Added in version 1.4.
- get_table_comment(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
Return the “comment” for the table identified by
table_name.Given a string
table_nameand an optional stringschema, return table comment information as a dictionary corresponding to theReflectedTableCommentdictionary.This is an internal dialect method. Applications should use
Inspector.get_table_comment().- Raise:
NotImplementedErrorfor dialects that don’t support comments.
Added in version 1.2.
- get_table_names(connection: Connection, schema: str | None = None, **kw)¶
Return a list of table names for
schema.This is an internal dialect method. Applications should use
_engine.Inspector.get_table_names().
- get_table_options(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
Return a dictionary of options specified when
table_namewas created.This is an internal dialect method. Applications should use
_engine.Inspector.get_table_options().
- get_temp_table_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of temporary table names on the given connection, if supported by the underlying backend.
This is an internal dialect method. Applications should use
_engine.Inspector.get_temp_table_names().
- get_temp_view_names(connection: Connection, schema: str | None = None, **kw: Any) List[str]¶
Return a list of temporary view names on the given connection, if supported by the underlying backend.
This is an internal dialect method. Applications should use
_engine.Inspector.get_temp_view_names().
- get_unique_constraints(connection: Connection, table_name: str, schema: str | None = None, **kw: Any) List[ReflectedUniqueConstraint]¶
Return information about unique constraints in
table_name.Given a string
table_nameand an optional stringschema, return unique constraint information as a list of dicts corresponding to theReflectedUniqueConstraintdictionary.This is an internal dialect method. Applications should use
Inspector.get_unique_constraints().
- get_view_definition(connection: Connection, view_name: str, schema: str | None = None, **kw)¶
Return plain or materialized view definition.
This is an internal dialect method. Applications should use
_engine.Inspector.get_view_definition().Given a
_engine.Connection, a stringview_name, and an optional stringschema, return the view definition.
- get_view_names(connection: Connection, schema: str | None = None, **kw)¶
Return a list of all non-materialized view names available in the database.
This is an internal dialect method. Applications should use
_engine.Inspector.get_view_names().- Parameters:
schema – schema name to query, if not the default schema.
- has_index(connection, table_name, index_name, schema=None, **kw)¶
Check the existence of a particular index name in the database.
Given a
_engine.Connectionobject, a stringtable_nameand string index name, returnTrueif an index of the given name on the given table exists,Falseotherwise.The
DefaultDialectimplements this in terms of theDialect.has_table()andDialect.get_indexes()methods, however dialects can implement a more performant version.This is an internal dialect method. Applications should use
_engine.Inspector.has_index().Added in version 1.4.
- has_schema(connection: Connection, schema_name: str, **kw: Any) bool¶
Check the existence of a particular schema name in the database.
Given a
_engine.Connectionobject, a stringschema_name, returnTrueif a schema of the given exists,Falseotherwise.The
DefaultDialectimplements this by checking the presence ofschema_nameamong the schemas returned byDialect.get_schema_names(), however dialects can implement a more performant version.This is an internal dialect method. Applications should use
_engine.Inspector.has_schema().Added in version 2.0.
- has_sequence(connection: Connection, sequence_name: str, schema: str | None = None, **kw: Any) bool¶
Check the existence of a particular sequence in the database.
Given a
_engine.Connectionobject and a string sequence_name, returnTrueif the given sequence exists in the database,Falseotherwise.This is an internal dialect method. Applications should use
_engine.Inspector.has_sequence().
- has_table(connection: Connection, table_name: str, schema: str | None = None, **kw)¶
For internal dialect use, check the existence of a particular table or view in the database.
Given a
_engine.Connectionobject, a string table_name and optional schema name, return True if the given table exists in the database, False otherwise.This method serves as the underlying implementation of the public facing
Inspector.has_table()method, and is also used internally to implement the “checkfirst” behavior for methods like_schema.Table.create()and_schema.MetaData.create_all().Note
This method is used internally by SQLAlchemy, and is published so that third-party dialects may provide an implementation. It is not the public API for checking for table presence. Please use the
Inspector.has_table()method.Changed in version 2.0:::
_engine.Dialect.has_table()now formally supports checking for additional table-like objects:any type of views (plain or materialized)
temporary tables of any kind
Previously, these two checks were not formally specified and different dialects would vary in their behavior. The dialect testing suite now includes tests for all of these object types, and dialects to the degree that the backing database supports views or temporary tables should seek to support locating these objects for full compliance.
- has_terminate: bool = False¶
Whether or not this dialect has a separate “terminate” implementation that does not block or require awaiting.
- include_set_input_sizes: Set[Any] | None = None¶
set of DBAPI type objects that should be included in automatic cursor.setinputsizes() calls.
This is only used if bind_typing is BindTyping.SET_INPUT_SIZES
- initialize(connection: Connection) None¶
Called during strategized creation of the dialect with a connection.
Allows dialects to configure options based on server version info or other properties.
The connection passed here is a SQLAlchemy Connection object, with full capabilities.
The initialize() method of the base dialect should be called via super().
Note
as of SQLAlchemy 1.4, this method is called before any
_engine.Dialect.on_connect()hooks are called.
- inline_comments: bool = False¶
Indicates the dialect supports comment DDL that’s inline with the definition of a Table or Column. If False, this implies that ALTER must be used to set table and column comments.
- insert_executemany_returning: bool¶
dialect / driver / database supports some means of providing INSERT…RETURNING support when dialect.do_executemany() is used.
- insert_executemany_returning_sort_by_parameter_order: bool¶
dialect / driver / database supports some means of providing INSERT…RETURNING support when dialect.do_executemany() is used along with the :paramref:`_dml.Insert.returning.sort_by_parameter_order` parameter being set.
- insert_null_pk_still_autoincrements = False¶
- insertmanyvalues_implicit_sentinel: InsertmanyvaluesSentinelOpts = symbol('NOT_SUPPORTED')¶
Options indicating the database supports a form of bulk INSERT where the autoincrement integer primary key can be reliably used as an ordering for INSERTed rows.
Added in version 2.0.10.
See also
engine_insertmanyvalues_returning_order
- insertmanyvalues_max_parameters: int = 32700¶
Alternate to insertmanyvalues_page_size, will additionally limit page size based on number of parameters total in the statement.
- insertmanyvalues_page_size: int = 1000¶
Number of rows to render into an individual INSERT..VALUES() statement for
ExecuteStyle.INSERTMANYVALUESexecutions.The default dialect defaults this to 1000.
Added in version 2.0.
See also
:paramref:`_engine.Connection.execution_options.insertmanyvalues_page_size` - execution option available on
_engine.Connection, statements
- is_disconnect(e: DBAPIModule.Error, connection: pool.PoolProxiedConnection | interfaces.DBAPIConnection | None, cursor: interfaces.DBAPICursor | None) bool¶
Return True if the given DB-API error indicates an invalid connection
- ischema_names: dict[str, type[Any]] = {'array': <class 'sqlalchemy.sql.sqltypes.String'>, 'bigint': <class 'sqlalchemy.sql.sqltypes.BIGINT'>, 'binary': <class 'sqlalchemy.sql.sqltypes.BINARY'>, 'boolean': <class 'sqlalchemy.sql.sqltypes.BOOLEAN'>, 'char': <class 'sqlalchemy.sql.sqltypes.CHAR'>, 'date': <class 'sqlalchemy.sql.sqltypes.DATE'>, 'decimal': <class 'sqlalchemy.sql.sqltypes.DECIMAL'>, 'double': <class 'sqlalchemy.sql.sqltypes.DOUBLE'>, 'float': <class 'sqlalchemy.sql.sqltypes.FLOAT'>, 'int': <class 'sqlalchemy.sql.sqltypes.INTEGER'>, 'integer': <class 'sqlalchemy.sql.sqltypes.INTEGER'>, 'json': <class 'sqlalchemy.sql.sqltypes.JSON'>, 'map': <class 'sqlalchemy.sql.sqltypes.String'>, 'real': <class 'sqlalchemy.sql.sqltypes.FLOAT'>, 'row': <class 'pyathena.sqlalchemy.types.AthenaStruct'>, 'smallint': <class 'sqlalchemy.sql.sqltypes.SMALLINT'>, 'string': <class 'sqlalchemy.sql.sqltypes.String'>, 'struct': <class 'pyathena.sqlalchemy.types.AthenaStruct'>, 'timestamp': <class 'sqlalchemy.sql.sqltypes.TIMESTAMP'>, 'tinyint': <class 'pyathena.sqlalchemy.types.TINYINT'>, 'varbinary': <class 'sqlalchemy.sql.sqltypes.BINARY'>, 'varchar': <class 'sqlalchemy.sql.sqltypes.VARCHAR'>}¶
- classmethod load_provisioning()¶
set up the provision.py module for this dialect.
For dialects that include a provision.py module that sets up provisioning followers, this method should initiate that process.
A typical implementation would be:
@classmethod def load_provisioning(cls): __import__("mydialect.provision")
The default method assumes a module named
provision.pyinside the owning package of the current dialect, based on the__module__attribute:@classmethod def load_provisioning(cls): package = ".".join(cls.__module__.split(".")[0:-1]) try: __import__(package + ".provision") except ImportError: pass
Added in version 1.3.14.
- loaded_dbapi¶
- max_constraint_name_length: int | None = None¶
The maximum length of constraint names if different from
max_identifier_length.
- max_index_name_length: int | None = None¶
The maximum length of index names if different from
max_identifier_length.
- name: str = 'awsathena'¶
identifying name for the dialect from a DBAPI-neutral point of view (i.e. ‘sqlite’)
- non_native_boolean_check_constraint = True¶
- normalize_name(name)¶
convert the given name to lowercase if it is detected as case insensitive.
This method is only used if the dialect defines requires_name_normalize=True.
- on_connect() Callable[[Any], None] | None¶
return a callable which sets up a newly created DBAPI connection.
The callable should accept a single argument “conn” which is the DBAPI connection itself. The inner callable has no return value.
E.g.:
class MyDialect(default.DefaultDialect): # ... def on_connect(self): def do_on_connect(connection): connection.execute("SET SPECIAL FLAGS etc") return do_on_connect
This is used to set dialect-wide per-connection options such as isolation modes, Unicode modes, etc.
The “do_on_connect” callable is invoked by using the
_events.PoolEvents.connect()event hook, then unwrapping the DBAPI connection and passing it into the callable.Changed in version 1.4: the on_connect hook is no longer called twice for the first connection of a dialect. The on_connect hook is still called before the
_engine.Dialect.initialize()method however.Changed in version 1.4.3: the on_connect hook is invoked from a new method on_connect_url that passes the URL that was used to create the connect args. Dialects can implement on_connect_url instead of on_connect if they need the URL object that was used for the connection in order to get additional context.
If None is returned, no event listener is generated.
- Returns:
a callable that accepts a single DBAPI connection as an argument, or None.
See also
Dialect.connect()- allows the DBAPIconnect()sequence itself to be controlled.Dialect.on_connect_url()- supersedesDialect.on_connect()to also receive the_engine.URLobject in context.
- on_connect_url(url: URL) Callable[[Any], Any] | None¶
return a callable which sets up a newly created DBAPI connection.
This method is a new hook that supersedes the
_engine.Dialect.on_connect()method when implemented by a dialect. When not implemented by a dialect, it invokes the_engine.Dialect.on_connect()method directly to maintain compatibility with existing dialects. There is no deprecation for_engine.Dialect.on_connect()expected.The callable should accept a single argument “conn” which is the DBAPI connection itself. The inner callable has no return value.
E.g.:
class MyDialect(default.DefaultDialect): # ... def on_connect_url(self, url): def do_on_connect(connection): connection.execute("SET SPECIAL FLAGS etc") return do_on_connect
This is used to set dialect-wide per-connection options such as isolation modes, Unicode modes, etc.
This method differs from
_engine.Dialect.on_connect()in that it is passed the_engine.URLobject that’s relevant to the connect args. Normally the only way to get this is from the_engine.Dialect.on_connect()hook is to look on the_engine.Engineitself, however this URL object may have been replaced by plugins.Note
The default implementation of
_engine.Dialect.on_connect_url()is to invoke the_engine.Dialect.on_connect()method. Therefore if a dialect implements this method, the_engine.Dialect.on_connect()method will not be called unless the overriding dialect calls it directly from here.Added in version 1.4.3: added
_engine.Dialect.on_connect_url()which normally calls into_engine.Dialect.on_connect().- Parameters:
url – a
_engine.URLobject representing the_engine.URLthat was passed to the_engine.Dialect.create_connect_args()method.- Returns:
a callable that accepts a single DBAPI connection as an argument, or None.
See also
_engine.Dialect.on_connect()
- preexecute_autoincrement_sequences: bool = False¶
True if ‘implicit’ primary key functions must be executed separately in order to get their value, if RETURNING is not used.
This is currently oriented towards PostgreSQL when the
implicit_returning=Falseparameter is used on aTableobject.
- preparer¶
alias of
AthenaDMLIdentifierPreparer
- reflection_options: Sequence[str] = ()¶
Sequence of string names indicating keyword arguments that can be established on a
Tableobject which will be passed as “reflection options” when using :paramref:`.Table.autoload_with`.Current example is “oracle_resolve_synonyms” in the Oracle Database dialects.
- requires_name_normalize: bool = False¶
Indicates symbol names are returned by the database in UPPERCASED if they are case insensitive within the database. If this is True, the methods normalize_name() and denormalize_name() must be provided.
- reset_isolation_level(dbapi_conn)¶
Given a DBAPI connection, revert its isolation to the default.
Note that this is a dialect-level method which is used as part of the implementation of the
_engine.Connectionand_engine.Engineisolation level facilities; these APIs should be preferred for most typical use cases.See also
_engine.Connection.get_isolation_level()- view current level_engine.Connection.default_isolation_level- view default level:paramref:`.Connection.execution_options.isolation_level` - set per
_engine.Connectionisolation level:paramref:`_sa.create_engine.isolation_level` - set per
_engine.Engineisolation level
- returns_native_bytes: bool = False¶
indicates if Python bytes() objects are returned natively by the driver for SQL “binary” datatypes.
Added in version 2.0.11.
- sequences_optional: bool = False¶
If True, indicates if the :paramref:`_schema.Sequence.optional` parameter on the
_schema.Sequenceconstruct should signal to not generate a CREATE SEQUENCE. Applies only to dialects that support sequences. Currently used only to allow PostgreSQL SERIAL to be used on a column that specifies Sequence() for usage on other backends.
- server_side_cursors: bool = False¶
deprecated; indicates if the dialect should attempt to use server side cursors by default
- server_version_info: Tuple[Any, ...] | None = None¶
a tuple containing a version number for the DB backend in use.
This value is only available for supporting dialects, and is typically populated during the initial connection to the database.
- set_connection_execution_options(connection: Connection, opts: Mapping[str, Any]) None¶
Establish execution options for a given connection.
This is implemented by
DefaultDialectin order to implement the :paramref:`_engine.Connection.execution_options.isolation_level` execution option. Dialects can intercept various execution options which may need to modify state on a particular DBAPI connection.Added in version 1.4.
- set_engine_execution_options(engine: Engine, opts: Mapping[str, Any]) None¶
Establish execution options for a given engine.
This is implemented by
DefaultDialectto establish event hooks for newConnectioninstances created by the givenEnginewhich will then invoke theDialect.set_connection_execution_options()method for that connection.
- set_isolation_level(dbapi_connection: DBAPIConnection, level: Literal['SERIALIZABLE', 'REPEATABLE READ', 'READ COMMITTED', 'READ UNCOMMITTED', 'AUTOCOMMIT']) None¶
Given a DBAPI connection, set its isolation level.
Note that this is a dialect-level method which is used as part of the implementation of the
_engine.Connectionand_engine.Engineisolation level facilities; these APIs should be preferred for most typical use cases.If the dialect also implements the
Dialect.get_isolation_level_values()method, then the given level is guaranteed to be one of the string names within that sequence, and the method will not need to anticipate a lookup failure.See also
_engine.Connection.get_isolation_level()- view current level_engine.Connection.default_isolation_level- view default level:paramref:`.Connection.execution_options.isolation_level` - set per
_engine.Connectionisolation level:paramref:`_sa.create_engine.isolation_level` - set per
_engine.Engineisolation level
- statement_compiler¶
alias of
AthenaStatementCompiler
- supports_alter: bool = False¶
Trueif the database supportsALTER TABLE- used only for generating foreign key constraints in certain circumstances
- supports_constraint_comments: bool = False¶
Indicates if the dialect supports comment DDL on constraints.
Added in version 2.0.
- supports_for_update_of = False¶
- supports_is_distinct_from = True¶
- supports_multivalues_insert: bool = True¶
Target database supports INSERT…VALUES with multiple value sets, i.e. INSERT INTO table (cols) VALUES (…), (…), (…), …
- supports_native_boolean: bool = True¶
Indicates if the dialect supports a native boolean construct. This will prevent
_types.Booleanfrom generating a CHECK constraint when that type is used.
- supports_native_decimal: bool = True¶
indicates if Decimal objects are handled and returned for precision numeric types, or if floats are returned
- supports_native_enum: bool = False¶
Indicates if the dialect supports a native ENUM construct. This will prevent
_types.Enumfrom generating a CHECK constraint when that type is used in “native” mode.
- supports_native_uuid: bool = False¶
indicates if Python UUID() objects are handled natively by the driver for SQL UUID datatypes.
Added in version 2.0.
- supports_sane_multi_rowcount: bool = True¶
Indicate whether the dialect properly implements rowcount for
UPDATEandDELETEstatements when executed via executemany.
- supports_sane_rowcount: bool = True¶
Indicate whether the dialect properly implements rowcount for
UPDATEandDELETEstatements.
- property supports_sane_rowcount_returning¶
True if this dialect supports sane rowcount even if RETURNING is in use.
For dialects that don’t support RETURNING, this is synonymous with
supports_sane_rowcount.
- supports_schemas = True¶
- supports_server_side_cursors: generic_fn_descriptor[bool] | bool = False¶
indicates if the dialect supports server side cursors
- supports_simple_order_by_label: bool = True¶
target database supports ORDER BY <labelname>, where <labelname> refers to a label in the columns clause of the SELECT
- supports_views = True¶
- type_compiler¶
alias of
AthenaTypeCompiler
- type_compiler_cls¶
alias of
GenericTypeCompiler
- type_descriptor(typeobj)¶
Provide a database-specific
TypeEngineobject, given the generic object which comes from the types module.This method looks for a dictionary called
colspecsas a class or instance-level variable, and passes on to_types.adapt_type().
- update_returning_multifrom: bool = False¶
if the dialect supports RETURNING with UPDATE..FROM
Added in version 2.0.
- use_insertmanyvalues: bool = False¶
if True, indicates “insertmanyvalues” functionality should be used to allow for
insert_executemany_returningbehavior, if possible.In practice, setting this to True means:
if
supports_multivalues_insert,insert_returninganduse_insertmanyvaluesare all True, the SQL compiler will produce an INSERT that will be interpreted by theDefaultDialectas anExecuteStyle.INSERTMANYVALUESexecution that allows for INSERT of many rows with RETURNING by rewriting a single-row INSERT statement to have multiple VALUES clauses, also executing the statement multiple times for a series of batches when large numbers of rows are given.The parameter is False for the default dialect, and is set to True for SQLAlchemy internal dialects SQLite, MySQL/MariaDB, PostgreSQL, SQL Server. It remains at False for Oracle Database, which provides native “executemany with RETURNING” support and also does not support
supports_multivalues_insert. For MySQL/MariaDB, those MySQL dialects that don’t support RETURNING will not reportinsert_executemany_returningas True.Added in version 2.0.
See also
engine_insertmanyvalues
- use_insertmanyvalues_wo_returning: bool = False¶
if True, and use_insertmanyvalues is also True, INSERT statements that don’t include RETURNING will also use “insertmanyvalues”.
Added in version 2.0.
See also
engine_insertmanyvalues
- compiler_linting: Linting¶
- type_compiler_instance: TypeCompiler¶
instance of a
Compiledclass used to compile SQL type objectsAdded in version 2.0.
- identifier_preparer: IdentifierPreparer¶
This element will refer to an instance of
IdentifierPrepareronce aDefaultDialecthas been constructed.
- default_isolation_level: IsolationLevel | None¶
the isolation that is implicitly present on new connections
- skip_autocommit_rollback: bool¶
Whether or not the :paramref:`.create_engine.skip_autocommit_rollback` parameter was set.
Added in version 2.0.43.
Type System¶
- class pyathena.sqlalchemy.types.AthenaTimestamp[source]¶
SQLAlchemy type for Athena TIMESTAMP values.
This type handles the conversion of Python datetime objects to Athena’s TIMESTAMP literal syntax. When used in queries, datetime values are rendered as
TIMESTAMP 'YYYY-MM-DD HH:MM:SS.mmm'.The type supports millisecond precision (3 decimal places) which matches Athena’s TIMESTAMP type precision.
Example
>>> from sqlalchemy import Column, Table, MetaData >>> from pyathena.sqlalchemy.types import AthenaTimestamp >>> metadata = MetaData() >>> events = Table('events', metadata, ... Column('event_time', AthenaTimestamp) ... )
- render_literal_cast = True¶
render casts when rendering a value as an inline literal, e.g. with
TypeEngine.literal_processor().Added in version 2.0.
- render_bind_cast = True¶
Render bind casts for
BindTyping.RENDER_CASTSmode.If True, this type (usually a dialect level impl type) signals to the compiler that a cast should be rendered around a bound parameter for this type.
Added in version 2.0.
See also
BindTyping
- literal_processor(dialect: Dialect) _LiteralProcessorType[datetime] | None[source]¶
Return a conversion function for processing literal values that are to be rendered directly without using binds.
This function is used when the compiler makes use of the “literal_binds” flag, typically used in DDL generation as well as in certain scenarios where backends don’t accept bound parameters.
Returns a callable which will receive a literal Python value as the sole positional argument and will return a string representation to be rendered in a SQL statement.
Tip
This method is only called relative to a dialect specific type object, which is often private to a dialect in use and is not the same type object as the public facing one, which means it’s not feasible to subclass a
types.TypeEngineclass in order to provide an alternate_types.TypeEngine.literal_processor()method, unless subclassing the_types.UserDefinedTypeclass explicitly.To provide alternate behavior for
_types.TypeEngine.literal_processor(), implement a_types.TypeDecoratorclass and provide an implementation of_types.TypeDecorator.process_literal_param().See also
types_typedecorator
- class pyathena.sqlalchemy.types.AthenaDate[source]¶
SQLAlchemy type for Athena DATE values.
This type handles the conversion of Python date objects to Athena’s DATE literal syntax. When used in queries, date values are rendered as
DATE 'YYYY-MM-DD'.Example
>>> from sqlalchemy import Column, Table, MetaData >>> from pyathena.sqlalchemy.types import AthenaDate >>> metadata = MetaData() >>> orders = Table('orders', metadata, ... Column('order_date', AthenaDate) ... )
- render_literal_cast = True¶
render casts when rendering a value as an inline literal, e.g. with
TypeEngine.literal_processor().Added in version 2.0.
- render_bind_cast = True¶
Render bind casts for
BindTyping.RENDER_CASTSmode.If True, this type (usually a dialect level impl type) signals to the compiler that a cast should be rendered around a bound parameter for this type.
Added in version 2.0.
See also
BindTyping
- literal_processor(dialect: Dialect) _LiteralProcessorType[date] | None[source]¶
Return a conversion function for processing literal values that are to be rendered directly without using binds.
This function is used when the compiler makes use of the “literal_binds” flag, typically used in DDL generation as well as in certain scenarios where backends don’t accept bound parameters.
Returns a callable which will receive a literal Python value as the sole positional argument and will return a string representation to be rendered in a SQL statement.
Tip
This method is only called relative to a dialect specific type object, which is often private to a dialect in use and is not the same type object as the public facing one, which means it’s not feasible to subclass a
types.TypeEngineclass in order to provide an alternate_types.TypeEngine.literal_processor()method, unless subclassing the_types.UserDefinedTypeclass explicitly.To provide alternate behavior for
_types.TypeEngine.literal_processor(), implement a_types.TypeDecoratorclass and provide an implementation of_types.TypeDecorator.process_literal_param().See also
types_typedecorator
- class pyathena.sqlalchemy.types.Tinyint[source]¶
SQLAlchemy type for Athena TINYINT (8-bit signed integer).
TINYINT stores values from -128 to 127. This type is useful for columns that contain small integer values to optimize storage.
- class pyathena.sqlalchemy.types.AthenaStruct(*fields: str | tuple[str, Any])[source]¶
SQLAlchemy type for Athena STRUCT/ROW complex type.
STRUCT represents a record with named fields, similar to a database row or a Python dictionary with typed values. Each field has a name and a data type.
- Parameters:
*fields – Field specifications. Each can be either: - A string (field name, defaults to STRING type) - A tuple of (field_name, field_type)
Example
>>> from sqlalchemy import Column, Table, MetaData, types >>> from pyathena.sqlalchemy.types import AthenaStruct >>> metadata = MetaData() >>> users = Table('users', metadata, ... Column('address', AthenaStruct( ... ('street', types.String), ... ('city', types.String), ... ('zip_code', types.Integer) ... )) ... )
See also
AWS Athena STRUCT Type: https://docs.aws.amazon.com/athena/latest/ug/rows-and-structs.html
- property python_type: type¶
Return the Python type object expected to be returned by instances of this type, if known.
Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like
intfor example), will return that type.If a return type is not defined, raises
NotImplementedError.Note that any type also accommodates NULL in SQL which means you can also get back
Nonefrom any type in practice.
- class pyathena.sqlalchemy.types.AthenaMap(key_type: Any = None, value_type: Any = None)[source]¶
SQLAlchemy type for Athena MAP complex type.
MAP represents a collection of key-value pairs where all keys have the same type and all values have the same type.
- Parameters:
key_type – SQLAlchemy type for map keys. Defaults to String.
value_type – SQLAlchemy type for map values. Defaults to String.
Example
>>> from sqlalchemy import Column, Table, MetaData, types >>> from pyathena.sqlalchemy.types import AthenaMap >>> metadata = MetaData() >>> settings = Table('settings', metadata, ... Column('config', AthenaMap(types.String, types.Integer)) ... )
See also
AWS Athena MAP Type: https://docs.aws.amazon.com/athena/latest/ug/maps.html
- property python_type: type¶
Return the Python type object expected to be returned by instances of this type, if known.
Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like
intfor example), will return that type.If a return type is not defined, raises
NotImplementedError.Note that any type also accommodates NULL in SQL which means you can also get back
Nonefrom any type in practice.
- class pyathena.sqlalchemy.types.AthenaArray(item_type: Any = None)[source]¶
SQLAlchemy type for Athena ARRAY complex type.
ARRAY represents an ordered collection of elements of the same type.
- Parameters:
item_type – SQLAlchemy type for array elements. Defaults to String.
Example
>>> from sqlalchemy import Column, Table, MetaData, types >>> from pyathena.sqlalchemy.types import AthenaArray >>> metadata = MetaData() >>> posts = Table('posts', metadata, ... Column('tags', AthenaArray(types.String)) ... )
See also
AWS Athena ARRAY Type: https://docs.aws.amazon.com/athena/latest/ug/arrays.html
- property python_type: type¶
Return the Python type object expected to be returned by instances of this type, if known.
Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like
intfor example), will return that type.If a return type is not defined, raises
NotImplementedError.Note that any type also accommodates NULL in SQL which means you can also get back
Nonefrom any type in practice.
Compilers¶
- class pyathena.sqlalchemy.compiler.AthenaTypeCompiler(dialect: Dialect)[source]¶
Type compiler for Amazon Athena SQL types.
This compiler translates SQLAlchemy type objects into Athena-compatible SQL type strings for use in DDL statements. It handles the mapping between SQLAlchemy’s portable types and Athena’s specific type syntax.
Athena has specific requirements for type names that differ from standard SQL. For example, FLOAT maps to REAL in CAST expressions, and various string types (TEXT, NCHAR, NVARCHAR) all map to STRING.
The compiler also supports Athena-specific complex types: - STRUCT/ROW: Nested record types with named fields - MAP: Key-value pair collections - ARRAY: Ordered collections of elements
See also
AWS Athena Data Types: https://docs.aws.amazon.com/athena/latest/ug/data-types.html
- class pyathena.sqlalchemy.compiler.AthenaStatementCompiler(dialect: Dialect, statement: ClauseElement | None, cache_key: CacheKey | None = None, column_keys: Sequence[str] | None = None, for_executemany: bool = False, linting: Linting = Linting.NO_LINTING, _supporting_against: SQLCompiler | None = None, **kwargs: Any)[source]¶
SQL statement compiler for Amazon Athena queries.
This compiler generates Athena-compatible SQL statements from SQLAlchemy expression constructs. It handles Athena-specific SQL syntax including:
Function name mapping (e.g., char_length -> length)
Lambda expressions in functions like filter()
CAST expressions with Athena type requirements
OFFSET/LIMIT clause ordering (Athena uses OFFSET before LIMIT)
Time travel hints (FOR TIMESTAMP AS OF, FOR VERSION AS OF)
The compiler ensures that generated SQL is compatible with Presto/Trino syntax used by Athena engine versions 2 and 3.
See also
AWS Athena SQL Reference: https://docs.aws.amazon.com/athena/latest/ug/ddl-sql-reference.html
- class pyathena.sqlalchemy.compiler.AthenaDDLCompiler(dialect: AthenaDialect, statement: CreateTable, schema_translate_map: dict[str | None, str | None] | None = None, render_schema_translate: bool = False, compile_kwargs: dict[str, Any] | None = None)[source]¶
DDL compiler for Amazon Athena CREATE TABLE and related statements.
This compiler generates Athena-compatible DDL statements including support for Athena-specific table options:
External table creation (EXTERNAL keyword for Hive-style tables)
Iceberg table creation (managed tables with ACID support)
File formats: PARQUET, ORC, TEXTFILE, JSON, AVRO, etc.
Row formats with SerDe specifications
Compression settings for various file formats
Table locations in S3
Partitioning (both Hive-style and Iceberg transforms)
Bucketing/clustering for optimized queries
The compiler uses backtick quoting for DDL identifiers (different from DML which uses double quotes) and handles Athena’s reserved words.
Example
A table created with this compiler might generate:
CREATE EXTERNAL TABLE IF NOT EXISTS my_schema.my_table ( id INT, name STRING ) PARTITIONED BY ( dt STRING ) STORED AS PARQUET LOCATION 's3://my-bucket/my-table/' TBLPROPERTIES ('parquet.compress' = 'SNAPPY')
See also
AWS Athena CREATE TABLE: https://docs.aws.amazon.com/athena/latest/ug/create-table.html
- property preparer: IdentifierPreparer¶
- __init__(dialect: AthenaDialect, statement: CreateTable, schema_translate_map: dict[str | None, str | None] | None = None, render_schema_translate: bool = False, compile_kwargs: dict[str, Any] | None = None)[source]¶
Construct a new
Compiledobject.- Parameters:
dialect –
Dialectto compile against.statement –
_expression.ClauseElementto be compiled.schema_translate_map –
dictionary of schema names to be translated when forming the resultant SQL
See also
schema_translating
compile_kwargs – additional kwargs that will be passed to the initial call to
Compiled.process().
Identifier Preparers¶
- class pyathena.sqlalchemy.preparer.AthenaDMLIdentifierPreparer(dialect: Dialect, initial_quote: str = '"', final_quote: str | None = None, escape_quote: str = '"', quote_case_sensitive_collations: bool = True, omit_schema: bool = False)[source]¶
Identifier preparer for Athena DML (SELECT, INSERT, etc.) statements.
This preparer handles quoting and escaping of identifiers in DML statements. It uses double quotes for identifiers and recognizes Athena’s SELECT statement reserved words to determine when quoting is necessary.
Athena’s DML syntax follows Presto/Trino conventions, which differ from DDL syntax (which uses Hive conventions with backticks).
See also
AthenaDDLIdentifierPreparer: Preparer for DDL statements. AWS Athena Reserved Words: https://docs.aws.amazon.com/athena/latest/ug/reserved-words.html- reserved_words: set[str] = {'all', 'and', 'any', 'array', 'as', 'asc', 'at', 'bernoulli', 'between', 'both', 'by', 'call', 'cascade', 'case', 'cast', 'column', 'constraint', 'contains', 'corresponding', 'create', 'cross', 'cube', 'current', 'current_catalog', 'current_date', 'current_path', 'current_role', 'current_schema', 'current_time', 'current_timestamp', 'current_user', 'deallocate', 'delete', 'desc', 'describe', 'distinct', 'drop', 'element', 'else', 'end', 'escape', 'every', 'except', 'exec', 'execute', 'exists', 'extract', 'false', 'first', 'for', 'from', 'full', 'group', 'grouping', 'having', 'in', 'inner', 'insert', 'intersect', 'into', 'is', 'join', 'last', 'lateral', 'leading', 'left', 'like', 'localtime', 'localtimestamp', 'natural', 'normalize', 'not', 'null', 'nullif', 'on', 'only', 'or', 'order', 'ordinality', 'outer', 'overlaps', 'partition', 'position', 'prepare', 'range', 'recursive', 'right', 'rollup', 'row', 'rows', 'select', 'some', 'system', 'table', 'tablesample', 'then', 'trailing', 'true', 'uescape', 'unbounded', 'union', 'unnest', 'using', 'values', 'when', 'where', 'window', 'with'}¶
- class pyathena.sqlalchemy.preparer.AthenaDDLIdentifierPreparer(dialect: Dialect, initial_quote: str = '`', final_quote: str | None = None, escape_quote: str = '`', quote_case_sensitive_collations: bool = True, omit_schema: bool = False)[source]¶
Identifier preparer for Athena DDL (CREATE, ALTER, DROP) statements.
This preparer handles quoting and escaping of identifiers in DDL statements. It uses backticks for identifiers (Hive convention) rather than double quotes (Presto/Trino convention used in DML).
Key differences from DML preparer: - Uses backtick (`) as the quote character - Recognizes DDL-specific reserved words - Treats underscore (_) as an illegal initial character
See also
AthenaDMLIdentifierPreparer: Preparer for DML statements. AWS Athena DDL Reserved Words: https://docs.aws.amazon.com/athena/latest/ug/reserved-words.html- reserved_words = {'all', 'alter', 'and', 'array', 'as', 'authorization', 'between', 'bigint', 'binary', 'boolean', 'both', 'by', 'case', 'cashe', 'cast', 'char', 'column', 'commit', 'conf', 'constraint', 'create', 'cross', 'cube', 'current', 'current_date', 'current_timestamp', 'cursor', 'database', 'date', 'dayofweek', 'decimal', 'delete', 'describe', 'distinct', 'double', 'drop', 'else', 'end', 'exchange', 'exists', 'extended', 'external', 'extract', 'false', 'fetch', 'float', 'floor', 'following', 'for', 'foreign', 'from', 'full', 'function', 'grant', 'group', 'grouping', 'having', 'if', 'import', 'in', 'inner', 'insert', 'int', 'integer', 'intersect', 'interval', 'into', 'is', 'join', 'lateral', 'left', 'less', 'like', 'local', 'macro', 'map', 'more', 'none', 'not', 'null', 'numeric', 'of', 'on', 'only', 'or', 'order', 'out', 'outer', 'over', 'partialscan', 'partition', 'percent', 'preceding', 'precision', 'preserve', 'primary', 'procedure', 'range', 'reads', 'reduce', 'references', 'regexp', 'revoke', 'right', 'rlike', 'rollback', 'rollup', 'row', 'rows', 'select', 'set', 'smallint', 'start', 'table', 'tablesample', 'then', 'time', 'timestamp', 'to', 'transform', 'trigger', 'true', 'truncate', 'unbounded', 'union', 'uniquejoin', 'update', 'user', 'using', 'utc_timestamp', 'values', 'varchar', 'views', 'when', 'where', 'window', 'with'}¶
- illegal_initial_characters = {'$', '0', '1', '2', '3', '4', '5', '6', '7', '8', '9', '_'}¶
- __init__(dialect: Dialect, initial_quote: str = '`', final_quote: str | None = None, escape_quote: str = '`', quote_case_sensitive_collations: bool = True, omit_schema: bool = False)[source]¶
Construct a new
IdentifierPreparerobject.- initial_quote
Character that begins a delimited identifier.
- final_quote
Character that ends a delimited identifier. Defaults to initial_quote.
- omit_schema
Prevent prepending schema name. Useful for databases that do not support schemae.