A variety of methods exist to redefine the behavior of existing types as well as to provide new ones.
A frequent need is to force the “string” version of a type, that is
the one rendered in a CREATE TABLE statement or other SQL function
like CAST, to be changed. For example, an application may want
to force the rendering of BINARY
for all platforms
except for one, in which is wants BLOB
to be rendered. Usage
of an existing generic type, in this case LargeBinary
, is
preferred for most use cases. But to control
types more accurately, a compilation directive that is per-dialect
can be associated with any type:
from sqlalchemy.ext.compiler import compiles
from sqlalchemy.types import BINARY
@compiles(BINARY, "sqlite")
def compile_binary_sqlite(type_, compiler, **kw):
return "BLOB"
The above code allows the usage of BINARY
, which
will produce the string BINARY
against all backends except SQLite,
in which case it will produce BLOB
.
See the section Changing Compilation of Types, a subsection of Custom SQL Constructs and Compilation Extension, for additional examples.
The TypeDecorator
allows the creation of custom types which
add bind-parameter and result-processing behavior to an existing
type object. It is used when additional in-Python marshalling of data
to and/or from the database is required.
Note
The bind- and result-processing of TypeDecorator
is in addition to the processing already performed by the hosted
type, which is customized by SQLAlchemy on a per-DBAPI basis to perform
processing specific to that DBAPI. While it is possible to replace this
handling for a given type through direct subclassing, it is never needed in
practice and SQLAlchemy no longer supports this as a public use case.
ORM Tip
The TypeDecorator
can be used to provide a consistent means of
converting some type of value as it is passed into and out of the database.
When using the ORM, a similar technique exists for converting user data
from arbitrary formats which is to use the validates()
decorator.
This technique may be more appropriate when data coming into an ORM model
needs to be normalized in some way that is specific to the business case
and isn’t as generic as a datatype.
Object Name | Description |
---|---|
Allows the creation of types which add additional functionality to an existing type. |
sqlalchemy.types.
TypeDecorator
(*args, **kwargs)¶Allows the creation of types which add additional functionality to an existing type.
This method is preferred to direct subclassing of SQLAlchemy’s built-in types as it ensures that all required functionality of the underlying type is kept in place.
Typical usage:
import sqlalchemy.types as types
class MyType(types.TypeDecorator):
'''Prefixes Unicode values with "PREFIX:" on the way in and
strips it off on the way out.
'''
impl = types.Unicode
cache_ok = True
def process_bind_param(self, value, dialect):
return "PREFIX:" + value
def process_result_value(self, value, dialect):
return value[7:]
def copy(self, **kw):
return MyType(self.impl.length)
The class-level impl
attribute is required, and can reference any
TypeEngine
class. Alternatively, the load_dialect_impl()
method can be used to provide different type classes based on the dialect
given; in this case, the impl
variable can reference
TypeEngine
as a placeholder.
The TypeDecorator.cache_ok
class-level flag indicates if this
custom TypeDecorator
is safe to be used as part of a cache key.
This flag defaults to None
which will initially generate a warning
when the SQL compiler attempts to generate a cache key for a statement
that uses this type. If the TypeDecorator
is not guaranteed
to produce the same bind/result behavior and SQL generation
every time, this flag should be set to False
; otherwise if the
class produces the same behavior each time, it may be set to True
.
See TypeDecorator.cache_ok
for further notes on how this works.
Types that receive a Python type that isn’t similar to the ultimate type
used may want to define the TypeDecorator.coerce_compared_value()
method. This is used to give the expression system a hint when coercing
Python objects into bind parameters within expressions. Consider this
expression:
mytable.c.somecol + datetime.date(2009, 5, 15)
Above, if “somecol” is an Integer
variant, it makes sense that
we’re doing date arithmetic, where above is usually interpreted
by databases as adding a number of days to the given date.
The expression system does the right thing by not attempting to
coerce the “date()” value into an integer-oriented bind parameter.
However, in the case of TypeDecorator
, we are usually changing an
incoming Python type to something new - TypeDecorator
by default will
“coerce” the non-typed side to be the same type as itself. Such as below,
we define an “epoch” type that stores a date value as an integer:
class MyEpochType(types.TypeDecorator):
impl = types.Integer
epoch = datetime.date(1970, 1, 1)
def process_bind_param(self, value, dialect):
return (value - self.epoch).days
def process_result_value(self, value, dialect):
return self.epoch + timedelta(days=value)
Our expression of somecol + date
with the above type will coerce the
“date” on the right side to also be treated as MyEpochType
.
This behavior can be overridden via the
TypeDecorator.coerce_compared_value()
method, which returns a type
that should be used for the value of the expression. Below we set it such
that an integer value will be treated as an Integer
, and any other
value is assumed to be a date and will be treated as a MyEpochType
:
def coerce_compared_value(self, op, value):
if isinstance(value, int):
return Integer()
else:
return self
Warning
Note that the behavior of coerce_compared_value is not inherited
by default from that of the base type.
If the TypeDecorator
is augmenting a
type that requires special logic for certain types of operators,
this method must be overridden. A key example is when decorating
the JSON
and JSONB
types;
the default rules of TypeEngine.coerce_compared_value()
should
be used in order to deal with operators like index operations:
class MyJsonType(TypeDecorator):
impl = postgresql.JSON
cache_ok = True
def coerce_compared_value(self, op, value):
return self.impl.coerce_compared_value(op, value)
Without the above step, index operations such as mycol['foo']
will cause the index value 'foo'
to be JSON encoded.
Class signature
class sqlalchemy.types.TypeDecorator
(sqlalchemy.types.ExternalType
, sqlalchemy.sql.expression.SchemaEventTarget
, sqlalchemy.types.TypeEngine
)
sqlalchemy.types.TypeDecorator.
cache_ok
= None¶inherited from the ExternalType.cache_ok
attribute of ExternalType
Indicate if statements using this ExternalType
are “safe to
cache”.
The default value None
will emit a warning and then not allow caching
of a statement which includes this type. Set to False
to disable
statements using this type from being cached at all without a warning.
When set to True
, the object’s class and selected elements from its
state will be used as part of the cache key. For example, using a
TypeDecorator
:
class MyType(TypeDecorator):
impl = String
cache_ok = True
def __init__(self, choices):
self.choices = tuple(choices)
self.internal_only = True
The cache key for the above type would be equivalent to:
>>> MyType(["a", "b", "c"])._static_cache_key
(<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))
The caching scheme will extract attributes from the type that correspond
to the names of parameters in the __init__()
method. Above, the
“choices” attribute becomes part of the cache key but “internal_only”
does not, because there is no parameter named “internal_only”.
The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.
To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:
class LookupType(UserDefinedType):
'''a custom type that accepts a dictionary as a parameter.
this is the non-cacheable version, as "self.lookup" is not
hashable.
'''
def __init__(self, lookup):
self.lookup = lookup
def get_col_spec(self, **kw):
return "VARCHAR(255)"
def bind_processor(self, dialect):
# ... works with "self.lookup" ...
Where “lookup” is a dictionary. The type will not be able to generate a cache key:
>>> type_ = LookupType({"a": 10, "b": 20})
>>> type_._static_cache_key
<stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not
produce a cache key because the ``cache_ok`` flag is not set to True.
Set this flag to True if this type object's state is safe to use
in a cache key, or False to disable this warning.
symbol('no_cache')
If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:
>>> # set cache_ok = True
>>> type_.cache_ok = True
>>> # this is the cache key it would generate
>>> key = type_._static_cache_key
>>> key
(<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20}))
>>> # however this key is not hashable, will fail when used with
>>> # SQLAlchemy statement cache
>>> some_cache = {key: "some sql value"}
Traceback (most recent call last): File "<stdin>", line 1,
in <module> TypeError: unhashable type: 'dict'
The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:
class LookupType(UserDefinedType):
'''a custom type that accepts a dictionary as a parameter.
The dictionary is stored both as itself in a private variable,
and published in a public variable as a sorted tuple of tuples,
which is hashable and will also return the same value for any
two equivalent dictionaries. Note it assumes the keys and
values of the dictionary are themselves hashable.
'''
cache_ok = True
def __init__(self, lookup):
self._lookup = lookup
# assume keys/values of "lookup" are hashable; otherwise
# they would also need to be converted in some way here
self.lookup = tuple(
(key, lookup[key]) for key in sorted(lookup)
)
def get_col_spec(self, **kw):
return "VARCHAR(255)"
def bind_processor(self, dialect):
# ... works with "self._lookup" ...
Where above, the cache key for LookupType({"a": 10, "b": 20})
will be:
>>> LookupType({"a": 10, "b": 20})._static_cache_key
(<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))
New in version 1.4.14: - added the cache_ok
flag to allow
some configurability of caching for TypeDecorator
classes.
New in version 1.4.28: - added the ExternalType
mixin which
generalizes the cache_ok
flag to both the TypeDecorator
and UserDefinedType
classes.
See also
Comparator
(expr)¶A Comparator
that is specific to
TypeDecorator
.
User-defined TypeDecorator
classes should not typically
need to modify this.
Class signature
class sqlalchemy.types.TypeDecorator.Comparator
(sqlalchemy.types.Comparator
)
sqlalchemy.types.TypeDecorator.Comparator.
operate
(op, *other, **kwargs)¶Operate on an argument.
This is the lowest level of operation, raises
NotImplementedError
by default.
Overriding this on a subclass can allow common
behavior to be applied to all operations.
For example, overriding ColumnOperators
to apply func.lower()
to the left and right
side:
class MyComparator(ColumnOperators):
def operate(self, op, other):
return op(func.lower(self), func.lower(other))
sqlalchemy.types.TypeDecorator.Comparator.
reverse_operate
(op, other, **kwargs)¶Reverse operate on an argument.
Usage is the same as operate()
.
sqlalchemy.types.TypeDecorator.
__init__
(*args, **kwargs)¶Construct a TypeDecorator
.
Arguments sent here are passed to the constructor
of the class assigned to the impl
class level attribute,
assuming the impl
is a callable, and the resulting
object is assigned to the self.impl
instance attribute
(thus overriding the class attribute of the same name).
If the class level impl
is not a callable (the unusual case),
it will be assigned to the same instance attribute ‘as-is’,
ignoring those arguments passed to the constructor.
Subclasses can override this to customize the generation
of self.impl
entirely.
sqlalchemy.types.TypeDecorator.
bind_expression
(bindparam)¶Given a bind value (i.e. a BindParameter
instance),
return a SQL expression which will typically wrap the given parameter.
Note
This method is called during the SQL compilation phase of a
statement, when rendering a SQL string. It is not necessarily
called against specific values, and should not be confused with the
TypeDecorator.process_bind_param()
method, which is
the more typical method that processes the actual value passed to a
particular parameter at statement execution time.
Subclasses of TypeDecorator
can override this method
to provide custom bind expression behavior for the type. This
implementation will replace that of the underlying implementation
type.
sqlalchemy.types.TypeDecorator.
bind_processor
(dialect)¶Provide a bound value processing function for the
given Dialect
.
This is the method that fulfills the TypeEngine
contract for bound value conversion which normally occurs via
the TypeEngine.bind_processor()
method.
Note
User-defined subclasses of TypeDecorator
should
not implement this method, and should instead implement
TypeDecorator.process_bind_param()
so that the “inner”
processing provided by the implementing type is maintained.
dialect¶ – Dialect instance in use.
sqlalchemy.types.TypeDecorator.
coerce_compared_value
(op, value)¶Suggest a type for a ‘coerced’ Python value in an expression.
By default, returns self. This method is called by the expression system when an object using this type is on the left or right side of an expression against a plain Python object which does not yet have a SQLAlchemy type assigned:
expr = table.c.somecolumn + 35
Where above, if somecolumn
uses this type, this method will
be called with the value operator.add
and 35
. The return value is whatever SQLAlchemy type should
be used for 35
for this particular operation.
sqlalchemy.types.TypeDecorator.
coerce_to_is_types
= (<class 'NoneType'>,)¶Specify those Python types which should be coerced at the expression
level to “IS <constant>” when compared using ==
(and same for
IS NOT
in conjunction with !=
).
For most SQLAlchemy types, this includes NoneType
, as well as
bool
.
TypeDecorator
modifies this list to only include NoneType
,
as typedecorator implementations that deal with boolean types are common.
Custom TypeDecorator
classes can override this attribute to
return an empty tuple, in which case no values will be coerced to
constants.
sqlalchemy.types.TypeDecorator.
column_expression
(column)¶Given a SELECT column expression, return a wrapping SQL expression.
Note
This method is called during the SQL compilation phase of a
statement, when rendering a SQL string. It is not called
against specific values, and should not be confused with the
TypeDecorator.process_result_value()
method, which is
the more typical method that processes the actual value returned
in a result row subsequent to statement execution time.
Subclasses of TypeDecorator
can override this method
to provide custom column expresion behavior for the type. This
implementation will replace that of the underlying implementation
type.
See the description of TypeEngine.column_expression()
for a complete description of the method’s use.
sqlalchemy.types.TypeDecorator.
comparator_factory
¶A Comparator
class which will apply
to operations performed by owning ColumnElement
objects.
The comparator_factory
attribute is a hook consulted by
the core expression system when column and SQL expression operations
are performed. When a Comparator
class is
associated with this attribute, it allows custom re-definition of
all existing operators, as well as definition of new operators.
Existing operators include those provided by Python operator overloading
such as ColumnOperators.__add__()
and
ColumnOperators.__eq__()
,
those provided as standard
attributes of ColumnOperators
such as
ColumnOperators.like()
and ColumnOperators.in_()
.
Rudimentary usage of this hook is allowed through simple subclassing
of existing types, or alternatively by using TypeDecorator
.
See the documentation section Redefining and Creating New Operators for examples.
sqlalchemy.types.TypeDecorator.
compare_values
(x, y)¶Given two values, compare them for equality.
By default this calls upon TypeEngine.compare_values()
of the underlying “impl”, which in turn usually
uses the Python equals operator ==
.
This function is used by the ORM to compare an original-loaded value with an intercepted “changed” value, to determine if a net change has occurred.
sqlalchemy.types.TypeDecorator.
copy
(**kw)¶Produce a copy of this TypeDecorator
instance.
This is a shallow copy and is provided to fulfill part of
the TypeEngine
contract. It usually does not
need to be overridden unless the user-defined TypeDecorator
has local state that should be deep-copied.
sqlalchemy.types.TypeDecorator.
get_dbapi_type
(dbapi)¶Return the DBAPI type object represented by this
TypeDecorator
.
By default this calls upon TypeEngine.get_dbapi_type()
of the
underlying “impl”.
sqlalchemy.types.TypeDecorator.
literal_processor
(dialect)¶Provide a literal processing function for the given
Dialect
.
This is the method that fulfills the TypeEngine
contract for literal value conversion which normally occurs via
the TypeEngine.literal_processor()
method.
Note
User-defined subclasses of TypeDecorator
should
not implement this method, and should instead implement
TypeDecorator.process_literal_param()
so that the
“inner” processing provided by the implementing type is maintained.
sqlalchemy.types.TypeDecorator.
load_dialect_impl
(dialect)¶Return a TypeEngine
object corresponding to a dialect.
This is an end-user override hook that can be used to provide
differing types depending on the given dialect. It is used
by the TypeDecorator
implementation of type_engine()
to help determine what type should ultimately be returned
for a given TypeDecorator
.
By default returns self.impl
.
sqlalchemy.types.TypeDecorator.
process_bind_param
(value, dialect)¶Receive a bound parameter value to be converted.
Custom subclasses of TypeDecorator
should override
this method to provide custom behaviors for incoming data values.
This method is called at statement execution time and is passed
the literal Python data value which is to be associated with a bound
parameter in the statement.
The operation could be anything desired to perform custom behavior, such as transforming or serializing data. This could also be used as a hook for validating logic.
sqlalchemy.types.TypeDecorator.
process_literal_param
(value, dialect)¶Receive a literal parameter value to be rendered inline within a statement.
Note
This method is called during the SQL compilation phase of a
statement, when rendering a SQL string. Unlike other SQL
compilation methods, it is passed a specific Python value to be
rendered as a string. However it should not be confused with the
TypeDecorator.process_bind_param()
method, which is
the more typical method that processes the actual value passed to a
particular parameter at statement execution time.
Custom subclasses of TypeDecorator
should override
this method to provide custom behaviors for incoming data values
that are in the special case of being rendered as literals.
The returned string will be rendered into the output string.
sqlalchemy.types.TypeDecorator.
process_result_value
(value, dialect)¶Receive a result-row column value to be converted.
Custom subclasses of TypeDecorator
should override
this method to provide custom behaviors for data values
being received in result rows coming from the database.
This method is called at result fetching time and is passed
the literal Python data value that’s extracted from a database result
row.
The operation could be anything desired to perform custom behavior, such as transforming or deserializing data.
sqlalchemy.types.TypeDecorator.
result_processor
(dialect, coltype)¶Provide a result value processing function for the given
Dialect
.
This is the method that fulfills the TypeEngine
contract for bound value conversion which normally occurs via
the TypeEngine.result_processor()
method.
Note
User-defined subclasses of TypeDecorator
should
not implement this method, and should instead implement
TypeDecorator.process_result_value()
so that the
“inner” processing provided by the implementing type is maintained.
sqlalchemy.types.TypeDecorator.
type_engine
(dialect)¶Return a dialect-specific TypeEngine
instance
for this TypeDecorator
.
In most cases this returns a dialect-adapted form of
the TypeEngine
type represented by self.impl
.
Makes usage of dialect_impl()
.
Behavior can be customized here by overriding
load_dialect_impl()
.
A few key TypeDecorator
recipes follow.
A common source of confusion regarding the Unicode
type
is that it is intended to deal only with Python unicode
objects
on the Python side, meaning values passed to it as bind parameters
must be of the form u'some string'
if using Python 2 and not 3.
The encoding/decoding functions it performs are only to suit what the
DBAPI in use requires, and are primarily a private implementation detail.
The use case of a type that can safely receive Python bytestrings,
that is strings that contain non-ASCII characters and are not u''
objects in Python 2, can be achieved using a TypeDecorator
which coerces as needed:
from sqlalchemy.types import TypeDecorator, Unicode
class CoerceUTF8(TypeDecorator):
"""Safely coerce Python bytestrings to Unicode
before passing off to the database."""
impl = Unicode
def process_bind_param(self, value, dialect):
if isinstance(value, str):
value = value.decode('utf-8')
return value
Some database connectors like those of SQL Server choke if a Decimal is passed with too many decimal places. Here’s a recipe that rounds them down:
from sqlalchemy.types import TypeDecorator, Numeric
from decimal import Decimal
class SafeNumeric(TypeDecorator):
"""Adds quantization to Numeric."""
impl = Numeric
def __init__(self, *arg, **kw):
TypeDecorator.__init__(self, *arg, **kw)
self.quantize_int = - self.impl.scale
self.quantize = Decimal(10) ** self.quantize_int
def process_bind_param(self, value, dialect):
if isinstance(value, Decimal) and \
value.as_tuple()[2] < self.quantize_int:
value = value.quantize(self.quantize)
return value
Timestamps in databases should always be stored in a timezone-agnostic way. For
most databases, this means ensuring a timestamp is first in the UTC timezone
before it is stored, then storing it as timezone-naive (that is, without any
timezone associated with it; UTC is assumed to be the “implicit” timezone).
Alternatively, database-specific types like PostgreSQLs “TIMESTAMP WITH
TIMEZONE” are often preferred for their richer functionality; however, storing
as plain UTC will work on all databases and drivers. When a
timezone-intelligent database type is not an option or is not preferred, the
TypeDecorator
can be used to create a datatype that convert timezone
aware timestamps into timezone naive and back again. Below, Python’s
built-in datetime.timezone.utc
timezone is used to normalize and
denormalize:
import datetime
class TZDateTime(TypeDecorator):
impl = DateTime
cache_ok = True
def process_bind_param(self, value, dialect):
if value is not None:
if not value.tzinfo:
raise TypeError("tzinfo is required")
value = value.astimezone(datetime.timezone.utc).replace(
tzinfo=None
)
return value
def process_result_value(self, value, dialect):
if value is not None:
value = value.replace(tzinfo=datetime.timezone.utc)
return value
Receives and returns Python uuid() objects. Uses the PG UUID type when using PostgreSQL, CHAR(32) on other backends, storing them in stringified hex format. Can be modified to store binary in CHAR(16) if desired:
from sqlalchemy.types import TypeDecorator, CHAR
from sqlalchemy.dialects.postgresql import UUID
import uuid
class GUID(TypeDecorator):
"""Platform-independent GUID type.
Uses PostgreSQL's UUID type, otherwise uses
CHAR(32), storing as stringified hex values.
"""
impl = CHAR
cache_ok = True
def load_dialect_impl(self, dialect):
if dialect.name == 'postgresql':
return dialect.type_descriptor(UUID())
else:
return dialect.type_descriptor(CHAR(32))
def process_bind_param(self, value, dialect):
if value is None:
return value
elif dialect.name == 'postgresql':
return str(value)
else:
if not isinstance(value, uuid.UUID):
return "%.32x" % uuid.UUID(value).int
else:
# hexstring
return "%.32x" % value.int
def process_result_value(self, value, dialect):
if value is None:
return value
else:
if not isinstance(value, uuid.UUID):
value = uuid.UUID(value)
return value
This type uses simplejson
to marshal Python data structures
to/from JSON. Can be modified to use Python’s builtin json encoder:
from sqlalchemy.types import TypeDecorator, VARCHAR
import json
class JSONEncodedDict(TypeDecorator):
"""Represents an immutable structure as a json-encoded string.
Usage::
JSONEncodedDict(255)
"""
impl = VARCHAR
cache_ok = True
def process_bind_param(self, value, dialect):
if value is not None:
value = json.dumps(value)
return value
def process_result_value(self, value, dialect):
if value is not None:
value = json.loads(value)
return value
The ORM by default will not detect “mutability” on such a type as above - meaning, in-place changes to values will not be detected and will not be flushed. Without further steps, you instead would need to replace the existing value with a new one on each parent object to detect changes:
obj.json_value["key"] = "value" # will *not* be detected by the ORM
obj.json_value = {"key": "value"} # *will* be detected by the ORM
The above limitation may be
fine, as many applications may not require that the values are ever mutated
once created. For those which do have this requirement, support for mutability
is best applied using the sqlalchemy.ext.mutable
extension. For a
dictionary-oriented JSON structure, we can apply this as:
json_type = MutableDict.as_mutable(JSONEncodedDict)
class MyClass(Base):
# ...
json_data = Column(json_type)
See also
The default behavior of TypeDecorator
is to coerce the “right hand side”
of any expression into the same type. For a type like JSON, this means that
any operator used must make sense in terms of JSON. For some cases,
users may wish for the type to behave like JSON in some circumstances, and
as plain text in others. One example is if one wanted to handle the
LIKE operator for the JSON type. LIKE makes no sense against a JSON structure,
but it does make sense against the underlying textual representation. To
get at this with a type like JSONEncodedDict
, we need to
coerce the column to a textual form using cast()
or
type_coerce()
before attempting to use this operator:
from sqlalchemy import type_coerce, String
stmt = select(my_table).where(
type_coerce(my_table.c.json_data, String).like('%foo%'))
TypeDecorator
provides a built-in system for working up type
translations like these based on operators. If we wanted to frequently use the
LIKE operator with our JSON object interpreted as a string, we can build it
into the type by overriding the TypeDecorator.coerce_compared_value()
method:
from sqlalchemy.sql import operators
from sqlalchemy import String
class JSONEncodedDict(TypeDecorator):
impl = VARCHAR
cache_ok = True
def coerce_compared_value(self, op, value):
if op in (operators.like_op, operators.not_like_op):
return String()
else:
return self
def process_bind_param(self, value, dialect):
if value is not None:
value = json.dumps(value)
return value
def process_result_value(self, value, dialect):
if value is not None:
value = json.loads(value)
return value
Above is just one approach to handling an operator like “LIKE”. Other
applications may wish to raise NotImplementedError
for operators that
have no meaning with a JSON object such as “LIKE”, rather than automatically
coercing to text.
As seen in the section Augmenting Existing Types, SQLAlchemy allows Python functions to be invoked both when parameters are sent to a statement, as well as when result rows are loaded from the database, to apply transformations to the values as they are sent to or from the database. It is also possible to define SQL-level transformations as well. The rationale here is when only the relational database contains a particular series of functions that are necessary to coerce incoming and outgoing data between an application and persistence format. Examples include using database-defined encryption/decryption functions, as well as stored procedures that handle geographic data. The PostGIS extension to PostgreSQL includes an extensive array of SQL functions that are necessary for coercing data into particular formats.
Any TypeEngine
, UserDefinedType
or TypeDecorator
subclass
can include implementations of
TypeEngine.bind_expression()
and/or TypeEngine.column_expression()
, which
when defined to return a non-None
value should return a ColumnElement
expression to be injected into the SQL statement, either surrounding
bound parameters or a column expression. For example, to build a Geometry
type which will apply the PostGIS function ST_GeomFromText
to all outgoing
values and the function ST_AsText
to all incoming data, we can create
our own subclass of UserDefinedType
which provides these methods
in conjunction with func
:
from sqlalchemy import func
from sqlalchemy.types import UserDefinedType
class Geometry(UserDefinedType):
def get_col_spec(self):
return "GEOMETRY"
def bind_expression(self, bindvalue):
return func.ST_GeomFromText(bindvalue, type_=self)
def column_expression(self, col):
return func.ST_AsText(col, type_=self)
We can apply the Geometry
type into Table
metadata
and use it in a select()
construct:
geometry = Table('geometry', metadata,
Column('geom_id', Integer, primary_key=True),
Column('geom_data', Geometry)
)
print(select(geometry).where(
geometry.c.geom_data == 'LINESTRING(189412 252431,189631 259122)'))
The resulting SQL embeds both functions as appropriate. ST_AsText
is applied to the columns clause so that the return value is run through
the function before passing into a result set, and ST_GeomFromText
is run on the bound parameter so that the passed-in value is converted:
SELECT geometry.geom_id, ST_AsText(geometry.geom_data) AS geom_data_1
FROM geometry
WHERE geometry.geom_data = ST_GeomFromText(:geom_data_2)
The TypeEngine.column_expression()
method interacts with the
mechanics of the compiler such that the SQL expression does not interfere
with the labeling of the wrapped expression. Such as, if we rendered
a select()
against a label()
of our expression, the string
label is moved to the outside of the wrapped expression:
print(select(geometry.c.geom_data.label('my_data')))
Output:
SELECT ST_AsText(geometry.geom_data) AS my_data
FROM geometry
Another example is we decorate
BYTEA
to provide a PGPString
, which will make use of the
PostgreSQL pgcrypto
extension to encrypt/decrypt values
transparently:
from sqlalchemy import create_engine, String, select, func, \
MetaData, Table, Column, type_coerce, TypeDecorator
from sqlalchemy.dialects.postgresql import BYTEA
class PGPString(TypeDecorator):
impl = BYTEA
cache_ok = True
def __init__(self, passphrase):
super(PGPString, self).__init__()
self.passphrase = passphrase
def bind_expression(self, bindvalue):
# convert the bind's type from PGPString to
# String, so that it's passed to psycopg2 as is without
# a dbapi.Binary wrapper
bindvalue = type_coerce(bindvalue, String)
return func.pgp_sym_encrypt(bindvalue, self.passphrase)
def column_expression(self, col):
return func.pgp_sym_decrypt(col, self.passphrase)
metadata_obj = MetaData()
message = Table('message', metadata_obj,
Column('username', String(50)),
Column('message',
PGPString("this is my passphrase")),
)
engine = create_engine("postgresql://scott:tiger@localhost/test", echo=True)
with engine.begin() as conn:
metadata_obj.create_all(conn)
conn.execute(message.insert(), username="some user",
message="this is my message")
print(conn.scalar(
select(message.c.message).\
where(message.c.username == "some user")
))
The pgp_sym_encrypt
and pgp_sym_decrypt
functions are applied
to the INSERT and SELECT statements:
INSERT INTO message (username, message)
VALUES (%(username)s, pgp_sym_encrypt(%(message)s, %(pgp_sym_encrypt_1)s))
{'username': 'some user', 'message': 'this is my message',
'pgp_sym_encrypt_1': 'this is my passphrase'}
SELECT pgp_sym_decrypt(message.message, %(pgp_sym_decrypt_1)s) AS message_1
FROM message
WHERE message.username = %(username_1)s
{'pgp_sym_decrypt_1': 'this is my passphrase', 'username_1': 'some user'}
See also
SQLAlchemy Core defines a fixed set of expression operators available to all column expressions.
Some of these operations have the effect of overloading Python’s built-in operators;
examples of such operators include
ColumnOperators.__eq__()
(table.c.somecolumn == 'foo'
),
ColumnOperators.__invert__()
(~table.c.flag
),
and ColumnOperators.__add__()
(table.c.x + table.c.y
). Other operators are exposed as
explicit methods on column expressions, such as
ColumnOperators.in_()
(table.c.value.in_(['x', 'y'])
) and ColumnOperators.like()
(table.c.value.like('%ed%')
).
When the need arises for a SQL operator that isn’t directly supported by the
already supplied methods above, the most expedient way to produce this operator is
to use the Operators.op()
method on any SQL expression object; this method
is given a string representing the SQL operator to render, and the return value
is a Python callable that accepts any arbitrary right-hand side expression:
>>> from sqlalchemy import column
>>> expr = column('x').op('>>')(column('y'))
>>> print(expr)
x >> y
When making use of custom SQL types, there is also a means of implementing
custom operators as above that are automatically present upon any column
expression that makes use of that column type, without the need to directly
call Operators.op()
each time the operator is to be used.
To achieve this, a SQL
expression construct consults the TypeEngine
object associated
with the construct in order to determine the behavior of the built-in
operators as well as to look for new methods that may have been invoked.
TypeEngine
defines a
“comparison” object implemented by the Comparator
class to provide the base
behavior for SQL operators, and many specific types provide their own
sub-implementations of this class. User-defined Comparator
implementations can be built directly into a simple subclass of a particular
type in order to override or define new operations. Below, we create a
Integer
subclass which overrides the ColumnOperators.__add__()
operator, which in turn uses Operators.op()
to produce the custom
SQL itself:
from sqlalchemy import Integer
class MyInt(Integer):
class comparator_factory(Integer.Comparator):
def __add__(self, other):
return self.op("goofy")(other)
The above configuration creates a new class MyInt
, which
establishes the TypeEngine.comparator_factory
attribute as
referring to a new class, subclassing the Comparator
class
associated with the Integer
type.
Usage:
>>> sometable = Table("sometable", metadata, Column("data", MyInt))
>>> print(sometable.c.data + 5)
sometable.data goofy :data_1
The implementation for ColumnOperators.__add__()
is consulted
by an owning SQL expression, by instantiating the Comparator
with
itself as the expr
attribute. This attribute may be used when the
implementation needs to refer to the originating ColumnElement
object directly:
from sqlalchemy import Integer
class MyInt(Integer):
class comparator_factory(Integer.Comparator):
def __add__(self, other):
return func.special_addition(self.expr, other)
New methods added to a Comparator
are exposed on an
owning SQL expression object using a dynamic lookup scheme, which exposes methods added to
Comparator
onto the owning ColumnElement
expression construct. For example, to add a log()
function
to integers:
from sqlalchemy import Integer, func
class MyInt(Integer):
class comparator_factory(Integer.Comparator):
def log(self, other):
return func.log(self.expr, other)
Using the above type:
>>> print(sometable.c.data.log(5))
log(:log_1, :log_2)
When using Operators.op()
for comparison operations that return a
boolean result, the Operators.op.is_comparison
flag should be
set to True
:
class MyInt(Integer):
class comparator_factory(Integer.Comparator):
def is_frobnozzled(self, other):
return self.op("--is_frobnozzled->", is_comparison=True)(other)
Unary operations
are also possible. For example, to add an implementation of the
PostgreSQL factorial operator, we combine the UnaryExpression
construct
along with a custom_op
to produce the factorial expression:
from sqlalchemy import Integer
from sqlalchemy.sql.expression import UnaryExpression
from sqlalchemy.sql import operators
class MyInteger(Integer):
class comparator_factory(Integer.Comparator):
def factorial(self):
return UnaryExpression(self.expr,
modifier=operators.custom_op("!"),
type_=MyInteger)
Using the above type:
>>> from sqlalchemy.sql import column
>>> print(column('x', MyInteger).factorial())
x !
The UserDefinedType
class is provided as a simple base class
for defining entirely new database types. Use this to represent native
database types not known by SQLAlchemy. If only Python translation behavior
is needed, use TypeDecorator
instead.
Object Name | Description |
---|---|
Base for user defined types. |
sqlalchemy.types.
UserDefinedType
¶Base for user defined types.
This should be the base of new types. Note that
for most cases, TypeDecorator
is probably
more appropriate:
import sqlalchemy.types as types
class MyType(types.UserDefinedType):
cache_ok = True
def __init__(self, precision = 8):
self.precision = precision
def get_col_spec(self, **kw):
return "MYTYPE(%s)" % self.precision
def bind_processor(self, dialect):
def process(value):
return value
return process
def result_processor(self, dialect, coltype):
def process(value):
return value
return process
Once the type is made, it’s immediately usable:
table = Table('foo', metadata_obj,
Column('id', Integer, primary_key=True),
Column('data', MyType(16))
)
The get_col_spec()
method will in most cases receive a keyword
argument type_expression
which refers to the owning expression
of the type as being compiled, such as a Column
or
cast()
construct. This keyword is only sent if the method
accepts keyword arguments (e.g. **kw
) in its argument signature;
introspection is used to check for this in order to support legacy
forms of this function.
New in version 1.0.0: the owning expression is passed to
the get_col_spec()
method via the keyword argument
type_expression
, if it receives **kw
in its signature.
The UserDefinedType.cache_ok
class-level flag indicates if this
custom UserDefinedType
is safe to be used as part of a cache key.
This flag defaults to None
which will initially generate a warning
when the SQL compiler attempts to generate a cache key for a statement
that uses this type. If the UserDefinedType
is not guaranteed
to produce the same bind/result behavior and SQL generation
every time, this flag should be set to False
; otherwise if the
class produces the same behavior each time, it may be set to True
.
See UserDefinedType.cache_ok
for further notes on how this works.
New in version 1.4.28: Generalized the ExternalType.cache_ok
flag so that it is available for both TypeDecorator
as well
as UserDefinedType
.
Class signature
class sqlalchemy.types.UserDefinedType
(sqlalchemy.types.ExternalType
, sqlalchemy.types.TypeEngine
)
sqlalchemy.types.UserDefinedType.
cache_ok
= None¶inherited from the ExternalType.cache_ok
attribute of ExternalType
Indicate if statements using this ExternalType
are “safe to
cache”.
The default value None
will emit a warning and then not allow caching
of a statement which includes this type. Set to False
to disable
statements using this type from being cached at all without a warning.
When set to True
, the object’s class and selected elements from its
state will be used as part of the cache key. For example, using a
TypeDecorator
:
class MyType(TypeDecorator):
impl = String
cache_ok = True
def __init__(self, choices):
self.choices = tuple(choices)
self.internal_only = True
The cache key for the above type would be equivalent to:
>>> MyType(["a", "b", "c"])._static_cache_key
(<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))
The caching scheme will extract attributes from the type that correspond
to the names of parameters in the __init__()
method. Above, the
“choices” attribute becomes part of the cache key but “internal_only”
does not, because there is no parameter named “internal_only”.
The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.
To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:
class LookupType(UserDefinedType):
'''a custom type that accepts a dictionary as a parameter.
this is the non-cacheable version, as "self.lookup" is not
hashable.
'''
def __init__(self, lookup):
self.lookup = lookup
def get_col_spec(self, **kw):
return "VARCHAR(255)"
def bind_processor(self, dialect):
# ... works with "self.lookup" ...
Where “lookup” is a dictionary. The type will not be able to generate a cache key:
>>> type_ = LookupType({"a": 10, "b": 20})
>>> type_._static_cache_key
<stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not
produce a cache key because the ``cache_ok`` flag is not set to True.
Set this flag to True if this type object's state is safe to use
in a cache key, or False to disable this warning.
symbol('no_cache')
If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:
>>> # set cache_ok = True
>>> type_.cache_ok = True
>>> # this is the cache key it would generate
>>> key = type_._static_cache_key
>>> key
(<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20}))
>>> # however this key is not hashable, will fail when used with
>>> # SQLAlchemy statement cache
>>> some_cache = {key: "some sql value"}
Traceback (most recent call last): File "<stdin>", line 1,
in <module> TypeError: unhashable type: 'dict'
The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:
class LookupType(UserDefinedType):
'''a custom type that accepts a dictionary as a parameter.
The dictionary is stored both as itself in a private variable,
and published in a public variable as a sorted tuple of tuples,
which is hashable and will also return the same value for any
two equivalent dictionaries. Note it assumes the keys and
values of the dictionary are themselves hashable.
'''
cache_ok = True
def __init__(self, lookup):
self._lookup = lookup
# assume keys/values of "lookup" are hashable; otherwise
# they would also need to be converted in some way here
self.lookup = tuple(
(key, lookup[key]) for key in sorted(lookup)
)
def get_col_spec(self, **kw):
return "VARCHAR(255)"
def bind_processor(self, dialect):
# ... works with "self._lookup" ...
Where above, the cache key for LookupType({"a": 10, "b": 20})
will be:
>>> LookupType({"a": 10, "b": 20})._static_cache_key
(<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))
New in version 1.4.14: - added the cache_ok
flag to allow
some configurability of caching for TypeDecorator
classes.
New in version 1.4.28: - added the ExternalType
mixin which
generalizes the cache_ok
flag to both the TypeDecorator
and UserDefinedType
classes.
See also
sqlalchemy.types.UserDefinedType.
coerce_compared_value
(op, value)¶Suggest a type for a ‘coerced’ Python value in an expression.
Default behavior for UserDefinedType
is the
same as that of TypeDecorator
; by default it returns
self
, assuming the compared value should be coerced into
the same type as this one. See
TypeDecorator.coerce_compared_value()
for more detail.
It is important to note that database types which are modified to have
additional in-Python behaviors, including types based on
TypeDecorator
as well as other user-defined subclasses of datatypes,
do not have any representation within a database schema. When using database
the introspection features described at Reflecting Database Objects, SQLAlchemy
makes use of a fixed mapping which links the datatype information reported by a
database server to a SQLAlchemy datatype object. For example, if we look
inside of a PostgreSQL schema at the definition for a particular database
column, we might receive back the string "VARCHAR"
. SQLAlchemy’s
PostgreSQL dialect has a hardcoded mapping which links the string name
"VARCHAR"
to the SQLAlchemy VARCHAR
class, and that’s how when we
emit a statement like Table('my_table', m, autoload_with=engine)
, the
Column
object within it would have an instance of VARCHAR
present inside of it.
The implication of this is that if a Table
object makes use of type
objects that don’t correspond directly to the database-native type name, if we
create a new Table
object against a new MetaData
collection
for this database table elsewhere using reflection, it will not have this
datatype. For example:
>>> from sqlalchemy import Table, Column, MetaData, create_engine, PickleType, Integer
>>> metadata = MetaData()
>>> my_table = Table("my_table", metadata, Column('id', Integer), Column("data", PickleType))
>>> engine = create_engine("sqlite://", echo='debug')
>>> my_table.create(engine)
INFO sqlalchemy.engine.base.Engine
CREATE TABLE my_table (
id INTEGER,
data BLOB
)
Above, we made use of PickleType
, which is a TypeDecorator
that works on top of the LargeBinary
datatype, which on SQLite
corresponds to the database type BLOB
. In the CREATE TABLE, we see that
the BLOB
datatype is used. The SQLite database knows nothing about the
PickleType
we’ve used.
If we look at the datatype of my_table.c.data.type
, as this is a Python
object that was created by us directly, it is PickleType
:
>>> my_table.c.data.type
PickleType()
However, if we create another instance of Table
using reflection,
the use of PickleType
is not represented in the SQLite database we’ve
created; we instead get back BLOB
:
>>> metadata_two = MetaData()
>>> my_reflected_table = Table("my_table", metadata_two, autoload_with=engine)
INFO sqlalchemy.engine.base.Engine PRAGMA main.table_info("my_table")
INFO sqlalchemy.engine.base.Engine ()
DEBUG sqlalchemy.engine.base.Engine Col ('cid', 'name', 'type', 'notnull', 'dflt_value', 'pk')
DEBUG sqlalchemy.engine.base.Engine Row (0, 'id', 'INTEGER', 0, None, 0)
DEBUG sqlalchemy.engine.base.Engine Row (1, 'data', 'BLOB', 0, None, 0)
>>> my_reflected_table.c.data.type
BLOB()
Typically, when an application defines explicit Table
metadata with
custom types, there is no need to use table reflection because the necessary
Table
metadata is already present. However, for the case where an
application, or a combination of them, need to make use of both explicit
Table
metadata which includes custom, Python-level datatypes, as well
as Table
objects which set up their Column
objects as
reflected from the database, which nevertheless still need to exhibit the
additional Python behaviors of the custom datatypes, additional steps must be
taken to allow this.
The most straightforward is to override specific columns as described at
Overriding Reflected Columns. In this technique, we simply
use reflection in combination with explicit Column
objects for those
columns for which we want to use a custom or decorated datatype:
>>> metadata_three = MetaData()
>>> my_reflected_table = Table("my_table", metadata_three, Column("data", PickleType), autoload_with=engine)
The my_reflected_table
object above is reflected, and will load the
definition of the “id” column from the SQLite database. But for the “data”
column, we’ve overridden the reflected object with an explicit Column
definition that includes our desired in-Python datatype, the
PickleType
. The reflection process will leave this Column
object intact:
>>> my_reflected_table.c.data.type
PickleType()
A more elaborate way to convert from database-native type objects to custom
datatypes is to use the DDLEvents.column_reflect()
event handler. If
for example we knew that we wanted all BLOB
datatypes to in fact be
PickleType
, we could set up a rule across the board:
from sqlalchemy import BLOB
from sqlalchemy import event
from sqlalchemy import PickleType
from sqlalchemy import Table
@event.listens_for(Table, "column_reflect")
def _setup_pickletype(inspector, table, column_info):
if isinstance(column_info["type"], BLOB):
column_info["type"] = PickleType()
When the above code is invoked before any table reflection occurs (note also
it should be invoked only once in the application, as it is a global rule),
upon reflecting any Table
that includes a column with a BLOB
datatype, the resulting datatype will be stored in the Column
object
as PickleType
.
In practice, the above event-based approach would likely have additional rules in order to affect only those columns where the datatype is important, such as a lookup table of table names and possibly column names, or other heuristics in order to accurately determine which columns should be established with an in Python datatype.
flambé! the dragon and The Alchemist image designs created and generously donated by Rotem Yaari.
Created using Sphinx 3.5.4.