A Table object can be instructed to load information about itself from the corresponding database schema object already existing within the database. This process is called reflection. In the most simple case you need only specify the table name, a MetaData object, and the autoload=True flag. If the MetaData is not persistently bound, also add the autoload_with argument:
>>> messages = Table('messages', meta, autoload=True, autoload_with=engine)
>>> [c.name for c in messages.columns]
['message_id', 'message_name', 'date']
The above operation will use the given engine to query the database for information about the messages table, and will then generate Column, ForeignKey, and other objects corresponding to this information as though the Table object were hand-constructed in Python.
When tables are reflected, if a given table references another one via foreign key, a second Table object is created within the MetaData object representing the connection. Below, assume the table shopping_cart_items references a table named shopping_carts. Reflecting the shopping_cart_items table has the effect such that the shopping_carts table will also be loaded:
>>> shopping_cart_items = Table('shopping_cart_items', meta, autoload=True, autoload_with=engine)
>>> 'shopping_carts' in meta.tables:
True
The MetaData has an interesting “singleton-like” behavior such that if you requested both tables individually, MetaData will ensure that exactly one Table object is created for each distinct table name. The Table constructor actually returns to you the already-existing Table object if one already exists with the given name. Such as below, we can access the already generated shopping_carts table just by naming it:
shopping_carts = Table('shopping_carts', meta)
Of course, it’s a good idea to use autoload=True with the above table regardless. This is so that the table’s attributes will be loaded if they have not been already. The autoload operation only occurs for the table if it hasn’t already been loaded; once loaded, new calls to Table with the same name will not re-issue any reflection queries.
Individual columns can be overridden with explicit values when reflecting tables; this is handy for specifying custom datatypes, constraints such as primary keys that may not be configured within the database, etc.:
>>> mytable = Table('mytable', meta,
... Column('id', Integer, primary_key=True), # override reflected 'id' to have primary key
... Column('mydata', Unicode(50)), # override reflected 'mydata' to be Unicode
... autoload=True)
The reflection system can also reflect views. Basic usage is the same as that of a table:
my_view = Table("some_view", metadata, autoload=True)
Above, my_view is a Table object with Column objects representing the names and types of each column within the view “some_view”.
Usually, it’s desired to have at least a primary key constraint when reflecting a view, if not foreign keys as well. View reflection doesn’t extrapolate these constraints.
Use the “override” technique for this, specifying explicitly those columns which are part of the primary key or have foreign key constraints:
my_view = Table("some_view", metadata,
Column("view_id", Integer, primary_key=True),
Column("related_thing", Integer, ForeignKey("othertable.thing_id")),
autoload=True
)
The MetaData object can also get a listing of tables and reflect the full set. This is achieved by using the reflect() method. After calling it, all located tables are present within the MetaData object’s dictionary of tables:
meta = MetaData()
meta.reflect(bind=someengine)
users_table = meta.tables['users']
addresses_table = meta.tables['addresses']
metadata.reflect() also provides a handy way to clear or delete all the rows in a database:
meta = MetaData()
meta.reflect(bind=someengine)
for table in reversed(meta.sorted_tables):
someengine.execute(table.delete())
A low level interface which provides a backend-agnostic system of loading lists of schema, table, column, and constraint descriptions from a given database is also available. This is known as the “Inspector”:
from sqlalchemy import create_engine
from sqlalchemy.engine import reflection
engine = create_engine('...')
insp = reflection.Inspector.from_engine(engine)
print insp.get_table_names()
Performs database schema inspection.
The Inspector acts as a proxy to the reflection methods of the Dialect, providing a consistent interface as well as caching support for previously fetched metadata.
A Inspector object is usually created via the inspect() function:
from sqlalchemy import inspect, create_engine
engine = create_engine('...')
insp = inspect(engine)
The inspection method above is equivalent to using the Inspector.from_engine() method, i.e.:
engine = create_engine('...')
insp = Inspector.from_engine(engine)
Where above, the Dialect may opt to return an Inspector subclass that provides additional methods specific to the dialect’s target database.
Initialize a new Inspector.
Parameters: | bind¶ – a Connectable, which is typically an instance of Engine or Connection. |
---|
For a dialect-specific instance of Inspector, see Inspector.from_engine()
Return the default schema name presented by the dialect for the current engine’s database user.
E.g. this is typically public for Postgresql and dbo for SQL Server.
Construct a new dialect-specific Inspector object from the given engine or connection.
Parameters: | bind¶ – a Connectable, which is typically an instance of Engine or Connection. |
---|
This method differs from direct a direct constructor call of Inspector in that the Dialect is given a chance to provide a dialect-specific Inspector instance, which may provide additional methods.
See the example at Inspector.
Return information about columns in table_name.
Given a string table_name and an optional string schema, return column information as a list of dicts with these keys:
Parameters: |
|
---|
Return information about foreign_keys in table_name.
Given a string table_name, and an optional string schema, return foreign key information as a list of dicts with these keys:
Parameters: |
|
---|
Return information about indexes in table_name.
Given a string table_name and an optional string schema, return index information as a list of dicts with these keys:
Parameters: |
|
---|
Return information about primary key constraint on table_name.
Given a string table_name, and an optional string schema, return primary key information as a dictionary with these keys:
Parameters: |
|
---|
Return information about primary keys in table_name.
Deprecated since version 0.7: Call to deprecated method get_primary_keys. Use get_pk_constraint instead.
Given a string table_name, and an optional string schema, return primary key information as a list of column names.
Return all schema names.
Return all table names in referred to within a particular schema.
The names are expected to be real tables only, not views. Views are instead returned using the Inspector.get_view_names() method.
Parameters: |
|
---|
See also
Return a dictionary of options specified when the table of the given name was created.
This currently includes some options that apply to MySQL tables.
Parameters: |
|
---|
Return information about unique constraints in table_name.
Given a string table_name and an optional string schema, return unique constraint information as a list of dicts with these keys:
Parameters: |
|
---|
New in version 0.8.4.
Return definition for view_name.
Parameters: | schema¶ – Optional, retrieve names from a non-default schema. For special quoting, use quoted_name. |
---|
Return all view names in schema.
Parameters: | schema¶ – Optional, retrieve names from a non-default schema. For special quoting, use quoted_name. |
---|
Given a Table object, load its internal constructs based on introspection.
This is the underlying method used by most dialects to produce table reflection. Direct usage is like:
from sqlalchemy import create_engine, MetaData, Table
from sqlalchemy.engine import reflection
engine = create_engine('...')
meta = MetaData()
user_table = Table('user', meta)
insp = Inspector.from_engine(engine)
insp.reflecttable(user_table, None)
Parameters: |
---|
It’s important to note that the reflection process recreates Table metadata using only information which is represented in the relational database. This process by definition cannot restore aspects of a schema that aren’t actually stored in the database. State which is not available from reflection includes but is not limited to:
The relational database also in many cases reports on table metadata in a different format than what was specified in SQLAlchemy. The Table objects returned from reflection cannot be always relied upon to produce the identical DDL as the original Python-defined Table objects. Areas where this occurs includes server defaults, column-associated sequences and various idosyncrasies regarding constraints and datatypes. Server side defaults may be returned with cast directives (typically Postgresql will include a ::<type> cast) or different quoting patterns than originally specified.
Another category of limitation includes schema structures for which reflection is only partially or not yet defined. Recent improvements to reflection allow things like views, indexes and foreign key options to be reflected. As of this writing, structures like CHECK constraints, table comments, and triggers are not reflected.