Auditing extension for Flask using Flask-SQLAlchemy, Alembic, and PostgreSQL. This package tracks changes to database records within specified tables. The current user affecting the target table will also be recorded.
- Stores versioned records with old and changed data in an
activitytable - Uses database triggers to record activity records, keeping INSERTs, UPDATEs and DELETEs as fast as possible
- Uses SQLAlchemy events to track actor IDs in a
transactiontable - Tables and triggers can be easily configured using Alembic migrations
This project was forked from PostgreSQL-Audit, but does not attempt to maintain backwards compatability. It draws inspiration from other projects such as SQLAlchemy-Continuum, Papertrail, and Audit Trigger.
pip install flask-audit-logger
Create a single AuditLogger() instance after your models have been declared. E.g. in app/models.py:
from sqlalchemy import BigInteger
from sqlalchemy.orm import Mapped, mapped_column
from flask_sqlalchemy import SQLAlchemy
from flask_audit_logger import AuditLogger
db = SQLAlchemy()
class User(db.Model):
__tablename__ = "users"
__table_args__ = ({"info": {"audit_logged": True}},)
id: Mapped[int] = mapped_column(BigInteger, primary_key=True, auto_increment=True)
name: Mapped[str]
# ... more model declarations ...
audit_logger = AuditLogger(db)Identify the tables you want audited by adding {"info": {"audit_logged": True}} to a model's __table_args__.
The "audit_logged" key determines which tables get database triggers.
Finally, run the migration which will create audit tables, functions, and triggers. Here I'm using Flask-Migrate, but you can use Alembic directly if you wish.
flask db migrate -m 'setup audit_logger'
flask db upgrade
If you need an audit trail for another table in the future, add {"info": {"audit_logged": True} to the __table_args__ tuple.
When you generate the next migration, the newly audit logger tracked table will be detected and the correct triggers will get created.
audit_logger = AuditLogger(
get_actor_id=... # callback to get current user, defaults to flask_login.current_user
schema=... # schema for activity and transaction tables, defaults to "public"
actor_cls=... # User model name as a string, required if not "User"
)By default, transaction.actor_id will be set using flask_login.current_user.
Use the get_actor_id constructor option if you want to change this behavior:
from flask import g
def get_current_user_id():
return g.my_current_user.id
audit_logger = AuditLogger(db, get_actor_id=get_current_user_id)The activity and transaction tables are created in the public schema by default.
If you want these tables to live in a different schema, pass in the schema name when you instantiate the AuditLogger.
audit_logger = AuditLogger(db, schema="audit_logs")You will also need to make sure Alembic supports multiple schemas.
This can be done through an env.py configuration.
def run_migrations_online():
# ...
context.configure(
# ...
include_schemas=True, # required for alembic to manage more than the 'public' schema
)The AuditLogger.actor_cls should align with your current_user type.
By default, this package assumes the User model is also the actor class.
This can be customized by passing in the model name as a string when the AuditLogger is created.
audit_logger = AuditLogger(db, actor_cls="SuperUser")This model will be used to populate the AuditLogTransaction.actor relationship.
For example, the following query loads the first activity and its responsible actor.
AuditLogActivity = audit_logger.activity_cls
AuditLogTransaction = audit_logger.transaction_cls
activity = db.session.scalar(
select(AuditLogActivity)
.options(joinedload(AuditLogActivity.transaction).joinedload(AuditLogTransaction.actor))
.limit(1)
)
print(activity.transaction.actor)
<SuperUser 123456>You may want to ignore version tracking on specific database columns.
This can be done by adding "exclude" with a list of column names to __table_args__.
In this case replace {"audit_logged": True} with your configuration dict.
# app/models.py
class User(db.Model):
__tablename__ = "users"
__table_args__ = ({"info": {"audit_logged": {"exclude": ["hair_color"]}}},)
id: Mapped[int] = mapped_column(BigInteger, primary_key=True, auto_increment=True)
name: Mapped[str]
hair_color: Mapped[str]
# flask db migrate -m 'exclude hair_color'
# migrations/versions/xxxx_exclude_hair_color.py
def upgrade_():
# ### commands auto generated by Alembic - please adjust! ###
op.init_audit_logger_triggers("users", excluded_columns=["hair_color"])
# ### end Alembic commands ###
def downgrade_():
# ### commands auto generated by Alembic - please adjust! ###
op.remove_audit_logger_triggers("users")
# ### end Alembic commands ###- This package does not play nicely with Alembic Utils
- Changes to
excluded_columnsare not remembered. You will need to editdowngrade_()manually to properly revert changes - The test suite must be run with multiple
pytestcommands. Because Alembic, Flask-SQLAlchemy, etc. only work with oneFlask()instance, the differentAuditLoggerconfigurations require separate test apps.
Create a virtualenv with the python version specified in specified in .tool-versions:
asdf install
uv sync --devNext, create a .envrc with test database credentials then run the tests.
direnv edit .
> export FLASK_AUDIT_LOGGER_TEST_USER=garrett # use whatever postgres user you prefer
> export FLASK_AUDIT_LOGGER_TEST_DB=flask_audit_logger_test
createdb $FLASK_AUDIT_LOGGER_TEST_DB
# Run a subset of the test suite using default settings
pytest tests/defaults
# Run the entire test suite, including linting checks
toxNote that multiple folders exist in the test module.
Each tests a slightly different flask_app configuration.
Use the defaults folder for generic tests.
Add tests to custom_actor when you want to use a separate User model to track changes.
Add tests to separate_schema when you want tables to exist in the non-public schema.