6.2.0
New features
- Add new
safir.uws
andsafir.arq.uws
modules that provide the framework of an IVOA Universal Worker Service implementation. - Add new
safir.testing.uws
module that provides a mock UWS job runner for testing UWS applications. safir.arq
is now available as a separate PyPI package,safir-arq
, so that it can be installed in environments where the full Safir dependency may be too heavy-weight or conflict with other packages. Thesafir[arq]
dependency will continue to work as before (by installingsafir-arq
behind the scenes).- Add new
abort_job
method to instances ofsafir.arq.ArqQueue
, which tells arq to abort a job that has been queued or in progress. To successfully abort a job that has already started, the arq worker must enable support for aborting jobs. - Add new utility function
safir.arq.build_arq_redis_settings
, which constructs theRedisSettings
object used to create an arq Redis queue from a Pydantic Redis DSN. - Add new
safir.arq.WorkerSettings
class that models the acceptable parameters for an arqWorkerSettings
object or class that Safir applications have needed. - Add new
safir.pydantic.SecondsDatetime
andsafir.pydantic.HumanDatetime
types for use in Pydantic models. These behave the same asdatetime.timedelta
fields but use custom validation. Both support a stringified number of seconds as input, and the latter also supports the interval strings parsed bysafir.datetime.parse_timedelta
. - Add new types
safir.pydantic.EnvAsyncPostgresDsn
andsafir.pydantic.EnvRedisDsn
, which validate PostgreSQL and Redis DSNs but rewrite them based on the environment variables set by tox-docker. Programs using these types for their configuration will therefore automatically honor tox-docker environment variables when running the test suite.EnvAsyncPostgresDsn
also enforces that the scheme of the DSN is compatible with asyncpg and the Safir database support. - Add the decorator
safir.database.retry_async_transaction
, which retries a function or method a configurable number of times if it raises a SQLAlchemy exception from the underlying database API. This is primarily intended to retry database operations aborted due to transaction isolation. safir.database.create_database_engine
now accepts the database URL as a PydanticUrl
as well as astr
.- Allow the Slack webhook URL argument to
SlackWebhookClient
andSlackRouteErrorHandler
to be given as a PydanticSecretStr
instead of astr
. This simplifies code in applications that get that value from a secret.
Other changes
What's Changed
- DM-45137: Switch to a shared Ruff configuration by @rra in #263
- DM-45138: Add support for aborting arq jobs by @rra in #266
- DM-45281: Add new timedelta data types for Pydantic models by @rra in #269
- DM-45281: Add Pydantic types for Postgres and Redis DSNs by @rra in #270
- DM-45281: Break apart
safir.database
for ease of maintenance by @rra in #271 - DM-45281: Add decorator to retry async transactions by @rra in #272
- DM-45281: Allow create_database_engine to take a Url by @rra in #273
- DM-45281: Accept
SecretStr
as the Slack webhook URL by @rra in #274 - Update cryptography requirement from <43 to <44 by @dependabot in #276
- DM-45281: Add function to build arq RedisSettings by @rra in #275
- DM-45281: Switch to testcontainers from tox-docker by @rra in #277
- DM-45281: Convert to nox for the build system by @rra in #279
- DM-45281: Separate safir-arq from safir by @rra in #280
- DM-45281: Add UWS support library by @rra in #281
- DM-45281: Collect change log for 6.2.0 release by @rra in #282
Full Changelog: 6.1.0...6.2.0