-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rollback and reapply norms in development? #37
Comments
I’ve thought about the idea, but never had the time to implement it. I do think Conformity could be a bit more intelligent.
|
I've been thinking about this a little bit, here's my current thoughts. Background In development, it is often desirable to work with a "development" database, and rework its schema multiple times before committing to a particular approach and deploying to production. Often the development database has useful working state in it which is inconvenient or slow to recreate. Datomic doesn't allow schema to be retracted or excised, so it is not possible to completely roll-back a migration in the same way that you could in a traditional SQL database. Idea When conforming schemas, Conformity can (optionally) check whether the norms that were transacted still match what they say now. If there is a difference, then the user can use multiple strategies to bring the db into alignment. This is intended only for use in development, schema migration in production is a separate issue. Schema strategy:
Data strategy:
I'm not sure what should happen for data that was applied in a norm. Maybe the same as any data that was transacted since the norm? This is a little bit hazy for me, so some of the details above may not make sense or be possible, but this is the general direction I'm thinking of. |
My approach is to use datomock in my REPL and in my tests for the initial work of fleshing out the schema, with a help function that looks more or less like this: (defn mock-conn! [conn]
(def mconn (app.db.core/migrate-schema! (datomock/fork-conn conn)))
(def mdb (d/db mconn))) I then iterate on a mock connection in the REPL, and the tests give me some really good feedback. This also has the benefit of validating my transactions as I go. It is not perfect (yet). I want to improve on the flow and be able to swap out the real datomic connection with some development middleware so I can actually test my full app with the mocked/conformed database. I'm relying on my tests for this feedback, and I still find myself having to add additional transactions after I've conformed and have had the tests pass. |
On Jul 17, 2017, at 3:36 PM, Daniel Compton ***@***.***> wrote:
I've been thinking about this a little bit, here's my current thoughts.
Background
In development, it is often desirable to work with a "development" database, and rework its schema multiple times before committing to a particular approach and deploying to production. Often the development database has useful working state in it which is inconvenient or slow to recreate.
Datomic doesn't allow schema to be retracted or excised, so it is not possible to completely roll-back a migration in the same way that you could in a traditional SQL database.
Idea
When conforming schemas, Conformity can (optionally) check whether the norms that were transacted still match what they say now. If there is a difference, then the user can use multiple strategies to bring the db into alignment. This is intended only for use in development, schema migration in production is a separate issue.
I think if the feature is meant to be used in development only it would make sense to have a function handle re-applying norms, taking a configuration specifying conflict resolution strategies.
Schema strategy:
No check
Warn - Print a warning to the console (probably best for production)
Rename - rename all conformed attributes that have changed to a synthetic name, and then reapply the new attributes
Data:
Warn - warn that the data is now using renamed attributes
Retract - retract any data that was transacted since the modified norm was applied
Excise - excise any data that was transacted since the modified norm was applied
These seem good. We can go ahead with these and leave the interface open to extension (probably via fn passing is reasonable unless a protocol is necessary).
I'm not sure what should happen for data that was applied in a norm. Maybe the same as any data that was transacted since the norm?
Me neither.
Side note: I imagine the hashing strategy for schema is going to be challenging. (It would be totally awesome to write a hash of the norm on conform)
This is a little bit hazy for me, so some of the details above may not make sense or be possible, but this is the general direction I'm thinking of.
I think the general direction is good. I want to try as hard as possible not to force people to duplicate their norms files though. Regardless, I'd be happy to enhance the library for the use case described.
Thanks for writing this up.
|
In case it's useful as a dissenting data point, I don't personally feel the anticipated benefit of this feature sufficiently offsets the increase in complexity or API surface area. To me, it feels more like a workflow issue than a tooling one. In the past, I've always addressed the problem of arriving at a correct norm by iterating on small, focused schema alterations via some mix of REPL testing and Datomic's mem DB before committing final PR-worthy changes to the schema description -- similar in spirit to the approach described by @kennethkalmer. This hasn't been nearly painful enough that I've wanted extra tooling support for it. Just wanted to throw that out there in case it helps highlight an alternative path. Happy to discuss further -- in particular, I'd be curious to see a concrete code situation that constitutes a strong pro argument for this change; perhaps I've just been lucky? |
One strong use case we have is that our front-end developers run our back-end Datomic application, but don't usually work with the code much. They need a persistent database so that they can keep the state they've built up over time when developing a feature for a few days. When I push a new update which has changed the existing migration (that hasn't made it to prod yet), I'd like it to automatically migrate the data, or at least warn them that their schema isn't up-to-date. Currently, I just tell them when to drop the entire DB, which works, but isn't the best, and sometimes I forget to tell them, leading to strange errors. |
Duct's migrator.ragtime has a really neat feature where it hashes the migration, and if the hash of the migration changes, it rolls back the migration, then reapplies it. This is really useful in development scenarios, where you want to be able to experiment with different schemas, or incrementally modify an existing one during development.
Would something like this make sense for Conformity?
The text was updated successfully, but these errors were encountered: