Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

imperativeStatus and validationModes #41

Open
nickevansuk opened this issue Oct 22, 2019 · 12 comments
Open

imperativeStatus and validationModes #41

nickevansuk opened this issue Oct 22, 2019 · 12 comments

Comments

@nickevansuk
Copy link
Contributor

nickevansuk commented Oct 22, 2019

Use case

Representing different property imperative configurations across validationModes

Proposal

For a particular subset of validationMode (e.g. C1Request, C2Request) a specific imperativeConfiguration should be applied.

If no matching validationMode is found for the current mode, the default values for the imperativeConfiguration from the model root should be used. This means that validationMode and imperativeConfiguration properties are only required in the model where there is variance in their use between modes, otherwise the default can be used without cluttering the model files.

If a matching validationMode is found for the current mode, then the imperativeConfiguration it references should be used and the values of requiredFields, recommendedFields, shallNotInclude in the model root should be completely ignored

shallNotInclude should be added just within imperativeConfiguration, to represent properties that are not expected to be found in a particular imperativeConfiguration, but are still in the spec.

Example

{
   "type":"Lease",
   "subClassOf": "https://schema.org/Thing",
   "validationMode":{
     "C1Request": "request",
     "C2Request": "request"
   }
   "imperativeConfiguration":{
      "request":{
         "hasId":false,
         "requiredFields":[
            "type",
            "leaseExpires"
         ],
         "recommendedFields":[

         ],
         "shallNotInclude":[

         ],
         "requiredOptions":[

         ]
      }
   },

Implementation requirements

Rules

  • New rule to recognise shallNotInclude (called ShallNotIncludeFieldsRule?) and produce error messages as appropriate (note that shallNotInclude does not exist in the default mode, and only exists in imperativeConfiguration)
  • Update to the existing RequiredFieldsRule, RequiredOptionalFieldsRule and RecommendedFieldsRule to respect the input validationMode and resolve/use the relevant imperativeConfiguration instead of the default.

Data model tests

  • Data model tests to ensure that all fields featured in the imperativeConfiguration within requiredFields, recommendedFields, shallNotInclude, and requiredOptions exist in inSpec within the same model
  • Data model tests to ensure that all validationModes specified are valid (valid list is inferred from meta.json see below)
  • Data model tests to ensure that all validationModes reference valid imperativeConfigurations within the same model

Making modes accessible

The data model library should include an accessible enumeration of validationModes which are configured in meta.json.

This should be the source of truth for available validationMode values, and can be used to drive the validator GUI.

"validationModeGroup":[
  {
    "name": "Opportunity Data Publishing"
    "validationModeList": [
      {
        "validationMode": "RPDEFeed" // note will just be handled as default everywhere as unrecognised
        "name": "Open Data Listings RPDE Feed"
      },
      {
        "validationMode": "BookableRPDEFeed"
        "name": "Open Data Bookable RPDE Feed"
      }
    ]
  },
  {
    "name": "Open Booking API"
    "validationModeList": [
      {
        "validationMode": "C1Request"
        "name": "C1 Request"
      },
      {
        "validationMode": "C1Response"
        "name": "C1 Response"
      },
      {
        "validationMode": "C2Request"
        "name": "C2 Request"
      }
      ...
    ]
  }
]

Representation in the Validator GUI

The validation group mentioned above can then be represented in the Validator GUI as a button next to the "VALIDATE" button, that uses existing design patterns to allow selection of the validation mode.

The validation mode must also be able to be set in the URL of the validator, such that someone hyperlinking to that a specific sample has the mode preset

Untitled drawing

Sample grouping

The existing example_list.json needs to be adjusted as below to include grouping capability similar to the above.

[
  {
    "name": "Opportunity Data Publishing"
    "exampleList": [
        {
          "file": "sessionseries_example_1.json",
          "name": "SessionSeries with ScheduledSession Example"
        },
        {
          "file": "scheduledsession_example_1.json",
          "name": "ScheduledSession with SessionSeries Example"
        },
        {
          "file": "sessionseries-split_example_1.json",
          "name": "SessionSeries Example"
        }
    ]
  },
  {
    "name": "Open Booking API"
    "exampleList": [
        {
          "file": "c1_request_example_1.json",
          "name": "C1 Request Example",
          "validationMode": "C1Request"
        },
        {
          "file": "c1_response_example_1.json",
          "name": "C1 Response Example (simple)",
          "validationMode": "C1Response"
        },
        {
          "file": "c1_response_example_2.json",
          "name": "C1 Response Example (with Attendee Details)"
          "validationMode": "C1Response"
        }
    ]
  }

Samples within the Validator GUI

The GUI of the validator then needs to render these groups for the examples.

As the example list is quite long, consider how the list will scale as the number of examples increases.

Screenshot 2019-10-22 at 15 51 46

henryaddison added a commit that referenced this issue Oct 24, 2019
these aren't yet added to the model specifications but based on the plan in #41
@henryaddison
Copy link
Contributor

@nickevansuk I have made a start on some of the rules and model specs for this as you can see in the two PRs linked above. Third one to follow soon (work done just need to push it when I can access it again). This probably marks about the end of the initial timebox for working on the model validator library's rules (I might be able to squeeze in moving the source of truth for available validation modes from the validator library to the data-models repo).

@henryaddison
Copy link
Contributor

@nickevansuk I am getting through the work for this issue and so thinking a bit how these changes will be deployed.

Adding validationModeGroup to the metadata in data-models doesn't break anything but the validator site will come to rely on it. My suggestion is therefore to make a new minor version of data-models (1.6.0 from 1.5.6 I think) so the changes to the validator site can specify only later versions.

However, the changes to the examples/samples is more complex as rather than a pure addition it is a change to data-models which is not backwards compatible with the validator site code (before the validator site expects examples to be a flat list of objects, after it is groups of lists). I assume this is not a case of creating a new version of the spec in data models but instead should be a change of version of the @openactive/data-models npm pacakge, which I think strictly should be a new major version (2.0) to indicate the breaking change.

Basically do you agree with me updating the npm package's version to 2.0.0 in #46 ?

@nickevansuk
Copy link
Contributor Author

yep updating the models to 2.0 is fine, we're doing fairly major tooling work across the board so makes sense to me

@thtg88
Copy link

thtg88 commented Oct 29, 2019

If I understand this correctly does this mean we will also have to bump major in models-lib/package.json?

@henryaddison
Copy link
Contributor

@thtg88 Other than the change to the Barcode.json file name none of the changes I'm making effect the models-lib (yet) and even the file rename doesn't seem to be an issue for the code generator either so you shouldn't HAVE to but it may probably be wise to update the version in models-lib once it's released to avoid failing behind.

@henryaddison
Copy link
Contributor

henryaddison commented Oct 31, 2019

I have noticed that the Rules in data-model-validator lib are quite highly coupled to a version of the spec in data-models lib due to the way rules are target fields, models and validation modes. This is unavoidable - for new versions of the spec there will need to be some new rules and some old rules will no longer apply. However, the way it is currently implemented means the rules are at risk of accidentally being ignored due to types in the strings used by targetFields, targetModels and targetValidationModes.

At some point I guess it will be necessary to have per-version collections of the rules to apply during validation and therefore more explicitly include version when constructing a new include of a Rule class (maybe this is already possible since constructor takes an OptionsHelper object) and add some validation of the values of targetModels, targetFields and targetValidationModes against the models, fields and validation modes that exist in the data models spec for that version.

@nickevansuk
Copy link
Contributor Author

nickevansuk commented Oct 31, 2019

Yes great point about having the rules target specific versions - if this isn't there already, presumably the way to do this would be by adding targetVersions to the rules, which for now would all be either 2.0or 2.x (depending on the below)

We had a design discussion originally about versions of the model and decided that we didn't want a set of model files for each minor version, as that would be a nightmare to maintain! Instead we opted for 2.x to represent the "latest" minor version, as any changes made in a minor version by definition need to be non-breaking, and so the model should not become more restrictive over time, only more expansive.

I'm not sure what version ends up being used by the time you get to the rule itself (i.e. this.options.version within the rule), but whichever that is we should probably be consistent with it in terms of the granularity of targetVersions.

So if this.options.version is 2.x then:

  • Allow only 2.x, 3.x etc (as defined by the values of the object in versions.js) to be specified in the targetVersions

If this.options.version is 2.0 then:

  • Allow either 2.1 or 2.x to be specified in the targetVersions (as defined by the union of keys and values in versions.js), to allow us to be consistent, and have rules applicable to minor versions if required

Could we add "targetVersions support and some validation of the values of targetModels, targetFields and targetValidationModes against the models, fields and validation modes that exist in the data models spec for that version" to the spreadsheet to pick up? Sounds like it will be useful if we're about to embark on extensive rule writing?

@henryaddison
Copy link
Contributor

@nickevansuk & @thill-odi,

With the merging of the changes to the validator site above (openactive/data-model-validator-site#56), I believe that completes the work for this issue (bar possibly our comments about validating validationModes, targetModels and targetFields - this has been added to bottom of spreadsheet and perhaps belongs in it's own issue).

Work is merged into master but I will leave to you to confirm all is good and to deploy the updates to the production site (not least because I don't know how it's hosted!).

@nickevansuk
Copy link
Contributor Author

@henryaddison still getting an error at https://validator-staging.openactive.io/ ? Assume it works on your machine?

@nickevansuk
Copy link
Contributor Author

In fact @henryaddison here's the logs from Heroku:

2019-11-07T10:39:00.902400+00:00 heroku[web.1]: State changed from crashed to starting
2019-11-07T10:39:05.148534+00:00 heroku[web.1]: Starting process with command `npm start`
2019-11-07T10:39:07.357507+00:00 app[web.1]: 
2019-11-07T10:39:07.357538+00:00 app[web.1]: > [email protected] start /app
2019-11-07T10:39:07.357539+00:00 app[web.1]: > node -r dotenv/config dist/server/index.js
2019-11-07T10:39:07.357540+00:00 app[web.1]: 
2019-11-07T10:39:08.256822+00:00 app[web.1]: /app/dist/server/server.js:81
2019-11-07T10:39:08.256875+00:00 app[web.1]:         var _ref = _asyncToGenerator( /*#__PURE__*/regeneratorRuntime.mark(function _callee(req, res) {
2019-11-07T10:39:08.256877+00:00 app[web.1]:                                                    ^
2019-11-07T10:39:08.256879+00:00 app[web.1]: 
2019-11-07T10:39:08.256880+00:00 app[web.1]: ReferenceError: regeneratorRuntime is not defined
2019-11-07T10:39:08.256883+00:00 app[web.1]:     at /app/dist/server/server.js:81:52
2019-11-07T10:39:08.256884+00:00 app[web.1]:     at Function.createServer (/app/dist/server/server.js:208:8)
2019-11-07T10:39:08.256885+00:00 app[web.1]:     at Object.<anonymous> (/app/dist/server/index.js:13:18)
2019-11-07T10:39:08.256887+00:00 app[web.1]:     at Module._compile (module.js:652:30)
2019-11-07T10:39:08.256888+00:00 app[web.1]:     at Object.Module._extensions..js (module.js:663:10)
2019-11-07T10:39:08.256889+00:00 app[web.1]:     at Module.load (module.js:565:32)
2019-11-07T10:39:08.256891+00:00 app[web.1]:     at tryModuleLoad (module.js:505:12)
2019-11-07T10:39:08.256892+00:00 app[web.1]:     at Function.Module._load (module.js:497:3)
2019-11-07T10:39:08.256893+00:00 app[web.1]:     at Function.Module.runMain (module.js:693:10)
2019-11-07T10:39:08.256894+00:00 app[web.1]:     at startup (bootstrap_node.js:191:16)
2019-11-07T10:39:08.264065+00:00 app[web.1]: npm ERR! code ELIFECYCLE
2019-11-07T10:39:08.264360+00:00 app[web.1]: npm ERR! errno 1
2019-11-07T10:39:08.265502+00:00 app[web.1]: npm ERR! [email protected] start: `node -r dotenv/config dist/server/index.js`
2019-11-07T10:39:08.265632+00:00 app[web.1]: npm ERR! Exit status 1
2019-11-07T10:39:08.265867+00:00 app[web.1]: npm ERR! 
2019-11-07T10:39:08.266253+00:00 app[web.1]: npm ERR! Failed at the [email protected] start script.
2019-11-07T10:39:08.266467+00:00 app[web.1]: npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
2019-11-07T10:39:09.689298+00:00 heroku[web.1]: State changed from starting to crashed
2019-11-07T10:39:09.626363+00:00 app[web.1]: 
2019-11-07T10:39:09.626549+00:00 app[web.1]: npm ERR! A complete log of this run can be found in:
2019-11-07T10:39:09.626645+00:00 app[web.1]: npm ERR!     /app/.npm/_logs/2019-11-07T10_39_08_267Z-debug.log
2019-11-07T10:39:09.675431+00:00 heroku[web.1]: Process exited with status 1

@henryaddison
Copy link
Contributor

Ah I'd foolishly assumed the issues with the staging site were unrelated. I will take a look.

@henryaddison
Copy link
Contributor

On plus side can re-create on my machine with npm start rather than running the dev server.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants