Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean validation of Tavern files is required to allow using it in a productive way #724

Open
hsq125 opened this issue Oct 18, 2021 · 1 comment

Comments

@hsq125
Copy link

hsq125 commented Oct 18, 2021

I'm stuck for hours with a simple test with one config file and one external stage file.

Each time there is a single mistake, I'm left with a stack trace of 500 line without clear indication about what is wrong.

to_verify = {'test_name': 'Full path, from client sync to period renewal.', 'includes': [{'test_name': 'Database delete stage', 'i...prod/command/SynchronizeClients'}, 'response': {'status_code': 502, 'headers': {'content-type': 'application/json'}}}]}
schema = {'name': 'Test schema', 'desc': 'Matches test blocks', 'schema;any_request_json': {'func': 'validate_request_json', 't... 'map', 'mapping': {'username': {'type': 'str', 'required': True}, 'password': {'type': 'str', 'required': False}}}}}}}

    def verify_generic(to_verify, schema):
        """Verify a generic file against a given schema
    
        Args:
            to_verify (dict): Filename of source tests to check
            schema (dict): Schema to verify against
    
        Raises:
            BadSchemaError: Schema did not match
        """
        logger.debug("Verifying %s against %s", to_verify, schema)
    
        here = os.path.dirname(os.path.abspath(__file__))
        extension_module_filename = os.path.join(here, "extensions.py")
    
        verifier = core.Core(
            source_data=to_verify,
            schema_data=schema,
            extensions=[extension_module_filename],
        )
    
        try:
>           verifier.validate()

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/tavern/schemas/files.py:106: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <pykwalify.core.Core object at 0x7f0663c92490>, raise_exception = True

    def validate(self, raise_exception=True):
        """
        """
        log.debug(u"starting core")
    
        self._start_validate(self.source)
        self.validation_errors = [unicode(error) for error in self.errors]
        self.validation_errors_exceptions = self.errors
    
        if self.errors is None or len(self.errors) == 0:
            log.info(u"validation.valid")
        else:
            log.error(u"validation.invalid")
            log.error(u" --- All found errors ---")
            log.error(self.validation_errors)
            if raise_exception:
>               raise SchemaError(u"Schema validation failed:\n - {error_msg}.".format(
                    error_msg=u'.\n - '.join(self.validation_errors)))
E               pykwalify.errors.SchemaError: <SchemaError: error code 2: Schema validation failed:
E                - Cannot find required key 'name'. Path: '/includes/0'.
E                - Cannot find required key 'description'. Path: '/includes/0'.
E                - Key 'test_name' was not defined. Path: '/includes/0'.
E                - Key 'includes' was not defined. Path: '/includes/0'.: Path: '/'>

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/pykwalify/core.py:194: SchemaError

The above exception was the direct cause of the following exception:

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f0662a263a0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/_pytest/runner.py:311: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>       lambda: ihook(item=item, **kwds), when=when, reraise=reraise
    )

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/_pytest/runner.py:255: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <_HookCaller 'pytest_runtest_call'>, args = ()
kwargs = {'item': <YamlItem Full path, from client sync to period renewal.>}
notincall = set()

    def __call__(self, *args, **kwargs):
        if args:
            raise TypeError("hook calling supports only keyword arguments")
        assert not self.is_historic()
        if self.spec and self.spec.argnames:
            notincall = (
                set(self.spec.argnames) - set(["__multicall__"]) - set(kwargs.keys())
            )
            if notincall:
                warnings.warn(
                    "Argument(s) {} which are declared in the hookspec "
                    "can not be found in this hook call".format(tuple(notincall)),
                    stacklevel=2,
                )
>       return self._hookexec(self, self.get_hookimpls(), kwargs)

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/pluggy/hooks.py:286: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <_pytest.config.PytestPluginManager object at 0x7f066319fac0>
hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/home/detailoc/dev/dependencies/python/aws38/li...xception' from '/home/detailoc/dev/dependencies/python/aws38/lib/python3.8/site-packages/_pytest/threadexception.py'>>]
kwargs = {'item': <YamlItem Full path, from client sync to period renewal.>}

    def _hookexec(self, hook, methods, kwargs):
        # called from all hookcaller instances.
        # enable_tracing will set its own wrapping function at self._inner_hookexec
>       return self._inner_hookexec(hook, methods, kwargs)

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/pluggy/manager.py:93: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/home/detailoc/dev/dependencies/python/aws38/li...xception' from '/home/detailoc/dev/dependencies/python/aws38/lib/python3.8/site-packages/_pytest/threadexception.py'>>]
kwargs = {'item': <YamlItem Full path, from client sync to period renewal.>}

>   self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
        methods,
        kwargs,
        firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
    )

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/pluggy/manager.py:84: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/home/detailoc/dev/dependencies/python/aws38/li...xception' from '/home/detailoc/dev/dependencies/python/aws38/lib/python3.8/site-packages/_pytest/threadexception.py'>>]
caller_kwargs = {'item': <YamlItem Full path, from client sync to period renewal.>}
firstresult = False

    def _multicall(hook_impls, caller_kwargs, firstresult=False):
        """Execute a call into multiple python functions/methods and return the
        result(s).
    
        ``caller_kwargs`` comes from _HookCaller.__call__().
        """
        __tracebackhide__ = True
        results = []
        excinfo = None
        try:  # run impl and wrapper setup functions in a loop
            teardowns = []
            try:
                for hook_impl in reversed(hook_impls):
                    try:
                        args = [caller_kwargs[argname] for argname in hook_impl.argnames]
                    except KeyError:
                        for argname in hook_impl.argnames:
                            if argname not in caller_kwargs:
                                raise HookCallError(
                                    "hook call must provide argument %r" % (argname,)
                                )
    
                    if hook_impl.hookwrapper:
                        try:
                            gen = hook_impl.function(*args)
                            next(gen)  # first yield
                            teardowns.append(gen)
                        except StopIteration:
                            _raise_wrapfail(gen, "did not yield")
                    else:
                        res = hook_impl.function(*args)
                        if res is not None:
                            results.append(res)
                            if firstresult:  # halt further impl calls
                                break
            except BaseException:
                excinfo = sys.exc_info()
        finally:
            if firstresult:  # first result hooks return a single value
                outcome = _Result(results[0] if results else None, excinfo)
            else:
                outcome = _Result(results, excinfo)
    
            # run all wrapper post-yield blocks
            for gen in reversed(teardowns):
                try:
                    gen.send(outcome)
                    _raise_wrapfail(gen, "has second yield")
                except StopIteration:
                    pass
    
>           return outcome.get_result()

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/pluggy/callers.py:208: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <pluggy.callers._Result object at 0x7f0663c92640>

    def get_result(self):
        """Get the result(s) for this hook call.
    
        If the hook was marked as a ``firstresult`` only a single value
        will be returned otherwise a list of results.
        """
        __tracebackhide__ = True
        if self._excinfo is None:
            return self._result
        else:
            ex = self._excinfo
            if _py3:
>               raise ex[1].with_traceback(ex[2])

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/pluggy/callers.py:80: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/home/detailoc/dev/dependencies/python/aws38/li...xception' from '/home/detailoc/dev/dependencies/python/aws38/lib/python3.8/site-packages/_pytest/threadexception.py'>>]
caller_kwargs = {'item': <YamlItem Full path, from client sync to period renewal.>}
firstresult = False

    def _multicall(hook_impls, caller_kwargs, firstresult=False):
        """Execute a call into multiple python functions/methods and return the
        result(s).
    
        ``caller_kwargs`` comes from _HookCaller.__call__().
        """
        __tracebackhide__ = True
        results = []
        excinfo = None
        try:  # run impl and wrapper setup functions in a loop
            teardowns = []
            try:
                for hook_impl in reversed(hook_impls):
                    try:
                        args = [caller_kwargs[argname] for argname in hook_impl.argnames]
                    except KeyError:
                        for argname in hook_impl.argnames:
                            if argname not in caller_kwargs:
                                raise HookCallError(
                                    "hook call must provide argument %r" % (argname,)
                                )
    
                    if hook_impl.hookwrapper:
                        try:
                            gen = hook_impl.function(*args)
                            next(gen)  # first yield
                            teardowns.append(gen)
                        except StopIteration:
                            _raise_wrapfail(gen, "did not yield")
                    else:
>                       res = hook_impl.function(*args)

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/pluggy/callers.py:187: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

item = <YamlItem Full path, from client sync to period renewal.>

    def pytest_runtest_call(item: Item) -> None:
        _update_current_test_var(item, "call")
        try:
            del sys.last_type
            del sys.last_value
            del sys.last_traceback
        except AttributeError:
            pass
        try:
            item.runtest()
        except Exception as e:
            # Store trace info to allow postmortem debugging
            sys.last_type = type(e)
            sys.last_value = e
            assert e.__traceback__ is not None
            # Skip *this* frame
            sys.last_traceback = e.__traceback__.tb_next
>           raise e

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/_pytest/runner.py:170: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

item = <YamlItem Full path, from client sync to period renewal.>

    def pytest_runtest_call(item: Item) -> None:
        _update_current_test_var(item, "call")
        try:
            del sys.last_type
            del sys.last_value
            del sys.last_traceback
        except AttributeError:
            pass
        try:
>           item.runtest()

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/_pytest/runner.py:162: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <YamlItem Full path, from client sync to period renewal.>

    def runtest(self):
        # Do a deep copy because this sometimes still retains things from previous tests(?)
        self.global_cfg = copy.deepcopy(load_global_cfg(self.config))
    
        self.global_cfg.setdefault("variables", {})
    
        load_plugins(self.global_cfg)
    
        self.global_cfg["tavern_internal"] = {"pytest_hook_caller": self.config.hook}
    
        # INTERNAL
        # NOTE - now that we can 'mark' tests, we could use pytest.mark.xfail
        # instead. This doesn't differentiate between an error in verification
        # and an error when running the test though.
        xfail = self.spec.get("_xfail", False)
    
        try:
            fixture_values = self._load_fixture_values()
            self.global_cfg["variables"].update(fixture_values)
    
            call_hook(
                self.global_cfg,
                "pytest_tavern_beta_before_every_test_run",
                test_dict=self.spec,
                variables=self.global_cfg["variables"],
            )
    
>           verify_tests(self.spec)

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/tavern/testutils/pytesthook/item.py:184: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

test_spec = {'test_name': 'Full path, from client sync to period renewal.', 'includes': [{'test_name': 'Database delete stage', 'i...prod/command/SynchronizeClients'}, 'response': {'status_code': 502, 'headers': {'content-type': 'application/json'}}}]}
with_plugins = True

    def verify_tests(test_spec, with_plugins=True):
        """Verify that a specific test block is correct
    
        Todo:
            Load schema file once. Requires some caching of the file
    
        Args:
            test_spec (dict): Test in dictionary form
    
        Raises:
            BadSchemaError: Schema did not match
        """
        here = os.path.dirname(os.path.abspath(__file__))
    
        schema_filename = os.path.join(here, "tests.schema.yaml")
        schema = load_schema_file(schema_filename, with_plugins)
    
>       verify_generic(test_spec, schema)

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/tavern/schemas/files.py:152: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

to_verify = {'test_name': 'Full path, from client sync to period renewal.', 'includes': [{'test_name': 'Database delete stage', 'i...prod/command/SynchronizeClients'}, 'response': {'status_code': 502, 'headers': {'content-type': 'application/json'}}}]}
schema = {'name': 'Test schema', 'desc': 'Matches test blocks', 'schema;any_request_json': {'func': 'validate_request_json', 't... 'map', 'mapping': {'username': {'type': 'str', 'required': True}, 'password': {'type': 'str', 'required': False}}}}}}}

    def verify_generic(to_verify, schema):
        """Verify a generic file against a given schema
    
        Args:
            to_verify (dict): Filename of source tests to check
            schema (dict): Schema to verify against
    
        Raises:
            BadSchemaError: Schema did not match
        """
        logger.debug("Verifying %s against %s", to_verify, schema)
    
        here = os.path.dirname(os.path.abspath(__file__))
        extension_module_filename = os.path.join(here, "extensions.py")
    
        verifier = core.Core(
            source_data=to_verify,
            schema_data=schema,
            extensions=[extension_module_filename],
        )
    
        try:
            verifier.validate()
        except pykwalify.errors.PyKwalifyException as e:
            logger.exception("Error validating %s", to_verify)
>           raise BadSchemaError() from e
E           tavern.util.exceptions.BadSchemaError

../../../../../dependencies/python/aws38/lib/python3.8/site-packages/tavern/schemas/files.py:109: BadSchemaError


========================= 1 failed, 1 passed in 0.76s ==========================

Process finished with exit code 1

This is just an example, but Tavern crashes here and there depending on the issue. That is not possible to use this in a productive way.

@hsq125 hsq125 changed the title Errors in Tavern files are not properly indicated to user, which makes it a nightmare to use. Clean validation of Tavern files is required to allow using it in a productive way Oct 19, 2021
@michaelboulton
Copy link
Member

This is partially the reason I'm switching to using jsonschema in the next release, it should be able to provide more useful errors like

../../.tox/py38-generic/lib/python3.8/site-packages/tavern/_core/schema/jsonschema.py:134: in verify_jsonschema
    raise BadSchemaError(msg) from e
E   tavern._core.exceptions.BadSchemaError:
E   ---
E
E   Additional properties are not allowed ('blllblbb' was unexpected)
E
E     - name: Send with basic auth
E       blllblbb: gg
E       request:
E         url: "{global_host}/authtest/basic"
E         method: GET
E         auth:
E           - "fakeuser"
E           - "fakepass"
E       response:
E         status_code: 200
E         json:
E           auth_type: basic
E           auth_user: fakeuser
E           auth_pass: fakepass

For the time being you can also use the pytest -q flag which won't print so many lines of code in each traceback

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants