Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Specification validator #805

Open
jakubklimek opened this issue Nov 22, 2024 · 4 comments
Open

Specification validator #805

jakubklimek opened this issue Nov 22, 2024 · 4 comments
Labels
idea Ideas with no specific timeline

Comments

@jakubklimek
Copy link
Contributor

Currently, more and more things can go wrong with the specification due to user error and the lack of validation in the UIs. These errors then come up in various forms, e.g. #804 .

It would be nice to have an extensible specification validator to warn about these situations.

@jakubklimek jakubklimek added the idea Ideas with no specific timeline label Nov 22, 2024
@sstenchlak
Copy link
Member

This is quite a challenging question regarding what and how to test.

Verifying whether the backend works and can generate something could be handled during testing in the GitHub pipeline. I would probably prefer setting this up, as it would test a substantial part of the codebase.

I wouldn't focus on testing the user interface, as it changes frequently, and errors can typically be identified quickly during regular use of the application.

Additionally, we can spin up something similar to the CLI we had in the past and integrate it to the GitHub pipeline which would follow the long-term vision of having Dataspecer GitHub action.

@jakubklimek
Copy link
Contributor Author

You are talking about testing Dataspecer code. That is indeed needed as well. But the idea was aiming at providing the user some information about the specification that they have created in Dataspecer - are there IRI conflicts? Labels missing? Definitions missing? XML Schema problems? etc.

@sstenchlak
Copy link
Member

Okay, it seems you're suggesting not testing the code itself but rather focusing on validating user inputs - such as provided IRIs, or ensuring there are no inconsistencies.

If that's the case, a more "correct" approach might be to allow the generators to report errors, warnings, and notices because error in one use-case may be correct approach in another. For example (not) providing cardinalities for relations in vocabularies, not using IRIs, etc.

@jakubklimek
Copy link
Contributor Author

I agree. However, this list of warnings/errors should be available as we are working on the specification, not only when we try to generate it. So it would be either something like trying to run the generators again and again, or having a "validate" mode of the generators for this purpose. And it is not just the generators. E.g. IRI conflicts are not detected by any generator, but they can be detected by a Dataspecer core validation service, or the CME, etc.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
idea Ideas with no specific timeline
Projects
None yet
Development

No branches or pull requests

2 participants