Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation Lacks Information on the YAML Validation #128

Open
jjscheel opened this issue May 12, 2023 · 2 comments
Open

Documentation Lacks Information on the YAML Validation #128

jjscheel opened this issue May 12, 2023 · 2 comments
Labels
documentation Improvements or additions to documentation

Comments

@jjscheel
Copy link

As near as I can find, the documentation does not contain any details as to how the tell if the input YAML files have been validated. The "Usage Examples" section shows output from the tool, but does not mention the error message "No errors..."

Some questions we might want to answer:

  • Is the error the only indication of success or are there other, better ways to tell?
  • Is validation by HART and thus this error must be checked for every HART?
  • What happens in a multi-HART situation where HARTs 0, 2 and 3 are fine, 1 is bad? Does the tool complete and provide all results or end at first failure?
  • What are common errors messages?
  • Given an error, what is the common way to correct/fix them?
  • What is the approach for handling multiple errors? Should the first one be corrected and tool re-run or should you attempt to fix them all?
  • Is there a way to get more information for an error, such as enabling verbose mode or something?

This is not a critical gap but would be extremely nice to address given that this tool will be called out for explicit use in the "RISC-V Compatible Trademark Permission Process" described on the RISC-V Branding Guidelines & Materials web page.

@jjscheel jjscheel added the documentation Improvements or additions to documentation label May 12, 2023
@pawks
Copy link
Collaborator

pawks commented May 15, 2023

* Is the error the only indication of success or are there other, better ways to tell?

Currently there are 2 indications of success: No error and a log message stating "Dumping normalised yaml ..."/

* Is validation by HART and thus this error must be checked for every HART?

Yes, validation is per HART although some checks happen at the overall level too(such as atleast one hart must have the hartid 0).

* What happens in a multi-HART situation where HARTs 0, 2 and 3 are fine, 1 is bad?  Does the tool complete and provide all results or end at first failure?

The tool usually gives out all the errors in the input yaml during a single run. The errors are also printed with the key hierarchy for the field in the log. It is pretty easy to identify which field in the input yaml has caused the error.

* What are common errors messages?

This is really a difficult question to answer. Each csr has different rules as defined by the spec. Anything which violates the constraints imposed by the ISA spec is an error.

* Given an  error, what is the common way to correct/fix them?

The only correct way to fix it is to verify that the definition provided is indeed inline with the spec. The spec is the golden reference for any errors related to the definitions.

* What is the approach for handling multiple errors?  Should the first one be corrected and tool re-run or should you attempt to fix them all?

This is more of a user preference than a requirement/recommendation.

* Is there a way to get more information for an error, such as enabling verbose mode or something?

The logging verbosity can be controlled using --verbose debug on the CLI.

@neelgala
Copy link
Collaborator

@jjscheel this seems like a good FAQ section that could be added to the docs. Would you like volunteer to start a PR and we can add more as necessary.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

3 participants