Skip to content

Workflow for robot specific integration tests

Marcel Stemmeler edited this page Jul 26, 2023 · 2 revisions

This wiki page describes a general step-by-step approach on how to add new robot specific integration tests. This is most useful when adding or adjusting a new robot system and should always be done with care as it's the main way of ensuring a robot system functions properly. Please don't rush these steps and remember why we create tests in the first place.

  1. If you haven't already, create a new feature branch for your changes.

  2. Go to the directory OpenRobertaServer/src/test/resources/crossCompilerTests/common/template.

  3. Add new Template:
    The usual approach for this is to copy an existing template with the same start block as your robot system. The name of the template should have the same name as the ".properties" file of your robot system.

  4. Add to testSpec.yml:
    Add your robot system to the testSpec.yml this is self-explanatory when referring to other robot systems.

  5. Go to /robotSpecific and add a new directory for your robot, again give it the same name as your ".properties" file.

  6. This is where you will create your test programs. To make a new test program, simply create it in the Open Roberta Lab, export it and add it to this directory.

    Note: When creating tests, create multiple test files covering each aspect of your robot system, every block except for some general blocks like math and logic blocks needs to appear in at least one test for the integration tests to pass. One test could be sound blocks, while another XML tests driving blocks. All tests should also be testable on the real robot, so don't just add every block but try to make the functionality as observable as possible when executing on the real robot.

  7. go to /_expected/robotSpecific and add a directory in each test type with the same name as your ".properties".
    In these, create a file with the correct file ending for each of your XML files in /robotSpecific/robot. This can be created manually, but another way is explained in the following steps using IntelliJ.

  8. The differences between the expected and generated files can be checked using an IntelliJ plugin called "Validation-File Comparison".

    Install it and go to the plugin settings to add the following:
    Output-File Directory Path: target/unitTests/_expected
    Validation-File Directory Path: src/test/resources/crossCompilerTests/_expected

  9. To quickly check the integration tests added, run the file ReuseIntegrationAsUnitTest.java.

    Using the default keyboard shortcut, simply press SHIFT + ALT + V to open the validation-file comparison.

    Now look for your new files.

    When running this for the first time, you can carefully check if the generated output on the left side is correct and copy it over to the right side. Remember the right side is what the file is supposed to look like, so adding wrong code here can make debugging any mistakes very difficult.

  10. To check if all blocks are used, run TestToolboxBlocksAreUsedInTestFiles.java.

  11. Now when tests finish without any errors, run the integration tests using maven with: mvn clean install -PrunIT

Remember, in order to run the tests, you need to specify the path to the ora-cc-rsc repository in the environment variable robot_crosscompiler_resourcebase.

  1. If everything is green, you are done. Good job!
Clone this wiki locally