Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simplify the process of adding a new metric #5

Open
niketagrawal opened this issue Feb 16, 2022 · 4 comments
Open

Simplify the process of adding a new metric #5

niketagrawal opened this issue Feb 16, 2022 · 4 comments
Assignees
Labels
documentation Improvements or additions to documentation enhancement New feature or request

Comments

@niketagrawal
Copy link
Collaborator

No description provided.

@maxspahn
Copy link
Collaborator

maxspahn commented Feb 16, 2022

@niketagrawal
Copy link
Collaborator Author

niketagrawal commented Mar 15, 2022

Developer notes on the current implementation:

  • To create a new metric, inherit from the metric class and implement the computemetric() method in metrics.py
  • In order to use the metric for benchmarking purposes for a specific type of robot, it must be instantiated for that type of robot in the postprocessor script.
self._metrics['time2Goal'] = TimeToReachGoalMetric(
            "time2Goal",
            ["q0", "q1", "goal_0_0", "goal_1_0", "t"],
            {"m": 2, "des_distance": self._experiment.primeGoal().epsilon()},
        )
- The fist argument is a string to represent the type of metric. It doesn't need to be passed as an argument here, it can be an instance attribute of the metric class.
- The 2nd argument is a list of column indices in res.csv that needs to be selected for postprocessing. The columns chosen for each type of robot and metric combination will differ which requires a unique instantiation for each metric and robot combination, but this issue can be solved by renaming the columns in the res.csv to represent the parameters type and not the specific variable names. 
- The 3rd argument comes from experiemnt.py; this can be standardized too. in the above manner.
  • Requirements for creating a new metric

    • Knowledge of the computation logic for the metric; this is needed to implement the computemetric method()
    • Knowledge of how the new metric will be instantiated for a particular type of robot. This is needed to specify which column indices from res.sv are required for metric computation.
  • The arguments of the class instantiation of the metrics in the postprocessor script can be standardized so that they can be embedded in the metric class itself.

  • self.experiment.robot_type() can be used to construct 2nd and 3rd argument of the class instantiation

  • A new metric may not apply to all the robot types; it would depend on the what is being computed in the metric.

@niketagrawal
Copy link
Collaborator Author

niketagrawal commented Mar 15, 2022

Expected functionality (tentative): A metric class should be instantiated for a particular type of robot based on the arguments passed in the postprocessor command. Currently, all metric class are being instantiated for all types of robots in advance.

@maxspahn
Copy link
Collaborator

maxspahn commented Apr 2, 2022

This could be done with the registry pattern. Similar to #25 for planners.

@c-salmi c-salmi assigned c-salmi and maxspahn and unassigned c-salmi Jul 26, 2022
@c-salmi c-salmi added enhancement New feature or request documentation Improvements or additions to documentation labels Jul 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants