Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(eval): add framework tip with documentation link #934

Merged
merged 1 commit into from
Feb 10, 2025

Conversation

youngbeom-shin
Copy link
Member

@youngbeom-shin youngbeom-shin commented Feb 10, 2025

What this PR does:

Add explanatory text and link to evaluation framework documentation under framework selection to help users understand differences between frameworks

Which issue(s) this PR fixes:

Fixes #

Type of changes
Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation update

Feature/Issue validation/testing:

Please describe the tests that you ran to verify your changes and relevant result summary.

  • Test A

  • Test B

  • Logs

Special notes for your reviewer:

Checklist:

  • I have added unit/e2e tests that prove your fix is effective or that this feature works.
  • I have commented my code, particularly in hard-to-understand areas.
  • I have made corresponding changes to the documentation.
  • I have reviewed my own code and ensured that it follows the project's style guidelines.

Release note:


MR Summary:

The summary is added by @codegpt.

This Merge Request introduces a new feature that enhances the user interface by adding explanatory text and a documentation link under the framework selection in the evaluation section. This aims to assist users in understanding the differences between various evaluation frameworks. Key updates include:

  1. Addition of a paragraph with a link to the evaluation framework documentation in the NewEvaluation.vue component.
  2. Update to English and Chinese localization files (evaluation.js) to include new text entries for the explanatory text and link description.

Add explanatory text and link to evaluation framework documentation under framework selection to help users understand differences between frameworks
Copy link
Collaborator

@zhendi zhendi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@starship-github
Copy link

MR Evaluation:

This feature is still under test, evaluation are given by AI and might be inaccurate.

After evaluation, the code changes in the Merge Request get score: 90.

Analysis for the evaluation score:
  • The code change may not include corresponding unit tests.
  • The code change may not include corresponding integration testing.
Tips

CodeReview Commands (invoked as MR or PR comments)

  • @codegpt /review to trigger an code review.
  • @codegpt /evaluate to trigger code evaluation process.
  • @codegpt /describe to regenerate the summary of the MR.
  • @codegpt /secscan to scan security vulnerabilities for the MR or the Repository.
  • @codegpt /help to get help.

CodeReview Discussion Chat

There are 2 ways to chat with Starship CodeReview:

  • Review comments: Directly reply to a review comment made by StarShip.
    Example:
    • @codegpt How to fix this bug?
  • Files and specific lines of code (under the "Files changed" tab):
    Tag @codegpt in a new review comment at the desired location with your query.
    Examples:
    • @codegpt generate unit testing code for this code snippet.

Note: Be mindful of the bot's finite context window.
It's strongly recommended to break down tasks such as reading entire modules into smaller chunks.
For a focused discussion, use review comments to chat about specific files and their changes, instead of using the MR/PR comments.

CodeReview Documentation and Community

  • Visit our Documentation
    for detailed information on how to use Starship CodeReview.

About Us:

Visit the OpenCSG StarShip website for the Dashboard and detailed information on CodeReview, CodeGen, and other StarShip modules.

@hiveer hiveer merged commit b74b64a into main Feb 10, 2025
3 checks passed
@hiveer hiveer deleted the csghub__add-framework-tip-with-documentation-link branch February 10, 2025 09:38
@starship-github
Copy link

The StarShip CodeReviewer was triggered but terminated because it encountered an issue: The MR state is not opened.

Tips

CodeReview Commands (invoked as MR or PR comments)

  • @codegpt /review to trigger an code review.
  • @codegpt /evaluate to trigger code evaluation process.
  • @codegpt /describe to regenerate the summary of the MR.
  • @codegpt /secscan to scan security vulnerabilities for the MR or the Repository.
  • @codegpt /help to get help.

CodeReview Discussion Chat

There are 2 ways to chat with Starship CodeReview:

  • Review comments: Directly reply to a review comment made by StarShip.
    Example:
    • @codegpt How to fix this bug?
  • Files and specific lines of code (under the "Files changed" tab):
    Tag @codegpt in a new review comment at the desired location with your query.
    Examples:
    • @codegpt generate unit testing code for this code snippet.

Note: Be mindful of the bot's finite context window.
It's strongly recommended to break down tasks such as reading entire modules into smaller chunks.
For a focused discussion, use review comments to chat about specific files and their changes, instead of using the MR/PR comments.

CodeReview Documentation and Community

  • Visit our Documentation
    for detailed information on how to use Starship CodeReview.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants