Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean and enhance evaluators that work an anndata #22

Open
mengerj opened this issue Jan 24, 2025 · 1 comment
Open

Clean and enhance evaluators that work an anndata #22

mengerj opened this issue Jan 24, 2025 · 1 comment
Labels
enhancement New feature or request

Comments

@mengerj
Copy link
Owner

mengerj commented Jan 24, 2025

Description of feature

Evaluation approaches:

  • how well do embeddings cluster, based on biological information or batch?
  • compared to raw data and or initial embeddings
  • scib evaluator
    Annotation quality:
  • On a new dataset, if both data and context are embedded, how often is the annotation correct?
@mengerj mengerj added the enhancement New feature or request label Jan 24, 2025
@mengerj
Copy link
Owner Author

mengerj commented Feb 20, 2025

Create a workflow function that takes test datasets and model name as input and applies all relevant evaluation functions to all datasets

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

1 participant