-
Notifications
You must be signed in to change notification settings - Fork 178
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] feat: add multimdoal fewshot evaluation #365
Open
Luodian
wants to merge
1
commit into
main
Choose a base branch
from
feat/multimodal_fewshot_eval
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -1034,19 +1034,14 @@ def concat_tar_parts(tar_parts, output_tar): | |
if "create_link" in dataset_kwargs: | ||
dataset_kwargs.pop("create_link") | ||
|
||
if "load_from_disk" in dataset_kwargs and dataset_kwargs["load_from_disk"]: | ||
dataset_kwargs.pop("load_from_disk") | ||
# using local task in offline environment, need to process the online dataset into local format via | ||
# `ds = load_datasets("lmms-lab/MMMU")` | ||
self.dataset = datasets.load_from_disk(path=self.DATASET_PATH, name=self.DATASET_NAME) | ||
else: | ||
self.dataset = datasets.load_dataset( | ||
path=self.DATASET_PATH, | ||
name=self.DATASET_NAME, | ||
download_mode=datasets.DownloadMode.REUSE_DATASET_IF_EXISTS, | ||
download_config=download_config, | ||
**dataset_kwargs if dataset_kwargs is not None else {}, | ||
) | ||
# Check if the key exists first | ||
self.dataset = datasets.load_dataset( | ||
path=self.DATASET_PATH, | ||
name=self.DATASET_NAME, | ||
download_mode=datasets.DownloadMode.REUSE_DATASET_IF_EXISTS, | ||
download_config=download_config, | ||
**dataset_kwargs if dataset_kwargs is not None else {}, | ||
) | ||
|
||
if self.config.process_docs is not None: | ||
for split in self.dataset: | ||
|
@@ -1114,6 +1109,7 @@ def fewshot_context( | |
apply_chat_template: bool = False, | ||
fewshot_as_multiturn: bool = False, | ||
chat_template: Optional[Callable] = None, | ||
is_multimodal: bool = False, | ||
) -> str: | ||
"""Returns a fewshot context string that is made up of a prepended description | ||
(if provided), the `num_fewshot` number of examples, and an appended prompt example. | ||
|
@@ -1162,48 +1158,73 @@ def fewshot_context( | |
|
||
# if few-shot - append examples after the system prompt | ||
if num_fewshot > 0: | ||
if apply_chat_template: | ||
labeled_examples.extend(self.sampler.get_chat_context(doc, num_fewshot, fewshot_as_multiturn)) | ||
if is_multimodal is False: | ||
if apply_chat_template: | ||
labeled_examples.extend(self.sampler.get_chat_context(doc, num_fewshot, fewshot_as_multiturn)) | ||
else: | ||
labeled_examples += self.sampler.get_context(doc, num_fewshot) | ||
else: | ||
labeled_examples += self.sampler.get_context(doc, num_fewshot) | ||
if apply_chat_template: | ||
labeled_examples_text, labeled_examples_multimodal = self.sampler.get_multimodal_chat_context(doc, num_fewshot, fewshot_as_multiturn) | ||
labeled_examples.extend(labeled_examples_text) | ||
else: | ||
labeled_examples_text, labeled_examples_multimodal = self.sampler.get_multimodal_context(doc, num_fewshot) | ||
labeled_examples += labeled_examples_text | ||
Comment on lines
+1167
to
+1172
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Since is_multimodal is always False as it is not pass in when calling the |
||
|
||
example = self.doc_to_text(doc) | ||
if apply_chat_template: | ||
if self.multiple_input: | ||
if is_multimodal is False: | ||
if apply_chat_template: | ||
if self.multiple_input: | ||
return chat_template(labeled_examples) | ||
if isinstance(example, str): | ||
self.append_target_question(labeled_examples, example, fewshot_as_multiturn) | ||
# for loglikelihood create a list of questions with appended choices | ||
elif isinstance(example, list): | ||
labeled_examples_list = [] | ||
# copy chat history for each example and append the answer | ||
for ex in example: | ||
chat = copy.deepcopy(labeled_examples) | ||
self.append_target_question(chat, ex, fewshot_as_multiturn) | ||
labeled_examples_list.append(chat_template(chat)) | ||
return labeled_examples_list | ||
# if example is an integer, append the choice or convert to string | ||
elif isinstance(example, int): | ||
if self.config.doc_to_choice is not None: | ||
choices = self.doc_to_choice(doc) | ||
self.append_target_question(labeled_examples, choices[example], fewshot_as_multiturn) | ||
else: | ||
self.append_target_question(labeled_examples, str(example), fewshot_as_multiturn) | ||
# return lm.apply_chat_template(labeled_examples) | ||
return chat_template(labeled_examples) | ||
if isinstance(example, str): | ||
self.append_target_question(labeled_examples, example, fewshot_as_multiturn) | ||
# for loglikelihood create a list of questions with appended choices | ||
elif isinstance(example, list): | ||
labeled_examples_list = [] | ||
# copy chat history for each example and append the answer | ||
for ex in example: | ||
chat = deepcopy(labeled_examples) | ||
self.append_target_question(chat, ex, fewshot_as_multiturn) | ||
labeled_examples_list.append(chat_template(chat)) | ||
return labeled_examples_list | ||
# if example is an integer, append the choice or convert to string | ||
elif isinstance(example, int): | ||
if self.config.doc_to_choice is not None: | ||
choices = self.doc_to_choice(doc) | ||
self.append_target_question(labeled_examples, choices[example], fewshot_as_multiturn) | ||
else: | ||
self.append_target_question(labeled_examples, str(example), fewshot_as_multiturn) | ||
# return lm.apply_chat_template(labeled_examples) | ||
return chat_template(labeled_examples) | ||
else: | ||
if self.multiple_input: | ||
return labeled_examples | ||
if isinstance(example, str): | ||
return labeled_examples + example | ||
elif isinstance(example, list): | ||
return [labeled_examples + ex for ex in example] | ||
elif isinstance(example, int): | ||
if self.config.doc_to_choice is not None: | ||
choices = self.doc_to_choice(doc) | ||
return labeled_examples + choices[example] | ||
else: | ||
return labeled_examples + str(example) | ||
else: | ||
if self.multiple_input: | ||
return labeled_examples | ||
if isinstance(example, str): | ||
return labeled_examples + example | ||
elif isinstance(example, list): | ||
return [labeled_examples + ex for ex in example] | ||
elif isinstance(example, int): | ||
if self.config.doc_to_choice is not None: | ||
choices = self.doc_to_choice(doc) | ||
return labeled_examples + choices[example] | ||
else: | ||
return labeled_examples + str(example) | ||
if apply_chat_template: | ||
raise NotImplementedError("Multimodal chat template not implemented yet") | ||
else: | ||
if self.multiple_input: | ||
return labeled_examples, labeled_examples_multimodal | ||
if isinstance(example, str): | ||
return labeled_examples + example, labeled_examples_multimodal | ||
elif isinstance(example, list): | ||
return [labeled_examples + ex for ex in example] | ||
elif isinstance(example, int): | ||
if self.config.doc_to_choice is not None: | ||
choices = self.doc_to_choice(doc) | ||
return labeled_examples + choices[example], labeled_examples_multimodal | ||
else: | ||
return labeled_examples + str(example), labeled_examples_multimodal | ||
|
||
def apply_filters(self): | ||
if hasattr(self, "_filters"): | ||
|
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Base on my understanding on the changes on this
few_shot_context
calling, thisis_multimodal
currently is always False?lmms-eval/lmms_eval/api/task.py
Lines 441 to 449 in c5abe57
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, it's not called in current lmms-eval standard pipeline, but in another project it's called via importing from lmms_eval.
The actual code is following.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So I set it to WIP and wish if we can altogether refine this PR lol.
I think Fanyi @pufanyi did similar things many months ago.