Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add starcode past-kv shape for TSModelForCausal class #371

Merged
merged 7 commits into from
Sep 26, 2023

Conversation

changwangss
Copy link
Contributor

@changwangss changwangss commented Jul 7, 2023

What does this PR do?

this PR is used to support generate bigcode/starcoder past-kv shape, please check and review. @echarlaix
here is the related PR in optimum, huggingface/optimum#1170
I don't find the tiny model, could you help us create a tiny model if necessary.

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

@changwangss changwangss requested a review from echarlaix July 7, 2023 09:55
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Jul 7, 2023

The documentation is not available anymore as the PR was closed or merged.

Copy link
Collaborator

@echarlaix echarlaix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the addition !

@@ -273,13 +273,17 @@ def forward(
num_attention_heads = self.normalized_config.num_attention_heads
hidden_size = self.normalized_config.hidden_size
d_k = hidden_size // num_attention_heads

if self.config.model_type != "bloom":
if self.config.model_type == "gpt_bigcode":
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you add a test to verify inference is behaving as expected ? (using INCModelForCausalLM can be enough for now)

@hshen14
Copy link
Collaborator

hshen14 commented Jul 23, 2023

@echarlaix could you please review and merge if no more comments? thx

@echarlaix
Copy link
Collaborator

@echarlaix could you please review and merge if no more comments? thx

Sure, can a test be added before we can merge ?

@changwangss
Copy link
Contributor Author

changwangss commented Sep 25, 2023

when I use the INCModelForCaudalLM to get the traced model, the error is raised. Could you please give me a help? @echarlaix @PenghuiCheng

model_id = "hf-internal-testing/tiny-random-GPTBigCodeModel"
model = INCModelForCausalLM.from_pretrained(model_id, export=True)
Traceback (most recent call last):
  File "/home/changwa1/optimum-intel/tests/generation/tmp.py", line 10, in <module>
    model = INCModelForCausalLM.from_pretrained(model_id, export=True)
  File "/home/changwa1/anaconda3/envs/rc1/lib/python3.9/site-packages/optimum/modeling_base.py", line 372, in from_pretrained
    return from_pretrained_method(
  File "/home/changwa1/chang_optimum-intel/optimum/intel/neural_compressor/modeling_base.py", line 240, in _from_transformers
    traced_model = jit_trace(model, task, use_cache)
  File "/home/changwa1/chang_optimum-intel/optimum/intel/generation/modeling.py", line 74, in jit_trace
    model_inputs = prepare_jit_inputs(model, task, use_cache)
  File "/home/changwa1/chang_optimum-intel/optimum/intel/generation/modeling.py", line 55, in prepare_jit_inputs
    dummy_inputs = onnx_config.generate_dummy_inputs(framework="pt")
KeyError: 'past_key_values'

There is debuging from myside:
if we want to generate dummy_inputs with past-kv,
onnx_config = onnx_config_class(model.config, use_past_in_inputs=True) is necessary for gpt_bigcode model, but for export=True, it is missed, onnx_config = onnx_config_class(model.config) is used.

onnx_config = onnx_config_class(model.config)

after using onnx_config = onnx_config_class(model.config, use_past=True, use_past_in_inputs=True), root cause raised by @PenghuiCheng
the model_inputs generated by onnxconfig is correct, why should we reformat the model_inputs? @jiqing-feng
https://github.com/huggingface/optimum-intel/blob/main/optimum/intel/generation/modeling.py#L55

when I get the traced model with INCModelForCausalLM.from_pretrained(model_id, export=True) with
commit id e0c7486, the example_inputs from ONNXConfig with position_ids, but the TSModelForCaudalLM forward doesn't support it.

@echarlaix
Copy link
Collaborator

There is debuging from myside: if we want to generate dummy_inputs with past-kv, onnx_config = onnx_config_class(model.config, use_past_in_inputs=True) is necessary for gpt_bigcode model, but for export=True, it is missed, onnx_config = onnx_config_class(model.config) is used.

Yes you're right, since huggingface/optimum#1358 use_past will define whether the model outputs will have past_key_values and use_past_in_inputs defines whether the model should expect past_key_values as inputs (this is to dissociate the decoder from the decoder_wtih_past), to enable the past_key_values to be generated when needed, we need to set use_past_in_inputs=use_cache

when I get the traced model with INCModelForCausalLM.from_pretrained(model_id, export=True) with commit id e0c7486, the example_inputs from ONNXConfig with position_ids, but the TSModelForCaudalLM forward doesn't support it.

huggingface/optimum#1381 introduces position_ids for some architectures to fix inference, TSModelForCaudalLM will needed to be updated accoradingly (happy to take care of this if needed!)

Copy link
Collaborator

@echarlaix echarlaix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for this integration @changwangss, could you add a test before we merge ?

@changwangss
Copy link
Contributor Author

changwangss commented Sep 26, 2023

I would like to add ut to test but TSModelForCaudalLM forward with position_ids is needed, so I add it to test Ella mentioned

Signed-off-by: changwangss <[email protected]>
@echarlaix echarlaix merged commit fc71567 into huggingface:main Sep 26, 2023
10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants