-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(dsp): in HFClientVLLM
, actually use kwargs
in the payload instead of discarding them
#719
fix(dsp): in HFClientVLLM
, actually use kwargs
in the payload instead of discarding them
#719
Conversation
Thanks for the additions @remiconnesson ! I believe this may not be needed as HFClientVLLM uses the kwargs through inheritance from HF which inherits from LM. Feel free to highlight any inconsistencies from this kwargs handling. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
kwargs are necessary for guided completion But from the link you show me, only a subset of the kwargs are used Just the commit on setting model is not necessary but, this being said, I don't see any reason to discard all of the kwargs and not passing those to the llm call |
got it! could you also run |
c88d6e9
to
e74c361
Compare
@arnavsinghvi11 it's done :) |
Thanks @remiconnesson ! |
Hi @arnavsinghvi11 can you confirm if this commit has been released? will re-installing the DSPy package solve the kwargs issue with HFClientVLLM? This does not seem to work with HFClientVLLM as of now and I get multiple output for a classification task after using config n=1 or top_p=1 |
Hi @manjubsavanth , this should be resolved after pulling and reinstalling DSPy with this latest commit. |
@arnavsinghvi11 thanks, I have installed dspy by pulling from git, getting the below error
|
@manjubsavanth indeed, I pushed a fix here #762 |
Hi @remiconnesson I continue to get the error, I have installed the package using "%pip install git+https://github.com/stanfordnlp/dspy.git". Does it need a merge with main?
|
Hi @manjubsavanth the fixed is push but indeed is not yet merged with main |
If you can't wait for the merge you can apply it yourself in the meantime: Open the following file (or wherever the new envs is on your machine now) and apply the change you see there: |
Hi @remiconnesson now I get the below error. I built a wheel locally with the changes you had suggested.
|
in
dsp.modules.HFClientVLLM
class thekwargs
are supposed to be passed to the vllm openai endpoint (cf. the docs) but are discarded instead. This is related to the issue here : #718In this PR, there are some modifications to the
dsp.modules.HFClientVLLM
class:__init__
thekwargs
were actually unused. In this PR those are attached toself.
instead of being discarded.kwargs
of_generate
method, disregarding the value initialized in the__init__
. This is fixed in this PR.temperature
andmax_tokens
) in the_generate
method. Those were made mandatory because it was a directkwargs["<parameter>"]
, which would raiseKeyError
. In this PR, those two dictionary access have been deleted._generate
method, the kwargs were never actually passed to the vllm endpoint. In this PR we fix this by unpacking thekwargs
into the payload so that the extra arguments.