You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With a generally high amount of interest in machine learning, we have already thought about a number of use cases involving traditional ML tasks like translation and validation as a Proxeus node. Given the explosion of support for large language model projects right now, I would like to see if there's value in connecting this to our workflow.
A simple Proxeus LLM node would accept a {input.prompt} and provide an {output.generated} value that can be embedded in the workflow, e.g. used inside of a document. The system administrator would provide an LLM_API_KEY and LLM_API_URL or other options in the environment settings. The node would connect to an API service from a provider like Hugging Face or Argo, or even an LLM proxy to obtain a response.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
With a generally high amount of interest in machine learning, we have already thought about a number of use cases involving traditional ML tasks like translation and validation as a Proxeus node. Given the explosion of support for large language model projects right now, I would like to see if there's value in connecting this to our workflow.
A simple Proxeus LLM node would accept a
{input.prompt}
and provide an{output.generated}
value that can be embedded in the workflow, e.g. used inside of a document. The system administrator would provide anLLM_API_KEY
andLLM_API_URL
or other options in the environment settings. The node would connect to an API service from a provider like Hugging Face or Argo, or even an LLM proxy to obtain a response.Beta Was this translation helpful? Give feedback.
All reactions