This is an application that uses the Ollama interface data format to proxy the OpenAI data format. Currently, there are many types of OpenAI data format proxies for other models. The project currently suggests integrating with OneAPI, which would allow the implementation of all models. However, the project encounters parsing errors with the return formats of some OneAPI models. Many tools now support Ollama, but Ollama only supports local models. If you want to use online models without deploying local models for Ollama, you can use this proxy to achieve that.
For example:
Here, I can integrate OneAPI to use my local aggregated model platform and then customize the model names. However, the mapping can later be for other models.