Ollama support in AI Positron Enterprise

Are you missing a feature? Request its implementation here.
mstr
Posts: 3
Joined: Tue Dec 14, 2021 3:26 pm

Ollama support in AI Positron Enterprise

Post by mstr »

Hi,

my company has set up an Ollama server that features several LLMs, e.g. Mistral. Having learned that Ollama is partially comaptible with OpenAI API, I tried to connect to the Ollama server using the OpenAI connector in AI Positron Enterprise add-on - and although the authorization is successful, the response I get upon trying to perform an action using the add-on is HTTP 404.
However, in his response to this post feature-request/topic26840.html, Radu mentioned that it is possible to use the AI Positron Enterprise add-on with Ollama/Mistral.

So my questions are:

Is it possible to connect to an Ollama server (not running locally on localhost) with the OpenAI connector in AI Positron Enterprise? If so, what is the required config and is it possible to change the LLM used by the addon (as I mentioned before, the Ollama server features several LLMs).

And if it's not possible to use Ollama with the OpenAI connector, are you planning to support Ollama or other open source AI solutions in the future?

Thank you in advance.

Magda
Radu
Posts: 9205
Joined: Fri Jul 09, 2004 5:18 pm

Re: Ollama support in AI Positron Enterprise

Post by Radu »

Hi Magda,
I'm attaching a screenshot with how I exactly configured Oxygen AI Positron Enterprise to connect to my localhost ollama server and use the "llama3" module.
ollama-config.png
ollama-config.png (122.45 KiB) Viewed 401 times
If you have ollama on a separate HTTP server and you cannot connect to it from the Oxygen AI Positron Enterprise add-on, then this might just be a connection problem, maybe for example you have some company specific proxy web server which does not allow in general local installed applications to connect to that particular HTTP address.
So yes, connecting the Oxygen AI Positron Enterprise to ollama and using different models should be possible.
From what I tested with the "llama3" model, most of our builtin actions did not properly work with it though, the generated DITA XML content was incomplete, "llama3" is an inferior model to GPT 3.5 and 4 in my opinion so you might not derive much benefit from using llama3 to work with DITA XML content. Working with Mistral was a bit better but again I had problems with most of our pre-defined actions as well.

Regards,
Radu
Radu Coravu
<oXygen/> XML Editor
http://www.oxygenxml.com
mstr
Posts: 3
Joined: Tue Dec 14, 2021 3:26 pm

Re: Ollama support in AI Positron Enterprise

Post by mstr »

Thank you, Radu, for your reply. We finally managed to make it work. :) The v1/chat/completions part in the address + providing the correct model name in the config file did the trick.

Thanks again

Magda
Radu
Posts: 9205
Joined: Fri Jul 09, 2004 5:18 pm

Re: Ollama support in AI Positron Enterprise

Post by Radu »

Hi Magda,
Great, I'm glad this works for you now!
Regards,
Radu
Radu Coravu
<oXygen/> XML Editor
http://www.oxygenxml.com
Post Reply