Ollama support in AI Positron Enterprise
Are you missing a feature? Request its implementation here.
-
- Posts: 3
- Joined: Tue Dec 14, 2021 3:26 pm
Ollama support in AI Positron Enterprise
Hi,
my company has set up an Ollama server that features several LLMs, e.g. Mistral. Having learned that Ollama is partially comaptible with OpenAI API, I tried to connect to the Ollama server using the OpenAI connector in AI Positron Enterprise add-on - and although the authorization is successful, the response I get upon trying to perform an action using the add-on is HTTP 404.
However, in his response to this post feature-request/topic26840.html, Radu mentioned that it is possible to use the AI Positron Enterprise add-on with Ollama/Mistral.
So my questions are:
Is it possible to connect to an Ollama server (not running locally on localhost) with the OpenAI connector in AI Positron Enterprise? If so, what is the required config and is it possible to change the LLM used by the addon (as I mentioned before, the Ollama server features several LLMs).
And if it's not possible to use Ollama with the OpenAI connector, are you planning to support Ollama or other open source AI solutions in the future?
Thank you in advance.
Magda
my company has set up an Ollama server that features several LLMs, e.g. Mistral. Having learned that Ollama is partially comaptible with OpenAI API, I tried to connect to the Ollama server using the OpenAI connector in AI Positron Enterprise add-on - and although the authorization is successful, the response I get upon trying to perform an action using the add-on is HTTP 404.
However, in his response to this post feature-request/topic26840.html, Radu mentioned that it is possible to use the AI Positron Enterprise add-on with Ollama/Mistral.
So my questions are:
Is it possible to connect to an Ollama server (not running locally on localhost) with the OpenAI connector in AI Positron Enterprise? If so, what is the required config and is it possible to change the LLM used by the addon (as I mentioned before, the Ollama server features several LLMs).
And if it's not possible to use Ollama with the OpenAI connector, are you planning to support Ollama or other open source AI solutions in the future?
Thank you in advance.
Magda
-
- Posts: 9424
- Joined: Fri Jul 09, 2004 5:18 pm
Re: Ollama support in AI Positron Enterprise
Hi Magda,
I'm attaching a screenshot with how I exactly configured Oxygen AI Positron Enterprise to connect to my localhost ollama server and use the "llama3" module.
So yes, connecting the Oxygen AI Positron Enterprise to ollama and using different models should be possible.
From what I tested with the "llama3" model, most of our builtin actions did not properly work with it though, the generated DITA XML content was incomplete, "llama3" is an inferior model to GPT 3.5 and 4 in my opinion so you might not derive much benefit from using llama3 to work with DITA XML content. Working with Mistral was a bit better but again I had problems with most of our pre-defined actions as well.
Regards,
Radu
I'm attaching a screenshot with how I exactly configured Oxygen AI Positron Enterprise to connect to my localhost ollama server and use the "llama3" module.
ollama-config.png
If you have ollama on a separate HTTP server and you cannot connect to it from the Oxygen AI Positron Enterprise add-on, then this might just be a connection problem, maybe for example you have some company specific proxy web server which does not allow in general local installed applications to connect to that particular HTTP address.So yes, connecting the Oxygen AI Positron Enterprise to ollama and using different models should be possible.
From what I tested with the "llama3" model, most of our builtin actions did not properly work with it though, the generated DITA XML content was incomplete, "llama3" is an inferior model to GPT 3.5 and 4 in my opinion so you might not derive much benefit from using llama3 to work with DITA XML content. Working with Mistral was a bit better but again I had problems with most of our pre-defined actions as well.
Regards,
Radu
You do not have the required permissions to view the files attached to this post.
Radu Coravu
<oXygen/> XML Editor
http://www.oxygenxml.com
<oXygen/> XML Editor
http://www.oxygenxml.com
-
- Posts: 3
- Joined: Tue Dec 14, 2021 3:26 pm
Re: Ollama support in AI Positron Enterprise
Thank you, Radu, for your reply. We finally managed to make it work.
The v1/chat/completions part in the address + providing the correct model name in the config file did the trick.
Thanks again
Magda

Thanks again
Magda
Jump to
- Oxygen XML Editor/Author/Developer
- ↳ Feature Request
- ↳ Common Problems
- ↳ DITA (Editing and Publishing DITA Content)
- ↳ SDK-API, Frameworks - Document Types
- ↳ DocBook
- ↳ TEI
- ↳ XHTML
- ↳ Other Issues
- Oxygen XML Web Author
- ↳ Feature Request
- ↳ Common Problems
- Oxygen Content Fusion
- ↳ Feature Request
- ↳ Common Problems
- Oxygen JSON Editor
- ↳ Feature Request
- ↳ Common Problems
- Oxygen PDF Chemistry
- ↳ Feature Request
- ↳ Common Problems
- Oxygen Feedback
- ↳ Feature Request
- ↳ Common Problems
- Oxygen XML WebHelp
- ↳ Feature Request
- ↳ Common Problems
- XML
- ↳ General XML Questions
- ↳ XSLT and FOP
- ↳ XML Schemas
- ↳ XQuery
- NVDL
- ↳ General NVDL Issues
- ↳ oNVDL Related Issues
- XML Services Market
- ↳ Offer a Service