This is a very simple interface to send a query to ollama and output the response as it comes back, without leaving Amplenote.
Ollama is a local LLM app that provides an API locally for generation. Responses are streamed as they are generated, and are generated entirely on the local machine.
Ollama must be run with OLLAMA_ORIGINS=https://plugins.amplenote.com ollama serve
0 Comments
Leave a comment
Login to Amplenote to leave a comment