Znote (recipes)
  Get Znote  

Tutorial: Using Ollama or Remote LLM 🤖

How to use Ollama with useAI hook

 

Tutorial: Using Ollama or Remote LLM 🤖

👉 Insure to install the Ollama plugin

Install OLLAMA

Ollama download

Download the LLM to run

ollama pull llama3.2

Usage

The comment //exec: node is always required for correctly stream the response.

Stream response in console

//exec: node
const streamResponse = true;
await useLocalAI("llama3.2", streamResponse, "Create a short story");

Wait the result to chain to another task

//exec: node
const streamResponse = false;
const result = await useLocalAI("llama3.2", streamResponse, "Translate in French this sentence: Hello everyone.");
print(result);

OpenRouter

Use remote LLM with OpenRouter: 👉 https://openrouter.ai/settings/keys

//exec: node
const streamResponse = true;
const baseURL = "https://openrouter.ai/api/v1";
const apiKey = "API_KEY";
await useOpenAIApi(baseURL, apiKey, "sao10k/l3.3-euryale-70b", streamResponse, "Create a short story");

Related recipes