langchain-hs-0.0.2.0: Haskell implementation of Langchain
Copyright(c) 2025 Tushar Adhatrao
LicenseMIT
MaintainerTushar Adhatrao <[email protected]>
Stabilityexperimental
Safe HaskellNone
LanguageHaskell2010

Langchain.LLM.Ollama

Description

Ollama implementation of LangChain's LLM interface , supporting:

  • Text generation
  • Chat interactions
  • Streaming responses
  • Callback integration

Example usage:

-- Create Ollama configuration
ollamaLLM = Ollama "llama3" [stdOutCallback]

-- Generate text
response <- generate ollamaLLM "Explain Haskell monads" Nothing
-- Right "Monads in Haskell..."

-- Chat interaction
let messages = UserMessage "What's the capital of France?" :| []
chatResponse <- chat ollamaLLM messages Nothing
-- Right "The capital of France is Paris."

-- Streaming
streamHandler = StreamHandler print (putStrLn Done)
streamResult <- stream ollamaLLM messages streamHandler Nothing
Synopsis

Documentation

data Ollama Source #

Ollama LLM configuration Contains:

  • Model name (e.g., "llama3:latest")
  • Callbacks for event tracking

Example:

>>> Ollama "nomic-embed" [logCallback]
Ollama "nomic-embed"

Constructors

Ollama 

Fields

Instances

Instances details
Show Ollama Source # 
Instance details

Defined in Langchain.LLM.Ollama

LLM Ollama Source #

Ollama implementation of the LLM typeclass Note: Params argument is currently ignored (see TODOs).

Example instance usage:

-- Generate text with error handling
case generate ollamaLLM Hello Nothing of
  Left err -> putStrLn $ "Error: " ++ err
  Right res -> putStrLn res
Instance details

Defined in Langchain.LLM.Ollama

Associated Types

type LLMParams Ollama 
Instance details

Defined in Langchain.LLM.Ollama

Runnable Ollama Source # 
Instance details

Defined in Langchain.LLM.Ollama

Associated Types

type RunnableInput Ollama 
Instance details

Defined in Langchain.LLM.Ollama

type RunnableOutput Ollama 
Instance details

Defined in Langchain.LLM.Ollama

type LLMParams Ollama Source # 
Instance details

Defined in Langchain.LLM.Ollama

type RunnableInput Ollama Source # 
Instance details

Defined in Langchain.LLM.Ollama

type RunnableOutput Ollama Source # 
Instance details

Defined in Langchain.LLM.Ollama

data OllamaParams Source #

Ollama Params contains same fields GenerateOps and ChatOps from ollama-haskell

Constructors

OllamaParams 

Fields

Instances

Instances details
Show OllamaParams Source # 
Instance details

Defined in Langchain.LLM.Ollama

Eq OllamaParams Source # 
Instance details

Defined in Langchain.LLM.Ollama

defaultOllamaParams :: OllamaParams Source #

Default values for OllamaParams