Copyright | (c) 2025 Tushar Adhatrao |
---|---|
License | MIT |
Maintainer | Tushar Adhatrao <[email protected]> |
Stability | experimental |
Safe Haskell | None |
Language | Haskell2010 |
Langchain.LLM.Ollama
Description
Ollama implementation of LangChain's LLM interface , supporting:
- Text generation
- Chat interactions
- Streaming responses
- Callback integration
Example usage:
-- Create Ollama configuration ollamaLLM = Ollama "llama3" [stdOutCallback] -- Generate text response <- generate ollamaLLM "Explain Haskell monads" Nothing -- Right "Monads in Haskell..." -- Chat interaction let messages = UserMessage "What's the capital of France?" :| [] chatResponse <- chat ollamaLLM messages Nothing -- Right "The capital of France is Paris." -- Streaming streamHandler = StreamHandler print (putStrLn Done) streamResult <- stream ollamaLLM messages streamHandler Nothing
Synopsis
- data Ollama = Ollama {}
- data OllamaParams = OllamaParams {}
- defaultOllamaParams :: OllamaParams
Documentation
Ollama LLM configuration Contains:
- Model name (e.g., "llama3:latest")
- Callbacks for event tracking
Example:
>>>
Ollama "nomic-embed" [logCallback]
Ollama "nomic-embed"
Constructors
Ollama | |
Instances
data OllamaParams Source #
Ollama Params contains same fields GenerateOps and ChatOps from ollama-haskell
Constructors
OllamaParams | |
Fields
|
Instances
Show OllamaParams Source # | |
Defined in Langchain.LLM.Ollama Methods showsPrec :: Int -> OllamaParams -> ShowS # show :: OllamaParams -> String # showList :: [OllamaParams] -> ShowS # | |
Eq OllamaParams Source # | |
Defined in Langchain.LLM.Ollama |
defaultOllamaParams :: OllamaParams Source #
Default values for OllamaParams