Start Interactive Chat
Description
Run a conversational agent inside your R session. Tools execute as direct function calls, no MCP server needed.
Usage
1chat(provider = NULL, model = NULL, tools = NULL, session = NULL)
Arguments
provider: LLM provider: “anthropic”, “openai”, or “ollama”. Defaults to config value or “anthropic”.model: Model name. Defaults to config value or provider default.tools: Character vector of tool names or categories to enable. Categories: file, code, r, data, web, git, chat, memory. Use “core” for file+code+git, “all” for everything (default).session: Session resume control. NULL (default) starts fresh, TRUE resumes the latest session, or a character session key to resume a specific session.
Value
The session object (invisibly).
Examples
1# Start chatting with defaults from config
2chat()
3
4# Use a specific provider/model
5chat(provider = "ollama", model = "llama3.2")
6
7# Minimal tools for focused work
8chat(tools = "core")