Chat¶
The chat UI element provides an interactive chatbot interface for conversations. It can be customized with different models, including built-in AI models or custom functions.
- class marimo.ui.chat(model: Callable[[List[ChatMessage], ChatModelConfig], object], *, prompts: List[str] | None = None, on_message: Callable[[List[ChatMessage]], None] | None = None, show_configuration_controls: bool = False, config: ChatModelConfigDict | None = None)¶
A chatbot UI element for interactive conversations.
Example - Using a custom model.
You can define a custom chat model Callable that takes in the history of messages and configuration.
The response can be an object, a marimo UI element, or plain text.
def my_rag_model(messages, config): question = messages[-1].content docs = find_docs(question) prompt = template(question, docs, messages) response = query(prompt) if is_dataset(response): return dataset_to_chart(response) return response chat = mo.ui.chat(my_rag_model)
Example - Using a built-in model.
You can use a built-in model from the
mo.ai
module.chat = mo.ui.chat( mo.ai.openai( "gpt-4o", system_message="You are a helpful assistant.", ), )
Attributes.
value
: the current chat history
Initialization Args.
model
: (Callable[[List[ChatMessage], ChatModelConfig], object]) a callable that takes in the chat history and returns a responseprompts
: optional list of prompts to start the conversationon_message
: optional callback function to handle new messagesshow_configuration_controls
: whether to show the configuration controlsconfig
: optional ChatModelConfigDict to override the default configuration keys include:max_tokens
temperature
top_p
top_k
frequency_penalty
presence_penalty
Public methods
Inherited from
UIElement
form
([label, bordered, loading, ...])Create a submittable form out of this
UIElement
.send_message
(message, buffers)Send a message to the element rendered on the frontend from the backend.
Inherited from
Html
batch
(**elements)Convert an HTML object with templated text into a UI element.
center
()Center an item.
right
()Right-justify.
left
()Left-justify.
callout
([kind])Create a callout containing this HTML element.
style
([style])Wrap an object in a styled container.
Public Data Attributes:
Inherited from
UIElement
value
The element’s current value.
Inherited from
Html
text
A string of HTML representing this element.
Basic Usage¶
Here’s a simple example using a custom echo model:
import marimo as mo
def echo_model(messages, config):
return f"Echo: {messages[-1].content}"
chat = mo.ui.chat(echo_model, prompts=["Hello", "How are you?"])
chat
Using a Built-in AI Model¶
You can use marimo’s built-in AI models, such as OpenAI’s GPT:
import marimo as mo
chat = mo.ui.chat(
mo.ai.openai(
"gpt-4",
system_message="You are a helpful assistant.",
),
show_configuration_controls=True
)
chat
Accessing Chat History¶
You can access the chat history using the value
attribute:
chat.value
This returns a list of ChatMessage
objects, each containing role
and content
attributes.
Custom Model with Additional Context¶
Here’s an example of a custom model that uses additional context:
import marimo as mo
def rag_model(messages, config):
question = messages[-1].content
docs = find_relevant_docs(question)
context = "\n".join(docs)
prompt = f"Context: {context}\n\nQuestion: {question}\n\nAnswer:"
response = query_llm(prompt, config)
return response
mo.ui.chat(rag_model)
This example demonstrates how you can implement a Retrieval-Augmented Generation (RAG) model within the chat interface.
Built-in Models¶
marimo provides several built-in AI models that you can use with the chat UI element.
import marimo as mo
mo.ui.chat(
mo.ai.openai(
"gpt-4",
system_message="You are a helpful assistant.",
api_key="sk-...",
),
show_configuration_controls=True
)
mo.ui.chat(
mo.ai.anthropic(
"claude-3-5-sonnet-20240602",
system_message="You are a helpful assistant.",
api_key="sk-...",
),
show_configuration_controls=True
)
- class marimo.ai.models.openai(model: str, *, system_message: str = 'You are a helpful assistant specializing in data science.', api_key: str | None = None, base_url: str | None = None)¶
OpenAI ChatModel
Args:
model (str): The model to use. Can be found on the OpenAI models page
system_message (str): The system message to use
api_key (Optional[str]): The API key to use. If not provided, the API key will be retrieved from the OPENAI_API_KEY environment variable or the user’s config.
base_url (Optional[str]): The base URL to use
- class marimo.ai.models.anthropic(model: str, *, system_message: str = 'You are a helpful assistant specializing in data science.', api_key: str | None = None, base_url: str | None = None)¶
Anthropic ChatModel
Args:
model (str): The model to use.
system_message (str): The system message to use
api_key (Optional[str]): The API key to use. If not provided, the API key will be retrieved from the ANTHROPIC_API_KEY environment variable or the user’s config.
base_url (Optional[str]): The base URL to use