Building a Web Application with the agent#
In the previous parts of this series (here and here), you saw how to define tools and create an LLM-based agent capable of answering queries by calling those tools. This part demonstrates how to build an interactive web interface so end-users can interact with this functionality in a browser. Different frameworks can be used, as detailed below.
Creating a webapp#
First, you’ll set up the webapp framework, alongside creating the necessary infrastructure within Dataiku. Choose your preferred framework and follow the necessary steps.
Dash applications can be created as Code webapps with the following steps:
Create a new webapp by clicking on </> > Webapps
Click the +New webapp, choose the Code webapp, then click on the Dash button, choose the An empty Dash app option, and choose a meaningful name
In the Code env option of the Settings tabs, select the Python code environment with the packages defined in the prerequisites for this tutorial series
You’ll need to add the following packages specific to Dash to the code env
dash # tested with 2.18.2 dash-bootstrap-components # tested with 1.6.0
Gradio applications run in Code Studios in Dataiku. To create a new application, follow the steps outlined in Gradio: your first web application. In short, the steps are:
Create a Code Studio template that includes a code env with the required packages defined in the prerequisites for this tutorial series
You’ll need to add the following packages specific to Gradio to the code env
gradio # tested with 3.48.0
Create a Code Studio based on the template
Add the full script provided at the end of this tutorial to the Code Studio in the
gradio/app.py
fileIf you have access to the Code Studio’s workspace via a VSCode or Jupyter Lab block, then you can see the full path of the file at
/home/dataiku/workspace/code_studio-versioned/gradio/app.py
Voila applications run in Code Studios in Dataiku. To create a new application, follow the steps outlined in Voilà: your first web application. In short, the steps are:
Create a Code Studio template with JupyterLab Server and Voila blocks
When adding the these block, include the required packages (
duckduckgo_search==7.1.1
) defined in the prerequisites in theAdditional Python modules
optionCreate a Code Studio based on the template
Using the JupyterLab interface, add the full script provided at the end of this tutorial to the Code Studio in the
code_studio-versioned/visio/app.ipynb
file
Note
The predefined tools need to be present in a location accessible via code.
You can place the file (available here for download)
in </> > Libraries
. You can find detailed instructions in the
previous tutorial, plus why it is useful to follow this approach.
For similar reasons of modularity, helper functions common among the application scripts are also placed in a separate file. Specifically, the functions create_chat_session
, get_customer_details
, search_company_info
and process_tool_calls
are included in the utils.py
(also available for download). It needs to be placed in the same location as tools.json
, following the same steps.
Passing on the task to the agent#
After choosing our webapp framework, the crucial step is implementing the LLM agent functionality. It follows a consistent pattern across frameworks. Regardless of which one, the chat session is defined the same way.
Similar to the agent in Part 2, the chat session is created by calling the
create_chat_session()
function. It sets up an LLM via the LLM Mesh with the system prompt.
The application sends the information obtained about the customer to the agent. You’ll see how each framework collects this information below. A loop is created to process the tool calls and responses, until no more tool calls are needed. The agent then returns the final response.
Calling the agent#
The next step is connecting the user interface to the agent’s functionality. Here’s how each framework runs the agent.
Dash wires everything up with callbacks to process user queries. Connect the
button to a callback function that invokes the agent with the @app.callback
decorator. The update_output()
function allows the user to enter the customer ID
and click the button to trigger the function with the agent. The agent then
processes the input via a chat session and returns the final response.
@app.callback(
[Output("result", "value"), Output("chat-state", "data")],
Input("search", "n_clicks"),
[State("customer_id", "value"), State("chat-state", "data")],
prevent_initial_call=True,
running=[(Output("auto-toast", "is_open"), True, False),
(Output("search", "disabled"), True, False)]
)
def update_output(n_clicks, customer_id, chat_state):
"""Callback function that handles agent interactions"""
if not customer_id:
return no_update, no_update
# Create new chat session
chat = create_chat_session(llm, project)
Gradio’s chat interface also uses a similar function that processes the current message and
all previous conversation turns. The chat_with_agent()
function that calls the agent
has two parameters:
message: current user message
history: list of (user, assistant) message tuples
The user inputs and conversation history are forwarded to the chat_with_agent()
function.
The agent then processes the input via a chat session and returns the final response.
def chat_with_agent(message, history):
"""Chat function that handles agent interactions"""
chat = create_chat_session(llm, project)
# Add history to chat context
for user_msg, assistant_msg in history:
chat.with_message(user_msg, role="user")
chat.with_message(assistant_msg, role="assistant")
chat.with_message(message, role="user")
In Voila, you’ll define a function process_agent_response()
to deliver queries to the LLM
via a chat session. It has two parameters: chat
, which is the chat session, and query
,
which is the user’s message. The agent processes the user’s query via the chat session.
def process_agent_response(chat, query):
"""Process the agent's response and handle any tool calls"""
chat.with_message(query, role="user")
Creating the layout#
Finally, to provide a UI for this agent functionality, you’ll build an interface with components that allows users to interact with it. Each framework offers its own approach.
The layout gathers user inputs (e.g. message with customer ID) and passes it to the agent functions for each framework. The agent then returns the result to be displayed in the UI.
Create a Dash layout that constructs an application like Figure 1, consisting of an input Textbox for entering a customer ID and an output Textarea.
The callback function described above takes the user’s requests from the input Textbox and passes the
entered customer ID to create_chat_session()
, rendering the final agent response in the output Textarea.
# Dash app layout
app.layout = html.Div([
dbc.Row([html.H2("Using LLM Mesh with an agent in Dash")]),
dbc.Row(dbc.Label("Please enter the ID of the customer:")),
dbc.Row([
dbc.Col(dbc.Input(id="customer_id", placeholder="Customer Id"), width=10),
dbc.Col(dbc.Button("Search", id="search", color="primary"), width="auto")
], justify="between"),
dbc.Row([dbc.Col(dbc.Textarea(id="result", style={"min-height": "500px"}), width=12)]),
dbc.Toast(
[html.P("Searching for information about the customer", className="mb-0"),
dbc.Spinner(color="primary")],
id="auto-toast",
header="Agent working",
icon="primary",
is_open=False,
style={"position": "fixed", "top": "50%", "left": "50%", "transform": "translate(-50%, -50%)"},
),
dcc.Store(id="chat-state"),
dcc.Store(id="step", data={"current_step": 0}),
], className="container-fluid mt-3")

Figure 1: LLM Agentic – webapp.#
Unlike Dash’s component-based approach, Gradio offers a more conversation-focused interface.
Using its ChatInterface
class, create a layout like
Figure 1 that includes:
A chat message input field for queries
The conversation history including the agent’s replies
Optional features like example prompts
app = gr.ChatInterface(
fn=chat_with_agent,
title="Customer Information Assistant",
description="Ask me about customers using their ID ...",
examples=["The id is fdouetteau",
"Find out about id wcoyote",
"who is customer tcook"]
)
app.launch(server_port=7860, root_path=browser_path)

Figure 1: LLM Agentic – webapp.#
Voila uses JupyterLab’s ipywidgets
(imported here as widgets
) to provide the UI for
user interactions. The query_input
provides a textbox to collect the user query and a
button
to trigger on_button_click()
. That function calls process_agent_response()
with the query and displays the returned message in the result
widget.
# Create widgets
label = widgets.Label(value="Enter your query about a customer")
query_input = widgets.Text(
placeholder="Tell me about customer fdouetteau",
continuous_update=False,
layout=widgets.Layout(width='50%')
)
result = widgets.HTML(value="")
button = widgets.Button(description="Ask")
# Create the chat session
chat = create_chat_session(llm, project)
def on_button_click(b):
"""Handle button click event"""
query = query_input.value
if query:
try:
response = process_agent_response(chat, query)
result.value = f"<div style='white-space: pre-wrap;'>{response}</div>"
except Exception as e:
result.value = f"<div style='color: red'>Error: {str(e)}</div>"
button.on_click(on_button_click)
# Layout
display(widgets.VBox([
widgets.HBox([label]),
widgets.HBox([query_input, button]),
widgets.HBox([result])
], layout=widgets.Layout(padding='20px')))

Figure 1: LLM Agentic – webapp.#
Conclusion#
You now have an application that:
Uses an LLM-based agent to process queries
Imports predefined tools to complement LLM capabilities
Provides a user-friendly web interface
You could enhance this interface by adding a history of previous searches or creating a more detailed and cleaner results display. This example provides a foundation for building more complex LLM-based browser applications, leveraging tool calls and webapp interfaces.
Dash application code
import dataiku
from dash import html, dcc, no_update, set_props
import dash_bootstrap_components as dbc
from dash.dependencies import Input, Output, State
import json
from utils import get_customer_details, search_company_info, process_tool_calls, create_chat_session
dbc_css = "https://cdn.jsdelivr.net/gh/AnnMarieW/dash-bootstrap-templates/dbc.min.css"
app.config.external_stylesheets = [dbc.themes.SUPERHERO, dbc_css]
# LLM setup
LLM_ID = "" # LLM ID for the LLM Mesh connection + model goes here
client = dataiku.api_client()
project = client.get_default_project()
llm = project.get_llm(LLM_ID)
# Dash app layout
app.layout = html.Div([
dbc.Row([html.H2("Using LLM Mesh with an agent in Dash")]),
dbc.Row(dbc.Label("Please enter the ID of the customer:")),
dbc.Row([
dbc.Col(dbc.Input(id="customer_id", placeholder="Customer Id"), width=10),
dbc.Col(dbc.Button("Search", id="search", color="primary"), width="auto")
], justify="between"),
dbc.Row([dbc.Col(dbc.Textarea(id="result", style={"min-height": "500px"}), width=12)]),
dbc.Toast(
[html.P("Searching for information about the customer", className="mb-0"),
dbc.Spinner(color="primary")],
id="auto-toast",
header="Agent working",
icon="primary",
is_open=False,
style={"position": "fixed", "top": "50%", "left": "50%", "transform": "translate(-50%, -50%)"},
),
dcc.Store(id="chat-state"),
dcc.Store(id="step", data={"current_step": 0}),
], className="container-fluid mt-3")
@app.callback(
[Output("result", "value"), Output("chat-state", "data")],
Input("search", "n_clicks"),
[State("customer_id", "value"), State("chat-state", "data")],
prevent_initial_call=True,
running=[(Output("auto-toast", "is_open"), True, False),
(Output("search", "disabled"), True, False)]
)
def update_output(n_clicks, customer_id, chat_state):
"""Callback function that handles agent interactions"""
if not customer_id:
return no_update, no_update
# Create new chat session
chat = create_chat_session(llm, project)
# Start conversation about customer
content = f"Tell me about the customer with ID {customer_id}"
chat.with_message(content, role="user")
conversation_log = []
while True:
response = chat.execute()
if not response.tool_calls:
# Final answer received
chat.with_message(response.text, role="assistant")
conversation_log.append(f"Final Answer: {response.text}")
break
# Handle tool calls
chat.with_tool_calls(response.tool_calls, role="assistant")
tool_call_result = process_tool_calls(response.tool_calls)
chat.with_tool_output(tool_call_result, tool_call_id=response.tool_calls[0]["id"])
# Log the step
tool_name = response.tool_calls[0]["function"]["name"]
tool_args = response.tool_calls[0]["function"]["arguments"]
conversation_log.append(f"Tool: {tool_name}\nInput: {tool_args}\nResult: {tool_call_result}\n{'-'*50}")
return "\n".join(conversation_log), {"messages": chat.cq["messages"]}
Gradio application code
import dataiku
import gradio as gr
import os
import re
import json
from utils import get_customer_details, search_company_info, process_tool_calls, create_chat_session
# LLM setup
LLM_ID = "" # LLM ID for the LLM Mesh connection + model goes here
client = dataiku.api_client()
project = client.get_default_project()
llm = project.get_llm(LLM_ID)
def chat_with_agent(message, history):
"""Chat function that handles agent interactions"""
chat = create_chat_session(llm, project)
# Add history to chat context
for user_msg, assistant_msg in history:
chat.with_message(user_msg, role="user")
chat.with_message(assistant_msg, role="assistant")
chat.with_message(message, role="user")
while True:
response = chat.execute()
if not response.tool_calls:
# Final answer received
chat.with_message(response.text, role="assistant")
return response.text
# Handle tool calls
chat.with_tool_calls(response.tool_calls, role="assistant")
tool_name = response.tool_calls[0]["function"]["name"]
tool_args = response.tool_calls[0]["function"]["arguments"]
# Process tool call and get result
tool_call_result = process_tool_calls(response.tool_calls)
chat.with_tool_output(tool_call_result, tool_call_id=response.tool_calls[0]["id"])
# Gradio interface setup
browser_path = os.getenv("DKU_CODE_STUDIO_BROWSER_PATH_7860")
env_var_pattern = re.compile(r'(\${(.*)})')
env_vars = env_var_pattern.findall(browser_path)
for env_var in env_vars:
browser_path = browser_path.replace(env_var[0], os.getenv(env_var[1], ''))
# Create Gradio chat interface
app = gr.ChatInterface(
fn=chat_with_agent,
title="Customer Information Assistant",
description="Ask me about customers using their ID ...",
examples=["The id is fdouetteau",
"Find out about id wcoyote",
"who is customer tcook"]
)
app.launch(server_port=7860, root_path=browser_path)
Voila application notebook
import dataiku
import ipywidgets as widgets
import json
import os
from utils import get_customer_details, search_company_info, process_tool_calls, create_chat_session
# LLM setup
LLM_ID = "" # LLM ID for the LLM Mesh connection + model goes here
client = dataiku.api_client()
project = client.get_default_project()
llm = project.get_llm(LLM_ID)
def process_agent_response(chat, query):
"""Process the agent's response and handle any tool calls"""
chat.with_message(query, role="user")
while True:
response = chat.execute()
if not response.tool_calls:
# Final answer received
chat.with_message(response.text, role="assistant")
chat = create_chat_session(llm, project) # refresh chat
return response.text
# Handle tool calls
chat.with_tool_calls(response.tool_calls, role="assistant")
tool_name = response.tool_calls[0]["function"]["name"]
tool_args = response.tool_calls[0]["function"]["arguments"]
# Process tool call and get result
tool_call_result = process_tool_calls(response.tool_calls)
chat.with_tool_output(tool_call_result, tool_call_id=response.tool_calls[0]["id"])
# Create widgets
label = widgets.Label(value="Enter your query about a customer")
query_input = widgets.Text(
placeholder="Tell me about customer fdouetteau",
continuous_update=False,
layout=widgets.Layout(width='50%')
)
result = widgets.HTML(value="")
button = widgets.Button(description="Ask")
# Create the chat session
chat = create_chat_session(llm, project)
def on_button_click(b):
"""Handle button click event"""
query = query_input.value
if query:
try:
response = process_agent_response(chat, query)
result.value = f"<div style='white-space: pre-wrap;'>{response}</div>"
except Exception as e:
result.value = f"<div style='color: red'>Error: {str(e)}</div>"
button.on_click(on_button_click)
# Layout
display(widgets.VBox([
widgets.HBox([label]),
widgets.HBox([query_input, button]),
widgets.HBox([result])
], layout=widgets.Layout(padding='20px')))