Dataiku Agents#
Tutorials
You can find tutorials on this subject in the Developer Guide: Agents and Tools for Generative AI.
List your agents#
The following code lists all the Agents and their IDs, which you can reuse in the code samples listed later on this page.
import dataiku
client = dataiku.api_client()
project = client.get_default_project()
llm_list = project.list_llms()
for llm in llm_list:
if 'agent:' in llm.id:
print(f"- {llm.description} (id: {llm.id})")
Use your agent#
Native DSSLLM#
Using the native DSSLLM
completion, for more information,
refer to Perform completion queries on LLMs:
import dataiku
AGENT_ID = "" # Fill with your agent id
client = dataiku.api_client()
project = client.get_default_project()
llm = project.get_llm(AGENT_ID)
completion = llm.new_completion()
resp = completion.with_message("How to run an agent?").execute()
if resp.success:
print(resp.text)
Streaming (in a notebook)#
from dataikuapi.dss.llm import DSSLLMStreamedCompletionChunk, DSSLLMStreamedCompletionFooter
from IPython.display import display, clear_output
AGENT_ID = "" # Fill with your agent id
client = dataiku.api_client()
project = client.get_default_project()
llm = project.get_llm(AGENT_ID)
completion = llm.new_completion()
completion.with_message("Who is the customer fdouetteau? Please provide additional information.")
gen = ""
for chunk in completion.execute_streamed():
if isinstance(chunk, DSSLLMStreamedCompletionChunk):
gen += chunk.data["text"]
clear_output()
display("Received text: %s" % gen)
elif isinstance(chunk, DSSLLMStreamedCompletionFooter):
print("Completion is complete: %s" % chunk.data)
Using the DKUChatLLM
#
With the DKUChatLLM
, for more information,
refer to LangChain integration:
from dataiku.langchain.dku_llm import DKULLM, DKUChatLLM
AGENT_ID = "" # Fill with your agent id
langchain_llm = DKUChatLLM(llm_id=AGENT_ID)
resp = langchain_llm.invoke("How to run an agent?")
print(resp.content)
Streaming (in a notebook)#
from IPython.display import display, clear_output
from dataiku.langchain.dku_llm import DKULLM, DKUChatLLM
AGENT_ID = "" # Fill with your agent id
langchain_llm = DKUChatLLM(llm_id=AGENT_ID)
resp = langchain_llm.stream("Who is the customer fdouetteau? Please provide additional information.")
gen = ""
for r in resp:
clear_output()
gen += r.content
display(gen)
Asynchronous Streaming (in a notebook)#
import asyncio
from IPython.display import display, clear_output
from dataiku.langchain.dku_llm import DKULLM, DKUChatLLM
async def func(response):
gen = ""
async for r in response:
clear_output()
gen += r.content
display(gen)
AGENT_ID = "" # Fill with your agent id
langchain_llm = DKUChatLLM(llm_id=AGENT_ID)
resp = langchain_llm.astream("Who is the customer fdouetteau? Please provide additional information.")
await(func(resp))
Reference documentation#
Classes#
|
Entry point for the DSS API client |
|
Langchain-compatible wrapper around Dataiku-mediated chat LLMs |
|
A handle to interact with a DSS-managed LLM. |
A handle to interact with a completion query. |
|
|
A handle to interact with a project on the DSS instance. |
Functions#
Get a handle to the current default project, if available (i.e. |
|
Get a handle to interact with a specific LLM |
|
List the LLM usable in this project |
|
Create a new completion query. |
|
Run the completions query and retrieve the LLM response. |
|
Run the completion query and retrieve the LLM response as streamed chunks. |
|
|
Add a message to the completion query. |
|
|
|
|
|