Dataiku Agents#

For more details on the Dataiku Agents, please refer to our documentation: refdoc:generative-ai/agents/index.
For more details on the LLM Mesh, please refer to our documentation Introduction.
If you want more information about the LLM Mesh API, please refer to LLM Mesh.

Tutorials

You can find tutorials on this subject in the Developer Guide: Agents and Tools for Generative AI.

List your agents#

The following code lists all the Agents and their IDs, which you can reuse in the code samples listed later on this page.

import dataiku

client = dataiku.api_client()
project = client.get_default_project()
llm_list = project.list_llms()
for llm in llm_list:
    if 'agent:' in llm.id:
        print(f"- {llm.description} (id: {llm.id})")

Use your agent#

Native DSSLLM#

Using the native DSSLLM completion, for more information, refer to Perform completion queries on LLMs:

import dataiku

AGENT_ID = "" # Fill with your agent id
client = dataiku.api_client()
project = client.get_default_project()
llm = project.get_llm(AGENT_ID)
completion = llm.new_completion()
resp = completion.with_message("How to run an agent?").execute()
if resp.success:
    print(resp.text)

Streaming (in a notebook)#

from dataikuapi.dss.llm import DSSLLMStreamedCompletionChunk, DSSLLMStreamedCompletionFooter
from IPython.display import display, clear_output

AGENT_ID = "" # Fill with your agent id
client = dataiku.api_client()
project = client.get_default_project()
llm = project.get_llm(AGENT_ID)
completion = llm.new_completion()
completion.with_message("Who is the customer fdouetteau? Please provide additional information.")

gen = ""
for chunk in completion.execute_streamed():
    if isinstance(chunk, DSSLLMStreamedCompletionChunk):
        gen += chunk.data["text"]
        clear_output()
        display("Received text: %s" % gen)
    elif isinstance(chunk, DSSLLMStreamedCompletionFooter):
        print("Completion is complete: %s" % chunk.data)

Using the DKUChatLLM#

With the DKUChatLLM, for more information, refer to LangChain integration:

from dataiku.langchain.dku_llm import DKULLM, DKUChatLLM

AGENT_ID = "" # Fill with your agent id
langchain_llm = DKUChatLLM(llm_id=AGENT_ID)
resp = langchain_llm.invoke("How to run an agent?")
print(resp.content)

Streaming (in a notebook)#

from IPython.display import display, clear_output
from dataiku.langchain.dku_llm import DKULLM, DKUChatLLM

AGENT_ID = "" # Fill with your agent id
langchain_llm = DKUChatLLM(llm_id=AGENT_ID)
resp = langchain_llm.stream("Who is the customer fdouetteau? Please provide additional information.")

gen = ""
for r in resp:
    clear_output()
    gen += r.content
    display(gen)

Asynchronous Streaming (in a notebook)#

import asyncio
from IPython.display import display, clear_output
from dataiku.langchain.dku_llm import DKULLM, DKUChatLLM

async def func(response):
    gen = ""
    async for r in response:
        clear_output()
        gen += r.content
        display(gen)

AGENT_ID = "" # Fill with your agent id
langchain_llm = DKUChatLLM(llm_id=AGENT_ID)
resp = langchain_llm.astream("Who is the customer fdouetteau? Please provide additional information.")


await(func(resp))

Reference documentation#

Classes#

dataikuapi.DSSClient(host[, api_key, ...])

Entry point for the DSS API client

dataikuapi.dss.langchain.DKUChatModel(*args, ...)

Langchain-compatible wrapper around Dataiku-mediated chat LLMs

dataikuapi.dss.llm.DSSLLM(client, ...)

A handle to interact with a DSS-managed LLM.

dataikuapi.dss.llm.DSSLLMCompletionQuery(llm)

A handle to interact with a completion query.

dataikuapi.dss.project.DSSProject(client, ...)

A handle to interact with a project on the DSS instance.

Functions#

dataikuapi.DSSClient.get_default_project()

Get a handle to the current default project, if available (i.e.

dataikuapi.dss.project.DSSProject.get_llm(llm_id)

Get a handle to interact with a specific LLM

dataikuapi.dss.project.DSSProject.list_llms([...])

List the LLM usable in this project

dataikuapi.dss.llm.DSSLLM.new_completion()

Create a new completion query.

dataikuapi.dss.llm.DSSLLMCompletionsQuery.execute()

Run the completions query and retrieve the LLM response.

dataikuapi.dss.llm.DSSLLMCompletionQuery.execute_streamed()

Run the completion query and retrieve the LLM response as streamed chunks.

dataikuapi.dss.llm.DSSLLMCompletionQuery.with_message(message)

Add a message to the completion query.

langchain.chains.llm.LLMChain.astream

langchain.chains.llm.LLMChain.invoke

langchain.chains.llm.LLMChain.stream