LLM Mesh#
For usage information and examples, please see LLM Mesh
- class dataikuapi.dss.llm.DSSLLM(client, project_key, llm_id)#
A handle to interact with a DSS-managed LLM.
Important
Do not create this class directly, use
dataikuapi.dss.project.DSSProject.get_llm()
instead.- new_completion()#
Create a new completion query.
- Returns:
A handle on the generated completion query.
- Return type:
- new_completions()#
Create a new multi-completion query.
- Returns:
A handle on the generated multi-completion query.
- Return type:
- new_embeddings(text_overflow_mode='FAIL')#
Create a new embedding query.
- Parameters:
text_overflow_mode (str) – How to handle longer texts than what the model supports. Either ‘TRUNCATE’ or ‘FAIL’.
- Returns:
A handle on the generated embeddings query.
- Return type:
- new_images_generation()#
- as_langchain_llm(**data)#
Create a langchain-compatible LLM object for this LLM.
- Returns:
A langchain-compatible LLM object.
- Return type:
- as_langchain_chat_model(**data)#
Create a langchain-compatible chat LLM object for this LLM.
- Returns:
A langchain-compatible LLM object.
- Return type:
- as_langchain_embeddings(**data)#
Create a langchain-compatible embeddings object for this LLM.
- Returns:
A langchain-compatible embeddings object.
- Return type:
- class dataikuapi.dss.llm.DSSLLMListItem(client, project_key, data)#
An item in a list of llms
Important
Do not instantiate this class directly, instead use
dataikuapi.dss.project.DSSProject.list_llms()
.- to_llm()#
Convert the current item.
- Returns:
A handle for the llm.
- Return type:
- property id#
- Returns:
The id of the llm.
- Return type:
string
- property type#
- Returns:
The type of the LLM
- Return type:
string
- property description#
- Returns:
The description of the LLM
- Return type:
string
- class dataikuapi.dss.llm.DSSLLMCompletionsQuery(llm)#
A handle to interact with a multi-completion query. Completion queries allow you to send a prompt to a DSS-managed LLM and retrieve its response.
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLM.new_completion()
instead.- property settings#
- Returns:
The completion query settings.
- Return type:
dict
- new_completion()#
- execute()#
Run the completions query and retrieve the LLM response.
- Returns:
The LLM response.
- Return type:
- with_json_output()#
Request the model to generate a valid JSON response, for models that support it.
Note that some models may require you to also explicitly request this in the user or system prompt to use this.
- class dataikuapi.dss.llm.DSSLLMCompletionsQuerySingleQuery#
- new_multipart_message(role='user')#
Start adding a multipart-message to the completion query.
Use this to add image parts to the message.
- Parameters:
role (str) – The message role. Use
system
to set the LLM behavior,assistant
to store predefined responses,user
to provide requests or comments for the LLM to answer to. Defaults touser
.- Return type:
- with_message(message, role='user')#
Add a message to the completion query.
- Parameters:
message (str) – The message text.
role (str) – The message role. Use
system
to set the LLM behavior,assistant
to store predefined responses,user
to provide requests or comments for the LLM to answer to. Defaults touser
.
- with_tool_calls(tool_calls, role='assistant')#
Add tool calls to the completion query.
- Parameters:
tool_calls (list[dict]) – Calls to tools that the LLM requested to use.
role (str) – The message role. Defaults to
assistant
.
- with_tool_output(tool_output, tool_call_id, role='tool')#
Add a tool message to the completion query.
- Parameters:
tool_output (str) – The tool output, as a string.
tool_call_id (str) – The tool call id, as provided by the LLM in the conversation messages.
role (str) – The message role. Defaults to
tool
.
- class dataikuapi.dss.llm.DSSLLMCompletionsResponse(raw_resp)#
A handle to interact with a multi-completion response.
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLMCompletionsQuery.execute()
instead.- property responses#
The array of responses
- class dataikuapi.dss.llm.DSSLLMCompletionQuery(llm)#
A handle to interact with a completion query. Completion queries allow you to send a prompt to a DSS-managed LLM and retrieve its response.
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLM.new_completion()
instead.- property settings#
- Returns:
The completion query settings.
- Return type:
dict
- execute()#
Run the completion query and retrieve the LLM response.
- Returns:
The LLM response.
- Return type:
- execute_streamed()#
Run the completion query and retrieve the LLM response as streamed chunks.
- Returns:
An iterator over the LLM response chunks
- Return type:
Iterator[Union[
DSSLLMStreamedCompletionChunk
,DSSLLMStreamedCompletionFooter
]]
- new_multipart_message(role='user')#
Start adding a multipart-message to the completion query.
Use this to add image parts to the message.
- Parameters:
role (str) – The message role. Use
system
to set the LLM behavior,assistant
to store predefined responses,user
to provide requests or comments for the LLM to answer to. Defaults touser
.- Return type:
- with_json_output()#
Request the model to generate a valid JSON response, for models that support it.
Note that some models may require you to also explicitly request this in the user or system prompt to use this.
- with_message(message, role='user')#
Add a message to the completion query.
- Parameters:
message (str) – The message text.
role (str) – The message role. Use
system
to set the LLM behavior,assistant
to store predefined responses,user
to provide requests or comments for the LLM to answer to. Defaults touser
.
- with_tool_calls(tool_calls, role='assistant')#
Add tool calls to the completion query.
- Parameters:
tool_calls (list[dict]) – Calls to tools that the LLM requested to use.
role (str) – The message role. Defaults to
assistant
.
- with_tool_output(tool_output, tool_call_id, role='tool')#
Add a tool message to the completion query.
- Parameters:
tool_output (str) – The tool output, as a string.
tool_call_id (str) – The tool call id, as provided by the LLM in the conversation messages.
role (str) – The message role. Defaults to
tool
.
- class dataikuapi.dss.llm.DSSLLMCompletionQueryMultipartMessage(q, role)#
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLMCompletionQuery.new_multipart_message()
ordataikuapi.dss.llm.DSSLLMCompletionsQuerySingleQuery.new_multipart_message()
.- with_text(text)#
Add a text part to the multipart message
- with_inline_image(image, mime_type=None)#
Add an image part to the multipart message
- Parameters:
image – bytes or str (base64)
str (mime_type) – None for default
- add()#
Add this message to the completion query
- class dataikuapi.dss.llm.DSSLLMCompletionResponse(raw_resp)#
A handle to interact with a completion response.
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLMCompletionQuery.execute()
instead.- property json#
- Returns:
LLM response parsed as a JSON object
- property success#
- Returns:
The outcome of the completion query.
- Return type:
bool
- property text#
- Returns:
The raw text of the LLM response.
- Return type:
Union[str, None]
- property tool_calls#
- Returns:
The tool calls of the LLM response.
- Return type:
Union[list, None]
- property log_probs#
- Returns:
The log probs of the LLM response.
- Return type:
Union[list, None]
- class dataikuapi.dss.llm.DSSLLMEmbeddingsQuery(llm, text_overflow_mode)#
A handle to interact with an embedding query. Embedding queries allow you to transform text into embedding vectors using a DSS-managed model.
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLM.new_embeddings()
instead.- add_text(text)#
Add text to the embedding query.
- Parameters:
text (str) – Text to add to the query.
- add_image(image)#
Add an image to the embedding query.
- Parameters:
image – Image content as bytes or str (base64)
- execute()#
Run the embedding query.
- Returns:
The results of the embedding query.
- Return type:
- class dataikuapi.dss.llm.DSSLLMEmbeddingsResponse(raw_resp)#
A handle to interact with an embedding query result.
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLMEmbeddingsQuery.execute()
instead.- get_embeddings()#
Retrieve vectors resulting from the embeddings query.
- Returns:
A list of lists containing all embedding vectors.
- Return type:
list
- class dataikuapi.dss.knowledgebank.DSSKnowledgeBankListItem(client, data)#
An item in a list of knowledege banks
Important
Do not instantiate this class directly, instead use
dataikuapi.dss.project.DSSProject.list_knowledge_banks()
.- to_knowledge_bank()#
Convert the current item.
- Returns:
A handle for the knowledge_bank.
- Return type:
- as_core_knowledge_bank()#
Get the
dataiku.KnowledgeBank
object corresponding to this knowledge bank- Return type:
- property project_key#
- Returns:
The project
- Return type:
string
- property id#
- Returns:
The id of the knowledge bank.
- Return type:
string
- property name#
- Returns:
The name of the knowledge bank.
- Return type:
string
- class dataikuapi.dss.knowledgebank.DSSKnowledgeBank(client, project_key, id)#
A handle to interact with a DSS-managed knowledge bank.
Important
Do not create this class directly, use
dataikuapi.dss.project.DSSProject.get_knowledge_bank()
instead.- as_core_knowledge_bank()#
Get the
dataiku.KnowledgeBank
object corresponding to this knowledge bank- Return type:
- class dataiku.KnowledgeBank(id, project_key=None)#
This is a handle to interact with a Dataiku Knowledge Bank flow object
- as_langchain_retriever(search_type='similarity', search_kwargs=None, **retriever_args)#
Get this Knowledge bank as a Langchain Retriever object
- as_langchain_vectorstore()#
Get this Knowledge bank as a Langchain Vectorstore object
- class dataikuapi.dss.langchain.DKULLM(*args: Any, **kwargs: Any)#
Langchain-compatible wrapper around Dataiku-mediated LLMs
Note
Direct instantiation of this class is possible from within DSS, though it’s recommended to instead use
dataikuapi.dss.llm.DSSLLM.as_langchain_llm()
.Example:
llm = dkullm.as_langchain_llm() # single prompt print(llm.invoke("tell me a joke")) # multiple prompts with batching for response in llm.batch(["tell me a joke in English", "tell me a joke in French"]): print(response) # streaming, with stop sequence for chunk in llm.stream("Explain photosynthesis in a few words in English then French", stop=["dioxyde de"]): print(chunk, end="", flush=True)
- llm_id: str#
LLM identifier to use
- max_tokens: int = 1024#
Denotes the number of tokens to predict per generation.
- temperature: float = 0#
A non-negative float that tunes the degree of randomness in generation.
- top_k: int = None#
Number of tokens to pick from when sampling.
- top_p: float = None#
Sample from the top tokens whose probabilities add up to p.
- class dataikuapi.dss.langchain.DKUChatModel(*args: Any, **kwargs: Any)#
Langchain-compatible wrapper around Dataiku-mediated chat LLMs
Note
Direct instantiation of this class is possible from within DSS, though it’s recommended to instead use
dataikuapi.dss.llm.DSSLLM.as_langchain_chat_model()
.Example:
from langchain_core.prompts import ChatPromptTemplate llm = dkullm.as_langchain_chat_model() prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}") chain = prompt | llm for chunk in chain.stream({"topic": "parrot"}): print(chunk.content, end="", flush=True)
- llm_id: str#
LLM identifier to use
- max_tokens: int = 1024#
Denotes the number of tokens to predict per generation.
- temperature: float = 0#
A non-negative float that tunes the degree of randomness in generation.
- top_k: int = None#
Number of tokens to pick from when sampling.
- top_p: float = None#
Sample from the top tokens whose probabilities add up to p.
- bind_tools(tools: Sequence[Dict[str, Any] | Type[langchain_core.pydantic_v1.BaseModel] | Callable | langchain_core.tools.BaseTool], tool_choice: dict | str | Literal['auto', 'none', 'required', 'any'] | bool | None = None, **kwargs: Any)#
Bind tool-like objects to this chat model.
- Args:
- tools: A list of tool definitions to bind to this chat model.
Can be a dictionary, pydantic model, callable, or BaseTool. Pydantic models, callables, and BaseTools will be automatically converted to their schema dictionary representation.
- tool_choice: Which tool to request the model to call.
- Options are:
name of the tool (str): call the corresponding tool;
“auto”: automatically select a tool (or no tool);
“none”: do not call a tool;
“any” or “required”: force at least one tool call;
True: call the one given tool (requires tools to be of length 1);
a dict of the form: {“type”: “tool_name”, “name”: “<<tool_name>>”}, or {“type”: “required”}, or {“type”: “any”} or {“type”: “none”}, or {“type”: “auto”};
kwargs: Any additional parameters to bind.
- class dataikuapi.dss.langchain.DKUEmbeddings(*args: Any, **kwargs: Any)#
Langchain-compatible wrapper around Dataiku-mediated embedding LLMs
Note
Direct instantiation of this class is possible from within DSS, though it’s recommended to instead use
dataikuapi.dss.llm.DSSLLM.as_langchain_embeddings()
.- llm_id: str#
LLM identifier to use
- embed_documents(texts: List[str]) List[List[float]] #
Call out to Dataiku-mediated LLM
- Args:
texts: The list of texts to embed.
- Returns:
List of embeddings, one for each text.
- async aembed_documents(texts: List[str]) List[List[float]] #
- embed_query(text: str) List[float] #
- async aembed_query(text: str) List[float] #