LLM Mesh#
For usage information and examples, please see LLM Mesh
- class dataikuapi.dss.llm.DSSLLM(client, project_key, llm_id)#
A handle to interact with a DSS-managed LLM.
Important
Do not create this class directly, use
dataikuapi.dss.project.DSSProject.get_llm()
instead.- new_completion()#
Create a new completion query.
- Returns:
A handle on the generated completion query.
- Return type:
- new_completions()#
Create a new multi-completion query.
- Returns:
A handle on the generated multi-completion query.
- Return type:
- new_embeddings(text_overflow_mode='FAIL')#
Create a new embedding query.
- Parameters:
text_overflow_mode (str) – How to handle longer texts than what the model supports. Either ‘TRUNCATE’ or ‘FAIL’.
- Returns:
A handle on the generated embeddings query.
- Return type:
- new_images_generation()#
- as_langchain_llm(**data)#
Create a langchain-compatible LLM object for this LLM.
- Returns:
A langchain-compatible LLM object.
- Return type:
dataikuapi.dss.langchain.llm.DKULLM
- as_langchain_chat_model(**data)#
Create a langchain-compatible chat LLM object for this LLM.
- Returns:
A langchain-compatible LLM object.
- Return type:
dataikuapi.dss.langchain.llm.DKUChatModel
- as_langchain_embeddings(**data)#
Create a langchain-compatible embeddings object for this LLM.
- Returns:
A langchain-compatible embeddings object.
- Return type:
dataikuapi.dss.langchain.embeddings.DKUEmbeddings
- class dataikuapi.dss.llm.DSSLLMListItem(client, project_key, data)#
An item in a list of llms
Important
Do not instantiate this class directly, instead use
dataikuapi.dss.project.DSSProject.list_llms()
.- to_llm()#
Convert the current item.
- Returns:
A handle for the llm.
- Return type:
- property id#
- Returns:
The id of the llm.
- Return type:
string
- property type#
- Returns:
The type of the LLM
- Return type:
string
- property description#
- Returns:
The description of the LLM
- Return type:
string
- class dataikuapi.dss.llm.DSSLLMCompletionsQuery(llm)#
A handle to interact with a multi-completion query. Completion queries allow you to send a prompt to a DSS-managed LLM and retrieve its response.
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLM.new_completion()
instead.- property settings#
- Returns:
The completion query settings.
- Return type:
dict
- new_completion()#
- execute()#
Run the completions query and retrieve the LLM response.
- Returns:
The LLM response.
- Return type:
- with_json_output(schema=None, strict=None, compatible=None)#
Request the model to generate a valid JSON response, for models that support it.
Note that some models may require you to also explicitly request this in the user or system prompt to use this.
Caution
JSON output support is experimental for locally-running Hugging Face models.
- Parameters:
schema (dict) – (optional) If specified, request the model to produce a JSON response that adheres to the provided schema. Support varies across models/providers.
strict (bool) – (optional) If a schema is provided, whether to strictly enforce it. Support varies across models/providers.
compatible (bool) – (optional) Allow DSS to modify the schema in order to increase compatibility, depending on known limitations of the model/provider. Defaults to automatic.
- with_structured_output(model_type, strict=None, compatible=None)#
Instruct the model to generate a response as an instance of a specified Pydantic model.
This functionality depends on with_json_output and necessitates that the model supports JSON output with a schema.
Caution
Structured output support is experimental for locally-running Hugging Face models.
- Parameters:
model_type (pydantic.BaseModel) – A Pydantic model class used for structuring the response.
strict (bool) – (optional) see
with_json_output()
compatible (bool) – (optional) see
with_json_output()
- class dataikuapi.dss.llm.DSSLLMCompletionsQuerySingleQuery#
- new_multipart_message(role='user')#
Start adding a multipart-message to the completion query.
Use this to add image parts to the message.
- Parameters:
role (str) – The message role. Use
system
to set the LLM behavior,assistant
to store predefined responses,user
to provide requests or comments for the LLM to answer to. Defaults touser
.- Return type:
- with_message(message, role='user')#
Add a message to the completion query.
- Parameters:
message (str) – The message text.
role (str) – The message role. Use
system
to set the LLM behavior,assistant
to store predefined responses,user
to provide requests or comments for the LLM to answer to. Defaults touser
.
- with_tool_calls(tool_calls, role='assistant')#
Add tool calls to the completion query.
- Parameters:
tool_calls (list[dict]) – Calls to tools that the LLM requested to use.
role (str) – The message role. Defaults to
assistant
.
- with_tool_output(tool_output, tool_call_id, role='tool')#
Add a tool message to the completion query.
- Parameters:
tool_output (str) – The tool output, as a string.
tool_call_id (str) – The tool call id, as provided by the LLM in the conversation messages.
role (str) – The message role. Defaults to
tool
.
- class dataikuapi.dss.llm.DSSLLMCompletionsResponse(raw_resp, response_parser=None)#
A handle to interact with a multi-completion response.
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLMCompletionsQuery.execute()
instead.- property responses#
The array of responses
- class dataikuapi.dss.llm.DSSLLMCompletionQuery(llm)#
A handle to interact with a completion query. Completion queries allow you to send a prompt to a DSS-managed LLM and retrieve its response.
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLM.new_completion()
instead.- property settings#
- Returns:
The completion query settings.
- Return type:
dict
- execute()#
Run the completion query and retrieve the LLM response.
- Returns:
The LLM response.
- Return type:
- execute_streamed()#
Run the completion query and retrieve the LLM response as streamed chunks.
- Returns:
An iterator over the LLM response chunks
- Return type:
Iterator[Union[
DSSLLMStreamedCompletionChunk
,DSSLLMStreamedCompletionFooter
]]
- new_multipart_message(role='user')#
Start adding a multipart-message to the completion query.
Use this to add image parts to the message.
- Parameters:
role (str) – The message role. Use
system
to set the LLM behavior,assistant
to store predefined responses,user
to provide requests or comments for the LLM to answer to. Defaults touser
.- Return type:
- with_json_output(schema=None, strict=None, compatible=None)#
Request the model to generate a valid JSON response, for models that support it.
Note that some models may require you to also explicitly request this in the user or system prompt to use this.
Caution
JSON output support is experimental for locally-running Hugging Face models.
- Parameters:
schema (dict) – (optional) If specified, request the model to produce a JSON response that adheres to the provided schema. Support varies across models/providers.
strict (bool) – (optional) If a schema is provided, whether to strictly enforce it. Support varies across models/providers.
compatible (bool) – (optional) Allow DSS to modify the schema in order to increase compatibility, depending on known limitations of the model/provider. Defaults to automatic.
- with_message(message, role='user')#
Add a message to the completion query.
- Parameters:
message (str) – The message text.
role (str) – The message role. Use
system
to set the LLM behavior,assistant
to store predefined responses,user
to provide requests or comments for the LLM to answer to. Defaults touser
.
- with_structured_output(model_type, strict=None, compatible=None)#
Instruct the model to generate a response as an instance of a specified Pydantic model.
This functionality depends on with_json_output and necessitates that the model supports JSON output with a schema.
Caution
Structured output support is experimental for locally-running Hugging Face models.
- Parameters:
model_type (pydantic.BaseModel) – A Pydantic model class used for structuring the response.
strict (bool) – (optional) see
with_json_output()
compatible (bool) – (optional) see
with_json_output()
- with_tool_calls(tool_calls, role='assistant')#
Add tool calls to the completion query.
- Parameters:
tool_calls (list[dict]) – Calls to tools that the LLM requested to use.
role (str) – The message role. Defaults to
assistant
.
- with_tool_output(tool_output, tool_call_id, role='tool')#
Add a tool message to the completion query.
- Parameters:
tool_output (str) – The tool output, as a string.
tool_call_id (str) – The tool call id, as provided by the LLM in the conversation messages.
role (str) – The message role. Defaults to
tool
.
- class dataikuapi.dss.llm.DSSLLMCompletionQueryMultipartMessage(q, role)#
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLMCompletionQuery.new_multipart_message()
ordataikuapi.dss.llm.DSSLLMCompletionsQuerySingleQuery.new_multipart_message()
.- with_text(text)#
Add a text part to the multipart message
- with_inline_image(image, mime_type=None)#
Add an image part to the multipart message
- Parameters:
image (Union[str, bytes]) – The image
mime_type (str) – None for default
- add()#
Add this message to the completion query
- class dataikuapi.dss.llm.DSSLLMCompletionResponse(raw_resp=None, text=None, finish_reason=None, response_parser=None, trace=None)#
Response to a completion
- property json#
- Returns:
LLM response parsed as a JSON object
- property parsed#
- property success#
- Returns:
The outcome of the completion query.
- Return type:
bool
- property text#
- Returns:
The raw text of the LLM response.
- Return type:
Union[str, None]
- property tool_calls#
- Returns:
The tool calls of the LLM response.
- Return type:
Union[list, None]
- property log_probs#
- Returns:
The log probs of the LLM response.
- Return type:
Union[list, None]
- property trace#
- class dataikuapi.dss.llm.DSSLLMEmbeddingsQuery(llm, text_overflow_mode)#
A handle to interact with an embedding query. Embedding queries allow you to transform text into embedding vectors using a DSS-managed model.
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLM.new_embeddings()
instead.- add_text(text)#
Add text to the embedding query.
- Parameters:
text (str) – Text to add to the query.
- add_image(image)#
Add an image to the embedding query.
- Parameters:
image – Image content as bytes or str (base64)
- execute()#
Run the embedding query.
- Returns:
The results of the embedding query.
- Return type:
- class dataikuapi.dss.llm.DSSLLMEmbeddingsResponse(raw_resp)#
A handle to interact with an embedding query result.
Important
Do not create this class directly, use
dataikuapi.dss.llm.DSSLLMEmbeddingsQuery.execute()
instead.- get_embeddings()#
Retrieve vectors resulting from the embeddings query.
- Returns:
A list of lists containing all embedding vectors.
- Return type:
list
- class dataikuapi.dss.knowledgebank.DSSKnowledgeBankListItem(client, data)#
An item in a list of knowledege banks
Important
Do not instantiate this class directly, instead use
dataikuapi.dss.project.DSSProject.list_knowledge_banks()
.- to_knowledge_bank()#
Convert the current item.
- Returns:
A handle for the knowledge_bank.
- Return type:
- as_core_knowledge_bank()#
Get the
dataiku.KnowledgeBank
object corresponding to this knowledge bank- Return type:
- property project_key#
- Returns:
The project
- Return type:
string
- property id#
- Returns:
The id of the knowledge bank.
- Return type:
string
- property name#
- Returns:
The name of the knowledge bank.
- Return type:
string
- class dataikuapi.dss.knowledgebank.DSSKnowledgeBank(client, project_key, id)#
A handle to interact with a DSS-managed knowledge bank.
Important
Do not create this class directly, use
dataikuapi.dss.project.DSSProject.get_knowledge_bank()
instead.- as_core_knowledge_bank()#
Get the
dataiku.KnowledgeBank
object corresponding to this knowledge bank- Return type:
- class dataiku.KnowledgeBank(id, project_key=None)#
This is a handle to interact with a Dataiku Knowledge Bank flow object
- as_langchain_retriever(search_type='similarity', search_kwargs=None, **retriever_args)#
Get this Knowledge bank as a Langchain Retriever object
- as_langchain_vectorstore()#
Get this Knowledge bank as a Langchain Vectorstore object