Importing tables as datasets#

For usage information and examples, see Importing tables as datasets

class dataikuapi.dss.project.TablesImportDefinition(client, project_key)#

Temporary structure holding the list of tables to import

add_hive_table(hive_database, hive_table)#

Add a Hive table to the list of tables to import

Parameters:
  • hive_database (str) – the name of the Hive database

  • hive_table (str) – the name of the Hive table

add_sql_table(connection, schema, table, catalog=None)#

Add a SQL table to the list of tables to import

Parameters:
  • connection (str) – the name of the SQL connection

  • schema (str) – the schema of the table

  • table (str) – the name of the SQL table

  • catalog (str) – the database of the SQL table. Leave to None to use the default database associated with the connection

add_elasticsearch_index_or_alias(connection, index_or_alias)#

Add an Elastic Search index or alias to the list of tables to import

prepare()#

Run the first step of the import process. In this step, DSS will check the tables whose import you have requested and prepare dataset names and target connections

Returns:

an object that allows you to finalize the import process

Return type:

TablesPreparedImport

class dataikuapi.dss.project.TablesPreparedImport(client, project_key, candidates)#

Result of preparing a tables import. Import can now be finished

execute()#

Starts executing the import in background and returns a dataikuapi.dss.future.DSSFuture to wait on the result

Returns:

a future to wait on the result

Return type:

dataikuapi.dss.future.DSSFuture