Contents Menu Expand Light mode Dark mode Auto light/dark mode
Dataiku Developer Guide
Dataiku
  • Academy
    • Join the Academy
      Benefit from guided learning opportunities →
      • Quick Starts
      • Learning Paths
      • New Features
      • Certifications
      • Academy Discussions
  • Community
      • Explore the Community
        Discover, share, and contribute →
      • Learn About Us
      • Ask A Question
      • What's New?
      • Discuss Dataiku
      • Using Dataiku
      • Setup And Configuration
      • General Discussion
      • Plugins & Extending Dataiku
      • Product Ideas
      • Programs
      • Frontrunner Awards
      • Dataiku Neurons
      • Community Resources
      • Community Feedback
      • User Research

      Discover the winners and finalists of the 2023 edition, and read their story to learn about their pioneering achievements in data science and AI!

      View Winners and Finalists

  • Documentation
    • Reference Documentation
      Comprehensive specifications of Dataiku →
      • User's Guide
      • Specific Data Processing
      • Automation & Deployment
      • APIs
      • Installation & Administration
      • Other Topics
  • Knowledge
    • Knowledge Base
      Articles and tutorials on Dataiku features →
      • User Guide
      • Admin Guide
      • Dataiku Solutions
      • Dataiku Cloud
  • Developer
    • Developer Guide
      Tutorials and articles for developers and coder users →
      • Getting Started
      • Concepts and Examples
      • Tutorials
      • API Reference
  • Getting started
    • Introduction
    • Development environment
    • Basic workflow
    • MLOps lifecycle
    • The Dataiku Python APIs
    • Quickstart Tutorial
      • Step 1: Prepare the input dataset for ML modeling
      • Step 2: Test different Machine Learning models for heart failures prediction
      • Step 3: Create a Dataiku Saved Model using the best-performing model
  • Concepts and examples
    • Datasets
      • Datasets (reading and writing data)
      • Datasets (other operations)
    • Managed folders
    • The main API client
    • Pyspark recipes
    • Projects
    • Recipes
    • Scenarios
    • Flow creation and management
    • LLM Mesh
    • Visual Machine learning
    • Experiment Tracking
    • Statistics worksheets
    • API Designer & Deployer
    • Project Deployer
    • Static insights
    • Jobs
    • Authentication information and impersonation
    • Importing tables as datasets
    • Wikis
    • Discussions
    • Performing SQL, Hive and Impala queries
    • SQL Query
    • Project libraries
    • Meanings
    • Users and groups
    • Connections
    • Scenarios (in a scenario)
    • Code envs
    • Plugins
    • Project folders
    • Dataiku Applications
    • Feature Store
    • Streaming Endpoints
    • Metrics and checks
    • Model Evaluation Stores
    • Administration
    • Utilities
    • API for plugin components
      • API for plugin recipes
      • API for plugin datasets
      • API for plugin formats
      • API for plugin FS providers
    • Clusters
    • Code Studios
    • Fleet Manager
      • The main FMClient class
      • Fleet Manager Instances
      • Fleet Manager Virtual Networks
      • Fleet Manager Instance Templates
      • Fleet Manager Tenant
      • Fleet Manager Future
    • Govern
      • The main GovernClient class
      • Govern Blueprints and Blueprint versions
      • Govern Artifacts
      • Govern Uploaded Files
      • Govern Time series
      • Govern Artifact Search
      • Govern Blueprint Designer
        • Govern Hooks
      • Govern Roles and Permissions
        • Govern Role Assignment Rules
      • Authentication information and impersonation
      • Users and groups
      • Other administration tasks
      • Govern Custom Pages
      • Govern Custom Pages Handler
      • Utilities
    • Workspaces
    • Data Collections
    • Data Quality
    • Webapps
  • Tutorials
    • Administration
      • Retrieve AWS Credentials from Dataiku API
    • AI & Machine Learning
      • Generative AI - NLP
        • Programmatic RAG with Dataiku’s LLM Mesh and Langchain
        • Using LLM Mesh to benchmark zero-shot classification models
        • GPT-based zero-shot text classification with the OpenAI API
        • GPT-based few-shot classification with the OpenAI API
        • Comparing zero-shot learning and few-shot learning using Dolly for efficient text classification
        • Using Langchain, Chroma, and GPT for document-based retrieval-augmented generation
      • Experiment Tracking
        • Experiment Tracking with the PythonModel module
        • Experiment Tracking for NLP with Keras/Tensorflow
        • Experiment tracking with Catboost
        • Experiment tracking with LightGBM
        • Experiment tracking with scikit-learn
      • Pre-trained Models
        • Load and re-use a TensorFlow Hub model
        • Load and re-use a PyTorch model
        • Load and re-use a Hugging Face model
        • Load and re-use a SentenceTransformers word embedding model
        • Load and re-use a spaCy named-entity recognition model
        • Load and re-use an NLTK tokenizer
      • Model Import
        • Importing serialized scikit-learn pipelines as Saved Models for MLOps
      • Model Export
        • Wrapping an exported model in a CLI tool
    • Data Engineering
      • Leveraging SQL in Python & R
      • Using Snowpark Python in Dataiku: basics
      • Using Databricks Connect Python in Dataiku: basics
      • Data quality assessments (SQL Datasets)
    • Developer tools
      • Setting up the Dataiku API local environment
      • Usage basics for the Dataiku Python API
      • VSCode extension for Dataiku DSS
      • PyCharm plugin for Dataiku DSS
      • Using VSCode for Code Studios
      • Using JupyterLab in Code Studios
      • Using RStudio in Code Studios
      • Running unit tests on project libraries
      • Using the API to interact with git for project versioning
    • Plugins development
      • Foreword
      • Creating and configuring a plugin
      • Setting up a dedicated instance for developing plugins
      • Git integration for plugins
      • Datasets
        • Creating a plugin Dataset component
      • File Format
        • Writing a plugin File Format component to allow ICal import in Dataiku
      • Macros
        • Creating a plugin Macro component
        • Writing a macro for project creation
        • Writing a macro for managing regression tests
      • Prediction algorithm
        • Creating a plugin Prediction Algorithm component
      • Recipes
        • Creating a plugin Recipe component
        • Writing a custom recipe to remove outliers from a dataset
      • Webapps
        • Turning an existing webapp into a webapp plugin component
    • Webapps
      • Common subjects on webapps
        • Impersonation with webapps
        • Accessible resources from webapps
      • Standard (HTML, CSS, JS)
        • HTML/CSS/JS: your first webapp
        • Adapt a D3.js Template in a Webapp
        • Use Custom Static Files (Javascript, CSS) in a Webapp
        • Upload to Dataiku DSS in a Webapp
        • Download from a Dataiku DSS Webapp
        • How to create a form for data input?
        • Simple scoring application
      • Bokeh
        • Bokeh: Your first webapp
      • Dash
        • Create an empty dash webapp
        • Dash: your first webapp
        • Create a simple admin project dashboard using Dash.
        • Create a simple multi-page webapp
        • Uploading or downloading files with Managed Folders in Dash
        • How to create a form for data input?
        • Using Dash and Azure OpenAI to build a GPT-powered web app assistant
        • Simple scoring application
      • Gradio
        • Gradio: your first web application
      • Streamlit
        • Streamlit: your first webapp
      • Use your own framework.
        • Basic setup: Code studio template
        • Basic setup: Quickstart with Angular & Vue Templates
        • Basic setup: Deploy your web application
        • Advanced setup: Code Studio template creation
        • Advanced setup: Web application creation and publication
  • API Reference
    • Python
      • Index
      • Datasets
      • Managed folders
      • The main API client
      • Interaction with Pyspark
      • Projects
      • Recipes
      • Scenarios
      • Flow creation and management
      • LLM Mesh
      • Machine learning
      • Experiment Tracking
      • Statistics worksheets
      • API Designer & Deployer
      • Project Deployer
      • Static insights
      • Jobs
      • Authentication information and impersonation
      • Importing tables as datasets
      • Wiki
      • Discussions
      • Performing SQL, Hive and Impala queries
      • SQL Query
      • Project libraries
      • Meanings
      • Users and groups
      • Connections
      • Scenarios (in a scenario)
      • Code envs
      • Plugins
      • Project folders
      • Macros
      • Dataiku applications
      • Feature Store
      • Streaming endpoints
      • Unified Monitoring
      • Metrics and checks
      • Model Evaluation Stores
      • Other administration tasks
      • Utilities
      • Index of the dataiku package
      • Index of the dataikuapi package
      • API for plugin components
        • API for plugin recipes
        • API for plugin datasets
        • API for plugin formats
        • API for plugin FS providers
      • Clusters
      • Code Studios
      • Fleet Manager
      • Govern
      • Workspaces
      • Data Collections
      • Data Quality
      • Webapps
      • Dashboards
      • Insights
      • Messaging channels
You are viewing the developer guide for version 12 of DSS.
  • Go back to the homepage
  • Tutorials
  • Webapps
  • Common Subjects On Webapps
Back to top

Common subjects on webapps#

  • Impersonation with webapps

  • Accessible resources from webapps

Next
Impersonation with webapps
Previous
Webapps
Copyright © 2024, Dataiku
Made with Sphinx and @pradyunsg's Furo