All-in-One Components

All-in-One components are new in Dash 2.0! To understand and make All-in-One components, you’ll first need to understand Basic Callbacks, Pattern-Matching Callbacks, Layout, and Sharing Data Between Callbacks.

All-in-One Components is a convention for encapsulating layout and callbacks into a reusable structure. This pattern uses standard Dash components with pattern-matching callbacks and Dash 2.0’s dash.callback interface.

All-in-One Component Use Cases

All-in-One Components can be useful when:

Inspiration

Here are some of the types of components you might create with the All-in-One component pattern:

The All-in-One Component Convention

Example 1: Simple Component

The All-in-One Convention is best explained through an example.

Here is an example demonstrating how to build a simple reusable component — a Markdown block with text configurable by a dropdown.

aio_components.py

from dash import Dash, Output, Input, State, html, dcc, callback, MATCH
import uuid

# All-in-One Components should be suffixed with 'AIO'
class MarkdownWithColorAIO(html.Div):  # html.Div will be the "parent" component

    # A set of functions that create pattern-matching callbacks of the subcomponents
    class ids:
        dropdown = lambda aio_id: {
            'component': 'MarkdownWithColorAIO',
            'subcomponent': 'dropdown',
            'aio_id': aio_id
        }
        markdown = lambda aio_id: {
            'component': 'MarkdownWithColorAIO',
            'subcomponent': 'markdown',
            'aio_id': aio_id
        }

    # Make the ids class a public class
    ids = ids

    # Define the arguments of the All-in-One component
    def __init__(
        self,
        text,
        colors=None,
        markdown_props=None,
        dropdown_props=None,
        aio_id=None
    ):
        """MarkdownWithColorAIO is an All-in-One component that is composed
        of a parent `html.Div` with a `dcc.Dropdown` color picker ("`dropdown`") and a
        `dcc.Markdown` ("`markdown`") component as children.
        The markdown component's color is determined by the dropdown colorpicker.
        - `text` - The Markdown component's text (required)
        - `colors` - The colors displayed in the dropdown
        - `markdown_props` - A dictionary of properties passed into the dcc.Markdown component. See <a href="https://dash.plotly.com/dash-core-components/markdown">https://dash.plotly.com/dash-core-components/markdown</a> for the full list.
        - `dropdown_props` - A dictionary of properties passed into the dcc.Dropdown component. See <a href="https://dash.plotly.com/dash-core-components/dropdown">https://dash.plotly.com/dash-core-components/dropdown</a> for the full list.
        - `aio_id` - The All-in-One component ID used to generate the markdown and dropdown components's dictionary IDs.

        The All-in-One component dictionary IDs are available as
        - MarkdownWithColorAIO.ids.dropdown(aio_id)
        - MarkdownWithColorAIO.ids.markdown(aio_id)
        """
        colors = colors if colors else ['#001f3f', '#0074D9', '#85144b', '#3D9970']

        # Allow developers to pass in their own `aio_id` if they're
        # binding their own callback to a particular component.
        if aio_id is None:
            # Otherwise use a uuid that has virtually no chance of collision.
            # Uuids are safe in dash deployments with processes
            # because this component's callbacks
            # use a stateless pattern-matching callback:
            # The actual ID does not matter as long as its unique and matches
            # the PMC `MATCH` pattern..
            aio_id = str(uuid.uuid4())

        # Merge user-supplied properties into default properties
        dropdown_props = dropdown_props.copy() if dropdown_props else {}
        if 'options' not in dropdown_props:
            dropdown_props['options'] = [{'label': i, 'value': i} for i in colors]
        dropdown_props['value'] = dropdown_props['options'][0]['value']

        # Merge user-supplied properties into default properties
        markdown_props = markdown_props.copy() if markdown_props else {} # copy the dict so as to not mutate the user's dict
        if 'style' not in markdown_props:
            markdown_props['style'] = {'color': dropdown_props['value']}
        if 'children' not in markdown_props:
            markdown_props['children'] = text

        # Define the component's layout
        super().__init__([  # Equivalent to `html.Div([...])`
            dcc.Dropdown(id=self.ids.dropdown(aio_id), **dropdown_props),
            dcc.Markdown(id=self.ids.markdown(aio_id), **markdown_props)
        ])

    # Define this component's stateless pattern-matching callback
    # that will apply to every instance of this component.
    @callback(
        Output(ids.markdown(MATCH), 'style'),
        Input(ids.dropdown(MATCH), 'value'),
        State(ids.markdown(MATCH), 'style'),
    )
    def update_markdown_style(color, existing_style):
        existing_style['color'] = color
        return existing_style

Then in our application app.py, we can use this component like:

from aio_components import MarkdownWithColorAIO
from dash import Dash, html

app = Dash(__name__)

app.layout = MarkdownWithColorAIO('## Hello World')

if __name__ == "__main__":
    app.run(debug=False)

Or with custom top-level and subcomponent properties:

app.layout = MarkdownWithColorAIO(
    'Custom colors',
    colors=['cornflowerblue', 'darkolivegreen', 'darkslateblue'],
    dropdown_props={
        'persistence': True
    }
)

Or bind our own custom callbacks to a subcomponent:

app.layout = html.Div([
    MarkdownWithColorAIO(
        'Custom callback',
        aio_id='color-picker'
    ),

    html.Div(id='color-picker-output')
])

@callback(
    Output('color-picker-output', 'children'),
    Input(MarkdownWithColorAIO.ids.dropdown('color-picker'), 'value')
)
def display_color(value):
    return f'You have selected {value}'

Key Points

@callback(
    Output('color-picker-output', 'children'),
    Input({'component': 'MarkdownWithColorAIO', 'subcomponent': 'dropdown', 'aio_id': 'color-picker'}), 'value')
)

and the callback defined within the class is equivalent to:

@callback(
    Output({
        'component': 'MarkdownWithColorAIO',
        'subcomponent': 'markdown',
        'aio_id': MATCH
    }, 'style'),
    Input({
        'component': 'MarkdownWithColorAIO',
        'subcomponent': 'dropdown',
        'aio_id': MATCH
    }, 'value'),
    State({
        'component': 'MarkdownWithColorAIO',
        'subcomponent': 'markdown',
        'aio_id': MATCH
    }, 'style'),
)

All-in-One Convention Summary

All-in-One Components adhere to the following convention:

  1. AIO Suffix - Component names are suffixed with AIO (“All-in-One”). For example, DataTableAIO or MarkdownWithColorAIO. This suffix indicates to other Dash developers that it is an “All-in-One” component.
  2. &lt;Component&gt;.ids.&lt;subcomponent&gt; - The ids class exposes the pattern-matching IDs to the developer using the component.
  3. &lt;subcomponent&gt;_props= - Exposing the Embedded Component’s Properties - All-in-One components should provide keyword arguments in the form &lt;subcomponent&gt;_props (e.g. markdown_props and dropdown_props in the example above) that are passed down into the components that comprise the All-in-One component. This allows developers to customize the All-in-One component. The &lt;subcomponent&gt;s names should be descriptive and should match the names in the ids class.
  4. dash.callback instead of app.callback - All-in-One components should define callbacks using the module-level dash.callback (or from dash import callback) instead of binding callbacks to the user-defined app. This enables components to be imported directly without requiring the developer to pass app into the component.
  5. aio_id - All-in-One components are composed of several Dash components and so they don’t have a single id like regular Dash components - Each subcomponent has its own id! These subcomponent IDs are pattern-matching IDs generated by the component. So, the All-in-One component convention does not allow a top-level id= keyword argument. Instead, they should provide an aio_id argument that is used when constructing the Pattern-Matching dictionary ID.
  6. Callbacks Are Defined on Import - The @callback declaration is a class-level declaration, executed when the component is imported. This enables the callbacks to be defined and registered by Dash before the app starts, an important limitation of Dash.
  7. Readable Source - All-in-One components are composed of the standard Dash components and callbacks. It is expected that Dash developers should be able to read the source code of each other’s All-in-One components to understand its layout structure, the logic in component’s callbacks, and how the component’s properties are passed down into the sub components.

Using a Subset of the Convention

The convention outlined above is just that, a convention. A community-adopted convention means that the components that the community shares and publishes and consumes will have a similar, familiar, and customizable API.

These conventions aren’t enforced. If you aren’t publishing your component to the community, you may skip writing the docstring or you may not wire through each subcomponent’s properties with the &lt;subcomponent&gt;_props keywords. Adhering to every step of the convention takes more effort but provides the most flexibility for consumers of your component.

If your component doesn’t have a layout, then you can skip the classes and pattern-matching dictionary IDs altogether and simply write a function that returns a Dash component.

Example 2: DataTableAIO - Sharing Data Between __init__ and callback

Background

MarkdownWithColorAIO isn’t a particularly useful component.

The example below defines a more general purpose and real-world example: DataTableAIO.

The DataTable component provides clientside filtering, sorting, and paging “natively” in JavaScript in the client. This functionality requires that all of the data be passed from the server to the browser.

For large datasets, it’s impractical to send all of the data from the server to the browser. The data should be kept on the server side in memory or in the database and only subsets of the dataset should be sent to the client.

The DataTable supports filtering, sorting, and paging with custom Python callbacks.

This functionality is general purpose—the same filtering, sorting, and paging logic demonstrated in the DataTable with Pandas examples can be copied from one project to the next.

This makes DataTable with a Pandas callback the perfect candidate for an “All-in-One” component: instead of copying the same 150 lines of code whenever you use a DataTable with Pandas, create an All-in-One component and reuse it.

Usage

DataTableAIO, as implemented below, performs filtering, sorting, and paging in a
callback in Pandas using the same recipe in the DataTable & Python Callbacks documentation.

DataTableAIO is complex and is implemented in over 250 lines of code. To start, we’ll demonstrate how this component may be used:

  1. Pass a df into DataTableAIO directly.

app.py

from aio_components import DataTableAIO

from dash import Dash, callback
import plotly.express as px

df = px.data.iris()

app = Dash(__name__)

app.layout = DataTableAIO(df)

Note how we didn’t need to convert df into a list of dictionaries.
Since All-in-One components are written in Python, they can implement their own
custom serialization code. In this implementation, DataTableAIO converts
the DataFrame into a list of dictionaries and creates the columns from the df.

  1. Access the DataFrame in a separate callback.

DataTableAIO may be defined dynamically in a callback and so the df may not
be available in the global scope.

This implementation stores the df that was passed into it in Redis
so that it can be accessed in other callbacks.

DataTableAIO filters, sorts, and pages the data automatically.
These filtering, sorting, and paging functions are also exposed in the
DataTableAIO API as DataTableAIO.filter_df, DataTableAIO.sort_df, and
DataTableAIO.page_df respectively.

This example creates a DataTableAIO dynamically in a callback,
retrieves the df from a separate callback, and filters the data using the
same function that DataTableAIO uses internally.

from aio_components import DataTableAIO, redis_store  # `aio_components.py` as defined below

from dash import Dash, callback, html, dcc, Output, Input, State
import plotly.express as px

app = Dash(__name__)

app.layout = html.Div([
    dcc.Dropdown(
        ['iris', 'gapminder'],
        'iris',
        id='dataset-selector'
    ),
    html.Div(id='datatable-container'),
    html.Div(id='graphs')
])

# Display a DataTableIO based off of the dataset
@callback(
    Output('datatable-container', 'children'),
    Input('dataset-selector', 'value'))
def display_datatable(value):
    df = getattr(px.data, value)()
    return DataTableAIO(df, aio_id='my-dataset')

# Graph the filtered data
@callback(
    Output('graphs', 'children'),
    Input(DataTableAIO.ids.datatable('my-dataset'), 'filter_query'),
    State(DataTableAIO.ids.store('my-dataset'), 'data')
)
def update_graph(filter_query, store):
    # Fetch the data that `DataTableAIO` stored in Redis
    df = redis_store.load(store['df'])
    # Filter the data using the method that `DataTableAIO` exposes
    # and uses internally
    df = DataTableAIO.filter_df(df, filter_query)

    # Graph each column of the data
    divs = html.Div([
        html.Div([
            html.B(c),
            dcc.Graph(figure=px.histogram(x=df[c]))
        ]) for c in df.columns
    ])
    return divs

app.run(debug=True)

Install

The DataTableAIO example below uses the redis, fakeredis, and pyarrow packages:

pip install pyarrow fakeredis redis

DataTableAIO Implementation

aio_components.py

Copy and paste the following file to e.g. aio_components.py to use DataTableAIO in your projects as above.

from dash import callback, html, dash_table, dcc, Input, Output, State, MATCH
import fakeredis
import hashlib
import io
import json
import os
import pandas as pd
import plotly
import redis
import warnings
import uuid

class redis_store:
    """Save data to Redis using the hashed contents as the key.
    Serialize Pandas DataFrames as memory-efficient Parquet files.

    Otherwise, attempt to serialize the data as JSON, which may have a
    lossy conversion back to its original type. For example, numpy arrays will
    be deserialized as regular Python lists.

    Connect to Redis with the environment variable `REDIS_URL` if available.
    Otherwise, use FakeRedis, which is only suitable for development and
    will not scale across multiple processes.
    """
    if 'REDIS_URL' in os.environ:
        r =  redis.StrictRedis.from_url(os.environ["REDIS_URL"])
    else:
        warnings.warn('Using FakeRedis - Not suitable for Production Use.')
        r = fakeredis.FakeStrictRedis()

    @staticmethod
    def _hash(serialized_obj):
        return hashlib.sha512(serialized_obj).hexdigest()

    @staticmethod
    def save(value):
        if isinstance(value, pd.DataFrame):
            buffer = io.BytesIO()
            value.to_parquet(buffer, compression='gzip')
            buffer.seek(0)
            df_as_bytes = buffer.read()
            hash_key = redis_store._hash(df_as_bytes)
            type = 'pd.DataFrame'
            serialized_value = df_as_bytes
        else:
            serialized_value = json.dumps(value, cls=plotly.utils.PlotlyJSONEncoder).encode('utf-8')
            hash_key = redis_store._hash(serialized_value)
            type = 'json-serialized'

        redis_store.r.set(
            f'_dash_aio_components_value_{hash_key}',
            serialized_value
        )
        redis_store.r.set(
            f'_dash_aio_components_type_{hash_key}',
            type
        )
        return hash_key

    @staticmethod
    def load(hash_key):
        data_type = redis_store.r.get(f'_dash_aio_components_type_{hash_key}')
        serialized_value = redis_store.r.get(f'_dash_aio_components_value_{hash_key}')
        try:
            if data_type == b'pd.DataFrame':
                value = pd.read_parquet(io.BytesIO(serialized_value))
            else:
                value = json.loads(serialized_value)
        except Exception as e:
            print(e)
            print(f'ERROR LOADING {data_type - hash_key}')
            raise e
        return value


# DataTable-specific Filtering logic from <a href="https://dash.plotly.com/datatable/callbacks">https://dash.plotly.com/datatable/callbacks</a>
_operators = [
    ['ge ', '>='],
    ['le ', '<='],
    ['lt ', '<'],
    ['gt ', '>'],
    ['ne ', '!='],
    ['eq ', '='],
    ['contains '],
    ['datestartswith ']]


def _split_filter_part(filter_part):
    for operator_type in _operators:
        for operator in operator_type:
            if operator in filter_part:
                name_part, value_part = filter_part.split(operator, 1)
                name = name_part[name_part.find('{') + 1: name_part.rfind('}')]

                value_part = value_part.strip()
                v0 = value_part[0]
                if (v0 == value_part[-1] and v0 in ("'", '"', '`')):
                    value = value_part[1: -1].replace('\\' + v0, v0)
                else:
                    try:
                        value = float(value_part)
                    except ValueError:
                        value = value_part
                # word _operators need spaces after them in the filter string,
                # but we don't want these later
                return name, operator_type[0].strip(), value

    return [None] * 3

class DataTableAIO(html.Div):
    class ids:
        datatable = lambda aio_id: {
            'component': 'DataTableAIO',
            'subcomponent': 'datatable',
            'aio_id': aio_id
        }
        store = lambda aio_id: {
            'component': 'DataTableAIO',
            'subcomponent': 'store',
            'aio_id': aio_id
        }
    ids = ids

    def __init__(self, df=None, aio_id=None, **datatable_props):
        """DataTableIO is an All-in-One component that is composed of a parent `html.Div`
        with a `dcc.Store` and a `dash_table.DataTable` as children.
        The dataframe filtering, paging, and sorting is performed in a built-in
        callback that uses Pandas.

        The DataFrame is stored in Redis as a Parquet file via the
        `redis_store` class. The `dcc.Store` contains the Redis key to the
        DataFrame and can be retrieved with `redis_store.get(store['df'])`
        in a separate callback.

        The underlying functions that filter, sort, and page the data are
        accessible via `filter_df`, `sort_df`, and `page_df` respectively.

        - `df` - A Pandas dataframe
        - `aio_id` - The All-in-One component ID used to generate the `dcc.Store` and `DataTable` components's dictionary IDs.
        - `**datatable_props` - Properties passed into the underlying `DataTable`
        """
        if aio_id is None:
            aio_id = str(uuid.uuid4())

        # Infer DataTable column types from the Pandas DataFrame
        columns = []
        columns_cast_to_string = []
        for c in df.columns:
            column = {'name': c, 'id': c}
            dtype = pd.api.types.infer_dtype(df[c])
            if dtype.startswith('mixed'):
                columns_cast_to_string.append(c)
                df[c] = df[c].astype(str)

            if pd.api.types.is_numeric_dtype(df[c]):
                column['type'] = 'numeric'
            elif pd.api.types.is_string_dtype(df[c]):
                column['type'] = 'text'
            elif pd.api.types.is_datetime64_any_dtype(df[c]):
                column['type'] = 'datetime'
            else:
                columns_cast_to_string.append(c)
                df[c] = df[c].astype(str)
                column['type'] = 'text'
            columns.append(column)

        if columns_cast_to_string:
            warnings.warn(
                'Converted the following mixed-type columns to ' +
                'strings so that they can be saved in Redis or JSON: ' +
                f'{", ".join(columns_cast_to_string)}'
            )

        derived_kwargs = datatable_props.copy()

        # Store the DataFrame in Redis and the hash key in `dcc.Store`
        # Allow the user to pass in `df=` or `data=` as per `DataTable`.
        store_data = {}
        if df is None and 'data' in datatable_props:
            store_data['df'] = redis_store.save(
                pd.DataFrame(datatable_props['data'])
            )
        elif df is not None and not 'data' in datatable_props:
            store_data['df'] = redis_store.save(df)
        elif df is not None and 'data' in datatable_props:
            raise Exception('The `df` argument cannot be supplied with the data argument - it\'s ambiguous.')
        else:
            raise Exception('No data supplied. Pass in a dataframe as `df=` or a list of dictionaries as `data=`')

        # Allow the user to pass in their own columns, otherwise define our own.
        if df is not None:
            if 'columns' not in datatable_props:
                derived_kwargs['columns'] = columns

        # Allow the user to override these properties, otherwise provide defaults
        derived_kwargs['page_current'] = derived_kwargs.get('page_current', 0)
        derived_kwargs['page_size'] = derived_kwargs.get('page_size', 10)
        derived_kwargs['page_action'] = derived_kwargs.get('page_action', 'custom')
        derived_kwargs['filter_action'] = derived_kwargs.get('filter_action', 'custom')
        derived_kwargs['filter_query'] = derived_kwargs.get('filter_query', '')
        derived_kwargs['sort_action'] = derived_kwargs.get('sort_action', 'custom')
        derived_kwargs['sort_mode'] = derived_kwargs.get('sort_mode', 'multi')
        derived_kwargs['sort_by'] = derived_kwargs.get('sort_by', [])

        super().__init__([
            dcc.Store(data=store_data, id=self.ids.store(aio_id)),
            dash_table.DataTable(id=self.ids.datatable(aio_id), **derived_kwargs)
        ])

    def filter_df(df, filter_query):
        """Filter a Pandas dataframe as per the `filter_query` provided by
        the DataTable.
        """
        filtering_expressions = filter_query.split(' && ')
        for filter_part in filtering_expressions:
            col_name, operator, filter_value = _split_filter_part(filter_part)

            if operator in ('eq', 'ne', 'lt', 'le', 'gt', 'ge'):
                # these _operators match pandas series operator method names
                df = df.loc[getattr(df[col_name], operator)(filter_value)]
            elif operator == 'contains':
                df = df.loc[df[col_name].str.contains(filter_value)]
            elif operator == 'datestartswith':
                # this is a simplification of the front-end filtering logic,
                # only works with complete fields in standard format
                df = df.loc[df[col_name].str.startswith(filter_value)]
        return df

    def sort_df(df, sort_by):
        """Sort a Pandas dataframe as per the DataTable `sort_by` property.
        """
        if len(sort_by):
            df = df.sort_values(
                [col['column_id'] for col in sort_by],
                ascending=[
                    col['direction'] == 'asc'
                    for col in sort_by
                ],
                inplace=False
            )
        return df

    def page_df(df, page_current, page_size):
        """Page a Pandas dataframe as per the DataTable `page_current`
        and `page_size` parameters.
        """
        return df.iloc[page_current * page_size: (page_current + 1) * page_size]

    @callback(
        Output(ids.datatable(MATCH), 'data'),
        Input(ids.datatable(MATCH), 'page_current'),
        Input(ids.datatable(MATCH), 'page_size'),
        Input(ids.datatable(MATCH), 'sort_by'),
        Input(ids.datatable(MATCH), 'filter_query'),
        State(ids.store(MATCH), 'data')
    )
    def filter_sort_page(page_current, page_size, sort_by, filter, store):
        df = redis_store.load(store['df'])
        df = DataTableAIO.filter_df(df, filter)
        df = DataTableAIO.sort_df(df, sort_by)
        df = DataTableAIO.page_df(df, page_current, page_size)
        return df.to_dict('records')

Walkthrough

All-in-One Component Conventions

DataTableAIO adheres to the All-in-One component convention outlined above.

Redis

DataTableAIO accepts a df and its stateless PMC callback filters, sorts, and pages that callback. The stateless callback is defined outside of the __init__ function so that it is defined before the Dash app starts. The __init__ function itself may be called after the Dash app starts if DataTableAIO is returned within the children property of a separate component. In Production, the callback that returned DataTableAIO might be running on a separate process than the callback that will ultimately filter, sort, and page the df. This means that df needs to be stored somewhere that is accessible to both processes.

Redis is a durable, in-memory database that is well suited for storing data in Production. Redis databases can be made available to all of the independent processes that are running a stateless Dash app.

In this implementation, the aio_components.redis_store class serializes the dataframe as a Parquet object and stores it in Redis. The key is the hash of the dataset. By storing the data based off of the dataset’s hash, only unique versions of the dataset will be stored in memory.

The hash of the dataset is stored in the user’s session via dcc.Store. Filtering, sorting, or paging the table will trigger the built-in pattern-matching callbacks which will retrieve the data from Redis via the hash key of that session stored in the dcc.Store.

The component’s consumer can also retrieve this dataset hash with a callback that has DataTableAIO.ids.store as Input or State.

In the future, redis_store may become a first-class member of dash. In the meantime, you can use this code and modify it as you see fit.

Note: A common alternative is to store based off of the each user’s session. Storing data by session keys can be expensive as each user visiting the app would have their own copy of the data in Redis, even if the datasets were the same. It can also be difficult to determine when to clear the data in Redis as sessions can be long lived even without activity. In this case, storing the data by hash is more efficient especially if there is a low number of datasets used in the application.

Exposing Callback Internals - DataTableAIO.filter_df

DataTableAIO.filter_df, DataTableAIO.sort_df, DataTableAIO.page_df are stateless functions that filter, sort, and page a DataFrame with Pandas. DataTableAIO uses these functions internally in its pattern-matching callback.

These functions are exposed to developers as part of the public DataTableAIO API. This allows developers to use the same functional methods that the table uses in their callbacks, ensuring the same results and preventing code duplication.

If your All-in-One component’s callback encapsulates complex but reusable functionality, then we recommend exposing those stateless functions as part of the component API so that end users can benefit. This is simply a convention of All-in-One components.

All-in-One Component Limitations

Community Discussion

The All-in-One components pattern emerged from discussions in the Dash Community Forums. Many thanks to everyone in the community who contribute to these discussions.