Generative AI

How to design an advanced multi-page Analyboard with powerful filtering, live KPIS, and rich visual assessment using the panel

In this tutorial, we create an advanced custom dashboard that works with Tabri. Through each part of the implementation, we look at how to cross synthetic data, use rich filters, visualize time series logs, compare segments and regions, and simulate live KPI updates. We design the system step-by-step to truly understand how each widget, callback, and programming functionality comes together to create a smooth, efficient analytics experience. Look Full codes here.

import sys, subprocess


def install_deps():
   pkgs = ["panel", "hvplot", "pandas", "numpy", "bokeh"]
   subprocess.check_call([sys.executable, "-m", "pip", "install", "-q"] + pkgs)


try:
   import panel as pn
   import hvplot.pandas
   import pandas as pd
   import numpy as np
except ImportError:
   install_deps()
   import panel as pn
   import hvplot.pandas
   import pandas as pd
   import numpy as np


pn.extension()


rng = np.random.default_rng(42)
dates = pd.date_range("2024-01-01", periods=365, freq="D")
segments = ["A", "B", "C"]
regions = ["North", "South", "East", "West"]


base = pd.DataFrame(
   {
       "date": np.tile(dates, len(segments) * len(regions)),
       "segment": np.repeat(segments, len(dates) * len(regions)),
       "region": np.repeat(np.tile(regions, len(segments)), len(dates)),
   }
)
base["traffic"] = (
   100
   + 40 * np.sin(2 * np.pi * base["date"].dt.dayofyear / 365)
   + rng.normal(0, 15, len(base))
)
trend = {"A": 1.0, "B": 1.5, "C": 2.0}
base["traffic"] *= base["segment"].map(trend)
base["conversions"] = (base["traffic"] * rng.uniform(0.01, 0.05, len(base))).astype(int)
base["revenue"] = base["conversions"] * rng.uniform(20, 60, len(base))
df = base.reset_index(drop=True)

We include all the necessary dependencies and load panel, HVPlot, pandas, and numpy so that the dashboard runs smoothly in colob. We generate a full year's worth of manufacturing time series data across all segments and regions, providing rich data for evaluation. At the end of this block, we will have clean daily data, ready to use for everything to come. Look Full codes here.

segment_sel = pn.widgets.CheckBoxGroup(name="Segment", value=segments[:2], options=segments, inline=True)
region_sel = pn.widgets.MultiChoice(name="Region", value=["North"], options=regions)
metric_sel = pn.widgets.Select(name="Metric", value="traffic", options=["traffic", "conversions", "revenue"])
date_range = pn.widgets.DateRangeSlider(
   name="Date Range",
   start=df["date"].min(),
   end=df["date"].max(),
   value=(df["date"].min(), df["date"].max()),
)
smooth_slider = pn.widgets.IntSlider(name="Rolling Window (days)", start=1, end=30, value=7)


def filtered_df(segment, region, drange):
   d1, d2 = drange
   mask = (
       df["segment"].isin(segment)
       & df["region"].isin(region or regions)
       & (df["date"] >= d1)
       & (df["date"] <= d2)
   )
   sub = df[mask].copy()
   if sub.empty:
       return df.iloc[:0]
   return sub


@pn.depends(segment_sel, region_sel, metric_sel, smooth_slider, date_range)
def timeseries_plot(segment, region, metric, window, drange):
   data = filtered_df(segment, region, drange)
   if data.empty:
       return pn.pane.Markdown("### No data for current filters")
   grouped = data.sort_values("date").groupby("date")[metric].sum()
   line = grouped.hvplot.line(title=f"{metric.title()} over time", ylabel=metric.title())
   if window > 1:
       smooth = grouped.rolling(window).mean().hvplot.line(line_width=3, alpha=0.6)
       return (line * smooth).opts(legend_position="top_left")
   return line

We create functional widgets and a filter concept that controls the entire dashboard. We include the Time-Securis timeline in widgets using Reaction @ PN.Dients, allowing to change components, regions, metrics, date windows, and smooth windows instantly. With this setup, we can change ideas fluidly and see results in real time. Look Full codes here.

@pn.depends(segment_sel, region_sel, metric_sel, date_range)
def segment_bar(segment, region, metric, drange):
   data = filtered_df(segment, region, drange)
   if data.empty:
       return pn.pane.Markdown("### No data to aggregate")
   agg = data.groupby("segment")[metric].sum().sort_values(ascending=False)
   return agg.hvplot.bar(title=f"{metric.title()} by Segment", yaxis=None)


@pn.depends(segment_sel, region_sel, metric_sel, date_range)
def region_heatmap(segment, region, metric, drange):
   data = filtered_df(segment, region, drange)
   if data.empty:
       return pn.pane.Markdown("### No data to aggregate")
   pivot = data.pivot_table(index="segment", columns="region", values=metric, aggfunc="sum")
   return pivot.hvplot.heatmap(title=f"{metric.title()} Heatmap", clabel=metric.title())

We create additional visual layers: a level-level bar chart and a region segment heatmap. We enable these charts to respond to the same global filters, so they update automatically whenever we choose. This gives us a deep breakdown of patterns in existing classes without writing uncompilable code. Look Full codes here.

kpi_source = df.copy()
kpi_idx = [0]


def compute_kpi(slice_df):
   if slice_df.empty:
       return 0, 0, 0
   total_rev = slice_df["revenue"].sum()
   avg_conv = slice_df["conversions"].mean()
   cr = (slice_df["conversions"].sum() / slice_df["traffic"].sum()) * 100
   return total_rev, avg_conv, cr


kpi_value = pn.indicators.Number(name="Total Revenue (window)", value=0, format="$0,0")
conv_value = pn.indicators.Number(name="Avg Conversions", value=0, format="0.0")
cr_value = pn.indicators.Number(name="Conversion Rate", value=0, format="0.00%")


def update_kpis():
   step = 200
   start = kpi_idx[0]
   end = start + step
   if start >= len(kpi_source):
       kpi_idx[0] = 0
       start, end = 0, step
   window_df = kpi_source.iloc[start:end]
   kpi_idx[0] = end
   total_rev, avg_conv, cr = compute_kpi(window_df)
   kpi_value.value = total_rev
   conv_value.value = avg_conv
   cr_value.value = cr / 100


pn.state.add_periodic_callback(update_kpis, period=1000, start=True)

We simulate a rotating stream of KPIS that refreshes every second, creating a live dashboard experience. We combine available funds, average conversions, and conversion rates within sliding windows and push the values ​​to the panel's numeric indicators. This allows us to see how the methods evolve continuously, like a real monitoring system. Look Full codes here.

controls = pn.WidgetBox(
   "### Global Controls",
   segment_sel,
   region_sel,
   metric_sel,
   date_range,
   smooth_slider,
   sizing_mode="stretch_width",
)


page_overview = pn.Column(
   pn.pane.Markdown("## Overview: Filtered Time Series"),
   controls,
   timeseries_plot,
)


page_insights = pn.Column(
   pn.pane.Markdown("## Segment & Region Insights"),
   pn.Row(segment_bar, region_heatmap),
)


page_live = pn.Column(
   pn.pane.Markdown("## Live KPI Window (simulated streaming)"),
   pn.Row(kpi_value, conv_value, cr_value),
)


dashboard = pn.Tabs(
   ("Overview", page_overview),
   ("Insights", page_insights),
   ("Live KPIs", page_live),
)


dashboard

We combine all the elements in clean multi-page houses using tabs. We organize the dashboard into an overview page, an insight page, and a live page, making navigation easy and intuitive. With this build, we get a built-in, functional analytics application ready to work directly on Google Colab.

In conclusion, we see how we can combine dashboards, HVPlot visualizations, and time intervals to create a powerful analytics dashboard. We appreciate that every module, from logical arrangement in Bar charts to live KPI streaming, fits together to produce a unified interface that works seamlessly. We end up with a complete, functional system that we can run through real-world reporting, testing, or production-grade dashboards.


Look Full codes here. Feel free to take a look at ours GitHub page for tutorials, code and notebooks. Also, feel free to follow us Kind of stubborn and don't forget to join ours 100K + ML Subreddit and sign up Our newsletter. Wait! Do you telegraph? Now you can join us by telegraph.


AsifAzzaq is the CEO of MarktechPost Media Inc.. as a visionary entrepreneur and developer, Asifi is committed to harnessing the power of social intelligence for good. His latest effort is the launch of a media intelligence platform, MarktechPpost, which stands out for its deep understanding of machine learning and deep learning stories that are technically sound and easily understood by a wide audience. The platform sticks to more than two million monthly views, which shows its popularity among the audience.

Follow Marktechpost: Add us as a favorite source on Google.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button