Skip to content

Fix for [BUG] Error handling timezones #305 #318

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 24 commits into from
Mar 6, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
777ffad
fix: check if update_data contains update before batch_update
jvdd Aug 25, 2024
4396f01
add test + avoid same error when verbose=True
jvdd Aug 25, 2024
277ce36
:broom: create _hf_data_container if correct trace type
jvdd Aug 26, 2024
43d01a8
:pray: python 3.7 not supported on Apple Silicon
jvdd Aug 27, 2024
cc3921e
remove WIP
jvdd Aug 27, 2024
ef81227
:pen: more verbose asserts
jvdd Aug 29, 2024
6de51be
:pen: more verbose asserts
jvdd Aug 29, 2024
d61017a
:pray: more sleep time
jvdd Aug 30, 2024
b2f9975
:pray:
jvdd Aug 30, 2024
ebf6aa6
:raised_hands:
jvdd Aug 30, 2024
f38a5c1
:thinking: fix for [BUG] Error handling timezones #305
jonasvdd Sep 9, 2024
1f88adb
:see_no_evil: linting
jonasvdd Sep 9, 2024
b31dce7
:dash: Refactor timezone handling in PlotlyAggregatorParser
jonasvdd Oct 23, 2024
f898de2
Update minmax operator image
emmanuel-ferdman Oct 24, 2024
0223450
Merge pull request #321 from emmanuel-ferdman/main
jonasvdd Oct 28, 2024
2c6deff
Merge pull request #316 from predict-idlab/fw_resampler_no_update
jonasvdd Dec 6, 2024
8914937
Drop duplicate sentence
ivanovmg Dec 12, 2024
bb30c91
Merge pull request #327 from ivanovmg/fix/contributing
jonasvdd Dec 15, 2024
18da98f
Feat/plotly6 (#338)
jonasvdd Feb 21, 2025
8f13b18
:thinking: fix for [BUG] Error handling timezones #305
jonasvdd Sep 9, 2024
ed880cc
:see_no_evil: linting
jonasvdd Sep 9, 2024
98e4ebc
:dash: Refactor timezone handling in PlotlyAggregatorParser
jonasvdd Oct 23, 2024
bb5c367
Merge branch 'bug/305_timezones' of github.com:predict-idlab/plotly-r…
jonasvdd Feb 21, 2025
5108824
:pushpin: bug: Fix timezone handling for DST in PlotlyAggregatorParse…
jonasvdd Mar 5, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 11 additions & 6 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,13 +19,15 @@ on:

jobs:
test:

runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
os: ['windows-latest', 'macOS-latest', 'ubuntu-latest']
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
python-version: ['3.8', '3.9', '3.10', '3.11', '3.12', '3.13']
exclude:
- os: ubuntu-latest
python-version: '3.12'
defaults:
run:
shell: bash
Expand Down Expand Up @@ -70,12 +72,15 @@ jobs:
run: |
poetry run pytest --cov=plotly_resampler --junitxml=junit/test-results-${{ matrix.python-version }}.xml --cov-report=xml tests
- name: Upload pytest test results
uses: actions/upload-artifact@v3
# Use always() to always run this step to publish test results when there are test failures
if: ${{ always() && hashFiles('junit/test-results-${{ matrix.python-version }}.xml') != '' }}
uses: actions/upload-artifact@v4
with:
name: pytest-results-${{ matrix.python-version }}
name: pytest-results-${{ matrix.python-version }}-${{ matrix.os }}-${{ github.run_number }}
path: junit/test-results-${{ matrix.python-version }}.xml
# Use always() to always run this step to publish test results when there are test failures
if: ${{ always() }}
overwrite: true
retention-days: 7
compression-level: 5

- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
2 changes: 0 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,6 @@
First of all, thank you for considering contributing to `plotly-resampler`.<br>
It's people like you that will help make `plotly-resampler` a great toolkit. 🤝

As usual, contributions are managed through GitHub Issues and Pull Requests.

As usual, contributions are managed through GitHub Issues and Pull Requests.
We invite you to use GitHub's [Issues](https://github.com/predict-idlab/plotly-resampler/issues) to report bugs, request features, or ask questions about the project. To ask use-specific questions, please use the [Discussions](https://github.com/predict-idlab/plotly-resampler/discussions) instead.

Expand Down
2 changes: 1 addition & 1 deletion examples/other_apps/streamlit_app.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@

__author__ = "Jeroen Van Der Donckt"

# Explicitely set pio.templates in order to have colored traces in the streamlit app!
# Explicitly set pio.templates in order to have colored traces in the streamlit app!
# -> https://discuss.streamlit.io/t/streamlit-overrides-colours-of-plotly-chart/34943/5
import plotly.io as pio

Expand Down
4 changes: 2 additions & 2 deletions examples/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
pyfunctional>=1.4.3
dash-bootstrap-components>=1.2.0
dash-extensions==1.0.1 # fixated on this version as more recent versions do not work
dash-extensions==1.0.20 # fixated on this version as more recent versions do not work
ipywidgets>=7.7.0
memory-profiler>=0.60.0
line-profiler>=3.5.1
pyarrow>=6.0.0
pyarrow>=17.0.0
kaleido>=0.2.1
flask-cors>=3.0.10
2 changes: 1 addition & 1 deletion plotly_resampler/aggregation/aggregators.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ class MinMaxOverlapAggregator(DataPointSelector):
"""Aggregation method which performs binned min-max aggregation over 50% overlapping
windows.

![minmax operator image](https://github.com/predict-idlab/plotly-resampler/blob/main/docs/sphinx/_static/minmax_operator.png)
![minmax operator image](https://github.com/predict-idlab/plotly-resampler/blob/main/mkdocs/static/minmax_operator.png)

In the above image, **bin_size**: represents the size of *(len(series) / n_out)*.
As the windows have 50% overlap and are consecutive, the min & max values are
Expand Down
18 changes: 15 additions & 3 deletions plotly_resampler/aggregation/plotly_aggregator_parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,11 @@ def to_same_tz(
return None
elif reference_tz is not None:
if ts.tz is not None:
assert ts.tz.__str__() == reference_tz.__str__()
# compare if these two have the same timezone / offset
try:
assert ts.tz.__str__() == reference_tz.__str__()
except AssertionError:
assert ts.utcoffset() == reference_tz.utcoffset(ts.tz_convert(None))
return ts
else: # localize -> time remains the same
return ts.tz_localize(reference_tz)
Expand Down Expand Up @@ -78,7 +82,15 @@ def get_start_end_indices(hf_trace_data, axis_type, start, end) -> Tuple[int, in
# convert start & end to the same timezone
if isinstance(hf_trace_data["x"], pd.DatetimeIndex):
tz = hf_trace_data["x"].tz
assert start.tz == end.tz
try:
assert start.tz.__str__() == end.tz.__str__()
except (TypeError, AssertionError):
# This fix is needed for DST (when the timezone is not fixed)
assert start.tz_localize(None) == start.tz_convert(tz).tz_localize(
None
)
assert end.tz_localize(None) == end.tz_convert(tz).tz_localize(None)

start = PlotlyAggregatorParser.to_same_tz(start, tz)
end = PlotlyAggregatorParser.to_same_tz(end, tz)

Expand Down Expand Up @@ -190,7 +202,7 @@ def aggregate(
agg_x = (
start_idx
+ hf_trace_data["x"].start
+ indices * hf_trace_data["x"].step
+ indices.astype(hf_trace_data["x"].dtype) * hf_trace_data["x"].step
)
else:
agg_x = hf_x[indices]
Expand Down
104 changes: 58 additions & 46 deletions plotly_resampler/figure_resampler/figure_resampler.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
import os
import warnings
from pathlib import Path
from typing import List, Optional, Tuple
from typing import List, Optional, Tuple, Union

import dash
import plotly.graph_objects as go
Expand All @@ -26,15 +26,9 @@
MinMaxLTTB,
)
from .figure_resampler_interface import AbstractFigureAggregator
from .jupyter_dash_persistent_inline_output import JupyterDashPersistentInlineOutput
from .utils import is_figure, is_fr

try:
from .jupyter_dash_persistent_inline_output import JupyterDashPersistentInlineOutput

_jupyter_dash_installed = True
except ImportError:
_jupyter_dash_installed = False

# Default arguments for the Figure overview
ASSETS_FOLDER = Path(__file__).parent.joinpath("assets").absolute().__str__()
_DEFAULT_OVERVIEW_LAYOUT_KWARGS = {
Expand Down Expand Up @@ -242,7 +236,6 @@ def __init__(
self._host: str | None = None
# Certain functions will be different when using persistent inline
# (namely `show_dash` and `stop_callback`)
self._is_persistent_inline = False

def _get_subplot_rows_and_cols_from_grid(self) -> Tuple[int, int]:
"""Get the number of rows and columns of the figure's grid.
Expand Down Expand Up @@ -535,7 +528,9 @@ def show_dash(
constructor via the ``show_dash_kwargs`` argument.

"""
available_modes = ["external", "inline", "inline_persistent", "jupyterlab"]
available_modes = list(dash._jupyter.JupyterDisplayMode.__args__) + [
"inline_persistent"
]
assert (
mode is None or mode in available_modes
), f"mode must be one of {available_modes}"
Expand Down Expand Up @@ -576,25 +571,6 @@ def show_dash(
init_dash_kwargs["external_scripts"] = ["https://cdn.jsdelivr.net/npm/lodash/lodash.min.js" ]
# fmt: on

if mode == "inline_persistent":
mode = "inline"
if _jupyter_dash_installed:
# Inline persistent mode: we display a static image of the figure when the
# app is not reachable
# Note: this is the "inline" behavior of JupyterDashInlinePersistentOutput
app = JupyterDashPersistentInlineOutput("local_app", **init_dash_kwargs)
self._is_persistent_inline = True
else:
# If Jupyter Dash is not installed, inline persistent won't work and hence
# we default to normal inline mode with a normal Dash app
app = dash.Dash("local_app", **init_dash_kwargs)
warnings.warn(
"'jupyter_dash' is not installed. The persistent inline mode will not work. Defaulting to standard inline mode."
)
else:
# jupyter dash uses a normal Dash app as figure
app = dash.Dash("local_app", **init_dash_kwargs)

# fmt: off
div = dash.html.Div(
children=[
Expand All @@ -620,18 +596,19 @@ def show_dash(
**graph_properties,
),
]
app.layout = div

# Create the app, populate the layout and register the resample callback
app = dash.Dash("local_app", **init_dash_kwargs)
app.layout = div
self.register_update_graph_callback(
app,
"resample-figure",
"overview-figure" if self._create_overview else None,
)

height_param = "height" if self._is_persistent_inline else "jupyter_height"

# 2. Run the app
if mode == "inline" and height_param not in kwargs:
height_param = "height" if mode == "inline_persistent" else "jupyter_height"
if "inline" in mode and height_param not in kwargs:
# If app height is not specified -> re-use figure height for inline dash app
# Note: default layout height is 450 (whereas default app height is 650)
# See: https://plotly.com/python/reference/layout/#layout-height
Expand All @@ -646,9 +623,11 @@ def show_dash(
self._host = kwargs.get("host", "127.0.0.1")
self._port = kwargs.get("port", "8050")

# function signature is slightly different for the Dash and JupyterDash implementations
if self._is_persistent_inline:
app.run(mode=mode, **kwargs)
# function signatures are slightly different for the (Jupyter)Dash and the
# JupyterDashInlinePersistent implementations
if mode == "inline_persistent":
jpi = JupyterDashPersistentInlineOutput(self)
jpi.run_app(app=app, **kwargs)
else:
app.run(jupyter_mode=mode, **kwargs)

Expand All @@ -665,18 +644,10 @@ def stop_server(self, warn: bool = True):
This only works if the dash-app was started with [`show_dash`][figure_resampler.figure_resampler.FigureResampler.show_dash].
"""
if self._app is not None:
servers_dict = (
self._app._server_threads
if self._is_persistent_inline
else dash.jupyter_dash._servers
)
servers_dict = dash.jupyter_dash._servers
old_server = servers_dict.get((self._host, self._port))
if old_server:
if self._is_persistent_inline:
old_server.kill()
old_server.join()
else:
old_server.shutdown()
old_server.shutdown()
del servers_dict[(self._host, self._port)]
elif warn:
warnings.warn(
Expand All @@ -685,6 +656,47 @@ def stop_server(self, warn: bool = True):
+ "\t- the dash-server wasn't started with 'show_dash'"
)

def construct_update_data_patch(
self, relayout_data: dict
) -> Union[dash.Patch, dash.no_update]:
"""Construct the Patch of the to-be-updated front-end data, based on the layout
change.

Attention
---------
This method is tightly coupled with Dash app callbacks. It takes the front-end
figure its ``relayoutData`` as input and returns the ``dash.Patch`` which needs
to be sent to the ``figure`` property for the corresponding ``dcc.Graph``.

Parameters
----------
relayout_data: dict
A dict containing the ``relayoutData`` (i.e., the changed layout data) of
the corresponding front-end graph.

Returns
-------
dash.Patch:
The Patch object containing the figure updates which needs to be sent to
the front-end.

"""
update_data = self._construct_update_data(relayout_data)
if not isinstance(update_data, list) or len(update_data) <= 1:
return dash.no_update

patched_figure = dash.Patch() # create patch
for trace in update_data[1:]: # skip first item as it contains the relayout
trace_index = trace.pop("index") # the index of the corresponding trace
# All the other items are the trace properties which needs to be updated
for k, v in trace.items():
# NOTE: we need to use the `patched_figure` as a dict, and not
# `patched_figure.data` as the latter will replace **all** the
# data for the corresponding trace, and we just want to update the
# specific trace its properties.
patched_figure["data"][trace_index][k] = v
return patched_figure

def register_update_graph_callback(
self,
app: dash.Dash,
Expand Down
Loading
Loading