Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move .rolling_exp functions from reduce to apply_ufunc #8114

Merged
merged 6 commits into from
Sep 19, 2023
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,9 @@ Bug fixes
issues (:issue:`7817`, :issue:`7942`, :issue:`7790`, :issue:`6191`, :issue:`7096`,
:issue:`1064`, :pull:`7827`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
- ``.rolling_exp`` functions no longer mistakenly lose non-dimensioned coords
(:issue:`6528`, :pull:`8114`)
By `Maximilian Roos <https://github.com/max-sixty>`_.

Documentation
~~~~~~~~~~~~~
Expand All @@ -96,6 +99,10 @@ Internal Changes
By `András Gunyhó <https://github.com/mgunyho>`_.
- Refactor of encoding and decoding times/timedeltas to preserve nanosecond resolution in arrays that contain missing values (:pull:`7827`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
- Transition ``.rolling_exp`` functions to use `.apply_ufunc` internally rather
than `.reduce`, as the start of a broader effort to move non-reducing
functions away from ```.reduce``, (:pull:`8114`).
By `Maximilian Roos <https://github.com/max-sixty>`_.

.. _whats-new.2023.08.0:

Expand Down
4 changes: 3 additions & 1 deletion xarray/core/computation.py
Original file line number Diff line number Diff line change
Expand Up @@ -433,6 +433,7 @@ def apply_dict_of_variables_vfunc(
join="inner",
fill_value=None,
on_missing_core_dim: MissingCoreDimOptions = "raise",
keep_attrs="override",
):
"""Apply a variable level function over dicts of DataArray, DataArray,
Variable and ndarray objects.
Expand All @@ -445,7 +446,7 @@ def apply_dict_of_variables_vfunc(
for name, variable_args in zip(names, grouped_by_name):
core_dim_present = _check_core_dims(signature, variable_args, name)
if core_dim_present is True:
result_vars[name] = func(*variable_args)
result_vars[name] = func(*variable_args, keep_attrs=keep_attrs)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that can be any function and there is no guarantee that it will consume this argument (admittedly I am surprised why this does not error in our test suite - so I may be wrong). Also isn't this on the DataArray level while the missing attrs are on the Dataset level?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, good chance that we are undertesting here. But the error is raised for the attribute on z1 (DataArray level). And this propagates keep_attrs.

Copy link
Contributor

@kmuehlbauer kmuehlbauer Sep 18, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK this only works for core_dim_present @max-sixty, so the attributes would also have to be removed on on_missing_core_dim == "copy". But as @mathause already mentioned, it's debatable if it should loose it's attrs.

else:
if on_missing_core_dim == "raise":
raise ValueError(core_dim_present)
Expand Down Expand Up @@ -522,6 +523,7 @@ def apply_dataset_vfunc(
join=dataset_join,
fill_value=fill_value,
on_missing_core_dim=on_missing_core_dim,
keep_attrs=keep_attrs,
)

out: Dataset | tuple[Dataset, ...]
Expand Down
31 changes: 25 additions & 6 deletions xarray/core/rolling_exp.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@

import numpy as np

from xarray.core.computation import apply_ufunc
from xarray.core.options import _get_keep_attrs
from xarray.core.pdcompat import count_not_none
from xarray.core.pycompat import is_duck_dask_array
Expand Down Expand Up @@ -128,9 +129,18 @@ def mean(self, keep_attrs: bool | None = None) -> T_DataWithCoords:
if keep_attrs is None:
keep_attrs = _get_keep_attrs(default=True)

return self.obj.reduce(
move_exp_nanmean, dim=self.dim, alpha=self.alpha, keep_attrs=keep_attrs
)
dim_order = self.obj.dims

return apply_ufunc(
move_exp_nanmean,
self.obj,
input_core_dims=[[self.dim]],
kwargs=dict(alpha=self.alpha, axis=-1),
output_core_dims=[[self.dim]],
exclude_dims={self.dim},
keep_attrs=keep_attrs,
on_missing_core_dim="copy",
).transpose(*dim_order)

def sum(self, keep_attrs: bool | None = None) -> T_DataWithCoords:
"""
Expand All @@ -155,6 +165,15 @@ def sum(self, keep_attrs: bool | None = None) -> T_DataWithCoords:
if keep_attrs is None:
keep_attrs = _get_keep_attrs(default=True)

return self.obj.reduce(
move_exp_nansum, dim=self.dim, alpha=self.alpha, keep_attrs=keep_attrs
)
dim_order = self.obj.dims

return apply_ufunc(
move_exp_nansum,
self.obj,
input_core_dims=[[self.dim]],
kwargs=dict(alpha=self.alpha, axis=-1),
output_core_dims=[[self.dim]],
exclude_dims={self.dim},
keep_attrs=keep_attrs,
on_missing_core_dim="copy",
).transpose(*dim_order)