Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DO NOT MERGE] Numpy2 #50

Closed
wants to merge 39 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
7286aed
initial fix with ruff
alexmgoldberg Jan 18, 2024
cc2a358
true NPY201 fixes
jeremyleung521 Jan 19, 2024
dfe572d
remove ruff lines
jeremyleung521 Jan 19, 2024
80aae91
Merge pull request #1 from jeremyleung521/numpy2
alexmgoldberg Jan 19, 2024
0432f4e
initial fix with ruff
alexmgoldberg Jan 18, 2024
1aee1c3
true NPY201 fixes
jeremyleung521 Jan 19, 2024
8e657af
remove ruff lines
jeremyleung521 Jan 19, 2024
7543537
Merge branch 'develop' of ssh://github.com/westpa/westpa into numpy2
jeremyleung521 Apr 5, 2024
f2c8f18
some changes to the cython code for numpy 2
jeremyleung521 Apr 5, 2024
21b7f1d
bump minimum version for dependencies
jeremyleung521 Apr 5, 2024
c50e8f7
Merge branch 'numpy2' of github.com:jeremyleung521/westpa into numpy2
jeremyleung521 Sep 4, 2024
d68fbbd
Merge branch 'develop' of github.com:jeremyleung521/westpa into numpy2
jeremyleung521 Sep 4, 2024
7d78590
not in h5diff.py
jeremyleung521 Sep 4, 2024
7e90af7
pin to numpy2, <3
jeremyleung521 Sep 4, 2024
bbd018a
drop python 3.8, bump to 3.12
jeremyleung521 Sep 4, 2024
946ea86
Merge branch 'develop' into numpy2
jeremyleung521 Oct 14, 2024
a3140b9
Merge branch 'develop' of ssh://github.com/westpa/westpa into numpy2
jeremyleung521 Dec 11, 2024
b125317
update westpa dependencies for numpy2
jeremyleung521 Dec 11, 2024
b8b29a5
pin numpy versions correctly
jeremyleung521 Dec 12, 2024
6fd3be2
Merge branch 'develop' of ssh://github.com/westpa/westpa into numpy2
jeremyleung521 Dec 12, 2024
4941e97
remove np.msort
jeremyleung521 Dec 12, 2024
d9559a8
cpdef is probably more appropriate
jeremyleung521 Dec 12, 2024
58b06d6
Merge branch 'develop' of ssh://github.com/westpa/westpa into numpy2
jeremyleung521 Dec 12, 2024
822f11a
drop py<= 3.8 support, hopefully allow numpy2 testing?
jeremyleung521 Dec 12, 2024
9bc590a
print conda env before moving on in testing
jeremyleung521 Dec 12, 2024
51b72b1
update ref file for numpy2
jeremyleung521 Dec 12, 2024
01c0ba2
Revert "update ref file for numpy2"
jeremyleung521 Dec 12, 2024
fb86e43
fix numpy str/repr changes in numpy2
jeremyleung521 Jan 9, 2025
e5d847e
use tmp_path instead of tmp_dir
jeremyleung521 Jan 9, 2025
65f539b
for some reason the fixtures folder is not recognized locally by pyte…
jeremyleung521 Jan 9, 2025
b0dc47b
fix fixtures folder
jeremyleung521 Jan 10, 2025
ec34da3
readability first for output
jeremyleung521 Jan 16, 2025
6dd5d8f
can't set to -1 if it's unsigned int
jeremyleung521 Jan 16, 2025
6afd220
let's see if these new files solve the mab tests
jeremyleung521 Jan 16, 2025
2ec8300
Revert "let's see if these new files solve the mab tests"
jeremyleung521 Jan 16, 2025
af491cc
do stable argsort for mab
jeremyleung521 Jan 22, 2025
f80dd0f
use np.trapezoid instead of np.trapz due to deprecation
jeremyleung521 Jan 22, 2025
d899fc3
force stable sorting everywhere
jeremyleung521 Jan 22, 2025
df855a9
test on numpy1 and numpy2
jeremyleung521 Jan 22, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ jobs:

- uses: actions/setup-python@v5
with:
python-version: '3.11'
python-version: '3.12'

- name: Build sdist
run: pipx run build --sdist
Expand Down
12 changes: 7 additions & 5 deletions .github/workflows/test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,22 +20,23 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.11
- name: Set up Python 3.12
uses: actions/setup-python@v5
with:
python-version: 3.11
python-version: 3.12
- name: Linting
run: |
pip install pre-commit
pre-commit run --all-files

test:
name: Test on ${{ matrix.os }}, Python ${{ matrix.python-version }}
name: Test on ${{ matrix.os }}, Python ${{ matrix.python-version }}, Numpy ${{ matrix.numpy-version }}
runs-on: "${{ matrix.os }}"
strategy:
matrix:
os: ["ubuntu-latest", "macos-13", "macos-latest"] # macos-13 is x86-64, macos-latest is arm64
python-version: ["3.9", "3.10", "3.11", "3.12"]
numpy-version: ['1', '2']

steps:
- uses: actions/checkout@v4
Expand All @@ -53,7 +54,7 @@ jobs:
- uses: conda-incubator/setup-miniconda@v3
with:
python-version: ${{ matrix.python-version }}
environment-file: devtools/conda-envs/test_env.yaml
environment-file: devtools/conda-envs/test_env_numpy${{ matrix.numpy-version }}.yaml
activate-environment: test_env
channel-priority: true
auto-update-conda: true
Expand All @@ -64,6 +65,7 @@ jobs:
# conda setup requires this special shell
shell: bash -l {0}
run: |
conda list
conda info --all
python -m pip install . -v --no-deps
conda list
Expand Down Expand Up @@ -96,7 +98,7 @@ jobs:
- uses: "actions/checkout@v4"
- uses: "actions/setup-python@v5"
with:
python-version: "3.11"
python-version: "3.12"
- name: "Install HDF5 with brew if macos-arm64"
if: ${{ matrix.os == 'macos-latest' }}
run: "brew install hdf5"
Expand Down
2 changes: 1 addition & 1 deletion devtools/conda-envs/test_env.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ channels:
dependencies:
- python
- cython
- numpy<2
- numpy
- scipy
- h5py
- pyyaml
Expand Down
25 changes: 25 additions & 0 deletions devtools/conda-envs/test_env_numpy1.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
name: test_env
channels:
- conda-forge
- defaults
dependencies:
- python
- cython
- numpy=1
- scipy
- h5py
- pyyaml
- pyzmq
- matplotlib-base
- blessings
- ipykernel
- mpi4py
- tqdm
- mdtraj
- netCDF4
# testing
- pytest
- pytest-cov
- pytest-rerunfailures
- pytest-timeout
- codecov
24 changes: 24 additions & 0 deletions devtools/conda-envs/test_env_numpy2.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
name: test_env
channels:
- conda-forge
- defaults
dependencies:
- python
- cython
- numpy
- scipy
- h5py
- pyyaml
- pyzmq
- matplotlib-base
- blessings
- ipykernel
- mpi4py
- tqdm
- mdtraj
# testing
- pytest
- pytest-cov
- pytest-rerunfailures
- pytest-timeout
- codecov
6 changes: 4 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,7 @@ requires = [
"wheel",
"Cython >=0.29.16; python_version >='3.10'", # Note: sync with setup.py
"Cython <3.0.3, >=0.29.16; python_version <'3.10'",
"oldest-supported-numpy; python_version <'3.9'",
"numpy >=1.25.0, <2; python_version >='3.9'",
"numpy >=2.0.0, <3", # Force numpy>=2 during build time so it works on both 1.XX and 2.XX
"scipy >=0.19.1",
"versioneer-518"
]
Expand Down Expand Up @@ -45,3 +44,6 @@ exclude = '''
)
)
'''

[tool.ruff.lint]
select = ['NPY201']
4 changes: 2 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ def extensions():
]

INSTALL_REQUIRES = [
"numpy >= 1.16.0, <2",
"numpy >= 1.25, <3",
"scipy >= 0.19.1",
"h5py >= 2.10",
"mdtraj >= 1.9.5",
Expand Down Expand Up @@ -139,7 +139,7 @@ def extensions():
version=versioneer.get_version(),
keywords='',
cmdclass=versioneer.get_cmdclass(),
python_requires=">=3.6",
python_requires=">=3.9",
zip_safe=False,
classifiers=CLASSIFIERS,
entry_points={'console_scripts': console_scripts},
Expand Down
8 changes: 4 additions & 4 deletions src/westpa/cli/tools/w_assign.py
Original file line number Diff line number Diff line change
Expand Up @@ -474,17 +474,17 @@ def go(self):

# Recursive mappers produce a generator rather than a list of labels
# so consume the entire generator into a list
labels = [np.string_(label) for label in self.binning.mapper.labels]
labels = [np.bytes_(label) for label in self.binning.mapper.labels]

self.output_file.create_dataset('bin_labels', data=labels, compression=9)

if self.states:
nstates = len(self.states)
state_map[:] = nstates # state_id == nstates => unknown state
state_labels = [np.string_(state['label']) for state in self.states]
state_labels = [np.bytes_(state['label']) for state in self.states]

for istate, sdict in enumerate(self.states):
assert state_labels[istate] == np.string_(sdict['label']) # sanity check
assert state_labels[istate] == np.bytes_(sdict['label']) # sanity check
state_assignments = assign(sdict['coords'])
for assignment in state_assignments:
state_map[assignment] = istate
Expand Down Expand Up @@ -558,7 +558,7 @@ def go(self):
shuffle=True,
chunks=h5io.calc_chunksize(pops_shape, weight_dtype),
)
h5io.label_axes(pops_ds, [np.string_(i) for i in ['iteration', 'state', 'bin']])
h5io.label_axes(pops_ds, [np.bytes_(i) for i in ['iteration', 'state', 'bin']])

pi.new_operation('Assigning to bins', iter_stop - iter_start)
last_labels = None # mapping of seg_id to last macrostate inhabited
Expand Down
2 changes: 1 addition & 1 deletion src/westpa/cli/tools/w_fluxanl.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
from westpa.tools.dtypes import iter_block_ci_dtype as ci_dtype
import westpa.mclib as mclib

fluxentry_dtype = np.dtype([('n_iter', n_iter_dtype), ('flux', weight_dtype), ('count', np.uint)])
fluxentry_dtype = np.dtype([('n_iter', n_iter_dtype), ('flux', weight_dtype), ('count', np.int32)])

target_index_dtype = np.dtype(
[
Expand Down
4 changes: 2 additions & 2 deletions src/westpa/cli/tools/w_ntop.py
Original file line number Diff line number Diff line change
Expand Up @@ -190,9 +190,9 @@ def go(self):
weights = all_weights.take(segs)

if what == 'lowweight':
indices = np.argsort(weights)[:count]
indices = np.argsort(weights, kind='stable')[:count]
elif what == 'highweight':
indices = np.argsort(weights)[::-1][:count]
indices = np.argsort(weights, kind='stable')[::-1][:count]
else:
assert what == 'random'
indices = np.random.permutation(len(weights))
Expand Down
9 changes: 7 additions & 2 deletions src/westpa/cli/tools/w_red.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,11 @@
from westpa import rc
from westpa.tools import WESTParallelTool

try:
from numpy import trapezoid
except ImportError:
from numpy import trapz as trapezoid


class DurationCorrector(object):
@staticmethod
Expand Down Expand Up @@ -128,9 +133,9 @@ def correction(self, iters, freqs=None):

for i, tau in enumerate(taugrid):
if i > 0 and tau < maxduration:
integral1[i] = np.trapz(f_tilde[: i + 1], taugrid[: i + 1])
integral1[i] = trapezoid(f_tilde[: i + 1], taugrid[: i + 1])

integral2 = np.trapz(integral1, taugrid)
integral2 = trapezoid(integral1, taugrid)

if integral2 == 0:
return 0.0
Expand Down
12 changes: 8 additions & 4 deletions src/westpa/core/binning/assign.py
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,7 @@ def boundaries(self, boundaries):
self._boundaries = []
self.labels = labels = []
for boundset in boundaries:
boundarray = np.asarray(boundset, dtype=coord_dtype, order='C')
boundarray = np.ascontiguousarray(boundset, dtype=coord_dtype)
db = np.diff(boundarray)
if (db <= 0).any():
raise ValueError('boundary set must be strictly monotonically increasing')
Expand All @@ -146,8 +146,12 @@ def boundaries(self, boundaries):
_boundaries = self._boundaries
binspace_shape = tuple(self._boundlens[:] - 1)
for index in np.ndindex(binspace_shape):
bounds = [(_boundaries[idim][index[idim]], boundaries[idim][index[idim] + 1]) for idim in range(len(_boundaries))]
labels.append(repr(bounds))
bounds = [
(float(str(_boundaries[idim][index[idim]])), float(str(boundaries[idim][index[idim] + 1])))
for idim in range(len(_boundaries))
]
print(repr(bounds))
labels.append(str(bounds))

def assign(self, coords, mask=None, output=None):
try:
Expand Down Expand Up @@ -184,7 +188,7 @@ def __init__(self, functions):
self.functions = functions
self.nbins = len(functions)
self.index_dtype = np.min_scalar_type(self.nbins)
self.labels = [repr(func) for func in functions]
self.labels = [str(func) for func in functions]

def assign(self, coords, mask=None, output=None):
if output is None:
Expand Down
10 changes: 7 additions & 3 deletions src/westpa/core/binning/mab.py
Original file line number Diff line number Diff line change
Expand Up @@ -173,8 +173,8 @@ def map_mab(coords: np.ndarray, mask: np.ndarray, output: List[int], *args, **kw
if skip is None:
skip = [0] * ndim

allcoords = np.copy(coords)
allmask = np.copy(mask)
allcoords = coords.copy()
allmask = mask.copy()

weights = None
isfinal = None
Expand Down Expand Up @@ -294,16 +294,20 @@ def detect_bottlenecks(unmasked_coords, unmasked_weights, n_coords, n):
"""
# Grabbing all unmasked coords in current dimension, plus corresponding weights
# Sort by current dimension in coord, smallest to largest
sorted_indices = unmasked_coords[:, n].argsort()
sorted_indices = unmasked_coords[:, n].argsort(kind='stable')

# Grab sorted coords and weights
coords_srt = unmasked_coords[sorted_indices, :]
weights_srt = unmasked_weights[sorted_indices]

# Also sort in reverse order for opposite direction
coords_srt_flip = np.flipud(coords_srt)
weights_srt_flip = np.flipud(weights_srt)

# Initialize the max directional differences along current dimension as None (these may not be updated)
bottleneck_coords, bottleneck_coords_flip = None, None
maxdiff, maxdiff_flip = -np.inf, -np.inf

# Looping through all non-boundary coords
# Compute the cumulative weight on either side of each non-boundary walker
for i in range(1, n_coords - 1):
Expand Down
4 changes: 2 additions & 2 deletions src/westpa/core/data_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -1549,7 +1549,7 @@ def create_dataset_from_dsopts(group, dsopts, shape=None, dtype=None, data=None,
# dsopts['file'] = str(dsopts['file']).format(n_iter=n_iter)
h5_auxfile = h5io.WESTPAH5File(dsopts['file'].format(n_iter=n_iter))
h5group = group
if not ("iter_" + str(n_iter).zfill(8)) in h5_auxfile:
if ("iter_" + str(n_iter).zfill(8)) not in h5_auxfile:
h5_auxfile.create_group("iter_" + str(n_iter).zfill(8))
group = h5_auxfile[('/' + "iter_" + str(n_iter).zfill(8))]

Expand Down Expand Up @@ -1652,7 +1652,7 @@ def create_dataset_from_dsopts(group, dsopts, shape=None, dtype=None, data=None,
if 'file' in list(dsopts.keys()):
import h5py

if not dsopts['h5path'] in h5group:
if dsopts['h5path'] not in h5group:
h5group[dsopts['h5path']] = h5py.ExternalLink(
dsopts['file'].format(n_iter=n_iter), ("/" + "iter_" + str(n_iter).zfill(8) + "/" + dsopts['h5path'])
)
Expand Down
4 changes: 2 additions & 2 deletions src/westpa/core/h5io.py
Original file line number Diff line number Diff line change
Expand Up @@ -344,10 +344,10 @@ def label_axes(h5object, labels, units=None):
if len(units) and len(units) != len(labels):
raise ValueError('number of units labels does not match number of axes')

h5object.attrs['axis_labels'] = np.array([np.string_(i) for i in labels])
h5object.attrs['axis_labels'] = np.array([np.bytes_(i) for i in labels])

if len(units):
h5object.attrs['axis_units'] = np.array([np.string_(i) for i in units])
h5object.attrs['axis_units'] = np.array([np.bytes_(i) for i in units])


NotGiven = object()
Expand Down
2 changes: 1 addition & 1 deletion src/westpa/core/kinetics/_kinetics.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ ctypedef np.uint16_t index_t
ctypedef np.float64_t weight_t
ctypedef np.uint8_t bool_t
ctypedef np.int64_t seg_id_t
ctypedef np.uint_t uint_t # 32 bits on 32-bit systems, 64 bits on 64-bit systems
ctypedef np.uintp_t uint_t # 32 bits on 32-bit systems, 64 bits on 64-bit systems

cdef double NAN = np.nan

Expand Down
2 changes: 1 addition & 1 deletion src/westpa/core/kinetics/matrates.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ def get_steady_state(rates):

vals = np.abs(vals)
log.debug('eigenvalues: {!r}'.format(list(reversed(sorted(vals)))))
asort = np.argsort(vals)
asort = np.argsort(vals, kind='stable')
vec = vecs[:, asort[-1]]
ss = np.abs(vec)

Expand Down
2 changes: 1 addition & 1 deletion src/westpa/core/reweight/_reweight.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ ctypedef numpy.uint16_t index_t
ctypedef numpy.float64_t weight_t
ctypedef numpy.uint8_t bool_t
ctypedef numpy.int64_t trans_t
ctypedef numpy.uint_t uint_t # 32 bits on 32-bit systems, 64 bits on 64-bit systems
ctypedef numpy.uintp_t uint_t # 32 bits on 32-bit systems, 64 bits on 64-bit systems
ctypedef unsigned short Ushort
ctypedef double complex Cdouble

Expand Down
Loading
Loading