Skip to content

Commit

Permalink
Merge pull request #559 from sheriferson/feature/github-actions
Browse files Browse the repository at this point in the history
Python 3.9/3.11 support, dropping Python 3.6 support, and moving to GitHub Actions
  • Loading branch information
sheriferson authored Dec 13, 2023
2 parents 2599bc1 + 9643dc1 commit 622b8ff
Show file tree
Hide file tree
Showing 31 changed files with 237 additions and 129 deletions.
20 changes: 0 additions & 20 deletions .bumpversion.cfg

This file was deleted.

2 changes: 1 addition & 1 deletion .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ jobs:
name: install test dependencies
command: |
. venv/bin/activate
pip install -r test_requirements.txt
pip install -r requirements-test.txt
# run tests!
- run:
Expand Down
85 changes: 85 additions & 0 deletions .github/workflows/github-actions-ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
name: ci/github-actions

on:
pull_request:
branches: [ master, feature/** ]

push:
branches: [ master, feature/** ]


jobs:
run-light-tests:
env:
SKIP_LIMS: true
TEST_INHOUSE: false
ALLOW_TEST_DOWNLOADS: false
TEST_API_ENDPOINT: "http://api.brain-map.org"
IPFX_TEST_TIMEOUT: 60

runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.9", "3.11"]

steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
sudo apt-get update
sudo apt-get install -y hdf5-tools curl
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install allensdk
- name: Run tests
run: |
pip install -r requirements-test.txt
mkdir -p test-results
git lfs install
git config lfs.url 'https://github.com/AllenInstitute/ipfx.git/info/lfs'
git lfs env
git lfs pull
python -m pytest --junitxml=test-results/junit.xml --verbose
onprem-tests:
name: on-prem tests
runs-on: ["self-hosted"]
strategy:
matrix:
image: ["ipfx_py39:latest", "ipfx_py311:latest"]
steps:
- uses: actions/checkout@v4
- name: run test in docker
run: |
docker run \
--env-file ~/env.list \
--mount type=bind,source=$PWD,target=/local1/github_worker,bind-propagation=rshared \
-v /data/informatics/module_test_data/:/data/informatics/module_test_data/ \
-v /allen/aibs/informatics/module_test_data/:/allen/aibs/informatics/module_test_data/ \
-v /allen/programs/celltypes/production/mousecelltypes/:/allen/programs/celltypes/production/mousecelltypes/ \
-v /allen/programs/celltypes/workgroups/279/:/allen/programs/celltypes/workgroups/279/ \
-v /allen/programs/celltypes/production/humancelltypes/:/allen/programs/celltypes/production/humancelltypes/ \
--workdir /local1/github_worker --rm \
--user 1001:1001 \
${{ matrix.image }} \
/bin/bash -c "python -m venv .venv; \
source .venv/bin/activate; \
pip install --upgrade pip; \
pip install numpy; \
pip install -r requirements.txt; \
export TEST_API_ENDPOINT=http://api.brain-map.org; \
export SKIP_LIMS=false; \
export TEST_INHOUSE=true; \
export ALLOW_TEST_DOWNLOADS=true; \
export IPFX_TEST_TIMEOUT=60; \
pip install -r requirements-test.txt; \
git config lfs.url 'https://github.com/AllenInstitute/ipfx.git/info/lfs'; \
git lfs env; \
git lfs pull; \
python -m pytest \
--junitxml=test-reports/test.xml --verbose"
42 changes: 42 additions & 0 deletions .github/workflows/nightly-onprem.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
name: IPFX Nightly onprem tests
on:
schedule:
- cron: '05 6 * * *'


jobs:
onprem:
name: on-prem tests
runs-on: ["self-hosted"]
strategy:
matrix:
image: ["ipfx_py39:latest", "ipfx_py311:latest"]
branch: ["master", "feature/**"]
steps:
- uses: actions/checkout@v4
with:
ref: ${{ matrix.branch }}
- name: run test in docker
run: |
docker run \
--mount type=bind,source=$PWD,target=/local1/github_worker,bind-propagation=rshared \
-v /data/informatics/module_test_data/:/data/informatics/module_test_data/ \
-v /allen/aibs/informatics/module_test_data/:/allen/aibs/informatics/module_test_data/ \
-v /allen/programs/celltypes/production/mousecelltypes/:/allen/programs/celltypes/production/mousecelltypes/ \
-v /allen/programs/celltypes/workgroups/279/:/allen/programs/celltypes/workgroups/279/ \
-v /allen/programs/celltypes/production/humancelltypes/:/allen/programs/celltypes/production/humancelltypes/ \
--workdir /local1/github_worker --rm \
--user 1001:1001 \
${{ matrix.image }} \
/bin/bash -c "python -m venv .venv; \
source .venv/bin/activate; \
pip install --upgrade pip; \
pip install numpy; \
pip install -r requirements.txt; \
export TEST_API_ENDPOINT=http://api.brain-map.org; \
pip install -r requirements-test.txt; \
git config lfs.url 'https://github.com/AllenInstitute/ipfx.git/info/lfs'; \
git lfs env; \
git lfs pull; \
python -m pytest \
--junitxml=test-reports/test.xml --verbose"
13 changes: 11 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -78,5 +78,14 @@ sandbox/
#linters
.mypy_cache

#virtualenv
venv
# virtualenv / environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# MacOS Finder
.DS_Store
1 change: 1 addition & 0 deletions AUTHORS.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,3 +12,4 @@ Credits
* Nile Graddis @nilegraddis
* Sergey Gratiy @sgratiy
* Yang Yu @gnayuy
* Sherif Soliman @sheriferson
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,9 @@ All notable changes to this project will be documented in this file.
### Added

### Changed
- Removed Python 3.6 support
- Updated dependencies and library for Python 3.9 to 3.11 support
- Moved CI and testing to GitHub Actions

## [1.0.8] = 2023-06-29
Changed:
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ visual git plugins, we prefer the following convention for branch naming:
GH-712/bugfix/auto-reward-key
GH-9999/feature/parallel-behavior-analysis
```
* Create an environment and install necessary requirements: `requirements.txt` and `test_requirements.txt`
* Create an environment and install necessary requirements: `requirements.txt` and `requirements-test.txt`
* Start writing code!

### Style guidelines
Expand Down
1 change: 0 additions & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
include requirements.txt
include requirements-test.txt
include requirements-dev.txt
include AUTHORS.rst
include CHANGELOG.md
include README.md
Expand Down
6 changes: 2 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,6 @@ We welcome contributions! Please see our [contribution guide](https://github.com

Deprecation Warning
-------------------
The 1.0.0 release of ipfx brings some new features, like NWB2 support, along with improvements to our documentation and testing. We will also drop support for
- NWB1
- Python 2

Older versions of ipfx will continue to be available, but may receive only occasional bugfixes and patches.
The 2.0.0 release of IPFX drops support for Python 3.6 which reached end of life and stopped receiving security updated on December 23, 2021.
IPFX is now tested on Python 3.9 and higher.
21 changes: 21 additions & 0 deletions docker/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# Docker images for on premise testing

This directory contains Dockerfiles for building images for running on-prem tests that require internal Allen Institute resources. On-prem tests use GitHub self-hosted runners that will run tests on docker images built from these Dockerfiles.

Our light and on-prem tests are defined in [our workflow file](../.github/workflows/github-actions-ci.yml "Link to GitHub Actions workflow for light and on-prem tests").

- See [here](https://docs.github.com/en/actions/hosting-your-own-runners/about-self-hosted-runners) for more information on self-hosted runners.
- See [here](https://docs.github.com/en/actions/hosting-your-own-runners/adding-self-hosted-runners) for more information on adding self-hosted runners to a GitHub repository.

## Building images

If you are an Allen Institute developer, you will have instructions on how to access the machine running the IPFX self-hosted runner.

On this machine you can create the Docker image corresponding to the Python versions we test on by running:

```
cd ipfx/docker/py39
docker build -t ipfx_py39:latest .
```

And this should be sufficient for the on-prem tests defined in our GitHub workflow to run.
10 changes: 10 additions & 0 deletions docker/py311/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
FROM python:3.11

RUN apt-get update \
&& apt-get install -y \
hdf5-tools \
curl \
git-lfs \
&& rm -rf /var/lib/apt/lists/*

RUN curl https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | bash
10 changes: 5 additions & 5 deletions docker/py36/Dockerfile → docker/py39/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
FROM python:3.6
FROM python:3.9

RUN apt-get update \
&& apt-get install -y \
hdf5-tools \
curl
curl \
git-lfs \
&& rm -rf /var/lib/apt/lists/*

RUN curl https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | bash

RUN apt-get install git-lfs
RUN curl https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | bash
2 changes: 1 addition & 1 deletion ipfx/attach_metadata/sink/metadata_sink.py
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@ def _ensure_plural_targets(
targets = self.targets
elif isinstance(targets, dict):
targets = [targets]
elif isinstance(targets, collections.Sequence):
elif isinstance(targets, collections.abc.Sequence):
targets = list(targets)
else:
raise ValueError(
Expand Down
34 changes: 0 additions & 34 deletions ipfx/attach_metadata/sink/nwb2_sink.py
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,6 @@ def _commit_nwb_changes(self):
recorded until _reload_nwbfile
"""

set_container_sources(self.nwbfile, self._h5_file.filename)
self.nwbfile.set_modified(True)
# Because the NWB schema versions of NWB data produced by MIES are older
# we do not want to cache the newer schema versions that IPFX is currently using
Expand Down Expand Up @@ -236,36 +235,3 @@ def serialize(self, targets: Optional[OneOrMany[Dict[str, Any]]] = None):
file_.write(self._data.getvalue())

self._reload_nwbfile()


def set_container_sources(
container: hdmf.container.AbstractContainer,
source: str
):
"""Traverse an NWBFile starting at a given container, setting the
container_source attribute inplace on each container.
Parameters
----------
container : container_source will be set on this object as well as on
each of its applicable children.
source : The new value of container source
"""
children = [container]
while children:
current = children.pop()

# 💀💀💀
# container_source is set on write, but cannot be overrwritten, making
# read -> modify -> write elsewhere
# pretty tricky!
# this is a fragile workaround
if hasattr(current, "_AbstractContainer__container_source"):
setattr(
current,
"_AbstractContainer__container_source",
source
)

if hasattr(current, "children"):
children.extend(current.children)
7 changes: 6 additions & 1 deletion ipfx/dataset/create.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,12 @@ def is_file_mies(path: str) -> bool:
with h5py.File(path, "r") as fil:
if "generated_by" in fil["general"].keys():
generated_by = dict(fil["general"]["generated_by"][:])
return generated_by.get("Package", "None") == "MIES"
try:
decoded_generation_info = {k.decode(): v.decode() for k, v in generated_by.items()}
except:
decoded_generation_info = generated_by

return decoded_generation_info.get("Package", "None") == "MIES"

return False

Expand Down
2 changes: 1 addition & 1 deletion ipfx/version.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
1.0.8
2.0.0
4 changes: 2 additions & 2 deletions ipfx/x_to_nwb/ABFConverter.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,10 +79,10 @@ def __init__(self, inFileOrFolder, outFile, outputFeedbackChannel, compression=T
nwbFile.add_ic_electrode(electrodes)

for i in self._createStimulusSeries(electrodes):
nwbFile.add_stimulus(i)
nwbFile.add_stimulus(i, use_sweep_table=True)

for i in self._createAcquiredSeries(electrodes):
nwbFile.add_acquisition(i)
nwbFile.add_acquisition(i, use_sweep_table=True)

with NWBHDF5IO(outFile, "w") as io:
io.write(nwbFile, cache_spec=True)
Expand Down
4 changes: 2 additions & 2 deletions ipfx/x_to_nwb/DatConverter.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,10 +71,10 @@ def generateList(multipleGroupsPerFile, pul):
nwbFile.add_ic_electrode(electrodes)

for i in self._createAcquiredSeries(electrodes, elem):
nwbFile.add_acquisition(i)
nwbFile.add_acquisition(i, use_sweep_table=True)

for i in self._createStimulusSeries(electrodes, elem):
nwbFile.add_stimulus(i)
nwbFile.add_stimulus(i, use_sweep_table=True)

if multipleGroupsPerFile:
outFileFmt = outFile
Expand Down
File renamed without changes.
23 changes: 12 additions & 11 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,17 +1,18 @@
argschema<2.0.0
allensdk
argschema
dictdiffer
h5py==2.10.0
marshmallow==3.0.0rc6
matplotlib>=1.4.3
h5py
marshmallow
matplotlib
methodtools
numpy>=1.15.4,<1.19.0
pandas>=0.25.1,<=0.25.3
numpy
pandas
pg8000
pillow
pyabf<2.3.0
pynwb>=1.3.2,<2.0.0
pyYAML<6.0.0
scipy>=0.15.1
simplejson>=3.10.0
pyabf
pynwb==2.2.0
pyYAML
ruamel.yaml<0.18.0
scipy
simplejson
watchdog
Loading

0 comments on commit 622b8ff

Please sign in to comment.