Skip to content

Commit

Permalink
Added dependency upgrades for python3.12 support (#1951)
Browse files Browse the repository at this point in the history
* Python3.12 changes

* Added the python3.12 version

* Code changes for the inner pyproject.toml

* Fixes for the scikit-learn

* Added the missign git repository url

* Added the custom github repo url

* Added the gcc linux as well

* Added the new gcc environment as well

* Upgraded the alibi-detct to remove creme dependency

* Added alibi as well to the restriction

* Commiting the changes for the gcc and alibi

* Changes for the scikit-learn

* #1951 resolved the comments and rebased with master

* #1951 reverted the code to address the comments

* #1951  Fixed the issues

* #1951 Rebased with master and addressed the comments for python3.12 upgrade

* #1951 Upgraded the tensorflow for compatibility with python version

* update alibi libraries TF versions

* add min torch and tf versions for python 3.12 support

* update torch min version in mlflow runtime to support python 2.12

* update lockfiles by running `make lock`

* Updated the Mlserver version to 1.6.1

* Update tests.yml

* Revert the version changes testing

* Updated the lock files using make lock command

* build(deps): bump mlflow from 2.17.2 to 2.18.0 in /runtimes/mlflow (#1970)

Bumps [mlflow](https://github.com/mlflow/mlflow) from 2.17.2 to 2.18.0.
- [Release notes](https://github.com/mlflow/mlflow/releases)
- [Changelog](https://github.com/mlflow/mlflow/blob/master/CHANGELOG.md)
- [Commits](mlflow/mlflow@v2.17.2...v2.18.0)

---
updated-dependencies:
- dependency-name: mlflow
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Python3.12 changes

* Added the python3.12 version

* Code changes for the inner pyproject.toml

* Fixes for the scikit-learn

* Added the missign git repository url

* Added the custom github repo url

* Added the gcc linux as well

* Added the new gcc environment as well

* Upgraded the alibi-detct to remove creme dependency

* Added alibi as well to the restriction

* Commiting the changes for the gcc and alibi

* Changes for the scikit-learn

* #1951 Rebased with master and addressed the comments for python3.12 upgrade

* #1951 Upgraded the tensorflow for compatibility with python version

* update alibi libraries TF versions

* add min torch and tf versions for python 3.12 support

* update torch min version in mlflow runtime to support python 2.12

* update lockfiles by running `make lock`

* Updated the Mlserver version to 1.6.1

* Update tests.yml

* Revert the version changes testing

* Updated the lock files using make lock command

* upgrade alibi libraries to point to master

* upgrade tf version as dev deps

* update lock files

* update lock

* add tests for py3.11 and py3.12

* generate grpc protos

* add tf-keras dep

* update lockfile for hf runtime

* revert back mlflow deps list

* use legacy keras in alibi-detect / alibi-explain tests

* Add docs about supported python versions

* adjust test to use python 3.10

---------

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: Sherif Akoush <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
  • Loading branch information
3 people authored Dec 17, 2024
1 parent 5fb0c8f commit e87943a
Show file tree
Hide file tree
Showing 34 changed files with 4,516 additions and 3,946 deletions.
18 changes: 5 additions & 13 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ["3.9", "3.10"]
python-version: ["3.9", "3.10", "3.11", "3.12"]
target: ["dataplane", "model-repository"]
runs-on: ubuntu-22.04
steps:
Expand All @@ -36,9 +36,7 @@ jobs:
lint:
strategy:
matrix:
python-version:
- "3.9"
- "3.10"
python-version: ["3.9", "3.10", "3.11", "3.12"]
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v4
Expand All @@ -64,9 +62,7 @@ jobs:
- macos-13
# NOTE: There's no pre-built `grpcio` wheel for Python 3.11 yet
# https://github.com/grpc/grpc/issues/32454
python-version:
- "3.9"
- "3.10"
python-version: ["3.9", "3.10", "3.11", "3.12"]
is-pr:
- ${{ github.event_name == 'pull_request' }}
exclude:
Expand Down Expand Up @@ -111,9 +107,7 @@ jobs:
os:
- ubuntu-22.04
- macos-13
python-version:
- "3.9"
- "3.10"
python-version: ["3.9", "3.10", "3.11", "3.12"]
tox-environment:
- sklearn
- xgboost
Expand Down Expand Up @@ -166,9 +160,7 @@ jobs:
os:
- ubuntu-22.04
- macos-12
python-version:
- "3.9"
- "3.10"
python-version: ["3.9", "3.10", "3.11", "3.12"]
runs-on: ${{ matrix.os }}
steps:
- name: Maximize build space
Expand Down
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,8 +100,9 @@ MLServer is licensed under the Apache License, Version 2.0. However please note
| 3.8 | 🔴 |
| 3.9 | 🟢 |
| 3.10 | 🟢 |
| 3.11 | 🔵 |
| 3.12 | 🔵 |
| 3.11 | 🟢 |
| 3.12 | 🟢 |
| 3.13 | 🔴 |

## Examples

Expand Down
5 changes: 3 additions & 2 deletions docs-gb/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,8 +98,9 @@ Out of the box, MLServer provides support for:
| 3.8 | 🔴 |
| 3.9 | 🟢 |
| 3.10 | 🟢 |
| 3.11 | 🔵 |
| 3.12 | 🔵 |
| 3.11 | 🟢 |
| 3.12 | 🟢 |
| 3.13 | 🔴 |

## Examples

Expand Down
35 changes: 23 additions & 12 deletions mlserver/grpc/dataplane_pb2.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

46 changes: 46 additions & 0 deletions mlserver/grpc/dataplane_pb2_grpc.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,32 @@
# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
import warnings

from . import dataplane_pb2 as dataplane__pb2

GRPC_GENERATED_VERSION = "1.67.1"
GRPC_VERSION = grpc.__version__
_version_not_supported = False

try:
from grpc._utilities import first_version_is_lower

_version_not_supported = first_version_is_lower(
GRPC_VERSION, GRPC_GENERATED_VERSION
)
except ImportError:
_version_not_supported = True

if _version_not_supported:
raise RuntimeError(
f"The grpc package installed is at version {GRPC_VERSION},"
+ f" but the generated code in dataplane_pb2_grpc.py depends on"
+ f" grpcio>={GRPC_GENERATED_VERSION}."
+ f" Please upgrade your grpc module to grpcio>={GRPC_GENERATED_VERSION}"
+ f" or downgrade your generated code using grpcio-tools<={GRPC_VERSION}."
)


class GRPCInferenceServiceStub(object):
"""
Expand All @@ -21,51 +44,61 @@ def __init__(self, channel):
"/inference.GRPCInferenceService/ServerLive",
request_serializer=dataplane__pb2.ServerLiveRequest.SerializeToString,
response_deserializer=dataplane__pb2.ServerLiveResponse.FromString,
_registered_method=True,
)
self.ServerReady = channel.unary_unary(
"/inference.GRPCInferenceService/ServerReady",
request_serializer=dataplane__pb2.ServerReadyRequest.SerializeToString,
response_deserializer=dataplane__pb2.ServerReadyResponse.FromString,
_registered_method=True,
)
self.ModelReady = channel.unary_unary(
"/inference.GRPCInferenceService/ModelReady",
request_serializer=dataplane__pb2.ModelReadyRequest.SerializeToString,
response_deserializer=dataplane__pb2.ModelReadyResponse.FromString,
_registered_method=True,
)
self.ServerMetadata = channel.unary_unary(
"/inference.GRPCInferenceService/ServerMetadata",
request_serializer=dataplane__pb2.ServerMetadataRequest.SerializeToString,
response_deserializer=dataplane__pb2.ServerMetadataResponse.FromString,
_registered_method=True,
)
self.ModelMetadata = channel.unary_unary(
"/inference.GRPCInferenceService/ModelMetadata",
request_serializer=dataplane__pb2.ModelMetadataRequest.SerializeToString,
response_deserializer=dataplane__pb2.ModelMetadataResponse.FromString,
_registered_method=True,
)
self.ModelInfer = channel.unary_unary(
"/inference.GRPCInferenceService/ModelInfer",
request_serializer=dataplane__pb2.ModelInferRequest.SerializeToString,
response_deserializer=dataplane__pb2.ModelInferResponse.FromString,
_registered_method=True,
)
self.ModelStreamInfer = channel.stream_stream(
"/inference.GRPCInferenceService/ModelStreamInfer",
request_serializer=dataplane__pb2.ModelInferRequest.SerializeToString,
response_deserializer=dataplane__pb2.ModelInferResponse.FromString,
_registered_method=True,
)
self.RepositoryIndex = channel.unary_unary(
"/inference.GRPCInferenceService/RepositoryIndex",
request_serializer=dataplane__pb2.RepositoryIndexRequest.SerializeToString,
response_deserializer=dataplane__pb2.RepositoryIndexResponse.FromString,
_registered_method=True,
)
self.RepositoryModelLoad = channel.unary_unary(
"/inference.GRPCInferenceService/RepositoryModelLoad",
request_serializer=dataplane__pb2.RepositoryModelLoadRequest.SerializeToString,
response_deserializer=dataplane__pb2.RepositoryModelLoadResponse.FromString,
_registered_method=True,
)
self.RepositoryModelUnload = channel.unary_unary(
"/inference.GRPCInferenceService/RepositoryModelUnload",
request_serializer=dataplane__pb2.RepositoryModelUnloadRequest.SerializeToString,
response_deserializer=dataplane__pb2.RepositoryModelUnloadResponse.FromString,
_registered_method=True,
)


Expand Down Expand Up @@ -193,6 +226,9 @@ def add_GRPCInferenceServiceServicer_to_server(servicer, server):
"inference.GRPCInferenceService", rpc_method_handlers
)
server.add_generic_rpc_handlers((generic_handler,))
server.add_registered_method_handlers(
"inference.GRPCInferenceService", rpc_method_handlers
)


# This class is part of an EXPERIMENTAL API.
Expand Down Expand Up @@ -229,6 +265,7 @@ def ServerLive(
wait_for_ready,
timeout,
metadata,
_registered_method=True,
)

@staticmethod
Expand Down Expand Up @@ -258,6 +295,7 @@ def ServerReady(
wait_for_ready,
timeout,
metadata,
_registered_method=True,
)

@staticmethod
Expand Down Expand Up @@ -287,6 +325,7 @@ def ModelReady(
wait_for_ready,
timeout,
metadata,
_registered_method=True,
)

@staticmethod
Expand Down Expand Up @@ -316,6 +355,7 @@ def ServerMetadata(
wait_for_ready,
timeout,
metadata,
_registered_method=True,
)

@staticmethod
Expand Down Expand Up @@ -345,6 +385,7 @@ def ModelMetadata(
wait_for_ready,
timeout,
metadata,
_registered_method=True,
)

@staticmethod
Expand Down Expand Up @@ -374,6 +415,7 @@ def ModelInfer(
wait_for_ready,
timeout,
metadata,
_registered_method=True,
)

@staticmethod
Expand Down Expand Up @@ -403,6 +445,7 @@ def ModelStreamInfer(
wait_for_ready,
timeout,
metadata,
_registered_method=True,
)

@staticmethod
Expand Down Expand Up @@ -432,6 +475,7 @@ def RepositoryIndex(
wait_for_ready,
timeout,
metadata,
_registered_method=True,
)

@staticmethod
Expand Down Expand Up @@ -461,6 +505,7 @@ def RepositoryModelLoad(
wait_for_ready,
timeout,
metadata,
_registered_method=True,
)

@staticmethod
Expand Down Expand Up @@ -490,4 +535,5 @@ def RepositoryModelUnload(
wait_for_ready,
timeout,
metadata,
_registered_method=True,
)
11 changes: 8 additions & 3 deletions mlserver/grpc/model_repository_pb2.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading

0 comments on commit e87943a

Please sign in to comment.