Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Patching for generativeai Python package #1329

Merged
merged 22 commits into from
Dec 3, 2023
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions Dockerfile.tmpl
Original file line number Diff line number Diff line change
Expand Up @@ -634,6 +634,10 @@ RUN mkdir -p ~/src && git clone https://github.com/SohierDane/BigQuery_Helper ~/
sed -i 's/)/packages=["bq_helper"])/g' ~/src/BigQuery_Helper/setup.py && \
pip install -e ~/src/BigQuery_Helper && \
/tmp/clean-layer.sh

RUN pip install wrapt \
google-generativeai && \
/tmp/clean-layer.sh

# Add BigQuery client proxy settings
ENV PYTHONUSERBASE "/root/.local"
Expand All @@ -647,6 +651,10 @@ ADD patches/sitecustomize.py /root/.local/lib/python3.10/site-packages/sitecusto
# Override default imagemagick policies
ADD patches/imagemagick-policy.xml /etc/ImageMagick-6/policy.xml

# Add generativeai Python patch
ADD patches/generativeaipatch.py /root/.local/lib/python3.10/site-packages/generativeaipatch.py
CMD ["python", "/root/.local/lib/python3.10/site-packages/generativeaipatch.py"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CMD is the Dockerfile instruction to update the command that is run when you start a container with the image. Do not update the CMD.

You can add logic that needs to be called to sitecustomize.py which gets call when a notebook session starts up.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

e.g.

if not hasattr(sys, 'frozen'):
sys.meta_path.insert(0, GcpModuleFinder())


# Add Kaggle module resolver
ADD patches/kaggle_module_resolver.py /opt/conda/lib/python3.10/site-packages/tensorflow_hub/kaggle_module_resolver.py
RUN sed -i '/from tensorflow_hub import uncompressed_module_resolver/a from tensorflow_hub import kaggle_module_resolver' /opt/conda/lib/python3.10/site-packages/tensorflow_hub/config.py && \
Expand Down
24 changes: 24 additions & 0 deletions patches/generativeaipatch.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
import wrapt
import os

@wrapt.when_imported('google.generativeai')
def post_import_logic(module):
old_configure = module.configure
Philmod marked this conversation as resolved.
Show resolved Hide resolved
def new_configure(*args, **kwargs):
if ('default_metadata' in kwargs):
default_metadata = kwargs['default_metadata']
else:
default_metadata = []
kwargs['transport'] = 'rest' # Only support REST requests for now
default_metadata.append(("x-kaggle-proxy-data", os.environ['KAGGLE_DATA_PROXY_TOKEN']))
Philmod marked this conversation as resolved.
Show resolved Hide resolved
default_metadata.append(('x-kaggle-authorization', f'Bearer {os.environ['KAGGLE_USER_SECRETS_TOKEN']}'))
kwargs['default_metadata'] = default_metadata
if ('client_options' in kwargs):
client_options = kwargs['client_options']
else:
client_options = {}
client_options['api_endpoint'] = os.environ['KAGGLE_DATA_PROXY_URL'] + '/palmapi'
Philmod marked this conversation as resolved.
Show resolved Hide resolved
kwargs['client_options'] = client_options
old_configure(*args, **kwargs)
module.configure = new_configure
module.configure() # generativeai can use GOOGLE_API_KEY env variable, so make sure we have the other configs set