Skip to content

Commit

Permalink
Releasing version 2.138.1
Browse files Browse the repository at this point in the history
Releasing version 2.138.1
  • Loading branch information
oci-dex-release-bot authored Nov 12, 2024
2 parents ea06d06 + 1a59c55 commit 22fd62c
Show file tree
Hide file tree
Showing 144 changed files with 10,360 additions and 175 deletions.
14 changes: 14 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,20 @@ Change Log
All notable changes to this project will be documented in this file.

The format is based on `Keep a Changelog <http://keepachangelog.com/>`_.
====================
2.138.1 - 2024-11-12
====================

Added
-----
* Support for calling Oracle Cloud Infrastructure services in the me-alain-1 region
* Support for connection refresh in the GoldenGate service
* Support for secret compartment id in import and export operations of deployment wallet in the GoldenGate service
* Support for creating metadata only backups in the GoldenGate service
* Support for Llama 3.2 unit shape in Generative AI service
* Support for Llama 3.2 vision in Generative AI Inference service
* Support for Cohere CommandR response format in Generative AI Inference service

====================
2.138.0 - 2024-11-05
====================
Expand Down
5 changes: 5 additions & 0 deletions docs/api/generative_ai_inference.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,9 @@ Generative Ai Inference
oci.generative_ai_inference.models.CohereLlmInferenceResponse
oci.generative_ai_inference.models.CohereMessage
oci.generative_ai_inference.models.CohereParameterDefinition
oci.generative_ai_inference.models.CohereResponseFormat
oci.generative_ai_inference.models.CohereResponseJsonFormat
oci.generative_ai_inference.models.CohereResponseTextFormat
oci.generative_ai_inference.models.CohereSystemMessage
oci.generative_ai_inference.models.CohereTool
oci.generative_ai_inference.models.CohereToolCall
Expand All @@ -48,6 +51,8 @@ Generative Ai Inference
oci.generative_ai_inference.models.GeneratedText
oci.generative_ai_inference.models.GenericChatRequest
oci.generative_ai_inference.models.GenericChatResponse
oci.generative_ai_inference.models.ImageContent
oci.generative_ai_inference.models.ImageUrl
oci.generative_ai_inference.models.LlamaLlmInferenceRequest
oci.generative_ai_inference.models.LlamaLlmInferenceResponse
oci.generative_ai_inference.models.LlmInferenceRequest
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
CohereResponseFormat
====================

.. currentmodule:: oci.generative_ai_inference.models

.. autoclass:: CohereResponseFormat
:show-inheritance:
:special-members: __init__
:members:
:undoc-members:
:inherited-members:
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
CohereResponseJsonFormat
========================

.. currentmodule:: oci.generative_ai_inference.models

.. autoclass:: CohereResponseJsonFormat
:show-inheritance:
:special-members: __init__
:members:
:undoc-members:
:inherited-members:
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
CohereResponseTextFormat
========================

.. currentmodule:: oci.generative_ai_inference.models

.. autoclass:: CohereResponseTextFormat
:show-inheritance:
:special-members: __init__
:members:
:undoc-members:
:inherited-members:
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
ImageContent
============

.. currentmodule:: oci.generative_ai_inference.models

.. autoclass:: ImageContent
:show-inheritance:
:special-members: __init__
:members:
:undoc-members:
:inherited-members:
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
ImageUrl
========

.. currentmodule:: oci.generative_ai_inference.models

.. autoclass:: ImageUrl
:show-inheritance:
:special-members: __init__
:members:
:undoc-members:
:inherited-members:
2 changes: 2 additions & 0 deletions docs/api/golden_gate.rst
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,7 @@ Golden Gate
oci.golden_gate.models.DefaultCancelDeploymentUpgradeDetails
oci.golden_gate.models.DefaultCancelSnoozeDeploymentUpgradeDetails
oci.golden_gate.models.DefaultDeploymentWalletExistsDetails
oci.golden_gate.models.DefaultRefreshConnectionDetails
oci.golden_gate.models.DefaultRestoreDeploymentDetails
oci.golden_gate.models.DefaultRollbackDeploymentUpgradeDetails
oci.golden_gate.models.DefaultSnoozeDeploymentUpgradeDetails
Expand Down Expand Up @@ -164,6 +165,7 @@ Golden Gate
oci.golden_gate.models.PostgresqlConnectionSummary
oci.golden_gate.models.RedisConnection
oci.golden_gate.models.RedisConnectionSummary
oci.golden_gate.models.RefreshConnectionDetails
oci.golden_gate.models.RemoveResourceLockDetails
oci.golden_gate.models.RescheduleDeploymentUpgradeDetails
oci.golden_gate.models.RescheduleDeploymentUpgradeToDateDetails
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
DefaultRefreshConnectionDetails
===============================

.. currentmodule:: oci.golden_gate.models

.. autoclass:: DefaultRefreshConnectionDetails
:show-inheritance:
:special-members: __init__
:members:
:undoc-members:
:inherited-members:
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
RefreshConnectionDetails
========================

.. currentmodule:: oci.golden_gate.models

.. autoclass:: RefreshConnectionDetails
:show-inheritance:
:special-members: __init__
:members:
:undoc-members:
:inherited-members:
5 changes: 5 additions & 0 deletions examples/showoci/CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,11 @@ All notable changes to this project will be documented in this file.

The format is based on `Keep a Changelog <http://keepachangelog.com/>`_.

=====================
24.11.12 - 24.11.12
=====================
* Added unique id to Vault CSVs

=====================
24.08.27 - 24.08.27
=====================
Expand Down
2 changes: 1 addition & 1 deletion examples/showoci/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -237,7 +237,7 @@ ssh opc@UsageVM

```text
sudo yum -y update
sudo yum -y git
sudo yum -y install git
sudo dnf -y module install python39
sudo dnf -y install python39-pip
sudo dnf -y install python39-setuptools
Expand Down
2 changes: 1 addition & 1 deletion examples/showoci/showoci.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@
import os
import time

version = "24.08.27"
version = "24.11.12"

##########################################################################
# check OCI version
Expand Down
2 changes: 1 addition & 1 deletion examples/showoci/showoci_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@


class ShowOCIData(object):
version = "24.08.27"
version = "24.11.12"

############################################
# ShowOCIService - Service object to query
Expand Down
5 changes: 3 additions & 2 deletions examples/showoci/showoci_output.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@


class ShowOCIOutput(object):
version = "24.08.27"
version = "24.11.12"

##########################################################################
# spaces for align
Expand Down Expand Up @@ -9356,7 +9356,8 @@ def __csv_security_kms_vaults(self, region_name, kms_vaults):
'software_key_version_count': ar['software_key_version_count'],
'time_created': ar['time_created'][0:16],
'lifecycle_state': ar['lifecycle_state'],
'id': ar['id'],
'vault_id': ar['id'],
'id': ar['id'] + ":" + region_name,
'freeform_tags': self.__get_freeform_tags(ar['freeform_tags']),
'defined_tags': self.__get_defined_tags(ar['defined_tags'])
}
Expand Down
2 changes: 1 addition & 1 deletion examples/showoci/showoci_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@
# class ShowOCIService
##########################################################################
class ShowOCIService(object):
version = "24.08.27"
version = "24.11.12"
oci_compatible_version = "2.129.4"
thread_lock = threading.Lock()
collection_ljust = 40
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -223,10 +223,14 @@ def unit_shape(self):
- LARGE_COHERE_V2
- SMALL_COHERE
- SMALL_COHERE_V2
- SMALL_COHERE_4
- EMBED_COHERE
- LLAMA2_70
- LARGE_GENERIC
- LARGE_COHERE_V2_2
- LARGE_GENERIC_4
- SMALL_GENERIC_V2
- LARGE_GENERIC_2


:return: The unit_shape of this CreateDedicatedAiClusterDetails.
Expand All @@ -245,10 +249,14 @@ def unit_shape(self, unit_shape):
- LARGE_COHERE_V2
- SMALL_COHERE
- SMALL_COHERE_V2
- SMALL_COHERE_4
- EMBED_COHERE
- LLAMA2_70
- LARGE_GENERIC
- LARGE_COHERE_V2_2
- LARGE_GENERIC_4
- SMALL_GENERIC_V2
- LARGE_GENERIC_2


:param unit_shape: The unit_shape of this CreateDedicatedAiClusterDetails.
Expand Down
24 changes: 20 additions & 4 deletions src/oci/generative_ai/models/dedicated_ai_cluster.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ class DedicatedAiCluster(object):
Dedicated AI clusters are compute resources that you can use for fine-tuning custom models or for hosting endpoints for custom models. The clusters are dedicated to your models and not shared with users in other tenancies.

To use any of the API operations, you must be authorized in an IAM policy. If you're not authorized, talk to an administrator who gives OCI resource access to users. See
`Getting Started with Policies`__ and `Getting Access to Generative AI Resouces`__.
`Getting Started with Policies`__ and `Getting Access to Generative AI Resources`__.

__ https://docs.cloud.oracle.com/iaas/Content/Identity/policiesgs/get-started-with-policies.htm
__ https://docs.cloud.oracle.com/iaas/Content/generative-ai/iam-policies.htm
Expand Down Expand Up @@ -73,6 +73,10 @@ class DedicatedAiCluster(object):
#: This constant has a value of "SMALL_COHERE_V2"
UNIT_SHAPE_SMALL_COHERE_V2 = "SMALL_COHERE_V2"

#: A constant which can be used with the unit_shape property of a DedicatedAiCluster.
#: This constant has a value of "SMALL_COHERE_4"
UNIT_SHAPE_SMALL_COHERE_4 = "SMALL_COHERE_4"

#: A constant which can be used with the unit_shape property of a DedicatedAiCluster.
#: This constant has a value of "EMBED_COHERE"
UNIT_SHAPE_EMBED_COHERE = "EMBED_COHERE"
Expand All @@ -93,6 +97,18 @@ class DedicatedAiCluster(object):
#: This constant has a value of "LARGE_GENERIC_4"
UNIT_SHAPE_LARGE_GENERIC_4 = "LARGE_GENERIC_4"

#: A constant which can be used with the unit_shape property of a DedicatedAiCluster.
#: This constant has a value of "SMALL_GENERIC_V2"
UNIT_SHAPE_SMALL_GENERIC_V2 = "SMALL_GENERIC_V2"

#: A constant which can be used with the unit_shape property of a DedicatedAiCluster.
#: This constant has a value of "LARGE_GENERIC_2"
UNIT_SHAPE_LARGE_GENERIC_2 = "LARGE_GENERIC_2"

#: A constant which can be used with the unit_shape property of a DedicatedAiCluster.
#: This constant has a value of "LARGE_GENERIC_V2"
UNIT_SHAPE_LARGE_GENERIC_V2 = "LARGE_GENERIC_V2"

def __init__(self, **kwargs):
"""
Initializes a new DedicatedAiCluster object with values from keyword arguments.
Expand Down Expand Up @@ -144,7 +160,7 @@ def __init__(self, **kwargs):

:param unit_shape:
The value to assign to the unit_shape property of this DedicatedAiCluster.
Allowed values for this property are: "LARGE_COHERE", "LARGE_COHERE_V2", "SMALL_COHERE", "SMALL_COHERE_V2", "EMBED_COHERE", "LLAMA2_70", "LARGE_GENERIC", "LARGE_COHERE_V2_2", "LARGE_GENERIC_4", 'UNKNOWN_ENUM_VALUE'.
Allowed values for this property are: "LARGE_COHERE", "LARGE_COHERE_V2", "SMALL_COHERE", "SMALL_COHERE_V2", "SMALL_COHERE_4", "EMBED_COHERE", "LLAMA2_70", "LARGE_GENERIC", "LARGE_COHERE_V2_2", "LARGE_GENERIC_4", "SMALL_GENERIC_V2", "LARGE_GENERIC_2", "LARGE_GENERIC_V2", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:type unit_shape: str

Expand Down Expand Up @@ -486,7 +502,7 @@ def unit_shape(self):
**[Required]** Gets the unit_shape of this DedicatedAiCluster.
The shape of dedicated unit in this AI cluster. The underlying hardware configuration is hidden from customers.

Allowed values for this property are: "LARGE_COHERE", "LARGE_COHERE_V2", "SMALL_COHERE", "SMALL_COHERE_V2", "EMBED_COHERE", "LLAMA2_70", "LARGE_GENERIC", "LARGE_COHERE_V2_2", "LARGE_GENERIC_4", 'UNKNOWN_ENUM_VALUE'.
Allowed values for this property are: "LARGE_COHERE", "LARGE_COHERE_V2", "SMALL_COHERE", "SMALL_COHERE_V2", "SMALL_COHERE_4", "EMBED_COHERE", "LLAMA2_70", "LARGE_GENERIC", "LARGE_COHERE_V2_2", "LARGE_GENERIC_4", "SMALL_GENERIC_V2", "LARGE_GENERIC_2", "LARGE_GENERIC_V2", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.


Expand All @@ -505,7 +521,7 @@ def unit_shape(self, unit_shape):
:param unit_shape: The unit_shape of this DedicatedAiCluster.
:type: str
"""
allowed_values = ["LARGE_COHERE", "LARGE_COHERE_V2", "SMALL_COHERE", "SMALL_COHERE_V2", "EMBED_COHERE", "LLAMA2_70", "LARGE_GENERIC", "LARGE_COHERE_V2_2", "LARGE_GENERIC_4"]
allowed_values = ["LARGE_COHERE", "LARGE_COHERE_V2", "SMALL_COHERE", "SMALL_COHERE_V2", "SMALL_COHERE_4", "EMBED_COHERE", "LLAMA2_70", "LARGE_GENERIC", "LARGE_COHERE_V2_2", "LARGE_GENERIC_4", "SMALL_GENERIC_V2", "LARGE_GENERIC_2", "LARGE_GENERIC_V2"]
if not value_allowed_none_or_none_sentinel(unit_shape, allowed_values):
unit_shape = 'UNKNOWN_ENUM_VALUE'
self._unit_shape = unit_shape
Expand Down
2 changes: 1 addition & 1 deletion src/oci/generative_ai/models/endpoint.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ class Endpoint(object):
To host a custom model for inference, create an endpoint for that model on a dedicated AI cluster of type HOSTING.

To use any of the API operations, you must be authorized in an IAM policy. If you're not authorized, talk to an administrator who gives OCI resource access to users. See
`Getting Started with Policies`__ and `Getting Access to Generative AI Resouces`__.
`Getting Started with Policies`__ and `Getting Access to Generative AI Resources`__.

__ https://docs.cloud.oracle.com/iaas/Content/Identity/policiesgs/get-started-with-policies.htm
__ https://docs.cloud.oracle.com/iaas/Content/generative-ai/iam-policies.htm
Expand Down
2 changes: 1 addition & 1 deletion src/oci/generative_ai/models/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ class Model(object):
You can create a custom model by using your dataset to fine-tune an out-of-the-box text generation base model. Have your dataset ready before you create a custom model. See `Training Data Requirements`__.

To use any of the API operations, you must be authorized in an IAM policy. If you're not authorized, talk to an administrator who gives OCI resource access to users. See
`Getting Started with Policies`__ and `Getting Access to Generative AI Resouces`__.
`Getting Started with Policies`__ and `Getting Access to Generative AI Resources`__.

__ https://docs.cloud.oracle.com/iaas/Content/generative-ai/training-data-requirements.htm
__ https://docs.cloud.oracle.com/iaas/Content/Identity/policiesgs/get-started-with-policies.htm
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,9 @@ class GenerativeAiInferenceClient(object):
"""
OCI Generative AI is a fully managed service that provides a set of state-of-the-art, customizable large language models (LLMs) that cover a wide range of use cases for text generation, summarization, and text embeddings.

Use the Generative AI service inference API to access your custom model endpoints, or to try the out-of-the-box models to [chat](#/en/generative-ai-inference/latest/ChatResult/Chat), [generate text](#/en/generative-ai-inference/latest/GenerateTextResult/GenerateText), [summarize](#/en/generative-ai-inference/latest/SummarizeTextResult/SummarizeText), and [create text embeddings](#/en/generative-ai-inference/latest/EmbedTextResult/EmbedText).
Use the Generative AI service inference API to access your custom model endpoints, or to try the out-of-the-box models to [chat](#/EN/generative-ai-inference/latest/ChatResult/Chat), [generate text](#/EN/generative-ai-inference/latest/GenerateTextResult/GenerateText), [summarize](#/EN/generative-ai-inference/latest/SummarizeTextResult/SummarizeText), and [create text embeddings](#/EN/generative-ai-inference/latest/EmbedTextResult/EmbedText).

To use a Generative AI custom model for inference, you must first create an endpoint for that model. Use the [Generative AI service management API](/#/en/generative-ai/latest/) to [create a custom model](#/en/generative-ai/latest/Model/) by fine-tuning an out-of-the-box model, or a previous version of a custom model, using your own data. Fine-tune the custom model on a [fine-tuning dedicated AI cluster](#/en/generative-ai/latest/DedicatedAiCluster/). Then, create a [hosting dedicated AI cluster](#/en/generative-ai/latest/DedicatedAiCluster/) with an [endpoint](#/en/generative-ai/latest/Endpoint/) to host your custom model. For resource management in the Generative AI service, use the [Generative AI service management API](/#/en/generative-ai/latest/).
To use a Generative AI custom model for inference, you must first create an endpoint for that model. Use the [Generative AI service management API](#/EN/generative-ai/latest/) to [create a custom model](#/EN/generative-ai/latest/Model/) by fine-tuning an out-of-the-box model, or a previous version of a custom model, using your own data. Fine-tune the custom model on a [fine-tuning dedicated AI cluster](#/EN/generative-ai/latest/DedicatedAiCluster/). Then, create a [hosting dedicated AI cluster](#/EN/generative-ai/latest/DedicatedAiCluster/) with an [endpoint](#/en/generative-ai/latest/Endpoint/) to host your custom model. For resource management in the Generative AI service, use the [Generative AI service management API](#/EN/generative-ai/latest/).

To learn more about the service, see the [Generative AI documentation](/iaas/Content/generative-ai/home.htm).
"""
Expand Down
10 changes: 10 additions & 0 deletions src/oci/generative_ai_inference/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,9 @@
from .cohere_llm_inference_response import CohereLlmInferenceResponse
from .cohere_message import CohereMessage
from .cohere_parameter_definition import CohereParameterDefinition
from .cohere_response_format import CohereResponseFormat
from .cohere_response_json_format import CohereResponseJsonFormat
from .cohere_response_text_format import CohereResponseTextFormat
from .cohere_system_message import CohereSystemMessage
from .cohere_tool import CohereTool
from .cohere_tool_call import CohereToolCall
Expand All @@ -36,6 +39,8 @@
from .generated_text import GeneratedText
from .generic_chat_request import GenericChatRequest
from .generic_chat_response import GenericChatResponse
from .image_content import ImageContent
from .image_url import ImageUrl
from .llama_llm_inference_request import LlamaLlmInferenceRequest
from .llama_llm_inference_response import LlamaLlmInferenceResponse
from .llm_inference_request import LlmInferenceRequest
Expand Down Expand Up @@ -70,6 +75,9 @@
"CohereLlmInferenceResponse": CohereLlmInferenceResponse,
"CohereMessage": CohereMessage,
"CohereParameterDefinition": CohereParameterDefinition,
"CohereResponseFormat": CohereResponseFormat,
"CohereResponseJsonFormat": CohereResponseJsonFormat,
"CohereResponseTextFormat": CohereResponseTextFormat,
"CohereSystemMessage": CohereSystemMessage,
"CohereTool": CohereTool,
"CohereToolCall": CohereToolCall,
Expand All @@ -84,6 +92,8 @@
"GeneratedText": GeneratedText,
"GenericChatRequest": GenericChatRequest,
"GenericChatResponse": GenericChatResponse,
"ImageContent": ImageContent,
"ImageUrl": ImageUrl,
"LlamaLlmInferenceRequest": LlamaLlmInferenceRequest,
"LlamaLlmInferenceResponse": LlamaLlmInferenceResponse,
"LlmInferenceRequest": LlmInferenceRequest,
Expand Down
8 changes: 6 additions & 2 deletions src/oci/generative_ai_inference/models/base_chat_request.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,9 @@ def get_subtype(object_dictionary):
def api_format(self):
"""
**[Required]** Gets the api_format of this BaseChatRequest.
The API format for the model's request.
The API format for the model's family group.
COHERE is for the Cohere family models such as the cohere.command-r-16k and cohere.command-r-plus models.
GENERIC is for other model families such as the meta.llama-3-70b-instruct model.

Allowed values for this property are: "COHERE", "GENERIC"

Expand All @@ -83,7 +85,9 @@ def api_format(self):
def api_format(self, api_format):
"""
Sets the api_format of this BaseChatRequest.
The API format for the model's request.
The API format for the model's family group.
COHERE is for the Cohere family models such as the cohere.command-r-16k and cohere.command-r-plus models.
GENERIC is for other model families such as the meta.llama-3-70b-instruct model.


:param api_format: The api_format of this BaseChatRequest.
Expand Down
Loading

0 comments on commit 22fd62c

Please sign in to comment.