Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Theia AI] Support Azure OpenAI #14711

Open
fipro78 opened this issue Jan 9, 2025 · 3 comments
Open

[Theia AI] Support Azure OpenAI #14711

fipro78 opened this issue Jan 9, 2025 · 3 comments
Labels

Comments

@fipro78
Copy link
Contributor

fipro78 commented Jan 9, 2025

Feature Description:

We host an OpenAI gpt-4o model in Azure and wanted to connect Theia AI via OpenAI Compatible Models as described here https://theia-ide.org/docs/user_ai/#openai-compatible-models-eg-via-vllm

{
    "ai-features.AiEnable.enableAI": true,
    "ai-features.openAiCustom.customOpenAiModels": [
        {
            "model": "gpt-4o",
            "url": "https://<custom>.openai.azure.com/openai/deployments/<my_deployment>/chat/completions?api-version=2024-08-01-preview",
            "id": "my_model",
            "apiKey": "<my_api_key>"
        }
    ],
    "ai-features.agentSettings": {
        "Universal": {
            "languageModelRequirements": [
                {
                    "purpose": "chat",
                    "identifier": "my_model"
                }
            ]
        },
        "Orchestrator": {
            "languageModelRequirements": [
                {
                    "purpose": "agent-selection",
                    "identifier": "my_model"
                }
            ]
        }
    }
}

This doesn't work. I always get

{"error":{"code":"404","message": "Resource not found"}}

After searching around for the cause, we noticed that the issue is related to OpenAI on Azure. This is shortly described in openai-node GitHub

We created a small node project to verify this. When using the OpenAI class in our small reproducer, we also get the 404 error when trying to access the Azure OpenAI resource. Switching to the AzureOpenAI class in the reproducer, the issue is gone and we get a correct response.

While looking into the Theia sources, I think we narrowed the issue down to OpenAiModel.

Either there needs to be some configuration to specify that Azure OpenAI is tried to access, and based on that use either OpenAI or AzureOpenAI. But that also means that the result needs to be parsed differently as far as we saw. Another option would be a new ai-azureopenai module. But I am not sure if this would be too much of copied code.

@sdirix
Copy link
Member

sdirix commented Jan 9, 2025

Hi @fipro78, Thanks for the report!

As the AzureOpenAI client is provided by the same client library as the regular OpenAI client, it feels natural to me to include the support of Azure Open AI in @theia/ai-openai. We could for example enrich the OpenAiModel or add an AzureOpenAiModel, sharing code between it and OpenAiModel where possible/reasonable.

If you need Azure support right now in your Theia based application, then it's probably easiest to rebind the OpenAiLanguageModelsManager to your own subclass, overriding the createOrUpdateLanguageModels call. In there you then create AzureOpenAiModels for Azure URLs and fallback to the regular implementation for all others.

@sdirix sdirix added the TheiaAI label Jan 9, 2025
@fipro78
Copy link
Contributor Author

fipro78 commented Jan 10, 2025

@sdirix
Thanks for the fast reply.

I am trying to follow your suggestion to see if my approach would work. Unfortunately I can extend OpenAiModel. I always get the following error when trying to import it:

Cannot find module '@theia/ai-openai/lib/node' or its corresponding type declarations.

As I don't know how to fix this (not sure if the class is even exported so it can be extended) I simply copied the whole code and modified only the places that need to be changed.

I am trying to rebind my custom manager, but it is somehow not used. Can you tell me what I am doing wrong?

import { ContainerModule } from '@theia/core/shared/inversify';
import { VecuBuilderAiContribution } from './vecu-builder-ai-contribution';
import { RemoteConnectionProvider, ServiceConnectionProvider } from '@theia/core/lib/browser';
import { OPENAI_LANGUAGE_MODELS_MANAGER_PATH, OpenAiLanguageModelsManager } from '@theia/ai-openai/lib/common';
import { AzureOpenAiLanguageModelsManagerImpl } from './azure-openai-language-models-manager-impl';


export default new ContainerModule((bind, unbind, isBound, rebind) => {
    
    rebind(OpenAiLanguageModelsManager).toDynamicValue(ctx => {
        const provider = ctx.container.get<ServiceConnectionProvider>(RemoteConnectionProvider);
        return provider.createProxy<OpenAiLanguageModelsManager>(OPENAI_LANGUAGE_MODELS_MANAGER_PATH, AzureOpenAiLanguageModelsManagerImpl);
    }).inSingletonScope();
});

@sdirix
Copy link
Member

sdirix commented Jan 10, 2025

Hi @fipro78,

Happy to help!

I am trying to follow your suggestion to see if my approach would work. Unfortunately I can extend OpenAiModel. I always get the following error when trying to import it:

Cannot find module '@theia/ai-openai/lib/node' or its corresponding type declarations.

As I don't know how to fix this (not sure if the class is even exported so it can be extended) I simply copied the whole code and modified only the places that need to be changed.

In Theia it's normal to do deep imports. That's possible as we don't do any bundling in the published packages.

So to import the OpenAiModel you can use the following import

import { OpenAiModel } from '@theia/ai-openai/lib/node/openai-language-model';

Note

In some packages the import from <package-name>/lib/node, <package-name>/lib/common or even <package-name> would work, as some of them provide convenience index files which expose a subset of the package functionality.

I am trying to rebind my custom manager, but it is somehow not used. Can you tell me what I am doing wrong?

import { ContainerModule } from '@theia/core/shared/inversify';
import { VecuBuilderAiContribution } from './vecu-builder-ai-contribution';
import { RemoteConnectionProvider, ServiceConnectionProvider } from '@theia/core/lib/browser';
import { OPENAI_LANGUAGE_MODELS_MANAGER_PATH, OpenAiLanguageModelsManager } from '@theia/ai-openai/lib/common';
import { AzureOpenAiLanguageModelsManagerImpl } from './azure-openai-language-models-manager-impl';


export default new ContainerModule((bind, unbind, isBound, rebind) => {
    
    rebind(OpenAiLanguageModelsManager).toDynamicValue(ctx => {
        const provider = ctx.container.get<ServiceConnectionProvider>(RemoteConnectionProvider);
        return provider.createProxy<OpenAiLanguageModelsManager>(OPENAI_LANGUAGE_MODELS_MANAGER_PATH, AzureOpenAiLanguageModelsManagerImpl);
    }).inSingletonScope();
});

I think there is a mixup of functionalities here. The OpenAiLanguageModelsManager lives on the backend and therefore the customization should be implemented and rebound there. What you want to rebind is this service.

To achieve this you should create a backend-module and in there invoke the rebind, for example like this:

export default new ContainerModule((bind, unbind, isBound, rebind) => {
    bind(AzureOpenAiLanguageModelsManagerImpl).toSelf().inSingletonScope();
    rebind(OpenAiLanguageModelsManager).toService(AzureOpenAiLanguageModelsManagerImpl);
});

You don't need to customize the RPC connection handling. When the frontend asks for the proxy service of the OpenAiLanguageModelsManager on backend side it will now use the AzureOpenAiLanguageModelsManagerImpl without the frontend ever knowing that this happened.

In fact you don't need to customize the frontend at all, at least not for the discussed changes here ;)

To integrate your backend module with Theia, you then need to declare it in your theiaExtensions like this.

Hope this helps. Let me know if anything does not work for you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants