You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
jupyter lab --AiExtension.model_parameters bedrock-chat:anthropic.claude-3-haiku-20240307-v1:0='{"guardrails":{"guardrailIdentifier":"********","guardrailVersion":"2"}}' OR
Boto3 : 1.35.16
Langchain : 0.1.20
jupyter_ai : 1.0.15
jupyter_ai_magics : 1.0.15
jupyterlab : 3.6.8 ( Can't upgrade to 4.x as AWS sagemaker instance doesn't support that)
LLM Model used : anthropic.claude-3-haiku-20240307-v1:0
As you can see from the above commands run, I have tried running both with id and guardrailidentifier but both of them are failing with following error message.
1. Error when tried with guardrail id parameter (refer above)
Traceback (most recent call last):
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/llms/bedrock.py", line 545, in _prepare_input_and_invoke
response = self.client.invoke_model(**request_options)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/botocore/client.py", line 569, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/botocore/client.py", line 980, in _make_api_call
request_dict = self._convert_to_request_dict(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/botocore/client.py", line 1047, in _convert_to_request_dict
request_dict = self._serializer.serialize_to_request(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/botocore/validate.py", line 381, in serialize_to_request
raise ParamValidationError(report=report.generate_report())
botocore.exceptions.ParamValidationError: Parameter validation failed:
Unknown parameter in input: "guardrail", must be one of: body, contentType, accept, modelId, trace, guardrailIdentifier, guardrailVersion
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai/chat_handlers/base.py", line 125, in on_message
await self.process_message(message)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai/chat_handlers/default.py", line 61, in process_message
response = await self.llm_chain.apredict(input=message.body, stop=["\nHuman:"])
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 333, in apredict
return (await self.acall(kwargs, callbacks=callbacks))[self.output_key]
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/_api/deprecation.py", line 157, in awarning_emitting_wrapper
return await wrapped(*args, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 428, in acall
return await self.ainvoke(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 212, in ainvoke
raise e
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 203, in ainvoke
await self._acall(inputs, run_manager=run_manager)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 298, in _acall
response = await self.agenerate([inputs], run_manager=run_manager)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 165, in agenerate
return await self.llm.agenerate_prompt(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 570, in agenerate_prompt
return await self.agenerate(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 530, in agenerate
raise exceptions[0]
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 719, in _agenerate_with_cache
result = await self._agenerate(messages, stop=stop, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai_magics/providers.py", line 689, in _agenerate
return await self._generate_in_executor(*args, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai_magics/providers.py", line 289, in _generate_in_executor
return await loop.run_in_executor(executor, _call_with_args)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/chat_models/bedrock.py", line 300, in _generate
completion, usage_info = self._prepare_input_and_invoke(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/llms/bedrock.py", line 552, in _prepare_input_and_invoke
raise ValueError(f"Error raised by bedrock service: {e}")
ValueError: Error raised by bedrock service: Parameter validation failed:
Unknown parameter in input: "guardrail", must be one of: body, contentType, accept, modelId, trace, guardrailIdentifier, guardrailVersion
Error when tried with guardrail identifier parameter (refer above)**
Traceback (most recent call last):
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/llms/bedrock.py", line 480, in _guardrails_enabled
and bool(self.guardrails["id"])
KeyError: 'id'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai/chat_handlers/base.py", line 125, in on_message
await self.process_message(message)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai/chat_handlers/default.py", line 61, in process_message
response = await self.llm_chain.apredict(input=message.body, stop=["\nHuman:"])
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 333, in apredict
return (await self.acall(kwargs, callbacks=callbacks))[self.output_key]
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/_api/deprecation.py", line 157, in awarning_emitting_wrapper
return await wrapped(*args, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 428, in acall
return await self.ainvoke(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 212, in ainvoke
raise e
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 203, in ainvoke
await self._acall(inputs, run_manager=run_manager)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 298, in _acall
response = await self.agenerate([inputs], run_manager=run_manager)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 165, in agenerate
return await self.llm.agenerate_prompt(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 570, in agenerate_prompt
return await self.agenerate(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 530, in agenerate
raise exceptions[0]
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 719, in _agenerate_with_cache
result = await self._agenerate(messages, stop=stop, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai_magics/providers.py", line 689, in _agenerate
return await self._generate_in_executor(*args, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai_magics/providers.py", line 289, in _generate_in_executor
return await loop.run_in_executor(executor, _call_with_args)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/chat_models/bedrock.py", line 300, in _generate
completion, usage_info = self._prepare_input_and_invoke(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/llms/bedrock.py", line 519, in _prepare_input_and_invoke
if self._guardrails_enabled:
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/llms/bedrock.py", line 485, in _guardrails_enabled
raise TypeError(
TypeError: Guardrails must be a dictionary with 'id' and 'version' keys.
Can you confirm whether usage of AWS Bedrock guardrail is supported in jupyter_ai extension ?
@mamahajan This issue is likely occurring because you are using Jupyter AI v1.x, which is no longer maintained as JupyterLab 3 has reached its end-of-life. We will not backport any features to Jupyter AI v1.x.
Closing this issue as it is due to an outdated environment. Please open another issue if you are able to reproduce with the latest release of Jupyter AI. Thank you for reaching out!
Problem statement : I am trying to use guardrail on the jupyter_ai extension.
Steps done to install jupyter_ai extension on AWS Sagemaker Instance :
Installed Versions :
Boto3 : 1.35.16
Langchain : 0.1.20
jupyter_ai : 1.0.15
jupyter_ai_magics : 1.0.15
jupyterlab : 3.6.8 ( Can't upgrade to 4.x as AWS sagemaker instance doesn't support that)
LLM Model used : anthropic.claude-3-haiku-20240307-v1:0
As you can see from the above commands run, I have tried running both with id and guardrailidentifier but both of them are failing with following error message.
Can you confirm whether usage of AWS Bedrock guardrail is supported in jupyter_ai extension ?
Reference issues :
langchain-ai/langchain-aws#156
langchain-ai/langchain-aws#25
The text was updated successfully, but these errors were encountered: