Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Unexpected schema validation error comes from model interface #3149

Open
yuye-aws opened this issue Oct 23, 2024 · 5 comments
Open

[BUG] Unexpected schema validation error comes from model interface #3149

yuye-aws opened this issue Oct 23, 2024 · 5 comments
Assignees
Labels
enhancement New feature or request

Comments

@yuye-aws
Copy link
Member

What is the bug?
When predict a ml model, even if the connector payload is filled, I still receives the error from model interface.

{
  "error": {
    "root_cause": [
      {
        "type": "status_exception",
        "reason": "Error validating input schema: Validation failed: [$.parameters: required property 'inputs' not found] for instance: {\"algorithm\":\"REMOTE\",\"parameters\":{\"prompt\":\"hello, who are you?\"},\"action_type\":\"PREDICT\"} with schema: {\n    \"type\": \"object\",\n    \"properties\": {\n        \"parameters\": {\n            \"type\": \"object\",\n            \"properties\": {\n                \"inputs\": {\n                    \"type\": \"string\"\n                }\n            },\n            \"required\": [\n                \"inputs\"\n            ]\n        }\n    },\n    \"required\": [\n        \"parameters\"\n    ]\n}"
      }
    ],
    "type": "status_exception",
    "reason": "Error validating input schema: Validation failed: [$.parameters: required property 'inputs' not found] for instance: {\"algorithm\":\"REMOTE\",\"parameters\":{\"prompt\":\"hello, who are you?\"},\"action_type\":\"PREDICT\"} with schema: {\n    \"type\": \"object\",\n    \"properties\": {\n        \"parameters\": {\n            \"type\": \"object\",\n            \"properties\": {\n                \"inputs\": {\n                    \"type\": \"string\"\n                }\n            },\n            \"required\": [\n                \"inputs\"\n            ]\n        }\n    },\n    \"required\": [\n        \"parameters\"\n    ]\n}"
  },
  "status": 400
}

How can one reproduce the bug?

  1. Create a connector for claude v2 model from the connector blue print. Only change ${parameters.inputs} to ${parameters.prompt}.
POST /_plugins/_ml/connectors/_create
{
  "name": "Amazon Bedrock",
  "description": "Test connector for Amazon Bedrock",
  "version": 1,
  "protocol": "aws_sigv4",
  "credential": {
    "access_key": "...",
    "secret_key": "..."
  },
  "parameters": {
    "region": "...",
    "service_name": "bedrock",
    "model": "anthropic.claude-v2"
  },
  "actions": [
    {
      "action_type": "predict",
      "method": "POST",
      "headers": {
        "content-type": "application/json"
      },
      "url": "https://bedrock-runtime.${parameters.region}.amazonaws.com/model/${parameters.model}/invoke",
      "request_body": """{"prompt":"\n\nHuman: ${parameters.prompt}\n\nAssistant:","max_tokens_to_sample":300,"temperature":0.5,"top_k":250,"top_p":1,"stop_sequences":["\\n\\nHuman:"]}"""
    }
  ]
}
  1. Register model with connector
POST /_plugins/_ml/models/_register
{
  "name": "anthropic.claude-v2",
  "function_name": "remote",
  "description": "test model",
  "connector_id": "6uQTs5IB3hywALe5wWxu"
}
  1. Run model prediction only with inputs parameter. You'll receive connector payload validation error since prompt parameter is needed. This is correct.
POST /_plugins/_ml/models/IY9qt5IB1dQFBoCTo8nv/_predict/
{
  "parameters": {
    "inputs": "hello, who are you?"
  }
} 
  1. Run model prediction only with prompt parameter. You'll receive model interface error.
POST /_plugins/_ml/models/IY9qt5IB1dQFBoCTo8nv/_predict/
{
  "parameters": {
    "prompt": "hello, who are you?"
  }
}

What is the expected behavior?

I should receive the model response as long as the connector payload is filled. I also find it quite weird that the bug won't be reproduced if I omit step 3.

What is your host/environment?

  • OS: MacOS
  • Version main
  • Plugins ml-commons

Do you have any screenshots?
If applicable, add screenshots to help explain your problem.

Do you have any additional context?
Add any other context about the problem.

@yuye-aws yuye-aws added bug Something isn't working untriaged labels Oct 23, 2024
@yuye-aws
Copy link
Member Author

Hi @b4sjoo ! I understand you are quite busy in these days. Can you take a look at this issue when free? No need to hurry.

@yuye-aws
Copy link
Member Author

The expected behavior of the model interface feature should remain consistent no matter how the user predicts the ml model.

@b4sjoo
Copy link
Collaborator

b4sjoo commented Oct 23, 2024

I am sorry but this is expected...The automated model interface only work when you strictly follow blueprint. If you changed something then you need update the model interface accordingly. In the future we will improve this, but at this time it still works in this way.

@b4sjoo b4sjoo removed the untriaged label Oct 23, 2024
@b4sjoo b4sjoo self-assigned this Oct 23, 2024
@pyek-bot
Copy link
Contributor

@yuye-aws Curious, what happens when you omit step 3? Is the request with prompt successful?

@b4sjoo b4sjoo added enhancement New feature or request and removed bug Something isn't working labels Oct 23, 2024
@yuye-aws
Copy link
Member Author

@yuye-aws Curious, what happens when you omit step 3? Is the request with prompt successful?

Yes. It's successful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Untriaged
Development

No branches or pull requests

3 participants