ml-commons
ml-commons copied to clipboard
[Doc] Add Lambda Function URL Connector Blueprint to ML Inference Processor
Overview
I propose adding a Connector Blueprint for AWS Lambda Function URL to the ML-Commons docs. This would allow users to access models hosted via Lambda functions directly from OpenSearch.
Background
Currently, OpenSearch ML Inference processor provides Connector Blueprints for various external hosted models (OpenAI, Amazon SageMaker, Amazon Bedrock, Cohere, etc.). However, there is no Connector Blueprint for connecting to models hosted using AWS Lambda Function URLs.
Lambda Function URLs enable direct invocation of Lambda functions through HTTPS endpoints, providing a flexible approach to hosting custom ML models or performing various processing tasks within Lambda.
Proposal
I propose adding a Connector Blueprint for Lambda Function URL with the following structure. I have already verified that this approach works correctly for invoking Lambda Function URLs:
{
"name": "Lambda_connector",
"description": "Remote connector for Lambda",
"version": 1,
"protocol": "aws_sigv4",
"credential": {
"roleArn": <CONNECTOR_ROLE_ARN>
},
"parameters": {
"region": region,
"service_name": "lambda"
},
"actions": [
{
"action_type": "predict",
"method": "POST",
"headers": {"content-type": "application/json"},
"url": model_endpoint_url,
"pre_process_function": """
StringBuilder builder = new StringBuilder();
builder.append("\\"");
builder.append(params.text_docs[0]);
builder.append("\\"");
def parameters = "{" +"\\"inputs\\":" + builder + "}";
return "{" +"\\"parameters\\":" + parameters + "}";
""",
"request_body": "{\"inputs\": \"${parameters.inputs}\"}",
}
]
}