elasticsearch
elasticsearch copied to clipboard
[ML] Adding manage_inference to the kibana_system role v2
Original PR: https://github.com/elastic/elasticsearch/pull/108262 Reverted because I merged too soon without Security's approval: https://github.com/elastic/elasticsearch/pull/108371
This PR adds the manage_inference
to the kibana_system
role so the default user can interact with the inference APIs. This came from a discussion around the security assistant not being able to interact with the inference API using the internal elasticsearch user within kibana.
Creating this PR to continue discussion.
Pinging @elastic/ml-core (Team:ML)
I can answer 1. -- over in the Security Solution Assistant, we're trying to leverage the new _inference API to automatically set up and deploy ELSER so that we can enable the Knowledge Base functionality by default (so long as the appropriate ML resources exist). To do this we would be calling the below with an asInternalUser esClient, however the internal user does not currently have this privilege:
// Temporarily use esClient for current user until `kibana_system` user has `inference_admin` role
// See https://github.com/elastic/elasticsearch/pull/108262
// const esClient = (await context.core).elasticsearch.client.asInternalUser;
const esClient = (await context.core).elasticsearch.client.asCurrentUser;
const elserResponse = await esClient.inference.putModel({
inference_id: 'elser_model_2',
task_type: 'sparse_embedding',
model_config: {
service: 'elser',
service_settings: {
model_id: elserId,
num_allocations: 1,
num_threads: 1,
},
task_settings: {},
},
});
We could fall back to using the TrainedModelsAPI as the internal user already has manage_ml privileges which covers this API, however we were hoping to start trialing the _inference API so we could begin to provide feedback and use cases to the platform team.
Would it be possible to have ELSER/KnowledgeBase already setup and enabled by default? Or it cant be done with out being setup via the internal user?
The manage_inference gives access to these apis below. Here are some docs: https://www.elastic.co/guide/en/elasticsearch/reference/master/inference-apis.html
Generally it allows setting up and deleting inference endpoints to interact with 3rd party services like cohere and openai. It also allows interacting with the trained models apis: https://www.elastic.co/guide/en/elasticsearch/reference/master/ml-df-trained-models-apis.html
private static final Set<String> MANAGE_INFERENCE_PATTERN = Set.of(
"cluster:admin/xpack/inference/*",
"cluster:monitor/xpack/inference*", // no trailing slash to match the POST InferenceAction name
"cluster:admin/xpack/ml/trained_models/deployment/start",
"cluster:admin/xpack/ml/trained_models/deployment/stop",
"cluster:monitor/xpack/ml/trained_models/deployment/infer"
);
Would granting a subset of the privileges (from the set you listed above) from manage_inference
suffice? I see other partial roles have been granted, i.e. "cluster:admin/analyze",
?
Would granting a subset of the privileges (from the set you listed above) from manage_inference suffice? I see other partial roles have been granted, i.e. "cluster:admin/analyze", ?
Hmm I'm not super familiar with how this works but is that a partial role? The only place I can find that is here: https://github.com/elastic/elasticsearch/blob/main/x-pack/plugin/security/src/main/java/org/elasticsearch/xpack/security/action/SecurityActionMapper.java#L24
I would expect to see that tied to an *Action
class something like this:
public static final ActionType<Response> INSTANCE = new ActionType<>("cluster:admin/scripts/painless/execute");
From here: https://github.com/elastic/elasticsearch/blob/4ef4b9d4204ba750c0cdad3e180c8a43d4778456/modules/lang-painless/src/main/java/org/elasticsearch/painless/action/PainlessExecuteAction.java#L125
I think to do a partial role, we'd have to create a new role and initialize it the same way manage_inference
but give it access to fewer routes.
The ml
routes are actually already granted to the kibana system user from this line: https://github.com/elastic/elasticsearch/blob/main/x-pack/plugin/core/src/main/java/org/elasticsearch/xpack/core/security/authz/store/KibanaOwnedReservedRoleDescriptors.java#L85
And defined here: https://github.com/elastic/elasticsearch/blob/main/x-pack/plugin/core/src/main/java/org/elasticsearch/xpack/core/security/authz/privilege/ClusterPrivilegeResolver.java#L124
@spong needs to create an inference endpoint (a put/write/create request) and perform inference on documents (a post/write request) so we'd need to grant the ability to do those things. I suppose we could restrict access to deleting and reading the inference endpoints but those operations seem less concerning to me 🤷♂️
needs to create an inference endpoint (a put/write/create request) and perform inference on documents (a post/write request) so we'd need to grant the ability to do those things.
I think this would be a good option, if it is not too much trouble 🙏
those operations seem less concerning to me 🤷♂️
While those may seem unproblematic by themselves, if an attacker was able to compromise the internal user, all the permissions we have granted may open up additional attack vectors, leading to additional damage.
We try to strike a balance between limiting the access of this user and trying not to impede engineers 😅
I'm concerned that creating many fine grained permissions will result in an bizarre situation where a user has the permission to deploy a model but does not have permission to delete it or even use it. Those extra permissions also create documentation and maintenance tasks plus the inevitable support issues that would arise from such a complicated permissions model.
Given that there is a work around using the ML trained models APIs and kibana_system has manage_ml I recommend closing this PR for now and revisiting the issue once we have concrete use cases that can only be satisfied by manage_inference.
Closing as recommended by Dave.