elasticsearch icon indicating copy to clipboard operation
elasticsearch copied to clipboard

[ML] Adding manage_inference to the kibana_system role

Open jonathan-buttner opened this issue 1 year ago • 1 comments

This PR adds the manage_inference to the kibana_system role so the default user can interact with the inference APIs. This came from a discussion around the security assistant not being able to interact with the inference API using the internal elasticsearch user within kibana.

jonathan-buttner avatar May 03 '24 17:05 jonathan-buttner

Pinging @elastic/ml-core (Team:ML)

elasticsearchmachine avatar May 03 '24 17:05 elasticsearchmachine

Heya @jonathan-buttner !

I have a few clarifying questions:

  1. Why do these API need to be called as the kibana_system user?
  2. What does the manage_inference privilege allow? I dont see it documented anywhere.

kc13greiner avatar May 06 '24 14:05 kc13greiner

I can answer 1. -- over in the Security Solution Assistant, we're trying to leverage the new _inference API to automatically set up and deploy ELSER so that we can enable the Knowledge Base functionality by default (so long as the appropriate ML resources exist). To do this we would be calling the below with an asInternalUser esClient, however the internal user does not currently have this privilege:

      // Temporarily use esClient for current user until `kibana_system` user has `inference_admin` role
      // See https://github.com/elastic/elasticsearch/pull/108262
      // const esClient = (await context.core).elasticsearch.client.asInternalUser;
      const esClient = (await context.core).elasticsearch.client.asCurrentUser;
      const elserResponse = await esClient.inference.putModel({
        inference_id: 'elser_model_2',
        task_type: 'sparse_embedding',
        model_config: {
          service: 'elser',
          service_settings: {
            model_id: elserId,
            num_allocations: 1,
            num_threads: 1,
          },
          task_settings: {},
        },
      });

We could fall back to using the TrainedModelsAPI as the internal user already has manage_ml privileges which covers this API, however we were hoping to start trialing the _inference API so we could begin to provide feedback and use cases to the platform team.

spong avatar May 06 '24 15:05 spong

Hey @kc13greiner 👋

  1. What does the manage_inference privilege allow? I dont see it documented anywhere.

The manage_inference gives access to these apis below. Here are some docs: https://www.elastic.co/guide/en/elasticsearch/reference/master/inference-apis.html

Generally it allows setting up and deleting inference endpoints to interact with 3rd party services like cohere and openai. It also allows interacting with the trained models apis: https://www.elastic.co/guide/en/elasticsearch/reference/master/ml-df-trained-models-apis.html

    private static final Set<String> MANAGE_INFERENCE_PATTERN = Set.of(
        "cluster:admin/xpack/inference/*",
        "cluster:monitor/xpack/inference*", // no trailing slash to match the POST InferenceAction name
        "cluster:admin/xpack/ml/trained_models/deployment/start",
        "cluster:admin/xpack/ml/trained_models/deployment/stop",
        "cluster:monitor/xpack/ml/trained_models/deployment/infer"
    );

jonathan-buttner avatar May 06 '24 15:05 jonathan-buttner

@jonathan-buttner @spong Thanks for the info! Reviewing and discussing with the team 🚀

kc13greiner avatar May 07 '24 12:05 kc13greiner

@jonathan-buttner Sorry, that wasn't an approval yet. I just wanted to provide an update that I was discussing with the team. I apologize for the confusing wording.

kc13greiner avatar May 07 '24 13:05 kc13greiner

Accidentally merged this without security's approval. They asked us to revert for now and we'll continue discussing on a new PR.

jonathan-buttner avatar May 07 '24 15:05 jonathan-buttner