Python: Ignore certain parameters when using `get_tool_call_object`
When using get_tool_call_object, it would be good if it were possible to exclude certain parameters from being included in the responding tool object, namely the parameters that are automatically passed in by the kernel (e.g. kernel, service, etc.)
Consider the following example:
import asyncio
import json
from typing import Annotated
from semantic_kernel import Kernel
from semantic_kernel.connectors.ai.open_ai.utils import get_tool_call_object
from semantic_kernel.functions import kernel_function
class MyPlugin:
@kernel_function(description="My function")
def my_func(
self,
param: Annotated[str, "The parameter"],
kernel: Kernel,
): ...
async def main():
kernel = Kernel()
kernel.add_plugin(plugin=MyPlugin(), plugin_name="MyPlugin")
print(json.dumps(get_tool_call_object(kernel, filter={}), indent=2))
asyncio.run(main())
The function includes kernel, which will automatically be passed in when the function is called, however it also appears in the tool object:
[
{
"type": "function",
"function": {
"name": "MyPlugin-my_func",
"description": "My function",
"parameters": {
"type": "object",
"properties": {
"param": {
"description": "The parameter",
"type": "string"
},
"kernel": {
"description": "",
"type": "string"
}
},
"required": [
"param",
"kernel"
]
}
}
}
]
This could mean that the LLM attempts to pass a value for kernel, when it shouldn't be coming from the LLM.
@cecheta which version are you using? I don't think this function is in there any more, replaced with something else, I do agree that a way to control which parameters get included makes sense, @moonbox3 you are working on this part currently, could you consider this as well?
@cecheta which version are you using? I don't think this function is in there any more, replaced with something else, I do agree that a way to control which parameters get included makes sense, @moonbox3 you are working on this part currently, could you consider this as well?
This was raised when using an older version of semantic kernel, most likely 0.9.6b1, however I would imagine that it's still the same using FunctionCallBehavior.EnableFunctions()?
@cecheta which version are you using? I don't think this function is in there any more, replaced with something else, I do agree that a way to control which parameters get included makes sense, @moonbox3 you are working on this part currently, could you consider this as well?
This was raised when using an older version of semantic kernel, most likely
0.9.6b1, however I would imagine that it's still the same usingFunctionCallBehavior.EnableFunctions()?
Yes, correct. We still have code that does something similar to build the JSON schema required for tool calling. I am working on something related to this currently, and will take this into consideration as I do the work.
We are actually tracking this in #6846, and we first need to write an ADR around it as it needs to be supported cross-platform.
Closing this since we're tracking the work in #6846.
So how does this work now?
So how does this work now?
We haven't handled this in a specific way, aside from what we have documented here: https://learn.microsoft.com/en-us/semantic-kernel/concepts/ai-services/chat-completion/function-calling/?pivots=programming-language-python#reserved-parameter-names-for-auto-function-calling
I'd still like to improve this, when possible, to better handle any type of JSON schema we send to the model for these reserved parameters.
Thanks @moonbox3 this was helpful.