func icon indicating copy to clipboard operation
func copied to clipboard

GsoC Project Proposal: Functions AI Agent Callbacks

Open lkingland opened this issue 10 months ago • 5 comments

Description:

Functions is well-suited for AI agent integration. The serverless nature and isolated runtime environment of Functions make them ideal for creating lightweight, purpose-built services that can be dynamically created and invoked agents.

Expected Outcome: This project would be a combination of research and practicum.

First, an analysis of current AI agent interaction patterns, emergent protocols, and available frameworks. Second, the development of a Proof-of-concept integration between Functions and AI agents. This would involve at a minimum invocation, with a stretch goal of implementation and deployment by the agent based on a human prompt.

Recommended Skills:

Strong language skills and ability to both research deeply and communicate clearly Familiarity with the Go programming language, web services, kubernetes. Familiarity with serveless paradigms and microservices. Experience with AIML frameworks or LLM integrations a plus.

lkingland avatar Feb 12 '25 04:02 lkingland

@lkingland as an FYI, @Cali0707 had a few functions being target of LLM-driven tool calls [1] [2] [3] as part of https://github.com/keventmesh/llm-tool-provider, does the project go beyond that?

[1] https://github.com/keventmesh/llm-tool-provider/tree/main/tools/average-resource-consumption [2] https://github.com/keventmesh/llm-tool-provider/tree/main/tools/current-time [3] https://github.com/keventmesh/llm-tool-provider/tree/main/tools/resource-cost-calculator

pierDipi avatar Feb 25 '25 12:02 pierDipi

A good expansion would be to see if it makes sense to have a function middleware exposing an MCP Server https://modelcontextprotocol.io/introduction and the function being the tool, and document the gaps with MCP Servers for Knative (since MCP is a "stateful" protocol: https://github.com/modelcontextprotocol/specification/discussions/102)

pierDipi avatar Feb 25 '25 12:02 pierDipi

Hello Team, I am Burhanuddin Ezzi, a cs major undergrad. I am interested in contributing to this issue as part of GSoC.

Recommended Skills:

  1. Strong language skills and ability to both research deeply and communicate clearly
  2. Familiarity with the Go programming language, web services, kubernetes.
  3. Familiarity with serveless paradigms and microservices.
  4. Experience with AIML frameworks or LLM integrations a plus.

I have started by 1) researching around this topic, 2) learning kubernetes (upto the point of building CRDs) and 2) started learning Go since I am new to it.

I am now working on: a) understanding the knative project b) understanding the func repo and c) starting with contributor guide to setup a local test environment I will ask for help if needed on a, b, and c.

I am also side by side looking at 3) and I have a good grasp of 4) ML frameworks (but unsure on what LLM integration is supposed to mean here).

Is there anything else I should be looking at?

Try solving the issues In the repo, Prerequisites are already mentioned above for this proposal

@BHAVISHYA2005 I would appreciate if you could point out a beginner friendly issue so that I can get started as fast as possible.

burhanuddin6 avatar Mar 09 '25 11:03 burhanuddin6

A good expansion would be to see if it makes sense to have a function middleware exposing an MCP Server https://modelcontextprotocol.io/introduction and the function being the tool, and document the gaps with MCP Servers for Knative (since MCP is a "stateful" protocol: modelcontextprotocol/specification#102)

Although stateless operations support is in the current roadmap, I doubt the support will arrive soon due to major changes required to the protocol.

Following the discussions, I think there are a few ways to go about this

  • Replace SSE with gRPC or Web Sockets. Given the current protocol specification requires SSE for the notification mechanism (asking the client for LLM output which also acts like a human in the loop), removing it will be complicated, especially with many current server implementations dependent on it.

    Also Web Sockets are already part of the SDKs, so working around it for Knative Functions should be straightforward.

  • Building a stateful MCP server and giving it access to a serverless environment as a tool. This also seemingly satisfies the original motivation for this project.

    The core idea is to build a MCP server with the current stateful approach and then build a comprehensive set of tools the MCP server has access to with the help of the client library. This will provide support for providing execution environments, prompt-based function creation/invocation/deployment, etc. This also aligns with the MCP roadmap for providing support for server sandboxing which will work out of the box if this approach is adopted.

I lean towards the second option since it has the potential to make developer workflows easier and improve Knative adoption. Finding good abstraction for the MCP tools layer is the hard part, I believe, since we have to balance maximum utility with developer experience.

Thoughts? @pierDipi @lkingland

Shubham-Rasal avatar Mar 20 '25 16:03 Shubham-Rasal

@lkingland as an FYI, @Cali0707 had a few functions being target of LLM-driven tool calls [1] [2] [3] as part of https://github.com/keventmesh/llm-tool-provider, does the project go beyond that?

[1] https://github.com/keventmesh/llm-tool-provider/tree/main/tools/average-resource-consumption [2] https://github.com/keventmesh/llm-tool-provider/tree/main/tools/current-time [3] https://github.com/keventmesh/llm-tool-provider/tree/main/tools/resource-cost-calculator

Thanks for these links and additional context.

Yes this project is different in that it integrates an MCP server into the func cli directly, enabling any LLM to then create and deploy functions directly (including implement their handler in a more accessible environment).

David will be adding LLM-related templates to our Functions template repository, so these implementations are potentially a good resourece /cc @gauron99

lkingland avatar Jun 05 '25 08:06 lkingland

A good expansion would be to see if it makes sense to have a function middleware exposing an MCP Server https://modelcontextprotocol.io/introduction and the function being the tool, and document the gaps with MCP Servers for Knative (since MCP is a "stateful" protocol: modelcontextprotocol/modelcontextprotocol#102)

You're right in mentioning there are challenges running an MCP server as a hosted service due to some early design decisions.

We're focused on integrating Functions with agentic AI using the MCP in its designed context for phase one, so this is not an urgent problem (as a service running locally for an MCP enabled client such as an IDE).

However, we are also working on creating a Function template which allows for a Function to be an MCP server. This essentially would allow an LLM to add functionality to itself when coupled with phase one. We're hoping there is an extension of the protocol to support this case by then, or we'll have to get a bit creative, such as using some of the solutions mentioned by @Shubham-Rasal above.

lkingland avatar Jun 05 '25 08:06 lkingland

@Shubham-Rasal that is indeed the direction we're taking for now; your second option of creating a local tool for (for example) agents within an IDE to use Functions as a tool. Middleware would come in later.

lkingland avatar Jun 05 '25 08:06 lkingland

As an update to this, our proposal was accepted by GSoC, and we've begun implementation.
https://github.com/knative/func/issues/2806

lkingland avatar Jun 05 '25 08:06 lkingland

@burhanuddin6 thank you for your interest. There are many issues in our "issues" section, which we are trying hard to triage and mark some as "Good First Issue"

lkingland avatar Jun 05 '25 08:06 lkingland