MCP Server for Semantic Link Labs
With all the great capabilities of Semantic Link labs, it's now quite broad and rick, it would be killer to be able to use the functions through MCP and an agent to accelerate work and administration (i.e. with VS code notebooks)
Alternatives considered: Another approach is generating the documentation in markdown as custom instructions for GitHub copilot so it has full context of how to work with all the semantic link lab functions (their references, no nuances). There is a basic knowledge it also needs to know about Fabric (i.e. the major item types). This also works but every developer would need to create or manage this vs. having something come from the repo or having an MCP server.
it’s a great idea and at the same time, an MCP for Fabric already exists. Why propose reinventing the wheel for Semantic Link Labs? If Fabric’s MCP covers the required orchestration and agent patterns, shouldn’t we be talking about leveraging and extending that, instead of fragmenting effort with another MCP server implementation? https://www.linkedin.com/posts/thisissanthoshr_githubcopilot-microsoftfabric-analytics-activity-7343922198954356739-3dyR/
Thanks for the feedback! This is already being discussed and we will see how to best expose this to the end user.