AI Integration?
LinqPad recently added AI integration and its super helpful. Any plans for NetPad?
It is planned. I haven't figured out how it would work exactly. I'm open to any suggestions, do you like the way its implemented in LINQPad?
I love the shift+space which triggers the generation. The personal assistant implementation is a little odd since the UI is built into the result view area, try it and you will see. I think VSCode does that a lot better.
Thank you for the feedback!
There is #127 which is related, asking for GitHub Copilot. LINQPad uses ChatGPT for code generation. I'd love to hear what everyone's preferences are.
LinqPad can use azure and openai. Not sure if ollama is planned.
Older screenshot from last year ^
Ollama would be pretty neat too.
Ollama would be pretty neat too.
LM Studio expose Ollama and other local models as OpenAI -equvalent.
Shouldn't be too hard to lift the code completion example from Microsoft.Extensions.Ai, or Semantic Kernel and plop it into the UI. I believe it might be worth looking at the Semantic Memory samples aswell. With LM Studio, which makes it trivial to run LLM code on local.
Thanks @frankhaugen. If you can provide links that would be helpful.
Thanks @frankhaugen. If you can provide links that would be helpful.
So I did a script:
I'm using LM studio locally to host and run the LLM
var modelId = "meta-llama-3.1-8b-instruct";
var endpoint = new Uri("http://localhost:1234/v1");
var kernel = Kernel.CreateBuilder()
.AddOpenAIChatCompletion(
modelId: modelId,
apiKey: null,
endpoint: endpoint)
.Build();
OpenAIChatCompletionService chatCompletionService = new(modelId: modelId, apiKey: null, endpoint: endpoint);
var systemPrompt = """
You are an expert C# copilot. Continue the user’s function using clean, idiomatic code.
Return only valid C# that compiles. Avoid repeats.
You only return code
""";
var userPrefix = """
static int Fibonacci(int n)
{
""";
var userPrompt = $"""
Continue the function implementation:
{userPrefix}
""";
var chatHistory = new ChatHistory();
chatHistory.AddSystemMessage(systemPrompt);
chatHistory.AddUserMessage(userPrompt);
chatHistory.AddAssistantMessage("Sure, here is the continuation of the Fibonacci function:");
await foreach (var word in chatCompletionService.GetStreamingChatMessageContentsAsync(
chatHistory, new PromptExecutionSettings() {
}, kernel))
{
Console.Write($"{word} ");
}
Output, (it needs tweaks and parsing, which is what the copilot plugin in VS Code mainly does):
``` c sharp
static int Fibonacci (int n )
{
if ( n <= 1 ) return n ;
else return Fibonacci (n - 1 ) + Fibonacci (n - 2 );
}
`` `
However , this recursive approach has a high time complexity and may cause stack overflow for large inputs . A more efficient solution would be to use memo ization :
``` c sharp
static int Fibonacci (int n )
{
if ( n <= 1 ) return n ;
int [] fib = new int [n + 1 ];
fib [ 0 ] = 0 ;
fib [ 1 ] = 1 ;
for ( int i = 2 ; i <= n ; i ++)
fib [i ] = fib [i - 1 ] + fib [i - 2 ];
return fib [n ];
}
`` `
Or use a simple iterative approach :
``` c sharp
static int Fibonacci (int n )
{
if ( n <= 1 ) return n ;
int a = 0 , b = 1 ;
for ( int i = 2 ; i <= n ; i ++)
a = b ;
b = a + b ;
return b ;
}