Built in support for newline separated messages via stdin/stdout (for json communication)?
This package supports out of the box an IsolateChannel, so it makes sense it could also support a stdin/stdout channel.
One of the most common use cases for that, is JSON protocols, which want to send newline separated JSON encoded messages back and forth.
It is a bit cumbersome today to do that, you need something along these lines:
StreamChannel.withCloseGuarantee(io.stdin, io.stdout)
.transform(StreamChannelTransformer.fromCodec(utf8))
.transformStream(const LineSplitter())
.transformSink(
StreamSinkTransformer.fromHandlers(
handleData: (data, sink) {
sink.add('$data\n');
},
),
),
There isn't a ton going on here but it is actually somewhat hard to come up with this magical incantation yourself (I ended up reaching out to @natebosch ).
Note that this implementation assumes each message passed to the sink is a full encoded JSON object.
We could expose this from other packages like package:json_rpc_2 but I think its probably a more general thing?
Alternatively the json_rpc_2 package could allow some configuration for arbitrary data to be written to the sink after each encoded message, which might be a more general purpose solution for that particular package (for binary message formats this could be any message termination byte(s)).
cc @natebosch
Yet another idea could be a JsonObjectLineSplitter which is a sink transformer that just counts open/close curly braces (taking into account escapes/strings etc) and injects newlines between objects.
We could consider making the JSON encoder and decoder support JSONL when used as a stream transformer.
They're converting between Stream<Object?> and Stream<String>/Stream<Uint8List> (when UTF-8 fused). That API can allow adding or reading more than one object from the same input.
I feel this would be useful for implementing Anthropic Model Context Protocol servers and clients.
I’d like to build a MCP client as a Flutter app with Gemini as the driving LL model (using Vertex AI in Firebase) for an advanced Flutter AI codelab.
Let me know if I can help with proving things out.
/cc @RedBrogdon for visibility
I feel this would be useful for implementing Anthropic Model Context Protocol servers and clients.
This is the exact use case I ran into haha (see here).
I’d like to build a MCP client as a Flutter app with Gemini as the driving LL model (using Vertex AI in Firebase) for an advanced Flutter AI codelab.
@natebosch might also be looking into a similar thing fwiw