Use Node.js runtime & move more components to RSC
Hey! I saw you ran into a few issues streaming with the edge runtime. This PR does a few things:
- Moves your client boundaries down lower in the tree, by changing the
"use client"placement. This means that the initialpagewill be a server component, as well ascomponents/chat. - Moves from edge -> Node.js runtime. You should pair this with selecting 1 vCPU in your Function settings on Vercel. This will help you get faster function responses (faster than edge, which might not be obvious)
- Removes a few bits of unused code that I saw in editor
@leerob is attempting to deploy a commit to the morphic Team on Vercel.
A member of the Team first needs to authorize it.
Also, added you here 😄
https://vercel.com/templates/next.js/morphic-ai-answer-engine-generative-ui
The latest updates on your projects. Learn more about Vercel for Git ↗︎
| Name | Status | Preview | Comments | Updated (UTC) |
|---|---|---|---|---|
| morphic | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | May 8, 2024 9:06pm |
@leerob
Thank you for reviewing this project code!
export const runtime = 'edge'
I also didn't write this line, and it worked fine in my local environment. (It's not even written in the code of the ai/rsc docs.) However, when I deploy to Vercel, it doesn't stream. That's why I added it. Related: https://github.com/vercel/ai/issues/1187
Do I need to change the Vercel settings? 1 vCPU in your Function settings is already set. Your PR also works fine locally, but it doesn't stream in the Vercel environment.
- Preview: https://morphic-git-fork-leerob-fixes-morphic-ai.vercel.app/
- Main branch: https://morphic-ai.vercel.app/
Also, added you here 😄
https://vercel.com/templates/next.js/morphic-ai-answer-engine-generative-ui
So cool! 🚀
@leerob Can you explain how streaming is faster on the node.js runtime than on edge? I'm imagining it has to do with the CPU speed. Are there significant price differences to using a full node runtime?
@leerob
Thank you for your commit with the fix. We recently encountered an error with Next.js v14.2.3. So, we are specifying v14.2.1. :https://github.com/miurla/morphic/issues/85
Hey @aleksa-code, reviewed your comment on #85 briefly. Do you believe the text not streaming when using the node runtime is related to the Next.js/Vercel CLI version?
Vercel has recently moved away from edge.
The argument for using the node run time is interesting but I would need more sources to be convinced. I think the argument goes as follow (can't remember if I saw this on X or not): With more compute power, text streams faster and so with less time executing the function, although the node runtime costs more, the end result in price is about the same if not better. So with node runtime you get faster streaming and a potential savings cost.
I'd like to see if that speed in streaming is significantly faster and whether using the node runtime opens you up to other costs.
Hi @albertdbio. My comment was related to the 405 error mentioned in #85 😅, which seems to be resolved now, so I removed the Vercel CLI version flag that I had in my project. I think Morphic should be working fine with the newest version of Next.js as well, without giving 405.
Regarding the Edge and Node runtime, I don't have an answer about why streaming is not working. I also learned about Vercel ditching Edge from Theo's video and by finding this PR here. In my case, I only use Edge when necessary, and I would happily switch to using only Node. This would remove the confusion of when to use it or not.
What I am curious about is what that means for the free hobby Vercel tier. Since with that tier, you cannot switch to more CPU and memory, in the case of, for example, streaming responses from the OpenAI API, the requests will time out. Does that mean the hobby tier will be upgraded to standard (CPU/memory) in the future? Or if we want to stay on the free tier, would we still need to utilize the Edge runtime?
👋 There was a bug up until 14.2.2 with streaming + server actions + Node.js that has since been resolved. Thanks for helping pinpoint things here, appreciate it. Should work now, if you want to approve that deployment.
Could you also make sure you're on "Standard" performance here?
https://vercel.com/changelog/manage-your-vercel-functions-cpu-and-memory-in-the-dashboard
Could you also make sure you're on "Standard" performance here?
https://vercel.com/changelog/manage-your-vercel-functions-cpu-and-memory-in-the-dashboard
It has become the standard.
~> Vercel Runtime Timeout Error: Task timed out after 30 seconds~ ~For our application, 30 seconds is not enough.~
Task timed out after 15.02 seconds
Seems to time out at 15 seconds.
Changed maxDuration to 60 seconds and error was resolved.
@leerob Thank you for your great contribution! 🥇