Alex Yang
Alex Yang
That's amazing, let me check
Good new is that our core module is mostly correct https://arethetypeswrong.github.io/?p=%40llamaindex%2Fcore%400.2.4
I think this is because we didn't use bundler in llamaindex module for some reasons. Maybe I should bring it back
I know the solution in [the repo](https://github.com/andrewbranch/example-subpath-exports-ts-compat/tree/main), but I think it's costly now because I don't want put many effort on bundle stuff. each bundler has advantages but also disadvantage,...
Working on this right now
blocked by https://github.com/huozhi/bunchee/pull/579 Just releazed that there is a bug on bunchee 😢
OK, core module is supporting node10 now. I cannot bundle llamaindex package because there're too many thrid party npm issue
I found https://github.com/jparkerweb/semantic-chunking which might only works in node.js
Need refactor node parser to support async
This is cool. I want to do this long time ago. Since our workflow core is very simple now. We should easy to teach LLM how to do this