[bounty] $150 make embedded LLM feature works on windows and linux
i created a new feature which allows you to run ollama, embedded in screenpipe, in a click, so there is no need to worry about running an LLM yourself and also to power some new features i'm thinking of + less friction with pipes
atm it works on macos but windows and linux seems to have bunch of files (like always) so just need to move these during build time correctly
i guess to resources tauri folder
https://github.com/mediar-ai/screenpipe/blob/main/screenpipe-app-tauri/scripts/pre_build.js https://github.com/mediar-ai/screenpipe/blob/main/screenpipe-app-tauri/src-tauri/tauri.windows.conf.json https://github.com/mediar-ai/screenpipe/blob/main/screenpipe-app-tauri/src-tauri/tauri.linux.conf.json
also if this works, i'm guessing it would solve our CUDA issue, meaning everyone will have CUDA-enabled whisper etc without having to worry about installing those files
definition of done:
- [ ]
release-app.ymlbuilds on windows with ollama installation in pre_build.js (currently commented out) - [ ]
release-app.ymlbuilds on linux with ollama installation in pre_build.js (currently commented out) - [ ] windows users can enable and start and stop the LLM sidecar through the UI, it works (this is already done, just need to remove the "macos" if)
- [ ] linux users can enable and start and stop the LLM sidecar through the UI, it works (this is already done, just need to remove the "macos" if)
to be clear this is mostly fixing the pre_build and testing it works on these OS, the pre build is currently downloading ollama from releases, and moving files to the app, but need to properly move the libs to the resources
/bounty 150
~~## đ $100 bounty âĸ Screenpi.pe~~
~~### Steps to solve:~~
~~1. Start working: Comment /attempt #507 with your implementation plan~~
~~2. Submit work: Create a pull request including /claim #507 in the PR body to claim the bounty~~
~~3. Receive payment: 100% of the bounty is received 2-5 days post-reward. Make sure you are eligible for payouts~~
~~Thank you for contributing to mediar-ai/screenpipe!~~
~~Add a bounty âĸ Share on socials~~
| Attempt | Started (GMT+0) | Solution |
|---|---|---|
| đĸ @varshith257 | Oct 16, 2024, 1:01:51 AM | WIP |
| đĸ @Neptune650 | Oct 18, 2024, 9:47:15 AM | #529 |
/attempt #507
| Algora profile | Completed bounties | Tech | Active attempts | Options |
|---|---|---|---|---|
| @varshith257 | Â Â Â 1 mediar-ai bounty + 15 bounties from 7 projects |
Go, Rust, Scala & more |
īš463 |
Cancel attempt |
PS: if necessary i can split into two bounties if you can only do one OS
Sure
@varshith257 can i know currenlty you are working on which os?
@Abiji-2020 Thanks for in! I have worked on both just a few quick fixes and will wrap up the work
@Abiji-2020 Thanks for in! I have worked on both just a few quick fixes and will wrap up the work
oh okayy
/attempt #507
| Algora profile | Completed bounties | Tech | Active attempts | Options |
|---|---|---|---|---|
| @Neptune650 | Â Â Â 1 mediar-ai bounty + 1 bounty from 1 project |
C++, C, Python & more |
īš321 |
Cancel attempt |
đĄ @Neptune650 submitted a pull request that claims the bounty. You can visit your bounty board to reward.
@Neptune650 I also worked on it and fixed for the Windows. And just waiting a small fixes for Linux that I thought to fix today night(Just got busy with my university exams so didn't put much time to submit PR soon) But you submitted the PR
PS: As stated here in the issue the bounty can splitted for separate OS. I am willing to submit PR for windows OS and you can proceed with LINUX OS changes in your PR or else we can collaborate. WDYT?
@louis030195 Should I submit a new pull request or should I collaborate on @Neptune650's pull request?
@louis030195 I apologize, but wasn't it listed as $150?
/tip $50 @Neptune650
sorry didn't see the bounty was incorrect
/tip $50 @Neptune650
sorry didn't see the bounty was incorrect
Much appreciated!