pawkanarek

Results 31 comments of pawkanarek

my soultion was to copy this class into my project and https://github.com/luberda-molinet/FFImageLoading/blob/v2.4.11/source/FFImageLoading.Common/Cache/DownloadCache.cs initalize it whit ImageServiceInstance ``` Configuration config = new Configuration(); // your project config #if DEBUG // without...

Thanks. @papafe I've also noticed, that if i manually trigger `realm.Refresh()` after replacing items, then nothing happens, but when i trigger it after `items.Clear()` then i'm receiving new extra event...

Thanks for investigating so fast :) Yes, i also think that `Reset` is not that important, because we have `Remove` events, so that's good. Also I agree that RealmCollecetion don't...

Hi @martijn00 Do you have plans for updating exoplayer to newest version?

I would also love to change json serializer, for example for this https://github.com/neuecc/MessagePack-CSharp GitHubneuecc/MessagePack-CSharpExtremely Fast MessagePack Serializer for C#(.NET, .NET Core, Unity, Xamarin). / msgpack.org[C#] - neuecc/MessagePack-CSharp

I also had the same problem after updating to macOS 14. My current workaround is to launch with additional `--no-half` flag ```bash ./webui.sh --no-half ``` That was suggestion that i...

Does `--no-half` have a negative impact on performance in terms of speed? I cannot check this as i don't have a second machine with older macOS version, and i didn't...

To make it work I also did a full cleanup on the repository with the command `git clean -fdx`, but remember to backup first output images, as i forgot about...

I modified the code a little bit to make some sanity checks. ```python def train(): gemma2it = AutoModelForCausalLM.from_pretrained("google/gemma-2b-it") # sanity check model tokenizer = AutoTokenizer.from_pretrained("NousResearch/gemma-2b-it-tokenizer") model = AutoModelForCausalLM.from_pretrained("google/gemma-2b-it", torch_dtype=torch.bfloat16) dataset...

Hi @zorrofox, and thanks for insight! Looks like my transformers fork didn't included change from that PR. What kind of fine-tune performance are you talking about? You want to know...