Liezl Maree
Liezl Maree
Closed in favor of [3cba6](https://github.com/talmolab/sleap/commit/3cba6d061177f3bcdfb9ea06107eb00bfa1c3733)
I think you are right - I also dislike how large the inference GUI has become. I like the organization of the stack widget, but adding a scroll widget is...
Hi @getzze, Yes, you are adding too many features for the GUI to handle! kudos 😎 The hold-up to merge this has indeed been displaying all the new features. I...
@coderabbitai review
Hi @neugun, Thanks for creating the issue (from discussion #1650). I did a bit of tracing and have everything documented below. Workaround --- The workaround would be to set-up the...
Hi @neugun, Just pinging you to say that I've finished the diagnosis. Thanks, Liezl
Hi @olinesn, Yes, `-n` and `--max_instances` are synonymous, the latter just more verbose and possibly more readable for others. Thanks, Liezl
Long awaited, but finally integrated. This issue is now fixed in SLEAP 1.4.1.
Hi @WeissShahaf, This parameter can be set through your training_config.json using the `"batch_size"` key. To train through the CLI, you would then reference this training_config.json where the batch size is...
Hi @WeissShahaf, There are a few downsides if you decrease the number of down-sampling blocks by too much... --- ### Context Deeper networks can capture more contextual information, which is...