Louis Maddox

Results 158 comments of Louis Maddox

I presume that the right place to 'intervene' (process the relative path) is here, but don't want to rush in with a thoughtless implementation: https://github.com/mkdocstrings/autorefs/blob/a6e373b5ee6e6f8a6a30a5addd189c0abcd85eb2/src/mkdocs_autorefs/plugin.py#L80-L97 e.g. in my example above...

It's not at all bad! The state is relatively simple in the demo I've made and there are only 2 modules. :innocent: I just traced back where to hook into...

If you could give me a pointer to where to go next I'd appreciate it, I hit a bit of a dead end there. I'm going to take a break...

Thanks for chiming in @oprypin, I was aware there is also a Crystal handler too! 🙂 > Hi. I'd just like to point out one thing. The dot character has...

## :handshake: Realigning > [!IMPORTANT] > > If different suggestions are confusing different people then I'd suggest to narrow the proposed DSL to what everyone agrees on. It seems like...

I got it working by editing `src/model/recognizer/attention_recognition_head.py`, at lines 109 and 141. **109** (see https://github.com/JasonBoy1/TextZoom/issues/16#issue-732964244): ```py state = state.index_select(1, predecessors.squeeze()) ``` ⇣ ```py state = state.index_select(1, predecessors.squeeze().long()) # `.long()` added...

Hey Leandro that's cool, I built a layer myself (it didn't take much effort) so this isn't urgent. I found a link to an AWS repo which linked to another...

These are the results I get on 3090, not sure if they're meant to correspond to the table in README or something's changed ``` UForm-Gen Throughput: 193.65 tokens/s (run 1)...

Came here to raise the same issue, this native messaging ping pong hello world demo is broken on Linux for me :-1:

I was reporting the bug from Linux for the record