Lars Heimdal
Lars Heimdal
This might help you: ```typescript function handleClerkChange(event) { const user: UserResource = event.detail; // do stuff } onMount(() => { document.addEventListener('clerk-sveltekit:user', handleClerkChange); return () => { document.removeEventListener('clerk-sveltekit:user', handleClerkChange); }; });...
Seems to also happen if I turn it into a `cat` op: ```python class CatModel(nn.Module): def forward(self, x: torch.Tensor): """ Args: x: [N, H, W, C] """ return torch.cat([-x.unsqueeze(-1), x.unsqueeze(-1)],...
That also fails
Nope, doesn't work either
This also fails with the same concatenation error: ```python import torch.nn as nn import torch from tinynn.graph.quantization.quantizer import PostQuantizer from tinynn.converter import TFLiteConverter class EncoderLayer(nn.Module): def __init__( self, d_model: int...
FYI having two separate encoders works (but I need them to be the same): ```python class Dummy(nn.Module): def __init__(self): super().__init__() self.encoder0 = EncoderLayer(256) self.encoder1 = EncoderLayer(256) def forward(self, x, y):...
This seems to be a decent workaround for the moment: ```python class Dummy(nn.Module): def __init__(self): super().__init__() self.encoder = EncoderLayer(256) def forward(self, x, y): x_cat = torch.cat([x, y], dim=0) x_cat =...
Using this implementation of RMSNorm instead of the built in one also fails: ```python class RMSNorm(nn.Module): def __init__(self, normalized_shape: int, eps=1e-8 ): """ Root Mean Square Layer Normalization :param normalized_shape:...
FYI `torch.rsqrt` also fails with the same error "QUint8" error. Maybe thats possible to support? [link](https://www.tensorflow.org/mlir/tfl_ops#tflrsqrt_tflrsqrtop): ```python class RMSNorm(nn.Module): def __init__(self, dim: int, eps: float = 1e-6) -> None: super().__init__()...