Isaac Godfried

Results 87 comments of Isaac Godfried

Hi @albertsun. We at @CoronaWhy are currently developing machine learning models to forecast COVID-19 hospitalizations and new cases at the county level as well as causal models to attribute causation....

Sure just did. Let me know if you need help.

Edit it doesn't... this is happening now too.

Actually it seems to happen where there is no_scale set to true. It also seems to happen with copied dataset params

Informer seems to be broken now as well.

``` ncalls tottime percall cumtime percall filename:lineno(function) 4224/1 0.068 0.000 76.199 76.199 {built-in method builtins.exec} 1 0.000 0.000 76.199 76.199 trainer.py:1() 1 0.003 0.003 72.211 72.211 trainer.py:173(main) 1 0.001 0.001...

So looking things over the speed bottlenecks on functions are generally where you would expect. For instance, on `simple_decode` the `model(src)` line takes up 90% of the compute. In PyTorch...

Using `torch.compile` in simple_decode 421 ms ± 11.8 ms per loop (mean ± std. dev. of 7 runs, 1 loop each) Not using `torch.compile` in simple_decode 266 ms ± 18...

``` Timer unit: 1e-09 s Total time: 0.0194293 s File: Function: __getitem__ at line 102 Line # Hits Time Per Hit % Time Line Contents ============================================================== 102 @profiler 103 def...

``` Line # Hits Time Per Hit % Time Line Contents ============================================================== 347 @profile 348 def torch_single_train(model: PyTorchForecast, 349 opt: optim.Optimizer, 350 criterion: Type[torch.nn.modules.loss._Loss], 351 data_loader: DataLoader, 352 takes_target: bool,...