Harsh Sutaria

Results 3 comments of Harsh Sutaria

Hi everyone, I’d love to work on adding the **AdaBelief optimizer** to `mlx.optimizers` as proposed here. **Planned implementation** - API similar to `Adam` / `AdamW` (`learning_rate`, `betas`, `eps`, `weight_decay`, optional...

Thanks for the review and feedback! I've refactored code and made the necessary changes. I request you to review them again.

Hi @awni, just checking in. I've addressed all the requested changes and pushed the updates. Whenever you get a moment, could you please review the latest commits? Thanks!