兮尘
兮尘
@VipMinF 这个库就是要用 aspectj 实现无侵入拓展的,去掉不是等价于失去灵魂了吗
- [x] 个人项目 - [x] 项目名:时光猫 - [x] 项目下载地址:[酷安 http://www.coolapk.com/apk/182751](http://www.coolapk.com/apk/182751)
希望支持业务模块的初始化优先级+1
应该是本地调试的锅。。另外申请的时候域名要写根域名,用的时候client_id和client_secret也要写对。。 [使用gitment总结](http://linxueyuanstdio.github.io/2018/01/31/2018-01-31-gitment/)
kotlin的出了吗?
我看不懂这些乱码。。。能详细描述一下吗?
do you change `lr_init`? `lr_init` is the initial learning rate, while `CosineAnnealingLR` is learning rate scheduler
please refer to `model/utils/lr_schedule.py`, which defines object `LRSchedule`. Warming-up (`lr_warm`, `end_warm`) and decay (`start_decay`, `end_decay`) are taken to schedule the learning rate. the learning rate will be `lr_init` only when...
you can refer to https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.MultiplicativeLR.html for pytorch's MultiplicativeLR.
emmm... it seems you are trying to reproduce with pytorch? lr_scheduler is only a trick to improve performance. you can try any LR on your own. Or you can even...