Yuchen Jin
Yuchen Jin
Thank you for your explanation. I think there are two cases in the current implementation. 1. If `upload_id` is used, the files would be separated by subfolders. The chunk file...
That is exactly what I want to test. I think I could write some codes for checking the performance in this case if I could finish the other on-progress PRs.
I meet the same problem when trying to deploy the service on a different server. In my tries, I hack into the python source codes and make some changes to...
Hi @np-8, Because I only want the `dash-uploader` served for my `flask` app, not `dash` app. The uploader component is deployed for another `dash` app. By the way, I hack...
@np-8 Thank you! Your reference is useful. I will try to implement this feature this week. I will try to start my work from `0.4.x` version. If successful, I will...
@np-8 @Vipulsh Hello! I have created my pull request for solving this issue. See #36 for checking details. Because I am not an expert in web development, please help me...
`torchsummary` would use a batch size 2 tensor to test the network, and get the information of each layer. See the codes here: https://github.com/sksq96/pytorch-summary/blob/011b2bd0ec7153d5842c1b37d1944fc6a7bf5feb/torchsummary/torchsummary.py#L58 Even you configure `batch_size` in the...
> Thanks a lot. I wanted to ask why does it take '1' as batch size when I input a shape similar to an image, like `(3,28,28)`? Because in that...
Your invocation is not correct. Change the codes like: ```python summary(autoencoder, (4396,)) ```
The current version could support your need. Please use the newest GitHub version instead of the version installed by PyPI, because `torchsummary.summary_string` does not exist in [`torchsummary 1.5.1`](https://pypi.org/project/torchsummary/#history). Here is...