Dmitry Kolesnikov

Results 32 comments of Dmitry Kolesnikov

To reduce the load on the operational memory, you can either decrease the number of patches during inference by increasing the stride or crop size, or you can resize the...

@hanbangzou To update the library to the latest version, run the following command in your terminal: ``` pip install --upgrade patched_yolo_infer ``` After successfully updating the library, you can use...

@hanbangzou How are your results? Did the new repository update help you solve the memory problem? PS: All usage examples are provided in [Google Colab](https://colab.research.google.com/drive/1XCpIYLMFEmGSO0XCOkSD7CcD9SFHSJPA?usp=sharing)

@hanbangzou I'm glad to hear that the update has resolved the issue! Good luck!

I advise you to move the image files one level earlier, that is, so that the structure is like this: ``` COCO_dataset/ |-- annotations/ | |-- instances_train.json | |-- instances_val.json...

yes, I agree. Apparently, the storage location of photos depends on the version cvat. My upload in this format looks like in the example. So, as an option, you can...

I assume that if there is a task that is not defined to any of the subtasks such as train validation or test, then it creates a data folder. But...

Good afternoon. Development in this area is underway. Most likely, a separate library called `patched_obb_infer` from our development team will be released in the next month. So stay tuned for...

@Alisoltan82 Unfortunately, there isn't a ready-made tool to directly export patch inference results to COCO format. However, you can still obtain the results of the patch inference algorithm in a...

Unfortunately, I haven't had time to implement this option yet. I'll let you know if this feature will appear in the next updates