FreeBuddy icon indicating copy to clipboard operation
FreeBuddy copied to clipboard

Generic bluetooth heaphones

Open TheLastGimbus opened this issue 2 years ago • 0 comments

It would be nice to support generic bluetooth headphones - ones that don't have separate case/buds, nor any awesome functions - it may still be helpful to provide a battery widget for them and other nice features in future

TheLastGimbus avatar Dec 05 '23 21:12 TheLastGimbus

Hi @byminji

You are right. The uploaded zip seems broken. I've just uploaded a new version and could you take another try? The md5sum is supposed to be 0472cd368133615e187e10c29179d9b7. If it doesn't work still, I will try some other cloud service to host the data.

Best, Yue

zhaoyue-zephyrus avatar Nov 14 '23 01:11 zhaoyue-zephyrus

Hi @zhaoyue-zephyrus, thank you for your kind regards:) It seems that the re-uploaded version is working. However, I faced one more issue during finetuning avion_pretrain_lavila_vitb_best.pt model in EK100 CLS task in my environment (Note that I disabled --use-flash-attn because my GPU is V100 where flash-attn is not supported).

Below is the error message:

Screenshot 2023-11-14 at 4 12 58 PM

It seems that disabling --use-flash-attn during fine-tuning raises dictionary key mismatch when loading pretrain model trained with flash attention. My command is:

export EXP_PATH=workspace/expr/finetune_ek100_cls_v100_4gpu_adamw_no_flash_attn
mkdir $EXP_PATH
PYTHONPATH=.:third_party/decord/python/ torchrun \
    --nproc_per_node=8 scripts/main_lavila_finetune_cls.py \
    --root /mnt/tmp/datasets/EK100/EK100_320p_15sec_30fps_libx264/ \
    --video-chunk-length 15 --disable-flash-attn \
    --grad-checkpointing \
    --use-fast-conv1 \
    --batch-size 64 \
    --optimizer adamw \
    --fused-decode-crop \
    --use-multi-epochs-loader \
    --pretrain-model workspace/weights/avion_pretrain_lavila_vitb_best.pt \
    --output-dir $EXP_PATH 2>&1 | tee $EXP_PATH/log.txt

byminji avatar Nov 14 '23 07:11 byminji

Hi @byminji

This is expected since the flash-attention uses another API. I can add a script to convert to regular attention but it won't be done until the end of this month due to an incoming deadline. You can also try it yourself since the main difference is only at these few lines.

I'll keep the issue open and come back later.

Best, Yue

zhaoyue-zephyrus avatar Nov 14 '23 08:11 zhaoyue-zephyrus

@zhaoyue-zephyrus Thanks! I will first try by myself.

byminji avatar Nov 14 '23 08:11 byminji