moabb
moabb copied to clipboard
Error in BIDS conversion example
There is an error in the BIDS conversion, using the caching process.
/home/runner/work/moabb/moabb/examples/plot_bids_conversion.py failed leaving traceback:
Traceback (most recent call last):
File "/home/runner/work/moabb/moabb/examples/plot_bids_conversion.py", line 50, in <module>
_ = dataset.get_data(cache_config=dict(path=temp_dir, save_raw=True))
File "/home/runner/work/moabb/moabb/moabb/datasets/base.py", line 342, in get_data
data[subject] = self._get_single_subject_data_using_cache(
File "/home/runner/work/moabb/moabb/moabb/datasets/base.py", line 436, in _get_single_subject_data_using_cache
sessions_data = self._get_single_subject_data(subject)
File "/home/runner/work/moabb/moabb/moabb/datasets/alex_mi.py", line 59, in _get_single_subject_data
raw = Raw(self.data_path(subject), preload=True, verbose="ERROR")
File "<decorator-gen-269>", line 10, in __init__
File "/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/mne/io/fiff/raw.py", line 153, in __init__
self._preload_data(preload)
File "/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/mne/io/base.py", line 594, in _preload_data
self._data = self._read_segment(data_buffer=data_buffer)
File "<decorator-gen-231>", line 12, in _read_segment
File "/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/mne/io/base.py", line 470, in _read_segment
_ReadSegmentFileProtector(self)._read_segment_file(
File "/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/mne/io/base.py", line 2526, in _read_segment_file
return self.__raw.__class__._read_segment_file(
File "/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/mne/io/fiff/raw.py", line 420, in _read_segment_file
one = read_tag(
File "/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/mne/io/tag.py", line 475, in read_tag
tag.data = fun(fid, tag, shape, rlims)
File "/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/mne/io/tag.py", line 263, in _read_simple
return _frombuffer_rows(fid, tag.size, dtype=dtype, shape=shape, rlims=rlims)
File "/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/mne/io/tag.py", line 112, in _frombuffer_rows
out = np.frombuffer(fid.read(read_size), dtype=dtype)
ValueError: buffer size must be a multiple of element size
The examples/plot_bids_conversion.py
is not build for now to avoid triggering error in CI, but we need to fix this error.
The issue is triggered in _get_single_subject_data
which does not depend on the BIDS conversion. It should also happen when the cache is not used. I will try to replicate in local
I have the same versions of mne and numpy, but can’t reproduce. I also tried to re-download AlexMI but it did not change
In the same log (https://github.com/NeuroTechX/moabb/actions/runs/6532207010/job/17734953029), I see we also had an issue while reading some .mat
datasets :
Unexpected failing examples:
/home/runner/work/moabb/moabb/examples/plot_vr_pc_p300_different_epoch_size.py failed leaving traceback:
Traceback (most recent call last):
File "/home/runner/work/moabb/moabb/examples/plot_vr_pc_p300_different_epoch_size.py", line 108, in <module>
X_train, y_train, _ = dataset.get_block_repetition(
File "/home/runner/work/moabb/moabb/moabb/datasets/braininvaders.py", line 936, in get_block_repetition
X, labels, meta = paradigm.get_data(self, subjects)
File "/home/runner/work/moabb/moabb/moabb/paradigms/base.py", line 278, in get_data
data = [
File "/home/runner/work/moabb/moabb/moabb/paradigms/base.py", line 279, in <listcomp>
dataset.get_data(
File "/home/runner/work/moabb/moabb/moabb/datasets/base.py", line 342, in get_data
data[subject] = self._get_single_subject_data_using_cache(
File "/home/runner/work/moabb/moabb/moabb/datasets/base.py", line 436, in _get_single_subject_data_using_cache
sessions_data = self._get_single_subject_data(subject)
File "/home/runner/work/moabb/moabb/moabb/datasets/braininvaders.py", line 899, in _get_single_subject_data
return _bi_get_subject_data(self, subject)
File "/home/runner/work/moabb/moabb/moabb/datasets/braininvaders.py", line 166, in _bi_get_subject_data
data = loadmat(os.path.join(file_path, os.listdir(file_path)[0]))["data"]
File "/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/scipy/io/matlab/_mio.py", line 227, in loadmat
matfile_dict = MR.get_variables(variable_names)
File "/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/scipy/io/matlab/_mio5.py", line 332, in get_variables
res = self.read_var_array(hdr, process)
File "/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/scipy/io/matlab/_mio5.py", line 292, in read_var_array
return self._matrix_reader.array_from_header(header, process)
File "_mio5_utils.pyx", line 666, in scipy.io.matlab._mio5_utils.VarReader5.array_from_header
File "_mio5_utils.pyx", line 695, in scipy.io.matlab._mio5_utils.VarReader5.array_from_header
File "_mio5_utils.pyx", line 769, in scipy.io.matlab._mio5_utils.VarReader5.read_real_complex
File "_mio5_utils.pyx", line 446, in scipy.io.matlab._mio5_utils.VarReader5.read_numeric
File "_mio5_utils.pyx", line 351, in scipy.io.matlab._mio5_utils.VarReader5.read_element
File "_streams.pyx", line 171, in scipy.io.matlab._streams.ZlibInputStream.read_string
File "_streams.pyx", line 164, in scipy.io.matlab._streams.ZlibInputStream.read_into
OSError: could not read bytes
Could this all be caused by an issue with the CI caching of the datasets @sylvchev ?
Following the installation guide from Installing MOABB from sources w/ development environment gave me this error when running the test, I thought it might be related. I also received "Extra [deeplearning] is not specified." I ran the provided test command.
(mne-1.5.1_0) auan@Monday-Night:~/developery/moabb$ python -m unittest moabb.tests
E
======================================================================
ERROR: moabb (unittest.loader._FailedTest.moabb)
----------------------------------------------------------------------
ImportError: Failed to import test module: moabb
Traceback (most recent call last):
File "/home/auan/anaconda3/lib/python3.11/unittest/loader.py", line 154, in loadTestsFromName
module = __import__(module_name)
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/auan/developery/moabb/moabb/__init__.py", line 4, in <module>
from .benchmark import benchmark
File "/home/auan/developery/moabb/moabb/benchmark.py", line 10, in <module>
from moabb import paradigms as moabb_paradigms
File "/home/auan/developery/moabb/moabb/paradigms/__init__.py", line 9, in <module>
from moabb.paradigms.cvep import *
File "/home/auan/developery/moabb/moabb/paradigms/cvep.py", line 5, in <module>
from moabb.datasets import utils
File "/home/auan/developery/moabb/moabb/datasets/__init__.py", line 9, in <module>
from . import compound_dataset
File "/home/auan/developery/moabb/moabb/datasets/compound_dataset/__init__.py", line 2, in <module>
from .base import CompoundDataset
File "/home/auan/developery/moabb/moabb/datasets/compound_dataset/base.py", line 5, in <module>
from ..base import BaseDataset
File "/home/auan/developery/moabb/moabb/datasets/base.py", line 13, in <module>
from moabb.datasets.bids_interface import StepType, _interface_map
File "/home/auan/developery/moabb/moabb/datasets/bids_interface.py", line 26, in <module>
import mne_bids
ModuleNotFoundError: No module named 'mne_bids'
----------------------------------------------------------------------
Ran 1 test in 0.000s
FAILED (errors=1)
Debian 12, AMD 3250U, Python 3.11
After having run
pip install mne_bids
pip install pyriemann
pip install edflib
I began to have different issues instead. It's quite long. erirorror.txt I should be able to troubleshoot otherwise, or just wait for the coming release.
Oops, sorry, I think it was totally unrelated. I'm pretty sure I installed something wrong.