asammdf GUI does not extract bus signals in specific conditions without warning
Background:
I am parsing a candump file into a .mf4-file. I took this as example. When I extract the bus signal of the generated file, it works perfectly fine. However, the mdf.extend() function is really slow (3.8M lines: >1h). So I decided, to collect all data into arrays and append the signal only once. This appeared to work way faster (3.8M lines: 25sec).
Issue: If I try to extract bus signal from the latter file, I seems to load the .dbc-files, but after that the loading bar disappers and nothing happens. I would expect to get a message, if there is an error.
Obsevations:
It seems, that for some reason, the data is saved in a different manner. For example, if I look at the tabular window, I get e.g. BusChannel 1 in the first, but [1] in the latter case.
So its actually an issue with the GUI + the question, how to store the data, so that it can be processed properly :)
OS: Win11 GUI: 7.5.0.dev2 asammdf==8.2.7
Maybe this can give additional hints, where the problem, I have relates to:
If I open the saved broken file with Python, I get the following error:
asammdf - ERROR - Error during CAN logging processing: Traceback (most recent call last):
File "C:\LocalProg\WPy64-311602\python-3.11.6.amd64\Lib\site-packages\asammdf\blocks\mdf_v4.py", line 10837, in _process_bus_logging
self._process_can_logging(index, group)
File "C:\LocalProg\WPy64-311602\python-3.11.6.amd64\Lib\site-packages\asammdf\blocks\mdf_v4.py", line 10957, in _process_can_logging
bus_msg_ids = msg_ids[bus_ids == bus]
~~~~~~~^^^^^^^^^^^^^^^^
IndexError: too many indices for array: array is 1-dimensional, but 2 were indexed
ERROR:asammdf:Error during CAN logging processing: Traceback (most recent call last):
File "C:\LocalProg\WPy64-311602\python-3.11.6.amd64\Lib\site-packages\asammdf\blocks\mdf_v4.py", line 10837, in _process_bus_logging
self._process_can_logging(index, group)
File "C:\LocalProg\WPy64-311602\python-3.11.6.amd64\Lib\site-packages\asammdf\blocks\mdf_v4.py", line 10957, in _process_can_logging
bus_msg_ids = msg_ids[bus_ids == bus]
~~~~~~~^^^^^^^^^^^^^^^^
IndexError: too many indices for array: array is 1-dimensional, but 2 were indexed
You probably have the wrong arrays dtpyes when you append the new signals
I used np.array(samples, dtype=STD_DTYPE) with
STD_DTYPE = np.dtype(
[
("CAN_DataFrame.BusChannel", "<u1"),
("CAN_DataFrame.ID", "<u4"),
("CAN_DataFrame.IDE", "<u1"),
("CAN_DataFrame.DLC", "<u1"),
("CAN_DataFrame.DataLength", "<u1"),
("CAN_DataFrame.DataBytes", "(64,)u1"),
("CAN_DataFrame.Dir", "<u1"),
("CAN_DataFrame.EDL", "<u1"),
("CAN_DataFrame.BRS", "<u1"),
("CAN_DataFrame.ESI", "<u1"),
]
)
in both cases.
I saw, that mdf.extend() takes invalidation_bits. Do I have to somehow pass them to the signal, if I collect them manually?
This is the code I use to store the data:
def write_mf4_file(messages, output_file):
samples = []
timestamps = []
for timestamp, can_bus, can_id, data in tqdm(messages, position=0, leave=True):
std_buffer = np.zeros(1, dtype=STD_DTYPE)
channel = int(can_bus[3])+1
std_buffer["CAN_DataFrame.BusChannel"] = channel
std_buffer["CAN_DataFrame.ID"] = can_id
std_buffer["CAN_DataFrame.IDE"] = 0 if channel == 3 else 1
std_buffer["CAN_DataFrame.Dir"] = 0 # RX
size = len(data)
std_buffer["CAN_DataFrame.DataLength"] = size
std_buffer["CAN_DataFrame.DataBytes"][0, :size] = bytearray(data)
std_buffer["CAN_DataFrame.DLC"] = size
std_buffer["CAN_DataFrame.ESI"] = 0
std_buffer["CAN_DataFrame.BRS"] = 0
std_buffer["CAN_DataFrame.EDL"] = 0
timestamps.append(timestamp)
samples.append(std_buffer)
# print(timestamp, can_bus, can_id, data)
"""Write messages to an mf4 file using asammdf."""
mdf = MDF(version="4.11")
attachment = None
acquisition_source = SourceInformation(
source_type=SOURCE_BUS, bus_type=BUS_TYPE_CAN
)
mdf.append(
Signal(
name="CAN_DataFrame",
samples=np.array(samples, dtype=STD_DTYPE),
timestamps=np.array(timestamps, dtype="<f8"),
attachment=attachment,
source=acquisition_source
)
)
# Speichere die mf4-Datei
mdf.save(output_file, overwrite=True)
mdf.close()
I found that flattening the array does the trick:
mdf.append(
Signal(
name="CAN_DataFrame",
samples=np.array(samples, dtype=STD_DTYPE).flatten(),
timestamps=np.array(timestamps, dtype="<f8"),
attachment=attachment,
source=acquisition_source
)
)
However, the issue remains, that there is no warning message in asammdf GUI (v7.5.0.dev2) when extraction does not work.
how about this
def write_mf4_file(messages, output_file):
timestamps = []
std_buffer = np.zeros(len(messages), dtype=STD_DTYPE)
for i, (timestamp, can_bus, can_id, data) in enumerate(tqdm(messages, position=0, leave=True)):
channel = int(can_bus[3])+1
std_buffer["CAN_DataFrame.BusChannel"][i] = channel
std_buffer["CAN_DataFrame.ID"][i] = can_id
std_buffer["CAN_DataFrame.IDE"][i] = 0 if channel == 3 else 1
std_buffer["CAN_DataFrame.Dir"][i] = 0 # RX
size = len(data)
std_buffer["CAN_DataFrame.DataLength"][i] = size
std_buffer["CAN_DataFrame.DataBytes"][i][0, :size] = bytearray(data)
std_buffer["CAN_DataFrame.DLC"][i] = size
std_buffer["CAN_DataFrame.ESI"][i] = 0
std_buffer["CAN_DataFrame.BRS"][i] = 0
std_buffer["CAN_DataFrame.EDL"][i] = 0
timestamps.append(timestamp)
"""Write messages to an mf4 file using asammdf."""
mdf = MDF(version="4.11")
attachment = None
acquisition_source = SourceInformation(
source_type=SOURCE_BUS, bus_type=BUS_TYPE_CAN
)
mdf.append(
Signal(
name="CAN_DataFrame",
samples=std_buffer,
timestamps=np.array(timestamps, dtype="<f8"),
attachment=attachment,
source=acquisition_source
)
)
# Speichere die mf4-Datei
mdf.save(output_file, overwrite=True)
mdf.close()
This produces the error:
File c:\users\i022056\onedrive - claas\dokumente\masterarbeit\16_poc2\06_testdaten\candump2mdf - kopie.py:79, in write_mf4_file(messages, output_file)
77 size = len(data)
78 std_buffer["CAN_DataFrame.DataLength"][i] = size
---> 79 std_buffer["CAN_DataFrame.DataBytes"][i][0, :size] = bytearray(data)
80 std_buffer["CAN_DataFrame.DLC"][i] = size
81 std_buffer["CAN_DataFrame.ESI"][i] = 0
IndexError: too many indices for array: array is 1-dimensional, but 2 were indexed