Issue with reading intan file
Hi I am trying to read the intan .rhs file as given in the code below, but I get the following errors. Could you help whats going wrong here?
@kshtjkumar, it looks like the memmap is failing.
Is it possible this file is corrupted? What version of RHS is it? We haven't updated the RHS portion of read_intan for a while. It might be best to open an issue on neo. Over there if you can share the file we can see if we can figure out what the problem is.
Also, just to confirm? Are you using the intan format where all the data is in a single file, right? That is, all your data is on the rhs file?
@h-mayorquin just an fyi for your info: the general workflow for choosing is you choose if you want one vs multi-file and you choose how often you want to split your file (ie make a new file every 30 mins). Then you give the software a name. For one-file (header-attached) the .rhd/.rhs would have the name + a timestamp. For multi-file (one-file-per-channel or one-file-per-signal) it will name the root folder with your name + a timestamp. So based on the file naming convention we would guess that this file is from a single file because the only way this could be from a multi-file would be if the user went in and edited the info file inside the multi-file system (which we don't support at all). That being said because there is info after the timestamp we know that they did edit the name of the file later (the part1 was added on). I'm just worried there was some attempt to crop the file leading to corruption.
Also, just to confirm? Are you using the intan format where all the data is in a single file, right? That is, all your data is on the rhs file?
yes its a merged file.
@kshtjkumar, it looks like the memmap is failing.
Is it possible this file is corrupted? What version of RHS is it? We haven't updated the RHS portion of
read_intanfor a while. It might be best to open an issue on neo. Over there if you can share the file we can see if we can figure out what the problem is.
I dont think so, because the time stamps were correct and the individual files of 60 sec each are opening without any issue!
yes its a merged file.
How did you merge? Because based on this it seems like the way we are making the memmap may not work with the way you merged.
You could instead load each file individually and then merge them in SpikeInterface which could get rid of this problem.
yes its a merged file.
How did you merge? Because based on this it seems like the way we are making the memmap may not work with the way you merged.
You could instead load each file individually and then merge them in SpikeInterface which could get rid of this problem.
intan has its own merger on its website, I have been using that for almost all files! intan stores recordings for 60seconds as one file, and the recording I am using is overnight night recording.
Yeah I've seen their merger tool, but I'm wondering if their merger tool changes the structure of the header file which leads to Neo not working for these files. Since SpikeInterface uses Neo under the hood we would need to see if we could patch this at the Neo level. The error you're getting is saying that the information that Neo is extracting from the merged header is wrong for the size of the actual data. This could be an issue at the Neo level. If you don't want to go through the process of that though (ie submit an issue on Neo and share the merged file so we can debug), my advice would be to just iterate through all the files and make a bunch of recording objects and then merge them within SpikeInterface. Rather than try to load the one giant merged file.
intan stores recordings for 60seconds as one file, and the recording I am using is overnight night recording.
Which version of the software are you using. With RHX (their newest version of 3.0) you can change a setting so that the size of the file is however long you want it to be. (For example you say to create a new file only every 3 hours or something) to make it easier to merge and less prone to the issue of the incorrect number of bytes for the files.
Yeah I've seen their merger tool, but I'm wondering if their merger tool changes the structure of the header file which leads to Neo not working for these files. Since SpikeInterface uses Neo under the hood we would need to see if we could patch this at the Neo level. The error you're getting is saying that the information that Neo is extracting from the merged header is wrong for the size of the actual data. This could be an issue at the Neo level. If you don't want to go through the process of that though (ie submit an issue on Neo and share the merged file so we can debug), my advice would be to just iterate through all the files overnight and make a bunch of recording objects and then merge them within SpikeInterface. Rather than try to load the one giant merged file.
intan stores recordings for 60seconds as one file, and the recording I am using is overnight night recording.
Which version of the software are you using. With RHX (their newest version of 3.0) you can change a setting so that the size of the file is however long you want it to be. (For example you say to create a new file only every 3 hours or something) to make it easier to merge and less prone to the issue of the incorrect number of bytes for the files.
Ok i am unaware of the version number , but I am assuming its the older one 2022! Certainly I will update that. Could help me on how to combine them from the folder using spikeinterface ? I can try that and see if that is working for me.
So you would basically just make one recording per file. So you could do something like
list_of_files = ['file1.rhs', 'file2.rhs', 'file3.rhs']
list_of_recordings = []
for file in list_of_files:
list_of_recordings.append(se.read_intan(file,....))
recording = si.concatenate_recordings(list_of_recordings)
Then you would just attach your probe to the recording normally.
For your future info @kshtjkumar
For @h-mayorquin, it appears based on their simulator that it may be possible to save .rhs files also as one-file-per-channel etc. I guess we will eventually need to update RHS for the extra formats at some point :)
@zm711 This is REALLY REALLY useful for me and I think for the other users.
So you would basically just make one recording per file. So you could do something like
list_of_files = ['file1.rhs', 'file2.rhs', 'file3.rhs'] list_of_recordings = [] for file in list_of_files: list_of_recordings.append(se.read_intan(file,....)) recording = si.concatenate_recordings(list_of_recordings)Then you would just attach your probe to the recording normally.
awesome!I will try this!
For your future info @kshtjkumar
![]()
For @h-mayorquin, it appears based on their simulator that it may be possible to save
.rhsfiles also asone-file-per-channeletc. I guess we will eventually need to update RHS for the extra formats at some point :)
Thank you for this! 👍
@kshtjkumar let us know if that works for you.
@zm711 maybe we should add a small "how to" to the corresponding section and if this works for @kshtjkumar we can close this issue after adding the documentation for this. What do you think?
You think an intan specific how_to? I don't know if we have any other neo based file formats where this would be the necessary solution. I agree we need to document it, but where? (agree to keep open until we have it documented) :)
No, I think a general "how to concatenate recordings if your data is contiguos on time but divided across many files" with the code you shared above:
list_of_files = ['file1.rhs', 'file2.rhs', 'file3.rhs']
list_of_recordings = []
for file in list_of_files:
list_of_recordings.append(se.read_intan(file,....))
recording = si.concatenate_recordings(list_of_recordings)
We can then mention the Intan case as an example of this.
Similary, I wrote the channel case in a docstring for NeuroExplorer:
https://github.com/SpikeInterface/spikeinterface/blob/2983463b655a6f10130b8017913f2ca90d4fd9fe/src/spikeinterface/extractors/neoextractors/neuroexplorer.py#L22-L33
Which should be in a how to as well I think.
Yeah that sounds like a good idea! As soon as @kshtjkumar confirms that works we can add that!
hi yes, tried it with multiple files, works perfectly for merging the .rhs files
Awesome! We will work on the docs so we will close this when the docs are done :)
