Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to remove the limit on the number of annotations? #187

Open
Sruthi-sk opened this issue Feb 24, 2023 · 5 comments
Open

How to remove the limit on the number of annotations? #187

Sruthi-sk opened this issue Feb 24, 2023 · 5 comments

Comments

@Sruthi-sk
Copy link

I have markers which I am trying to save as annotations in an edf file

marker_signal = recorded_data[marker_channel].reshape(1,-1)
    
signal_names = eeg_channel_names + ['marker']
signals = np.append(eeg_signals,marker_signal,axis=0) 
# signal_names = eeg_channel_names
# signals = eeg_signals

pmin, pmax = signals.min(), signals.max()  
channel_info = pyedflib.highlevel.make_signal_headers(signal_names, sample_frequency=sf,
                                          physical_min=pmin, physical_max=pmax,dimension='uV') # 

# Annotation format [ [timepoint, duration, description ],[...] ]
marker_timepoints = np.where(marker_data!=0)[0] # nonzero(marker_signal != 0)[0]
annotations = []
for j in range(len(marker_timepoints)):    
    annotations.append([marker_timepoints[j]/sf,
                        0,
                        str(int( marker_data[ marker_timepoints[j] ] ))
                        ])

# main_header = pyedflib.highlevel.make_header(equipment = device)
header = {'annotations':annotations,'equipment': device}

pyedflib.highlevel.write_edf(edf_file_name, signals = signals, 
                             signal_headers=channel_info, header = header, 
                             file_type=-1)

missing_annotations

As I understand it, the limit on the number of annotations is the number of seconds of data. I thought of using pyedflib.edfwriter.set_number_of_annotation_signals to increase this number but the documentation says the maximum is 64, while I need over 300 in 1 minute.
I understand the alternate solution is to just send the marker channel as I have in the above image, but I wanted to know if there's another way.

@Sruthi-sk Sruthi-sk changed the title How to remove limit on the number of annotations How to remove the limit on the number of annotations? Feb 24, 2023
@skjerns
Copy link
Collaborator

skjerns commented Feb 28, 2023

I don't exactly know what's going on, but I suspect that annotations are limited by the number of records.

EDF saves data in records (i.e. blocks/chunks). The record length is usually 1 second, but can also be different than that. I guess for each block only one annotation can be saved, but I might be wrong about that. It might be possible to reduce the block length, however, there is an open bug #159 about this, so probably you will need to install from that branch to test it out. Currently we are waiting for one other maintainer to give the go to merge the bugfix.

You could try setting the record length manually to something lower and check if that allows you to write more annotations?

probably it is possible to add additional annotation channels, maybe that would multiply the number of annotations that can be saved. Not sure if this is implemented in pyedflib. I haven't worked with annotations in long time.

@Sruthi-sk
Copy link
Author

Thanks, @skjerns. I don't think pyedflib allows us to add additional annotation channels, but I will try the second solution on setting the record length manually. Did you mean that I should modify this?
record_duration = record_duration_seconds * 1000 * 100

@skjerns
Copy link
Collaborator

skjerns commented Mar 1, 2023

Something along those lines, yes.

However, you probably need to re-set the sampling rate/frequency and/or smp_per_record as well manually.

@kevincar
Copy link

Thanks for the reference to the pull request. I was also having issues coming across this limitation. I have a system that writes real-time to an EDF from an EEG headset / experiment.

I noticed that when the number of annotations was greater than the number of seconds in the run of data that annotations are truncated. To test this, I wrote 60 seconds of random samples and played round with changing how many annotations to write and it simply stops writing annotations.

Figure 4-2

Would it make sense to provide an error (or perhaps a warning or any sort of explanation) whenever the writeAnnotation function is called (or probably writeClose) and all data records are already full such that the error indicates that no more annotations can be written or that annotations will be truncated because there aren't enough data records to store them?

@skjerns
Copy link
Collaborator

skjerns commented Nov 18, 2023

That would be a great addition! Would you feel confident enough to attempt a PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants