eelbrain.load.mne

Tools for importing data through mne.

events([raw, merge, proj, name, bads, ...])

Load events from a raw fiff file.

epochs(ds[, tmin, tmax, baseline, decim, ...])

Load epochs as NDVar.

variable_length_epochs(ds, tmin[, tmax, ...])

Load data epochs where each epoch has a different length

mne_epochs(ds[, tmin, tmax, baseline, ...])

Load epochs as mne.Epochs.

variable_length_mne_epochs(ds, tmin[, tmax, ...])

Load mne Epochs where each epoch has a different length

add_epochs(ds[, tmin, tmax, baseline, ...])

Load epochs and add them to a dataset as NDVar.

add_mne_epochs(ds[, tmin, tmax, baseline, ...])

Load epochs and add them to a dataset as mne.Epochs.

epochs_ndvar(epochs[, name, data, exclude, ...])

Convert an mne.Epochs object to an NDVar.

evoked_ndvar(evoked[, name, data, exclude, ...])

Convert one or more mne Evoked objects to an NDVar.

raw_ndvar(raw[, i_start, i_stop, decim, ...])

Raw data as NDVar

sensor_dim(info[, picks, sysname, connectivity])

Create a Sensor dimension from an mne.Info object.

stc_ndvar(stc, subject, src[, subjects_dir, ...])

Convert one or more mne.SourceEstimate objects to an NDVar.

forward_operator(fwd, src[, subjects_dir, ...])

Load forward operator as NDVar

inverse_operator(inv, src[, subjects_dir, ...])

Load inverse operator as NDVar

DatasetSTCLoader(data_dir)

Load source estimates on disk into Dataset for use in statistical tests

Managing events with a Dataset

To load events as Dataset:

>>> ds = load.mne.events(raw_file_path)

By default, the Dataset contains a variable called "trigger" with trigger values, and a variable called "i_start" with the indices of the events:

>>> print(ds[:10])
trigger   i_start
-----------------
2         27977
3         28345
1         28771
4         29219
2         29652
3         30025
1         30450
4         30839
2         31240
3         31665

These events can be modified in ds (adding event labels as Factor, discarding events, modifying i_start, , …) before being used to load data epochs.

Epochs can be loaded as NDVar with load.mne.epochs(). Epochs will be loaded based only on the "i_start" variable, so any modification to this variable will affect the epochs that are loaded.:

>>> ds['epochs'] = load.mne.epochs(ds)

Epochs can also be loaded as mne-python mne.Epochs object:

>>> mne_epochs = load.mne.mne_epochs(ds)

Using Threshold Rejection

In case threshold rejection is used, the number of the epochs returned by load.mne.epochs(ds, reject=reject_options) might not be the same as the number of events in ds (whenever epochs are rejected). For those cases, load.mne.add_epochs`() will automatically resize the Dataset:

>>> epoch_ds = load.mne.add_epochs(ds, -0.1, 0.6, reject=reject_options)

The returned epoch_ds will contain the epochs as NDVar as ds['meg']. If no epochs got rejected during loading, the length of epoch_ds is identical with the input ds. If epochs were rejected, epoch_ds is a shorter copy of the original ds.

mne.Epochs can be added to ds in the same fashion with:

>>> ds = load.mne.add_mne_epochs(ds, -0.1, 0.6, reject=reject_options)

Separate events files

If events are stored separately form the raw files, they can be loaded in load.mne.events() by supplying the path to the events file as events parameter:

>>> ds = load.mne.events(raw_file_path, events=events_file_path)

Loading source estimates into a Dataset

Previously exported stc files can be loaded into a Dataset with the DatasetSTCLoader class. The stcs must reside in subdirectories named by condition. Supply the path to the data, and the constructor will detect the factors’ levels from the names of the condition directories. Call set_factor_names() to indicate the names of the experimental conditions, and finally load the data with make_dataset().

>>> loader = load.mne.DatasetSTCLoader("path/to/exported/stcs")
>>> loader.set_factor_names(["factor1", "factor2"])
>>> ds = loader.make_dataset(subjects_dir="mri/")