Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
behinger committed May 10, 2024
2 parents b9b86da + cced1d2 commit e6a6648
Show file tree
Hide file tree
Showing 4 changed files with 67 additions and 5 deletions.
File renamed without changes.
21 changes: 16 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,17 @@
# UnfoldDecode
[![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](https://behinger.github.io/UnfoldDecode.jl/stable/)
[![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://behinger.github.io/UnfoldDecode.jl/dev/)
[![Build Status](https://github.com/behinger/UnfoldDecode.jl/actions/workflows/CI.yml/badge.svg?branch=main)](https://github.com/behinger/UnfoldDecode.jl/actions/workflows/CI.yml?query=branch%3Amain)
[![Coverage](https://codecov.io/gh/behinger/UnfoldDecode.jl/branch/main/graph/badge.svg)](https://codecov.io/gh/behinger/UnfoldDecode.jl)

Beta-stage toolbox to decode ERPs with overlap, e.g. from eye-tracking experiments.

> [!WARNING]
> No unit-tests implemented as of 2024-01-09 - use at your own risk!
Toolbox to decode ERPs with overlap, e.g. from eye-tracking experiments.
Currently the following algorithms are implemented:

Currently only the overlap corrected LDA¹ proposed by [Gal Vishne, Leon Deouell et al.](https://doi.org/10.1101/2023.06.28.546397) is implemented, but more to follow.
- [back-to-back regession](https://doi.org/10.1016/j.neuroimage.2020.117028) (`solver_b2b`, [tutorial how to use](https://unfoldtoolbox.github.io/Unfold.jl/dev/HowTo/custom_solvers/#Back2Back-regression))
- overlap corrected LDA¹ proposed by [Gal Vishne, Leon Deouell et al.](https://doi.org/10.1101/2023.06.28.546397) is implemented, but more to follow.

¹ actually any MLJ supported classification/regressoin model is already supported

Expand All @@ -22,12 +25,20 @@ uf_lda = fit(UnfoldDecodingModel,des,evt,dat,LDA(),"fixation"=>:condition)
```

Does the trick - you should probably do an Unfold.jl tutorial first though!

## Installation
Not yet registered thus you have to do:
```julia
using Pkg
Pkg.add(url="https://github.com/behinger/UnfoldDecode.jl")
using UnfoldDecode
```
once it is registered, this will simplify to `Pkg.add("UnfoldDecode")`
## Loading Data
have a look at PyMNE.jl to read the data. You need a data-matrix + DataFrames.jl event table (similar to EEGlabs EEG.events)

## Limitations
No time generalization is available, but straight forward to implement with the current tooling.
- Not thoroughly tested, no unit-tests yet!
- Missing features: e.g. No time generalization is available, but straight forward to implement with the current tooling.

## Citing

Expand Down
2 changes: 2 additions & 0 deletions src/UnfoldDecode.jl
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,11 @@ include("decoding.jl")
include("fit.jl")
include("helper.jl")
include("overlap_corrected.jl")
include("b2b.jl")

export UnfoldDecodingModel
export coeftable
export fit
export solver_b2b

end
49 changes: 49 additions & 0 deletions src/b2b.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# Code currently duplicated in Unfold.jl
# https://github.com/unfoldtoolbox/Unfold.jl/edit/main/src/solver.jl

# Basic implementation of https://doi.org/10.1016/j.neuroimage.2020.117028
solver_b2b(X, data, cross_val_reps) = solver_b2b(X, data, cross_val_reps = cross_val_reps)
function solver_b2b(
X,
data::AbstractArray{T,3};
cross_val_reps = 10,
multithreading = true,
showprogress=true,
) where {T<:Union{Missing,<:Number}}

X, data = dropMissingEpochs(X, data)


E = zeros(size(data, 2), size(X, 2), size(X, 2))
W = Array{Float64}(undef, size(data, 2), size(X, 2), size(data, 1))

prog = Progress(size(data, 2) * cross_val_reps, 0.1;enabled=showprogress)
@maybe_threads multithreading for m = 1:cross_val_reps
k_ix = collect(Kfold(size(data, 3), 2))
X1 = @view X[k_ix[1], :]
X2 = @view X[k_ix[2], :]

for t = 1:size(data, 2)

Y1 = @view data[:, t, k_ix[1]]
Y2 = @view data[:, t, k_ix[2]]


G = (Y1' \ X1)
H = X2 \ (Y2' * G)

E[t, :, :] += Diagonal(H[diagind(H)])
ProgressMeter.next!(prog; showvalues = [(:time, t), (:cross_val_rep, m)])
end
E[t, :, :] = E[t, :, :] ./ cross_val_reps
W[t, :, :] = (X * E[t, :, :])' / data[:, t, :]

end

# extract diagonal
beta = mapslices(diag, E, dims = [2, 3])
# reshape to conform to ch x time x pred
beta = permutedims(beta, [3 1 2])
modelinfo = Dict("W" => W, "E" => E, "cross_val_reps" => cross_val_reps) # no history implemented (yet?)
return LinearModelFit(beta, modelinfo)
end

0 comments on commit e6a6648

Please sign in to comment.