You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Do a run-through of the study with Josh (like actually doing the study once I'd suggest), so you get an idea of how the experiment looked like :)
Do the task for EEG preprocessing that I have given to Alper , who is currently doing an M/EEG internship with us, to get some first hands-on with EEG data (MEG and EEG are pretty similar)
Follow the MNE tutorials on decoding and any other tutorials you think might help you (e.g. preprocessing)
decoding Coursera Course MNE oder machine learning basics
(optional/later) get MEG and MRI security instructions (so you are allowed to join measurements)
Organizational
Gordon Felds group is part of the department of Clinical Psychology, of which Prof. Kirsch is the head. Besides that, Fungi Gerchen also has a research group in the department.
We have a weekly stand-up meeting on Tuesday 10:00 where Gordon's group update the other members what they're going to do this week. Either in person or often via Zoom.
On Wednesday 9:30, there is the lab meeting which all department members attend. Usually there is a presentation in that meeting. If you want, you can also present e.g. your Bachelor's thesis at some point, or the results of the current project
We mainly communicate via Rocket.Chat which is an open source slack clone. You can login here https://chat.zi-mannheim.de and install the app on your phone and/or laptop
You'll find more information in the intranet of ZI and/or at http://wiki.zi.local. I'll give you the password for the intranet, the wiki can only be accessed from within the ZI network
IT stuff
Once your contract stuff is through, you'll have an IT account called vera.kluetz.
With this, you can access your mails at owa.zi-mannheim.de (or use Outlook), join the rocket.chat, login to any laptop and more
Most important things are only accessible from within the ZI network and only from ZI laptops or citrix. If you are physically at the ZI, you can either use a ZI laptop and plugin to the network, if you are at home, you can login via FortiNet VPN and a personal access token (which we need to request).
If you want to use your own computer, you can login to the network via the webinterface at citrix.zi-mannheim.de , and from there use a Remote Desktop to login to a ZI internal workstation
For working I would suggest that you work on klipscalc, which is our department working station with 64 cores and 256 GB of RAM. Alternatively, you can also use a personal linux VM. You can login at klipscalc.zi.local via the program ThinLinc.
Project summary
The project your'll be working on is called Emo-React-Prestudy, which was programmed and conducted last year by Joshua Rocha, who now started a PhD in our lab, so he might also be a good person to talk to if there are questions. There's a really short study description here and the master's thesis advert here. The goal is to create a classifier that is able to pick up on different emotional reactions that we recorded in the MEG, based upon the Keltner et al database. The project is called a "prestudy" because our actual, long-term goal is to replicate the study by Schönauer et al (2017), in which they showed that they were able to differentiate different memory processes during sleep based on what participants learned before sleep. We want to extend their approach and show that it also works with emotional stimuli. Before the "prestudy", we already tried right away recording 3x10 nights of either positive/negative or neutral pictures, but did not have much luck with decoding there. You can find details of that study here EMO-React. Last year, Esmondo (not in the lab anylonger) worked on the picture data and tried different classification approaches without much luck, but he also didn't get too deep into it. He documented his approaches at the issues here: https://github.com/CIMH-Clinical-Psychology/EMO_REACT/issues, please have a look at them! That might be a good starting point
However, we suspected that the pictures were not emotionally arousing enough, so we switched to GIFs, and our intuition is indeed that they elicit far stronger emotional responses. Therefore, we decided to record another dataset, this time without any sleep, just to validate that we can actually pick up on different emotional valence from the MEG. This is where the current dataset "Emo-React-Prestudy" comes in :)
Reading list:
Schönauer, M., Alizadeh, S., Jamalabadi, H. et al. Decoding material-specific memory reprocessing during sleep in humans. Nat Commun 8, 15404 (2017). https://doi.org/10.1038/ncomms15404
Cowen AS, Keltner D. Self-report captures 27 distinct categories of emotion bridged by continuous gradients. Proc Natl Acad Sci U S A. 2017 Sep 19;114(38):E7900-E7909. doi: 10.1073/pnas.1702247114. Epub 2017 Sep 5. PMID: 28874542; PMCID: PMC5617253.
Schreiner, T., Petzka, M., Staudigl, T. et al. Endogenous memory reactivation during sleep in humans is clocked by slow oscillation-spindle complexes. Nat Commun 12, 3112 (2021). https://doi.org/10.1038/s41467-021-23520-2
Reactivation strength during cued recall is modulated by graph distance within cognitive maps, Simon Kern, Juliane Nagel, Martin F. Gerchen, Cagatay Guersoy, Andreas Meyer-Lindenberg, Peter Kirsch, Raymond J. Dolan, Steffen Gais, Gordon B. Feld bioRxiv 2023.07.31.551234; doi: https://doi.org/10.1101/2023.07.31.551234
The text was updated successfully, but these errors were encountered:
Welcome to the ZI, here's some information to get you started.
ToDo
Organizational
IT stuff
Project summary
The project your'll be working on is called Emo-React-Prestudy, which was programmed and conducted last year by Joshua Rocha, who now started a PhD in our lab, so he might also be a good person to talk to if there are questions. There's a really short study description here and the master's thesis advert here. The goal is to create a classifier that is able to pick up on different emotional reactions that we recorded in the MEG, based upon the Keltner et al database. The project is called a "prestudy" because our actual, long-term goal is to replicate the study by Schönauer et al (2017), in which they showed that they were able to differentiate different memory processes during sleep based on what participants learned before sleep. We want to extend their approach and show that it also works with emotional stimuli. Before the "prestudy", we already tried right away recording 3x10 nights of either positive/negative or neutral pictures, but did not have much luck with decoding there. You can find details of that study here EMO-React. Last year, Esmondo (not in the lab anylonger) worked on the picture data and tried different classification approaches without much luck, but he also didn't get too deep into it. He documented his approaches at the issues here: https://github.com/CIMH-Clinical-Psychology/EMO_REACT/issues, please have a look at them! That might be a good starting point
However, we suspected that the pictures were not emotionally arousing enough, so we switched to GIFs, and our intuition is indeed that they elicit far stronger emotional responses. Therefore, we decided to record another dataset, this time without any sleep, just to validate that we can actually pick up on different emotional valence from the MEG. This is where the current dataset "Emo-React-Prestudy" comes in :)
Reading list:
http://doi.org/10.1098/rstb.2019.0293
The text was updated successfully, but these errors were encountered: