interactiveADELE (Automated Documenter to Extract Learning from EEG) is a tool, in the form of a webapp, to process and interactively visualize various biometric sensors in a holistic manner. Currently, the following sensing packages and data are supported:
- Cognionics 64-EEG headset (although any 64-electrode setup with .eeg, .vhdr, .vmrk files should work)
- Tobii Pro Glasses 2 (provides audio and POV + eye videos)
The most direct method of establishing the environment to run adele would be to install the f22_env.yml
file through conda:
conda env create -f f22_env.yml
conda activate intAdele
Note: intAdele
is the created environment's name.
Then, place the relevant files in the static
folder, in a subfolder titled YYYY_MM_DD_TXX_UZZZ
(year, month, day, trial, user). As mentioned earlier, the relevant files may consist of the following:
- .vhdr, .eeg, .vmrk file for EEG data
- fullstream and eyesstream (if captured) from eye-tracking glasses
The provided data may be further pre-processed, prior to visualizing using Dash Plotly. Currently, that functionality exists within the filePreProcessing.ipynb
notebook, which performs tasks such as audio transcription, sentiment analysis (and outputs files that are used on the webapp). It is necessary to perform this pre-processing before running the visualizations.
Finally, the dashboard_f22.py
script needs to be provided the folder name with the data (in the earlier format, YYYY_MM_DD_TXX_UZZZ
), and run using:
python dashboard_f22.py
After opening the webapp, most likely at http://127.0.0.1:8050/
, it should display something similar to the following gif:
Below is an explanation of the different components of the webapp that may be visible (depending on the input data):
- Title:
- Trial name and metric used to create epochs
- Slider:
- Select between different epochs of the event, placed in chronological order.
- Epochs are chosen based on a decision tree based-method, with number epochs customizable as a parameter, and a biometric parameter such as heart rate, skin temperature etc.
- Defaults to heart rate.
- 3D Brain:
- Displays the three most (red) and least (blue) active electrodes, within the Theta band (4-8 Hz)
- Network of Electrodes:
- Displays the most and least active electrodes across the whole event, within the Alpha (8-12 Hz) and Beta (12-30 Hz) bands.
- Electrodes in black borders signify the ones relevant to the selected epoch. The thickness of the edges in the network signify the frequency of 'connections' made between two given electrodes.
- Report:
- Provides information on Biometrics such as heart rate, blinks (if available) for the epoch.
- Displays information on high and low activity electrodes for the epoch, the Brodmann areas the electrodes correspond to, and in turn, the primary brain regions (and their functions) the Brodmann areas correspond to.
- Graphs of Biometrics:
- Plot of a specified biometric for the epoch window.
- Video Streams and POV view:
- Synchronized videos of the POV of the subject and their eyes.
- Blink Detection:
- Current iteration of this panel plots out the EAR value (Eye-Aspect-Ratio: a measure of how closed the eye is, based on key point detection; Troughs indicate a blink).
- Eye stream must be present in the data, and the video needs to be analyzed prior to running the visualization.
- Soon to be replaced with a more robust, but more time consuming method for blink detection.
- Speech Transription, Speaker Diarization, Sentiment Analysis:
- Transcribes audio data collected from Tobii Pro Glasses using the Whisper library
- Speaker diarization, act of separating different speaker sources, is performed using PyAnnote's Speaker Diarization method
- Sentiment analysis generated through a hugging-face transformer