This repo contains:
electron-lsl-demo/— Electron app that publishes two LSL marker streams:MouseEvents(JSON markers:moveandclick, includest_ms,x,y,btn)MicMarkers(mic_start/mic_stopmanual markers)- Also includes
lsl-bridge.py(Python WebSocket → LSL bridge)
xdf-analysis/— Python tools to inspect XDF files (analyze_xdf.py,requirements.txt)
This README explains how to run the demo, record with LabRecorder, and analyze the saved XDF recordings.
- Node.js + npm (for Electron)
- Python 3.8+ (for the bridge and analysis)
- LabRecorder (to record streams to
.xdf) — see links below
LabRecorder resources:
- GitHub (source / releases): https://github.com/labstreaminglayer/App-LabRecorder
- Docs & downloads: https://github.com/sccn/liblsl/releases
Install LabRecorder first (download the appropriate binary for Windows and install).
- Install Node dependencies for the Electron app:
cd 'C:\dev\scratch\LSL Test\electron-lsl-demo'
npm install- Prepare Python environment for the analyzer (recommended: venv):
cd 'C:\dev\scratch\LSL Test\xdf-analysis'
python -m venv .venv
.\.venv\Scripts\Activate.ps1
python -m pip install --upgrade pip
pip install -r requirements.txt- Install Python deps for the bridge (if you will run it manually; included in README steps below):
python -m pip install --user pylsl websocketsYou can run both the Python LSL bridge and the Electron app together manually in two terminals.
Terminal 1 — start the Python bridge:
cd 'C:\dev\scratch\LSL Test\electron-lsl-demo'
python .\lsl-bridge.pyTerminal 2 — start Electron:
cd 'C:\dev\scratch\LSL Test\electron-lsl-demo'
npm startNotes:
- The bridge publishes real LSL marker streams via
pylsl. The preload will use the native Node LSL addon if installed; otherwise it sends markers to the bridge. - If the Electron window shows
LSL: fallback (no network)then the bridge isn't connected. Restart the bridge and then restart Electron (or check DevTools console for WebSocket errors).
- Launch LabRecorder application.
- Click Update in the LabRecorder window. You should see the streams:
MouseEventsMicMarkers
- Check those streams (tick boxes), choose a save folder, then click Start.
- For your setup the save path (example) is:
C:\Users\vdingram\Documents\CurrentStudy\sub-P001\ses-S001\eeg - LabRecorder will save an XDF file in that folder (filename depends on LabRecorder settings and your Study template).
- For your setup the save path (example) is:
- Interact with the Electron window while recording:
- Move the mouse (move markers)
- Click (click markers)
- Toggle the Mic button (pushes
mic_start/mic_stopmarkers)
- Stop recording in LabRecorder → this finalizes the
.xdf.
Important:
- The
micmarkers are manual event markers (NOT audio). Use them as trial or block boundaries (e.g.,mic_startmarks the start of a response window;mic_stopmarks the end). - If LabRecorder does not list the streams after clicking Update:
- Ensure the bridge is running (start it before Electron).
- Allow
python.exeandelectron.exethrough Windows Firewall. - Use the pylsl check (below) to confirm streams exist.
Quick stream check (run in any terminal while bridge+Electron are running):
python -c "from pylsl import resolve_streams; print([s.name() for s in resolve_streams()])"After LabRecorder saves the XDF (example path above), move or copy that .xdf into the xdf-analysis folder so the analysis tools can find it easily.
Example (PowerShell):
# adjust file name to the actual saved file name
Copy-Item "C:\Users\$User$\Documents\CurrentStudy\sub-P001\ses-S001\eeg\sub-P001_ses-S001_task-YourTask_run-001_eeg.xdf" `
"C:\dev\scratch\LSL Test\xdf-analysis\test.xdf"(You can also use Move-Item instead of Copy-Item if you want to relocate the file.)
Activate the analysis venv (if not already active):
cd 'C:\dev\scratch\LSL Test\xdf-analysis'
.\.venv\Scripts\Activate.ps1python .\analyze_xdf.py --file ".\test.xdf"python .\analyze_xdf.py --file ".\test.xdf" --show-all- Timeline (moves + clicks; mic windows shaded):
python .\analyze_xdf.py --file ".\test.xdf" --timeline --plot- Heatmap of mouse positions:
python .\analyze_xdf.py --file ".\test.xdf" --heatmap --plot- Event-rate timeseries (events per second by kind):
python .\analyze_xdf.py --file ".\test.xdf" --event-rate --plot- Compute reaction times (for each
mic_start→ firstclickafter it, printed in ms):
python .\analyze_xdf.py --file ".\test.xdf" --reaction-times- Combine plots:
python .\analyze_xdf.py --file ".\test.xdf" --timeline --heatmap --event-rate --plotNotes:
--plotis a convenience: it causes timeline/heatmap/event-rate (depending on flags) to render and show.- If plotting libraries are missing the script will print an informative message; make sure
matplotlibandseabornare installed inside the venv.
- Ensure LabRecorder is installed.
- Start the demo:
cd 'C:\dev\scratch\LSL Test\electron-lsl-demo'
npm start- Open LabRecorder, click Update, select
MouseEventsandMicMarkers, set save folder to:
C:\Users\vdingram\Documents\CurrentStudy\sub-P001\ses-S001\eeg
- Click Start in LabRecorder and run the task in the Electron window (move/click, toggle mic).
- Click Stop in LabRecorder. Copy the saved
.xdfintoxdf-analysisastest.xdf(see copy command above). - Activate venv and run the analyzer:
cd 'C:\dev\scratch\LSL Test\xdf-analysis'
.\.venv\Scripts\Activate.ps1
python .\analyze_xdf.py --file ".\test.xdf" --timeline --heatmap --reaction-times --plot-
Bridge not connected:
- Start the Python bridge before starting Electron.
- Check Electron DevTools (Ctrl+Shift+I) Console for WebSocket errors.
-
Streams still not visible in LabRecorder:
- Check that
pylslresolves streams (see quick stream check above). - Check Windows Firewall or antivirus blocking UDP multicast/discovery.
- Check that
-
Analyzer errors:
- Activate the venv and confirm dependencies installed (
pip install -r requirements.txt). - XDF file locked by LabRecorder — ensure LabRecorder has finished/stopped recording.
- Activate the venv and confirm dependencies installed (
-
MouseEventspayload example:{"t_ms": 21020.7, "kind": "move", "x": 403, "y": 518}t_ms:performance.now()in milliseconds (app-local, monotonic).kind:moveorclickx,y: pixel coordinatesbtn: mouse button for clicks (0 = left, etc.)
-
MicMarkerspayload example:{"t_ms": 24129, "kind": "mic_start"}- These are manual markers (no audio). Use
mic_start/mic_stopas trial/recording window boundaries or anchors to compute reaction times.
- These are manual markers (no audio). Use
-
analyze_xdf.pyusespyxdf.load_xdf(..., synchronize_clocks=True)to align LSL clocks and make timestamps comparable across streams.