fMRI Data Visualization with Julia in Minutes
1. Introduction
Recently, I needed to investigate an fMRI research[1] and data which is collected from people who are reading a Harry Potter chapter. The words are shown to people one by one by blurring other words around. The work has demonstrated that, given hardcoded language features and visual features of two words, one can distinguish which word the reader reads at a time by looking fMRI data.
The first day I started to investigate this data, I was lucky enough to be in Julia Lab at MIT. I got Simon Danisch's, who has written awesome Makie.jl plotting package, example script to visualize another 3D stream data. It took 30 minutes to edit this example code and visualize my fMRI data without any external processing tool. So, let see how easy to visualize this data on Julia.
2. Setup
2.1. Downloading Data
Here, we will download the raw fMRI data of subject_1
from accompanying website of the paper
server = "http://www.cs.cmu.edu/afs/cs/project/theo-73/www/plosone/files/" download(server*"subject_1.mat","./subject_1.mat")
2.2. Installing&Adding Packages
We install MAT.jl which is a package to read MATLAB data and it isn't working on Julia 1.0 version yet. So, we download it from another source which is working fine for now. We also install Makie.jl and its dependencies.
pkg"add GeometryTypes#master Observables#master AbstractPlotting#master Makie#master https://github.com/halleysfifthinc/MAT.jl#v0.7-update"
Including required packages to current workspace
using Makie, GeometryTypes, Observables using AbstractPlotting: slider!, playbutton, vbox, hbox! using MAT
3. Data Preperation
Subject's data is a matrix which holds voxel activation values for each time. Rows corresponds to different time and Columns corresponds to different voxels. You could check subject["meta"]
to get more meta information about whole data.
subject = matread("./subject_1.mat"); data = subject["data"];
T is #of time steps and V is #of voxels. Each time step is a 2 seconds interval. Voxel locations are investigated in next cell.
T,V = size(data)
3.1. Voxel Locations
subject["meta"]["colToCoord"]
is a Vx3
matrix which corresponds to voxel locations for each voxel. I converted it to an array of CartesianIndex to make next steps easy.
colToCoord = vec(mapslices(x->CartesianIndex(Tuple(Int.(x))),subject["meta"]["colToCoord"]; dims=2)); colToCoord[1] #display an example data
Sizes of the 3D rectangular parallelepiped which encloses all voxels.
lx,ly,lz=maximum(Int.(subject["meta"]["colToCoord"]);dims=1)
3.2. Time Series of 3D Spatial Voxel Data
We have voxel activation data and voxel locations. We need to create 3D arrays size of (lx,ly,lz) and fill their voxel locations with activation values. Then, we repeat this for all time steps.
fmri_time_series = map(1:T) do i voxels = zeros(lx,ly,lz) voxels[colToCoord] = data[i,:]; voxels end;
typeof(fmri_time_series)
3.3. Normalization
We normalize activation values data between 0-1 to display correctly.
fm = fmri_time_series mini, maxi = mapreduce(extrema, (a, b)-> (min(a[1], b[1]), max(a[2], b[2])), fm) fm_n = map(fm) do vol Float32.((vol .- mini) ./ (maxi - mini)) end ex_volume = first(fm_n);
4. Visualization
Plotting and GUI with Makie.jl
style = Theme(raw = true, camera = campixel!) # 3D Brain Image axis = range(0, stop = 1, length = size(ex_volume, 1)) scene3d = contour(axis, axis, axis, ex_volume, alpha = 0.1, levels = 4) cntr = last(scene3d); volume = cntr[4] # Heatmap Slices and Their Sliders axis2 = range(-1, stop = 1, length = size(ex_volume, 1)) hscene = Scene(camera = cam3d!, show_axis = false) planes = (:xy, :xz, :yz) # XY-XZ-YZ Planes' Sliders sliders = ntuple(3) do i s_scene = slider(style, 1:size(volume[], i), start = size(volume[], i) ÷ 2) s = last(s_scene) idx = s[:value]; plane = planes[i] indices = map(1:3) do j; planes[j] == plane ? 1 : (:); end hmap = last(heatmap!( hscene, axis2, axis2, volume[][indices...], fillrange = true, interpolate = true )) lift(idx, volume) do _idx, vol idx = (i in (1, 2)) ? (size(vol, i) - _idx) + 1 : _idx transform!(hmap, (plane, axis2[_idx])) indices = map(1:3) do j; planes[j] == plane ? idx : (:); end if checkbounds(Bool, vol, indices...) hmap[3][] = view(vol, indices...) end end s_scene end # Time Step Slider t = slider(style, 1:length(fm_n)) tslider = last(t) lift(tslider[:value]) do idx if checkbounds(Bool, fm_n, idx) volume[] = fm_n[idx] end end #XY-XZ-YZ Planes Toggle Button b1 = button(style, "Plane: "; dimensions = (150, 150)) on(b1[end][:clicks]) do clicks if b2.plots[1][1][] == "2D" cam3d!(hscene) cam = cameracontrols(hscene) cam.projectiontype[] = AbstractPlotting.Orthographic if clicks % 3 == 1 cam.upvector[] = Vec3f0(0, 0, 1) cam.eyeposition[] = Vec3f0(0, 300, 0) cam.lookat[] = Vec3f0(0, 0, 0) b1.plots[1][1][] = "Plane: XZ" elseif clicks % 3 == 0 cam.upvector[] = Vec3f0(1, 0, 0) cam.eyeposition[] = Vec3f0(0, 0, 300) cam.lookat[] = Vec3f0(0, 0, 0) b1.plots[1][1][] = "Plane: YZ" elseif clicks % 3 == 2 cam.upvector[] = Vec3f0(0, 0, 1) cam.eyeposition[] = Vec3f0(300, 0, 0) cam.lookat[] = Vec3f0(0, 0, 0) b1.plots[1][1][] = "Plane: XY" end update_cam!(hscene, cam) end end # 3D/2D Toggle Button b2 = button(style, "3D"; dimensions = (100, 100)) on(b2[end][:clicks]) do clicks clicks if iseven(clicks) cam3d!(hscene) center!(hscene) b1.plots[1][1][] = "Plane: XYZ" b2.plots[1][1][] = "3D" else cam2d!(hscene) center!(hscene) cam3d!(hscene) cam = cameracontrols(hscene) cam.projectiontype[] = AbstractPlotting.Orthographic cam.upvector[] = Vec3f0(1, 0, 0) cam.eyeposition[] = Vec3f0(0, 0, 300) cam.lookat[] = Vec3f0(0, 0, 0) update_cam!(hscene, cam) b1.plots[1][1][] = "Plane: YZ" b2.plots[1][1][] = "2D" end end #Layout hbox( vbox(scene3d, hscene), vbox(sliders...,t, b2, b1) )
5. Demo
6. Conclusion
It was very fun to have brain as a 3D and see voxel activations on it. Meanwhile, we also verified that language/reading related parts of the brain is located on left brain (assuming subjects are right handed).
7. References
[1] Wehbe, Leila, et al. “Simultaneously uncovering the patterns of brain regions involved in different story reading subprocesses.” PloS one 9.11 (2014)