Revolutionizing Neuroimaging Education with Supercomputing and Neurodesk

Picture Pim Pullens Ghent University with neuroimaging software

(Picture Prof. Pim Pullens - Ghent University) 
 

The challenge of analysing neuroimaging data 

Neuroimaging is a research field that studies the human brain using quantitative MRI [1] techniques. Pim Pullens is MRI physicist at Ghent University Hospital, Director of Operations of the Ghent Institute Functional and Metabolic Imaging (GIfMI) and a professor of Magnetic Resonance Imaging (MRI). He teaches Neuroimaging Analysis at Ghent University: “I want to give my students a real-life situation where they have to analyse MRI data to answer a research question. However, analysing MRI data often requires specific software. That specific software may also demand specific hardware and operating systems. So, it’s not straightforward to teach students Neuroimaging analysis if they have to install the software on their laptops. It takes multiple days to set up or doesn’t work.”

Also, analysing MRI data is time-consuming. It is common to spend months analysing data using many different software tools. The datasets can grow exponentially, and some analyses may take over eight hours per dataset on a single computer. Thus, even if students manage to install the required software, their laptops often aren't powerful enough or lack disk space for the analysis. 

Screenshot of neuroimaging software
Screenshot of neuroimaging software running in a web browser. This application is running on the HPC infrastructure of Ghent University.

Supercomputing and Neurodesk to the rescue

Therefore, the students of Prof. Pullens use the supercomputing infrastructure of the Vlaams Supercomputer Center combined with Neurodesk [2] . This community-driven open-source platform [3] provides a containerised data analysis environment to facilitate the analysis of neuroimaging data. 

Prof. Pim Pullens (GIfMI, Ghent University): “The NeuroImaging community mostly uses Linux and Mac Unix-like environments, which can be tricky to get everything working. Thanks to the containerised environment, Neurodesk makes the tools available without having to install software on a local computer. By using the supercomputing infrastructure of VSC (as opposed to students using their laptops), we don’t have to worry anymore about laptops crashing or students panicking about losing their work— only a decent internet connection is required. It's also great for Master's or PhD students because they can run their analyses quickly. They can install Neurodesk on their computers, develop their workflow, and easily transfer it to the supercomputer for larger-scale analysis. This really simplifies the whole process.”
 

Neurodesk
Overview of the Neurodesk environment. Image source: https://www.biorxiv.org/content/10.1101/2022.12.23.521691v1.full


 

Ensuring reproducibility and computational stability

For anyone involved in data analysis, it's a common experience: you install software, complete your analysis, and then, a year or so later, you need to repeat the process. However, by that time, you may need to upgrade the software, which might cause compatibility issues, leading to significant challenges (breaking existing analysis pipelines, for example). Your software may no longer work as expected, causing considerable frustration. This is where another advantage of Neurodesk comes into play. It preserves different software versions, allowing you to return to the exact pipeline you used one or two years ago and repeat your experiments seamlessly. 

Prof. Pim Pullens: “A research paper from Neurodesk [4] demonstrates an important point: when analysing large datasets on different systems—whether Unix, Linux, or Windows— (not using Neurodesk), you end up with different results. However, when using Neurodesk, you consistently get the same result on every system. This consistency is crucial because the algorithms used in data analysis involve many computational steps, and each step can introduce small errors.” 

Simplified access through Open OnDemand

Thanks to the Open OnDemand [5] tool offered by VSC, supercomputing has become even more user-friendly because it allows users to interact with supercomputer clusters within a web browser. 
This is particularly valuable for users who are not familiar with advanced software. They find it incredibly useful to interact with the system as easily as they would with a regular computer.

Prof. Pullens (GIfMI UGent): “The main concern people had about using HPC was the lack of familiar user interfaces. However, with Open OnDemand, there is a desktop environment where you can simply click and start programs just like you do on your computer.”

Another way Open OnDemand can help is by enabling data transfer without needing specific software on the student's laptop. Prof. Pim Pullens: “My students have little to no experience in computing, programming, or using terminal commands. The most challenging aspect was learning to transfer data between directories on the supercomputer from their own computers using the terminal. However, with just one lesson and using Open OnDemand, they could analyse datasets and display MRI data in their web browser.”

Invaluable support from Neurodesk and the VSC team of Ghent University  

Prof. Pim Pullens: “I had assistance from the main developer of Neurodesk, Steffen Bollmann (University of Queensland, Brisbane), who even visited us personally from Australia. Also, the support from the VSC HPC team of UGent Ghent University made the process remarkably straightforward. The student list is maintained on a specific university platform, and VSC accounts were automatically created for that group. All directories were set up correctly, with appropriate permissions, ensuring everything was in order. As a result, when students log into the graphical interface via the web browser, everything is pre-configured. This support was invaluable, and I am pleased that the students can work with the system easily and efficiently.”

Prof. Pullens concludes: “As said, my students have no experience working with a supercomputer. However, thanks to using the VSC infrastructure, Open OnDemand and Neurodesk, I can get them to analyse their data within an hour, which shows how user-friendly and efficient this set-up is.” 
 


[1] MRI: Magnetic resonance imaging (MRI) is a medical imaging technique used in radiology to form pictures of the anatomy and the physiological processes inside the body. MRI scanners use strong magnetic fields, magnetic field gradients, and radio waves to generate images of the organs in the body (source: Wikipedia)

[2] Neurodesk: https://www.neurodesk.org/

[3] Source: Renton, A.I., Dao, T.T., Johnstone, T., Civier, O., Sullivan, R. P., White, D. J., … Narayanan, A. & Bollmann, S. Neurodesk: an accessible, flexible and portable data analysis environment for reproducible neuroimaging. Nat Methods (2024). 

[4] https://www.biorxiv.org/content/10.1101/2022.12.23.521691v2.full

[5] More information: https://openondemand.org/

 

 

 

 

Published on 01/10/2024

Prof. Pim Pullens

Prof. Pim Pullens is a magnetic resonance imaging (MRI) physicist at the department of radiology, UZ Gent, and MRI physicist at the Ghent Institute for functional and Metabolic Imaging (GIfMI) and a professor at Ghent University. His aim is to introduce quantitative MR imaging and the associated measurement and analysis processes to clinicians, researchers and students. He strives to connect researchers, clinicians, students, and imaging experts to create synergy and fully exploit the infrastructure and expertise at Ghent University Hospital and Ghent University.