Full description
Work in progress... NeuriPhy - Neuroimaging Dataset for Physics-Informed Learning This dataset was developed in the context of my master's thesis titled "Physics-Guided Deep Learning for Sparse Data-Driven Brain Shift Registration", which investigates the integration of physics-based biomechanical modeling into deep learning frameworks for the task of brain shift registration. The core objective of this project is to improve the accuracy and reliability of intraoperative brain shift prediction by enabling deep neural networks to interpolate sparse intraoperative data under biomechanical constraints. Such capabilities are critical for enhancing image-guided neurosurgery systems, especially when full intraoperative imaging is unavailable or impractical. The dataset integrates and extends data from two publicly available sources: ReMIND and UPENN-GBM. A total of 207 patient cases (45 cases from ReMIND and 162 cases from UPENN-GBM), each represented as a separate folder with all relevant data grouped per case, are included in this dataset. It contains preoperative imaging (unstripped), synthetic ground truth displacement fields, anatomical segmentations, and keypoints, structured to support machine learning and registration tasks. For details on the image acquisition and other topics related to the original datasets, see their original links above. Contents Imaging Data: T1ce: Preoperative contrast-enhanced T1-weighted MRI scans. T2: Preoperative T2-weighted MRI scans, including mostly T2-SPACE, but also native T2 and T2-BLADE acquisitions depending on the case. All MRI scans are in NIfTI format and have been resampled to the same isotropic resolution (1x1x1 mm). Intra-patient rigid coregistration was performed as part of preprocessing with the "General Registration (BRAINS)" extension of 3D Slicer. Synthetic Displacement Fields: Biomechanically simulated ground truth displacement fields were generated using a meshless approach and by solving differential equations of nonlinear elasticity using explicit methods, as described in 1, 2, 3, 4. For each patient, 1 to 5 simulations were successfully performed, each with a different gravity vector orientation according to a plausible surgical entry point, creating variability in the deformations obtained. Overall, the dataset contains 394 simulations that aimed to predict the intraoperative state after tumor-resection-induced brain shift. Includes the initial and displaced (final) coordinates of several points in the brain volume that were used to generate the displacement field using a multi-level BSpline interpolation algorithm. These displacement fields were mainly intended for use as supervision in deep learning-based registration methods. Keypoints: Sparse 3D keypoints and their descriptors were generated using the 3D SIFT-Rank algorithm on the T1ce images (or T2 if T1ce was unavailable). Keypoints are provided for each case in both voxel space and world coordinates (RAS?), being suitable for sparse registration or landmark-based evaluation. Segmentations: Brain segmentations were automatically generated using SynthSeg, a deep learning model capable of robust whole-brain segmentation with scans of any contrast and resolution. Tumor segmentations are included from the original datasets. All segmentations are provided in the NRRD format. Data Structure Each patient folder contains the following subfolders: images/: Preoperative MRI scans (T1ce, T2) in NIfTI format. segmentations/: Brain and tumor segmentations in NRRD format. simulations/: Biomechanically simulated displacement fields with initial and final point coordinates (LPS) in .npz and .txt formats, respectively. keypoints/: 3D SIFT-Rank keypoints and their descriptors in both voxel space and world coordinates (RAS?) as .key files. The folder naming and organization are consistent across patients for ease of use and scripting. Source Datasets ReMIND: is a multimodal imaging dataset of 114 brain tumor patients that underwent image-guided surgical resection at Brigham and Women’s Hospital, containing preoperative MRI, intraoperative MRI, and 3D intraoperative ultrasound data. It includes over 300 imaging series and 350 expert-annotated segmentations such as tumors, resection cavities, cerebrum, and ventricles. Demographic and clinico-pathological information (e.g., tumor type, grade, eloquence) is also provided. UPENN-GBM: comprises multi-parametric MRI scans from de novo glioblastoma (GBM) patients treated at the University of Pennsylvania Health System. It includes co-registered and skull-stripped T1-weighted, T1-weighted contrast-enhanced, T2-weighted, and FLAIR images. The dataset features high-quality tumor and brain segmentation labels, initially produced by automated methods and subsequently corrected and approved by board-certified neuroradiologists. Alongside imaging data, the collection provides comprehensive clinical metadata including patient demographics, genomic profiles, survival outcomes, and tumor progression indicators. Use Cases This dataset is tailored for researchers and developers working on: Deformable image registration Physics-informed machine learning Intraoperative brain shift modeling Sparse data interpolation and deep learning Multi-modal image alignment in neuroimaging It is especially well-suited for evaluating learning-based registration methods that incorporate physical priors or aim to generalize under sparse supervision.Notes
External OrganisationsUniversity of Lisbon; Pitié-Salpêtrière Hospital; University of Cambridge
Associated Persons
Tiago Assis (Creator); Reuben Dorent (Creator); Nuno C. Garcia (Creator); Ines P. Machado (Creator)
Tiago Assis (Creator); Reuben Dorent (Creator); Nuno C. Garcia (Creator); Ines P. Machado (Creator)
Issued: 2025-05-12
User Contributed Tags
Login to tag this record with meaningful keywords to make it easier to discover
Other Information
Deep Biomechanically-Guided Interpolation for Keypoint-Based Brain Shift Registration
url :
http://research-repository.uwa.edu.au/en/publications/6bfaded7-53ea-4eb8-b38c-6943278cce1d![]()
Conference paper
Identifiers
- DOI : 10.5281/ZENODO.15381865
- global : 8fee9354-e10b-491b-86c2-f3c78e24dbf1
