MIND: modality independent neighbourhood descriptor for multi-modal deformable registration.

TitelMIND: modality independent neighbourhood descriptor for multi-modal deformable registration.
Publication TypeJournal Article
Year of Publication2012
AuthorsHeinrich M.P., Jenkinson M., Bhushan M., Matin T., Gleeson F.V., Brady S.M., Schnabel J.A.
JournalMedical image analysis
Volume16
Issue7
Pages1423-35
Date Published2012 Oct
Publication Languageeng
ISSN1361-8423
SchlüsselwörterAlgorithms, Artificial Intelligence, Humans, Image Enhancement, Image Interpretation, Computer-Assisted, Imaging, Three-Dimensional, Magnetic Resonance Imaging, Pattern Recognition, Automated, Reproducibility of Results, Sensitivity and Specificity, Subtraction Technique, Tomography, X-Ray Computed
Abstract

Deformable registration of images obtained from different modalities remains a challenging task in medical image analysis. This paper addresses this important problem and proposes a modality independent neighbourhood descriptor (MIND) for both linear and deformable multi-modal registration. Based on the similarity of small image patches within one image, it aims to extract the distinctive structure in a local neighbourhood, which is preserved across modalities. The descriptor is based on the concept of image self-similarity, which has been introduced for non-local means filtering for image denoising. It is able to distinguish between different types of features such as corners, edges and homogeneously textured regions. MIND is robust to the most considerable differences between modalities: non-functional intensity relations, image noise and non-uniform bias fields. The multi-dimensional descriptor can be efficiently computed in a dense fashion across the whole image and provides point-wise local similarity across modalities based on the absolute or squared difference between descriptors, making it applicable for a wide range of transformation models and optimisation algorithms. We use the sum of squared differences of the MIND representations of the images as a similarity metric within a symmetric non-parametric Gauss-Newton registration framework. In principle, MIND would be applicable to the registration of arbitrary modalities. In this work, we apply and validate it for the registration of clinical 3D thoracic CT scans between inhale and exhale as well as the alignment of 3D CT and MRI scans. Experimental results show the advantages of MIND over state-of-the-art techniques such as conditional mutual information and entropy images, with respect to clinically annotated landmark locations.

DOI10.1016/j.media.2012.05.008
PubMed Link

http://www.ncbi.nlm.nih.gov/pubmed/22722056?dopt=Abstract

Alternate JournalMed Image Anal
Erstellt am 26. Oktober 2015 - 10:25.

Studium

Medizinische Informatik
an der Uni Lübeck studieren

Informationen für
Interessierte
u. Einsteiger

Anschrift

Institutssekretariat
Susanne Petersen

Tel+49 451 3101 5601
Fax+49 451 3101 5604


Gebäude 64 (Informatik)

Ratzeburger Allee 160
23538 Lübeck
Deutschland