Please use this identifier to cite or link to this item:
|Title:||The unified extreme learning machines and discriminative random fields for automatic knee cartilage and meniscus segmentation from multi-contrast MR images||Authors:||Zhang, Kunlei
|Keywords:||DRNTU::Engineering::Electrical and electronic engineering||Issue Date:||2013||Source:||Zhang, K., Lu, W., & Marziliano, P. (2013). The unified extreme learning machines and discriminative random fields for automatic knee cartilage and meniscus segmentation from multi-contrast MR images. Machine vision and applications, 24(7), 1459-1472.||Series/Report no.:||Machine vision and applications||Abstract:||Segmenting articular cartilage and meniscus from magnetic resonance (MR) images is an essential task for the assessment of knee pathology. Most of the previous classification-based works for cartilage and meniscus segmentation only rely on independent labellings by a classifier, but do not consider the spatial context interaction. The labels of most image voxels are actually dependent upon their neighbours. In this study, we present an automatic knee segmentation system working on multi-contrast MR images where a novel classification model unifying an extreme learning machine (ELM)-based association potential and a discriminative random field (DRF)-based interaction potential is proposed. The DRF model introduces spatial dependencies between neighbouring voxels to the independent ELM classification. We exploit a rich set of features From multi-contrast MR images to train the proposed classification model and perform the loopy belief propagation for the inference. The proposed model is evaluated on multi-contrast MR datasets acquired from 11 subjects with results outperforming the independent classifiers in terms of segmentation accuracy of both cartilages and menisci.||URI:||https://hdl.handle.net/10356/98648
|DOI:||10.1007/s00138-012-0466-9||Fulltext Permission:||none||Fulltext Availability:||No Fulltext|
|Appears in Collections:||EEE Journal Articles|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.