Techn. Fakultät Website deprecated and outdated. Click here for the new site. FAU-Logo

Dipl.-Inf. Eva Kollorz

Alumnus of the Pattern Recognition Lab of the Friedrich-Alexander-Universität Erlangen-Nürnberg

The vision of my work is to improve ultrasound image quality and to segment and to classify automatically nodules in 3-D ultrasound data.


Segmentation and classification of thyroid nodules


Thyroid nodule segmentation is a hard task due to different echo structures, textures and echogenicities in ultrasound (US) images as well as speckle noise. Currently, a typical clinical evaluation involves the manual, approximate measurement in two section planes in order to obtain an estimate of the nodule's size with the  ellipsoid formula: width x depth x length (in cm) x 0.5 [1] (see Fig. 1).



Fig. 1: Screen shot of the 2D measurement (axial section plane) of an echo complex thyroid nodule, partial input for the ellipsoid formula (left); paper for recording nodule attributes and position (right).


The aforementioned nodule attributes are recorded on paper (see Fig. 1). On the one hand, this is user dependent, e.g. the choice of the stored slice by the physician, on the other hand it limits the plastically 3D impression of the thyroid nodule. That's why we work with 3D US volumes.

The aim of this project is to automatically determine the volume of the thyroid nodule and to classify the echo structure, texture and echogenicity. The processing pipeline includes: preprocessing, segmentation and classification.


We work with different benign nodule types/categories, e.g., isoechoic, complex echoic (see Fig. 2). 



Fig. 2: Different nodule categories (from left to right): isoechoic ("echogleich"), complex echoic ("echokomplex"), complex echoic and hypoechoic ("echoarm")/cystic.


Criteria for nodule classification are:

  • echo structure: solid, mixed, cystic
  • echo texture: homogeneous, heterogeneous
  • echogenicity: isoechoic, hyperechoic, hypoechoic


To preprocess our US image data, we inspected different filters [2] as well as image fuzzification [3]. We studied Haralick features [4], Haar-like features, filterbanks (e.g., Leung-Malik), gray-level run-length matrix (GLRLM), Law's texture energy measure, scale-invariant feature transform (SIFT) features [5] as well as textons on 2D US slices. For classification, a support vector machine (SVM) was used.


Currently, we analyze semi-automatic segmentation methods (see Fig. 3), e.g., random walker (RW) [6], power watersheds (PW) [7] and graph cuts (GC). The segmentation output will be used for the classification stage.



Fig. 3: Segmentation results: random walker with white seed points (object and background), red: segmented object, blue: background (left); blue: power watersheds segmentation result, green & red: gold standard segmentations, yellow: intersection of gold standard segmentations (right).



Further literature:

[1] Lyshchik, Andrej; Drozd, Valentina; Reiners, Christoph; Accuracy of three-dimensional ultrasound for thyroid volume measurement in children and adolescents; Thyroid; 2004; 14(2); pp. 113-120

[2] Kollorz, Eva; Hahn, Dieter; Linke, Rainer; Goecke, Tamme; Hornegger, Joachim; Kuwert, Torsten; Quantification of Thyroid Volume Using 3-D Ultrasound Imaging; IEEE Trans. on Medical Imaging; 2008; 27(4); pp. 457-466

[3] Cheng, H.D.; Chen Jim-Rong; Li, Jiguang; Threshold Selection based on fuzzy c-Partition entropy approach; Pattern Recognition; 1998; 31(7); pp. 857-870

[4] Haralick, Robert M.; Shanmugam, K.; Dinstein, Its'hak; Textural Features for Image Classification; IEEE Trans. on Systems, Man, and Cybernetics; SMC-3(9); 1973; pp. 610-621

[5] Lowe, David G.; Distinctive Image Features from Scale-Invariant Keypoints; International Journal of Computer Vision; 60(2); 2004; pp. 91-110

[6] Grady, Leo; Random Walks for Image Segmentation; IEEE Trans. on Pattern Analysis and Machine Intelligence; 28(11); 2006; pp. 1768-1783

[7] Couprie, Camille; Grady, Leo; Najman, Laurent; Talbot, Hugues; Power Watersheds: A Unifying Graph Based Optimization Framework; IEEE Trans. on Pattern Analysis and Machine Intelligence; 2010; to appear