Home   >   CSC-OpenAccess Library   >    Manuscript Information
Full Text Available

(404.87KB)
This is an Open Access publication published under CSC-OpenAccess Policy.
3D Position Tracking System for Flexible Cystoscopy
Munehiro Nakamura, Yusuke Kajiwara, Tatsuhito Hasegawa, Haruhiko Kimura
Pages - 418 - 429     |    Revised - 15-08-2013     |    Published - 15-09-2013
Volume - 7   Issue - 4    |    Publication Date - September 2013  Table of Contents
MORE INFORMATION
KEYWORDS
Flexible Cystoscopy, Position Tracking, Optical Flow, Zero-mean Normalized Cross- Correlation, Handling Pattern.
ABSTRACT
Flexible cystoscopy is an examination that allows physicians to look inside the bladder. In flexible cystoscopy, beginner physicians tend to lose track of the observation due to complex handling patterns of a flexible cystoscope and poor characteristics of the bladder. In this paper, as a diagnostic support tool for beginner physicians in flexible cystoscopy, we propose a system for tracking the observation using cystoscopic images. Our system discriminates three handling patterns of a flexible cystoscope, namely bending, rotation, or insertion. To discriminate the handling patterns accurately, we propose to use the degree of bending, rotation, or insertion as features for the discrimination as well as ZNCC-based optical flows. These features are learned by a Random Forest classifier. The classifier discriminates sequential handling patterns of the cystoscope by a time-series analysis. Experimental results on ten videos obtained in flexible cystoscopy show that each of the three handling patterns were correctly discriminated over 90% in average. In addition, we reproduced the observation in a virtual bladder we propose.
CITED BY (0)  
1 Google Scholar
2 CiteSeerX
3 refSeek
4 Scribd
5 SlideShare
6 PdfSR
1 M. Froehner, M. A. Brausi, H. W. Herr, G. Muto, U E. Studer. “Complications following radical cystectomy for bladder cancer in the elderly” European Urology, vol.56, no.3, pp.443-454,2009.
2 J. Key, D. Dhawan, D. K. Knapp, K. Kim, I. C. Kwon, K. Choi, J. F. Leary. “Method and apparatus for estimating the velocity vector of multiple vehicles on non-level and curved roads using a single camera” in Proc. SPIE 8225, Imaging, Manipulation, and Analysis of Biomolecules, Cells, and Tissues X, 82251F, 2012, pp.1-8.
3 H C. Choi, S. Y Oh. “Robust segment-based object tracking using generalized hyperplane approximation” Pattern Recognition, vol.45, no.8, pp.2980-2991, 2012.
4 A. Ramisa, G. Alenya, C. Torras. “Single image 3D human pose estimation from noisy observations” in Proc. 2012 IEEE Conference on Computer Vision and Pattern Recognition,2012, pp.2673-2680.
5 J. C. Yoo and T. Han. “Fast normalized cross-correlation” Circ. Syst. Signal Process, vol.28,no.6, pp.819-843, 2009.
6 L. Breiman. “Random Forests” Machine Learning, vol.45, no.1, pp.5-32, 2001.
7 S.L. Tanimoto. “Template Matching in Pyramids” Computer Graphics and Image Processing,vol.16, no.4, pp.356-369, 1981.
8 B.K.P Horn and B.G. Schunck. “Determining optical flow” Artif. Intell., vol.17, pp.185-203,1981.
9 M. Fosteller. “A k-sample slippage test for an extreme population” Annals of Mathematical Statistics, vol.19, no.1, pp.58-65, 1948.
10 L. Beyang, S. Gould, and D. Koller. “Single image depth estimation from predicted semantic labels” in Proc. 2010 IEEE Conference on Computer Vision and Pattern Recognition, 2010,pp.1253-1260.
11 N. Dalal and B. Triggs. “Histograms of Oriented Gradients for Human Detection” in Proc.2005 IEEE Conference on Computer Vision and Pattern Recognition, 2005, pp.886-893.
Dr. Munehiro Nakamura
Department of Natural Science and Engineering Kanazawa University Kanazawa, 9201192 - Japan
m-nakamura@blitz.ec.t.kanazawa-u.ac.jp
Dr. Yusuke Kajiwara
Department of Information Science Ritsumeikan University Kusatsu, 525877 - Japan
Mr. Tatsuhito Hasegawa
Department of Natural Science and Engineering Kanazawa University Kanazawa, 9201192 - Japan
Professor Haruhiko Kimura
Department of Natural Science and Engineering Kanazawa University Kanazawa, 9201192 - Japan