Skip to main content
Log in

General first-order target registration error model considering a coordinate reference frame in an image-guided surgical system

  • Original Article
  • Published:
Medical & Biological Engineering & Computing Aims and scope Submit manuscript

Abstract

Point-based rigid registration (PBRR) techniques are widely used in many aspects of image-guided surgery (IGS). Accurately estimating target registration error (TRE) statistics is of essential value for medical applications such as optically surgical tool-tip tracking and image registration. For example, knowing the TRE distribution statistics of surgical tool tip can help the surgeon make right decisions during surgery. In the meantime, the pose of a surgical tool is usually reported relative to a second rigid body whose local frame is called coordinate reference frame (CRF). In an n-ocular tracking system, fiducial localization error (FLE) should be considered inhomogeneous, that means FLE is different between fiducials, and anisotropic that indicates FLE is different in all directions. In this paper, we extend the TRE estimation algorithm relative to a CRF from homogeneous and anisotropic to heterogeneous FLE cases. Arbitrary weightings can be assumed in solving the registration problems in the proposed TRE estimation algorithm. Monte Carlo simulation results demonstrate the proposed algorithm’s effectiveness for both homogeneous and inhomogeneous FLE distributions. The results are further compared with those using the other two algorithms. When FLE distribution is anisotropic and homogeneous, the proposed TRE estimation algorithm’s performance is comparable with that of the first one. When FLE distribution is heterogeneous, proposed TRE estimation algorithm outperforms the other two classical algorithms in all test cases when ideal weighting scheme is adopted in solving two registrations. Possible clinical applications include the online estimation of surgical tool-tip tracking error with respect to a CRF in IGS.

This paper provides the target registration error model considering a coordinate reference frame in surgical navigation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Arun KS, Huang TS, Blostein SD (1987) Least-squares fitting of two 3-d point sets. IEEE Trans Pattern Anal Mach Intell 5(5):698–700

    Article  Google Scholar 

  2. Babin P, Giguere P, Pomerleau F (2019) Analysis of robust functions for registration algorithms. In: 2019 International conference on robotics and automation (ICRA). IEEE, pp 1451–1457

  3. Balachandran R, Fitzpatrick JM (2009) Iterative solution for rigid-body point-based registration with anisotropic weighting. In: Medical imaging 2009: visualization, image-guided procedures, and modeling. International Society for Optics and Photonics, vol 7261, p 72613D

  4. Balakrishnan G, Zhao A, Sabuncu MR, Guttag J, Dalca AV (2019) Voxelmorph: a learning framework for deformable medical image registration. IEEE Trans Med Imaging 38(8):1788–1800

    Article  Google Scholar 

  5. Balakrishnan G, Zhao A, Sabuncu MR, Guttag J, Dalca AV (2019) Voxelmorph: a learning framework for deformable medical image registration. IEEE Trans Medical Imaging 38(8):1788–1800

    Article  Google Scholar 

  6. Chen ECS, Morgan I, Jayarathne U, Ma B, Peters TM (2017) Hand–eye calibration using a target registration error model. Healthcare Technol Lett 4(5):157–162

    Article  CAS  Google Scholar 

  7. Dalca AV, Balakrishnan G, Guttag J, Sabuncu MR (2018) Unsupervised learning for fast probabilistic diffeomorphic registration. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 729–738

  8. Dalca AV, Balakrishnan G, Guttag J, Sabuncu MR (2019) Unsupervised learning of probabilistic diffeomorphic registration for images and surfaces. Medical Image Analysis 57:226–236

    Article  Google Scholar 

  9. Danilchenko A, Fitzpatrick JM (2011) General approach to first-order error prediction in rigid point registration. IEEE Trans Medical Imaging 30(3):679–693

    Article  Google Scholar 

  10. Deng H, Birdal T, Ilic S (2019) 3d local features for direct pairwise registration. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 3244–3253

  11. Dillon NP, Siebold MA, Mitchell JE, Blachon GS, Balachandran R, Fitzpatrick JM, Webster RJ (2016) Increasing safety of a robotic system for inner ear surgery using probabilistic error modeling near vital anatomy. In: Medical imaging 2016: image-guided procedures, robotic interventions, and modeling. International Society for Optics and Photonics, vol 9786, p 97861g

  12. Dillon NP, Siebold MA, Mitchell JE, Blachon GS, Balachandran R, Fitzpatrick JM, Webster RJ III (2016) Increasing safety of a robotic system for inner ear surgery using probabilistic error modeling near vital anatomy. In: Medical imaging 2016: image-guided procedures, robotic interventions, and modeling. International Society for Optics and Photonics, vol 9786, p 97861g

  13. Fitzpatrick JM (2009) Fiducial registration error and target registration error are uncorrelated. In: Medical imaging 2009: visualization, image-guided procedures, and modeling. International Society for Optics and Photonics, vol 7261, p 726102

  14. Haidegger T, Gyȯri S, Benyó B, Benyó Z (2010) Stochastic approach to error estimation for image-guided robotic systems. In: 2010 annual international conference of the IEEE engineering in medicine and biology. IEEE, pp 984–987

  15. Hu Y, Gibson E, Barratt DC, Emberton M, Noble JA, Vercauteren T (2019) Conditional segmentation in lieu of image registration. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 401–409

  16. Hu Y, Modat M, Gibson E, Li W, Ghavami N, Bonmati E, Wang G, Bandula S, Moore CM, Emberton M, et al. (2018) Weakly-supervised convolutional neural networks for multimodal image registration. Medical Image Analysis 49:1–13

    Article  CAS  Google Scholar 

  17. Krebs J, Delingette H, Mailhé B, Ayache N, Mansi T (2019) Learning a probabilistic model for diffeomorphic registration. IEEE Trans Medical Imaging 38(9):2165–2176

    Article  Google Scholar 

  18. Li Z, Wu L, Yu H, Ren H (2017) Kinematic comparison of surgical tendon-driven manipulators and concentric tube manipulators. Mech Mach Theory 107:148–165

    Article  Google Scholar 

  19. Luo J, Sedghi A, Popuri K, Cobzas D, Zhang M, Preiswerk F, Toews M, Golby A, Sugiyama M, Wells WM, et al. (2019) On the applicability of registration uncertainty. In: MICCAI. Springer, pp 410–419

  20. Ma B, Choi J, Huai HM (2014) Target registration error for rigid shape-based registration with heteroscedastic noise. In: Medical imaging 2014: image-guided procedures, robotic interventions, and modeling. International Society for Optics and Photonics, vol 9036, p 90360u

  21. Ma B, Moghari MH, Ellis RE, Abolmaesumi P (2010) Estimation of optimal fiducial target registration error in the presence of heteroscedastic noise. IEEE Trans Medical Imaging 29(3):708–723

    Article  Google Scholar 

  22. Ma B, Peters TM, Chen EC (2016) Estimation of line-based target registration error. In: Medical imaging 2016: image-guided procedures, robotic interventions, and modeling. International Society for Optics and Photonics, vol 9786, p 978626

  23. Ma J, Jiang X, Jiang J, Zhao J, Guo X (2019) Lmr: learning a two-class classifier for mismatch removal. IEEE Trans Image Process 28(8):4045–4059

    Article  Google Scholar 

  24. Ma J, Wu J, Zhao J, Jiang J, Zhou H, Sheng QZ (2019) Nonrigid point set registration with robust transformation learning under manifold regularization. IEEE Trans Neural Netw Learn Sys 30 (12):3584–3597

    Article  Google Scholar 

  25. Ma J, Zhao J, Jiang J, Zhou H, Guo X (2019) Locality preserving matching. Int J Comput Vis 127(5):512–531

    Article  Google Scholar 

  26. Maiseli B, Gu Y, Gao H (2017) Recent developments and trends in point set registration methods. Journal of Visual Communication and Image Representation

  27. Min Z, Ren H, Meng MQ (2017) Ttre: a new type of error to evaluate the accuracy of a paired-point rigid registration. In: 2017 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 953–960

  28. Min Z, Ren H, Meng MQ (2020) Statistical model of total target registration error in image-guided surgery. IEEE Trans Autom Sci Eng 17(1):151–165

    Article  Google Scholar 

  29. Min Z, Liu L, Meng MQH (2019) Generalized non-rigid point set registration with hybrid mixture models considering anisotropic positional uncertainties. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 547–555

  30. Min Z, Meng MQH (2017) General first-order TRE model when using a coordinate reference frame for rigid point-based registration. In: 2017 IEEE 14th international symposium on biomedical imaging (ISBI 2017). IEEE, pp 169–173

  31. Min Z, Meng MQH (2019) Robust generalized point set registration using inhomogeneous hybrid mixture models via expectation maximization. In: 2019 International conference on robotics and automation (ICRA). IEEE, pp 8733–8739

  32. Min Z, Ren H, Meng MQH (2017) TTRE: a new type of error to evaluate the accuracy of a paired-point rigid registration. In: 2017 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 953–960

  33. Min Z, Wang J, Meng MQH (2018) Joint registration of multiple generalized point sets. In: International workshop on shape in medical imaging. Springer, pp 169–177

  34. Min Z, Wang J, Meng MQH (2018) Robust generalized point cloud registration using hybrid mixture model. In: 2018 IEEE International conference on robotics and automation (ICRA). IEEE, pp 4812–4818

  35. Min Z, Wang J, Meng MQH (2019) Joint rigid registration of multiple generalized point sets with hybrid mixture models. IEEE Transactions on Automation Science and Engineering

  36. Min Z, Wang J, Meng MQH (2019) Robust generalized point cloud registration with orientational data based on expectation maximization. IEEE Transactions on Automation Science and Engineering

  37. Min Z, Wang J, Song S, Meng MQH (2018) Robust generalized point cloud registration with expectation maximization considering anisotropic positional uncertainties. In: 2018 IEEE/RSJ International conference on intelligent robots and systems (IROS). IEEE, pp 1290–1297

  38. Moghari MH, Abolmaesumi P (2009) Distribution of target registration error for anisotropic and inhomogeneous fiducial localization error. IEEE Trans Medical Imaging 28(6):799–813

    Article  Google Scholar 

  39. Moghari MH, Abolmaesumi P (2010) Understanding the effect of bias in fiducial localization error on point-based rigid-body registration. IEEE Trans Medical Imaging 29(10):1730–1738

    Article  Google Scholar 

  40. Moghari MH, Ma B, Abolmaesumi P (2008) A theoretical comparison of different target registration error estimators. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 1032–1040

  41. Nadeau C, Ren H, Krupa A, Dupont PE (2015) Intensity-based visual servoing for instrument and tissue tracking in 3d ultrasound volumes. IEEE Transactions on Automation Science and Engineering 12(1):367–371

    Article  Google Scholar 

  42. Picard F, Deakin AH, Riches PE, Deep K, Baines J (2019) Computer assisted orthopaedic surgery: past, present and future. Medical Engineering & Physics 72:55–65

    Article  Google Scholar 

  43. Qiu L, Zhang Y, Xu L, Niu X, Zhang Q, Zhang L (2018) Estimating maximum target registration error under uniform restriction of fiducial localization error in image guided system. IEEE Trans Med Imaging 37(4):881–892

    Article  Google Scholar 

  44. Rahimian P, Kearney JK (2015) Optimal camera placement for motion capture systems in the presence of dynamic occlusion. In: Proceedings of the 21st ACM symposium on virtual reality software and technology. ACM, pp 129–138

  45. Ren H, Kazanzides P (2011) A paired-orientation alignment problem in a hybrid tracking system for computer assisted surgery. J Intell Robotic Sys 63(2):151–161

    Article  Google Scholar 

  46. Ren H, Lim CM, Wang J, Liu W, Song S, Li Z, Herbert G, Tse ZTH, Tan Z (2013) Computer-assisted transoral surgery with flexible robotics and navigation technologies: a review of recent progress and research challenges, vol 41

  47. Ren H, Vasilyev NV, Dupont PE (2011) Detection of curved robots using 3d ultrasound. In: IROS 2011, IEEE/RSJ International conference on intelligent robots and systems, pp 2083–2089

  48. Robu MR, Ramalhinho J, Thompson S, Gurusamy K, Davidson B, Hawkes D, Stoyanov D, Clarkson MJ (2018) Global rigid registration of ct to video in laparoscopic liver surgery. International Journal of Computer Assisted Radiology and Surgery 13(6):947–956

    Article  Google Scholar 

  49. Sedghi A, Kapur T, Luo J, Mousavi P, Wells WM (2019) Probabilistic image registration via deep multi-class classification: characterizing uncertainty. In: Uncertainty for safe utilization of machine learning in medical imaging and clinical image-based procedures. Springer, pp 12–22

  50. Shamir RR, Joskowicz L, Spektor S, Shoshan Y (2009) Localization and registration accuracy in image guided neurosurgery: a clinical study. Int J Comput Assisted Radiol Surgery 4(1):45

    Article  Google Scholar 

  51. Siebold MA, Dillon NP, Webster RJ, Fitzpatrick JM (2015) Incorporating target registration error into robotic bone milling. In: Medical imaging 2015: Image-guided procedures, robotic interventions, and modeling. vol. 9415, p. 94150r. International society for optics and photonics

  52. Siebold MA, Dillon NP, Webster RJ III, Fitzpatrick JM (2015) Incorporating target registration error into robotic bone milling. In: Medical imaging 2015: image-guided procedures, robotic interventions, and modeling. International Society for Optics and Photonics, vol 9415, p 94150r

  53. Sielhorst T, Bauer M, Wenisch O, Klinker G, Navab N (2007) Online estimation of the target registration error for n-ocular optical tracking systems. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 652–659

  54. Simpson AL, Ma B, Ellis RE, Stewart AJ, Miga MI (2011) Uncertainty propagation and analysis of image-guided surgery. In: Medical imaging 2011: visualization, image-guided procedures, and modeling. International Society for Optics and Photonics, vol 7964, p 79640H

  55. Simpson AL, Ma B, Vasarhelyi EM, Borschneck DP, Ellis RE, James Stewart A (2014) Computation and visualization of uncertainty in surgical navigation. Int J Med Robotics Comput Assisted Surg 10 (3):332–343

    Article  Google Scholar 

  56. Sinha A, Billings SD, Reiter A, Liu X, Ishii M, Hager GD, Taylor RH (2019) The deformable most-likely-point paradigm. Medical Image Analysis 55:148–164

    Article  Google Scholar 

  57. Song S, Li Z, Ren H, Yu H (2015) Shape reconstruction for wire-driven flexible robots based on Bezier curve and electromagnetic positioning. Mechatronics 29(99):28–35

    Article  Google Scholar 

  58. Taylor RH (2020) Computer-integrated interventional medicine: a 30 year perspective. In: Handbook of medical image computing and computer assisted intervention. Elsevier, pp 599– 624

  59. Thompson S, Penney G, Dasgupta P, Hawkes D (2013) Improved modelling of tool tracking errors by modelling dependent marker errors. IEEE Trans Medical Imag 32(2):165–177

    Article  Google Scholar 

  60. Thompson S, Schneider C, Bosi M, Gurusamy K, Ourselin S, Davidson B, Hawkes D, Clarkson MJ (2018) In vivo estimation of target registration errors during augmented reality laparoscopic surgery. Int J Comput Assisted Radiol Surg 13(6):865–874

    Article  Google Scholar 

  61. Vercauteren T, Unberath M, Padoy N, Navab N (2019) Cai4cai: the rise of contextual artificial intelligence in computer-assisted interventions. Proceedings of the IEEE

  62. Wang J, Qi L, Meng MQH (2015) Robot-assisted occlusion avoidance for surgical instrument optical tracking system. In: 2015 IEEE International conference on information and automation, pp 375–380

  63. Wang J, Song S, Ren H, Lim CM, Meng MQ (2018) Surgical instrument tracking by multiple monocular modules and a sensor fusion approach. IEEE Trans Autom Sci Eng: 1–11

  64. Wang J, Meng MQH, Ren H (2015) Towards occlusion-free surgical instrument tracking: a modular monocular approach and an agile calibration method. IEEE Trans Autom Sci Eng 12(2):588–595

    Article  Google Scholar 

  65. West JB, Maurer CR (2004) Designing optically tracked instruments for image-guided surgery. IEEE Trans Med Imag 23(5):533–545

    Article  Google Scholar 

  66. Wiles AD, Likholyot A, Frantz DD, Peters TM (2008) A statistical model for point-based target registration error with anisotropic fiducial localizer error. IEEE Trans Medical Imag 27(3):378– 390

    Article  Google Scholar 

  67. Wiles AD, Peters TM (2007) Improved statistical TRE model when using a reference frame. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 442–449

  68. Wiles AD (2009) Navigation accuracy of image-guided interventional systems. vol 70

  69. Wu K, Daruwalla ZJ, Wong KL, Murphy D, Ren H (2015) Development and selection of Asian-specific humeral implants based on statistical atlas: toward planning minimally invasive surgery. Int J Comput Assisted Radiol Surg 10(8):1333– 1345

    Article  CAS  Google Scholar 

  70. Wu L, Yang X, Chen K, Ren H (2015) A minimal poe-based model for robotic kinematic calibration with only position measurements. IEEE Trans Autom Sci Eng 12(2):758–763

    Article  Google Scholar 

  71. Wu L, Yang X, Chen K, Ren H (2015) A minimal poe-based model for robotic kinematic calibration with only position measurements. IEEE Trans Autom Sci Eng 12(2):758–763

    Article  Google Scholar 

  72. Yaniv Z (2016) Registration for orthopaedic interventions. In: Computational radiology for orthopaedic interventions. Springer, pp 41–70

Download references

Funding

This project is partially supported by the Hong Kong RGC GRF grant no. 14210117, RGC NSFC/RGC Joint Research Scheme no. N_CUHK448/17, ITC ITF grant no. ITS/236/15, and Shenzhen Science and Technology Innovation projects JCYJ20170413161616163 awarded to Prof. Max Q.-H. Meng.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Max Q.-H. Meng.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

In this section, we verify that the TRE vector calculated using the proposed algorithm is consistent with that using Wiles’ algorithm. In Wiles’ algorithm, the estimated TRE expression in a single registration is in the principal axes’ frame of a rigid body. In the last row of Eq. 2, the term \({{~}_{ots}^{crf} }{\mathbf {R}^{\star }}\cdot ({~}^{ots}{\mathbf {t}\mathbf {r}\mathbf {e}_{to}({{~}^{ots}}{\mathbf {p}})})\) becomes \({{~}_{trf}^{crf} }{\mathbf {R}^{\star }}\cdot (^{trf}{\mathbf {t}\mathbf {r}\mathbf {e}_{to}({{~}^{trf}}{\mathbf {p}})})\) when Wiles’ algorithm is used, where \(^{trf}{\mathbf {t}\mathbf {r}\mathbf {e}_{to}({{~}^{trf}}{\mathbf {p}})}\) is defined as,

$$ \begin{array}{ll} ^{trf}{\mathbf{t}\mathbf{r}\mathbf{e}_{to}({{~}^{trf}}{\mathbf{p}})}&=\mathbf{T}_{e,p}\cdot^{trf}{\mathbf{p}}^{\star}-^{trf}{\mathbf{p}}^{\star}\\ &=\mathbf{R}_{e,p}\cdot^{trf}{\mathbf{p}}^{\star}+\mathbf{t}_{e,p}-^{trf}{\mathbf{p}}^{\star} \end{array} $$
(12)

where the term Te,p represents the erroneous transformation matrix relating OTS frame and TRF. We can rewrite the Te,p explicitly:

$$ \mathbf{T}_{e,p}=({{~}_{trf}^{ots}}{\mathbf{T}}^{\star})^{-1}\cdot{{~}_{trf}^{ots}}{\mathbf{T}}={{~}_{ots}^{trf}}{\mathbf{T}}^{\star}\cdot{{~}_{trf}^{ots}}{\mathbf{T}} $$
(13)

where we have \({{~}_{trf}^{ots}}{\mathbf {T}}^{\star }=[{{~}_{trf}^{ots}}{\mathbf {R}}^{\star },{{~}_{trf}^{ots}}{\mathbf {t}}^{\star }]\), we then have the inverse:

$$ \begin{array}{ll} ({{~}_{trf}^{ots}}{\mathbf{T}}^{\star})^{-1}&=[({{~}_{trf}^{ots}}{\mathbf{R}}^{\star})^{\mathsf{T}},-({{~}_{trf}^{ots}}{\mathbf{R}}^{\star})^{\mathsf{T}}\cdot{{~}_{trf}^{ots}}{\mathbf{t}}^{\star}]\\&=[{{~}_{ots}^{trf}}{\mathbf{R}}^{\star},-{{~}_{ots}^{trf}}{\mathbf{R}}^{\star}\cdot{{~}_{trf}^{ots}}{\mathbf{t}}^{\star}] \end{array} $$
(14)

With \({{~}_{trf}^{ots}}{\mathbf {T}}=[{{~}_{trf}^{ots}}{\mathbf {R}},{{~}_{trf}^{ots}}{\mathbf {t}}]\), we can have the explicit expression of Te,p: \(\mathbf {T}_{e,p}=[\mathbf {R}_{e,p},\mathbf {t}_{e,p}]=[{{~}_{ots}^{trf}}{\mathbf {R}}^{\star }\cdot {{~}_{trf}^{ots}}{\mathbf {R}},{{~}_{ots}^{trf}}{\mathbf {R}}^{\star }\cdot {{~}_{trf}^{ots}}{\mathbf {t}}-{{~}_{ots}^{trf}}{\mathbf {R}}^{\star }\cdot {{~}_{trf}^{ots}}{\mathbf {t}}^{\star }]\). Substituting the expression of Te,p in Eq. 13 into the first row of Eq. 12, we can have another expression of \(^{trf}{\mathbf {t}\mathbf {r}\mathbf {e}_{to}({{~}^{trf}}{\mathbf {p}})}\):

$$ \begin{array}{ll} ^{trf}{\mathbf{t}\mathbf{r}\mathbf{e}_{to}({{~}^{trf}}{\mathbf{p}})}&={{~}_{ots}^{trf}}{\mathbf{T}}^{\star}\cdot{{~}_{trf}^{ots}}{\mathbf{T}}\cdot^{trf}{\mathbf{p}}^{\star}-^{trf}{\mathbf{p}}^{\star}\\ &={{~}_{ots}^{trf}}{\mathbf{T}}^{\star}\cdot({{~}_{trf}^{ots}}{\mathbf{T}}\cdot^{trf}{\mathbf{p}}^{\star}-{{~}_{trf}^{ots}}{\mathbf{T}}^{\star}\cdot^{trf}{\mathbf{p}}^{\star}) \end{array} $$
(15)

Noticing that \(({{~}_{trf}^{ots}}{\mathbf {T}}\cdot ^{trf}{\mathbf {p}}^{\star }-{{~}_{trf}^{ots}}{\mathbf {T}}^{\star }\cdot ^{trf}{\mathbf {p}}^{\star })\) in Eq. 15 equals the value of \(^{ots}{\mathbf {t}\mathbf {r}\mathbf {e}_{to}({{~}^{ots}}{\mathbf {p}})}\) (referring to Eq. 4), together with Eq. 15, we can have:

$$ ^{trf}{\mathbf{t}\mathbf{r}\mathbf{e}_{to}({{~}^{trf}}{\mathbf{p}})}={{~}_{ots}^{trf}}{\mathbf{R}}^{\star}\cdot^{ots}{\mathbf{t}\mathbf{r}\mathbf{e}_{to}({{~}^{ots}}{\mathbf{p}})} $$
(16)

Finally, multiplying both sides of Eq. 16 with \({{~}_{trf}^{crf} }{\mathbf {R}^{\star }}\), we can have the following:

$$ \begin{array}{ll} {{~}_{trf}^{crf} }{\mathbf{R}^{\star}}\cdot (^{trf}{\mathbf{t}\mathbf{r}\mathbf{e}_{to}({{~}^{trf}}{\mathbf{p}})})&={{~}_{trf}^{crf} }{\mathbf{R}^{\star}}\cdot{{~}_{ots}^{trf}}{\mathbf{R}}^{\star}\cdot^{ots}{\mathbf{t}\mathbf{r}\mathbf{e}_{to}({{~}^{ots}}{\mathbf{p}})}\\ &={{~}_{ots}^{crf} }{\mathbf{R}^{\star}}\cdot (^{ots}{\mathbf{t}\mathbf{r}\mathbf{e}_{to}({{~}^{ots}}{\mathbf{p}})}) \end{array} $$
(17)

Until now, we have shown that the expression of the composed TRE vector in CRF space derived in this article is equal to the classical one. We can further derive the mathematical relationship between the covariance matrices of random variables \(^{trf}{\mathbf {t}\mathbf {r}\mathbf {e}_{to}({{~}^{trf}}{\mathbf {p}})}\) and \(^{ots}{\mathbf {t}\mathbf {r}\mathbf {e}_{to}({{~}^{ots}}{\mathbf {p}})}\) on both sides of Eq. 16 as the following:

$$ cov[^{trf}{\mathbf{t}\mathbf{r}\mathbf{e}_{to}({{~}^{trf}}{\mathbf{p}})}]={{~}_{ots}^{trf}}{\mathbf{R}}^{\star}\cdot cov[^{ots}{\mathbf{t}\mathbf{r}\mathbf{e}_{to}({{~}^{ots}}{\mathbf{p}})}]\cdot{{{~}_{ots}^{trf}}{\mathbf{R}}^{\star}}^{\mathsf{T}} $$
(18)

For the sake of clarity, the covariance matrix of composed TRE vector \(cov[{{~}_{comb}}{\mathbf {t}\mathbf {r}\mathbf {e}}{({{~}^{crf}}{\mathbf {p}})}]\) in Wiles’ algorithm is as the following:

$$ \begin{array}{ll} &{{~}^{crf}}{{\varSigma}}_{tre,comb}({^{crf}{\mathbf{p}}})\\ &={{~}^{crf}}{{\varSigma}}_{tre,oc}({^{crf}{\mathbf{p}}})+{{~}_{trf}^{crf} }{\mathbf{R}^{\star}}\cdot cov[^{trf}{\mathbf{t}\mathbf{r}\mathbf{e}_{to}({{~}^{trf}}{\mathbf{p}})}]({{~}_{trf}^{crf} }{\mathbf{R}^{\star}})^{\mathsf{T}} \end{array} $$
(19)

where the value of \({{~}^{crf}}{{\varSigma }}_{tre,oc}({^{crf}{\mathbf {p}}})\) and \(cov[^{trf}{\mathbf {t}\mathbf {r}\mathbf {e}_{to}({{~}^{trf}}{\mathbf {p}})}]\) can be acquired directly from Wiles’ algorithm in a single registration as the two terms are expressed in their respective principal axes’ frames . Substituting Eqs. 8 and 18 into 19, we can conclude expression in Eq. 19 is in fact equivalent to that in Eq. 8.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Min, Z., Meng, M.QH. General first-order target registration error model considering a coordinate reference frame in an image-guided surgical system. Med Biol Eng Comput 58, 2989–3002 (2020). https://doi.org/10.1007/s11517-020-02265-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11517-020-02265-y

Keywords

Navigation