Hostname: page-component-cd9895bd7-hc48f Total loading time: 0 Render date: 2024-12-28T12:46:38.747Z Has data issue: false hasContentIssue false

Curvature-based force estimation for an elastic tube

Published online by Cambridge University Press:  13 February 2023

Qingyu Xiao
Affiliation:
Department of Biomedical Engineering, Georgia Institute of Technology/Emory University, Atlanta, GA, USA, 30332
Xiaofeng Yang
Affiliation:
Department of Radiation Oncology, Emory University, Atlanta, GA, USA, 30322
Yue Chen*
Affiliation:
Department of Biomedical Engineering, Georgia Institute of Technology/Emory University, Atlanta, GA, USA, 30332
*
*Corresponding author. E-mail: yue.chen@bme.gatech.edu
Rights & Permissions [Opens in a new window]

Abstract

Contact force is one of the most significant feedback for robots to achieve accurate control and safe interaction with environment. For continuum robots, it is possible to estimate the contact force based on the feedback of robot shapes, which can address the difficulty of mounting dedicated force sensors on the continuum robot body with strict dimension constraints. In this paper, we use local curvatures to estimate the magnitude and location of single or multiple contact forces based on Cosserat rod theory. We validate the proposed method in a thin elastic tube and calculate the curvatures via Fiber Bragg Grating (FBG) sensors or image feedback. For the curvature feedback obtained from multicore FBG sensors, the overall force magnitude estimation error is $0.062 \pm 0.068$ N and the overall location estimation error is $3.51 \pm 2.60$ mm. For the curvature feedback obtained from image, the overall force magnitude estimation error is $0.049 \pm 0.048$ N and the overall location estimation error is $2.75 \pm 1.71$ mm. The results demonstrate that the curvature-based force estimation method is able to accurately estimate the contact force.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press

1. Introduction

Continuum robots could easily change their configurations due to external contact, which will affect the robot motion and eventually alter the control efforts [Reference Chen, Wang, Galloway, Godage, Simaan and Barth1, Reference Xiao, Musa, Godage, Su and Chen2]. The contact force is of great significance especially in medical settings such as cardiac ablation surgery, where the ablation area of the tissue largely depends on the contact force [Reference Kesner and Howe3]. Many work has been done to estimate the contact forces. Contact force can be directly measured by embedding miniature force sensors on the robot body [Reference Mitsuishi, Sugita and Pitakwatchara4, Reference Valdastri, Harada, Menciassi, Beccai, Stefanini, Fujie and Dario5]. However, these methods are only effective when the contact occurs at the sensor location, for instance at the tip of the robot. Also, the strict dimension constraints also limit the wide application of these approaches in needle-sized continuum robots, especially in the minimally invasive surgical tools where the tool tip often equipped with multiple treatment or diagnosis units [Reference Alipour, Meyer, Dumoulin, Watkins, Elahi, Loew, Schweitzer, Olson and Chen6, Reference Gunderman, Schmidt, Viswanathan, Halperin, Tokuda, Seethamraju and Chen7]. Alternatively, continuum robot task-space contact force can be estimated based on joint-space sensor measurement. For example, Bajo et al. [Reference Bajo and Simaan8, Reference Bajo and Simaan9] proposed a method for tendon-driven continuum robot contact force sensing, but this requires a dedicated force cell at the actuation unit. The joint-space force cells may not be available for some continuum devices such as concentric tubes [Reference Chen, Poorman, Comber, Pitt, Liu, Godage, Yu, Grissom, Barth and Webster10], steerable needles [Reference Chen, Godage, Sengupta, Liu, Weaver and Barth11], or in MR-guided interventions [Reference Chen, Howard, Godage and Sengupta12Reference Kwok, Lee, Chen, Wang, Hu, Chow, Zhang, Stevenson, Kwong, Luk and Schmidt15] where the MRI-conditional force sensor is not readily available.1

Alternatively, force can be estimated implicitly based on the measurements of continuum robot configuration change. This is applicable since the intrinsic compliant nature of continuum robots allows the robot configuration to be changed with the contact forces [Reference Webster and Jones16]. Many models have been proposed to predict the continuum robot shape based on contact force, including beam-based method [Reference Stilli, Kolokotronis, Fraś, Ataka, Althoefer and Wurdemann17], calibration method [Reference Wang and Simaan18], variable curvature method [Reference Mahl, Hildebrandt and Sawodny19], and Cosserat rod theory [Reference Rucker, Jones and Webster20]. Contact forces, in turn, can be estimated by matching the measured shape with model shape.

Related works have been published to estimate the contact force using shape-based method, which compares the difference between measured shape and model shape in the point cloud. Qiao [Reference Qiao, Borghesan, De Schutter and Poorten21] proposed a shape-based method to estimate contact forces by minimizing the difference between the measured shape and model shape with Kalman filters. However, the estimation error is relatively large, with force mean error of 30.59% and locations mean error of 17.20%. Aloi [Reference Aloi and Rucker22] proposed a shape-based method using Fourier transformation. This method will estimate the force in a distributed fashion and can detect the force location by observing the "peaks” of the distributed force profile. However, it cannot provide useful information on the exact force magnitude and location if it is point contact.

Aside from shape-based method, curvatures can provide more robust information for force estimation. This is because the curvatures remain invariant for any translations or rotations the continuum robot undergoes. The curvatures along the continuum robot can be measured using multicore fiber Bragg grating (FBG) sensors [Reference Gouveia, Jorge, Baptista and Frazao23, Reference Xu, Yurkewich and Patel24] or intraoperative images with curvature estimation algorithms [Reference Frette, Virnovsky and Silin25Reference Pottmann, Wallner, Yang, Lai and Hu29]. Several curvature-based methods have been investigated for force estimation. Qiao [Reference Qiao, Willems, Borghesan, Ourak, De Schutter and Poorten30] utilized FBGS to measure the curvatures along the manipulator and calculated the force (mean error of 9.8% error) and locations (mean error of 3.6% error) directly using constitutive law. Al-Ahmad [Reference Al-Ahmad, Ourak, Vlekken and Poorten31] utilized an unscented Kalman filter to the measured curvatures, which achieved 7.3% error of force magnitude and 4.6% error of location. However, the estimated location error is still large because these methods cannot solve subnode accuracy and require expensive FBG sensors embedded in the instrument.

In this paper, we present a curvature-based force estimation method that enables force estimation in a subnode accuracy. We also illustrate curvature-based method has advantages over shape-based method regarding the loss map convexity. Moreover, a local frame representation of Cosserat rod theory is derived to achieve faster curvature computation, which avoids the time-consuming integration of rotation matrix. Aside from obtaining the curvature feedback from FBG sensors, we also investigate the feasibility of obtaining the curvature from image feedback using the line integral method. The result shows that the contact force locations and magnitudes can be accurately estimated by curvature from both FBG sensors and images. The contributions of this paper include:

  • Distributed the contact force to adjacent nodes to solve the model in subnode accuracy.

  • Derived a local frame representation of Cosserat rod theory to achieve faster computation.

  • Investigated the method to compute local curvature based on image feedback.

  • Experimentally validated the proposed estimation method on single and multiple contact force cases.

The structure of the paper is arranged as follows. Section 2 describes the derivation of a local frame representation of Cosserat rod theory for curvature calculation. Section 3 discusses the method to solve the model in subnode accuracy and the methods to compute the feedback curvatures from multicore FBG sensors and the images. Section 4 shows the experimental results of the force estimation regarding the force magnitudes and the locations. Finally, Section 5 is the conclusion.

2. Model curvature calculation

2.1. Review of Cosserat rod model

Cosserat rod model describes the equilibrium state of a small segment of thin rod subjected to internal and external distributed forces and moments. Previous works [Reference Jones, Gray and Turlapati32] have detailed the differential geometry of a thin rod with Cosserat rod theory so that the deformation can be estimated by solving Ordinary differential equations (ODEs) in (1)–(4) given external forces $\boldsymbol{F}_{e}$ and external moments $\boldsymbol{L}_e$ . To distinguish variables from different frames, we use lowercase letters for variables in local frame and uppercase letters for variables in global frame.

(1) \begin{align} \dot{\boldsymbol{P}} = \boldsymbol{R} \boldsymbol{v} \end{align}
(2) \begin{align} \dot{ R} = R \hat{\boldsymbol{u}} \end{align}
(3) \begin{align} \dot{\boldsymbol{N}} = -\boldsymbol{F}_e \end{align}
(4) \begin{align} \dot{\boldsymbol{M}} = -\dot{\boldsymbol{P}}\times \boldsymbol{N} - \boldsymbol{L}_e \end{align}

where the dot symbol in $\dot{\boldsymbol{x}}$ represents the derivative of $\boldsymbol{x}$ respect to the arc length $s$ , hat symbol in $\hat{\boldsymbol{x}}$ reconstruct vector $\boldsymbol{x}$ to a 3 by 3 skew symmetric matrix, $\boldsymbol{P}$ is the shape in global frame, $R$ is the rotation matrix of local frame relative to the global frame, $\boldsymbol{N}$ and $\boldsymbol{M}$ are internal force and moment in global frame which obeys the constitutive law

(5) \begin{align} \boldsymbol{N} = RK_{SE}\left (\boldsymbol{v}-\boldsymbol{v}^*\right ) \end{align}
(6) \begin{align} \boldsymbol{M} = RK_{BT}\left (\boldsymbol{u}-\boldsymbol{u}^*\right ) \end{align}

$\boldsymbol{v}$ and $\boldsymbol{u}$ are differential geometry parameters of a rod, $\boldsymbol{v}^*$ and $\boldsymbol{u}^*$ are differential geometry parameters of a rod with no external loads. $K_{SE}$ and $K_{BT}$ are the stiffness matrices. The boundary conditions for solving the ODEs (1)–(4) are described as

(7) \begin{align} \boldsymbol{N}\left (loc_{tip}\right ) = \boldsymbol{F}_{tip} \end{align}
(8) \begin{align} \boldsymbol{M}\left (loc_{tip}\right ) = \boldsymbol{T}_{tip} \end{align}

where $\boldsymbol{F}_{tip}$ and $\boldsymbol{T}_{tip}$ are external force and torque applied at the tip. $loc_{tip}$ is the tip position in arc length. Thus, the rod shape $\boldsymbol{P}$ can be computed by solving the boundary value problem (BVP).

2.2. ODEs for model curvature calculation

The widely accepted Cosserat rod model in (1)–(4) are expressed in global frame. But it is beneficial to use the model in local frame to calculate the curvatures. We firstly take the derivative of $\boldsymbol{M}$ in (6) and combine with (4)

(9) \begin{equation} R \hat{\boldsymbol{u}} K_{BT}\left (\boldsymbol{u}-\boldsymbol{u}^*\right ) + R K_{BT}\dot{\boldsymbol{u}} = -\dot{\boldsymbol{P}}\times \boldsymbol{N}-\boldsymbol{L}_e \end{equation}

Then, using the relation $\dot{\boldsymbol{P}}\times \boldsymbol{N} = \hat{\dot{\boldsymbol{P}}}\boldsymbol{N}$ , $\dot{\boldsymbol{P}} = R\boldsymbol{v}$ , and $\hat{(R\boldsymbol{v})} = R\hat{\boldsymbol{v}}R^T$ (this is true when $R \in SO(3)$ ), we can obtain

(10) \begin{equation} R \hat{\boldsymbol{u}}K_{BT}(\boldsymbol{u}-\boldsymbol{u}^*) + RK_{BT}\dot{\boldsymbol{u}} = -R\hat{\boldsymbol{v}}R^T \boldsymbol{N}-\boldsymbol{L}_e \end{equation}

Multiplying $R^T$ on both side, and solve for $\dot{\boldsymbol{u}}$ , we can have

(11) \begin{equation} \dot{\boldsymbol{u}} = -K_{BT}^{-1}\left ( \hat{\boldsymbol{u}}K_{BT}(\boldsymbol{u}-\boldsymbol{u}^*) + \hat{\boldsymbol{v}}R^T\boldsymbol{N} + R^T\boldsymbol{L}_e \right ) \end{equation}

We define local variables $\boldsymbol{n} = R^T\boldsymbol{N}$ , and $l_e = R^T\boldsymbol{L}_e$ . The meaning of the variables $\boldsymbol{n}$ and $\boldsymbol{l}_e$ are actually the variables $\boldsymbol{N}$ and $\boldsymbol{L}_e$ expressed in local frame, respectively. Hence, (11) becomes

(12) \begin{equation} \dot{\boldsymbol{u}} = -K_{BT}^{-1}\left ( \hat{\boldsymbol{u}}K_{BT}(\boldsymbol{u}-\boldsymbol{u}^*) + \hat{\boldsymbol{v}}\boldsymbol{n} + \boldsymbol{l}e \right ) \end{equation}

In order to solve $\boldsymbol{n}$ , we use (3) and multiply $R^T$ on the two sides and add $\dot{R}^T\boldsymbol{N}$ , which gives

(13) \begin{equation} R^T\dot{\boldsymbol{N}} + \dot{R}^T\boldsymbol{N} = -R^T\boldsymbol{F}_e + \dot{R}^T\boldsymbol{N} \end{equation}

Again, we define $R^T\boldsymbol{F}_e = \boldsymbol{f}_e$ . Notice that the left-hand side in (13) is actually the derivative of $R^T\boldsymbol{N}$ , which is $\dot{\boldsymbol{n}}$ . $\dot{R}$ can be replaced by (2) on the right-hand side. Thus, we have

(14) \begin{equation} \dot{\boldsymbol{n}} = -\boldsymbol{f}_e+\hat{\boldsymbol{u}}\boldsymbol{n} \end{equation}

To sum up, the curvatures of a general rod can be calculated by

(15) \begin{align} \dot{\boldsymbol{u}} = -K_{BT}^{-1}\left ( \hat{\boldsymbol{u}}K_{BT}(\boldsymbol{u}-\boldsymbol{u}^*) + \hat{\boldsymbol{v}}\boldsymbol{n} + \boldsymbol{l}_e \right ) \end{align}
(16) \begin{align} \dot{\boldsymbol{n}} = -\boldsymbol{f}_e + \hat{\boldsymbol{u}}\boldsymbol{n} \end{align}

with boundary conditions

(17) \begin{align} \boldsymbol{u}\left(loc_{tip}\right) = \textbf{0} \end{align}
(18) \begin{align} \boldsymbol{n}\left(loc_{tip}\right) = \boldsymbol{f}_{tip} \end{align}

where $loc_{tip}$ and $f_{tip}$ are the location and applied external force at the tip, respectively. (15) and (16) can be simplified in a special case where the rod is straight ( $\boldsymbol{u}^* = \textbf{0}$ and $\boldsymbol{v}^* = [0\quad 0\quad 1 ]^T$ ), with circular cross-section ( $K_{11} = K_{22} = K_{33}/2$ ), nonshear and inextensible ( $\boldsymbol{v} = [0\quad 0\quad 1 ]^T$ ), and no external moment ( $\boldsymbol{l}_e=\textbf{0}$ ). The simplified ODEs can be written as

(19) \begin{align} \dot{u}_x &= -n_y/K_{BT,11} \end{align}
(20) \begin{align} \dot{u}_y &= n_x/K_{BT,22} \end{align}
(21) \begin{align} \dot{n}_x &= -f_x + u_y n_z \end{align}
(22) \begin{align} \dot{n}_y &= -f_y - u_x n_z \end{align}
(23) \begin{align} \dot{n}_z &= -f_z-u_yn_x + u_xn_y \end{align}

where $x$ , $y$ , and $z$ are the first, second, and third component of a vector, respectively. $K_{BT,ij}$ means the entry of $K_{BT}$ at row $i$ and column $j$ . And (21)–(23) is the same as (16). Note that both the integral variables and the boundary conditions are defined in local frame, the curvature $u_x$ and $u_y$ can be solved by simple backward integration from the distal point ( $s = L$ ) to the proximal point ( $s = 0$ ), which requires no iterative computation to solve a standard BVP. Therefore, the computational complexity is only determined by the number of nodes we divide along the arc length. Moreover, solving (19)–(23) will lead to the same result as solving the model in (1)–(4), but the former method will have faster performance for curvature calculation.

3. Force estimation approach

3.1. Point forces for Cosserat model

The force defined in Cosserat rod theory is distributed force whose unit is N/m. However, the objective of this paper is to estimate the magnitude and location of point force whose unit is N. Conversion has to be performed between point force and distributed force. Firstly, we define the applied point forces in a new force vector which contains the locations and magnitude components of all forces.

(24) \begin{equation} \boldsymbol{f}_{vec} = \left [s^1,f^1_{pt,x},f^1_{pt,y},s^2,f^2_{pt,x},f^2_{pt,y},\ldots,s^h,f^h_{pt,x},f^h_{pt,y}\right ]^T \end{equation}

where $s^i$ is the location of the $i^{th}$ point force, $f_{pt,x}^i$ and $f_{pt,y}^i$ are the components of the $i^{th}$ point force, and $h$ is the total number of forces acting on the continuum robot. The arc length of the manipulator can be divided into $q-1$ segments, and this gives

(25) \begin{equation} \boldsymbol{Loc} = \left [loc_1, loc_2, \ldots, loc_q\right ] \end{equation}

where $loc_i$ is the location of the $i^{th}$ node, $\boldsymbol{Loc}$ is the whole list of the nodes. In order to achieve subnode accuracy, we convert the point forces to distributed forces $\boldsymbol{f}_{vec}$ on the nodes $\boldsymbol{Loc}$ . Figure 1 shows an example to distribute the point force $f^i_{pt}$ to two adjacent nodes $j-1$ and $j$ , the point force is distributed linearly according to the arc length distance between the two nodes. Therefore, the distributed forces can be calculated by

(26) \begin{align} f^{j-1} &= \frac{f^i_{pt}}{\left(loc_j - loc_{j-1}\right)^2}\left(s^i - loc_{j-1}\right) \end{align}
(27) \begin{align} f^{j} &=\frac{f^i_{pt}}{\left(loc_j - loc_{j-1}\right)^2}\left(loc_{j} - s^i\right) \end{align}

where $f^j$ (N/m) is the distributed force, $f_{pt}^i$ (N) is the point force, $loc_{j-1}$ and $loc_j$ are two adjacent nodes for $s^i$ . Note that the $s^i$ is located between hte nodes $loc_{j-1}$ and $loc_j$ , which allows the model to solve the solution in subnode accuracy. In the case where multiple point forces are close and distributed to the same node, the forces on that node will be superposed. After knowing the distributed forces along the rod, the mechanics model is complete and curvature can then be calculated through the integration of (19)–(23).

Figure 1. Distribution of point force to adjacent nodes to achieve subnode accuracy.

Figure 2. Loss maps computed by (a) shape (b) curvature.

3.2. Curvature-based force estimation

A robust method for force estimation is to minimize the least square loss between the calculated curvature with measured curvatures.

(28) \begin{equation} loss_u(\boldsymbol{f}_{vec}) = \lVert \boldsymbol{u}_{cal} - \tilde{\boldsymbol{u}}\rVert ^2 \end{equation}

where $\boldsymbol{u}_{cal}$ is the curvature calculated from $\boldsymbol{f}_{vec}$ and $\tilde{\boldsymbol{u}}$ is the curvature measured by sensors.

Similar to minimizing the loss of curvature, the loss of shape $loss_{P}(\boldsymbol{f}_{vec})$ could also be used for the objective function of the optimization algorithm. Figure 2 shows a simulation example of the loss of shape $loss_{P}(\boldsymbol{f}_{vec})$ and loss of curvature $loss_{u}(\boldsymbol{f}_{vec})$ with various force magnitudes and locations. The loss map is calculated by following steps: 1) Choosing $\boldsymbol{f}_{vec} = [200\quad 0.3\quad 0 ]^T$ as the ground truth, and calculate the curvature $\tilde{\boldsymbol{u}}$ and shape $\tilde{\boldsymbol{p}}$ using Cosserat rod theory. The $\tilde{\boldsymbol{u}}$ and $\tilde{\boldsymbol{p}}$ can be assumed as the measured data from sensors. 2) Calculate $\boldsymbol{u}_{cal}$ and $\boldsymbol{P}_{cal}$ with the same method, but use different $\boldsymbol{f}_{vec}$ whose first component ranges from 100 to 290 mm and the second component ranges from 0 to 0.5 N. 3) Use (28) to compute the loss for each pair of force magnitude and location. As illustrated in Fig. 2, the loss of curvature (b) shows better convex property than that of shape coordinates (a), and thus more robust results can be calculated [Reference Bubeck33]. Therefore, we use curvature-based method to estimate the force.

Figure 3. Compute the curvatures using template disk. $A_I$ is the area surrounded by the template disk and the shape, and $b$ is the radius of the template disk.

3.3. Curvature feedback

Curvature feedback of a elastic tube can be obtained from either FBG sensors or images. For curvature measurement using FBG sensors, multicore FBG fiber is typically used to compute the curvature through the change of the Bragg wavelength [Reference Moore and Rogge34]. In this paper, the curvature measurement using FBG sensors is obtained directly from the software Shape Sensing v1.3.1 provided by the supplier.

To obtain the curvature feedback from the intraoperative image feedback, one of the major challenges is that the shape of the tube on the image is noisy, especially when the resolution of the image is low. This is because the tube shape can only be discretely captured by pixels of the image and the resulting centerline of the tube is not continuous. Thus, it is inapplicable to compute the curvature based on the basic geometry, which is the norm of the derivative of the tangent direction. The derivative of a noisy shape will increase the noise magnitude and reduce the quality of the curvature feedback. Many methods have been proposed to reduce the noise of the curvature feedback from the image feedback, including segmentation methods [Reference Nguyen and Debled-Rennesson27, Reference Nguyen and Debled-Rennesson28] and integral invariant methods [Reference Lin, Chiu, Widder, Hu and Boston26, Reference Pottmann, Wallner, Yang, Lai and Hu29]. In this paper, we implemented the integral invariant method proposed in ref. [Reference Frette, Virnovsky and Silin25] to compute the curvature feedback of the elastic tube from an image, which can be described as

(29) \begin{equation} \Vert \boldsymbol{u} \Vert \approx \frac{3A_I}{b^3} - \frac{3\pi }{2b} \end{equation}

where $\Vert \boldsymbol{u} \Vert$ is the norm of the curvature feedback at the center of the template disk, $A_I$ is the area surrounded by the template disk and the shape, and $b$ is the radius of the template disk (see Fig. 3). However, this integral invariant method can only be used to compute the curvature feedback of a tube if the tube only has in-plane bending. Special care has to be taken to compute the curvature feedback from image of a tube with out-of-plane bending, which will be the focus of our future work.

4. Experiments and results

In this section, we will validate the curvature-based force estimation method on single, double, and triple point force cases, with the curvature measured from FBG sensors or calculated via image feedback.

4.1. Force estimation result using multicore FBG sensors

4.1.1. Experiment setup

As illustrated in Fig. 4, the experiments were performed by adding multiple forces to a straight Nitinol tube. ATI force sensor (ATI Industrial Automation, United States) with 3D printed probe was mounted on a fixed vertical board. The probes are designed with different height in order to contact the Nitinol tube at various location and orientation (Fig. 4(c)). For every single probe that mounted on the force sensor, a counterpart probe (Fig. 4(a) and (b)) is also designed to keep the same distance from the probe head to the wall. This allows the force sensor with probe to be interchangeable with its counterpart such that the shape of Nitinol tube shape remains unchanged. For example, the Nitinol tube in Fig. 4(a) and (b) has the same shape, but the location of the force sensor is swapped. Thus, we can use one force sensor to measure multiple external forces. Fiber Bragg gratings sensors (FBGS International NV, Belgium) was inserted inside the Nitinol tube to measure the curvature.

Figure 4. Experiment setup for force estimation. The ATI force sensor with a probe is mounted on a fixed vertical wall. The FBGS is inserted inside Nitinol tube for curvature measurement. Counterpart of each force sensor with probe are designed to keep the shape of Nitinol in (a) and (b) the same, and the force sensor can swap the location. Different height design of the probe in (c) allows for 3D shape deformation.

4.1.2. System parameter calibration

The system calibration was conducted in two steps: (1) location calibration and (2) stiffness calibration. Since the FBGS fiber is transparent, it is hard to identify the locations of gratings inside the fiber. But the relative locations (spacing between adjacent gratings) are specified on the user manual (20 mm). The objective of location calibration is to ensure the curvatures we measured are aligned with the positions on the Nitinol. Three cases of single force at different locations were recorded, and the bias of the location can be minimized by adding an offset to the results.

(30) \begin{equation} s_{est} = s_{cal} + s_{bias} \end{equation}

where $s_{est}$ is the estimated location of the external force, $s_{cal}$ is the result of the model, and $s_{bias}$ is the bias to offset the error. After location calibration, we conducted stiffness calibration to match the calculated force magnitude with the measured force magnitude. Notice that the stiffness change has trivial impact on the location estimation, therefore the location calibration is still valid after completing the stiffness calibration. The result of the calibration is listed in Table I.

Table I. Calibrated parameters.

4.1.3. Force estimation results

The error analysis of the force estimation result is grouped into three categories: single force estimation analysis, double forces estimation analysis, and triple forces estimation analysis. For each category, the experiment was performed 13 times by applying varying contact force (0.27–1.96 N) at 13 different locations on the Nitinol tube. The step size of the integration of the ODEs for curvature calculation is $ds =$ 1 mm.

As summarized in Table II, the RMSE of single force estimation, double force estimation, and triple force estimation are $0.037 \pm 0.069$ N, $0.060 \pm 0.046$ N, and $0.127 \pm 0.081$ N, respectively. Therefore, the proposed method can estimate the force magnitude accurately, despite the fact that larger number of forces can slightly reduce the estimation accuracy on force magnitude. The RMSE of the single-location estimation, double-location estimation, and triple-location estimation is 2.95 $\pm$ 2.17 mm, $2.58\pm 2.06$ mm, and $4.69\pm 4.06$ mm, respectively. This indicates that a larger location error will occur when the number of external forces increases.

Table II. Force estimation RMSE using FBG sensor feedback.

Figure 5. Experimental results for single force estimation. (a) Force magnitude estimation. (b) force location estimation.

Figures 5 and 6 summarize the force estimation results for single force and multiple forces, respectively. The blue dots are the predicted result and the red dots are the measured results. Most of the cases in Fig. 5 can be predicted accurately except case 1 (error magnitude of 8.72%). This error occurs because the curvature can only be measured by the first few gratings (the spacing of adjacent gratings is 20 mm), while the rest gratings will read 0 because no internal moment exists after the location where the force is applied. In Fig. 6, each case has two or three values, which refer to the two or three forces, respectively. The red dots and the blue dots are aligned well with each other, which demonstrates the accuracy of the proposed estimation method.

Figure 6. Experimental results for double and triple forces estimation. (a) Force magnitudes estimation. (b) force locations estimation. During each experiment, the circle marker indicates the first force, diamond marker indicates the second force, and the triangle marker indicates the third force (triple force scenario).

4.2. Force Eestimation result using image feedback

In this subsection, we investigate the potential application of curvature-based force estimation method in image-guided interventions. We assume the continuum robot with in-plane bending so that its shape can be captured by one image. However, the 3D shape reconstruction of the continuum robot with out-of-plane bending can be achieved by images captured from multiple viewing angles or 3D imaging techniques such as MRI [Reference Kwok, Chen, Chau, Luk, Nilsson, Schmidt and Zion35] or 3D ultrasould imaging [Reference Dai, Lei, Roper, Chen, Bradley, Curran, Liu and Yang36].

4.2.1. Workflow to estimate the force from images

The process to estimate the contact force can be summarized as: (a) continuum robot centerline detection; (b) compute the curvature feedback based on the detected centerline; and (c) estimate the contact force based on the curvature feedback. Here, we assume the continuum robot shape can be accurately identified via various image segmentation techniques [Reference Manakov, Kolpashchikov, Danilov, Nikita Vitalievich Laptev and Gerget37] or mounting active tracking coils on the continuum device [Reference Chen, Howard, Godage and Sengupta12, Reference Yue Chen, Wang, Kwong, Stevenson and Schmidt38]. Thus, we will skip the image segmentation step in this preliminary study. To obtain this ideal continuum robot shape from image feedback, we first deform the Nitinol tube with random force, overlay the deformed tube (Fig. 7(a)) on a 128 × 170 grid with the pixel resolution of 0.5 mm × 0.5 mm (which is the ultrasound imaging resolution), and then threshold the grid (as shown in Fig. 7(b)) to obtain Fig. 7(c). After obtaining continuum tube image, the centerline of the continuum robot, as presented in Fig. 7(d) can be detected. This will be achieved by the skeletonization algorithm proposed in refs. [Reference Kerschnitzki, Kollmannsberger, Burghammer, Duda, Weinkamer, Wagermaier and Fratzl39, Reference Lee, Kashyap and Chu40], which is well-accepted imaging processing algorithm in the field of computer vision. A sample continuum robot image and the corresponding result of the skeletonization algorithm are presented in Fig. 7(d). After detecting the centerline of the continuum robot, the curvature feedback of the robot can then be computed through (29). Finally, the contact force can be estimated by the curvature-based force estimation method by feeding the curvature feedback we computed.

Figure 7. (a) Shape of the deformed rod. (b) The blurred grid within the threshold. (c) The resulting image of the deformed rod. (d) Centerline (red pixels) detection using skeletonization algorithm.

Figure 8. Curvatures computed from the imaging.

4.2.2. Force estimation results

As presented in Fig. 8, the curvature feedback can be computed accurately from image data. This is a critical step for the curvature-based force estimation method. After obtaining the curvature feedback, we use them to estimate the force magnitudes and the corresponding locations. The step size of the integration of the ODEs for curvature calculation is $ds =$ 1 mm. The results of the curvature-based force estimation are presented in Figs. 9 and 10. As summarized in Table III, the overall force magnitude estimation error is $0.049 \pm 0.048$ N and the overall location estimation error is $2.75 \pm 1.71$ mm. For single force estimation, the force magnitude estimation error is $0.045 \pm 0.040$ N and the location error is $2.20 \pm 1.73$ mm. For double force estimation, the force magnitude estimation error is $0.051 \pm 0.048$ N and the location estimation error is $3.11 \pm 1.64$ mm. For triple force estimation, the force magnitude estimation error is $0.050 \pm 0.037$ N and the location estimation error is $2.59 \pm 1.41$ mm. The result shows that the curvature-based method can estimate the force locations accurately for either single force or multiple force estimations. However, the force magnitude accuracy will be degraded if more forces are acting on the rod. Although moderate errors exist, this result validates the feasibility of applying curvature-based force estimation method with image data to estimate the contact force.

Table III. Force estimation RMSE using image feedback.

Figure 9. Single force estimation result based on curvature computed from image. (a) Force magnitude estimation. (b) Force location estimation.

Figure 10. Experimental results for double and triple forces estimation based on image feedback. (a) Force magnitudes estimation. (b) Force locations estimation. During each experiment, the circle marker indicates the first force, diamond marker indicates the second force, and the triangle marker indicates the third force (triple force scenario).

5. Conclusions

In this article, we develop the curvature-based force estimation method to estimate the contact force magnitude and location. The proposed method was validated on a straight 290 mm Nitinol tube, with single or multiple forces acting at different locations. The curvature feedback was obtained from the multicore FBG sensors and the images. The results showed that the model can estimate the force magnitude and location accurately. For the curvature feedback obtained from multicore FBG sensors, the overall force magnitude estimation error is $0.062 \pm 0.068$ N and the overall location estimation error is $3.51 \pm 2.60$ mm. For the curvature feedback obtained from image, the overall force magnitude estimation error is $0.049 \pm 0.048$ N and the overall location estimation error is $2.75 \pm 1.71$ mm.

However, the accurate force estimation relies on reliable curvature calculation of the deformed continuum robot. One of the main challenges for practical implementation is to accurately extract the center line from multiple images when robot is subjected to 3D deformation. 3D shape reconstruction may have larger noise or error caused by multicamera calibration. Moreover, computing the curvature from 3D reconstructed shape remains a challenging task. Therefore, our future work will involve the study of 3D contact force estimation, machine learning methods for curvature calculation, and evaluation of the force estimation performance in clinical settings with 3D image feedback such as 3D ultrasound and magnetic resonance imaging.

Author contributions

Qingyu Xiao, Xiaofeng Yang, and Yue Chen designed the study. Qingyu Xiao conducted data gathering. Qingyu Xiao and Yue Chen wrote the article.

Financial support

This research was partially funded by Georgia Institute of Technology Startup Grant.

Conflict of interest

Not applicable.

Ethical standards

Not applicable.

References

Chen, Y., Wang, L., Galloway, K., Godage, I., Simaan, N. and Barth, E., “Modal-based kinematics and contact detection of soft robots,” Soft Robot. 8(3), 298309 (2021).CrossRefGoogle ScholarPubMed
Xiao, Q., Musa, M., Godage, I. S., Su, H. and Chen, Y., “Kinematics and stiffness modeling of soft robot with a concentric backbone,” J. Mech. Robot. 15(5), 051011 (2023).CrossRefGoogle Scholar
Kesner, S. B. and Howe, R. D., “Robotic catheter cardiac ablation combining ultrasound guidance and force control,” Int. J. Robot. Res 33(4), 631644 (2014).CrossRefGoogle Scholar
Mitsuishi, M., Sugita, N. and Pitakwatchara, P., “Force-feedback augmentation modes in the laparoscopic minimally invasive telesurgical system,” IEEE/ASME Trans. Mechatron. 12(4), 447454 (2007).CrossRefGoogle Scholar
Valdastri, P., Harada, K., Menciassi, A., Beccai, L., Stefanini, C., Fujie, M. and Dario, P., “Integration of a miniaturised triaxial force sensor in a minimally invasive surgical tool,” IEEE Trans. Biomed. Eng. 53(11), 23972400 (2006).CrossRefGoogle Scholar
Alipour, A., Meyer, E. S., Dumoulin, C. L., Watkins, R. D., Elahi, H., Loew, W., Schweitzer, J., Olson, G., Chen, Y., S. Tao and M. Guttman, “MRI conditional actively tracked metallic electrophysiology catheters and guidewires with miniature tethered radio-frequency traps: theory, design, and validation,” IEEE Trans. Biomed. Eng. 67(6), 16161627 (2019).CrossRefGoogle ScholarPubMed
Gunderman, A. L., Schmidt, E. J., Viswanathan, A. N., Halperin, H. R., Tokuda, J., Seethamraju, R. T. and Chen, Y., "Mr-Guided Tissue Puncture with On-line Imaging for High-resolution Theranostics,” In: 2020 International Symposium on Medical Robotics (ISMR) (IEEE, 2020) pp. 5761.CrossRefGoogle Scholar
Bajo, A. and Simaan, N., "Finding Lost Wrenches: Using Continuum Robots for Contact Detection and Estimation of Contact Location,” In: 2010 IEEE International Conference on Robotics and Automation (IEEE, 2010) pp. 36663673.CrossRefGoogle Scholar
Bajo, A. and Simaan, N., “Kinematics-based detection and localization of contacts along multisegment continuum robots,” IEEE Trans. Robot. 28(2), 291302 (2011).CrossRefGoogle Scholar
Chen, Y., Poorman, M. E., Comber, D. B., Pitt, E. B., Liu, C., Godage, I. S., Yu, H., Grissom, W. A., Barth, E. J. Webster, and R. J. III, “Treating Epilepsy Via Thermal Ablation: Initial Experiments with an Mri-Guided Concentric Tube Robot,” In: Frontiers in Biomedical Devices. vol. 40672 (American Society of Mechanical Engineers,2017) pp. V001T02A002.Google Scholar
Chen, Y., Godage, I. S., Sengupta, S., Liu, C. L., Weaver, K. D. and Barth, E. J., “Mr-conditional steerable needle robot for intracerebral hemorrhage removal,” Int. J. Comput. Ass. Rad. 14(1), 105115 (2019).Google ScholarPubMed
Chen, Y., Howard, J., Godage, I. and Sengupta, S., “Closed loop control of an mr-conditional robot with wireless tracking coil feedback,” Ann. Biomed. Eng. 47(11), 23222333 (2019).CrossRefGoogle ScholarPubMed
Gunderman, A., Schmidt, E., Morcos, M., Tokuda, J., Seethamraju, R. T., Halperin, H., Viswanathan, A. and Chen, Y., “Mr-tracked deflectable stylet for gynecologic brachytherapy,” IEEE/ASME Trans. Mechatron. 27(1), 407417 (2021).CrossRefGoogle ScholarPubMed
Gunderman, A. L., Sengupta, S., Siampli, E., Sigounas, D., Kellner, C., Oluigbo, C., Sharma, K., Godage, I., Cleary, K. Chen, and Y., “A surgical platform for intracerebral hemorrhage robotic evacuation (aspihre): A non-metallic mr-guided concentric tube robot, arXiv preprint arXiv: 2206.09848 (2022).Google Scholar
Kwok, K. W., Lee, K. H., Chen, Y., Wang, W., Hu, Y., Chow, G. C., Zhang, H. S., Stevenson, W. G., Kwong, R. Y., Luk, W. and Schmidt, E. J., “Interfacing fast multi-phase cardiac image registration with MRI-based catheter tracking for MRI-guided electrophysiological ablative procedures,” Circulation 130(Suppl._2), A18568A18568 (2014).Google Scholar
Webster, R. J. III and Jones, B. A., “Design and kinematic modeling of constant curvature continuum robots: A review,” Int. J. Robot. Res. 29(13), 16611683 (2010).CrossRefGoogle Scholar
Stilli, A., Kolokotronis, E., Fraś, J., Ataka, A., Althoefer, K. and Wurdemann, H. A., "Static Kinematics for An Antagonistically Actuated Robot Based on a Beam-mechanics-based Model,” In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE, 2018) pp. 69596964.CrossRefGoogle Scholar
Wang, L. and Simaan, N., “Geometric calibration of continuum robots: Joint space and equilibrium shape deviations,” IEEE Trans. Robot. 35(2), 387402 (2019).CrossRefGoogle Scholar
Mahl, T., Hildebrandt, A. and Sawodny, O., “A variable curvature continuum kinematics for kinematic control of the bionic handling assistant,” IEEE Trans. Robot. 30(4), 935949 (2014).CrossRefGoogle Scholar
Rucker, D. C., Jones, B. A. and Webster, R. J. III, “A geometrically exact model for externally loaded concentric-tube continuum robots,” IEEE Trans. Robot. 26(5), 769780 (2010).CrossRefGoogle ScholarPubMed
Qiao, Q., Borghesan, G., De Schutter, J. and Poorten, E. V., “Force from shape—Estimating the location and magnitude of the external force on flexible instruments,” IEEE Trans. Robot. 37(5), 18261833 (2021).CrossRefGoogle Scholar
Aloi, V. A. and Rucker, D. C., "Estimating Loads Along Elastic Rods,” In: 2019 International Conference on Robotics and Automation (ICRA) (IEEE, 2019) pp. 28672873.CrossRefGoogle Scholar
Gouveia, C., Jorge, P. A. S., Baptista, J. M. and Frazao, O., “Temperature-independent curvature sensor using FBG cladding modes based on a core misaligned splice,” IEEE Photonic. Tech. Lett. 23(12), 804806 (2011).CrossRefGoogle Scholar
Xu, R., Yurkewich, A. and Patel, R. V., “Curvature, torsion, and force sensing in continuum robots using helically wrapped fbg sensors,” IEEE Robot. Automat. Lett. 1(2), 10521059 (2016).CrossRefGoogle Scholar
Frette, O. I., Virnovsky, G. and Silin, D., “Estimation of the curvature of an interface from a digital 2D image,” Comp. Mater. Sci. 44(3), 867875 (2009).CrossRefGoogle Scholar
Lin, W.-Y., Chiu, Y.-L., Widder, K. R., Hu, Y. H. and Boston, N., “Robust and accurate curvature estimation using adaptive line integrals,” Eurasip. J. Adv. Sig. Process. 1(1), 20102014 (2010).Google Scholar
Nguyen, T. P. and Debled-Rennesson, I., "Curvature Estimation in Noisy Curves,” In: International Conference on Computer Analysis of Images and Patterns (Springer, 2007) pp. 474481.CrossRefGoogle Scholar
Nguyen, T. P. and Debled-Rennesson, I., "Curvature and Torsion Estimators for 3D Curves,” In: International Symposium on Visual Computing (Springer, 2008) pp. 688699.CrossRefGoogle Scholar
Pottmann, H., Wallner, J., Yang, Y.-L., Lai, Y.-K. and Hu, S.-M., “Principal curvatures from the integral invariant viewpoint,” Comput. Aided Geom. Des. 24(8-9), 428442 (2007).CrossRefGoogle Scholar
Qiao, Q., Willems, D., Borghesan, G., Ourak, M., De Schutter, J. and Poorten, E. V., "Estimating and Localizing External Forces Applied on Flexible Instruments by Shape Sensing,” In: 2019 19th International Conference on Advanced Robotics (ICAR) (IEEE, 2019) pp. 227233.CrossRefGoogle Scholar
Al-Ahmad, O., Ourak, M., Vlekken, J. and Poorten, E. V., “Fbg-based estimation of external forces along flexible instrument bodies,” Front. Robot. AI 8, 718033 (2021).CrossRefGoogle ScholarPubMed
Jones, B. A., Gray, R. L. and Turlapati, K., "Three Dimensional Statics for Continuum Robotics,” In: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE, 2009) pp. 26592664.CrossRefGoogle Scholar
Bubeck, S, “Convex optimization: Algorithms and complexity, arXiv preprint arXiv: 1405.4980 (2014).CrossRefGoogle Scholar
Moore, J. P. and Rogge, M. D., “Shape sensing using multi-core fiber optic cable and parametric curve solutions,” Opt. Exp. 20(3), 29672973 (2012).CrossRefGoogle ScholarPubMed
Kwok, K.-W., Chen, Y., Chau, T. C. P., Luk, W., Nilsson, K. R., Schmidt, E. J. and Zion, T. T., “Mri-based visual and haptic catheter feedback: simulating a novel system’s contribution to efficient and safe mri-guided cardiac electrophysiology procedures,” J. Cardiov. Magn. Reson. 16(1), 13 (2014).CrossRefGoogle Scholar
Dai, X., Lei, Y., Roper, J., Chen, Y., Bradley, J. D., Curran, W. J., Liu, T. and Yang, X., “Deep learning-based motion tracking using ultrasound images,” Aip. Conf. Proc. 48(12), 77477756 (2021).Google ScholarPubMed
Manakov, R. A., Kolpashchikov, D. Y., Danilov, V. V., Nikita Vitalievich Laptev, I. P. S. and Gerget, O. M., "Visual Shape and Position Sensing Algorithm for a Continuum Robot,” In: IOP Conference Series: Materials Science and Engineering (IOP Publishing, vol. 1019, 2021) pp. 012066.CrossRefGoogle Scholar
Yue Chen, T. T. Z., Wang, W., Kwong, R. Y., Stevenson, W. G. and Schmidt, E. J., “Intra-cardiac mr imaging & mr-tracking catheter for improved mr-guided ep,” J. Cardiov. Magn. Reson. 17(1), 12 (2015).Google Scholar
Kerschnitzki, M., Kollmannsberger, P., Burghammer, M., Duda, G. N., Weinkamer, R., Wagermaier, W. and Fratzl, P., “Architecture of the osteocyte network correlates with bone material quality,” J. Bone Miner. Res. 28(8), 18371845 (2013).CrossRefGoogle ScholarPubMed
Lee, T.-C., Kashyap, R. L. and Chu, C.-N., “Building skeleton models via 3-D medial surface axis thinning algorithms,” CVGIP: Graph. Models Image Process. 56(6), 462478 (1994).Google Scholar
Figure 0

Figure 1. Distribution of point force to adjacent nodes to achieve subnode accuracy.

Figure 1

Figure 2. Loss maps computed by (a) shape (b) curvature.

Figure 2

Figure 3. Compute the curvatures using template disk. $A_I$ is the area surrounded by the template disk and the shape, and $b$ is the radius of the template disk.

Figure 3

Figure 4. Experiment setup for force estimation. The ATI force sensor with a probe is mounted on a fixed vertical wall. The FBGS is inserted inside Nitinol tube for curvature measurement. Counterpart of each force sensor with probe are designed to keep the shape of Nitinol in (a) and (b) the same, and the force sensor can swap the location. Different height design of the probe in (c) allows for 3D shape deformation.

Figure 4

Table I. Calibrated parameters.

Figure 5

Table II. Force estimation RMSE using FBG sensor feedback.

Figure 6

Figure 5. Experimental results for single force estimation. (a) Force magnitude estimation. (b) force location estimation.

Figure 7

Figure 6. Experimental results for double and triple forces estimation. (a) Force magnitudes estimation. (b) force locations estimation. During each experiment, the circle marker indicates the first force, diamond marker indicates the second force, and the triangle marker indicates the third force (triple force scenario).

Figure 8

Figure 7. (a) Shape of the deformed rod. (b) The blurred grid within the threshold. (c) The resulting image of the deformed rod. (d) Centerline (red pixels) detection using skeletonization algorithm.

Figure 9

Figure 8. Curvatures computed from the imaging.

Figure 10

Table III. Force estimation RMSE using image feedback.

Figure 11

Figure 9. Single force estimation result based on curvature computed from image. (a) Force magnitude estimation. (b) Force location estimation.

Figure 12

Figure 10. Experimental results for double and triple forces estimation based on image feedback. (a) Force magnitudes estimation. (b) Force locations estimation. During each experiment, the circle marker indicates the first force, diamond marker indicates the second force, and the triangle marker indicates the third force (triple force scenario).