1. Introduction
There is an increasing number of people that suffer from arthritis or chronic joint diseases. For example, in the United States, the number of people aged 65 and older with these symptoms will exceed 41 million by 2030 [
1]. Among them, knee arthritis contributes considerably to morbidity and results in a low quality of life. The primary surgical treatment for these patients is the total knee arthroplasty (TKA), namely, the replacement of the worn knee joint with a prosthesis implant [
2,
3,
4,
5,
6]. As shown in
Figure 1, the TKA implant is composed of three components: the femoral component, the tibia component and the polyethylene plastic spacer.
A successful TKA surgery improves the knee function, and relieves the knee joint pain. However, the improper placement of the knee prosthesis may accelerate the attrition of the plastic spacer, which will result in a reduced working life, and cause severe pain to patients [
2]. The integrity of a successful TKA surgery depends on several factors, including the appropriate alignment of the components, the rotational congruency between the prosthetic proximal tibia and the prosthetic femoral condyles, and the ligamentous balance of the knee joint [
3]. Recent research [
4] shows that the overall TKA failure rate is about 2.3%, and more than 74% of TKA failures are caused by non-infection reasons such as instability, aseptic loosening, stiffness, etc. Actually, most TKA failures are directly or indirectly caused by the operative mechanical reasons such as malalignment [
5]. Traditionally, the quality of the TKA surgery is determined exclusively by the experience of the surgeons. The digitally guided assistance equipment will be meaningful to improve the surgery quality, and reduce the failure rate caused by the operative reasons.
Many efforts have been made to develop the instrumentation of the knee joint prosthesis, for the purpose of surgery quality evaluation or long-term monitoring. Some investigations focused on measuring the contact force inside the joint prosthesis under different conditions [
7,
8,
9,
10,
11,
12,
13,
14,
15]. In [
7,
8,
9], a tibia tray was designed to measure the six load components in TKA. The in-vivo experiment under the conditions of walking or stair climbing during the postoperative follow-up of 6 to 10 months was conducted, in which the contact force of the prosthesis components were measured for five subjects. Similarly, a set of force sensing components were measured in [
10,
11,
12,
13] for three subjects in the scenarios of exercising and recreational activities after the TKA surgery, and the in-vivo experiment validated the effectiveness of the method. An implantable tibia prosthesis with the multiaxial force sensing was implemented and reported in [
13]. In [
14], a sensing system inside the knee implant for the force measurement was designed, fabricated and tested. The sensing system could be used to measure the contact force up to 1.5 times the body weight, and it was suitable for long-term monitoring. In [
15,
16,
17], an instrumented smart knee prosthesis for the in-vivo measurement of the contact force and kinematics was proposed. This system could be used to monitor the knee prosthesis after the implantation with three magnetic sensors and a permanent magnet, and to prevent the possible damage to the prosthesis by detecting the load imbalance or abnormal forces and kinematics in the knee prosthesis. In [
18], a long-term knee implant fatigue monitoring system using a floating-gate sensor array was introduced for the long-term, battery-less fatigue monitoring. All the above-mentioned designs were successful in measuring knee prosthesis contact force, and some could also measure the kinetic movement of the knee prosthesis. Nevertheless, they were designed to monitor the knee prosthesis after the TKA surgeries. It is also very meaningful to develop the technique to improve the implantation surgery quality during the process of TKA surgeries. The motivation of this work is to provide a device to help the surgeons to improve the TKA surgery quality during the surgery process. Such a device is used only during the TKA surgery for measurement of less than one hour.
Some efforts have been made to provide the auxiliary methods to help the surgeons to improve the quality in the surgery procedure. In [
19], a “VERASENSE Knee System” is presented, in which an array of sensors provides the dynamic, intraoperative feedback regarding tibiofemoral position and quantitative pressure at peak contact points in the medial and lateral compartments in the TKA surgery. The kinematic tracking can also be assessed. In [
20], a wireless knee joint force measurement system is proposed to increase the accuracy of the ligament balancing procedure. In [
21], a force amplitude- and location-sensing device has been designed to improve the ligament balancing procedure in TKA. However, the above-mentioned systems only acquire the single modal sensor data, which may miss some important information. The sensing system can be more comprehensive with additional sensor data such as the image data.
A wireless visualized sensing system has been proposed and implemented in this work for multimodal signal sensing inside the knee joint prosthesis. The system is used during the surgery as a trial component for adjustment and calibration of the implant without reforming the standard knee prosthesis or changing the standard clinical procedures. The proposed system consists of a small-sized sensing device, a wireless data receiver and a data processing workstation. The system can be used to acquire both the direct images and the contact force distribution inside the prosthesis. The real-time images can help the surgeons directly see and understand the real situation in the joint. More importantly, the image sensors have higher resolution than the piezoelectric, magneto-resistive and other types of physical sensors, and the proposed system can acquire much richer information about the knee joint prosthesis with the vision sensing.
The proposed sensing device is used during the surgery when the patient lies flat. Once the sensing device is placed in the knee joint, the surgeons will move the shin manually, and the device records the inside images and the contact forces. The further signal processing reconstructs the instant pose, and the kinematic trajectory of the relative movement between the femoral component and spacer. The surgeons can use the obtained information to calibrate the position of the components in the joint prosthesis. During the surgery, the sensing device is temporarily inserted into the knee implant as a trial component and will be replaced by the real spacer component once the femoral component and tibia component are fitted to the proper position. The influence on the TKA procedure is quite small. The TKA surgeons can easily use it in the standard clinical procedures. In general, the proposed sensing device cannot be used to monitor the progression of the prosthesis over time after the surgery, partially due to the limited battery lifetime with the huge power consumption associated with image data acquisition and processing.
The rest of the paper is organized as follows. In
Section 2, the proposed system architecture and the design considerations are presented.
Section 3 describes the hardware design details of the sensing device as well as the wireless data receiver. In
Section 4, the detailed data processing procedures and algorithms will be introduced.
Section 5 gives the experimental results. This work is concluded in
Section 6.
2. Design Considerations and System Design
The proposed wireless sensing measurement system consists of three parts, namely, the multimodal sensing device, the wireless data receiver, and the workstation, as shown in
Figure 2.
The small-sized multimodal signal sensing device has the identical shape and size as the real spacer of the knee joint prosthesis. During the surgery, before the final placement of the real spacer, the sensing device is placed in its position inside the joint prosthesis as a trial of the spacer. The sensing device can take two types of data: (1) the images of the femoral component viewed from the spacer side; (2) the contact force distribution between the spacer and the femoral component. To facilitate the image data processing, some simple but easily recognizable markers are printed on the surface of the femoral component.
The wireless data receiver is used to receive the sensing data acquired by the sensing device through a 432 MHz wireless link. The data receiver is then connected with the workstation through a USB cable. The signal processing will be carried out in the workstation, which will recognize the real-time relative position between the spacer and the femoral component from the multimodal sensing data. The algorithm implemented in the workstation can also the 3D kinematic trajectory of the femoral component with respect to the spacer. The surgeons can use the reconstructed information to evaluate if the knee joint prosthesis is properly installed, and make the adjustment if necessary.
Generally speaking, the force data acquired through this system is only 2D data, and it is difficult to obtain the 3D information from the force data only. The image data is also 2D data, but the high-resolution images also contain the depth information. With the proposed kinetic pose estimation algorithm in this work, the presented system can output a 3D motion trajectory which is not available from other systems that acquire the force data only.
There are some abnormal TKA situations that cannot be identified using the force data only, and there are also some abnormal situations that cannot be detected only using the image data.
Figure 3 gives some examples to show the advantage of multimodal sensing data instead of the single-modal data. As shown in
Figure 3a, if the knee joint is appropriately installed, the spacer and the femoral component is well aligned with a certain relative pose in both the static and kinetic conditions, and the ligament around the two sides of the knee joint is well balanced. The sensing device presented in [
21] could be used to determine if the ligament is well balanced by comparing the contact forces of the two sides, since it could only measure the contact force distribution between the spacer and the femoral component. Such a device could detect the inappropriate condition as shown in
Figure 2b, in which the spacer is tilted, and the contact force distribution is not in equilibrium if the patients stand straight. However, the sensing device in [
21] could not report the inappropriate situation as shown in
Figure 2c. In this situation, the contact forces between the two sides of the knee joint are equal, and therefore the device in [
21] would faultily judge it as a good situation, although the spacer and the femoral component are misaligned. Nevertheless, the malalignment can be detected by processing the direct images taken inside the joint using the multimodal sensing device presented in this work. In general, the surgery flaws shown in both
Figure 3b,c can be detected using the presented multimodal sensing system.
3. Hardware Implementation
The hardware implementation of the presented sensing system will be presented in this section.
3.1. Multimodal Sensing Device
The functional diagram of the image and force sensing device is shown in
Figure 4. It consists of five key functional blocks, i.e., the image sensor, the force sensors array in together with an analog-to-digital converter (ADC) for signal conversion, the wireless data transmitter, the sensor interface chip to bridge the sensors and the transmitter, and the microcontroller (MCU) for the system initialization. There are four white LEDs distributed evenly around the image sensor lens. These four LEDs provide the necessary lighting for the image sensing, since the sensing device will work in the dark or dim environment side the knee joint. The illuminance can be tuned by adjusting the driving current of the LEDs to avoid the overexposure or underexposure.
The MSP430 MCU by Texas Instruments Inc. provides the initial settings for all the other chips in the sensing device during the power-up phase. Since the wireless data transmitter and the sensors operate autonomously, the MCU is programmed to power off to save power consumption once the device starts the full operation.
The OV7670 image sensor used in this device is a CMOS sensor manufactured by Omnivision with a maximum resolution of 640 × 480. In this application, it can be programmed to output images with 480 × 480 or 240 × 240 resolution. A view angle of 140° is achieved by using a specially designed wide-angle macro lens. Since the image sensor is very close to the femoral component, this wide view angle macro lens can guarantee that most of the surface of the femoral component can be “seen” by the image sensor.
The sensing device contains six low profile force sensors manufactured by Honeywell based on silicon-implanted piezo resistors. Each sensor has a measurement range of 0–45 N. A 6-channel 24-bit ΣΔ ADC by ADI is used to quantize the force sensors’ output.
The wireless data transmitter chip is a highly integrated system-on-a-chip (SoC) which is an improved design of that reported in [
22]. The ultra-low power SoC was originally designed to transmit the image sensor data. It is mainly composed of a minimum shift keying (MSK) transmitter working at the 400 MHz band and an image data compressor. In this application, the transmitter is configured to work at 416 MHz, and it provides a raw data rate of 3 Mbps. The image data compressor provides a near-lossless data compression ratio of ~3, which can help to boost the transmission frame rate. In this application, the transmitter SoC can be configured to transmit 480 × 480 images with a frame rate of ~3 fps, or 240 × 240 images with a frame rate of 6–8 fps. The SoC contains one charge-pump boost regulator and three programmable low-dropout (LDO) linear regulators, which provide 4 V, 2.5 V, 1.8 V and 1.2 V power supplies to all the other circuits in the SoC and the other function blocks in the sensing device. The 4V power supply is used to drive four white LEDs, which serve as the flash lights for image sensing. A coil antenna with −8.9 dBi peak gain and a voltage standing wave ratio (VSWR) of less than 2.0 at the 416 MHz center frequency with the proper matching between the antenna and the RF power amplifier. The RF transmitter has an energy efficiency of 1.3 nJ/bit not including the power amplifier.
Since the wireless transmitter SoC was originally designed to transmit the image data, it can only receive the data from the image sensor using an 8-bit parallel data port. The dedicated sensor interface chip is used to pack the sensing data from the force sensor array and the image sensor into the format of 8-bit parallel image data, which is readable by the SoC. Actually, the first four data rows of each 480 × 480 or 240 × 240 image is replaced by the force sensor data with careful consideration of data synchronization. From the view point of the transmitter SoC, it just receives and transmits the “image” data from the interface chip. By doing so, each frame of the image will lose the first four rows, which is affordable, considering that each image contains at least 240 rows.
The sensing device is supplied by a 3 V lithium manganese battery with a capacity of 170 mAh. There is the potential risk to use this type of battery since it is not dedicated for the medical applications. Regardless, this battery is only used for the experiments before the clinical trial. In the future, the lithium manganese battery can be replaced by a safer battery, such as the lithium/iodine cell battery for the real medical product. The currently used battery has a peak current of 30 mA that occurs when the image sensor is enabled, with the image sensor drawing 15 mA from a regulated 2.5 V supply and the flash lights consuming ~14 mA current. The transmitter SoC has a peak current of ~6 mA. Note that the image sensor and the transmitter are not enabled simultaneously to avoid too much peak current. The force sensors and the ADC has a peak current of less than 1 mA. In overall, the sensing device consumes a peak current of ~30 mA.
All the circuits in the sensing device are powered on only when necessary, the sensing device has an average current of ~10 mA from the 3 V supply. Roughly, 40% of the total average current is contributed by the transmitter SoC and almost all the remaining 60% is contributed by the image sensor in together with its flash lights. The force sensors and the following ADC are enabled at very low duty cycle ratio, and the contribution to the average current is less than 1%.
As shown in
Figure 5a, the force sensors and the image sensor reside on the same printed circuit board (PCB) which has the same outline as the spacer. The PCB is sealed inside a transparent shell which is composed of the upper shell and lower shell. The central region of the upper shell is transparent and polished to ensure the image quality. The entire sensing device is shown in
Figure 5b. It has the identical shape and size as the real spacer used in the knee joint prosthesis. Since there are many versions of spacers with various sizes to meet the requirements of different patients, the sensing devices in the real product should also have as many as versions to match the spacers. The shell of the sensing device is made of the medical-level polycarbonate. Since the device is not permanently implanted, the wear issue is not of concern. Note that the sensing device is used when the patient lies flat. It is estimated that the force applied to the shell of the device is less than 20 kilograms. The mechanical architecture of the sensing device is well designed so that it can tolerate the loads up to 50 kilograms. Experiments have been performed to validate the mechanical reliability of the sensing device with a 50 kg load.
The performance of the sensing device is summarized in
Table 1.
3.2. Wireless Data Receiver
The data receiver in the proposed wireless visualized measurement system is used to receive the multimodal data from the sensing device. The block diagram of the receiver is shown in
Figure 6. The key parts of the data receiver are the RF receiver and the FPGA-based MSK demodulator. The digital demodulator receives the digitized intermediate frequency (IF) signals from a pair of 8-bit 24 Msps ADCs, and performs the MSK demodulation. Note that in the sensing device, the image sensor output and the force sensors’ output are packed. Correspondingly, the image data and the force data are decomposed from the received data in the data receiver. The data receiver is then connected to the workstation through a USB cable for further data processing.
The PCB and package of the data receiver is shown in
Figure 7. The data receiver has a size of 124 × 86 × 22 mm
3. It contains a li-iron battery of 4000 mAh, and the battery life is about 10 h.
4. Multimodal Sensor Data Processing
The multimodal sensing data is eventually sent to the workstation for further processing to acquire the relative pose of the components in the knee joint prosthesis.
The force sensors’ data can be used to check the ligament balance of the knee prosthesis, and the details of the force data processing can be found in the paper previously published by our group [
21]. As shown in
Figure 8, the image data processing is composed of three steps, i.e., the image data pre-processing, the instant pose reconstruction from the individual image, and the kinetic pose reconstruction based on multiple consecutive images.
4.1. Image Pre-Processing
In the image pre-processing part, the images are denoised, and then the contrast is enhanced, and the lens distortion is corrected. Note that the contrast enhancement is necessary since all the images are taken under the low illumination scenario inside the knee prosthesis. The distortion introduced by the wide-angle macro-lens must also be corrected before further processing. The steps of the pre-processing are shown in
Figure 8.
The Block-Matching and 3D filtering (BM3D) algorithm [
23] is used for the denoising. In this algorithm, the similar blocks in the femoral component surface images are used to eliminate the noise, and the image quality is improved in terms of both the peak signal-to-noise ratio (PSNR) and the subjective visual quality.
For the contrast enhancement, a modified Multi-Scale Retinex (MSR) algorithm [
24] is used to enhance the edge features of the images. For each input image data matrix
S(
x,
y), the output of the MSR processing is the reflectance matrix
R(
x,
y) given by:
in which
F(
x,
y) is a Gaussian filter.
F(
x,
y) is given by:
where c is the scale factor. To simplify the calculation, the logarithmic matrix
r(
x,
y) is used, and
r(
x,
y) is calculated as:
Following the method in [
25], three Gaussian filters
are used to generate three outputs, i.e.,
and the weighted summation of
is calculated to find the final MSR output.
In this application, based on the experiments on the image data taken in the emulated environment, the MSR processing is only applied to the G channel of the original RGB image data, which yields the best subjective quality. Also based on the experimental data, the following set of parameters are used for the MSR processing, i.e., c1 = 15, c2 = 80, c3 = 250, and W1 = W2 = W3 = 1/3.
The R, G and B channels after the MSR processing are normalized separately, to balance the relative intensities of the three-color channels.
The last step in the pre-processing is to correct the lens distortion. The camera calibration method from OpenCV [
26] is used. A test board with the chess board pattern is utilized to find the camera matrix, distortion coefficients, rotation and translation vectors, etc., which is then used for the distortion correction.
4.2. Establishment of the Pose Reconstruction Problem
To facilitate the image data processing, five pairs of control points are marked on the femoral component, by printing some special markers on its surface. Those control points are numbered as
1L/1R, 2L/2R, 3L/3R, 4L/4R and
5L/5R. As shown in the side view of the femoral component given in
Figure 9, the five pairs of control points are spaced approximately evenly on the femoral component surface with an angle difference of 30°. Any two pairs of control points form a rectangle. The femoral component is usually made of reflective materials, which will bring difficult to take the. However, the most concerned features of the images are the five pairs of control points. Non-reflective material with high contrast should be chosen to print the control points, so that they can then easily be distinguished, even if the entire image is affected by the surface reflection issue.
Only three types of markers are used, namely, the triangle, round and square markers. 1L/1R markers are triangle, 2L/2R and 3L/3R markers are round, and 4L/4R and 5L/5R markers are square. Since the movement of the femoral component has limited freedom, the control point numbers can be easily recognized by recognizing the shapes of the markers. For example, if the surface image of the femoral component contains two round markers and two square markers, the control point numbers are recognized as 3L/3R and 4L/4R.
The image data processing of this system turns to finding the relative position between the spacer (sensing device) and the femoral component from the surface images of the femoral component “seen” by the spacer (sensing device) with the aid of the control points.
Geometrically, the spacer and the femoral component are represented by two coordinate systems, namely, the spacer coordinate system {
OCXCYCZC}, and the femoral coordinate system {
OFXFYFZF}, as shown in
Figure 10. For the spacer coordinate system, the origin
OC is the center of the space bottom plane, the X axis
OCXC is defined as the long axis the space bottom plane, and the Y axis
OCYC is the axis perpendicular to
OCXC on the space bottom plane, and the Z axis
OCZC is vertically perpendicular to the space bottom plane. Ideally, when the femur is alignment with the tibia, the femoral coordinate system {
OFXFYFZF} is fully parallel to {
OCXCYCZC}, and the two origins
OF and
OC are apart vertically. The spacer coordinate system is used as the camera coordinate system in this implementation. The relative position of the spacer and the femoral component can be described by the rotational angles [
ϕ,
θ,
ψ], and the 3-dimensional translation
tCF = [
tx,
ty,
tz]
T between the two origins.
ϕ is the the roll angle about the
OCYC axis.
θ is the flexion-extension rotation angle, namely, the pitch angle about the
OCXC axis.
ψ is the internal–external rotation angle, namely, the yaw angle about the
OCZC axis between the two coordinate systems.
tx is the mediolateral translation, namely, the horizontal distance between the centers of the femoral component and the spacer along the
OCXC axis. For a successful TKR surgery, it is expected that
ϕ,
ψ and
tx are all zero when the tibia moves relative to the femur.
In the image data processing, it is more convenient to use the rotation matrix R instead of the rotational angles [
ϕ,
θ,
ψ]. [
ϕ,
θ,
ψ] can easily calculated from
R based on the definition of
R as follows:
The relative pose reconstruction problem is converted to the problem to find RCF and tCF between two coordinate systems {OCXCYCZC} and {OFXFYFZF}.
It is not easy for the sensing device to directly recognize the coordinate system {OFXFYFZF}. However, with the wide view angle of the image sensor in the sensing device, any femoral component image taken by the sensing device contains at least 2 pairs of control points, which can define an assistant control point coordinate system {OAXAYAZA}. There are a series of control point coordinate systems {OAXAYAZA} defined by the control points, and the rotation matrix RAF and the translational distance tAF between any {OAXAYAZA} and {OFXFYFZF} are exactly known. On the other hand, RCA and tCA between {OCXCYCZC} and {OAXAYAZA} can be calculated using the image data processing shown next.
Assume that the coordinates of a given point in these three coordinate systems are
CC = [
xC,
yC,
zC]
T,
CA = [
xA,
yA,
zA]
T, and
CF = [
xF,
yF,
zF]
T, respectively. The coordinate transformation gives the following equations:
Substituting (6) and (7) into (8) gives:
Comparing (6) and (9), it follows that:
The problem to solve
RCF and
tCF is finally converted to the coplanar perspective-4-points problem [
27] to find [
RCA,
tCA], with [
RAF,
tAF] as the known parameters.
4.3. Instant Pose Reconstruction
After the pre-processing, the control points in the femoral component images are recognized. The images are firstly filtered by an adaptive threshold filter to generate the binary images. The contours of the control point markers are then recognized using the method by [
28]. The numbers of the control points are recognized by the markers’ shapes. For each individual image, four adjacent control points which form the biggest rectangle area in this image are chosen to establish the control point coordinate system {
OAXAYAZA}.
An analytic and non-iterative method is then proposed to solve the coplanar perspective-4-points problem to find [
RCA,
tCA] as described in the previous subsection. There are many classical methods to solve this perspective problem, including the non-iterative method Epnp [
29], the iterative linear solution using unbiased statistics [
30,
31], some simple methods based on P3P problem [
32,
33] and many linear analytic solutions [
34,
35,
36,
37,
38] and so on. Compared to these classical methods, the proposed method has the advantage of low computational complexity.
The geometry of the proposed method is explained in
Figure 11. As shown in
Figure 11, the four control points
C1,
C2,
C3,
C4 are projected to the image plane as the image points
m1,
m2,
m3,
m4, and the correspondence between the control points and the image points is known. The point
O is the perspective center of the camera. Solving the estimation problem can be equivalent to solving the depth of
OC1,
OC2,
OC3, and
OC4. The vector
is the unit vector with the same direction as
(I = 1, 2, 3, 4) and
,
,
are the angles between
and
,
,
, respectively. H
1, H
2, H
3 are the points on line
OC2,
OC3, and
OC4 with
,
and
.
To simplify this problem, assume that
OC1 has a length of
x and
C1H1,
C1H2,
C1H3 equal to
k1x,
k2x and
k3x respectively. How to find the values of
k1,
k2,
k3 and
x is explained as follows. In the triangle
C1C2C3:
where
D1 and
D2 are the two side lengths of rectangle
C1C2C3C4. According to the cosine theorem:
According to the geometry relationship:
Substitute (14) and (15) into (13):
(16) can be expanded to (17):
Substitute (17) into (11)/(12):
Similarly, in the triangle
C1C4C3:
While in the triangle
C1C2C4:
Combining (17), (19) and (21) leads to an equation without
k1 or
k3:
In (17)–(22), the coefficients
ai,
bi,
ci,
di,
ei,
fi,
gi,
hi,
ii,
ji (I = 1, 2, 3) are used without explanation. Actually, these coefficients all have the analytical expressions in terms of
,
,
, which are shown in
Appendix A. (18), (20) and (22) can be combined and written in the form of matrix operation:
The unique solution of
k2 can be found from these homogeneous linear equations, based on the right singular vectors of null space of the 3 × 5 matrix [
20]. Consequently,
k1,
k3 and length
x can be calculated from (17), (19) and (11), respectively. Note that:
Consequently,
RCA and
tCA can be calculated as:
RCF and tCF can then be calculated using (10), and the rotational angles [ϕ, θ, ψ] can be calculated from RCF using (5).
Obviously, the proposed method can solve the problem without any iteration, and the computation complexity can be characterized as O(1). As a result, the instant pose reconstruction can be implemented in real time, while requiring limited computation overhead.
4.4. Kinetic Pose Reconstruction
During the TKA surgeries, the instant relative pose reconstructed by the image data and the ligament balance indicated by the force sensor data should be checked by slowly moving the tibia with respect to the femur. As shown in
Figure 12, there are some typical angles between the tibia and the femur, such as 0°, 45°, 90° and 130° [
39].
It is also of great significance to check the kinematic trajectory of the femoral component “seen” by the spacer when moving the tibia with respect to the femur. A successful surgery will lead to a trajectory that is central symmetric and smooth. The symmetry of the kinetic trajectory indicates the balance of the prosthesis, and the smoothness indicates the knee joint can move freely without interference between the prosthesis components.
In the instant pose reconstruction, the coordinate of the femoral component coordinate origin in the spacer coordinate system is described as
tCF = [
tx,
ty,
tz]
T. The change of
tCF = [
tx,
ty,
tz]
T when moving the tibia with respect to the femur can be used as the kinetic trajectory. However, this trajectory has limited amplitude. To amplify this trajectory, another point P which is the lowest central point of the femoral component is defined and the relative trajectory of P in the spacer coordinate system is described. As shown in
Figure 13, the point P has a coordinate of [0, 0, −50]
T, then in the spacer coordinate system, the coordinate of point P is given by:
The kinetic pose reconstruction is just to sketch the trajectory of point P with the coordinate CP,C in the camera coordinate system.