Open-source software in Biomedical Education: from tracking to modeling movements

Davide Piovesan received his M.S.M.E in 2003 and D.Eng in Mechanical Measurement in 2007 at the University of Padova, Italy. His dissertation presented a set of experimental and analytical validation techniques for human upper limb models. From 2004 to 2008 he was a visiting scholar and post-doctoral fellow at the Ashton Graybiel Spatial Orientation Lab at Brandeis University, under the supervision of Professors Paul DiZio and James R. Lackner. There, he worked on the mechanics of movement adaptation in non inertial environments as part of a NASA extramural funding program. He joined Northwestern University in 2008, working as a post-doc fellow at the Rehabilitation Institute of Chicago under the supervision of Professor Ferdinando (Sandro) Mussa-Ivaldi. Davide is currently Assistant Professor in the Mechanical Engineering department at Gannon University and director of the Biomedical Engineering Program. His main interest is to gain insights on the role of biomechanics in the neural control of movements, with applications to rehabilitation engineering.


Open-source software in Biomedical Education: from tracking to modeling movements Project Overview
A curriculum in biomedical engineering requires a set of laboratory experiences which allow students to familiarize with a number of medical equipment and simulation software that are commonly used in the health care industry. Typically, engineering tools such as force plates, electromyography (EMG), and motion capture systems are used to acquire subjects' data to be used as input for simulation software, so to characterize human movement performance. Movement analysis is a topic of extreme importance to be presented to the students, especially when considering sport specific motions or impairments caused by injury or disease. The possible output parameters of motion capture studies include, but are not limited to, joint angles, velocity, and acceleration of upper and lower extremities. By using these data as inputs for modeling software it is possible to infer the relative elongation and the force generated by the muscles performing said action.
In small colleges, it is uncommon to have sufficient resources that would allow for the purchasing of a commercial motion capture apparatus. The high cost of such equipment tends to limit the ability of small educational facilities to have a full practical experience with such technology. The present work serves as proof that simple movements can be analyzed using lowcost digital cameras as well as a set of open-source free-ware software. Eliminating the issue that accompanies cost, we developed a set of bioengineering laboratory experiments providing students with a full "hands on" experience on motion capture and data post processing.
The project was divided in three modules. 1) Design of a camera-based setup and acquisition of raster video data. 2) Extraction of limbs' trajectories from raster images via free-ware software 3) Processing of kinematic data as input for a refined musculo-skeletal model to calculate muscles' properties during the movement. We studied eating as one of the basic motions necessary for individuals to live independently and experience a sufficient quality of life.
The goal of this procedure is to prove that motion capture analysis for educational purposes can still be accomplished without purchasing expensive commercial motion capture apparatuses.

Design of a camera-based setup and acquisition of raster image data.
It is well-known that high-end motion capture solutions have only been readily available to successful research laboratories, large production firms, or motion capture industries. The equipment is expensive, and often prohibitive when pursued by small educational enterprises hoping to gain a greater insight into motion analysis.
Optical motion capture devices can be divided in two sub categories: active and passive. An active device is composed of a set of markers which actively transmit a stroboscopic infrared light. Each marker placed on the subject transmits a signal with a different frequency that is recorded by a set of infrared cameras. This allows the system to recognize each marker and avoids accidental swapping if markers' projections on the plane of the camera become too close Page 24.956.2 to each other. To describe the markers' trajectory in the Cartesian space the position of the markers is triangulated using the acquisition of different cameras. In passive system active markers placed on the subject are replaced with reflective dots. The reflection of the dots is tracked in the visible spectrum by a set of video cameras. It is more common for the reflective dots to be swapped by the tracking algorithm from one or several of the cameras over the course of the range of motion, hindering the tracking process. These types of apparatuses can have a price ranging in the tens of thousands of dollars. According to META Motion, this type of equipment can range anywhere from $15,000 to $500,000 [1], far exceeding the monetary capabilities of small cost-conscious firms. This module presents the setup procedure of a passive motion capture system built using two inexpensive 1080p, 60fps digital cameras. The cost of each camera with such capabilities ranges from $70 to $200. The camcorders frames rate we selected is comparable with a few entry-level commercial motion capture system that use reflective markers technology.
Although motion capture allows for the examination of virtually any movement, it was determined that the hygienic motions are some of the most vital in terms of living independently. Any injury that would impair these movements impacts the quality of life of the individual, and, thus, it is important to fully understand the muscles' forces and joint torques at play in each of the actions. Feeding one-self involves the use of both mono-articular and bi-articular muscles given the use of both the flexion of the forearm and the motion of the shoulder.
To improve the tracking capability, the subject is fitted with a black long-sleeve shirt to offer a more dramatic contrast against the off-white color of the walls. External markers of different colors are placed on bony landmarks to track their movement. In this experiment fluorescent markers are positioned of the styloid process of the radius, the epicondyle of the humerus, the head of the humerus, the acromion process and the scapular spine to track the position of wrist, elbow, shoulder, and scapular motion associated with eating ( Figure 1-2). Depending on the model complexity chosen to analyze the data, not all of the markers will be used in the data processing phase.

Figure 2: View from Camera A (Right) and Camera B (Left) -End Position
The subject sat on a high stool. The initial position of the arm is along the side of the body with the hand in the proximity of the pelvis. The final position of the hand is close to the mouth. The subject was instructed to execute the movement at a natural speed.
Each camera is setup to record the various markers throughout the entire motion. Consequently, the test subject was positioned at a 45-degree angle to each camera. Moreover, the recording area was situated in a corner of the room to make use of the walls, both for tracking purposes and for color contrast. Markers, similar to those used on the subject, are placed on the walls at a known distance (0.6 m) to serve as reference points for calibrating the tracking system. It was important that each camera be at the same distance from the wall, as well as the subject, to output accurate and reliable data. The distance that allowed the best compromise between pixel resolution and lens visual distortion required that each camera be placed 2.20 meters away from the wall opposite of it. Two tripods were used to hold the camcorders at a height of 1.34 meters. The test subject was situated 1.50 meters from the camera (Figure 3).
Synchronization between the two cameras was obtained by a simple clapping motion of a thirdparty participant. The motion was performed several times, after which another clapping motion was used to signal the end of the recording. The resulting videos were then synchronized and processed utilizing open-source video editing and motion-tracking software, allowing for further analysis as it will be described in the next section.
This experiment, however simple, does have at least one major point of concern that could greatly affect the outcome of the motion analysis project. If the shirt is loose-fitting, the markers may experience some unwanted movement caused by the shirt sliding. Though apparently trivial, this may cause discrepancies in the position values later on. This potential source of error can be reduced by applying the markers directly to the skin or by using a tight-fitting shirt.

Extraction of limbs' trajectories from raster images using "Tracker"
The analysis for the motion capture project began by uploading the video files into a free-ware educational software called Tracker [2][3][4]. Tracker is a free video analysis and modeling tool built on the Open Source Physics (OSP) Java framework [5].
Although Tracker does not require the fluorescent markers to be present, they ensured that the automated tracking feature in the software worked flawlessly. Essentially, the markers allowed for tracking with higher fidelity. The markers placed on the subject created a set of well-defined region of pixels each with different RGB level. The function "Autotracker" in the software works by creating a set of template images containing the isolated pixels' cluster. This process requires a manual estimation of the cluster area to be tracked. A spatial correlation between the color map of each frame and the one of the newly created template is performed frame after frame. The process is optimized for each frame where the correlation between frame and template is performed shifting the images pixel by pixel in each direction. This algorithm is very similar to the reassignment process of spectrograms [6] and it allows to track the marker template frame by frame with sub-pixel resolution.
By knowing the position of maximal correlation from frame to frame it is possible to extract a time array of coordinate for each marker. When the tracking algorithm is complete for both cameras recording, a series of time dependent position points in the Cartesian space parallel to the camera plane are stored within the software. All data points, at the completion of tracking, could then be exported as text files for further processing. Data can be analyzed in software such as, MATLAB, Excel, or equivalent open-source programs [7]. An example of tracking four markers is presented in Figure 4.
To transform the trajectory from raster image space to physical Cartesian space, a system of reference was chosen for each video file. In each case, the origin of the reference frame was considered coincident to the marker located at the acromion process. This location is also the chosen origin of the reference frame utilized in a pre-compiled simulation model that will be Page 24.956.5 described in the next session. Next, the tracking system was calibrated by assigning the known distance (0.6 m) to the markers located on the wall behind the test subject ( Figure 4). Prospective distortions where not taken into consideration.

Processing of kinematic data
The program "Tracker" is only capable of producing two-dimensional time dependent data points for each analysis (position, velocity and acceleration). Further processing is necessary to obtain a three-dimensional trajectory of the limb that can thus be imported in specific musculoskeletal modeling software.
In order to use our acquired data with a musculo-skeletal model, the following operations need to be performed a. Synchronization and creation of a 3-Dimentional vector for each marker b. Co-ordinate transformation to match the frame of reference chosen within the model c. Data formatting to load motion-capture data in simulation software Every facet of engineering uses simulations to gain insight on probable working conditions and consequential system performance. Simulations are a useful tool to gain insight on human features that are not directly measurable. To obtain accurate and trustworthy results, simulations must represent the actual system as much as possible. On the other hand an exceeding complexity can undermine the comprehension of the phenomenon in question. As a result a few assumptions were made in the data processing. In particular we restricted our analysis on the trajectory of only three markers, namely the acromion, the epicondyle of the humerus and the styloid process of the radius, as described in Figure 3. This simplification allowed for the construction of a basic model representing an open kinetic chain of the upper right extremity. We represented the arm as a two link two joint 3-Dimentional model with 4 DOFs. The shoulder is Page 24.956.6 represented as a ball and socket joint with 3 rotational DOFs. The elbow is represented as a single DOF hinge. Figure 5 illustrates a stick figure of the data extracted by Tracker, where the data have been formatted using Matlab.

Figure 5: Stick figure of the movement Using MATLAB
A valuable and well established tool for the modeling of biomechanical systems is OpenSim [8][9][10][11]. This open-source free-ware software includes a variety of validated models both for the upper and lower limbs on which it is possible to build upon according to a Creative Commons license [12]. The model that we utilized is "arm26.osim", a simplified model of the upper extremity, with two joints and six muscles intended primarily for education and demonstrations [9,10].
The muscles embedded in the model are: long and short head of the biceps (BIClong, BICshort), brachialis (BRA), and three head of the triceps (TRIlat, TRImed, TRIlong). These are amongst the most important mono-articular and bi-articular muscles involved in performing the movement of feeding one-self. The two joints are the elbow and the gleno-humeral joint (shoulder). The original model incorporated only one DOF, at the shoulder, neglecting abadduction and internal-external rotation which were added as part of the exercise. Unfortunately, to modify the model in OpenSim is not straightforward as interacting with a CAD software. A basic knowledge of programming is required to modify the main code written in .xml format.
A simple calculation was performed to determine the scale factor between the data and the original model, which was then implemented into the OpenSim system to accommodate for size differences between subject and model. The scale factor added into the system automatically cue OpenSim to adjust the weight of the whole skeleton to match the increase in size. In scaling the model we assumed that the forearm and upper arm have a uniform ratio. However, this is not necessarily true.
Page 24.956.7 The coordinates of the recorded data needs to be matched with a set of virtual markers within the model. The Virtual markers will be used to "drag" the arm around the desired recorded trajectory. The matching between recorded data and virtual markers is obtained using an optimization algorithm that minimizes the Euclidian norm between markers. The algorithm is part of the OpenSim package. To partially correct for a non-uniform scale factor between limbs segments it is possible to apply a different "weight" to each marker when importing the recorded data into OpenSim. This process gives more importance to reduce the distance between real and virtual markers for elements that have more weight.  Figure 6 shows the effect of changing the relative weight between the two bony landmarks of the forearm at the end of the motion. In the left panel high importance is given to the matching at the elbow bony landmarks (5:1). The right panel shows the same procedure when weighting the wrist more (1:5). In the central panel all the markers have the same weight (1:1).
After uploading the recorded kinematic data into OpenSim, it is possible to calculate the kinematic and dynamic properties that the muscle is required to have in order for the subject to perform the movement. The results of the simulation include length and force for each muscletendon system implemented in the model. These parameters are obtained using the inverse dynamic capabilities of OpenSim. The optimization algorithm used in OpenSim utilizes the motion of the model to solve the inverse dynamics equation of the open kinematic chain, where the unknown are the joint torques. Since the system is over-constrained (in this case 4 DOFs are actuated by 6 muscles) the algorithm minimizes a cost function representing the sum of the square of muscles' activations.

Figure 7: Simulation Results
The left panel of Figure 7 displays the initial and final position of the simulation. The associated right panel shows the length variation and applied force of all the six muscles implemented in the model. These results reflect the changes expected of each muscle in that the flexor shortens (BIClong, BICshort, BRA) while the extensors lengthens (TRIlat, TRImed, TRIlong). The force that each muscle exerts can also be observed. The distribution of force around muscle is a critical factor in biomechanics and it is seldom presented in textbooks.

Conclusion
We presented three simple modules for a bioengineering laboratory experience that relate motion capture with biomechanical analysis of simple movements. These experiences can be performed utilizing inexpensive camcorders and a set of open-source free-ware software. We acknowledge that Matlab is not a freeware software, but it can be easily replaced by several open-source products (e.g. [7]). These experiences could be expanded by recording simultaneously the electromyographic (EMG) activity of the muscles in exam. This could allow a more informative analysis of muscles' force exerted during the movement by comparing the result of OpenSim with known EMG driven models [13]. Although more intricate analyses may be possible with advanced commercial equipment, a large learning opportunity exists even for small costconscious enterprises that cannot afford such expensive equipment and could not give their students a full "hands on" experience. There are great opportunities that finances cannot inhibit, all one must do is look!