https://www.youtube.com/watch?v=lJZYZqtoEH4. 4. System Sensor Data ..... git clone https://github.com/libfann/fann.git. 6. Navigate into fann .... http://iot.ee.ucla.edu/UCLA-STMicro/index.php/UCLA_-_STMicroelectronics. [4]: FANN Library ...
STMicroelectronics SensorTile Reference Design: Gait Classification Prototype using Stride Measurements Kirk Patrick Rustia Shayan Hosseinpouli Advisor: Dr. William J. Kaiser
STMicroelectronics SensorTile Reference Design
Page 1 of 31
Table of Contents 1. Introduction 1.1. Motivation 1.2. Theory
3 3 3
2. System Architecture and Algorithms 2.1 Materials and Setup 2.2 Algorithms
4 4 5
3. System Sensor Data Acquisition and Experimental Methods 3.1 Experimental Methods for Training and Testing Data Acquisition 3.2 Photos and Video
6 6 9
4. System Sensor Data
10
5. Signal Feature Extraction
12
6. Neural Network Implementation
21
7. Demonstration Data 7.1 Unsuccessful Attempt 7.2 Final Attempt
23 23 25
8. Source Code 8.1 Source Code Modules 8.2 Functionality of Modules 8.3 Google Drive Links
27 27 27 29
Design Guidance 9.1 Overall Experience 9.2 Challenges 9.3 Conclusion and Next Steps 9.4 Acknowledgments
29 29 29 30 30
10. References
31
STMicroelectronics SensorTile Reference Design
Page 2 of 31
1.
Introduction
1.1. Motivation This classification was created with the goal of using machine learning methods in order to create a prototype with the end goal of using stride measurements to classify an individual’s gait. We decided on simply making a prototype rather than a full-blown project, due to the ten-week deadline that was presented to us, and we felt that creating a prototype instead would allow for the application of development of the product outside the lab. Those considered to have irregular gait are those who have either suffered from accidents leading to some injuries of the lower body and leg area, or those that have disorders such as multiple sclerosis that completely affect movement.
These individuals are either unable to walk, unable to
maintain a proper balance while walking, or those that have knee pain when attempting to make a bigger step; such individuals are usually given some mechanical aid in the form of wheelchairs or crutches. Use of this product will be of great help to doctors and physicians in evaluating whether a patient is in need of a walking aid. This prototype would also have broader applications throughout the biomedical field, and even be of use in the athletic field for rehabilitating athletes. In addition, in order to make this a more cost-effective product, we seek to create a sensor system using only the minimum amount of sensors needed to properly fulfill this objective.
1.2. Theory Evaluation of gait is divided into such broad categories, ranging from walking, running, jumping, and other forms involving moving the leg and ankle. As such, the identification of key gait features that constitute adequate walking ability are required. We decided on three classifications, on the basis of the scope of time we had: long strides, short strides, and sitting. Short strides constitute walking in a straight line normally from one end of an 11.8-inch tile to another. Long strides repeat the same process, but raising the knee, and walking over three tiles instead of one. Finally, irregular behavior is made up of movements that do not involve walking forward in a straight line, including jumping, standing, and sitting. Using these classifications, we would acquire data using an IoT (Internet of Things) sensor device called the STMicro SensorTile.
The SensorTile is composed of several different sensors measuring
parameters ranging from linear acceleration, angular acceleration, and temperature. These sensors are
STMicroelectronics SensorTile Reference Design
Page 3 of 31
then integrated into a C program that is run on an embedded Linux server, using the BeagleBone IoT platform, to record the three-axis acceleration values over a set time interval. Other programs are then moved to the Linux server with the purpose of calculating values correlation values using the obtained data; from there, a neural network -- specifically, the Fast Artificial Neural Network (FANN) -- is trained using features from the calculated data. Ultimately, we would have 70-80% of our tests train the neural network, and then the remainder of our tests be used to assess its accuracy; we would present a perfect confusion matrix to show the correctness of our classification. To ensure a low-power, efficient device, a proper algorithm must be made using only a limited amount of sensor data from use of either the angular or linear acceleration sensors. It is possible to use only one of the sensors to use in the correlation process, and even use a two-axis measurement combination such as a y-z angular acceleration. The combination is entirely up to the user depending on the desired result and applied algorithm, but as seen in our findings, the sensor use need not be uniform on both modules used to acquire data.
2.
System Architecture and Algorithms
2.1 Materials and Setup As mentioned before, we are gathering the data of all of our motions using the STMicro SensorTile, with the use of its linear and angular acceleration sensors, the accelerometer and the gyrometer respectively. The selection of our SensorTile device as our system of measurement is based on its capability to use low-power connections (65mA current) to interact with a network using Bluetooth, using a special NRGMS chip. Should the designer already have sensor algorithms to apply, the SensorTile also possesses capabilities to act as a hub to receive them either wirelessly using Bluetooth or using wired connections to a debugging device. However, the built-in sensor algorithms are already sufficient to act as a proper sensor. The embedded IoT platform that will be receiving the data from the SensorTile is the BeagleBone Green Wireless. Operated using Linux, this platform is capable of 4GB of storage through a memory card, and thus can serve as the hub of a multitude of algorithms and programs.
The
BeagleBone also possesses Bluetooth capabilities that allow it to act as either a parent or a child device in a network, but is also capable of connecting to WiFi to accomplish other tasks. Furthermore, the platform is installed with a SITARA TI Microprocessor that makes it both quick, efficient, and grants it the
STMicroelectronics SensorTile Reference Design
Page 4 of 31
ability to manipulate algorithms, clocking times, and measurement frequencies that for optimal interaction with data in real-time. To allow for a better understanding of gait, two SensorTile systems were used, having been flashed and soldered so that only the cradles are used.
We secure the SensorTile cradles in a small
case, and attach one sensor to the ankle (Sensor 2) and another to the quadriceps area (Sensor 1). A dual sensor algorithm is downloaded onto the BeagleBone and will thus serve as the algorithm to record the data of both sensors simultaneously over a ten-second cycle. The SensorTile and BeagleBone documentations are seen in the reference section for your perusal. [1][2] For those unfamiliar with the use of the SensorTile and BeagleBone systems, the reference section includes a link to tutorials.[3] In short, here is a list of the materials required to implement this classification. 1.
1.2x STMicroElectronics Sensortile System
2.
BeagbleBone Green Wireless Module, or other compatible IoT platform ●
Preferably, it is best to order the product entitled ‘Seed Studio BeagleBone Green Wireless IOT Developer Prototyping Kit for Google Cloud Platform’. Available on DigiKey or Mouser websites.
3.
3x Mini-USB cables
4.
1x STMicroElectronics NucleoTile (optional, for debugging purposes)
5.
1x Micro-USB cable (optional, for use with NucleoTile)
2.2 Algorithms There are five key algorithms corresponding to the procedural steps described in the “System Sensor Data and Experimental Methods” section below. The first algorithm involves the Dual_SensorTile.sh algorithm, which is the main command used in the remote server of your choice to gather data on both of the sensors. This algorithm is composed of two C programs: Acquire_LowPass_Continuous1.c and Acquire_LowPass_Continuous2.c.
These programs correspond to the respective sensors 1 and 2
designated to acquire all nine forms of sensor data, but have been modified to include a timestamp at 50 millisecond intervals. The second algorithm is the waveform_feature_xcorr.c file, which is a C file that handles both autocorrelation and cross-correlation, depending on how you proceed with your measurement. This algorithm has been modified to select which of the nine sensor data column sets to correlate, either by itself for autocorrelation or with another column set for cross-correlation. Please
STMicroelectronics SensorTile Reference Design
Page 5 of 31
note that this algorithm is not mandatory to file a complete classification, depending on both the test and algorithm used for classification. The third and fourth algorithms were the FANN training and testing algorithms, respectively, xor_train.c and xor_test.c. These were downloaded as part of a library, and is renowned for ease of setup; the link is in the reference section [4]. After acquiring data sets and features to pass to the neural network, data is grouped separately to first train the neural network, and then test it. The final algorithm we chose to integrate is the reset_bluetooth.sh file, which resets the connection of both sensors when it is run on Terminal in a quick manner.
3. System Sensor Data Acquisition and Experimental Methods 3.1 Experimental Methods for Training and Testing Data Acquisition Gait Classification aims to distinguish between the three motion types of short strides, long strides, and sitting. Using machine learning to do the discrimination, it is required to gather sample data in order to train the neural network. For the neural network to be properly trained, it is highly advised to obtain consistent sample data. Having a properly trained neural network, increases the performance and the success of your system. After setting your project goals and devising a plan, one should think about the following question: how can I make construct an environment to gather consistent and meaningful data? One should also take into account the external factors that might affect data integrity. Practical methods are recommended to increase the consistency and the integrity of the data. Experimental data acquisition was performed in the corridors of Boelter Hall at UCLA. It was chosen to let the length of each floor tile of Boelter Hall define the length of short strides. The short stride walking behavior was defined as letting the toes be aligned with the top of each tile while walking. Therefore, stepping on each tile defined the short stride walking behavior. The long stride walking behavior was defined as walking on every other two tiles, trying to have the toe be aligned with top of the last tile while stepping down. The sitting behavior was defined as simply sitting down on a chair. Videos on how to perform each of the three motion are available in section 3.2. Data acquisition for the Gait Classification Prototype using the Dual_Sensor_Data_Acquisition_V_1.1 can be followed using the instructions below. 1.
Download Dual_Sensor_Data_Acquisition_V_1.1 to your computer from the link below: https://drive.google.com/drive/folders/14EwQMMeojMXlwkqeF0RlWlBMAUrBRM1-
STMicroelectronics SensorTile Reference Design
Page 6 of 31
2.
Create a new directory using the following command: mkdir Dual_Sensor_Data_Acquisition_V_1.1 Using FileZilla, transfer the content of the google drive folder into the directory you just created. Compile with the following commands: make clean make all
3. 4.
●
The directory Dual_Sensor_Data_Acquisition_V_1.1 should include the following files at this point, in Figure 1 below. You can list out the content with the ‘ls’ command.
Figure 1: Contents of Dual_Sensor_Data_Acquisition_V_1.1
●
You need to edit Acquire_LowPass_Continuous_1.c and Acquire_LowPass_Continuous_2.c for them to print the time stamps in the motion data output files.
5.
Open Acquire_LowPass_Continuous_1.c in the vim editor using the command: vi Acquire_LowPass_Continuous_1.c
6.
Go the section that contains the following code in Figure 2 below:
Figure 2: Initial Code in Acquire_LowPass_Continuous_1.c
7. Add Sample time to the arguments of fprintf(). The result should be similar to Figure 3 below: Figure 3: Modifications in Acquire_LowPass_Continuous_1.c
8. Go to the section that contains the following code:
Figure 4: Initial Code in Acquire_LowPass_Continuous_1.c
STMicroelectronics SensorTile Reference Design
Page 7 of 31
●
Modify the code as in Figure 5 below.
Figure 5: Modifications in Acquire_LowPass_Continuous_1.c ●
At this point, we can repeat the same steps as above for Acquire_LowPass_Continuous_2.c Open Acquire_LowPass_Continuous_2.c in the vim editor. Repeat steps 6-9 for Acquire_LowPass_Continuous_2.c.
9. 10. ● ●
At this point, we are now ready to add the SensorTile’s MAC addresses need to be added to motion_data_sensortile_1.sh and motion_data_sensortile_2.sh. Make sure both SensorTiles are paired to but disconnected from the Beaglebone. Refer to the reference manual in the IoT Wikipage on how to pair the SensorTiles with the BeagleBone.[3]
11. Open motion_data_sensortile_1.sh in the vim editor. You should see content similar to Figure 6 below.
Figure 6: motion_data_sensortile_1.sh 12. 13. 14. 15. 16.
Replace the existing MAC address with your SensorTile1’s MAC address. Open motion_data_sensortile_2.sh in the vim editor. Replace the existing MAC address with your SensorTile2’s MAC address. Open reset_bluetooth.sh in the vim editor. Add SensorTile1 and SensorTile2’s MAC addresses in address1 and address2 respectively.
STMicroelectronics SensorTile Reference Design
Page 8 of 31
17. 18.
Figure 7: reset_bluetooth.sh Compile with the following commands: make clean make All Run with the following command: ./Dual_SensorTile_Acquire.sh -t 1
Now. dual sensor data acquisition can be performed in a synchronized manner. Make sure after each trial of data acquisition, the motion data output file is renamed to a comprehensive name so dual sensor data acquisition won’t overwrite the file in the next trial of data acquisition. The SensorTiles will blink when they are on. However, when dual sensor data acquisition is running the SensorTiles won’t blink because they connect to the beaglebone at that time. Refer to the README file for any clarifications needed. NOTE: In case of any BLE errors first check that the MAC addresses are correct. Then, run the bluetooth reset bash script using the following command: ./reset_bluetooth.sh
3.2 Photos and Video Gait Classification uses two sensors for data acquisition. Mark your SensorTiles to be #1 and #2 to avoid confusion. Place SensorTile #1 on the right leg slightly above the knee. Place SensorTile#2 on the the right ankle. SensorTiles should align, facing forward with the STM logo right side up. Placement of the sensortiles is displayed in the Figure 8. The SensorTile on the knee will be referred to as SensorTile1 and the SensorTile on the ankle will be referred to as SensorTile2.
STMicroelectronics SensorTile Reference Design
Page 9 of 31
Figure 8: Placement of two SensorTiles on the right leg: SensorTile1 slightly above the knee, SensorTile2 on the ankle.
The video for data acquisition of the long stride motion is available in the following link: https://youtu.be/72NiaGn63vg The video for data acquisition of the short stride motion is available in the following link: https://youtu.be/Rswj_TJWMMQ The video for data acquisition of the sitting motion is available in the following link: https://youtu.be/yeBcN1LHx9E Below is a link to a video demonstration of complete and detailed system implementation and operation: https://www.youtube.com/watch?v=lJZYZqtoEH4
4.
System Sensor Data
There were two main attempts to approaching the idea of motion classification for walking behavior. In the initial attempt 10 trials of data was obtained for each of short stride walking, long stride walking, and standing. 7 trials of data were to train the neural network and 3 were to test the system for each motion. In the final attempt 15 trials of data were acquired for each of the short stride walking, long stride walking, and sitting. 10 trials of data were to train the neural network and 5 trials were to test the system for each of the motions. Two sensortiles were used to acquired data. the sesnsortiles were placed on the knee and the ankle of the right leg as depicted in the previous section. Data collected by the sensortiles was transmitted to a Seeed Studio Beaglebone, as the processing unit, via BLE. Data acquisition was performed through the data example figures for all the acquired data in this project are available in the google drive link available below: https://drive.google.com/open?id=1LTkPQ1VjuUi8uDCdb-BN_48AofwYoxQ6 Seen in Figures 9-11 are examples of the motion data output files which will be outputted by dual sensor data acquisition.
STMicroelectronics SensorTile Reference Design
Page 10 of 31
Figure 9: Motion Data output for SensorTile1 short stride trial #11
Figure 10: Motion Data output for SensorTile1 long stride trial #11
STMicroelectronics SensorTile Reference Design
Page 11 of 31
Figure 11: Motion Data output for SensorTile1 sitting trial #11
5.
Signal Feature Extraction
Features are what allows the neural network is trained on to perform the classification. To find features one should examine the raw data visually first. The SensorTile data acquisition allows for obtaining accelerometer, gyroscope, and magnetometer data in x, y, and z axels. After examining the raw data files visually it was noticed that Accel_y in SensorTile1 shows discrimination between short stride walking, long stride walking, and sitting. In SensorTile2, Accel_z and Gyro_z showed discrimination among the three motions. Running different algorithms on the raw data files allows for a thorough analysis of discriminatory behavior. The algorithms used to perform data discrimination analysis include cross correlation, autocorrelation, and the mean. After performing the latter algorithms on the the motion data output files mean of Accel_y from SensorTile1, autocorrelation of Accel_y fromSsensorTile1, mean of Gyro_z from SensorTile2, and mean of Accel_z from SensorTile2 exhibited characteristics that allow for discrimination between the three motions. Gait Classification uses 4 features to classify between the three motions of short stride walking, long stride walking, and sitting. As the value of features may vary in range based on the motion axle it is strongly recommended to normalize the features and use mapping to narrow the range of features’ values. The Arctangent function was used to map the features to range of -𝜋𝜋/2 to 𝜋𝜋/2. The mapped values are the final features used in Gait Classification. The features are the arctangent of mean of
STMicroelectronics SensorTile Reference Design
Page 12 of 31
Accel_y from SensorTile1, arctangent of autocorrelation of Accel_y from SensorTile1, arctangent of mean of Gyro_z from SensorTile2, and arctangent of mean of Accel_z from SensorTile2. The following steps demonstrate how to extract the features for Gait Classification. 1. 2. 3.
Export the motion data output files to the computer and open them in excel. Find the mean for the column of Accel_y in sensortile1’s motion data output files for each file. Record the results, which should look like Table 1: Table 1: Mean of Accel_Y data from each motion data output file from SensorTile1. File
Mean of Accel_y
File
Mean of Accel_y
File
Mean of Accel_y
Short11
-1017.974747
Long11
-1003.70202
Sitting11
-65.62121212
Short12
-1013.530303
Long12
-926.520202
Sitting12
-60.99494949
Short13
-1021.697143
Long13
-958.4848485
Sitting13
-60.48989899
Short14
-1021.89899
Long14
-950.0505051
Sitting14
-64.07575758
Short15
-1030.121212
Long15
-938.5336788
Sitting15
-56.91919192
Short16
-1019.385787
Long16
-990.2979798
Sitting16
-55.12626263
Short17
-1022.171717
Long17
-994.5858586
Sitting17
-50.44949495
Short18
-1027.807292
Long18
-979.6313131
Sitting18
-52.84343434
Short19
-1014.878788
Long19
-1004.464646
Sitting19
-52.72222222
Short20
-1018.481865
Long20
-995.2121212
Sitting20
-51.98989899
Short21
-1024.050505
Long21
-991.7025641
Sitting21
-51.30808081
Short22
-998.6969697
Long22
-1004.722222
Sitting22
-50.67676768
Short23
-1033.505051
Long23
-969.9949495
Sitting23
-50.58080808
Short24
-1016.843434
Long24
-983.5252525
Sitting24
-48.4040404
Short25
-1024.343434
Long25
-971.5151515
Sitting25
-52.5
4. Find the mean for the column of Accel_z in SensorTile2’s motion data output files for each file.
STMicroelectronics SensorTile Reference Design
Page 13 of 31
5.
Record the results, which should look like Table 2: Table 2: Mean of Accel_z data from each motion data output file from SensorTile2.
6. 7.
File
Average of Accel_z
File
Average of Accel_z
File
Average of Accel_z
Short 11
9.315508021
Long 11
-157.9946237
Sitting 11
277.5925926
Short 12
14.85561497
Long 12
-213.7150538
Sitting 12
267.0324324
Short 13
19.50289017
Long 13
-127.4526316
Sitting 13
266.1263158
Short 14
16.62130178
Long 14
-187.9888889
Sitting 14
271.7956989
Short 15
17.10215054
Long 15
-86.44086022
Sitting 15
264.6402116
Short 16
1.586021505
Long 16
-101.5578947
Sitting 16
332.7566138
Short 17
4.397790055
Long 17
-54.81283422
Sitting 17
333.1925134
Short 18
9.967914439
Long 18
-85.74594595
Sitting 18
328.7748691
Short 19
19.6944444
Long 19
-77.03703704
Sitting 19
323.2356021
Short 20
11.90526316
Long 20
-82.44897959
Sitting 20
325.5631579
Short 21
2.64361702
Long 21
-54.82887701
Sitting 21
325.2169312
Short 22
6.23404255
Long 22
-77.15469613
Sitting 22
324.0695187
Short 23
8.77956989
Long 23
-200.4202128
Sitting 23
324.0591398
Short 24
6.56521739
Long 24
-144.6631579
Sitting 24
284.4864865
Short 25
13.4475138
Long 25
-176.4413408
Sitting 25
285.5
Find the mean for the column of Gyro_z in sensortile2’s motion data output files for each file. Record the results, which should look like Table 3: Table 3: Mean of Gyro_z data from each motion data output file from SensorTile2. File
Average of Gyro_z
File
Average of Gyro_z
File
Average of Gyro_z
Short 11
26.21925134
Long 11
93.61827957
Sitting 11
3.513227513
Short 12
23.05882353
Long 12
85.51075269
Sitting 12
3.491891892
Short 13
36.0982659
Long 13
77.94736842
Sitting 13
3.584210526
Short 14
22.36686391
Long 14
77.16666667
Sitting 14
3.887096774
Short 15
22.64516129
Long 15
118.9946237
Sitting 15
3.666666667
Short 16
26.22043011
Long 16
97.75789474
Sitting 16
3.714285714
Short 17
25.38674033
Long 17
71.12834225
Sitting 17
3.657754011
Short 18
28.48128342
Long 18
68.68108108
Sitting 18
3.706806283
STMicroelectronics SensorTile Reference Design
Page 14 of 31
Short 19
22.95
Long 19
92.44444444
Sitting 19
3.701570681
Short 20
13.32105263
Long 20
74.57653061
Sitting 20
3.715789474
Short 21
33.72340426
Long 21
100.3101604
Sitting 21
3.724867725
Short 22
32.79255319
Long 22
108.2541436
Sitting 22
3.689839572
Short 23
37.06451613
Long 23
82.64361702
Sitting 23
3.682795699
Short 24
38.13043478
Long 24
102.4578947
Sitting 24
3.681081081
Short 25
26.04972376
Long 25
77.74860335
Sitting 25
3.908602151
8. In order to perform cross correlation and autocorrelation on the motion data output files download the feature extraction package from the following google drive link: https://drive.google.com/open?id=1mCtj3DPQDliE9nENHHzAGTutlLRIcTka 9. Download Feature_Extraction_Correlation_4-11-2018.tar to the Beaglebone 10. Untar with the following command: tar -xvf Feature_Extraction_Correlation_4-11-2018.tar 11. Go to the Feature_Extraction_Correlation_4-11-2018 directory. 12. Open waveform_feature_xcorr.c in the vim editor. 13. Move to the section of the code that displays:
Figure 12: waveform_feature_xcorr.c
14.
Declare the following parameters and add the following line to the section of the code:
Figure 13: Modifications to waveform_feature_xcorr.c
15.
Move to the following section of the code:
Figure 14: waveform_feature_xcorr.c
STMicroelectronics SensorTile Reference Design
Page 15 of 31
16. Change the code to acquire the following result:
Figure 15: Modifications to waveform_feature_xcorr.c
17.
Move to the following section of the code:
Figure 16: waveform_feature_xcorr.c
18.
Change the code to acquire the following result in Figure 17:
STMicroelectronics SensorTile Reference Design
Page 16 of 31
19.
Figure 17: Modifications to waveform_feature_xcorr.c
Compile with the following commands: make clean make all 20. Place all the motion data output files obtained from dual sensor data acquisition in the motion_data directory inside the Feature_Extraction_Correlation_4-11-2018 directory. 21. Download the files from the following google drive folder: https://drive.google.com/open?id=1In-9DyXUivumVm73OmVlN0dbtRIAmJx_ 22. Place the 6 .dat files inside the Feature_Extraction_Correlation_4-11-2018 directory. 23. Run with the following command for all the 6 of the files lists: ./compute_cross_correlation_auto.sh 3 15 20 1 24. After each trial of running compute_cross_correlation_auto.sh, change the name of the file int_xcorr_summary.csv so the bash script won’t overwrite the file in the next trial of running. 25. You are left with 6 files that are the integrated cross correlation summary files for the three motions for each of the two sensortiles. 26. The integrated cross correlation summary files can be found in the following google drive folder: https://drive.google.com/open?id=1sQeVRO70LcD1Z7hQ8SQL8ga56AOatAuj Below is an example of the integrated cross correlation summary file.
STMicroelectronics SensorTile Reference Design
Page 17 of 31
Figure 18: Integrated cross correlation summary for short strides for SensorTile1.
Note the indices above each data row. The first index represents the motion data output files in the xcorr file list that compute_cross_correlation_auto.sh was ran on. The next index represents the all the other motion data output files in the order they appear in the xcorr file list. The autocorrelation for Accel_y is the value that appears in the 1xx column of the rows that both of their indices are the same, such as: 1, 1; 2, 2; 3, 3. Continue with the following steps. 27. Extract the values that appear in the 1xx column of the rows that both of their indices are the same for 1, 1 through 14, 14. You will arrive at a result like Table 4:
Table 4: Autocorrelation of Accel_y for SensorTile1
File
Autocorrelation of Accel_y
File
Autocorrelation of Accel_y
File
Autocorrelation of Accel_y
Short11
1332.275757
Long11
382.12439
Sitting11
14.085385
Short12
1345.859253
Long12
254.788086
Sitting12
12.299966
Short13
892.907349
Long13
456.547485
Sitting13
12.090223
STMicroelectronics SensorTile Reference Design
Page 18 of 31
Short14
1458.25061
Long14
588.510864
Sitting14
11.625508
Short15
1423.42981
Long15
212.2724
Sitting15
10.746694
Short16
1545.145996
Long16
342.725769
Sitting16
10.076032
Short17
1644.133545
Long17
508.320312
Sitting17
8.069046
Short18
1303.133423
Long18
323.230408
Sitting18
9.235287
Short19
1082.99292
Long19
583.78772
Sitting19
9.275327
Short20
1060.900391
Long20
407.624817
Sitting20
8.959215
Short21
1127.767944
Long21
401.99649
Sitting21
8.849077
Short22
1064.938843
Long22
245.841339
Sitting22
8.630483
Short23
1154.57312
Long23
412.947357
Sitting23
8.546937
Short24
1108.045288
Long24
484.843109
Sitting24
7.987212
Short25
1129.228882
Long25
421.793213
Sitting25
9.802538
28. Now that feature extraction is complete take the arctangent of all the features to map them to 𝜋𝜋/2 to 𝜋𝜋/2. You will arrive at results similar to Table 5-7. Table 5: Arctangent of features for short stride walking.
S1 autocorr Accel_Y
S2 Gyro_Z
S2 Accel_Z
S1 Accel_Y
Short11
1.570045732
1.53267489
1.463857967
-1.569813984
Short12
1.570053307
1.527456137
1.50358311
-1.569809677
Short13
1.56967639
1.543101248
1.519566739
-1.569817563
Short14
1.570110574
1.526117086
1.510705001
-1.569817757
Short15
1.570093798
1.526665454
1.512390651
-1.569825568
Short16
1.570149139
1.532676602
1.008245664
-1.569815344
Short17
1.570188104
1.531426039
1.34721113
-1.569818018
Short18
1.570028946
1.535699967
1.470808983
-1.569823382
Short19
1.56987296
1.527250887
1.520064155
-1.569810988
Short20
1.569853732
1.495867724
1.486996574
-1.569814474
Short21
1.56990962
1.541152017
1.209162228
-1.569819813
STMicroelectronics SensorTile Reference Design
Page 19 of 31
Short22
1.569857306
1.540311046
1.411741757
-1.569795022
Short23
1.569930206
1.543822888
1.457384292
-1.569828746
Short24
1.569893837
1.544576567
1.419640277
-1.569812892
Short25
1.569910767
1.532427044
1.496569759
-1.569820092
Table 6: Arctangent of features for long stride walking. S1 autocorr Accel_Y
S2 Gyro_Z
S2 Accel_Z
S1 Accel_Y
Long11
1.568179384
1.560115058
-1.564467082
-1.569800015
Long12
1.566871517
1.559102424
-1.566117233
-1.56971702
Long13
1.568605978
1.557967861
-1.562950436
-1.569753014
Long14
1.569097124
1.557838089
-1.565476914
-1.569743752
Long15
1.566085434
1.562392784
-1.55922824
-1.569730835
Long16
1.56787855
1.560567331
-1.55922824
-1.56978653
Long17
1.568829066
1.55673816
-1.552554448
-1.569790884
Long18
1.567702568
1.556237305
-1.559134497
-1.569775535
Long19
1.569083377
1.559979441
-1.557816287
-1.569800772
Long20
1.568343095
1.557388086
-1.558668209
-1.569791516
Long21
1.568308748
1.560827577
-1.552559784
-1.56978796
Long22
1.566728685
1.561559068
-1.557836079
-1.569801027
Long23
1.568374715
1.558696769
-1.565806851
-1.569765394
Long24
1.568733807
1.56103653
-1.563883827
-1.569779576
Long25
1.568425501
1.557935069
-1.565128781
-1.569767007
Table 7: Arctangent of features for sitting. S2 Gyro_Z
S2 Accel_Z
S1 Accel_Y
Sitting11 1.499919674
1.293491494
1.567193941
-1.555558531
Sitting12 1.489673711
1.291883422
1.567051481
-1.554402995
Sitting13 1.488272711
1.298713802
1.56703873
-1.554266147
Sitting14 1.484989793
1.318995157
1.567117109
-1.555191067
S1 autocorr Accel_Y
STMicroelectronics SensorTile Reference Design
Page 20 of 31
Sitting15 1.478011635
1.304544278
1.56701763
-1.553229367
Sitting16 1.471874842
1.307801595
1.567791136
-1.552658142
Sitting17 1.447494627
1.303925849
1.567795068
-1.550977118
Sitting18 1.462936219
1.307295138
1.567754741
-1.551874758
Sitting19 1.463398253
1.306939483
1.567702618
-1.551831267
Sitting20 1.459639491
1.30790319
1.567724736
-1.551564193
Sitting21 1.458267582
1.308514898
1.567721466
-1.551308687
Sitting22 1.455442349
1.306139173
1.567710579
-1.551065979
Sitting23 1.454324889
1.305656349
1.56771048
-1.551028558
Sitting24 1.446244284
1.30553856
1.567281236
-1.550139833
Sitting25 1.469133622
1.320323213
1.567293714
-1.551751011
The features have now been extracted and are ready to be inputted to the neural network for training. Use the first ten set of features for training, and the remaining five set of features for testing.
6.
Neural Network Implementation
Gait Classification Prototype uses the Fast Artificial Neural Network (FANN) library to classify between the motions of short stride walking, long stride walking, and sitting. Now that the features are extracted, they will be inputted to the FANN neural network for training and testing purposes. Installation of FANN follows these steps: 1. Connect your BeagleBone to a WiFi network with Internet access 2. Navigate into the root directory using following Linux command: cd 3. Update apt-get utility using following Linux command: apt-get update 4. Install cmake using following Linux command: apt-get install cmake 5. Download FANN library using following Linux command: git clone https://github.com/libfann/fann.git 6. Navigate into fann directory using following Linux command: cd fann
STMicroelectronics SensorTile Reference Design
Page 21 of 31
7. Compile and install FANN using following command (Note the period, “.” , indicating the local directory in the command line below): cmake . sudo make install 8. Link the library using following Linux command: ldconfig 9. Test the example by using the following commands: cd examples make ./xor_train ./xor_test Gait Classification user xor_train and xor_test modules to train and test the neural network. The neural network’s architecture consists of 3 hidden layers and 4 neurons in each hidden layer. The neural network takes 4 inputs, being the features that have been extracted, and gives 1 output being the type of motion it predicts based on training. One can modify the architecture of the neural network accordingly for better optimization. Modification of the neural network’s architecture can be done in xor_train.c.The neural network’s architecture that was used for Gait Classification is depicted below.
Figure 19: FANN neural network architecture. STMicroelectronics SensorTile Reference Design
Page 22 of 31
Creating the training and testing datasets, numbers -1, 0, and 1 have been assigned as the output to represent the motions of short stride walking, long stride walking, and sitting, respectively. Do the following: 1. 2. 3.
The training and testing datasets are available in the link below. https://drive.google.com/open?id=1ejYJtnCsTPsif0oyYvR44xRaFi_dubeI Download them to the Beaglebone, placing the two .data files in the examples directory. Run with: ./xor_train ./xor_test
7.
Demonstration Data
7.1 Unsuccessful Attempt It is important to show one of the unsuccessful attempts to motion classification. To classify between 3 motion types, it is recommended to use at least 4 features. With less features, the performance of the system decreases. It is unlikely that the system could perform correct discrimination. As an example to point out the importance of quality features, the neural network was trained and tested in an unsuccessful attempt of motion classification using 3 features. The results are provided below.
STMicroelectronics SensorTile Reference Design
Page 23 of 31
Figure 20: Training Errors in an unsuccessful case
Figure 21: Testing Errors in an unsuccessful case Below is the confusion matrix for testing of the unsuccessful classification attempt. None of the long stride walking trials were predicted correctly. All of the long stride trials were falsely predicted as short stride walking. Moreover, the error involved in prediction of short stride walking was extremely high, being 47%.
STMicroelectronics SensorTile Reference Design
Page 24 of 31
Figure 22: Testing Confusion Matrix for the Unsuccesful Classification Attempt.
7.2 Final Attempt The final results can be seen below. First, we proceed to (Figure 20) below to analyze the neural network after it has been trained (using xor_train.c) with our features for the 10 designated trials.
Figure 23: The results of the neural network after it has been trained data We have given short strides the classification of ‘-1’, long strides ‘0’, and sitting ‘1’. We can see in the furthest right column in the “difference” portion that our data was classified to maximum difference of 0.015, which corresponds to around 98.5% minimum accuracy.
We can see in Figure 21 below the
confusion matrix of this is thus a trained neural network.
STMicroelectronics SensorTile Reference Design
Page 25 of 31
Figure 24: The confusion matrix of the trained neural network. Next, we proceed to test the neural network (through xor_train.c) using the five designated testing data sets. We see the results in Figure 22.
Figure 25: The results of the neural network after it has been tested data We see that the difference between the designated classification constants for each motion and the difference from the actual test data is small, again with a minimum of 98.5% accuracy. We proceed to analyze the confusion matrix for the tested data in Figure 23.
STMicroelectronics SensorTile Reference Design
Page 26 of 31
Figure 26: The confusion matrix of the neural network after passing the test data. Once again, we see a perfect confusion matrix. As the matrices in both the training and test data are both perfectly diagonal for a significant number of datasets, we can ultimately say that we have created a successful classification and neural network for the three types of motions we have selected. The link to the overall process and demonstration can be seen in the reference section below [5].
8.
Source Code
8.1 Source Code Modules Gait Classification uses the Dual Sensor Data Acquisition and Feature Extraction Correlation Modules for data acquisotion and feature extraction. Dual Sensor Data Acquisition relies on the two c files, Acquire_LowPass_Continuous_1.c and Acquire_LowPass_Continuous_2.c, which were developed by Charles Zaloom and modified throught this project. The module relies on the gattool interface to BLE motion system on sensortile to output the motion data output files. The Feature Extraction Correlation module delivers autocorrelation and crosscorrelation of data in the motion data output files. The autocorrelation of Accel_y from sensortile 1 was used as a feature.
8.2 Functionality of Modules The Feature Extraction Correlation Module was modified to perform autocorrelation and correlation on any of the axles of motions for sensortile data. The module functions by reading two columns from the motion data output files, normalizing the data, and performing autocorrelation and crosscorrelation operations on the data. Following figures display part of the modifications that were done in the feature extraction section.
STMicroelectronics SensorTile Reference Design
Page 27 of 31
Figure 24: waveform_feature_xcorr.c
Figure 25: waveform_feature_xcorr.c
Feature extraction correlation performs autocorrelation and crosscorrelation on two files at a time. In waveform_feature_xcorr.c, &litude_vector1x[i] and &litude_vector1y[i] are the pointers that the column vectors of data from the first motion data output file will be stored in. Similarly &litude_vector2x[i] and &litude_vector2y[i] are the pointers that the column vectors of data from the second motion data output file will be stored in. By changing the order that those pointers come in in the list of arguments for sscanf(), one could change the column vector that would be read from the motion data output file. Columns of the motion data output files are sample time, Accel_x,
STMicroelectronics SensorTile Reference Design
Page 28 of 31
Accel_y, Accel_z, Gyro_x, Gyro_y, Gyro_z, Magneto_x, Magneto_y, and Magneto_z. By moving around the discussed pointers, one is capable reading the column vector of choice for feature extraction.
8.3 Google Drive Links The steps to modify the source codes were discussed through the sections. Additionally, the modified source codes are available in the following google drive link.
https://drive.google.com/open?id=1qTHNuAcx3AMTQCifwvG-bvhF5tr9pq39
Design Guidance 9.1 Overall Experience This was, overall, a very fun and very rewarding experience in terms of creating an embedded system related to wireless health and sensors. While the steps were not extremely difficult to understand, it was still fairly challenging for engineering students with limited background in C. However, one can easily augment their programming skills in Linux through this project alone, one requiring several folders and commands to speed up the process of gathering data and launching algorithms. We also felt that this project showed the overwhelming capabilities of the SensorTile and its ease of handling. With nine different sensors to choose from, we were at times overwhelmed by the numerous possibilities and forms our project could take. Taking the time to learn each step of the project really helped us developers as engineers and we really hope we can develop this project even further.
9.2 Challenges The main challenges for the project mainly revolved around questioning what sort of classification could be done in a 10-week span. As such, there was no immediate goal, and the prototype benchmark was only set immediately after obtaining data and features for a two-motion classification in the first two weeks of the project. In connection to this, there was the issue of whether or not there were too little or too many features at many points of our product, especially when training and testing our project in its early stages. Finally, the instability of the Beaglebone connection was at times a hindrance, since it required repeating several tests and using an extended period of time dedicated to testing, especially since the SensorTiles only had an estimated battery time limit of 15 minutes before requiring a recharge. However, any foreseen challenges to our project were accounted for through constant recording of our progress and weekly consultation with our advisor, Dr. Kaiser.
STMicroelectronics SensorTile Reference Design
Page 29 of 31
9.3 Conclusion and Next Steps We have succeeded in creating an appropriate classification classifying long strides from short strides, as well as sitting. As previously mentioned, ultimate goal is to classify gait through stride length, and we believe our developments are a step in the right direction. There are several steps that we definitely want to take that we discussed over the course of ten weeks. First, would we want to develop a termination algorithm that would activate when the test is disturbed by irregular behavior. This was a very enticing idea that we were exploring for the entirety of our project, but in the end could not implement due to time constraints. We would also further like to develop a medium stride classification into our project, and consequently, standardize a final test that will determine gait classification from the resulting neural network. Overall, this project’s speed can be further improved if some of the algorithms, such as averaging values, was automated. Following the development of this test, it would be ideal to create a proper user-friendly interface to whoever decides to use this product. We realize that as developers, however, outside of the above, there are many ways our project can grow. This is why that we would also like to not only promote this project of ours to hospitals and other research facilities, but we would also like to present this to specialists who can advise and tell us on how this project can be better improved for rehabilitation.
9.4 Acknowledgments We would like to acknowledge Charles Zaloom for creating the source code for the Acquire_LowPass_Continuous.c file, and Steffen Nielsen for the FANN library and source code. Furthermore, we would like to acknowledge Xu Zhang for developing the Dual_SensorTile.sh and reset_bluetooth.sh algorithms, and providing us expert advice for our project. Finally, we would like to give a special thank you to Dr. Kaiser, for giving us constructive criticism as we developed our project and giving us the opportunity to create and share this project.
STMicroelectronics SensorTile Reference Design
Page 30 of 31
10. References [1]: Beaglebone Documentation: https://www.mouser.com/pdfdocs/Seeed-BBG_SRM_V3_20150804.pdf
[2]: STMicro SensorTile Documentation: http://www.st.com/content/ccc/resource/technical/document/user_manual/group0/bc/b1/ad/c8/36/de/40/92/DM0032 0099/files/DM00320099.pdf/jcr:content/translations/en.DM00320099.pdf [3]: Beaglebone and SensorTile Tutorial Link: http://iot.ee.ucla.edu/UCLA-STMicro/index.php/UCLA_-_STMicroelectronics
[4]: FANN Library Download and Tutorial Link: http://leenissen.dk/fann/html/files/fann-h.html [5]: Demonstration and Process Video https://www.youtube.com/watch?v=lJZYZqtoEH4&t=930s
STMicroelectronics SensorTile Reference Design
Page 31 of 31