This week went by really really fast. I woke up and it was already Friday! Time sure does fly when you are having fun.
This week, I tried the different light filters to determine which filter would give a more accurate pulse rate. The pulse rate from the webcam GUI and the pulse oximeter were compared. For the fairer skin toned person (my office mate), the green light filter gave a more accurate pulse rate. However, for a darker skin toned person (myself), the yellow light filter gave more accurate pulse rate.
Also while doing research this week, I learned that there is an association between resting heart rate and diabetes. The researchers found that participants with faster heart rates had an increased risk of diabetes or were pre-diabetic. This was very interesting. If the research that I am doing this summer could help with the detection of diabetes that would be amazing.
I also spent part of my week working on my Research Symposium Poster. Mines a few adjustments, I think that I am done with it.
My goal for this week was to use the webcam to track the face and locate a region of interest (ROI) and then use that ROI to measure the heart rate (HR). These things were done with success, but I still have a few bugs/kinks to work out. Check out my and pics below:
Video 1: Face and ROI Tracking with motion
Video 2: Face and ROI Tracking Static
Video 3: Heart Rate Detection using Webcam
rPPG Output using ambient lighting inside. The top graph (wave) is the raw signal. The bottom graph (wave) is power spectral density curve when the peaks represents the actual heart beat .
Video 4: Using Pulse Ox to compare webcam HR
I was trying the program using natural lighting outside. However, yesterday (6/28/19) when I tried to run the program again using an additional green light source, the program kept freezing and wouldn’t take the HR. I will have to debug the code and try to figure out the issue.
Attempting to run the program again using additional light source
I’ve also set up a meeting for this weekend with my friend Doug to help me learn how to build an app since I have no experience building an app.
This week, I continued to code using Python but with success. Last week, I was learning how to code. This week I was implementing some of the coding techniques that is involved with rPPG. My first big coding task was to implement thresholding on a live video. So basically, it takes the video and converts it to black and white (see images and video below).
Black and white thresolding! Look at the eyes!Thresholding Pic 2
After thresholding, I worked on code that would implement facial recognition. So the way the code works, it recognizes the face by placing a box around it. Then it tries to recognizes the eyes by placing green boxes around that. When I tried it on a focused picture of myself (nothing to much in the background) it worked perfectly. When I tried it again on a picture of my daughter (with a very busy background), it recognized her face but it also recognized several inanimate objects as eyes as well, such a picture of a chin and a heart.
Face Recognition on MeFacial Recognition on my daughter Kailen
My third big coding task was color tracking. This one was a little bit tougher for me but with the help of Gary (a fellow PATHS-UP RET) and Yong (our mentor), I was able to get it done and extend the concept. The first time, I was only tracking the color yellow. I noticed that the color is much better at picking up darker yellows (see yellow paper vs yellow key). Then I expanded to track two colors at once (yellow and red).
Other tasks that I have completed are planning a draft for the Computing for Health Summer Camp. I also found a really interesting article on-line titled “Measurement of heart rate variability using off-the-shelf smart phones” that talked about how it reduces absolute errors of HRV metrics comparing with the related works using RGB color signals using short video clips from the camera on the smartphone. Thus being able to produce reliable HRV metrics using remote health monitoring in a convenient and comfortable way. The difference in their paper and what I plan do with my research is that they used a chrominance-based remote PPG algorithm and what Rice and myself is using is a distance r-PPG algorithm and that I will potentially being creating a mobile app.
I also was able to play around with this gaze detector app. It was pretty cool. It detects facial regions when gazing at the computer screen.
My daily quest in scientific research always begins with a picturesque one mile walk from my car to the office.
It’s normally breezy in the morning, so walking to “work” is always nice and pleasant. Now in the afternoon, that one mile walk quickly turns from pleasant to a work out. Lol! However, the work that I am doing and will be doing this summer is worth every ounce of sweat.
This week’s work has focused on learning about PPG, DistancePPD, and learning to code using Python. Past research at Rice by Dr. Ashok Veeraraghvan and his team involving PPG (photoplethysmography) has focused on using PPG and a web camera to non-contact monitor vital statistics such as pulse rate (PR) and pulse rate variability (PRV). DistancePPG detects the blood volume changes in the microvascular bed of tissues in the face. Their research focused on certain parts of the face, such as the forehead, checks, and under eye area.
Fig 2. Overall steps involved in distancePPG algorithm for estimating camera-based PPG. From DistancePPG:Robust non-contact vital signs monitoring using a camera, by A. Kumar, A. Veeraraghavan, and A. Sabharwal 2015, https://www.ncbi.nlm.nih.gov/pubmed/26137365 . Copyright 2015
However, this is what the patient interface can potentially look like.
Image above is From APP Using Your Webcam to Detect Your Heart and Breathing Rates, by J. Mulroy 2011,
https://www.pcworld.com/article/244211/app_uses_your_webcam_to_detect_your_heart_and_breathing_rates.htmlThere are several benefits of using DistancePPG and non-contact vital signs monitoring, especially in low-income and predominately minority communities. This type of vitals statistics monitoring can be (1) an invaluable tool for physicians who need to make rapid life-and-death decisions, (2) helpful to physicians and patients and allow them to making better informed decisions as patients’ long-term vital signs data is available (3) reduce the cost-of-care [especially since healthcare costs are very expansive for low income and minority communities).
My potential independent research for this project is taking this concept of DistancePPG using webcams and making it more accessible to the minority community by testing whether or not this concept can work using cameras on smartphones. The reason why I am choosing to focus on using smartphone cameras instead of webcams is because everyone in minority communities has access to a smartphone camera, while many do not have access to a webcam and/or a laptop.
So to begin my research, I have been looking at articles that already discuss the possible implication of using a smartphone camera with PPG. I have also started learning to code using Python.
The image above is of code in python that I debuggedThe image above is of me writing a command code using if-elif-else code.
Besides learning, I have also done some really fun things this week as well. One of the cool things that I had the opportunity to engage in this week was using a PATH-UPS 2018 research project that allowed me to find my pulse using a pulse sensor and Arduino code (see image below). I also listened to a very interesting research presentations from the Scalable Health Lab Researchers. In my opinion the most intriguing presentation was about F.L.A.S.H. by Anil V. F.L.A.S.H. stands for Family Level Assessment of Screen-Use in the Home. The purpose of F.L.A.S.H. was to determine the amount time a specific child was looking at the T.V. screen.
Finding my pulse using a pulse sensor and Arduino code
Last by not least, I want to shout out my 2019 PATH-UPS peeps. Thank for your help, snacks, banter, and candor this week. See you next week!
Back row (L to R): Pam, Gary, Ali, Yong (Rice ECE PhD graduate student; PATH-UPS mentor) Front row (L to R): Me, Jimmy, and Azka
K-12 Educators Disseminating Research from Rice University, Arizona State University, and University of Texas-El Paso