And The Coding Continues: PATHS-UP Week 2

This week, I continued to code using Python but with success. Last week, I was learning how to code. This week I was implementing some of the coding techniques that is involved with rPPG. My first big coding task was to implement thresholding on a live video. So basically, it takes the video and converts it to black and white (see images and video below).

Black and white thresolding! Look at the eyes!
Thresholding Pic 2

After thresholding, I worked on code that would implement facial recognition.  So the way the code works, it recognizes the face by placing a box around it. Then it tries to recognizes the eyes by placing green boxes around that. When I tried it on a focused picture of myself (nothing to much in the background) it worked perfectly. When I tried it again on a picture of my daughter (with a very busy background), it recognized her face but it also recognized several inanimate objects as eyes as well, such a picture of a chin and a heart.

 

Face Recognition on Me
Facial Recognition on my daughter Kailen

My third big coding task was color tracking. This one was a little bit tougher for me but with the help of Gary (a fellow PATHS-UP RET) and Yong (our mentor), I was able to get it done and extend the concept. The first time, I was only tracking the color yellow. I noticed that the color is much better at picking up darker yellows (see yellow paper vs yellow key). Then I expanded to track two colors at once (yellow and red).

color tracking yellow 1
color tracking yellow 2 darker yellow)
coloring tracking burgundy
color tracking 2 colors

Other tasks that I have completed are planning a draft for the Computing for Health Summer Camp. I also found a really interesting article on-line titled “Measurement of heart rate variability using off-the-shelf smart phones” that talked about how it reduces absolute errors of HRV metrics comparing with the related works using RGB color signals using short video clips from the camera on the smartphone. Thus being able to produce reliable HRV metrics using remote health monitoring in a convenient and comfortable way. The difference in their paper and what I plan do with my research is that they used a chrominance-based remote PPG algorithm and what Rice and myself is using is a distance r-PPG algorithm and that I will potentially being creating a mobile app.

I also was able to play around with this gaze detector app. It was pretty cool. It detects facial regions when gazing at the computer screen.

The research continues….

 

 

4 thoughts on “And The Coding Continues: PATHS-UP Week 2”

  1. Hey Sheila!

    I am actually going to have to learn Python later on this summer, so this is so cool to see in action.

    How are you enjoying the challenges of coding and the details it entails? Do you have to take notes over it so it doesn’t have the same errors as the next code?

    1. I have enjoyed learning to code with Python. As with any coding, there will be challenges but I am not sure if anyone enjoys the challenges, especially when you spend hours trying to debug something and it was one little thing.

      Yes, I do notes. I use the notebook that Christina provided us to take notes on everything.

  2. I’m so J that you got face tracking done the 2nd week. What’s the next step? Do you want to still use this model to design a mobile app?

    1. My next step was doing real time face tracking using the webcam (which I got done during week 3) and having the screen actually take the heart rate (which I also got done in week three). My plan is to have the mobile app to work the same way as the webcam. But I will see how this goes.

Leave a Reply

Your email address will not be published. Required fields are marked *