homesearch



LAFCam:
Leveraging Affective Feedback Camcorder

Editing home videos is cumbersome and boring. However, it needs to be done, because you easily end up with hours of material, but nobody in your circle of friends wants to watch it all.

Several attempts were made to automate this process, but visual images are far too complex for machines to decide if a shot of the pool is more important than one of grandma.

LAFCam takes a different approach: Instead of looking at what the camera sees, the camera looks at the camera operator. If she/he laughs, the camera probably captured something funny. If she/he experience an affective emotion, like meeting a friend not seen in a long time, the camera marks this event, along with a picture of the camera operator at this moment, too.

I wrote a video editing application that takes all this data and allows through these cues a better indexing and browsing through hours of video. The laughs are automatically detected using Artificial Intelligence and help to create an automatic summary of the footage. Using skin conductivity (as known from lie detectors) allows the creation of final shots of various lengths.

A more detailed description, including images of what the Hidden Markov Model needed to detect, and the paper, is available on the LAFCam page.




 
     
to the contact page