rulururu

post Evan’s photos of the Emotional Contagion Event at the Dana Center

February 15th, 2008

img_1710.jpg
Harry, Neil and myself, getting ready to talk, Dana Event

img_1702.jpg

Pablo taking part in the live event

img_1697.jpg

evan setting up upstairs.

img_1682.jpg

The first stage of Chameleon

post beginning emotional algorithms

February 13th, 2008

This is how we are starting it off : with six basic emotions and there are two people facing each other.

Emotions are:

(“neutral”, “happy”, “surprised”, “disgusted”, “angry”, “sad”)

What we need to know is what are the most probable responses to each of these.

For example

If I was Happy – main response, secondary response, tertiary response of other?

If I was Sad – main response, secondary response, tertiary response of other?

If I was neutral – main response, secondary response, tertiary response of other?

If I was disgusted – main response, secondary response, tertiary response of other?

If I was surprised– main response, secondary response, tertiary response of other?

If I was angry – main response, secondary response, tertiary response of other?

After I was angry for example, what would be my own follow up emotion?

If I displayed anger, for example? My most probable next display of emotion would be?

post Emotional Algorithms.

February 13th, 2008

Spent day working on first prototype. The images need replacing – will shoot stills over the next few days, maybe tomorrow? Also, it is rather blunt – one image for disgust, anger etc. Basicall – shot in profile – seven images for each emotion. The male and female face each other. Each others emotional response triggers the aprropriate emotional response. Looks good, but lighting isn’t right. should be shown on two monitors. flat screen. Matt and I can pose tomorrow?. Also want to shoot pablo looking into the mirror for the first time. tomorrow?

Looking at the algorithmic codes that trigger the work. Need more work. Will send onto Chris Frith and Dylan evans. He sent through these questions last week, a lot of which I don’t know the answers for. Reading Dylan’s short book on emotions at the moment. Simple, but great. Chris’s book arrived today.

Dylan Evans

“Facial feature tracking is hard enough. My PhD student in Bristol was using Active Appearance
Models for tracking, but this re-quires manually labelling of training footage, which is tedious and introduces errors. Automatic landmarking algorithms enable a more accurate ?tting. Do you know what the MIT team is doing to track facial features?”

emotion[EMOTION_NEUTRAL][POSSIBLE_RESPONSE] = new Array(EMOTION_NEUTRAL, EMOTION_NEUTRAL, EMOTION_HAPPY, EMOTION_SAD);
emotion[EMOTION_NEUTRAL][POSSIBLE_FOLLOW_UP] = new Array(EMOTION_NEUTRAL, EMOTION_HAPPY, EMOTION_SURPRISED, EMOTION_DISGUSTED, EMOTION_ANGRY, EMOTION_SAD);

emotion[EMOTION_HAPPY][POSSIBLE_RESPONSE] = new Array(EMOTION_HAPPY, EMOTION_SURPRISED);
emotion[EMOTION_HAPPY][POSSIBLE_FOLLOW_UP] = new Array(EMOTION_NEUTRAL, EMOTION_HAPPY, EMOTION_SURPRISED);

emotion[EMOTION_SURPRISED][POSSIBLE_RESPONSE] = new Array(EMOTION_NEUTRAL, EMOTION_HAPPY, EMOTION_DISGUSTED, EMOTION_ANGRY, EMOTION_SAD);
emotion[EMOTION_SURPRISED][POSSIBLE_FOLLOW_UP] = new Array(EMOTION_NEUTRAL, EMOTION_HAPPY, EMOTION_DISGUSTED, EMOTION_ANGRY, EMOTION_SAD);

emotion[EMOTION_DISGUSTED][POSSIBLE_RESPONSE] = new Array(EMOTION_SAD, EMOTION_ANGRY, EMOTION_DISGUSTED, EMOTION_SURPRISED);
emotion[EMOTION_DISGUSTED][POSSIBLE_FOLLOW_UP] = new Array(EMOTION_SAD, EMOTION_ANGRY, EMOTION_NEUTRAL, EMOTION_SURPRISED);

emotion[EMOTION_ANGRY][POSSIBLE_RESPONSE] = new Array(EMOTION_SAD, EMOTION_ANGRY, EMOTION_SURPRISED, EMOTION_DISGUSTED);
emotion[EMOTION_ANGRY][POSSIBLE_FOLLOW_UP] = new Array(EMOTION_SAD, EMOTION_ANGRY, EMOTION_SURPRISED, EMOTION_NEUTRAL);

emotion[EMOTION_SAD][POSSIBLE_RESPONSE] = new Array(EMOTION_SAD, EMOTION_ANGRY, EMOTION_SURPRISED);
emotion[EMOTION_SAD][POSSIBLE_FOLLOW_UP] = new Array(EMOTION_NEUTRAL, EMOTION_SAD, EMOTION_ANGRY);

af09.JPGbf13.JPG

post technology that mimics eye contact perception

February 13th, 2008

Filed under: CHAMELEON PROJECT,interesting research — Tina @ 3:18 am

The portable device, dubbed the eyebox2, can be attached to public area advertisements and uses a camera that monitors eye movements in real time to automatically detect when people are looking at it from up to 10 meters away and at a horizontal range of 2-3 meters.

By emitting infrared diodes and recognising the red-eye effect, the device mimics eye contact perception in humans, allowing it to accurately pinpoint what television screen, billboard or product shelf people are looking at.

This enables advertisers to track the number of people who engage eye contact with their ads said Professor Roel Vertegaal, the chief developer of the eyebox2 and director of the Human Media Laboratory at Queen’s University.

look into further.

post reading emotions

February 13th, 2008

In different emotional states, we direct our gaze to different parts of the face first – Using eye tracking – we look at eyebrows/brow area first – happiness we look to the mouth. Could use this to advantage – could abstract the imagery real time. When healthy people look at faces, they spend a lot of time looking at the eyes and the mouth, as shown in the figure below. People with damage to the amygdala, with agenesis of the corpus callosum, and with autism, all look at faces abnormally.

emotiontracking.jpg

using eye tracking to work out how we view a face to make sense fo the expression

studies02.jpg

reworking Paul Ekmans FACS database. Bringing particular features of the faces forward – does emotion still get read? Is it stronger? How much can ou abstract the image before the emotional expression gets lost?

post real time emotional contagion capture

February 13th, 2008

Filed under: CHAMELEON PROJECT,working on installation ideas — Tina @ 2:56 am

Met with Evan Raskob over at Borrough to work on the live emotional cantagion patch – seems to work ok – has it working with two cameras – live feed in – caputres for ten seconds, plays back at 10 per cent for a minute. Looks good. Then worked on the first prototype for CHAMELEON. Using emotional algorithms – need to send them off to Chris Frith and Dylan Evans for some advise about the algorithmic codes.

img_0336.jpg

Evan Raskob at work – the set up,

img_0339.jpg

img_0330.jpg

Off to the Tate

post observer

February 13th, 2008

Filed under: CHAMELEON PROJECT,interesting opportunities — Tina @ 2:37 am

\The observer were in contact today wanting to do something on the scientific side of the project, more about the idea of emotional contagion. Sent through pics, and hugo and chris’s contacts. Hugo has just had a baby.

Chris said he would talk to them. Chris has just published a book – making up the mind

http://www.amazon.com/Making-Mind-Brain-Creates-Mental/dp/1405160225

post Neil’s talk – Emotional Contagion event

February 12th, 2008

Met with Neil Harrison to discuss talk, timings at the Dana Event. Neil will be talking about:

“Human society operates through cohesive social relationships. One characteristic of our social interactions is an ‘intuitive’ ability to understand other’s mental and emotional states. In parallel we have a tendency to mimic the body posture, gesticulations, emotional expressions and even physiological bodily responses of others. I shall discuss the mechanisms underlying this automatic mimicry, its role in empathy and intersubjectivity and how individual differences in mimetic systems may underlie disorders such as Autism and Tourette Syndrome”.

bm19anfl.JPGam13anfl.JPGam06anfl.JPG

images from the Karilonska database of emotional expression – these images have formed the visual database of the first stage of CHAMELEON.

post how to assess the smaller micro expressions?

February 12th, 2008

Had a chat with physiologist Harry Witchell about the project. To analyse emotional expression he is using “image pro plus” image analysis software. The process for Harry is quite analogue – he applies black dots on face and then analyses. He mentioned an interesting group at Carnegie Melon looking at facial emotion recognition. Harry believes his knowledge of understanding facial expressions is more implicit – he says he can tell you exactly what muscles to move to achieve certain facial expressions – but he believes the people who have the most understanding of the subtleties of expression are the animators. A lot of the technology, though complex can still not make sense of the smaller micro expressions that make up different emotional reactions. Constantly need to keep in mind we are working with fluid, human faces, not static computer models. the fleeting nature of emotions.

ferment01.jpg

ferment06.jpg

still from FERMENT, 3 minute video ,2006

post Why bother interacting? How to create engagement? How to probe emotional expressions?

February 12th, 2008

Filed under: CHAMELEON PROJECT — Tina @ 8:27 pm

At the Nan Goldin exhibition at Kiasma, there were about 40 people in the room, spending hours with her video installation that talked about her sisters suicide. Was powerful, raw, emotive. Was traditionally narrative driven.

Today I met with prof Nadia Berthouze again who works in emotion computing and human computer interaction at UCLIC and her masters student, Matt to talk about how to evaluate the interactive projects. With the Feel Series works, I am creating prototypes, evaluating them – working out what does and doesn’t work, and then moving onto the next prototype. The same strategy of incremental building blocks will create Chameleon. Matt is interesting in creating a strategy about how we go about this, and what are the most important questions to ask. This could potentially become his masters project.

We started with the idea – what makes people engaged? What makes people want to invest time with the work? We need to study engagement interaction – this could bedone be video, physiological feedback and questionnaires.

Secondly we began discussing the interaction. if we are using real time human expression to create interaction -how will people behave when they enter the exhibition space? We need to define the requirements for the complete technology. what is needed to stimulate facial expressions – does the imagery need to be extra expressive? Amplifying emotional expressions in order to provoke a change in emotion expression of the audience? HCI people researching emotion expression in gaming have pretty much found that most people are still in front of computer games – blank expressions. How can we ignite emotional expressions when people enter the space? Also, what will the technology be analyzing – will it be analyzing both body and face – another mode such as movement and sound – We need to amplify emotion so it arrives at the face? understanding the dynamics of how emotions emerge into a facial expression
Also, At what stage do the audience realize that the work is interacting wit h them?

How do we arrive at an emotional expression – some tribes in Africa don’t express pain at all – Last week in Estonia it was quite disconcerting as very few people emitted any emotional expression at all. Different cultures are trained from a very early age not to express certain emotions. How much to we need to look at the cross cultural differences. How subtle is the technology, and how predictable and robust is it? How powerful the images need to be to result in an emotional expression?

Also we are so socialised in how, when to display emotional expressions. We can mask our expression at any time – Reading Dylan Evans – Emotions – book he mentioned experiments where lone participants in a room were asked to watch clips from a comedy, a horror story, a love story. Their facial expressions were monitored via a video camera. The facial expressions were quite mono-tone through out. They were then asked to watch the films with another person in the room. The emotional expressions displayed were a lot stronger when other people are present. Multi user interction could be key in generating emotional expressions.

img_0111.JPG
pablo on the spanish trains – nearly four months old and way too young to mask any emotions. Every emotion erupts on his face.

« Previous Page ʚ Next Page »
ruldrurd
   Footer Anatlogo Arts Sa
Australian Government The Visual Arts Strategy