rulururu

post Meeting with Evan Raskob

April 4th, 2008

We were stuck in jetlag land last night. Pablo finally slept about 2am and then we were awake till six am. Woke up for an 11am meeting with Evan Raskob, the programmer working on the Chameleon Project.

We worked through some of the algorithms, but really we need to sit down with Chris Frith, the social neuroscientist working on the project. We looked at the piece that explores the propagation of emotions. We slowed it down, played with scale and paced the propagation with it so we could see what was going on. Its a really complex set of probabilities that is hard to understand, Especially in the midst of jetlag. Its looking much better than the first version, but its pretty essential to sit down with both Chris and Evan for a few hours so we all make sure we are understanding each other.

propagation031.jpg

We discussed what was next. We need to try a version of the propagation piece that only captures the facial expression and not upper body. We need to experiment with how it looks spatially – can it work in a row? A cross? What is the best way for it to work? Can the figures just be facing each other, as if in dialogue?

We need to get the version of the two heads facing each other working in Processing.

kevinandthomas011.jpg

kevinandthomas021.jpg

kevinandthomas031.jpg

We also need to work with the two faces one and investigate ways of scouring the web to find appropriate text to transpose onto the work. We will try to the api of wefeelfine.org – the api , for example .

Returned XML samples:

The API is free under the Creative Commons Attribution-NonCommercial-Sharealike license ( http://creativecommons.org/licenses/by-nc-sa/2.5/ ).

Sites that use this API must provide attribution by including the following html on their site:

Powered by: We Feel Fine.

Finally, we discussed the integration of touch. Evan mentioned some one who is pretty switched on with dealing with haptics. It would be great to meet him next week.

post The propagation of emotions – first prototype.

April 2nd, 2008

Last night Evan Raskob, the wonderful programmer/artist I am working on with the Chameleon Project sent me the first stage of emotional contagion propagation piece.

stage041.jpg

mock up of emotional contagion piece of how it could work in exhibtion space.

propagation.tiff

screen capture of how piece is working now – all screen are designated to one screen so we can work through how the algorithms are actually working or not working.

Viewed the first rough last night. His written up using processing. Fantastic, as it allows me to view it over the web – and when we get it fine tuned a bit – it will allow me to show the project to the other collaborators over the web so we can all get a sense of what is going on.

Aesthetically and algorithmically it’s a first stage – sort of interesting. A beginning – the first thing that comes to me is that the movement of the faces make me focus more on the changing shape of the background rather than the facial expression. Secondly, I find myself reading the personalities of the people also via the clothes they are wearing. Thirdly. I am not getting any sense of emotional contagion – I am just seeing the faces emote in a sense that I can’t understand. Maybe its too fast…There doesn’t seem to be a rhythm. How do we make it more explicit?

I think the way that we have to work with it is looking a pacing and weighting of certain personalities. Some overdrive others. Also weightings and pace of each emotional response. We are going on meet Thursday at 11am in London to discuss the next stage of the project.

Evan says that closer analysis of how the people effect each other is necessary, because its so complicated. He knows they are spreading emotions to one another, but we’d have to play with this a bit to get some optimal values. because there are so many variables involved, we’d have to look at how to add some controls for changing the percentages (each emotional state has a list of possible emotions it can go to, with percentages… there would be about 50 sliders in all if we simple added controls for all).

In Banff, at the Liminal Screen residency at the Banff New Media Institue – I was asked a question – what if you can’t map emotional contagion. Good question – what if we can’t? What if we are making it all up – maybe it is too complex. Maybe we have to relook at how I am thinking about these emotional algorithms. Anyway, all will be clear on Thursday when we meet.

post organising stages.

March 1st, 2008

Had another meeting with Rana. Discussed stages and who supervises what parts of the project she should supervise and what we should achieve before I come back to the Media Lab on the 29th March.

stage one: Emotional Algorithms

1. working out emotional algorithms with Chris
2. reshoot database in banff
3. Contact Evan to see if the project can work online so all collaborators can view it?

stage two: Look at the propagation of emotions

1. Look at more complex algorithms and how they spread between each other.
2. reshoot this still database in banff (using still photography) (need camera?)

stage three: Multi-modal interaction?
1. test touch as a further mode of interaction. (touch will bring audience closer to the portraits – making the cameras easier to read them – could be major issues with the cameras reading the audience more spontaneously, and this could potentially resolve this problem.
2. shoot data base of pablo. As the audience touches the portrait of the baby – becomes like a live tamagochi?

stage four: integration
1. Integration of Rana’s technology into project. We will do this using max – her technology will spit out info every few frames.
2. Look at one camera – running on laptop – interacting with digital portraits.

thoughts:
how does sound work with the project. ? Talking with banff about what and when to record in banff.

Should we continue to pursue a live emotional contagion tool to really understand how social groups begin to build trust – also what are the most emotionally contagious actions?

talking a lot about engagement – scale, pace, dynamic. What is the immediate response? How do you get some one to want to engage with the work?

how is the imagery treated? – how we look at the face and read emotions changes in different emotional states?

How much can you abstract the image to amplify the emotion?

An initial video database and still database will be shot in banff.

img_0628.jpg

img_0626.jpg

img_0627.jpg

post Evan’s photos of the Emotional Contagion Event at the Dana Center

February 15th, 2008

img_1710.jpg
Harry, Neil and myself, getting ready to talk, Dana Event

img_1702.jpg

Pablo taking part in the live event

img_1697.jpg

evan setting up upstairs.

img_1682.jpg

The first stage of Chameleon

ruldrurd
   Footer Anatlogo Arts Sa
Australian Government The Visual Arts Strategy