skip to Main Content

James Rennie Residency


Commissioned by Creative Partnerships and the Ashton Group, myself and Pete Hamilton spent two weeks researching the application of interactive, sensory new technologies to create immediate audio-visual expression for use within the disabled community.

The project is based at James Rennie special school in Carlise working with children with a range of mental and physical impairments across an age range from Key Stage 1 to post 16.

Using existing hardware in the school in conjunction with new software to create a base of technological possibilities that enables teachers to create new projects specific to students and spaces. Thus enabling the students to express themselves in a custom created and ever changing environment.

Over the two week period we connected sound beam (a invisible laser beam which when broken could send a midi signal) to Vjamm (VJ software). This process allowed students at the school to trigger visuals via movement.

There were 2 main applications of this process.

Collective story telling – The students of key stage 2 each created and painted a character, object, these characters and objects were then used to create a story with the help of a teacher.  The paintings were then scanned and cut out using photoshop and placed into Vjamm over painted backgrounds.  When the story was read aloud the students broke the beam and their images were projected in front of them.  By the end of this process the students had created a short film to fit their story.

Real-time reflexive learning – This process involved the use of blue screen techniques to create a environment whereby the student could place themselves in a visual situation (i.e. jumping down the toilet) and perform the action (the jumping) then immediately review the action (themselves watching themselves jump down the toilet).  My self and Pete wrote a paper about this process available to read below:


Play Ground Tactics – real-time reflexive learning

The following is a collaborative paper with Peter David Hamilton

(a) Abstract

This paper explores the application of augmented reality as a tool for reflexive learning in playful environments. Augmenting Expectation in Playful Arena Performances with Ubiquitous Intimate Technologies by Bayliss, Lock and Sheridan proposes the ‘Performance Triad’ (PT) model as a method for the analysis, deconstruction and understanding of performance in playful arenas. In the cited example, the PT model (See Figure 1.) operates in a specific cultural and social context, but when we apply this model to an education environment, particularly with regard to special needs, then it can be used as a tool for reflexive learning.

Figure 1

Figure 1. Performance Triad Model (Bayliss, Lock and Sheridan, 2003)

Background

Dave Lynch and Pete Hamilton were commissioned, by The Ashton Group Theatre (1) on behalf of Creative Partnerships Cumbria (2), to train staff at James Rennie School (3), a school for children with Profound Motor Learning Disability in Carlisle, in animation software and video production.

The idea was that as much of the school should take part as possible right across the age groups and that the project should be inclusive of:

• The Inclusion Programme; a partnership with local mainstream schools.
• The Comenious Project; an exchange project involving James Rennie School with similar schools across the EU.

Address the deep learning agenda, in particular three areas:
• Learning to Learn
• Assessment for Learning
• Student Voice.

…[We] felt that given the essentially reflective nature of the Deep Learning
Agenda (DLA), film and new media skills would be really useful tools for the
school. Film has the capacity to be re-wound, re-watched and analysed.
(Ashton, 2007)

The project ran for two weeks in July 2007 culminating in a presentation of the work of staff and students to the school. A preliminary visit revealed that the school was already using assistive technology in the classroom; Intellikeys (4) boards and trigger buttons were used as durable alternatives to standard computer input devices like the mouse or keyboard. There was also SoundBeam (5), hardware which uses midi signals and movement sensors to detect proximity, which we combined with live performance video mixing (VJ) (6) software called Vjamm (7) to create an interactive environment.

As part of the training remit, we experimented with a video feed to create a live composition using a ‘blue screen’, a process very similar to that used in television weather reports to superimpose foreground elements onto a dynamic background. Weather reports require meteorologists to
choreograph their movements in direct response to the changing weather patterns behind them. In our experiment, students performed scenes from stories they had written in the classroom. A camera was positioned in front of the screen on a tripod and was connected (8) to a computer running
VJamm. With the facilities at hand, we were able to superimpose the performer into a scene from the story.

Applying the PT model

We have used the PT model as a basis for investigation. While installing our ‘bio-feedback mechanism’, we addressed several technical issues relating to the performance of the equipment. The hardware limitations caused a visual hysteresis (9) which allowed one person, the participant, to become an
instant observer of their virtual performance. In this instance a black curtain was erected in the school hall behind a table and a crash mat placed on the floor in front of that. We were then able to key (10) out the background.

In the virtual space, the story required each of the children to jump down an animated toilet, to travel to Chocolate Land in search of buried treasure. This was a real-time performance environment where the child was engaged as performer. In the physical space they played the role of participant and were directed to jump down on to the mat. Once they had landed, they were able to look up and observe themselves seconds earlier making the jump within the virtual environment. The rest of the children in the hall completed the triad as observers who would eventually become the participants and virtual performers.

Using the PT model as a basis for research allowed us to dissect the performance into sections and investigate the benefits of using this process in a school environment. Operating simultaneously within a virtual and real space with a delay in feedback creates an environment where the students
can explore and react to themselves. We suggest that further research into ‘out of real-time’ immersive learning environments could create a tool for reflexive learning at all levels of education.

Terms

(1) http://www.ashtongroup.co.uk

(2) http://www.creative-partnerships.com/Cumbria

(3) http://cms.jamesrennie.cumbria.sch.uk

(4) http://store.cambiumlearning.com/ProgramPage.aspx?parentId=074003405&functionID=009000008&site=itc

(5) http://www.soundbeam.co.uk

(6) http://en.wikipedia.org/wiki/VJ_%28video_performance_artist%29

(7) http://www.vjamm.com

(9) The phenomenon in which the value of a physical property lags behind changes in the effect causing it.

(10) http://en.wikipedia.org/wiki/Keying_%28graphics%29

References

ASHTON, R. 2007, Here, Now and Then: A media, story-making and
performance research project for James Rennie School. Unpublished

BAYLISS, A., LOCK, S. and SHERIDAN, J. G. 2003, Augmenting
Expectation in Playful Arena Performances with Ubiquitous Intimate
Technologies, in Proceedings of Pixel Raiders, Sheffield.

Fitzpatrick, G., Husbands, P., Jungmann, M., Lutz, R. and Villar, N.
Exploring the Boundaries between Perception and Action, Lancaster
University, Infolab21 and University of Sussex, Department of Informatics
and Artificial Intelligence.

LESH, N., RICH, C. and SIDNER, C. L. 2000, Collagen: Applying
Collaborative Discourse Theory, Mitsubishi Electric Research Laboratories.

PECS – Picture Exchange Communication System [online], CALL Centre,
University of Edinburgh, Available from:
http://callcentre.education.ed.ac.uk/SCN/Level_A_SCA/Using_Symbols_S
CB/Hot_PECS_News_HTA/hot_pecs_news_hta.html [Accessed: 01.07.07].


Dave

Drystone waller turned film maker/animator/VJ/educator, now an artist/entrepreneur/projection bomber interested using technology/ intervention as a medium for social change.  Currently researching experimental projection methods from moving vehicles/ interactive programming and blending tea.

This Post Has 0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top