Critique 4 – Final

2D Representation of Multi-screen Lapse, “Sounds of light”

My final presentation included various works that represent the broad scope of my practical research.  Aside from the Google hyperlapse, none of the works can be properly experienced in an online format.  The works were created to occupy a space, provide interactivity, be tactile or to be actively consumed.  They are referred to here only for documentary purposes. 

Waikato LapseTeToto GorgeCarpark Lapse


A series of experimental moving image digital works was created to exhibit the diverse scope of the project and its realms of inquiry.  Outputs demonstrate the variety of methods and technologies used to create the imagery, and several proof of concept works were used to show potential means of interacting with or viewing the works.  All works were motivated by the desire to break away from a traditional means of moving image media creation or consumption.

One of my biggest challenges in creating these pieces was letting go of control over how the user will experience the work.  After months of editing many of the components, it is quite nerve wracking to think that it is completely likely that a viewer may not even spend any time at all investigating what the work contains or how it works.  This is the complete opposite of traditional means of video delivery where the creator dictates exactly what the viewer will see, and has more control over interpretations.

Final works presented include:

  •  Multi-screen Time Lapse, Sounds of Light: With shooting spanning more than a year, this was one of the most time consuming efforts in the project.  The work uses multiple means of time lapse and hyperlapse creation including DSLR cameras, motion controlled time lapse, hacked point and shoot digital cameras and the Ladybug3 360 camera.  Imagery is then broken out into multiple screens and synchronized via a wireless network and master/client software.  Displayed together with custom scored music, the work demonstrates both the seemingly infinite possibilities of image manipulation and consumption, and also how the final work can only be realized as a summation of its many parts.  Many of the subjects of the time lapses are a study of how light interacts with various spaces and environments and how it changes over time and season.  ‘Beauty’ is sought after in the man made environment (the city) and juxtaposed with beauty in ‘natural’ environments.  Time lapse is used to reveal this beauty in the seemingly otherwise mundane.  I was also interested in the idea of simultaneous perspectives of the same moment in time, and occasionally dissected the 360 footage to reveal alternate views of the same shot.  The shots are combined using various montage techniques and accompanied by music that varies in tempo and complexity to support the edit.  With no previous script or shot list, the work relies upon the edit, music and visual aesthetic in attempt to illicit a response from the viewer.
  • 360 Interactive Documentary, Reflections:  This interactive documentary uses a fusion of traditional film techniques and 360 degree video to tell a selection of the family stories of Val Sanford in Hamilton, New Zealand.  Inspiration for the piece was derived from my interest in what objects we surround ourselves with within our homes and what personal meanings and memories they trigger.  The documentary allows the user to navigate through Val’s historical homestead and click on various items in the rooms to watch a video story derived from a traditionally filmed interview.  The work demonstrates how a 360 degree interactive environment can be used as a non-linear means of delivering a story to the user.  The resulting method could easily be adapted to deliver historical, cultural and other stories in an interactive way.  The online demo is a lower quality version in attempt to ease download times.  It was originally designed for an offline viewing experience with much larger resolutions and higher quality media files.
  • iPad Night Lapse:  This interactive hyperlapse video can be viewed on a mobile device (iPad, iPhone, android) and allows the viewer to alter the view-port of the video by touch and gyroscopic means.  Software for this type of interaction is still in its infancy and options are limited.  I have investigated several current options as well as closely observing what is happening in emerging commercial sectors.  I had originally chosen the Kolor Eyes IOS app, but newer versions of the app don’t seem to perform as well on older devices so I’m using an app from a company called FINWE for this demonstration
  • 360 Kinect Lapse:  A series of 360 time lapse videos supported by a soundtrack of environmental sounds and projected large-screen.  This work again uses locations both in nature and the city, this time from high vantage points, in attempt to fully utilize the 360 degree spherical view.  A Microsoft Kinect sensor is used with some rudimentary software to show proof of concept for an interactive large-scale projected video installation.  A viewer is able to enter a space and control the time lapses.  Usability is limited by the use of non-custom software, but it is acceptable for proof of concept.  A demo of the interactivity can be viewed here.
  • Leap Motion 3D Composite, Float:  This work combines emerging technology, the Leap Motion as a control device, and the technique of creating a 360 video from stitched 2D video sources with compositing software, i.e. After Effects with a plugin designed for authoring exhibits for planetariums (a spherical environment mesh distorter).  Here I was interested with the idea of ‘floating’ in space with repetition/patterns (loops), reflections and color manipulation for contrast and to isolate body movements.  Supported by a somewhat ethereal soundtrack I aim to allow the user to assist in creating the experience.  Using the bundled software of the Leap Motion, the user is provided a primitive touch-less interaction with the 3D environment to manipulate the work.  The sensitivity and possibilities of the Leap Motion are promising, and the user experience would be greatly enhanced by custom software.  An online demo of Float can be viewed here.
  • Google Hyperlapse New Zealand:  This work was inspired by the work of Teehan + Lax Labs and their javascript software that manipulates Google Street View data.  After watching their video, Google Street View Hyperlapse, I was interested in creating my own video using their software, and also to use the Ladybug3 to generate original content for hyperlapse shots (time lapse with a moving camera  over greater distances than motion control would allow) for the multi-screen piece.  This work is somewhat a reflection of spending my first year in New Zealand on the road – fleeting memories of glances out the window as we traveled around both islands and living out of a car.
  •  This supporting website that serves as documentation and a repository of practical research.


The research of this project centered on montage techniques and uses of time-based digital media.  Its aim was to experimentally utilize traditional and emerging technologies in order to create, manipulate and deliver a series of original works.  It expanded to include non-traditional means of documentary, means of interaction/display and re-purposing data (imagery) to create new works.  Although the works vary in their methods and content, there are some recurring elements that have driven my inquiry.  I became especially interested in how light behaves and interacts with various spaces, structures and environments.  I was also attracted to multiple types of reflections, not only physical manifestations of light, mirror imagery and manipulation, but also on mental reflections, memories and investigating the past.

The project refers to both historical and contemporary methods and techniques of image capture, montage and relationships with time.  Importance was placed upon technical aesthetic and new technologies that may be used to create or display the pieces.  Fundamental inspiration was originally derived from director/cinematographer, Ron Fricke, who pioneered motion controlled time lapse camera equipment in his film Baraka.  Fricke used camera movement through time lapse sequences, together with juxtaposition of imagery and subject matter to create meaning.  His use of new technologies with time lapse cinematography transformed the process of time lapse moving image capture.

Montage in film/video often relies upon close relationships between audio and visual stimuli.  Throughout the project various methods were studied including:  the relationships between an artist’s techniques in montage and how the user/viewer interprets or perceives meaning through the work, temporal and spatial perception through moving image, and convergence of technologies and its relationship to the creation and consumption of the resulting new media.  Literature studied spanned several disciplines, and included works from Russian film theorists such as Eisenstein to new media writers Manovich and Jenkins, as well as investigating the role of moving image beyond the single screen.

My practice was challenged and refined by utilizing and synthesizing previous skills acquired from my past work as still photographer, cinematographer, editor, IT professional, artist and computer geek.  The resulting body of work has pushed the boundaries of my technical and creative abilities, and its diversity is evidence of the scope of research undertaken.

Presentation delivered in my final critique can be viewed below.  It requires Flash or an IOS device.