Where Do We Go From Here?
Client : Hull 2017
Audio : Joseph Reeves
Footage & photography : James Medcraft
Where Do We Go from Here? uses a specially choreographed interplay of light, shadow and sound to guide people through Hull's Old Town. It encourages people to explore the city's night-time streets as dormant robots awaken, responding to the city's architecture, interacting with one another and with Hull's residents and visitors.
The site-specific installations focus on three areas around Hull's Old Town, each featuring a different configuration of re-purposed industrial robots of varying sizes from ground to rooftop. The robots communicate through woven networks and act as light guides creating kinetic animations resulting in an inquisitive acquaintance with the city. With a wide range of light effects, from beams to constellations, shadows and reflections, the robots animate and highlight unseen places and encourage people to see Hull in a new light. Specially sourced and curated soundscapes add to the experience.
The commission's exhilarating mix of art and technology embodies key themes for Hull as it reflects on a successful year as UK City of Culture and looks towards the future.
My involvement spanned the entire project from concept design through to delivery including the development of some unique Cinema 4D plugins that were designed for controlling the robots which can be seen in the video below
Some of the features that were developed specifically for the project:
- Integration into the Cinema 4D Mograph system for full flexibility when designing movements for multiple robots.
- Custom inverse kinematics for predicting exactly how the robot would behave in reality and showing that in the C4D viewport
- Calculating linear and rotational velocities, axis speed limits and boundaries of the robot model being used
- For each robot in a group, a fully functional robot program is exported to be uploaded directly to the real world robot controller
Light group ray tracing:
- A Mograph based system for calculating the orientation of surfaces to determine where to bounce light around a space with mirrors
- Simulates light from a source with beam angle divergence & decay for each bounce on a surface
In addition to these, a range of python effectors were developed for creating custom animation tools to make complex movements much more simple and reduce the amount of key framing