DESCRIPTION
The Wiggle Room is an immersive, interactive installation that transforms children’s movements into digital magic. Developed in Unity as part of Belfast’s 2024 cultural celebrations, the project bridged the physical and virtual worlds through advanced camera tracking technology.

Children were invited to step into a space where their real‑world movements directly controlled virtual avatars, creating intuitive and responsive interactions. As they moved through the environment, they could engage with digital objects and playful systems that encouraged creativity and exploration.

The installation ran from September 19th through to early 2025 at the MAC (Metropolitan Arts Centre), delighting young visitors and showcasing the role of technology in cultural storytelling.

ROLE
As the primary Unity Developer, I was responsible for translating design proofs into a fully realised interactive experience. My work combined software programming with hardware integration, ensuring that the physical tracking systems connected seamlessly with the digital environment.

Key contributions included:
•     Developing a user tracking system that automatically detected when participants entered or exited the camera’s range, enabling smooth             transitions between physical presence and digital expression.
•     Implementing an intelligent object management system that orchestrated environmental interactions, dynamically responding to user                 presence and activity.
•     Bridging design concepts with technical implementation, ensuring that the creative vision translated into an intuitive, engaging experience         for children.
CHALLENGES
The biggest challenge came mid‑development, when the project shifted from Orbbec cameras to ZED 2i cameras for body tracking. This required rapid adaptation and reconfiguration of the tracking systems, testing both technical flexibility and project management skills.
Rather than derailing progress, the change became an opportunity to demonstrate adaptability and problem‑solving under pressure. By approaching the transition systematically, we maintained the original timeline and delivered the installation on schedule.
This experience reinforced the importance of building flexible, modular systems that can adapt to evolving technical requirements — a principle that continues to guide my development approach.
TECHNICAL
•     Engine & Programming: Unity3D with C# for all core interaction and tracking systems.
•     Networking: Configured a multi‑machine network topology with static IPs across several Linux machines for body tracking computation,             with Unity as the central interaction node.
•     Time Synchronisation: Configured the Windows host machine as an NTP (Network Time Protocol) server to ensure synchronous timing             across all systems.
•     Hardware Integration: Deployed and calibrated ZED 2i camera systems for precise, real‑time body tracking in a public exhibition                           environment.
VIDEO

You may also like

Back to Top