Living Archives develops a practical, research-led pathway for bringing an iconic fashion and textile archive into connected physical and digital experiences. The project focuses on cultural storytelling, provenance, craft and craftsmanship, and measurable audience engagement. It will deliver a tested concept design plus a workflow map and schematic that heritage institutions and technology partners can adopt with confidence — including clear guidance on what enhances engagement, what detracts, and what capabilities and skills are required.
How do you translate a physical garment into a world? Not a display. Not a retrospective. A world you stand inside.
The central design premise: the garment is the architecture. Colour palette, surface textures, spatial geometry, typographic language, environment assets — all extracted directly from the cloth, not referenced from it. Working from garment scans outward, the project required building a complete design system of texture libraries, 2D assets, and 3D environment materials — simultaneously faithful to Westwood's iconic visual DNA and entirely specific to these physical objects and this personal archive.
The challenge was to honour one of the most visually distinctive brands in fashion history without producing a retrospective: the garments are not the exhibit. They are the material the world is made from, and the story they serve is Asiya's.
The design development leaned heavily on Google's generative AI tools throughout ideation. Project Genie (Google DeepMind / Google Labs) was used to explore the garment-world concept in navigable real time — uploading scans of the Five Centuries Ago print and walking through the world they generated, testing the visual logic of turning textile into terrain before committing to a build approach.
Skybox AI generated 360° photospheres of the three scene environments — Brixton market, the Westwood atelier, a 90s basement club — seeded with the garment's specific colour language to ensure AI-generated spaces carried the right atmosphere before any hand-authored texture was applied. Google Stitch drove rapid UI and interface design iteration, exporting directly to Figma for refinement.
The archive material — VHS footage, Polaroids, handwritten letters — was processed through Topaz Video AI and Runway Gen-3 to make stills breathe and degraded footage legible, always retaining the grain and warmth of the original formats. The texture of the time is the aesthetic. Enhancement without erasure.
The WebXR PoC demonstrates the spatial logic of a future room-scale gallery installation. Every camera position maps to where a physical visitor would stand. Every audio source is positioned for a directional speaker array that doesn't yet exist. Every design decision is made with the full installation in mind: projection-mapped walls carrying the garment-world, d&b Soundscape spatial audio with three simultaneous sonic horizons, LiDAR tracking updating the visual field in real time as visitors move.
The physical installation design centres on the garment itself — at the geometric vanishing point of all three worlds, housed in a conservation vitrine, the actual object around which the digital memory is built.
The 2D design work underpins everything: a complete visual identity derived from the garment — colour palette, texture system, typographic language sourced from Asiya's handwritten letters, annotation marks that appear on walls and floors as navigational elements at architectural scale. In the room-scale version, visitors feel miniature — as if walking through an enlarged scrapbook, the annotations floor-to-ceiling around them.
The translation from 2D asset to spatial environment is the core design skill the project exercises: a textile print becomes a procedural GLSL shader system that generates the world at infinite resolution. A Polaroid becomes a three-dimensional portal. A felt-tip annotation becomes a wall-height navigational mark. Every 2D element has a spatial life.
THROUGH delivers a tested concept design and technology appraisal to Living Archives as its primary output — the research framework the academic team uses to evaluate immersive technology options for cultural heritage institutions. Alongside this: a working WebXR experience demonstrating the spatial logic, a fully documented production pipeline from garment scan to navigable world, and a clear institutional pitch for a room-scale installation at venues including the V&A, Tate Modern, and the Whitworth Manchester.
The experience is proof that the most powerful thing you can do with an archive is not display it. It is to let people stand inside what it remembers.
Steven Bamidele's debut concept album THE CRASH! begins with a man — faithless, directionless, drifting through life until a spaceship crash lands in his living room. From the wreckage, a woman alien emerges, and through their relationship, meaning re-enters the world.
The album carries this narrative in abstraction — gestured at, never stated. My role as creative director was to make the allegory inhabitable and place the audience inside it.
The evening is divided into two acts — a live podcast (Vinyl Resting Place Pod Q&A) and a live musical performance. Rather than treating these as separate segments, I conceived them as two phases of a single theatrical journey.
Act I — The Living Room
The audience enters a fully staged living room. I decided the podcast Q&A should take place within the domestic world — the before. Towards its close, a theatrical crash sequence is staged and the audience are rushed to evacuate. This moment of displacement mirrors the unnamed character's rupture. His world is broken open. The audience does not yet know what they will return to.
Act II — The Crash
The audience returns to find the room transformed. Projection mapping floods the walls and the movement artist moves through the space as the woman alien — the source of all magic. Steven performs a curated selection from CRASH!, the visual world responding to the emotional arc of each song: cold and colourless in disconnection, saturated and alive when the characters draw close.
To brief the projection mapping artist and movement artist with precision, I designed a structured prompt sheet for Steven — a series of questions developed to draw out the narrative, visual, and emotional vocabulary of the album. Questions ranged from the broad (themes of liminal space, faith, connection) to the granular (how do people physically move through this world? what elemental forces exist that we have no concept of?).
This document became the shared creative compass for all three collaborators. From it, I extracted a world-building framework: two-state colour logic, the importance of blue as the colour of meaning, the Romeo-and-Juliet shape of the central relationship, and the core visual principle that the alien's presence should be felt rather than depicted literally.
I am currently producing a stimulus sheet for the projection mapping artist, developing prompts that translate the album's emotional architecture into visual sequences for the walls of the venue.
Add your project description here — the brief, the context, and what you were brought in to create.
Describe your process and the decisions you made along the way.
What was the result? Impact, reception, and what you took away.
Add your project description here — the brief, the context, and what you were brought in to create.
Describe your process and the decisions you made along the way.
What was the result? Impact, reception, and what you took away.
An immersive dance-theatre experience about losing touch with yourself.
When you arrive, someone will fit you with a device. It goes around your torso. You're told it's for your health — government mandated, nothing to worry about. Around you, the world is in crisis. Just follow the instructions.
Tethered is a promenade dance-theatre experience set in a near-future where wearable technology has been deployed to monitor and regulate the human body. As you move through the show, the device you're wearing pulses and vibrates in response to what's happening on stage — syncing you to the inner lives of two characters living with early-onset dementia, triggered by the technological apocalypse, whose hold on memory and self is becoming harder to keep.
Objects pass between performers and audience throughout. Each one carries the weight of someone's memory. Each exchange is an act of connection in a world that has quietly made connection harder.
But not everyone's device is the same. And not everything you've been told about it is true.
Tethered fuses physical theatre, dance, live music and haptic wearable technology to explore what it means to trust or distrust your own body. It asks who benefits when we stop listening to ourselves, and what we might reclaim when we start again.
This is a show you watch, feel and make choices inside. Some of those choices are small. One of them isn't.
I led the experiential and creative R&D direction of Tethered, defining how performers and audiences would interact within a multi-sensory, immersive performance environment. Working from an existing script, I focused on translating narrative themes into embodied audience experience rather than rewriting story or dialogue.
A key part of this was assigning the audience an active role within the world of the piece, ensuring they were not passive observers but embodied participants whose presence helped flesh out the performance's internal logic and world-building. I designed the interaction framework through which audiences inhabited this role, shaping how gaze, proximity, and spatial positioning created moments of intimacy, tension, and recognition between performers and audience members.
Objects played a central role in this approach — selected and deployed as memory carriers, allowing audiences to physically hold, witness, and transfer meaning through material interaction.
Alongside this, I developed a proof-of-concept integrating haptic wearable technology into the narrative of the performance, exploring how sensory feedback could function as an additional experiential layer that deepened audience embodiment and emotional resonance without overriding the live performance.
Debuted at 93 Feet Live, London.
Edinburgh Fringe Festival 2025.
Ongoing development.
Tunnel Groove was the second event in an evolving immersive nightlife series exploring how sound, space, and visual illusion can reshape collective experience. Building on the first iteration — which fused live saxophone with a techno DJ set — this version shifted tone into a funky house–led experience, inspired by the visual language of funfairs, crazy houses, and playful perceptual distortion. Set within tunnel environments in central London, the event aimed to reframe nightlife as an immersive, world-led experience rather than a conventional club format.
I led the project end to end, originating the creative concept and translating it into a cohesive, multi-sensory world designed to be playful, disorienting, and participatory. I chose to use Projection Mapping on the walls of the tunnel, engulfing the audience to reach this effect.
I shaped the event across all touchpoints: creating the visual mood boards; collaborating with a projection mapping artist to realise the environment; curating the musical identity by sourcing DJs and defining set concepts; and physically configuring the space to support audience flow and collective atmosphere. I also created the event's dynamic poster with Adobe Creative Suite and authored the marketing copy, ensuring the public-facing narrative aligned with the internal logic of the experience.
Tunnel Groove was conceived as an experimental yet repeatable format rather than a one-off event. The success of this second iteration demonstrated how the core concept could be adapted and evolved over time, allowing each edition to explore a distinct aesthetic and musical identity while retaining a consistent experiential structure.
The series began with Techno Tunnel Rave — a techno DJ set fused with live saxophone and an immersive art installation inside the COLAB Tower tunnel, July 2025.
Digital Catapult's investigation into ethical AI within the creative industries, developed in partnership with Target3D and Hat Trick Productions.
Ahead of the technical sprint, I designed and delivered exploratory workshops investigating hyper-realistic digital characters as a way to examine digital identity, representation, and presence. Working hands-on with Unreal Engine's MetaHuman Creator as a creative tool, I helped establish the visual and experiential language that later informed the direction of the project.
Upon receiving funding to create a demonstration, I conducted research into the real-world challenges and tensions surrounding ethical AI in the creative industries. Rather than informing the AI system's underlying knowledge (which drew on large language models), my findings were translated into training and guidance materials published alongside the experience, supporting organisations to engage critically and responsibly with the use of AI.
My focus was on ensuring the experience functioned not just as a technical showcase, but as a meaningful point of engagement — helping audiences and organisations grapple with the lived, emotional, and ethical realities of AI in practice.
Debuted at SXSW Texas 2024.