VFX Creative Technologist Roy Rodenhäuser illustrates how tech can facilitate virtual fashion workflows -

Creative Technologist Roy Rodenhäuser illustrates how tech can facilitate virtual fashion workflows

Verizon Media- RYOT Studio Creative Technologist Roy Rodenhäuser

With the pandemic ushering in a new age of remote working and social distancing, the ecosystem has seen various examples of innovative use of technology to address the current challenges. Fashion events had come to a grinding halt in the wake of the outbreak, leading businesses to ponder on ways to keep their brands afloat.

What if we could create a digital version of a fashion show whilst facilitating designers to view and customise their clothes on a digital model? Can technology provide solutions to the fashion industry and ease their workflow?

We spoke to Verizon Media RYOT Studio creative technologist Roy Rodenhäuser who recently fused technology and creativity in order to create a simulated experience of a fashion show in a home setting through what he calls an ‘acrylic-to-pixel demo.’

Here are some excerpts :-

Tell us about what spurred you on to create it?

Part of my work as a Creative Technologist at RYOT Studio, Verizon Media, consists of exploring and prototyping use cases for multiple industries combining creativity with technology with the goal of producing memorable experiences that connect people with brands.

Soon after the pandemic started, physical events such as fashion shows and music concerts had to be cancelled, rescheduled, or moved completely online. At the same time, my workspace transitioned from a shared office environment to my home. 

With the acrylic-to-pixel prototype, I wanted to paint a picture of a workspace where technology would seamlessly augment a traditional workflow so as to enhance the craft as opposed to disrupt it, all of this depicted in the form of a fashion show unfolding in a home setting instead of a large venue.

My recent involvement in The Fabric of Reality virtual fashion show has been a source of learning and motivation, and during the production, I had the opportunity to peek into the work of highly talented designers and understand some of the advantages and challenges that originate from a digital fashion workflow.

What are your expectations from this prototype? 

I expect a combination of edge computing, real-time rendering, augmented reality, and 5G to enable a workspace for craftspersons, artists, and designers, that allow them to focus on their creations and existing workflows while benefiting from the advantages of the digital. 

Imagine a creative space where technology’s visual footprint is nonexistent, with remote computing resources dynamically arranging themselves according to one’s needs during a particular task, not a radical change, rather a seamless and almost natural augmentation of the working environment.

How do you think brands can leverage this kind of imagery?

If developed further, it could be leveraged by brands as a productivity tool for fashion designers to immediately preview designs hence cutting down prototyping time. 

Also, think of a fashion event that sees artists design garments while being modelled in front of an audience through a combination of motion capture performance and live painting. This could be an interesting concept for fashion brands and designers to explore.

How do you plan on taking forward your craft?

Joachim Hensch, former managing director at Hugo Boss Textile Industries, and with whom I had the pleasure to discuss the acrylic-to-pixel concept with, asked readers to ‘Think of the endless possibilities and the easiness for designers to step into AR, not leaving their traditional and haptic work but just adding a new layer on it, which in addition is easy to transfer, move, adjust, multiply.’

This is the kind of thinking that is informing the direction I’m taking for further development. There’s plenty of room for improving details such as cloth simulation and pattern recognition, in both, the context of computer vision and textiles that should be addressed, and yet, I think that addressing and connecting with the craft of people involved in the fashion supply chain deserves my primary attention to take this forward.

Can you give us a breakdown as to how you created the imagery?

While learning the basics of working with Marvelous Designer, I realised that when exporting a 3D model of a garment, the corresponding UV map is generated from the  2D layout of the patterns I designed using the software. I thought I could use this aspect to create an acrylic-to-paint concept for fashion! So I started with the creation of the dress in Marvelous Designer, designing the dress patterns in a 2D layout while fitting them on a character in a 3D view.

Then, I exported the character wearing the dress to Mixamo and applied several animations to it. I set up a virtual scene in Unreal for pixel streaming to a HoloLens device and imported the character wearing the dress into it. 

If you worked with live video compositing in Unreal, you are probably already familiar with applying a video to a material. I won’t go into detail, but essentially that’s what I did, I fed the video stream from a webcam to the dress material in Unreal. This opens up many possibilities, for example using chroma keying to erase parts of the garment with green paint!

Finally, I placed the webcam so as to face the printed UV map of the dress patterns that also served as the canvas for painting and ran the application. 

HP Z1 equipped with an NVIDIA RTX 2070 Super graphics, an HD webcam, and a HoloLens device with a mobile phone

In terms of equipment I used an HP Z1 equipped with an NVIDIA RTX 2070 Super graphics, an HD webcam, and a HoloLens device with a mobile phone mounted inside it to record the experience.

We are thrilled to witness such examples of artists making innovative use of technology to bolster creativity and art. We hope leaders of fashion industry build upon this idea to enable easier workflows.