A Q&A with Evercoast CEO Ben Nunez on the state of Hologram Technology in the age of COVID-19
It’s been (gulp!) 43 years since my 4 year old brain first tried to process the idea of holograms thanks to Princess Leah and Obi-Wan Kenobi. Yet, since that time we’ve not advanced nearly enough to approach the possibilities inherent in that iconic scene. There have been some notable attempts to get there, but, when it comes to hologram technology it’s still very much early days. As with everything else, though, COVID-19 is acting as an accelerant for the space.
Back in the spring, as I watched the selection of events that I was looking forward to attending over the summer (Rolling Stones, Billy Joel, Elton John) all go up in a pandemic fueled ball of flames, the potential for a remote live viewing experience buoyed by hologram technology suddenly seemed more relevant than ever, even necessary. It seems — at long last — as if hologram technology is finally poised for a serious breakthrough.
To dive into this further, I reached out to Ben Nunez the CEO of Evercoast, an integrated software and cloud rendering platform that streams real-time live broadcast and on-demand pre-recorded holograms, to get his point of view on the state of hologram technology today, what’s holding it back and what we should expect from it going forward.
Ben Nunez, CEO of Evercoast
Chris Young: Describe the current state of the hologram industry in general from a technological standpoint.
Ben Nunez: The latest advancements we’re excited about are the ones that are enabling the creation and streaming of volumetric content at scale. For the few years it’s been around, volumetric has been very difficult and expensive to create, and thus we haven’t seen a lot of it. That’s changing now, particularly with systems like Evercoast’s that are small scale, portable, scalable, still very high quality and yet a fraction of the cost of other systems in the market.
Live streaming of volumetric data is another huge advancements we’ve made and are excited to launch this Fall. The ability to have a live fashion show in 3D with photoreal humans is at our doorstep.
COVID created a tailwind to trends that were already in motion: remote, immersive, virtual, work from home, 2D to 3D…and combined with other technologies that are advancing like 5G, edge computing, machine learning, XR, and computer vision, volumetric is primed to see significant growth in the years ahead.
Chris Young: With Fashion Week right around the corner, how would you characterize the fashion industry’s embrace of Hologram Tech pre and post COVID?
Ben Nunez: We think it’s important to distinguish between hologram and 3D. The former is a loosely used term to describe what many believe is a projection of people and objects into thin air, ala Princess Leia in Star Wars or even Tupac at Coachella.
What volumetric technology enables is the creation of true 3D content that can be consumed on any device for uses like aiding decisions across the entire fashion value chain, from pre-production between suppliers and designers, to consumer purchasing at the point of sale.
One of the early examples of volumetric in fashion was “Ashley Graham, Unfiltered” — an experience created by Joanna Nikas, a New York Times Styles editor, that explores issues around body image with model and activist Ashley Graham.
The promise ahead for volumetric in the fashion industry is rooted in the use of photoreal 3D human models, in full motion. For consumers, the ability to try on clothes without physically trying on clothes, the virtual dressing room, has been a dream for a long while. The technology is there to do it now.
Cloth simulation and tools that can map outfits to a human body and maintain the physics of how that cloth flows and drapes given the weight and type of fabric along with the motion of the body will
This type of approach can be used in the pre-production process designers go through with suppliers in making purchasing decisions as well. Good fashion teams are creating versions of their apparel using these cut and simulation tools, and the missing link is a high quality volumetric character in motion that clothing can be mapped on to.
3D content is a manifest destiny of sorts. The world we live in is, after all, 3D.
Chris Young: What makes Evercoast unique in the space?
Ben Nunez: Cost, use of commercially available sensors, small spaces, ease of use, generally accessibility. We’re attacking the cost of creation with by far the lowest TCO in the industry, while maintaining a professional level of quality. We’re making volumetric ubiquitous by lowering the cost of production.
Chris Young: In which sector or industry do you think the technology has been most readily adopted and why?
Ben Nunez: Music is proving to be an exciting vertical use of volumetric. The creativity volumetric capture enables is unparalleled. And with COVID delivering a crushing blow to live music, volumetric gives artists a highly compelling way to connect with fans and create experiences we haven’t seen previously.
Ironically, fashion seems to have been slower to adopt the technology than one would expect. It’s such a perfect market for the use of 3D human capture, but it’s also an industry that has historically been slower to adopt technology. With purchasing decisions moving online, consumers will continue to demand more personalized and advanced ways of choosing products that are best for them. And with limited business travel and in-person modeling in the supply chain, the need for designers and suppliers to interact remotely in more immersive environments will drive greater demand.
Chris Young: What’s your boldest prediction for the space in 2020?
Ben Nunez: We’ll all fall asleep on December 31, 2020 and wake up on January 1, our digital calendars will all say 2020 still, and we’ll all realize it was a long, bad dream from another dimension where our holographic selves experienced a really rough, crazy year.
Short of that, with just a few months left in the year, we predict a live stream of a fashion + music show into an online virtual experience that will span devices and platforms and be viewed as the most widely viewed volumetric experience ever created.
Chris Young: What’s your biggest fear?
Ben Nunez: 3D content is a manifest destiny of sorts. The world we live in is, after all, 3D. The rest is figuring our own path as this transition takes place. We don’t see anything to be afraid of, per-se, except making the bad calls and following deadend technology trends. But, our team has been at this for a long time, and we’re confident in our instincts.
Shudu is the world’s first digital supermodel and the rise of synthetic and deep faked humans, if used in nefarious ways, could cause problems in our world. Though, fears of 3D synthetic humans and deep fakes will call even more attention to the need for authentic content shot volumetrically, so it really doesn’t keep me up at night.