AR

Auto Added by WPeMatico

World premiere at Tribeca: Breonna’s Garden AR experience

World premiere at Tribeca: Breonna's Garden AR experienceBreonna’s Garden, an AR experience created to honor the life of Breonna Taylor is one of the multiple innovative cinematic experiences at the Tribeca Festival that will take viewers to a new world.

Imagine standing in New York City and then instantly journeying to the bottom of an ocean with a marine biologist. Or walking through a moonlit forest inhabited by a monster. Or getting a tour of Barack Obama’s White House from the former president himself. Or crash-landing on an unknown planet 300 light years from Earth with only a robot as your guide.

World premiere at Tribeca: Breonna's Garden AR experience
Republique, the interactive movie

These unique experiences were all made possible thanks to the Tribeca Festival’s evolution in cutting-edge virtual reality and immersive storytelling. For nearly a decade, it’s become the premiere showcase for thrilling technology that allows creative thinkers and explorers to immerse and interact with the 360-degree world around them. All you need is the willingness to let go — and, on occasion, a headset.

Tribeca’s multimedia program ignited in 2013 with the introduction of the Storyscapes section, a series of five experimental works that capitalized on interactive, web-based, and cross-platform approaches to visual creation. A year later, acclaimed directors Daniel Scheinert and Daniel Kwan introduced Possibilia, a non-linear film in which cast members made real-time choices onstage in front of a live audience. And in 2015, Jeremy Bailenson from Stanford University brought its Virtual Human Interaction Lab to the festival. For the first time, audiences could play quarterback for a football team and walk a mile in someone else’s shoes.

World premiere at Tribeca: Breonna's Garden AR experienceBreonna Taylor: a special tribute

That same spirit continues today, and the 2021 edition of the Tribeca Festival has launched an invitation to the public: take an adventure with a friend as they confront their fears through music, confront cryptic AI, and honor Breonna Taylor with a special tribute from her sister, Ju’Niyah Palmer. These cinematic experiences are all a part of Tribeca Festival’s immersive lineup.

With incredible stories from world-class artists, these immersive experiences push the boundaries of storytelling. Presented by AT&T, Tribeca Festival will celebrate cutting-edge selections with the Storyscapes Award, honoring artists who use innovative ideas to bridge the gap between technology and storytelling.

“2021 is the most dynamic year yet for Tribeca Immersive, especially as it relates to the diversity of experiences,” said Loren Hammonds, VP, Immersive Programming, Senior Programmer, Film & Immersive. “Attendees, both in-person and online, will experience everything from live AI performers and animated adventures to a location-based augmented reality memorial.”

The 2021 Tribeca Immersive Program will happen through outdoor interactive experiences. Outdoor in-person Immersive installations will be located at various locations throughout NYC; these experiences are free and open to everyone throughout the Festival. Those available virtually can be accessed via the Tribeca Festival website. Outdoor locations for each immersive experience will be announced on the festival’s website.

World premiere at Tribeca: Breonna's Garden AR experienceWe Are At Home, a VR installation

The lineup includes a world premiere, Breonna’s Garden, a 15 minutes long AR experience created in collaboration with Ju’Niyah Palmer to honor the life of her sister, Breonna Taylor. Part of Juneteenth programming, the cinematic experience Breonna’s Gard – She who plants a garden plants hope, has Lady PheOnix Ruach Shaddai as project creator with Stuart ‘Sutu’ Campbell as key collaborator.

The full list of immersive experiences includes 28 other titles, covering a wide variety of subjects, from Bystanding: The Feingold Syndrome, an immersive docufiction sharing the confessions of people who witnessed a kayak-rower drown for four-and-a-half minutes and did not jump in, to We Are At Home, a VR installation based on the poem The Hangman at Home by Carl Sandburg, which poses a single question to viewers: “What does the hangman think about when he goes home at night from work?”

Other titles present are JAILBIRDS, which ProVideo Coalition mentioned recently, or Republique, the interactive movie, which is, as the name suggests, an interactive film that plunges the participant into the emotions felt during an attack in the Paris Metro through three parallel storylines, letting you switch from one to another. There, are, in fact, experiences for a wide variety of tastes, a clear indication of how vibrant this section of the festival is.

This Unique Nikon Lens Can Capture 360-Degree Views With No Stitching

A designer duo has created a first-of-its-kind lens that can record 360-degree spherical video content that doesn’t need to be stitched in post-processing and can be used with any conventional camera.

Rob Englert and Meyer Giordano are two experienced industrial and interaction designers with a particular interest in augmented (AR) and virtual reality (VR) and have worked with brands such as Bose, Chobani, KODAK, RIDGID, and others. Together, they founded the (sphere) optics brand under which they developed the unique (sphere) Pro1 lens, which can capture everything in full 360-degree view, and creates shooting opportunities that otherwise would not be possible.

What makes it unique is that this lens eliminates the stitching process normally found in spherical or VR content production that combines the perspective of multiple cameras and lenses together. Additionally, creators can also use their existing camera and workflow with the Pro1 as opposed to needing wholly separate equipment.

The idea for the lens was born out of personal experiences after Englert lost his young brother. This life-changing moment happened long before everyone had smartphones in their pockets, which left Englert with almost no memories captured of his brother in a video format.

This fuelled his drive to use all of his skills as an industrial designer “to explore different ways to capture moments in time,” which can later be revisited over and over again. He says that he would “give almost anything for just a couple more minutes” with his brother, and although that is not possible anymore, he hopes it can be made possible for others in the future.

Lens Design

Originally, the novel lens was developed as part of the duo’s ongoing work on AR and VR technologies, with the inclusion of non-fungible tokens (NFTs) in the project’s funding process. Both designers were producing 360-degree videos, using multi-camera arrays in a housing that they had designed and 3D-printed. The process was arduous because the content was recorded across several cameras and the final output needed stitching. They separated each video into frames, combined each set of frames into a panorama, and then recompiled it as a video. Even then, the final product would still have knit lines visible where the views were merged.

Both designers spent a lot of time reviewing how this process could be improved, which is where the idea of a single lens and single-camera setup originated. Instead of using optical design software, they created potential shapes as 3D models and then simulated the result using regular 3D rendering software, which eventually, after some trial and error, gave them a potential physical prototype. They still had to test it in a real-life scenario, which meant the prototype had to be made by hand. Once finessed, they began working with optics professionals to further refine the design and get one step closer to a finished lens.

To cover 100-percent of the environment with the lens, they started with a regular circular fisheye lens with a field of view of approximately 180-degrees and mounted a mirror that reflects its image downward like a periscope in front of it. Then, using the cross-section of this setup, they extruded it in a circle around the axis of the mirror, which ended up with several donut-shaped lens elements around a cone-shaped mirror, which then provided the 180-degree vertical field of view of the original fisheye design as well as a 360-degree horizontal view from being revolved around the center.

The current design of the lens has 12 elements: one reflective — the mirror — and 11 refractive, including twi torus-shaped elements that surround the mirror. Most of the elements of this lens are just as unique as the design itself and cannot be found in any other existing lenses.

The parts are made from specialist engineering plastics — using a family of materials called cyclic olefin copolymers — in a process called single-point diamond turning, which is the only way to generate the complex aspheric forms that make the lens design possible. These plastics have similar optical properties to glass, but are much lighter and easier to shape, and are used in top-end scientific applications, like space telescopes.

The duo chose to go with Nikon F-mount because it’s easily adaptable to most other common standards, making it easier for content creators to use the equipment they already have instead of investing in a completely new system while maintaining full control over the image at all times. The single-lens construction also allows capturing content that a standard VR setup couldn’t due to space limitations.

The lens has a fixed f/8 aperture and uses a 1mm focal length. It is 150 millimeters (5.9 inches) wide and 198 millimeters (7.8 inches) long and weighs 1.8 kg (4 lbs).

Currently, these lenses are produced extremely slowly. They are fabricated one unit at a time and assembled by hand, which makes the individual cost quite high. However, with a large order, the duo says that costs could be significantly reduced by molding most of the elements. The two believe that this lens can benefit a wide range of industries from film, documentary, gaming, and entertainment, all the way to engineering, military, surveying, and more.

Photo and Video Footage

For stills, the lens produces a circular image with a black void in the center. The area near the center of the image corresponds to the forward end of the lens, and the outer periphery of the image circle is the direction facing the camera body. In their words, “It looks a lot like the black hole from ‘Interstellar.’”

Converted equirectangular of same image, as above, shot in Big Bear CA.
(sphere) Pro1 lens was used to help NASA document the James Webb Space Telescope at Goddard Space Flight Center as part of its journey to space.

The circular image can be used as-is if desired, but most 360-degree video players use the standard equirectangular format. For this reason, (sphere) provides an ST map, which is an image that acts as a positional lookup table to tell the computer how to rearrange the pixels. This is also commonly used to remove lens distortion and internally is similar to what Adobe Camera Raw’s distortion correction does. The duo says that Adobe could add specific support for this if they desired.

Designers have also created 3D meshes that can be inserted into game engines, such as Unity or Unreal, to unwrap the image in real-time, “which is very useful either to act as a monitor for recording or to facilitate live streaming to VR headsets.” Overall, the process of converting (sphere) videos to VR experiences is simple and can be done on iOS and Android devices, the designers claim.

Get Involved

To fund the project and the ongoing development, the team offers high resolution, fully immersive VR moments as limited edition NFTs on Mintable. These moments were filmed using the lens and buyers can use their VR headset to virtually enter the place and time captured. The purchase includes the video in standard equirectangular format at 8K resolution for viewing via VR headset, along with the native circular projection of the (sphere) Pro 1 lens at the native 4,048 x 4,048 resolution.

Additionally, the company offers a token that can be exchanged for a physical copy of the (sphere) Pro1 lens as more units become available. You can also view a (sphere) gallery with more image samples here.

Snap is Working on AR Glasses, Selfie Drone: Report

Snap is working on its next-generation glasses that will be able to layer Snapchat lenses (which are augmented reality effects) onto the environment without needing to use a smartphone camera, according to a new report. It has also reportedly revived its plans to introduce a selfie-drone.

According to a report from The Information, and noted by Engadget, the next generation camera glasses aren’t actually being developed with the intention of deployment to the average consumer. Instead, the company is focusing efforts on developers and creators, those who currently are responsible for making the app’s more popular SnapChat AR filters.

Snapchat is reportedly gearing up to officially announce these new Spectacles in May during its developer conference. Theoretically, if the glasses are successful, Snap could aim to launch more widely.

While Snap is most well-known for its social media app, the company has successfully brought hardware to market. Snap launched the Snap Spectacles 3 in 2019 and still offers them for sale today. The original Snap spectacles were a viral hit in 2016 when they launched in special pop-up vending machines in select cities. Eventually, the Spectacles became avaialble to order on Amazon. In 2018, Snap launched its second-generation Spectacles that allowed you to capture photos using the glasses.

That’s not the only hardware Snap is investigating. The report goes on to say that the company is once again looking into producing a selfie drone. In 2017, Snap reportedly acquired the LA-based drone company Ctrl Me Robotics before looking at another China-based acquisition, Zero Zero Robotics. Back then, The Information reported that Zero Zero reached out to Snap for funding help, which was later denied by the robotics company’s CEO.

Since then, news about Snap and drones has been largely pretty quiet, but that is apparently set to change. Snap supposedly did eventually invest in Zero Zero to the amount of $20 million, but it did not outright acquire the company. Since then, the new report indicates that Snap’s own engineers have been working on developing an in-house product. It’s not clear when such a drone would come to market and how it would differentiate itself in what has become a very crowded space.

Arcturus brings volumetric video to 5G mobile devices

Volumetric video comes to 5G mobile devicesLarge, high quality interactive 3D files can now be streamed to 5G mobile devices and viewed directly through a browser, along with AR/VR, thanks to Arcturus and telecom giant DOCOMO

Although volumetric videos have seen a steady increase in everything from entertainment to education, the recent introduction of lidar cameras for mobile devices is poised to turn anyone with a compatible phone or tablet into a new type of content creator, but for that you need the technology to follow the dream. That’s what the partnership between Arcturus and Japan’s top wireless carrier, NTT DOCOMO, brings to the table: it allows users to stream volumetric videos of any length over a mobile network for the first time.

The barrier up until now was that the complexity and size of the files – which include detailed 3D geometry and multiple camera angles – meant that volumetric videos were not suitable for streaming anything beyond short clips. Now, according to Arcturus, anyone on DOCOMO’s 5G network – from the average user to professionals in industries ranging from ecommerce to healthcare and beyond – can stream the future of video directly to their mobile devices.

Volumetric video comes to 5G mobile devicesStreaming AR and VR in real-time

In fact, this new partnership changes how it all works, giving 5G mobile users the ability to stream volumetric videos of any length without a drop in quality, so they can explore every angle of the video directly from their browser. This also means that large, high quality interactive 3D files can now be viewed directly through a browser, along with Augmented Reality or Virtual Reality projects, all in real-time.

“We want to put volumetric video technology in the hands of anyone with a mobile device,” said Kamal Mistry, CEO of Arcturus. “DOCOMO’s 5G capabilities are among the best in the world, which gives us the opportunity to really push the technology forward by leaps and bounds.”

To bring volumetric videos to mobile devices, DOCOMO is leveraging its 5G network speeds and CDN expertise, with Arcturus’s compression tools, adaptive bitrate streaming solution and video player, capable of adapting to a device’s bandwidth to maintain the highest quality possible. Utilizing these tools, playback length in volumetric videos is no longer an issue, and users browsing on a web browser, or experiencing them in AR or VR, can now explore travel hotspots, explore retail locations, watch full workout videos and more, all from their chosen point-of-view, and all on their mobile devices.

Volumetric video comes to 5G mobile devicesCheck the Virtual Booth online

“Working with Arcturus, we are able to anticipate the needs of our customers now and in the future and offer them solutions before they know they want them,” said Naoto Matoba, manager, innovation management department at NTT DOCOMO. “Volumetric videos offer an entirely new way to experience content, and now that we can offer them to anyone with a mobile device, we expect the interest to grow rapidly.”

This partnership follows Arcturus’s release of its industry-leading volumetric video content delivery platform, which allows artists to easily edit and compress clips, then distribute them to third-party tools, including post-production software, real-time game engines and more. As the global rollout of 5G technology continues, individual and commercial studios alike have access to the tools they will need to engage with the next stage of the evolution of video.

The partnership between DOCOMO and Arcturus will be showcased during the “DOCOMO Open House 2021”, an online event created to highlight the latest news and features from Japan’s top wireless carrier. Volumetric videos will be featured in the “Virtual Booth,” beginning on Thursday, February 4 and running through Sunday, February 7.

Volumetric video comes to 5G mobile devices

Pricing and availability

Arcturus’s “HoloEdit,” the first non-linear editor and post-production tool for volumetric video, and “HoloStream,” a tool that offers streaming options for volumetric video, are both available now. Annual licenses are available now for commercial and private use. Special education pricing is available as well upon request, says Arcturus.

Arcturus is the global leader in authoring, editing, and distributing volumetric videos. Founded in 2016 by innovators with deep backgrounds in storytelling and technology from Pixar, Dreamworks, Autodesk, Google and YouTube, Arcturus was ideally suited to create “HoloSuite,” a SAAS post-production platform, offering tools for editing and distributing volumetric video. HoloSuite consists of “HoloEdit,” an optimized pipeline for the editing and compression of volumetric video, and “HoloStream,” an adaptive streaming solution that streams polygons directly to the viewer.

Arcturus also helps others navigate the volumetric video ecosystem. Consulting work includes Hulu’s Emmy nominated “Light as a Feather” Magic Leap activation at VidCon, and Madonna’s volumetric holograms at the 2019 Billboard Music Awards, the first ever live broadcast of a volumetric performance and the winner of the Advanced Imaging Society Lumiere award.

You can now see Smithsonian Museum’s exhibitions through Instagram AR

Since we can’t really visit many places nowadays, there are solutions that let us experience them at least virtually. And now, you can even do it through Instagram. The platform has added exhibitions from the Smithsonian Museum and two other museums to its AR effects lineup, so you can “visit” exhibitions from your phone. Other […]

The post You can now see Smithsonian Museum’s exhibitions through Instagram AR appeared first on DIY Photography.

Zylia navigable audio system: a 3D sound experience

Zylia navigable audio system: a 3D sound experienceHere is a demo for you to listen to: the whole 3D audio recording of the Poznań Philharmonic Orchestra concert, with 34 musicians, 30 microphones, and 600 audio channels, totals 3 hours and 700 GB of audio data. Enjoy!

Zylia, an industry leader in the field of 3D audio recording, continues to promote its solution for anything from VR and AR to games of concert streaming. The most recent example is a 4-minute demo that takes you to the stage as the Poznań Philharmonic Orchestra plays the overture from “The Marriage of Figaro”, from W. A. Mozart. The video premiered on YouTube and Facebook recently shows how it works when each listener can enact their own audio path and get a real being-there sound experience.

With 34 musicians on stage and 30 Zylia 3’rd order Ambisonics microphones, the company showed the possibilities of the Zylia 6 Degrees of Freedom Navigable Audio system – a breakthrough in sound streaming. Zylia’s system does offer a new way of experiencing the sound trough the Internet, which may be just what we need, now that so many things are being offered online, either pre-recorded or live.

Zylia notes that the virtual concert with the Poznań Philharmonic Orchestra is a “real being-there sound experience” but I dare to suggest that it goes beyond that, because you’re not just sitting on the third row in front of the stage, through the magic of the system you can move around the stage, and listen to the music from different spots, following the musical flow or listening to a specific instrument you want to listen to. No other sound system allows you to do that, as far as I know!

Zylia navigable audio system: a 3D sound experienceNumber of streaming events continues to grow

While it may not be as perfect as some wish, Zylia’s 6 Degrees of Freedom Navigable Audio system puts an impressive show. We’ve shown some more about the system before, here at ProVideo Coalition, and this new example will immerse and convince you. Just get your headphones and sit back, while watching the video. It’s not difficult to imagine what this can do for a whole variety of musical experiences. Not to mention how the system makes sense for Virtual Reality experiences, where the viewer is placed inside a 3D space. 3D sound is necessary for those experiences.

Zylia navigable audio system: a 3D sound experienceThe demo with the Poznań Philharmonic Orchestra comes at a time when the number of streaming events continues to grow. After an initial period where no one knew exactly what to do next and hoped that “next month we will be back to normal again” now everybody, musicians, comedians, actors, and television personalities are connecting with their public through Facebook, YouTube, Tweeter, Instagram, Twitch, Zoom, TikTok and their own websites.

The pandemic moved the whole world online and from live performances at home or at venues without an in-house audience to pre-recorded material – we all have seen our share of Zoom concerts – artists are trying to cheer up their fans and offset their losses from canceled or postponed live events. Either way, the public demand for streaming footage is huge and still grows. American “billboard” magazine publishes every week all the live streams and virtual concerts to watch during corona virus crisis. Since March 2020, they put on the list over 800 events, and that’s just the beginning, as there is almost everything there: concerts, digital festivals, virtual exhibitions, talking panels, readings, movies, theater plays and many others.

Zylia navigable audio system: a 3D sound experienceBroadcasting concerts during a pandemic

Audio is a key part of any performance, and we’ve seen how audio is still a problem through these last months: while webcams and video technology have reached a point where the image offered is enough for professional use, in many cases, audio is not always at the same level of quality. As streaming, live or pre-recorded, becomes not an after thought but the plan A for many artists, big or small, those players need to find a way to stand out of the forming crowd when showing their work through the Internet.

Zylia navigable audio system: a 3D sound experienceZylia believes the company has a solution: its unique 6 Degrees of Freedom Navigable Audio recording can be a breakthrough in the streaming of sound. Their work with the Poznań Philharmonic Orchestra, from Poland, is a real-world example of what can be achieved. “In the last months, we faced a problem of making concerts without a public. We’ve been broadcasting the concerts, but it produced a lot of problems on how to show the acoustic of the hall, – says Łukasz Borowicz, the Principal Guest Conductor of Poznań Philharmonic Orchestra.

Zylia’s solution used during Poznań Philharmonic Orchestra concert allowed to record an entire sound field around and within the performance. For a common listener it means that while listening they can walk through the audio space freely. For instance, they can approach the stage, or even step on the stage to stand next to the musician. At every point, the sound they hear is a bit different, as in real life. “This is something particularly important in the Zylia system – Łukasz Borowicz – it makes it possible to listen to the orchestra not only from one fixed place. You can go to the balcony or the last row and acoustic changes.”

Zylia navigable audio system: a 3D sound experienceBehind the Scenes of a concert

The recording with the Poznań Philharmonic Orchestra using Zylia’s 6 Degrees of Freedom Navigable Audio system was made with:

  • 30 ZM-1S mics – 3rd order Ambisonics microphones with synchronization
  • Two MacBooks pro – for recording
  • A single PC Linux workstation – as a backup for recordings
  • 600 audio channels – 20 channels from each ZM-1S mic multiplied by 30 units

The result can be experienced in the demonstration video and explored further in the Behind the Scenes video also published by Zylia. The listener can navigate smoothly around the scene and experience the sound from anyplace – close perspective, the backstage or the audience. Notice how the sound changes while passing by the musicians. The 6 Degrees of Freedom in Zylia’s solution name refers to 6 directions of possible movement: up and down, left and right, forward and backwards, rotation left and right, tilting forward and backward, rolling sideways.

For the team at Zylia responsible for the project, this is an example of what the future can bring. Tomasz Żernicki, CEO of Zylia says “when we look into the future, what we see is a completely different world of interaction with creative art. We see streaming platforms enabling people to participate in concerts of any artists they want. Platforms that they can enter from their homes and join the experience with their friends. We see hyper-realistic games, both visually and audibly, allowing players to fully immerse themselves in the plot. Finally, we see virtual reality creations, available for everybody from anywhere in the world and limited only by the imagination of their designers. Zylia Navigable Audio is a huge step in that direction, and we are very proud of the impact that we are having on the future of 3D audio technology.”

One final word, from Wojciech Nentwig, director of the Poznań Philharmonic Orchestra: “I am not a fan of experiencing classical music through electronic media. However, I admit that the recording made by the Zylia team surprised me very pleasantly. I am glad that thanks to this innovative form of recording, classical music can reach even wider audiences. Congratulations to the Zylia team for an interesting research and such achievement.”

Realtime SetMapping used for Amazon’s “Inside The Boys”

Realtime SetMapping used for Amazon’s “Inside The Boys”

When Amazon approached Aggressive to design the set and show package for their “Prime Rewind: Inside The Boys” the studio spearheaded a technique to eliminate green screen and post-production.

Projection mapping, LED-driven visuals and elements of Augmented Reality and Extended Reality are integral parts of the toolbox of Aggressive (www.aggressive.tv), the New York production and design studio. Headed by the Grammy award-winning filmmakers Alex Topaller and Dan Shapiro, the creative production company crafts commercials and music videos for some of the world’s biggest brands and artists such as Samsung, Boeing, Ford, Toyota, Michael Jackson, Pharrell, Stone Temple Pilots and Tiesto among many others.

So it’s no surprise that when Amazon and Embassy Row approached the studio to design the set and show package for their “Prime Rewind: Inside The Boys” series – which takes viewers on a behind-the-scenes exploration of its hit action-comedy “The Boys” – the company built on its experience and spearheaded a technique that unified the look of the entire project while eliminating the need for complex green screen shooting and months of intense post-production. At the same time, this process dramatically reduced the number of crew required to be present on set.

Realtime SetMapping used for Amazon’s “Inside The Boys”

The Realtime SetMapping technique

Aggressive has dubbed this technique Realtime SetMapping, and is ready to roll it out on a broad scale to clients across the spectrum of broadcast promotion, branded content and advertising. For “Inside The Boys,” which comprises nine 30-minute episodes hosted by TV personality Aisha Tyler, Aggressive created a show open and accompanying branding.

To execute the package, they created an immersive virtual CG set where Tyler could dish on the series, do cast member interviews and provide fans with additional show-related content. Displayed on an array of connected 4K LED panels, this CG-environment tracked camera positions, allowing them to move freely in any direction while the backdrop shifted and changed perspectives correctly for each corresponding camera angle, in real time.

AR and XR content was then layered on top of the footage being captured on the virtual set, with all the elements recorded simultaneously. The BTS video above shows how the Realtime SetMapping technique works.

Realtime SetMapping used for Amazon’s “Inside The Boys”

Minimal post production

“Our Realtime SetMapping technique allowed multiple cameras to move freely through the space for each of the episodes, capturing the host interacting with guests in a dynamic, believable and authentic way,” explains Aggressive Executive Creative Director Alex Topaller.

“The virtual set enabled custom depth-of-field and dynamic lighting effects driven by the LED tiles, real-time video guest appearances, as well as a slew of in-show animations and Easter eggs that we were able to trigger on cue,” adds Executive Creative Director Dan Shapiro. “With all the visuals being rendered in real-time, last minute adjustments to art direction, virtual propping, virtual set dressing and lighting were done smoothly, easily and efficiently on the shoot day. This allowed us to deliver the final results in camera, which in turn led to minimal post production.”

Realtime SetMapping used for Amazon’s “Inside The Boys”

From “Inside The Boys” to Alicia Keys

“Blending cutting edge Realtime SetMapping with smart visual design, we shaped the look of the show and transformed it into a novel live execution that pushed the boundaries of the medium and elevated the concept of the entire season,” notes Topaller.

Aggressive also developed and directed an hour-long XR live performance, which premiered on September 18th, 2020, to launch Alicia Keys’ eponymous new album ALICIA. American Express and Momentum engaged the company to create the event as part of American Express’ prestigious UNSTAGED concert series.

While the event was “virtual”, filmed in a stage and streamed around the globe, it was key for Aggressive to deliver the authenticity and emotional impact of a real concert experience, so the studio opted for a multi-camera longform broadcast approach. In this way, Alicia and her band could play each act in its entirety, settling into a groove and allowing Aggressive to capture the spontaneous energy and magic of the performance in the moment, as it happened.