Auto Added by WPeMatico

CAPCOM GO! Apollo VR Planetarium: a celebration of Apollo 11

CAPCOM GO! Apollo VR Planetarium: celebrating Apollo 11 at homePut your Virtual Reality headset on and enter your own FREE planetarium in CAPCOM GO! Apollo VR Planetarium, created to celebrate the 50th Apollo 11 Anniversary in July 2019.

Initially released in December 2019 and last updated in January 2020, CAPCOM GO! Apollo VR Planetarium, from NSC Creative, is a FREE Virtual Reality experience that serves as introduction to a series of DLCs (downloadable content) that expand the original idea. Constructed to celebrate the launch of Apollo 11, NASA’s first mission to land humans on the Moon, the title transports the viewer to, as the creators say, “a state-of-the-art facility with the latest opto-mechanical starball, film projectors and the first ever holoscope!”

CAPCOM GO! Apollo VR Planetarium: celebrating Apollo 11 at homeThe VR planetarium, compatible with PCVR headsets like the Valve Index, Oculus Rift, the Vive family or the portable Quest, with the use of Link, is an easy way to get a glimpse of the catalog available from the creator of the title, NSC Creative, which is an award-winning UK studio with 20 years of experience in immersive media, and a specialist in immersive experiences for theme parks, VR, AR, MR, domes, 3D/4D, museums and science centres.

The planetarium in CAPCOM GO! Apollo VR Planetarium is your mission control room. Take a seat behind the control desk and use your VR controllers to explore each console. Your planetarium comes equipped with the ability to watch eye-popping trailers and full-length immersive films which are out of this world, revealing some of the titles produced by NSC Creative. Using the VIDEO MODE, put a video tape in the player and enjoy the show!

CAPCOM GO! Apollo VR Planetarium: celebrating Apollo 11 at homeTake a seat and immerse yourself

Besides a series of activities to try, the app, which serves also as the base upon which the series is expanded, includes trailers for the following NSC Creative planetarium shows:

  • CAPCOM GO! The Apollo Story
  • We Are Stars
  • We Are Aliens
  • Astronaut 3D

The STARBALL MODE opens a window towards the big sky. Everyone enjoys a nice relaxing tour of the night sky and your new star projector can resolve over a billion stars and has custom space race themed constellation artwork slides. Change the procession to see the cosmos roll past overhead. (Please note this is not a 100% accurate simulation of the night sky).

CAPCOM GO! Apollo VR Planetarium: celebrating Apollo 11 at home

The HOLOSCOPE mode is another key element of the experience offered by the app. No other planetarium in the world has one of these. Grab an object from your desk and place it on the transmogrification pad to project a holographic visualization into the planetarium. You can almost reach out and touch it! Hear about the amazing feats of engineering that took Apollo Astronauts to the moon and back.

CAPCOM GO! Apollo VR Planetarium can be the start of a space adventure, as you take a seat and immerse yourself in one of NSC Creative’s 3D dome movies, explore the interactive planetarium controls to discover stars, constellations, holograms and lasers, celebrate the Moon landings, explore the wonders of the cosmos and take a tour of the International Space Station. If you want to take the experience further, there are three other Virtual Reality apps available:

CAPCOM GO! The Apollo Story (Paid)

CAPCOM GO! The Apollo Story is an award-winning planetarium film produced by the National Space Centre, UK that celebrates the achievements of the Apollo program and what it took to land the first human on the Moon. Screening in 100+ planetariums worldwide you can now watch it in the comfort of your own VR headset in the CAPCOM GO! Apollo VR Planetarium. This DLC content gives you access to the full-length 26 minute, 4k 3D 60fps 360° film with English, French, German and Italian narration and subtitles.

We Are Stars (Paid)

We Are Stars is an award-winning planetarium film produced by the National Space Centre, UK that explores the creation of the Universe, our Solar System and the origins of life on planet Earth. Screening in planetariums worldwide you can now watch it in CAPCOM GO! Apollo VR Planetarium: celebrating Apollo 11 at homethe comfort of your own VR headset in the CAPCOM GO! Apollo VR Planetarium.

We join the Time Master narrated by Hollywood superstar Andy Serkis, a Victorian gent with his very own time tent who whisks us off on a 13.8 billion year adventure. With expert input from leading scientists, cosmologists, astrophysicists, astrochemists, planetary scientists and astrobiologists we present humanity’s current understanding of where everything, including us, came from.

This DLC content gives you access to the full-length 26 minute 3D film with English narration and subtitles.

ISS 360° Tour with Tim Peake (Free)

ISS 360° Tour with Tim Peake is an educational planetarium film produced by the National Space Centre, UK that takes you through a narrated tour of the International Space Station. Screening in planetariums worldwide, you can now watch it in the comfort of your own VR headset in the CAPCOM GO! Apollo VR Planetarium.

CAPCOM GO! Apollo VR Planetarium: celebrating Apollo 11 at homeWorking together with BAP (British Association of Planetaria), NSC Creative has produced a short immersive 360° experience which takes the viewer around the ISS (International Space Station) to see what it is like to live and work in space. Narrated by British Astronaut Tim Peake who has successfully completed his 6 month Principia mission. We also get to see footage of Tim shot onboard the ISS specifically for this project using a DSLR and Fisheye lens.

The project was supported by the UKSA (UK Space Agency) and ESA (European Space Agency) with additional support from Explorer Dome, Centre for Life and the National Space Centre.

This DLC content gives you access to the 15-minute film as a free addition to the primary experience.

CAPCOM GO! Apollo VR Planetarium: celebrating Apollo 11 at homeExpand your Virtual Reality library

The CAPCOM GO! series is available for fulldome experiences, iOS, Android, Google Cardboard/Daydream, Oculus Rift S, HTC Vive, Valve Index and Windows Mixed Reality. Although you can watch the content on a flat screen, virtual reality headsets offer, as expected, the most immersive experience, and one that you can explore from the comfort of home.

The two paid titles in the collection, We Are Stars and CAPCOM GO! The Apollo Story can be acquired for less than $10, making the collection a bargain if you want to expand your Virtual Reality library with more titles related to space. We’ve previously suggested, here at ProVideo Coalition, titles as The Edgar Mitchell Overview Effect VR Experience, Space Explorers, the largest production ever filmed in space, or the Apollo 11 VR Experience.

Maxon: May 3D & Motion Design Show coming soon

Maxon: May 3D & Motion Design Show coming soonFrom using Cinema 4D to tell cinematic stories to a project around a mythological femme fatale, the May edition of the 3D & Motion Design Show covers a variety of work by 3D and motion graphics artists.

Maxon’s free virtual event for 3D visual artists is packed with industry luminaries sharing insights, workflows and techniques for the Maxon Suite of creative tools. Maxon’s lineup of speakers for its May 3D & Motion Design Show, is one of the good reasons to attend the virtual event with a series of artist-focused presentations for 3D and motion graphics artists. During the May show, on May 19, 2021, Maxon will host presentations from a collection of professionals from different segments of the industry.

Coming off the heels of the three-day April 3D & Motion Design Show, the free-to-attend virtual series of artist-focused presentations for 3D and motion graphics artists, which launched Cinema 4D S24, attendees will get a closer look at how some of the most dynamic 3D artists are leveraging the Maxon suite of creative tools.

Featured speakers for the May session Include:

  • Brandon Parvini – a freelance motion designer and director based in Los Angeles. With over a decade of experience, Parvini moves between various lead roles including, creative direction, art direction, sign, senior 3D artist, technical direction, look development, consultation, excelling in the development of nontraditional pipelines and workflows.
  • Chris Bjerre – a San Francisco-based multidisciplinary artist with over a decade of experience in the motion graphics industry. His diverse and vast body of work includes feature films, commercials, title sequences, music videos, games, VR, and experiential design.
  • Frederic Colin – a director and designer based in Paris. His recent CG passion project, “Medusa: The Fallen Goddess” used Maxon tools to recreate the story of this mythological femme fatale.
  • Martin Vanners – after doing photography for over 15 years, Martin Vanners now uses Cinema 4D to tell cinematic stories. Martin will be demonstrating how he uses his photography skills to art direct his 3D renderings.

Presentations for all Maxon 3D and Motion Design Shows are streamed live and available on demand shortly after on, as well as the Maxon YouTube channel. Viewers will be able to interact and send in questions via chat for the live Q&A sessions with artists.

Vazen has a new 50mm T2.1 1.8X anamorphic lens for full frame

Vazen has a new 50mm T2.1 1.8X anamorphic lens for full frameVazen has unveiled the 50mm T2.1 1.8X anamorphic lens, adding an ultra wide-angle option to its full frame EF/PL 1.8x anamorphic lens line-up.

Known for its Micro Four Thirds lens set, which comprises the 28mm + 40mm + 65mm focal lengths, the Chinese company Vazen is expanding its full frame lineup with a new lens, the 50mm T2.1 1.8X anamorphic lens, which joins the Vazen 85mm T2.8 1.8X Anamorphic, the first of the promised complete three lens set. The 85mm was announced August 2020, with the two other lenses promised, then, for late 2020 or early 2021. The time is now, apparently, as Vazen announced the new Vazen 50mm T2.1, which joins the available 85mm. According to the company, a 135mm lens is expected to be ready to ship in 1-2 months to complete a 3-lens set (50 – 85 – 135) in phase 1.

We’ve covered the Micro Four Thirds lenses here at ProVideo Coalition, and when announcing the Vazen 65mm T2 1.8x Anamorphic lens we noted, following the information available from Vazen, that the lens was ““the world’s first 1.8x anamorphic prime designed for Micro Four Thirds cameras with 4:3 sensors. Characterized by its front anamorphic design, the VZ Anamorphic prime delivers a butterly smooth oval bokeh, signature blue horizontal flare and the widescreen cinematic look.”

One more note that may be of interest if you’re a user of Canon EOS R mirrorless cameras: the kit introduced for Micro Four Thirds is also available, according to Vazen, with Canon RF mount, if you want to explore this anamorphic solution with a full frame sensor.

Vazen has a new 50mm T2.1 1.8X anamorphic lens for full frameFor large format cinema cameras

The new Vazen 50mm T2.1 is designed to cover large format cinema cameras like Red Monstro, Alexa LF, Kinefinity Mavo LF and Z-CAM E2-F8 fully. The lens features a super compact and lightweight design. Weighing merely 3.42 pounds (1.55kg) and 5.24” long (13.3cm), it is one of the world’s smallest 1.8x anamorphic lens for full frame cameras. Its compactness allows it to be balanced on gimbals and rigs very easily, says Vazen.

The 50mm has the same focus and aperture position as the 85mm for easy lens switching on set. And a consistent 86mm front filter thread is handy for ND filters or diopters installation. The front diameter is a standard 95mm for matte box mounting. The entire lens is built up of aluminum and the independent aperture and focus rings are incorporated with 0.8 mod cine gears. The lens is available in an interchangeable PL and EF mount. Both mounts and shims can be found with the lens in a Vazen hard case.

All Vazen 1.8x anamorphic lenses feature a front anamorphic design. It delivers a buttery smooth oval bokeh, signature blue, but not over saturated, horizontal flare and the widescreen cinematic look. The lens also features an ultra-wide 65° horizontal field of view (similar to a 28mm spherical lens) and the closest focusing distance is 3.6” from sensor.

Vazen has a new 50mm T2.1 1.8X anamorphic lens for full frame

A $2,000 discount for the pair

The lens, Vazen claims, delivers “outstanding sharpness, even at wide open, which is unparalleled by other PL/EF anamorphic lenses with the similar squeeze ratio. Vazen chose to adopt a 1.8x squeeze design to balance the anamorphic characters as well as the resolution of the image. The 1.8x squeeze can produce a cinematic widescreen 2.39:1 aspect when paired up with 4:3 sensors. When paired up with 16:9 sensors, much less data (than 2X anamorphic lens) is needed to be cropped away to create the desired 2.39:1 ratio.

The lens is currently available to order from authorized resellers and from Vazen website. It is available to ship in late August. Free Priority shipping provided. The retail price in US is USD 8,000/pc. A $2,000 discount will be offered for a 2-lens (50 + 85) purchase.

Seven companies launch the Volumetric Format Association

Seven companies launch the Volumetric Format AssociationWe’ve seen the term being used more and more and now seven companies – Verizon, ZEISS, RED Digital Cinema, Unity, Intel, NVIDIA, and Canon – announce the Volumetric Format Association.

Volumetric video has been a buzzword in recent months. ProVideo Coalition has covered the technology different times, from the announcement of the volumetric films premiere at 77th Venice International Film Festival, in September 2020, when Intel Studios showcased volumetric production with two originals honored with selection in the VR Expanded program of the festival, to the Polymotion Stage Truck, the world’s first volumetric studio on wheels, also presented last September.

Last February ProVideo Coalition announced a major step in the use of volumetric video, as Arcturus, the global leader in authoring, editing, and distributing volumetric videos, announced a new partnership with Japan’s top wireless carrier, NTT DOCOMO to stream volumetric videos, using the power of 5G. This May, Arcturus announced a $5 million seed round led by BITKRAFT Ventures with participation from leaders in the sports, telecom, gaming, and music space. The funding will be used to scale the software development team, focus efforts on sales growth, and expand the product line with an emphasis on live streaming features.

Seven companies launch the Volumetric Format AssociationA more immersive and interactive experience

Its against that background that the Volumetric Format Association (VFA), the first industry association dedicated to ensuring interoperability across the volumetric video ecosystem, is born, launched by seven founding member companies, the founding charter member Verizon as well as ZEISS, RED Digital Cinema, Unity, Intel, NVIDIA, and Canon. The VFA brings together companies from key industry verticals in the volumetric ecosystem to work together to establish a collection of specifications driving adoption of volumetric capture, processing, encoding, delivery, and playback.

Volumetric video is an innovative way of experiencing content in holographic 3D, allowing the viewer a more immersive and interactive experience. It is characterized by the process of simultaneously capturing content from multiple cameras which can then be viewed from any angle at any point in time. The Volumetric Format Association seeks to facilitate collaboration, innovation, and technology sharing of the new bandwidth-intensive industry format, in order to drive faster development, adoption and ecosystem growth.

“Verizon is proud to be the driving force behind creating this group which we believe will be crucial to laying the foundation for future innovation using volumetric video,” said Denny Breitenfeld, president of the VFA and director of volumetric technology at Verizon. “With 5G’s massive bandwidth, super-fast speeds and ultra-low latency, things like volumetric video capture will be taken to the next level, enabling innovators to turn real people into lifelike, moving 3D holograms in just minutes. We look forward to working with leaders in this industry to develop specifications that will propel volumetric video forward.”
Seven companies launch the Volumetric Format Association

Experience sports like never before

Volumetric Format Association membership allows companies to share intellectual property within the confines of the organization while protecting the value of that IP. The VFA Technical Steering Committee has defined four initial working groups to begin the work of building end to end specifications:

  • Capture Acquisition
  • Interchange of Data
  • Decode & Render
  • Persistent Metadata

3D content experiences across sports, entertainment, productivity, enterprise services, and more are fast becoming a reality. The VFA is working to make that reality accessible and practical by developing specifications that will be a catalyst for industry growth. To that end the Volumetric Format Association is opening membership to innovative companies in the volumetric ecosystem. VFA welcomes input, creativity, and collaboration as together they pioneer a new medium.

“Canon is a strong believer in volumetric video technology as evidenced by the new kind of sporting experience we delivered at the 2019 Rugby World Cup in Japan. In addition, we opened a volumetric video studio in Kawasaki, Japan that is used by the most advanced creators,” said Mr. Atsushi Date, senior general manager, Canon SV Business Development Center. “Canon is confident that volumetric video technology will enable the world to enjoy sports and entertainment more in many ways that have never been experienced before, and we have high expectations for the results of VFA standardization. This is a big first step of many to come in the rollout of the Canon Volumetric Business.”

Seven companies launch the Volumetric Format AssociationDemocratizing volumetric video solutions

“As media content evolves from 2D to 3D, volumetric video will impact industries from entertainment to a wide range of enterprise services. In addition to the breadth of our XPU portfolio that spans network, FPGA, CPU and GPU technologies, Intel is working with the Volumetric Format Association and our ecosystem partners to bring advanced and cost-effective volumetric streaming to a variety of customers,” said Lynn Comp, vice president, Data Platforms Group, general manager, Visual Infrastructure Division & NPG Strategy at Intel. “The VFA is an important step in democratizing end-end volumetric video solutions, enabling global supply chains and supporting interoperable solutions and innovation.”

“Volumetric video is changing how customers experience virtual interactions by providing complete immersion with full interactivity that is powered by NVIDIA GPUs,” said Bob Pette, general manager of professional visualization at NVIDIA. “Accelerated computing technology is pushing the boundaries of virtual experiences and bringing together partners to evolve the ecosystem as a whole will be critical to achieving the next generation of AI-driven 3D content.”

“As a brand that was founded on innovation, disruption, and pushing the bleeding-edge of image capture technology, RED is proud and excited to be part of the Volumetric Format Association,” said Brian Henderson vice president, business development at RED. “We will continue to evolve alongside this emerging industry. As a member of the VFA, we will continue to re-define previously held standards of overall image quality, resolution and camera performance in order to provide the best possible images for creators and innovators to work with.”

Seven companies launch the Volumetric Format Association

The biggest shift since B+W went to color

“The Volumetric Format Association is leading the way in how content, especially in sports and M&E, is being presented, and we are seeing the biggest shift since B+W went to color” said Tian Pei, head of business affairs for sports at Unity. “Volumetric technologies encompass the process of capturing, viewing, and interacting with the real world and here at Unity we are pushing the limits of what’s possible and transforming the interactive experience forever.”

“As board members of the Volumetric Format Association, ZEISS strives to abridge the current adoption barriers within the motion picture industry and beyond”, says Snehal Patel, head of cinema sales at ZEISS. Patel continues, “with over 100 years of experience in optical design and innovation, ZEISS is primed to provide key insights into lens technology and its relation to Volumetric workflow. Furthermore, through the VFA specifications, ZEISS hopes to contribute to the accuracy of lens metadata which will play a vital role in the volumetric user experience.”

As the Volumetric Format Association members move rapidly towards the establishment of their specifications, companies interested in joining the organization should reach out by email to or see

NVIDIA Broadcast 1.2: turn any room into a home studio

NVIDIA Broadcast 1.2: turn any room into a home studioNo more dog bark, room echo or video noise are some of the reasons to use the new NVIDIA Broadcast 1.2. All you need is a compatible NVIDIA GPU.

NVIDIA Broadcast is a universal plugin that works with most popular live streaming, voice chat and video conferencing apps. Introduced last September, the app is supported on any NVIDIA GeForce RTX, TITAN RTX or Quadro RTX GPU, using their dedicated Tensor Core AI processors to help the app’s AI networks run in real-time, right alongside your other software. The features in NVIDIA Broadcast can be used in anything from game broadcasting to video conferencing with Zoom or any other situation where a sound and image connection to the world is needed.

As millions of people discovered when forced to work from home, backgrounds and noise became part of any home broadcast, problems that software companies tried to solve with different apps. NVIDIA had been playing with green screen effects and noise reduction for all those streaming from home, so the next evolution was a tool that offered an integrated solution for NVIDIA GPU users. That’s how NVIDIA Broadcast appeared, to allow the world’s 20 million live streamers – and all the others – to turn their bedrooms, living rooms and kitchens into a broadcast studio.

The NVIDIA Broadcast app has now been updated to version 1.2, bringing new room echo removal and video noise removal features, updated general noise removal and the ability to stack multiple effects for NVIDIA RTX users.  Version 1.2 includes new effects that enhance your microphone and webcam quality, plus features requested from the streaming community. NVIDIA Broadcast technologies are also being integrated natively into top streaming apps such as OBS Studio, AVerMedia and Notch.

NVIDIA Broadcast 1.2: turn any room into a home studioThe new Room Echo Removal

According to the company, “NVIDIA Broadcast uses AI to improve the quality of your microphone, speakers and webcam. RTX GPU owners can apply AI effects to their own microphone, or any incoming audio, removing unwanted noise. AI features also enhance the camera, including Background Removal, which removes a cluttered background from the camera feed, or Auto Frame which keeps the subject centered in the camera, even if they move. These effects run in real-time using Tensor Cores found on GeForce RTX GPUs. It’s the perfect tool to enhance both live streaming and video conferencing.”

The new Room Echo Removal feature is a key element in NVIDIA Broadcast 1.2, as it removes the echoey sound of your voice in rooms with poor acoustics. That echo may not feel so bad at first, but after a long streaming session or more than 400 days of meetings from home, your audience or colleagues will thank you for getting rid of it.

Because better sound deserves a better image, the updated version also includes a beta feature, Video Noise Removal that aims to make video noise a thing of the past… even for lower quality cameras. While NVIDIA acknowledges that the reduction of output “noise” – static that the audience can see on the video feed, especially in low-light situations, is one that will take some time to perfect, the function now included is a first step, helping to reduce video static to deliver a cleaner image.

NVIDIA Broadcast 1.2: turn any room into a home studio

Able to run multiple AI effects

NVIDIA says that existing features also receive improvements. The new update further enhances the quality and performance of the popular audio noise removal tool, with added profiles to better separate the sound of cats, dogs and even insects (cicadas be gone!). Auto Frame is also getting an update with a “buffer zone” that will let the subject move within the frame, without the camera updating. Now, it’ll move only when the talent leaves the middle-third of the screen.

Feedback from users has played a key part in the introduction of new features. For example, one bit of feedback, now incorporated in the app, is being able to run multiple AI effects on a single device. Now users can combine Background Blur with Auto Frame, or any combination they choose.

Follow the link to download the updated version of NVIDIA Broadcast app.

Read this article about Internet video editing and then TikTok

If you’re an old fart working in editing and post-production (like me I mean… I’m not a spring chicken anymore) and you haven’t taken time to dive into the world that is short-form internet video let this article from Vulture called In the Messy Land of Internet Video, the Editor Is King be your introduction. Or better than that is 25 Edits That Define the Modern Internet Video. It’s a nice round-up of what Vulture magazine deems worthy of the best edited (viral?) internet videos. This list runs the gamut of the short life of short-form internet video reaching all the way back 9 years, an eternity on the internet to old YouTube videos, some Vines and of course TikTok. While the stodgy Casablaca loving film intellectual might dismiss all that is silly on the portrait-oriented slab of Gorilla-glass they hold in their hand, they are missing some great escapist entertainment. And even if a lot of it is mindless, dumb (even misogynistic at times) there’s a ton of it that is well thought out, well produced and amazingly edited. Even if a lot of that lot is made to look like it’s simple and sloppy.

What’s truly amazing about some of this content is that a lot of it is basically edited in camera. And my “in camera” I mean in the app that it is created in. While you can take your mobile-captured clips and do a cut in iMovie, Premiere Rush or even the comparatively sophisticated LumaFusion much of it is all done within the TikTok (or similar) app. And since the more detailed editing techniques of J/L cuts, audio transitions, trimming and frame f*****g aren’t really part of the package a lot of that editing is done by trial and error. And redoing. And redoing a redo.

Those of us that are old enough to remember hooking two VHS VCRs together and editing with the pause button because we didn’t have a flying erase head can probably appreciate a lot of what is going on in these apps. Perhaps you experienced the early tape-to-tape editing consoles like a Sony RM–440 which were limited in their ability to fix a problem and had to start your edit from scratch when you didn’t see the glitch at the top of the project. Or maybe you’ll remember the in-camera shooting/editing exercise of film school where if it didn’t look right when you were done you had to do it over again.

Think back to those days and you can appreciate the hours spent with a tiny mobile phone on a mini-tripod with a ring light redoing hidden transition cut to get it just right. Hell, just search TikTok head spin transition on YouTube and you’ve got your whole day set.

Ignore this stuff at your own peril while you edit away on the 6th video in your 20 video corporate interview series. You’re probably making more money than most of the TikTok transition YouTubers in the search above but you might not be having a much fun. I know, I’m deep into my own corporate video series between this blog post and taking the dog out for a walk since we have a gasoline shortage right now and I don’t dare drive to the office.

On a personal note, I’ve got two young boys and as they’ve wanted to get into this internet world of content creation they’ve realized it’s a lot harder and more time consuming than they thought to actually make engaging content for the world to see. I set them up a TikTok channel for our new(ish) dog and they enjoy creating videos about him. But they got a lesson in the algorithm recently.

Early SirRiggy videos were mainly simple, silly affairs and despite hash-tagging away, views were limited.


A dog plays with his ball. #dog #dogsoftiktok #dogs

♬ Cute – Tik Tok

Once they started getting this idea of in-camera-TikTok-app editing things got a bit more sophisticated.


It is time to fetch the stuffy Sir Riggy!! #dog #dogs #dogsoftiktok #stuffy #bench #dogpark #dogparkfun


And then came the realization that maybe tagging a consumer product might help in the viewership.


Dogs like ice cream too. #dog #dogs #dogsoftiktok #dogsters #dogstersicecream

♬ Ice Cream Man – Geof Johnson

But leave it to the random encounter with a dog gone crazy, a dead possum and a realization that dad give me your phone, this will make a great TikTok would be the video suddenly got views into the thousands and in little boy’s minds … went viral.


Riggy found this on are morning walk#dog #dogs #dogsoftiktok #possum

♬ original sound – user7186778854348

It isn’t always the editing, or the concept or the quality. There’s a lot of unknowns that go into making the internet video that you can be proud of or that gets a lot of views or that gets a lot of likes and comments or that ends up making you millions. Sometimes it’s out of your control but you don’t get anything from the content creation if you don’t try it. Any hopefully have fun in the process.

Ok, this blog post went off the rails. The purpose of it was to link to the 25 Edits That Define the Modern Internet Video article so you might enjoy it as much as I did. Under his eye.

New NVIDIA Studio laptops for photographers and video editors

New NVIDIA Studio laptops for photographers and video editorsNew GeForce RTX gaming and Studio laptop models are now available, more than 140, including new RTX 3050 and 3050 Ti models, some bringing Studio laptops to the mainstream.

Both AMD and NVIDIA are in a race for the fastest, more powerful graphic cards… and your money. NVIDIA has just made its move, with the introduction of NVIDIA Studio laptops from Dell, HP, Lenovo, Gigabyte, MSI and Razer. The new Studio laptops are powered by GeForce RTX 30 Series and NVIDIA RTX professional laptop GPUs, including designs with the new GeForce RTX 3050 Ti and 3050 laptop GPUs, and the latest 11th Gen Intel mobile processors.

The invitation from NVIDIA is clear: create in record time using the new solutions. With the latest NVIDIA Ampere architecture, Studio laptops accelerate creative workflows with real-time ray tracing, AI and dedicated video acceleration. Creative apps run faster than ever, taking full advantage of new AI features that save time and enable all creators to apply effects that previously were limited to the most seasoned experts.

New NVIDIA Studio laptops for photographers and video editorsThirteen new Studio laptops were introduced this time, including:

  • Nine new models from Dell: The professional-grade Precision 5560, 5760, 7560 and 7760, creator dream team XPS 15 and XPS 17, redesigned Inspiron 15 Plus and 16 Inspiron Plus, and the ready for small business Vostro 7510. The Dell Precision 5560 and XPS 15 debut with elegant, thin, world-class designs featuring creator-grade panels.
  • HP debuts the updated ZBook Studio G8, the world’s most powerful laptop of its size, featuring an incredible DreamColor display with a 120Hz refresh rate and a wide array of configuration options including GeForce RTX 3080 16GB and NVIDIA RTX A5000 laptop GPUs.
  • Lenovo introduced the IdeaPad 5i Pro, with a factory-calibrated, 100 percent sRGB panel, available in 14 and 16-inch configurations with RTX 3050, as well as the ThinkBook 16p, powered by an RTX 3060.
  • Gigabyte, MSI and Razer also refreshed their Studio laptops, originally launched earlier this year, with new Intel 11th Gen CPUs, including the Gigabyte AERO 15 OLED and 17 HDR, MSI Creator 15 and Razer Blade 15.

New NVIDIA Studio laptops for photographers and video editorsGeForce RTX 3050 Ti and 3050 Studio laptops are perfect for graphic designers, photographers and video editors, bringing high performance and affordable Studio laptops to artists and students, says NVIDIA. The new line of NVIDIA Studio laptops introduces a wider range of options, making finding the perfect system easier than ever.

  • Creative aficionados that are into photography, graphic design or video editing can do more, faster, with new GeForce RTX 3050 Ti and 3050 laptops, and RTX A2000 professional laptops. They introduce AI acceleration, best-in-class video hardware encoding and GPU acceleration in hundreds of apps. With reduced power consumption and 14-inch designs as thin as 16mm, they bring RTX to the mainstream, making them perfect for students and creators on the go.
  • Advanced creators can step up to laptops powered by GeForce RTX 3070 and 3060 laptop GPUs or NVIDIA RTX A4000 and A3000 professional GPUs. They offer greater performance in up to 6K video editing and 3D rendering, providing great value in elegant Max-Q designs that can be paired with 1440p displays, widely available in laptops for the first time.
  • Expert creators will enjoy the power provided by the GeForce RTX 3080 laptop GPU, available in two variants, with 8GB or 16GB of video memory, or the NVIDIA RTX A5000 professional GPU, with 16GB of video memory. The additional memory is perfect for working with large 3D assets or editing 8K HDR RAW videos. At 16GB, these laptops provide creators working across multiple apps with plenty of memory to ensure these apps run smoothly.

New NVIDIA Studio laptops for photographers and video editorsSupport for top creative apps

The laptops are powered by third-generation Max-Q technologies. Dynamic Boost 2.0 intelligently shifts power between the GPU, GPU memory and CPU to accelerate apps, improving battery life. WhisperMode 2.0 controls the acoustic volume for the laptop, using AI-powered algorithms to dynamically manage the CPU, GPU and fan speeds to deliver quieter acoustics. For 3D artists, NVIDIA DLSS 2.0 utilizes dedicated AI processors on RTX GPUs called Tensor Cores to boost frame rates in real-time 3D applications such as D5 Render, Unreal Engine 4 and NVIDIA Omniverse.

The Studio ecosystem is flush with support for top creative apps. In total, more than 60 have RTX-specific benefits. Here are some:

  • GeForce RTX 30 Series and NVIDIA RTX professional Studio laptops save time (and money) by enabling creators to complete creative tasks faster.
  • Video specialists can expect to edit 3.1x faster in Adobe Premiere Pro on Studio laptops with an RTX 3050 Ti, and 3.9x faster with a RTX 3060, compared to CPU alone.
  • Studio laptops shave hours off a single project by reducing time in playback, unlocking GPU-accelerated effects in real-time video editing and frame rendering, and faster exports with NVIDIA encoding.
  • Color grading in Blackmagic Design’s DaVinci Resolve, and editing using features such as face refinement and optical flow, is 6.8x faster with an RTX 3050 Ti than on CPU alone.
  • Edits that took 14 minutes with RTX 3060 would have taken 2 hours with just the CPU.
  • 3D artists working with Blender who are equipped with a laptop featuring a GeForce RTX 3080-powered system can render an astonishing 24x faster than CPU alone.
  • A heavy scene that would take 1 hour to render on a Macbook Pro 16 only takes 8 minutes to render on an RTX 3080.
  • Adobe Photoshop Lightroom completes Enhance Details on RAW photos 3.7x faster with an RTX 3050 Ti, compared to an 11th Gen Intel i7 CPU, while Adobe Illustrator users can zoom and pan canvases twice as fast with an RTX 3050.

New NVIDIA Studio laptops for photographers and video editorsRegardless of your creative field, Studio laptops with GeForce RTX 30 Series and RTX professional laptop GPUs will speed up your workflow, says NVIDIA.  Starting at $799, GeForce RTX 3050 laptops add immersive ray tracing to the latest games, accelerate performance by up to 2x with NVIDIA DLSS, and increase the speed of video editing by up to 7x vs. a CPU. Compared to GeForce GTX 10 Series laptops, DLSS and the company’s latest innovations can deliver an astonishing 4x boost in games, and up to 2.5x in video editing applications, making your gaming better and your content creation far faster.

World premiere at Tribeca: Breonna’s Garden AR experience

World premiere at Tribeca: Breonna's Garden AR experienceBreonna’s Garden, an AR experience created to honor the life of Breonna Taylor is one of the multiple innovative cinematic experiences at the Tribeca Festival that will take viewers to a new world.

Imagine standing in New York City and then instantly journeying to the bottom of an ocean with a marine biologist. Or walking through a moonlit forest inhabited by a monster. Or getting a tour of Barack Obama’s White House from the former president himself. Or crash-landing on an unknown planet 300 light years from Earth with only a robot as your guide.

World premiere at Tribeca: Breonna's Garden AR experience
Republique, the interactive movie

These unique experiences were all made possible thanks to the Tribeca Festival’s evolution in cutting-edge virtual reality and immersive storytelling. For nearly a decade, it’s become the premiere showcase for thrilling technology that allows creative thinkers and explorers to immerse and interact with the 360-degree world around them. All you need is the willingness to let go — and, on occasion, a headset.

Tribeca’s multimedia program ignited in 2013 with the introduction of the Storyscapes section, a series of five experimental works that capitalized on interactive, web-based, and cross-platform approaches to visual creation. A year later, acclaimed directors Daniel Scheinert and Daniel Kwan introduced Possibilia, a non-linear film in which cast members made real-time choices onstage in front of a live audience. And in 2015, Jeremy Bailenson from Stanford University brought its Virtual Human Interaction Lab to the festival. For the first time, audiences could play quarterback for a football team and walk a mile in someone else’s shoes.

World premiere at Tribeca: Breonna's Garden AR experienceBreonna Taylor: a special tribute

That same spirit continues today, and the 2021 edition of the Tribeca Festival has launched an invitation to the public: take an adventure with a friend as they confront their fears through music, confront cryptic AI, and honor Breonna Taylor with a special tribute from her sister, Ju’Niyah Palmer. These cinematic experiences are all a part of Tribeca Festival’s immersive lineup.

With incredible stories from world-class artists, these immersive experiences push the boundaries of storytelling. Presented by AT&T, Tribeca Festival will celebrate cutting-edge selections with the Storyscapes Award, honoring artists who use innovative ideas to bridge the gap between technology and storytelling.

“2021 is the most dynamic year yet for Tribeca Immersive, especially as it relates to the diversity of experiences,” said Loren Hammonds, VP, Immersive Programming, Senior Programmer, Film & Immersive. “Attendees, both in-person and online, will experience everything from live AI performers and animated adventures to a location-based augmented reality memorial.”

The 2021 Tribeca Immersive Program will happen through outdoor interactive experiences. Outdoor in-person Immersive installations will be located at various locations throughout NYC; these experiences are free and open to everyone throughout the Festival. Those available virtually can be accessed via the Tribeca Festival website. Outdoor locations for each immersive experience will be announced on the festival’s website.

World premiere at Tribeca: Breonna's Garden AR experienceWe Are At Home, a VR installation

The lineup includes a world premiere, Breonna’s Garden, a 15 minutes long AR experience created in collaboration with Ju’Niyah Palmer to honor the life of her sister, Breonna Taylor. Part of Juneteenth programming, the cinematic experience Breonna’s Gard – She who plants a garden plants hope, has Lady PheOnix Ruach Shaddai as project creator with Stuart ‘Sutu’ Campbell as key collaborator.

The full list of immersive experiences includes 28 other titles, covering a wide variety of subjects, from Bystanding: The Feingold Syndrome, an immersive docufiction sharing the confessions of people who witnessed a kayak-rower drown for four-and-a-half minutes and did not jump in, to We Are At Home, a VR installation based on the poem The Hangman at Home by Carl Sandburg, which poses a single question to viewers: “What does the hangman think about when he goes home at night from work?”

Other titles present are JAILBIRDS, which ProVideo Coalition mentioned recently, or Republique, the interactive movie, which is, as the name suggests, an interactive film that plunges the participant into the emotions felt during an attack in the Paris Metro through three parallel storylines, letting you switch from one to another. There, are, in fact, experiences for a wide variety of tastes, a clear indication of how vibrant this section of the festival is.

JAILBIRDS VR: world premiere at Tribeca and NewImages

JAILBIRDS VR: world premiere at Tribeca and NewImagesJAILBIRD: Bwa Kayiman, is a VR narrative experience on a human scale, a fantastic tale about the prison system and human freedom. Discover how a comic book story was adapted to Virtual Reality.

Between virtual reality, comics and animation, JAILBIRDS: Bwa Kayiman, trilogy first episode, is a bittersweet tale about men freedom. In his work, Director Thomas Villepoux uses VR codes and interactive storytelling techniques in an innovative composition. This innovative drama relies on the viewer engagement by alternating objective and subjective points of view for the first time, creating a subtle blend of viewer movement and camera movement.

JAILBIRDS: Bwa Kayiman is a fantastic tale based on the visual universe of Philippe Foerster, Belgian author of black poetry comics (freely adapted from “Paulot s’évade” in “Certains l’aiment Noir” published by Fluide Glacial in 1985). How do you adapt a comic book with such a particular style in a medium that uses real time animation – that is, video game tools that have their own aesthetic?

Fred Remuzat handled the first work. He is an art director specialized in CGI, who can make a character exist in a pencil sketch and then translate it into polygons and vectors. Together, Fred Remuzat and Thomas Villepoux defined the visual style and the design of the characters. Then the animation studio The Pack created the 17 special “shader” that enables this “pencil drawing render”.

JAILBIRDS VR: world premiere at Tribeca and NewImagesA metaphor for virtual reality

Real actors were chosen to embody the characters: Thomas Lemarquis (Snowpiercer (2013), 3 Days to Kill (2014), X-Men: Apocalypse (2016), Blade Runner 2049 (2017)) (Chief Warden), Barry Johnson (Felix) and Elliot Delage (Booker). A particular technical device was set according to three axes: same sound recording system for dialogues as for a shooting, filming facial expressions for characters animation, and filming body movements as an animation reference. An acting performance that gives strength and humanity to the characters.

JAILBIRDS VR: world premiere at Tribeca and NewImagesSet sometimes in a nightmarish prison, sometimes in beautiful landscapes, the experience makes full use of the assets of the VR medium as narrative tools. The story resonates strangely with the promise of virtual reality: a prisoner escapes each night as his eyes magically detach from his body and set off to discover the world that is physically inaccessible to him.

In fact, by using Virtual Reality as the medium to reach audiences, JAILBIRDS: Bwa Kayiman poses an interesting question: What better metaphor for virtual reality? In fact, our eyes escaping to discover the world while our body is trapped in the physical reality is very much what VR offers. In these troubled times it is an experience which finds a particular echo. You’ll need a VR headset to try the experience. Online access will be available with the MOR – Museum of Other Realities) application, which is available via Steam and Vive platforms.

JAILBIRDS VR: world premiere at Tribeca and NewImagesXR3 aims to fully democratize access to VR

JAILBIRDS: Bwa Kayiman will be available from June 9th on both platforms. The MOR app is available to users of the following PCVR headsets: Rift, HTC Vive, Valve Index and Windows Mixed Reality. Oculus Quest’s users can access it by using Oculus Link or Virtual Desktop application. Even if you do not own a VR headset, it will be possible to watch JAILBIRDS: Bwa Kayiman through in situ access, within the framework of the three events and also thanks to the network of partners “lieux satellites” deployed throughout the world and in which dedicated VR stations will be set up.

JAILBIRDS VR: world premiere at Tribeca and NewImagesJAILBIRDS: Bwa Kayiman won some prizes before: the Prix Wallimage – stereopsia du meilleur projet VR in 2016, le Prix Courant-3D – Nouvelle Aquitaine du meilleur projet immersif in 2017 and the Prix SACD du meilleur scénario VR au “Pitch projets numériques” du Festival d’Annecy in 2018. Now it will have its world premiere in two festivals. The Virtual Reality story will be shown to professionals and to the public during the two events, both international leaders in the immersive creation sector.

The world premiere of JAILBIRDS: Bwa Kayiman happens at a pivotal moment in which through an unprecedented partnership, NewImages, Tribeca and Cannes XR are joining forces to fully democratize access to VR and immersive creation. In fact, from June 9th to the 20th the three festivals will partner in order to produce a unique and VR dedicated platform available to everyone. All VR works in competition will be visible online via the virtual exhibition XR3.

Pioneer of a new exhibition concept and result of each event’s expertise in the immersive arts, XR3 will gather around fifty works among which several world premieres previews. All selections will be available inside the Museum of Other Realities (MOR), a virtual exhibition space with an inventive scenography specially designed for the occasion. Thanks to its hybrid dimension and its different modes of access, the XR3 platform thus offers the works presented to a worldwide influence and to all audiences.