Press Release – Broadcast Solutions and Zero Density enter strategic partnership, adding new countries and strengthening cooperation

As part of this new cooperation, Broadcast Solutions will extend their list of countries to distribute Zero Density. Besides several Nordic countries, where Broadcast Solutions is already partnering with Zero Density, the new cooperation concerns the countries: the UK, Germany (DACH), Hungary and Spain.

Antti Laurila, Global Sales Director at Broadcast Solutions, comments on the partnership: “We are thrilled to bring Zero Density tools to even more European countries. This partnership perfectly matches our strategic expansion in Europe. Zero Density is a global leader in Virtual Studio technology and pushes the boundaries of what is possible. Their 3D Virtual Studio and Augmented Reality platform is a game-changer, and we look forward to making it available to our customers.”


“Broadcast solutions is one of the biggest systems integrators in Europe offering complex products and services and we’re proud to take our collaboration to the next level.” said Ulaş Kaçmaz, VP, Sales / Marketing. “We’ve been partnering with Nordic branches of Broadcast Solutions and it is evident that the know-how and the capabilities of the team are vast in all areas broadcast. We are proud to extend the partnership to other countries.”

The extended partnership between Broadcast Solutions and Zero Density starts immediately. To learn more about the product, Broadcast Solutions’ sales and engineering colleagues from all over Europe visited Zero Density’s headquarters for extensive training.

Zero Density is an international technology company dedicated to developing creative products for the industries such as broadcasting, augmented reality, live events and esports. Zero Density offers the next level of virtual production with real-time visual effects. It provides Unreal Engine native platform, “Reality EngineTM”, with advanced real-time compositing tools and its proprietary keying technology. Reality EngineTM is the most photo-realistic real-time 3D Virtual Studio and Augmented Reality platform in the industry.

More info

Emerson College Adopts Broadcast Pix BPswitch Integrated Production Switcher for Sports Production

Installed in late 2018 during winter break, the new BPswitch was used throughout last season when the men’s basketball team won its first NEWMAC championship title.

The BPswitch anchors the control room in the Bobbi Brown and Steven Plofker Gym, which hosts all home games for men’s and women’s basketball as well as men’s and women’s volleyball. The control room is located a floor above the basketball court, and is designed with a traditional three-row setup. Tim MacArthur, associate director for media technologies and production, said the college is exploring the idea of using the BPswitch-based control room for remote productions of other campus sports as well.

“The big draw for us was getting into BPfusion for graphics and the integration with the Daktronics scoreboard,” explained MacArthur. “The all-in-one device was also very appealing to us. There is a real ease of workflow with Broadcast Pix.”

BPfusion is a software option that works with BPswitch’s integrated NewBlueNTX multi-layer 3D motion graphics CG to streamline the creation of data-intensive graphics for sports, elections, and more. It uses templates to create customized graphics that update automatically using data from scoreboards, RSS feeds, and other IP-based sources without re-keying the data.

“The ease of use of the NewBlueNTX was another big draw,” MacArthur added. “The graphics are a step up from what we’ve done in the past, with easily modified templates that look great.”

All four of the switcher’s outputs are used to populate monitors in the control room, including two large panels in the front of the room and individual monitors for the TD and audio stations. MacArthur said the BPview™ integrated multi-view was customized for sports production, but there is a separate layout for an awards gala that is hosted in the gym every April.

Broadcast Pix Emerson College
Emerson College used its new BPswitch GX integrated production switcher last season to produce live coverage when the men’s basketball team won its first NEWMAC championship title.

While the students do not use the BPswitch’s file-based macros during productions, MacArthur said there is one macro that preps the switcher for production, so there is a consistent look no matter which student is serving as technical director. Coverage is produced in 720p and streamed live to the Emerson Lions YouTube channel.

With very limited funding for the gym’s video production system, students rely on mostly hand-me-down equipment. The new BPswitch, for example, replaced an existing Broadcast Pix Granite™ system that had been donated to the school. MacArthur said it was a seamless upgrade, and the school was able to keep its existing Broadcast Pix control panel.

There are dozens of sports productions throughout the school year, all produced with volunteer student crews. There is also no formal instruction for the BPswitch (or any other video equipment) in the gym’s control room. Instead, student volunteers teach themselves and each other how to use the equipment.

According to MacArthur, typical games have a three-camera setup with play-by-play and color commentators along with instant replay. Some games have as many as six cameras as well as a sideline commentator, but even single camera productions are run through the BPswitch to take advantage of the BPfusion graphics.

More info

IDEA to Create Specifications for Next-Gen Immersive Media, Including Light Field Technology

Founding members, including CableLabs®, Light Field Lab Inc, Otoy, and Visby, created IDEA to serve as an alliance of like-minded technology, infrastructure, and creative innovators working to facilitate the development of an end-to-end ecosystem for the capture, distribution, and display of immersive media.

Such a unified ecosystem must support all displays, including highly anticipated light field panels. Recognizing that the essential launch point is to create a common media format specification that can be deployed on commercial networks, IDEA already has begun work on the new Immersive Technology Media Format (ITMF). ITMF will serve as an interchange and distribution format that will enable high-quality conveyance of complex image scenes, including six-degrees-of-freedom (6DoF), to an immersive display for viewing. Moreover, ITMF will enable the support of immersive experience applications including gaming, VR, and AR, on top of commercial networks.

Recognized for its potential to deliver an immersive true-to-life experience, light field media can be regarded as the richest and most dense form of visual media, thereby setting the highest bar for features that the ITMF will need to support and the new media-aware processing capabilities that commercial networks must deliver. Jon Karafin, CEO and co-founder of Light Field Lab, explains that “a light field is a representation describing light rays flowing in every direction through a point in space. New technologies are now enabling the capture and display of this effect, heralding new opportunities for entertainment programming, sports coverage, and education. However, until now, there has been no common media format for the storage, editing, transmission, or archiving of these immersive images.”


“We’re working on specifications and tools for a variety of immersive displays — AR, VR, stereoscopic 3D, and light field technology, with light field being the pinnacle of immersive experiences,” said Dr. Arianne Hinds, Immersive Media Strategist at CableLabs. “As a display-agnostic format, ITMF will provide near-term benefits for today’s screen technology, including VR and AR headsets and stereoscopic displays, with even greater benefits when light field panels hit the market. If light field technology works half as well as early testing suggests, it will be a game changer, and the cable industry will be there to help support distribution of light field images with the 10G platform.”

Starting with Otoy’s ORBX scene graph format, a well-established data structure widely used in advanced computer animation and computer games, IDEA will provide extensions to expand the capabilities of ORBX for light field photographic camera arrays, live events, and other applications. Further specifications will include network streaming for ITMF and transcoding of ITMF for specific displays, archiving, and other applications. IDEA will preserve backwards-compatibility on the existing ORBX format.

IDEA anticipates releasing an initial draft of the ITMF specification in 2019. The alliance also is planning an educational seminar to explain more about the requirements for immersive media and the benefits of the ITMF approach. The seminar will take place in Los Angeles this summer.

More info

ChyronHego Launches PRIME Graphics 3.0

ChyronHego today announced the release of PRIME Graphics 3.0, the latest version of its universal graphics platform that stacks an array of diverse applications into a single design and playout solution.

The advanced 3D graphics authoring and playout solution from ChyronHego has been enhanced with support for the newly adopted SMPTE ST 2110 standard and with HDR-enabled 16-bit color, which allows the platform to deliver 10-bit HDR output in the customer’s choice of HLG or S-Log3 formats.


“As markets become more competitive and revenue per channel is shrinking on average, the need to output more content with less equipment is a common challenge for broadcasters everywhere. That’s why media enterprises around the world, and in broadcast markets of all sizes, are embracing PRIME Graphics,” said Boromy Ung, chief product officer, ChyronHego.

“With its ‘One Platform, Multiple Applications’ model, PRIME Graphics is the ideal solution for everything from news, sports, and entertainment, to government and corporate applications. Equipped to support the most sophisticated broadcast operations, PRIME Graphics 3.0 assists users in taking full advantage of the latest in IP-based workflows and in delivering the rich high-resolution, high-dynamic-range images that give productions a competitive edge.”

Driven by the latest version of ChyronHego’s most powerful rendering engine, PRIME Graphics 3.0 addresses five mission-critical use cases within a single, easy-to-use platform: the most renowned CG, a powerful clip player, a video wall solution, a graphics-driven touch-screen platform, and a complete branding solution using ChyronHego’s NewsTicker family of options. PRIME Graphics adapts smoothly to customer requirements by offering all of these use cases within a single, easy-to-use, 4K- and IP-ready graphics design and playout system.


Resolution-agnostic and software-based, PRIME Graphics 3.0 leverages advanced 64-bit GPU- and CPU-based technologies for maximum power in rendering graphics and effects, as well as HDR-enabled 16-bit color. By supporting both HLG or S-Log3 formats, PRIME Graphics 3.0 offers customers a future-proof solution that enables them to choose the standard they need when they need it.

For news production, PRIME Graphics 3.0 provides full integration with CAMIO, ChyronHego’s MOS-based and NRCS-connected graphics asset management solution, giving news and sports broadcasters the ability to extend their newsroom capabilities significantly. PRIME Graphics also offers a complete video wall solution, with the ability to feed content to any size canvas, to any number of outputs, and to any resolution per output — all synchronized and automated by PRIME.

More info

Twin Pines reconstructs 16th-century Seville for “The Plague”

A team of 35 VFX artists created over 500 shots over a period of ten months for this story set in 16th-century Seville.

The biggest challenge was digitally recreating the city of Seville of the period with the same historical seriousness as a history documentary. To this end, throughout the production process it worked alongside a whole series of historical advisors, documentary makers and an art director for VFX. The exhaustive development of visual layers and the creation of 3D elements by computer relied heavily on maps, etchings and paintings from the period.

“The job of reconstructing the city was formidable given that there is practically nothing existing today of 16th-century Seville”, explained Juanma Nogales, VFX supervisor at Twin Pines. “It required a painstaking process combining historical seriousness with aesthetic taste. And that’s where the collaboration between all the different teams came in: art (Pepe Domínguez), make-up (Yolanda Piña), photography (Pau Esteve), VFX (Juan Ventura, Juanma Nogales), postproduction (José Moyano, Iván Benjumea)… Unless everybody is on the same page, working in the same direction, such an ambitious production as The Plague would not have been possible. And the results are there for everyone to see.”

Twin Pines started working on the project a long time before shooting actually begun. First of all, there was an initial phase of contextualisation, which was then followed by execution, and this took up to ten months of work.

“One of the biggest technical problems we came across was that, unlike other productions we had done before, we didn’t work in fields but in different sets in which the cameras moved freely in 360 degrees. Using immersive techniques, we had to insert the computer-generated elements”, recalled Nogales.

One of the most complex parts to replicate was the port of Seville, which was the most important in the world at that time. Twin Pines solved the problem by recording several images of the Guadalquivir river around the area of the Isleta in Coria del Río, which was then digitally mastered to achieve an end result faithful to history. The team at Twin Pines also had to model the ships and galleons in the port in 3D, which were then integrated into the different scenes.

At the beginning of the first episode, the main character appears on horseback looking down on Seville from a distance. The only real thing in this scene was the horse. In fact, the character was on a hill overlooking a motorway and, using a green screen, 16th-century Seville was reconstructed using computer-generated 3D elements.

Also worth underscoring is Twin Pines’ work reproducing the crowds of people. For instance, in one of the episodes, a space holding around 15,000 people was filled with just 100 extras, with the help of a greenscreen and a lot of individuals recreated digitally after scanning the extras and their clothing in 3D.

At the same time, a large part of the buildings and monuments that feature in the series are not actually from Seville. What the team did was to use similar buildings as references, many of which were photographed and rebuilt in 3D, like the cathedral, the city walls, the poor quarters, the city gates and St George’s Castle. Nowadays these buildings or areas are completely different or the still existing monuments are now surrounded by a different landscape.

In addition, the visual effects in The Plague included a major work of digital extension of the sets, the reconstruction of natural spaces, the integration of backdrops, and a long etcetera. The whole process generated a volume of data for VFX of around 50 terabytes and over 19,500 hours of rendering. Nuke Studio software by Foundry played an instrumental role in the project by enabling VFX, editing and finishing with one single application.

Juanma Nogales is taking part in a talk on FMX on Wednesday 25 April at 17:00, precisely to talk about the use of Nuke Studio in The Plague, as part of the most important symposium on digital visual arts in Europe, held annually in Stuttgart (Germany).

Created by Alberto Rodríguez and Rafael Cobos, the first season of The Plague, consisting of six 50-minute episodes, can be seen on Movistar+. Following the fantastic results since it was first screened, the company has already confirmed a second season for 2019.

With a budget of 10 million euros, around 200 actors, over 2,000 extras and 400 technical experts were involved in this superproduction. The Plague was the first TV series included in the official section of an A-list international film festival, as was the case in the San Sebastián Festival. With a January 12 premiere, the show became the best premiere of a series in Movistar+ in 2018. The Spanish telco recently launched a drama channel across Latin America called Movistar Series, with 12 original series to debut throughout 2018, including The Plague.

ZDF relies on Robycam cable camera system for international football match Germany – Brazil

During the friendly, which took place on March 27 at Berlin Olympic Stadium, ZDF used the Robycam cable camera system along with a Broadcast Solutions production team.

Together with augmented reality elements, implemented by the graphics service provider netventure production GmbH, new graphic elements could be inserted during the camera’s movement in the stadium. The Robycam signals, including the actual location data in the stadium, and the virtual graphics of netventure were mapped in the computer and then sent. Using these new innovative tools, ZDF was able to offer its viewers additional content and information, even while moving the cable camera system in the stadium.


The football match was not the first use of Robycam for ZDF. The German broadcaster used the Robycam system already during the production of “Mainz celebrates” on the occasion of the Day of German Unity 2017 and early 2018 during the Biathlon World Cup in Ruhpolding. Robycam is a cable camera system that enables gyro stabilized 3D camera movements. Together with a powerful controller, the system allows movements in all three axes: Pan, Tilt, Roll. All three axes are stabilized. There is also an auto-horizon feature that makes the image absolutely stable in wind or swings.

Unlike other systems on the market, Robycam has a fifth winch that carries the fibre-optic cable in master-slave mode. As a result, using the Polyspast mode the system is more variable in height.

The Robycam system uses four automatic winches, controlled in real time, and a sophisticated motion control system that allows millimetre-accurate and fast camera shots with up to 8 ms in all axes. The largest system has an operating range of up to 250 m x 250 m and more. Robycam is fully redundant (UPS units on each computer) and of course, has all necessary certifications of DGUV17 / 18 (formerly BGV-C1) to be used above people.


The next productions you can see the system live is the Champions League semi-final Bayern Munich vs Real Madrid in Munich, April 25. An additional and special use-case is planned for the event SportsInnovation 2018 (8 – 9 May) in Düsseldorf. During the event and in front of numerous media and club representatives the Robycam system will perform the camera flights of the test matches in the Düsseldorf ESPRIT Arena. Two days earlier (May 6) and in the same stadium, Robycam will produced the flying cable camera pictures during the 2nd league football-match Fortuna Dusseldorf vs Holstein Kiel.

More info

ChyronHego and Epic Games to Integrate Unreal Engine With AR and Virtual Set Software

ChyronHego today announced a partnership with leading game developer Epic Games to integrate the Unreal Engine with ChyronHego’s family of augmented reality (AR) graphics and virtual set solutions.

With the integration, news broadcasters and other customers of ChyronHego’s Neon and Plutonium software will be able to leverage Unreal’s industry-leading rendering and real-time special effects capabilities to add powerful new photorealistic and hyper-realistic elements to their on-air virtual sets.

“In an environment that’s more competitive than ever, our broadcast news customers are on a constant search for innovative ways to tell a better story and captivate viewers,” said Olivier Cohen, senior product manager, virtual solutions, ChyronHego. “Virtual sets that harness the amazing graphics capabilities of world-class gaming engines are the wave of the future for news, sports, and weather broadcasting, and Epic Games is the perfect partner to take us there. The integration with Unreal is just the latest link in our CAMIO Universe strategy to place the industry’s most powerful storytelling tools at news producers’ fingertips and drive template-based, unified news and weather workflows.”


“Since we launched Unreal Engine 4, it has become one of the world’s most powerful rendering engines for the game industry — but its flexibility, real-time performance, and robust set of tools make it ideal in just about any type of content creation workflow,” said Marc Petit, general manager, Unreal Engine Enterprise at Epic Games. “ChyronHego’s AR and virtual set solutions for broadcast production are the ideal match for Unreal Engine. By partnering with one of the leading broadcast graphics providers, we can continue to expand our presence in the broadcast, film, and entertainment industries.”

ChyronHego will work to integrate the CAMIO graphic asset management server with Unreal Engine, including creating broadcast-specific camera movements. Unreal’s 3D graphics engine enables producers to generate AR graphics through the templated workflows of the CAMIO Universe. Via custom user interfaces built with ChyronHego Live Assist panels, producers can then present the AR graphics on air using ChyronHego’s Plutonium and Neon virtual set and robotic camera tracking solutions.

Cohen added, “As one of the world’s most powerful rendering engines from one of the world’s foremost gaming companies, Unreal will bring new levels of openness and scalability to our virtual set solutions. Unreal Engine is easily customizable and expandable, and five million users around the world will benefit from Epic Games’ vast reach across the global community of gamers and game developers. It means news broadcasters will be able to render effects in their virtual sets that rival anything their viewers have seen in the gaming world, with special effects like real-time shaders, bumps, sliders, and highly photorealistic objects.”

More info