MX1 at the 2019 NAB Show

As a wholly-owned subsidiary of SES, MX1 is a trusted partner for broadcasters, content owners, and TV platform operators, as well as content aggregators for the sports and news industry. MX1 focuses on helping its customers reduce their efforts and costs and increase their ROI by tapping new sources of revenue.

MX1 will highlight a number of solutions at SES’s booth (SU1410) during the 2019 NAB Show, including a demonstration of a new broadcast-scale Online Video Platform (OVP), a powerful new automated targeted ad insertion solution, the company’s OU Flex service, and an enhanced sports and events management tool paired with a new live events booking platform.

Boost OTT Monetisation With MX1’s Fully Managed Online Video Platform (OVP)

At the 2019 NAB Show, MX1 will demonstrate its new fully managed OVP, which provides operators with a complete end-to-end solution, access to the SES and MX1 global contribution and distribution network, and seamless integration with the company’s content processing and management services. Fully integrated into the MX1 360 Unified Media Platform, the solution reduces complexity, provides access to analytics data, and enables easy management across the complete media supply chain — from origination to distribution and monetization — on one single platform. Its cloud-based nature also guarantees seamless scaling, globally, in relation to the number of users, channels, and content delivered.

Increase Revenue With Automated Targeted Ad Insertion

As the advertising industry increasingly pivots towards targeted advertising, broadcasters and operators need a valuable solution to implement an offering that further drives monetisation and ROI for both linear and on-demand content. MX1 now features an innovative new solution that provides near-plug-and-play automated dynamic ad insertion capabilities. Integrated with MX1’s broadcast and online streaming infrastructure, the solution enables content providers to remove and replace adverts reliably at a frame-accurate, broadcast-grade level. In addition, its integration with connected advertising sales platforms, such as SpotX and Google Ad Manager, allows replacement ads to be targeted to specific segments of the audience.

Broadcast Live Events Via IP With OU Flex

MX1’s OU Flex service enables sports and live events to be broadcast reliably and without interruption, providing the next-generation connectivity required by news organisations and anyone in the business of broadcasting live events, particularly from remote locations. Offering the best of the video and data worlds, the OU Flex solution combines traditional DVB links with full-service IP connections and offers rock-solid connectivity at bandwidths high enough to support HD and UHD, even in congested network spots and in areas with low-connectivity bandwidth. Seamless, high-quality live broadcasts are enhanced with IP data connections for both production coordination and content streaming, allowing broadcasters, content creators, and event organizers to focus on growing their business and offering the best possible experience to their viewers — whatever the device.

Enhanced Interfaces Increase Efficiency and Flexibility for Sports and Live Events

At the 2019 NAB Show, MX1 will present its new sports and events management interface for individual matches to team sports with any group of licensees from around the world, aimed at sports and event content owners and distributors. The new interface allows sports and event organizations and rights holders to easily manage and share broadcast-quality media containing matches, clips, and highlights, as well as related images, documents, metadata, and more.

Also on display at the show will be the company’s new live events booking interface, which connects MX1’s sports and events customers with its 24/7 live events booking team, simplifying the planning, monitoring, and coordination of live-event content distribution. The live booking interface features one-click sharing of live events with licensees from around the world, who receive real- time notifications and access to mission-critical distribution details. The integrated solution can be used both for single live event management and more complex sports tournaments with multiple concurrent live events.

As with all MX1 products, any new services launched by the company are managed through the MX1 360 unified media platform.

More info

The Late Show with Stephen Colbert Selects Bannister Lake to Integrate Live Data-Driven Graphics

Bannister Lake, the leading provider of professional broadcast data aggregation and visualization solutions, today announced that the company’s graphic integration services were used to generate key graphics for the special live election night broadcast of The Late Show with Stephen Colbert on Nov. 6. To help audiences visualize the unfolding results of the midterm elections, The Late Show’s production team turned to Bannister Lake to prepare and integrate graphics into the live broadcast that showed the House of Representatives and the Senate populated with the number of elected officials.

BL_LSSC_House_GFX
Bannister Lake Helps Visualize Midterm Election Results for The Late Show With Stephen Colbert

The Late Show used Google Sheets to tie data to the graphics and to manually make changes as required. During the election special, Bannister Lake’s role focused on transforming The Late Show’s creative graphic concepts into templates ready for playout. The graphic work was performed by Bannister Lake’s Creative and Technical Director Al Savoie, and data integration was carried out by The Late Show’s production staff.

“Bannister Lake has a long-standing relationship with key members of the graphics team at The Late Show and when we were asked to help out with its election coverage, we immediately said yes,” said Savoie. “We have done countless election broadcasts integrating data and graphics, but we’re especially honored to have been chosen by The Late Show’s team and take part of midterm election history.”

Bannister Lake provides software and services for managing and visualizing data for a broad range of projects including real-time election results, social media, financial information, sports, news, and weather. Bannister Lake’s flagship software solution, Chameleon, is the industry’s most advanced broadcast data engine that allows users to input any kind of data to populate and manage graphic templates. With Chameleon’s Google Sheets Custom Reader, media producers can automatically pull data content from Google Sheets cells and seamlessly populate graphic templates. In addition, producers are able to take full advantage of Google Sheets’ sharing capabilities, granting access and editing rights to multiple users contributing to the content. Data can be organized by topic and displayed using the Google Sheet tab, with Chameleon’s Google Sheets Custom Reader capable of handling multiple sheets and tabs for an efficient and elegant way to display complex broadcast graphics.

More info

The battle to reach ‘convergence 4.0’

Kuldip-Johal---Vice-President-Sales---Subscription-Broadcasting

By Kuldip Singh Johal, Vice President Sales – Subscription Broadcasting, UEI

Now, it could be at the heart of convergence, providing users with a single device with which to perform synergised functions, controlling everything from the television to your home’s temperature, security and lighting. However, with established technology manufacturers competing against smaller, yet more agile, counterparts, who will win convergence 4.0?

While there are a handful of manufacturers most of us will associate with smart home technology, many paid TV operators and telecommunication companies are turning their backs on these devices, thus creating space for some of the smaller brands to rise to the top. Not only are devices created by these big brands more expensive for paid TV operators to deploy, but they are also less adaptable for their needs. Conversely, by partnering up with smaller manufacturers to create bespoke solutions, paid TV operators can be in control of their own destiny and create their own eco-system on which they can build in the future. This is a fluidity that isn’t offered by larger branded devices where the roadmaps for convergence are already set out for them.

Solutions developed with telcos and paid TV operators in mind also allow them to have more autonomy in their approach to the market, rather than following the trends as dictated by large manufacturers. This is a key issue in convergence as with paid TV operators on side, the smaller manufacturers have the potential to tap into different insights and develop new capabilities.

Currently, many consumers are turned off by the idea of convergence due to the difficulties they face with configuration and setting up smart devices. For the average person, configuring multiple devices can be daunting and seem like it may require some engineering know-how, which can deter consumers from buying new devices or attempting to integrate them. Difficulties with configuring, discovering and controlling devices are among the biggest pain points for consumers of smart devices and is something that needs to be considered in relation to convergence.

It is vital that manufacturers and industry leaders address these pain points in order to make convergence and configuration a frictionless and simple process for the end user. This is also likely to lead to increased uptake of smart devices. Users require intuitive devices which are capable of automatically recognising new devices and help the user to configure them. However, many of the key players within the industry are yet to offer devices capable of this. Should manufacturers, and the bigger market challengers such as Amazon and Google, want to attract audiences, the need for end-to-end solutions which simplify the process of migration to the smart home for the end user needs to be considered. For example, devices should offer simplistic, voice-based processes to increase ease of use.

Additionally, the winner of convergence 4.0 will produce devices which are capable of ‘learning’ set skills. As well as being intuitive in recognising other devices, these devices must also intuit what a user requires when they perform certain commands. For example, ultimate convergence will come when users are able to ask their device to enter ‘movie mode’, for instance, and the device will not only play a movie but will also draw the curtains and dim the lights.

Voice-Remote---ConvergenceIt is our view, that as many smart devices make use of voice-control, the two issues are intrinsically linked. Therefore, to win convergence 4.0, you must also be one step ahead in the fight for the voice-assistant market. The uptake of voice-controlled devices is growing significantly with a recent study finding that 1 in 6 adults in the US now owns a voice-activated smart speaker and 65% say they wouldn’t want to go back to a life without these devices. Their popularity and ease of use show that this technology should be a key feature for the future of convergence.

Given the current difficulties associated with configuring devices, the real winner of convergence will be the company that can make integrating these devices into the home it as easy and seamless as possible. At the moment, convergence is currently being driven primarily by the market as a pre-emptive strike to anticipate the needs of consumers. While the big names currently have a monopoly on this market, convergence 4.0 could be a case of the tortoise and the hare as smaller manufacturers step up their approach to the market. With a more insightful view of the requirements of not only the user but also telecommunication companies, these brands could be better able to tailor their offerings more precisely to suit the users’ needs.

Dramatic increase in OTT drives major streaming development

Jérôme Vieron, PhD – Director of Research & Innovation for ATEME

The number one differentiator will always be the content provided. Whereas before we learnt to love the only content on offer, now we can pick and choose. As a result, the competition to provide the most compelling movies and box-sets is fierce. However, there is a further game-changer too as we also have high expectations of the quality of service provided and these have been raised even further due to various new formats including 4K/UHD, HDR and HFR.

Jérôme-Vieron
Jérôme Vieron, PhD – Director of Research & Innovation for ATEME

With this in mind, it’s not surprising that operators and technology vendors alike are focusing on ways to further enhance content delivery. As a result, this demand is powering a continuous string of innovations, especially around the issue of streaming.

The most recent of these developments embraces artificial intelligence (AI). Once the subject of science fiction, AI is now being used by global tech giants such as Amazon and Google to predict the behaviours of its users, with health organisations like the NHS also looking into the technology to help alleviate pressure from its doctors and nurses.

The broadcast industry is also seeing a more widespread adoption of AI. Now it is being used to analyse thousands of assets as part of the streaming process. In doing this, AI has been shown to save operators around 30% of content delivery costs, while also improving the quality of this delivery.

Most operators now find that traditional streaming can result in buffering and other delays. Research by Conviva shows that while watching a half-hour show, the average viewer spends less than 18 seconds waiting for a video to re-buffer, however, even this short time is too long when consumer expectations are high and the market so competitive.

The current solution within the industry looking to address this is adaptive streaming and its successor, content adaptive streaming. Adaptive streaming works by detecting a user’s bandwidth and CPU capacity in real-time and adjusting the quality of a video stream accordingly. Although the former is widely used, it does mean that for half the content the bitrate will be too high, and for the other half it will be too low. If it’s too high the content may stall and means that the content is never fully optimised.

As a result, industry pioneers such as Netflix have been working on remedying this shortfall. Netflix has been leading the way with per-title encoding and even recently announced per-shot encoding, but these are proprietary technologies and not available to other operators.

Recognising this shortfall, other developers have been working on content that adjusts the bitrates based on the complexity of content rather than just the internet connection. The result is content adaptive streaming which uses AI to compute all the necessary information, such as motion estimation, to make intelligent allocation decisions. Using a variable bitrate to reach constant quality allows bits to be saved when the complexity drops on slow scenes – using fewer profiles on easier content.

The traditional approach is to keep chunks at fixed lengths. The ecosystem usually requires chunks to start with an I-frame so that profile switches can occur between chunks, but with fixed-size chunks this implies arbitrary I-frame placement. Therefore, a scene cut before a chunking point results in a major compression inefficiency as the image is encoded twice.

Content adaptive streaming combines a scene cut detection algorithm in the video encoder with rules to keep chunk size reasonable and minimise drift, in order to prepare the asset for more efficient packaging. This not only brings cost saving benefits due to reduced traffic, storage and other overheads, but also improves the quality of experience for the consumer.

Content adaptive streaming solutions have been developed with interoperability in mind, so individual parameters such as dynamic chunking can be turned on and off. Operators also have the option to use the specific resolutions they want, even if these appear to be suboptimal to the system.

Any development that enhances quality while at the same time cuts overheads demands to be investigated further. It represents a win for the operator and a win for the viewer too – which all suggests that content adaptive streaming could be the future method of choice.

NBCUniversal and Google partner to produce VR experiences

These VR projects include original content from “Saturday Night Live,” Bravo Media’s “Vanderpump Rules” and SYFY WIRE, with more NBCUniversal networks, channels and shows to follow in the coming months, including projects from E!, NBC, NBC News, NBC Sports, SYFY, Telemundo and USA Network.

They’ll be made available on YouTube, and can be viewed on the web or using any mobile phone. For the most immersive experience, you can watch in VR using Google Cardboard or Daydream View.

NBCUniversal and Google will collaborate on at least 10 multi-episode VR productions that will allow fans to get closer to the shows they love—placing them inside the scene and allowing them to look all around—by leveraging NBCUniversal’s content and Google’s technology. The experiences are being produced using Jump, Google’s platform for VR video capture that combines high-quality VR cameras and automated stitching. Future select VR experiences will also be available in VR180, a new VR format providing 4K, three-dimensional video.

“We are constantly looking for opportunities to bring consumers new ways to experience content from across the NBCUniversal portfolio,” said Ron Lamprecht, Executive Vice President, NBCUniversal Digital Enterprises. “This partnership combines the creative expertise of NBCUniversal with Google’s VR capabilities to create these engaging experiences. We look forward to working with Google and YouTube on more collaborations like this in the future.”

“NBCUniversal’s networks and shows have a proven track record of high-quality storytelling that audiences can’t get enough of. Bringing them to VR lets fans connect with that content in a whole new way,” said Amit Singh, VP of Business & Operations for VR & AR at Google. “NBCU’s teams were able to easily capture engaging VR content using the latest VR Jump cameras. And with YouTube, audiences can experience it on any device, bringing them closer to their favorite series.”

As part of this partnership, Bravo and Google today launched two 360 VR experiences featuring Lisa Vanderpump and the cast of “Vanderpump Rules,” with more to launch in the coming weeks.

TOUR VANDERPUMP DOGS WITH LISA VANDERPUMP

Take a 360 tour of Lisa Vanderpump’s recently opened Hollywood dog rescue center, Vanderpump Dogs. Fans will get a VIP tour from Lisa herself who will show you every detail, from her trademark pink walls to the doggie biscuit bar. Guest appearances also from Tom Schwartz, Katie Maloney, and their dogs Gordo and Butter.

 

JAMES, SCHEANA AND LALA GET READY FOR A PARTY AT TOM TOM

Go 360 with an exclusive bonus clip from the season finale of “Vanderpump Rules.” Pour a glass of champagne and join Scheana, Lala, and James as they get glammed up for the Tom Tom party. Plus, take a tour of the “Vanderpump Rules” interview set where the cast sets the story straight.

SyFy Wire created 360 and 180 videos for Emerald City Comic Con, including interactive games such as Sidekick Search

More videos and inforamtion here