Esports Hub Launches Staffordshire University into a New League

By Ellen Camloh

It’s not an uncommon mission statement for anyone in higher education. But what’s not so common is that the jobs Mr. Leese is referring to are in an industry that didn’t even exist just a few years ago.

So when it came time for the University to build a new facility to prepare students for their future careers, he and his colleagues had to put aside equipment that had worked for decades in the past.

Instead, they focused all their attention on future-looking technology—technology that didn’t even exist, just a few years ago.

Ahead of the Game

Take one look at the course curriculum for the newest addition to Staffordshire University’s business school, and you’ll see it’s not like any other management programme in academia.

e-sportsphotos_59

Year One compulsory modules:

  • Competitive Gaming Culture
  • Resourcing Esports Events
  • Esports Ecosystems
  • Single Player Esports Event
  • Esports Events Experience
  • Esports Broadcasting

That’s because the University’s brand-new course—started only in September of 2018—is anything but typical.

In fact, it’s the first of its kind in the United Kingdom, focusing strictly on the business of an emerging industry that in 2019 expects to see record revenues, thanks in large part to the influx of companies and investors getting into the market.

With a growing sector and high-profile brands rushing into a new space, Staffordshire University looked into the future of its graduates, and saw an opportunity for them to be on the leading edge of a USD $1.1bn (£844 million) industry.

And the Bachelor of Arts in Esports (Honours) was born.

Acquired Skills

Staffordshire University, located in Stoke-on-Trent, England, is no noob when it comes to teaching technology.

For one thing, the campus’s roots date back more than a hundred years, to the site of a school of science and technology that opened in 1914.

Other experience points: the University’s computer science programmes include certifications in Cyber Security, studies in Artificial Intelligence and Robotics, and partnerships with Cisco, Amazon, and Microsoft.

What’s more, it offers no fewer than four bachelors’ degree programmes in video games. It has previously been ranked best business school in the world for its use of Twitter. And it pioneered the world’s first Games PR and Community Management degree.

It was just a natural levelling-up for Staffordshire University to become the first in the UK to offer a degree programme dedicated to the business of esports management.

e-sportsphotos_52

Attract Mode

In 2017, at the time of the course’s conception, the Business School’s Associate Dean for Recruitment, Rachel Gowers, observed that industry was driving the creation of new jobs, saying that “companies are looking for people who are both entrepreneurial and tech savvy.”

That’s because jobs in esports aren’t limited to the players sitting in front of the game consoles. For companies to make money in esports, just like any other sport, it takes more than just athletes—it takes a whole ecosystem of specialisations.

These jobs can be anything from highly visible roles like on-camera shout casters and hosts, coaches and analysts, to behind-the scenes pros including event managers, partnership and sponsorship reps, production crews, team or organisation managers, finance specialists, and PR execs.

With its combination of business and technology expertise, the University was well-positioned to pioneer a programme that could both advance the career development of the next generation of students, and have a direct impact on the industry’s presence and growth in the UK.

The resulting course draws from academic best practices in business curriculum, giving students instruction in marketing, HR, finance and event management—but it’s entirely modernised and contextualised, to make it far more relevant for students pursuing careers in this ever-evolving industry.

Additionally, one of the school’s Lecturers, Stuart Kosters, said that to give students the best chance of getting a job, course developers had actually worked closely with industry employers, for whom hands-on experience was vital in their recruitment.

What better selling point to drive the hands-on experience home to prospective students than to create an Esports Hub—a state-of-the-art facility where they could bring their own esports competitions and productions to life?

e-sportsphotos

Challenge Accepted

The course developers turned to the technical specialists in the Media & Communications staff to specify and deploy the equipment and infrastructure for the new facility. The tech staff were already responsible for running, supporting and maintaining the gear used in the school’s other production studios.

“The newsroom that we have here is a four-camera studio in a typical ‘newsy’ setup with green screen and lighting,” says Chris Leese.

Students in the school’s television studies or sports journalism programmes, for example, produce live news shows and special broadcasts, and report from the football grounds, all using mostly conventional broadcast equipment. But Chris knew it wouldn’t be the same for esports.

The Esports Hub would require 4K streaming, as well as vision mixing large numbers of simultaneous, high-res computer sources over the network (think a dozen gamers competing in the same game) together with 4K camera sources, graphics and audio.

What’s more, Chris says, “because this was a new venture for us, it had to be something that was going to be flexible, something we could expand on if we had to make changes at any point.”

It was clear right away that investing in traditional broadcast equipment was not going to be sufficient.

“We had to look into future capabilities, or we’d be fitting out a studio that was going to be outdated almost immediately,” says Chris. “We had to go IP.”

e-sportsphotos_50

Student-Friendly, Staff-Friendly Technology

Designing a brand new, 4K-ready, all-IP space from scratch for the first esports business degree in the country seems like a fairly daunting task, considering it had never been done before.

But what made it even more challenging—and exciting, says Chris—is that he and his technology staff colleague, Matt Lewis, hadn’t actually built a production studio before.

“The TV studio had already been installed before we started our current roles,” says Chris, “and before that, we didn’t have any broadcast engineering experience or a background in outfitting a studio. We’ve just developed that knowledge and understanding over time.”

But the staff has a rule of thumb it uses whenever they’re deciding on any new equipment that comes in.

“Be it portable cameras, an audio controller, or anything else that’s going to ultimately get installed, the first requirement is that it’s got to be what we call ‘student friendly.’”

That ‘rule’ has helped them make equipment choices that allow students to grasp production concepts and create their projects a lot more quickly.

“We’re not training students to be broadcast engineers, so we don’t need to delve into in-depth lessons about how signals are routed or how equipment is cabled,” Chris says. “It’s got to be something that they can easily pick up. And that was one of the biggest selling points for going to NewTek.”

e-sportsphotos_46

Gaining NDI Experience Points

The technical staff designed a workflow around NDI®, NewTek’s encoding technology for delivering frame-accurate video over IP, and circumvented conventional video routers altogether.

Using everyday, standard 1Gb IT connectivity, the new facility can run 13 high-frame-rate, high-res gaming workstations over the network simultaneously, with NDI Scan Converter software making each gamer’s PC available as video sources.

This allows productions to be configured for two teams of 6 players, plus a connected “spectator mode” workstation to determine which in-game feeds will be used. Game consoles such as Nintendo Switch or PlayStation can be added to the network via NewTek Spark Pro, converting HDMI video devices to NDI sources.

Vision mixing for all gaming PCs and consoles (plus graphics and audio) is centralised in the 44-input NewTek VMC1, which, with its companion 2-stripe Control Panel, is “student friendly” enough to be operated without ongoing staff support.

In addition to the networked inputs from the computerized sources, the facility has 3 studio cameras as well as a ceiling-mounted PTZ camera, all with 4K signals converted to SDI using converters, and then brought into the VMC1 via a pair of NewTek NC1 I/O modules so that they’re available for mixing and monitoring over IP as well. (Only the VMC1 has a 10Gb network connection.)

A dedicated switch isolates the esports hub from the University’s administrative network, keeping additional traffic from bogging it down. All in all, says Chris, the project involved far more IT connectivity and networking than he and his teammates had implemented before.

“Doing this project really pushed our knowledge forward massively in terms of broadcast technologies and also networking,” he says. “We didn’t have years of extensive knowledge beforehand, but the technology available made us fairly comfortable with going into it and doing it ourselves.”

With traditional workflows, most facilities of a similar scope would require consulting with broadcast engineers and specialists, he says.

“It just speaks volumes about how easy it is to use NDI.”

New Game

September 2018 marked the opening of the programme, which, predictably, filled to capacity when it launched.

“Within months of the students using the hub,” Chris says, “we were already expanding on and developing different elements to incorporate into it.”

For example, the school hosted a recent broadcast event, showing the esports varsity (or eVarsity) team playing against another local university.

The students wanted to take the feed from the broadcast studio on one part of the campus to a social venue on complete other side of the campus, about half a mile away—and at the same time, pull a camera feed from that location back in to the studio and use it as part of the broadcast.

“With traditional setups we wouldn’t have been able to do that. We just wouldn’t have had the cabling in place to send over the signal, and streaming it would have added a massive delay,” he says. “So NDI has already allowed us to expand and achieve new capabilities that we’d never have done with a broadcast studio.”

He mentions another event coming up that will be completely driven by the students, who are now scheduling their own independent events. Because it will take place over the weekend, they won’t have any technical support from staff.

But they’re already self-sufficient, says Chris. “They’ve got equipment they like to use, they’ve got the training, they’ve asked us any questions that they might to know. Now they can effectively do the whole production on their own.”

More info

Austin-Peay State University Delivers Hands-On, Educational Production Experiences with Hitachi HDTV Cameras

Hitachi_Austin_Peay_State_U_3The Department of Communication at Austin-Peay State University (APSU) is proud to offer students real-world, hands-on experience in a full array of television and sports broadcasting disciplines. Building on positive experience with cameras from Hitachi Kokusai Electric America, Ltd. (Hitachi Kokusai) in the university’s athletic arenas, the department has expanded its use of Hitachi cameras into its instructional television studio by purchasing three new Z-HD5000s.

Hitachi Kokusai will exhibit in booth C4409 at the 2019 NAB Show from April 8 to 11.

Located just 45 minutes from Nashville in downtown Clarksville, Tennessee, APSU is a four-year public university with more than 10,000 undergraduate and graduate students. APSU Television (APSU-TV) operates 24 hours per day and features student-produced news, public affairs, sports and special event programming. The APSU Department of Communication also offers the only Sports Broadcasting major in the state of Tennessee, providing students with live production experience on APSU football and basketball broadcasts, as well as video scoreboard productions.

While APSU-produced programming can be seen on outlets ranging from the campus cable channel to the EPSN+ streaming service, the department’s primary focus is on the educational experience.

“Even though our department does productions, they are all embedded within our curriculum,” said Kathy Lee Heuston, professor and interim chair, Department of Communication at APSU. “Even our sports broadcasts are part of the curriculum. We properly train the students, and then they apply those skills on actual productions.”

That emphasis on experiential learning was core to the university’s earlier purchase of six Z-HD5000s from Hitachi Kokusai for its basketball and football venues, the Dunn Center and Fortera Stadium.

“Our objective was to have a professional camera to teach the students on, so they would be prepared when they went out into the real world,” said Lee Heuston. “The Z-HD5000 met all of the requirements while best fitting the budget, making it the first time the department was able to purchase professional-level cameras.”

Hitachi_Austin_Peay_State_U_2

“While streaming Ohio Valley Conference games on EPSN+ was not a consideration when we bought the Hitachi cameras, it was a huge benefit when we started with ESPN+ that we already had cameras that met ESPN standards,” added Steve Sawyer, video production coordinator at APSU.

When it came time to upgrade the department’s educational television studio, the Hitachi cameras were again a natural fit.

“Our previous studio cameras were not professional-grade and were due for replacement, so we needed to upgrade to make everything function the way we wanted,” explained Lee Heuston. “Adding more Z-HD5000s gave us consistency, so once the students learn how to use them in the studio, they’re immediately able to use them in our other venues.”

While the three new Z-HD5000s are deployed in fixed positions in the studio, the arena cameras are deployed in various combinations of tripod-based, ladder-mount and handheld operation depending on the sport. Sawyer highlights the cameras’ durability as particularly valuable for the rigors of their athletic productions.

“The Z-HD5000s are very rugged,” he said. “We’re constantly setting up and tearing down the cameras, and moving them across campus. We’ve had absolutely no problems – they’re like tanks.”

The visual quality of the Hitachi cameras has also benefited APSU’s productions.

“We use other cameras in certain situations, such as robotic basketball backboard cameras, and there’s just no comparison in quality,” said Sawyer. “When we do need to take those other shots, we use them very sparingly, because you can definitely see the difference.”

The quality improvement compared to APSU’s earlier studio cameras was similarly evident.

Hitachi_Austin_Peay_State_U_1

“When we had the older cameras in the studio alongside content coming from the Hitachi cameras in the arena, you could see the quality difference,” said Lee Heuston. “Now, with the Z-HD5000s feeding our NewTek TriCasters, we get very clear pictures and the results really look professional.”

Last but certainly not least, Sawyer compliments the intuitive nature of the Z-HD5000s as well-suited to their educational goals.

“The Hitachi cameras are very easy to teach,” he said. “Everything is very well-placed in the layout of the controls.”

Those instructional benefits align nicely with the department’s primary objective.

“We are very fortunate to be able to use the Hitachi cameras in our classroom environment as well as in our productions, to give our students hands-on experience with professional equipment that will benefit them after graduation,” concluded Lee Heuston.

More info

Magewell to Showcase Robust Production and Distribution Solutions “from Start to Finish” at 2019 NAB Show

Magewell continues to expand its innovative offerings into new product categories, bringing its highly-acclaimed quality, reliability and price/performance value deeper into professional media workflows. At the 2019 NAB Show, the company will demonstrate its latest feature-rich tools spanning key steps in the content production chain, from capture and conversion to playout and streaming. Magewell will exhibit in booth SU5724 at the show, taking place April 8-11 in Las Vegas.

Convert

Magewell_Pro_Convert_HDMI_4K_Plus_1Highlighted among Magewell’s newest products on display will be the Pro Convert family of standalone NDI® converters. Making the transition from traditional video architectures to IP-based workflows more affordable and practical, Pro Convert devices lets users bring HDMI or SDI sources into live, IP production networks using NewTek’s popular NDI technology. Available in 4K and 1080p60 models with a choice of input interfaces, the plug-and-play devices automatically detect the input signal format to easily and cost-effectively connect existing equipment into next-generation IP media infrastructures.

For users of Magewell capture products or who prefer a software-based conversion solution, the company will also demonstrate its new Magewell Bridge software for NDI.  Magewell Bridge allows video and audio from any Magewell ingest device to appear as a live source to any other NDI-enabled software and systems on the network.

Capture

Magewell_Pro_Capture_Dual_HDMI_4K_Plus_LTMagewell’s comprehensive array of capture devices continues to grow, enabling end-users, integrators and OEMs to choose the perfect ingest solution for their exact needs. Featured products will include the USB Capture Plus family of plug-and-play, external, USB 3.0 capture devices; the ultra-compact, power-efficient Eco Capture series of M.2 cards; and the company’s flagship Pro Capture line of PCIe cards. The newest release, the Pro Capture Dual SDI 4K Plus, simultaneously captures two channels of 4K video at full 60 frames per second with 4:4:4 chroma sampling over single-link 12G-SDI, dual-link 6G-SDI or quad-link 3G-SDI connections.

Stream

Magewell_Ultra_Stream_HDMI_PR_Drk_1Making live streaming easy even for non-professional users, Magewell’s Ultra Stream HDMI standalone streaming encoder lets them record or stream high-quality video with one click using on-device buttons or an intuitive smartphone app. Users can stream to popular services including YouTube, Facebook Live or Twitch or to a custom-specified server, and can record video to a USB drive or the associated smartphone. The compact device encodes video up to 1080p60 from an HDMI input and also supports 4K sources, down-converting them automatically to HD.

Playout

Flex I/O PCIe input-output cards combine Magewell’s highly-regarded video capture benefits with high-quality playout capabilities. Available with SDI or HDMI interfaces, the first two Flex I/O models each feature four input channels and two outputs, all of which can be used simultaneously with independent resolution, frame rate and processing settings for maximum flexibility.

More info

Magewell Expands Pro Convert NDI Encoder Roster with Fourth Powerful Model

Magewell has unveiled the new Pro Convert SDI Plus standalone NDI® encoder, the fourth entry in its rapidly-expanding Pro Convert family of devices for bringing traditional video signals into IP-based production networks using NewTek’s popular NDI technology. The new model converts SDI-connected HD or 2K input sources into NDI streams with extremely low latency and is available immediately, as are all three previously-announced Pro Convert models.

All four converters will be featured in Magewell’s booth (SU5724) at the upcoming 2019 NAB Show, taking place April 8-11 in Las Vegas.

The quartet of Pro Convert configurations offers content producers a flexible choice of input connectivity and encoding resolutions. The new Pro Convert SDI Plus and recently-introduced Pro Convert HDMI Plus encode source signals into full-bandwidth NDI streams up to 1080p60 HD from 3G-SDI or HDMI interfaces, respectively. The Pro Convert HDMI Plus can also accept a 4Kp60 input, down-converting it automatically to HD for encoding.

Meanwhile, the two flagship products in the Pro Convert series natively support 4K inputs and encoding. The Pro Convert HDMI 4K Plus transforms sources up to 4096×2160 at 60 frames per second through an HDMI 2.0 input interface, while the Pro Convert SDI 4K Plus converts 6G-SDI signals up to 4K at 30 fps into NDI streams.

The Pro Convert family incorporates NewTek’s NDI Embedded SDK to ensure compatibility with the growing ecosystem of NDI-enabled solutions, and wraps its core functionality in a robust feature set. Automatic input format detection and network configuration provide plug-and-play ease of use, converting source video at its native resolution and frame rate by default without any manual setup effort. Loop-through connectivity allows input signals to be sent simultaneously to additional displays or equipment, while an intuitive browser-based interface enables users to control advanced conversion settings and FPGA-based video processing such as up/down/cross-conversion, de-interlacing and image adjustments.

The ultra-compact Pro Convert devices are ideal for both in-studio and portable field use. Value-added features for live production applications include a 1/4″-20 thread for standard camera-mounting accessories, preview and program tally lights, and NDI-based PTZ camera control. The units can be powered by the included AC adapter or via Power over Ethernet (PoE) for further deployment simplicity.

“Our Pro Convert family has gained tremendous momentum, enabling more content producers to easily and cost-effectively make the transition to IP-based live production workflows,” said James Liu, VP of Engineering at Magewell. “We have a history of offering customers a comprehensive choice of input interfaces and video formats so they aren’t forced to buy more than they actually need, and we’re pleased to continue this tradition with the introduction of the Pro Convert SDI Plus.”

More info

Mediaproxy To Show New Exception-Based Monitoring and Content Matching Workflows at NAB 2019

Mediaproxy  will be showcasing its updated LogServer software suite for compliance, monitoring and analysis at NAB on its booth SU2802.

That broadcasting has become more complex with the advent of OTT services is an understatement. Playout is no longer the final point of quality control. Going further down the content delivery chain, CDN edge points, targeted ad-insertion, multi-language support, and event-based channels require the expert scrutiny of broadcast engineers. The need to manage a more complex ecosystem with an ever-growing list of logging and compliance requirements has become a priority for content owners and regulators alike.

At this year’s show, Mediaproxy will introduce the concept of exception-based monitoring via IP penalty boxes to enable broadcasters and MSOs working at scale to more efficiently handle quality control and compliance.

For network operators managing multiple stations and playouts, Mediaproxy has introduced a new live source comparison feature for real-time identification of mismatched content. One or more live sources can be compared based on video content and alert operators immediately when they are not matching.

Support for Newtek NDI and SMPTE 2110 media over IP formats further enhances the options available for compliance recording and monitoring solutions. By capturing NDI and SMPTE 2110 sources including metadata Mediaproxy LogServer allows broadcasters to ensure full compliance during all stages of IP deployment.

Erik Otto, CEO of Mediaproxy, explains, “Reigning in control over a large number of OTT streams can be daunting, which is why having a unified system for monitoring compliance and identifying issues across all traditional and OTT playouts is critical. To ensure that the right content is at the right place at the right time, Mediaproxy LogServer enables operators to log and monitor outgoing ABR streams as well as Transport Stream and OTT stream metadata including SCTE triggers, closed captions, and audio information, all from one place. It is the key to surviving the multiformat game of the future and we are delighted to demonstrate our latest advances in this area at NAB 2019.”

More info

Magewell Ships 4K HDMI to NDI(r) Encoder and Unveils New HD Model

Enabling users to bring traditional HDMI video signals into live, IP-based production and AV infrastructures using NewTek’s NDI® technology, the two converters will be showcased alongside other Magewell innovations in stand 8-G430 at the upcoming ISE 2019 exhibition in Amsterdam.

The latest expansion of the Pro Convert family offers users and systems integrators a flexible choice of input connectivity and encoding resolution. The Pro Convert HDMI 4K Plus transforms sources up to 4K Ultra HD at full 60 frames per second via an HDMI 2.0 input interface, while the newly-unveiled Pro Convert HDMI Plus encodes HDMI source signals into full-bandwidth NDI streams up to 1080p60 HD. The Pro Convert HDMI Plus can also accept a 4Kp60 HDMI input signal, down-converting it automatically to HD for encoding.

A third model, the Pro Convert SDI 4K Plus, converts 6G-SDI signals up to 4K at 30 fps into NDI streams. It is slated to ship early in 2019.

“The ability to connect existing HDMI or SDI equipment investments into IP-based media networks is a key to making the new IP production paradigm affordable and practical,” said Nick Ma, CEO and CTO at Magewell. “Pro Convert devices are designed to help customers effortlessly connect their current sources into NDI workflows, and we are excited to now be shipping the first models. Our flagship 4K models are ideal for users who are currently producing in Ultra HD or plan to in the future, while our 1080p60 models offer a lower-cost alternative for users who only require HD.”

All Pro Convert models feature extremely low latency and are truly plug-and-play, with automatic input format detection and DHCP-based network configuration eliminating the need for manual setup. For users wanting greater control of the conversion process and the powerful features of the Pro Convert devices, an intuitive browser-based interface provides access to status monitoring, advanced settings and FPGA-based video processing including up/down/cross-conversion, de-interlacing and image adjustments.

Power for the ultra-compact Pro Convert devices can be supplied via Power over Ethernet (PoE) or the included power adapter. Loop-through connectivity on each unit allows the input signal to be sent simultaneously to additional displays or equipment without external splitters or routers, enabling sources to be easily used in new IP workflows without disrupting existing video infrastructures.

Value-added features for live production applications include a 1/4″-20 thread for standard camera-mounting accessories, preview and program tally lights, and NDI-based PTZ camera control. An included breakout cable allows extension of the tally lights and PTZ control interface for increased connection flexibility.

More info

Nickelodeon’s Revolutionary VR Experience Turtle-Powered by NDI®

by Brian Leopold

“Hey,” Donnie says. “How are you doing? I hear you’ve got some questions to ask us.” You take another step forward and glance down at your feet. A wide puddle of water spreads across the cracked concrete floor and you catch a glimpse of your reflection. It looks like you, but somehow, your head has morphed into… Hey Arnold. Awesome. You love Hey Arnold.

This is the experience journalists and a few choice super fans were treated to at this year’s Comic-Con Convention in San Diego, thanks to an incredible virtual reality environment cooked up by the talented staff of Nickelodeon Entertainment Lab. And at the heart of the experience, working tirelessly behind-the-scenes to allow layers of disparate technology to interact seamlessly with one another, NewTek’s NDI® was doing the heavy lifting, bringing the new world of Rise of the Teenage Mutant Ninja Turtles to life in a way that’s never been possible before.

Chris Young, Senior Vice President of the Nickelodeon Entertainment Lab, headed up this virtual reality project, designed to create the ultimate PR splash to publicize the reboot of the Teenage Mutant Ninja Turtles franchise. Nickelodeon’s new turtle series Rise of the Teenage Mutant Ninja Turtles is a prequel to the original show, and features the TMNT crew before they became crime fighters. The new show is set in a dense, urban background reminiscent of New York City, along with NYC’s mythical, hidden underground worlds. By creating a virtual press junket, Young was hoping to be able to immerse journalists into the Turtles’ new world.

“At the Nickelodeon Entertainment Lab, we believe the future is rendered in real-time,” Young says, and at 2018’s Comic-Con, he and his crew of engineers and dreamers set out to prove it.

Newtek_NDI_Nickelodeon
Chris Young is Senior Vice President of Nickelodeon Entertainment Lab, established to take a long-range look at emerging technology and new platforms to deliver content

“We wanted journalists and superfans at Comic-Con to have this unique opportunity to step inside the Turtles’ art-directed world,” Young says. “We wanted them to get a first-hand look at the new show and be able to interview Mikey and Donnie in virtual reality. Our plan was to film the interview using live-action cameras composited with gaming footage in mixed reality. Then, at the end of the interview, we would hand the journalist a thumb drive of their interview with the turtles.”

All of which sounds like a tremendous idea, but how to pull it off? “Well,” Young admits. “The devil is in the details.”

Newtek_NDI_Nickelodeon1
David Gerhardt uses Adobe Character Animator to give life to one of the two TMNT characters taking part in the virtual press interviews. A different animator was responsible for the control of each of the two Turtles

How Did They Do It? NDI, of course Fortunately, this is just the sort of challenge that the Nickelodeon Entertainment Lab was created to take on. The Lab was set up a few years ago to experiment with emerging technologies in the hopes of identifying outlets for Nickelodeon’s universe of characters, both present and future. Pulling off a project as ambitious as a virtual press junket proved to be quite a challenge for the Lab, and required bringing together technologies from a number of different disciplines and forcing them to play together nicely. That’s where NewTek’s revolutionary NDI technology came into play. NDI acted as the unifying force in the Lab’s virtual reality project, allowing a wide range of programs and devices to interact with one another.

Newtek_NDI_Nickelodeon2
Using TriCaster allows live editing between animation and live action angles and iso-recording of all feeds as well as outputting the program feed to a thumb drive for take home and easy sharing and iso-editing

Chris Young explains. “It all starts with Adobe Character Animator. We stream that into Unreal Engine, (the source-available game engine developed by Epic Games) using NDI technology to get it into the game. So, the person wearing the VR headset in the game is seeing the animated Turtles streaming over NDI in the game. From there, we’re also streaming NDI into live compositing software, where we’re compositing the footage together, both virtual camera shots of the Turtles, and live action footage of the journalists.”

The journalists are shot on green screen,” Young explains. “And that green screen key is composited into a back plate coming out of the game.”

But that was only the first step in the complicated virtual-reality universe concocted by the fertile minds at the Nickelodeon Entertainment Lab.

Newtek_NDI_Nickelodeon9
A journalist immerses himself in the virtual Turtle world wearing a VR headset in front of a green screen

“All of those signals were then live-streamed back over NDI to our TriCaster® system,” Young tells me. “Using the TriCaster, we were able to live edit between all the animated and live-action camera angles, as well as record iso-feeds of all the different angles, in addition to the program edit. Then, once the interview was over, we were able to throw the program feed on a thumb drive to give to the journalists. From there, they could either share the interview immediately over their social channels, or go back and use the iso-edits to repackage the interview in a way that worked best for telling the bigger story they wanted to tell.”

Newtek_NDI_Nickelodeon7
The same journalist as above interviews the Turtles inside the new show’s virtual reality environment. Each interviewer was given the option of choosing one of several Nickelodeon character heads to cover their virtual reality headsets

Keeping Someone Else’s Head on Straight

Through experience, the staff at the Nickelodeon Entertainment Lab has come to realize that nothing ruins a virtual reality experience faster than seeing yourself wearing a clunky VR headset rig. So, rather than destroy the illusion for journalists, the Lab’s designers came up with an ingenious solution to the headset problem; they gave the journalists new heads. Before the interview, each journalist was allowed to select their favorite character from the Nickelodeon universe of animated characters, and that head was keyed over their own, eliminating those pesky, illusion-destroying VR headsets.

“We thought it would be super-fun for everyone to be a Nick character,” Young says. “So, the idea was to cover the journalist’s head and VR headsets with a Nick character head. Although the journalists chose which character they wanted to be, it wasn’t revealed to them until they looked down at a puddle we put in the ground. So, there would be this great moment when they’d see themselves and say, ‘Wow, I’m Hey Arnold.” It made for a great interview.”

Bringing The Turtles to Life

Pulling off this live virtual-reality project required a well-rehearsed team, according to Young. A crew of nine worked to bring the new turtles’ universe to life at Comic-Con, as well as a room full of blazing-fast gaming computers and other video gear.

“We had two puppeteers who live-animated the characters of Donnie and Mikey,” Young says. “They worked with the Turtles’ animation show unit to extract animation cycles from the actual show episodes. Those cycles were triggered using MIDI controllers and the two puppeteers were able to puppet live, viewing the output on NDI monitors, as well as a composite feed of the two characters together. That way, each animator knew what they were doing, as well as what the other character was doing. And then, just off to the right of the animators, were actors Josh Brenner and Brandon Mychal Smith, who voice the characters of Donnie and Mikey in the series.”

Since the Nickelodeon crew only had access to the actors for a few hours during Comic-Con, it was essential that the system worked without any problems.

NDI: The Glue Holding Turtle Reality Together

And that’s why the Nickelodeon Entertainment Lab chose NewTek’s NDI technology to meld all these different platforms into a single seamless world. Chris Young has been a fan of NewTek products for many years, and using NDI was a natural extension of that.

“NDI does this amazing thing in a magical way,” Young says. “The great thing about NDI is that you can send it out as a source and pretty much pick it up on any machine and it just works, regardless of format or frame rate. It gives us an amazing amount of flexibility. We also wanted to route so many different sources from so many different systems with so many different requirements for each source into our project, and NDI is an elegant solution for moving video across your local area network.”

Rehearsal Is Key

Once the crew at the Lab came up with their bold plan to combine all these divergent systems and technologies into a single virtual reality experience, one question remained. Would it work? And more importantly, would it work at the convention, with its requirements for rapid set-up and tear-down? In order to find out, the Lab’s engineering staff set a mock-up of the system in their Burbank studio and began putting the system through its paces.

“We taped the dimensions of our booth out on the studio floor,” Young says. “Then, we set up a bunch of tables and all the computers with the right cable lengths for everything, so we could really understand the set-up. We rehearsed it for a couple of weeks straight to get the protocol down cold. We built the eight-foot-high green screen cube for our journalists. Then, we hauled people out of the hallways at the Lab, anyone we could find, and made them play the role of journalists. In the end, it turned into a mini-broadcast production unit, built around the TriCaster system. As a result of all that rehearsal, the system performed like a champ at Comic-Con, all thanks to NDI, which Young credits for holding the project together.

“Anytime you do a live event and involve advanced technology and talent, there are so many things that can go wrong,” he says. “So, we were thankful that NDI was super-solid and allowed us to find the rhythm to pull the project off.”

Using NewTek’s Connect Spark to Work Out the Bugs

The crew at the Lab also uses another piece of NewTek gear in their studio in Southern California to help design their virtual reality worlds.

“When the Connect Spark device came out, I literally bought it Day-One off the NewTek website,” Young says. “It solved a major problem for us,”

NewTek’s Connect Spark is a portable IP video encoder that converts a 4K video signal to NDI and delivers it to the network for use with compatible systems, devices, and applications. Many of Nickelodeon’s current projects involve creating large-scale roaming VR experiences, and one of the biggest problems facing game designers in that realm is understanding what the end-users are seeing with an eye toward improving the experience.

“There’s nothing worse than the player telling you, ‘I think that thing over there should move.’ And you’re saying, ‘What thing? I can’t see what you’re talking about.’ So, the minute I saw the Connect Spark, I realized I could put it on a backpack and send wireless video at sixty-frames-per-second back to the designers, so they can get a true view of what the VR players are actually experiencing while they’re playing. It’s absolutely critical to understanding what needs to be changed.”

Real-Time is the Future of Entertainment

When they’re not creating a Turtle-powered world of wonder for journalists, the engineers and designers at the Nickelodeon Entertainment Lab are focused almost exclusively on virtual reality, augmented reality and mixed reality projects. According to Young, real-time entertainment is the wave of the future, which is why Nickelodeon Labs is working so hard to be one of the first riding the wave.

“The world is going to be full of executables and binary files,” he proclaims. “The days of QuickTime and 2-D video will fall away.”

“I think the idea of immersing yourself into your entertainment is an idea that’s coming at us like a runaway train. That’s why we’re trying to understand that world with so much energy and effort.”

And most likely, NewTek’s cutting edge technologies will continue to figure heavily into the Nickelodeon Entertainment Lab’s future plans.

“There are so many projects we have in our queue, and we’re just dying to get to them all,” Young says. “We don’t have enough time in the day to do all the cool things we want to do.”

Read this article on NewTek’s website