AJA has a couple new mini-converters they are releasing, but most of their news is in the form of software updates. They have new software partners, new unified installers and features for their desktop products, and continued progress on their CION camera which is now shipping. They can route and capture CION’s raw data over 3G-SDI hardware, which offers some pretty slick new workflow possibilities, albeit in a narrow set of potential use cases.
Across the aisle, Blackmagic Design has a slew of new products they are showing off. There are all sorts of new variations to their URSA camera line, and some smaller ones as well. There are new versions of Resolve and Fusion, with Resolve getting multi-camera editing and multi-track audio features. There are new 12G SDI routers, recorders and Teranex converters, as well as their Thunderbolt I/O interface for Avid, with hardware support for encoding H265 or ProRes. DNxHR will be an option in the future, similar to how Avid’s previous hardware was designed.
The hardware encoder in the DNxIO seemed a bit surprising to me until I had some time to think about it. Obviously modern systems have no trouble encoding ProRes or DNxHR without dedicated hardware. Where this will make the most significant impact is on 4Kp60 capability. There isn’t enough bandwidth on the 20Gb/s Thunderbolt bus to transport uncompressed 4K data at 60fps. Even at 4:2:2, you can’t get 10bit color into that datarate. That makes pre-compressing the data (into H265, ProRes, or DNxHR) a requirement, between the SDI inputs and the Thunderbolt transfer to the computer itself. AJA did something very similar with ProRes in their ioHD to fit HD into an IEEE 1394b bitstream, and Avid used to fit HD into a Firewire 400 bitstream with their Adrenaline and Mojo interfaces. So it’s not a new idea, and I find it interesting to see it repeated every time a larger format is introduced in the industry.
NVidia did not have much of a visible presence at the show this year, but they did release a new flagship professional GPU card last week. The Quadro M6000 is based on their Maxwell architecture, and has 12GB of RAM to supply its 3072 CUDA cores. I am looking forward to trying one of these out sometime, hopefully hooked up to a nice 4K display.
Sony has lots of new products, including the HDC-4300, a new 3-chip 4K camera with high frame-rates for sports replays, and the PMW-PZ1 4K SxS player on the high end. They were also one of many companies showing off High Dynamic Range displays. After looking at them, I can see where they are going and why, I just haven’t wrapped my head around the how. They do make a significant difference in the viewing expereience, and I think it is a positive change. But what is entailed in actually producing HDR content? I am going to have to do more research on the specifics, but this is my current understanding. High bit depths are required to effectively store HDR content, but merely having 12 or 14bits or color data does note mean that in image is HDR. RAW recording is a good starting point for HDR production, and there is a totally different colorspace involved. I need to learn more about the processing steps to prepare HDR images for display. Among other items to note, the new HDMI 2.0a specification released last week added support for HDR, but I am still looking for details on what that really means. But one thing it means, is that we will be hearing a lot more about HDR in the not too distant future.