It started with black and white television on those old tiny CRT bubble screens set into huge wooden free standing boxes with a speaker across the bottom. The cameras that broadcast a picture to that monster were also beasts. They used vacuum tubes and were the size of a hot water heater.
Next, color television. I was not alive for the transition to color TV, but I can only imagine how exciting that was. We see in color (most of us), not black and white, so when the time came that you could watch your favorite television show in vivid chromatic bliss, I am sure people looked at television in a whole new way. I still smile when I see an old motel sign that says, “we have color TV”.
Fast forward many years and we improve television again, this time we change the signal all together. Even when it was black and white, the move to color did not change the foundation for television broadcasting. But about 60 years later, we went “digital” and this ushered in a new type of television, high definition television. I was there for that transition!
My first HD show was in 1999 at the Newport Rhode Island Jazz Festival. The television truck was out of Japan (NHK) and I remember using first generation Sony HD cameras that required SMPTE fiber. The engineers were so nervous about running glass cable (for fear of failure) that they made the camera guys run two extra cables to every camera! And we had six cameras covering the concert! The show went well, and the HD simply jumped off the screen in all their glory.
Later in 2003, New England Sports Network went high def on their coverage of the Boston Red Sox. I was there for that as well! I got a brand new Sony HDC-910 camera with a new Canon 75x HD lens to operate at the high left field camera position. The serial number was 12. They guy running the high first camera across the field from me got serial number 2. I was now able to follow the homerun balls out of Fenway Park in 1080i HD. Man was that viewfinder sharp (and still is today)! I remember shooting a game (after we just made the HD switch) in a light rain and it was as if the rain drops were in 3D. I could see every single drop. But this was not 3D, this was 2D…
This morning I saw something that truly jumped off the screen. I was invited to take a look at some cutting edge technology. I got to touch the future of television. In fact, when I arrived, I felt like I was in the future.
Just south of Boston Massachusetts, engineers and camera operators from around the globe are working with six cameras (actually twelve camera heads). These six cameras are stereo 3D high definition cameras and they will be covering a concert in the third dimension. The Burbank, California company that is providing the gear is called 3ality Digital.
I am going to do my best to keep this blog as simple as possible, because this technology is very complex. The 3D television process works, but it takes a lot of precision calibration and personnel to get it done correctly. If you want to see bad 3D, watch the 3D blu-ray version of “Journey to the Center of the Earth” remake. I thought I went cross-eyed after I took off the red/green glasses. 3ality Digital do things much better.
If you want to see good 3D, buy “Polar Express” on Blu-ray DVD. I am really looking forward to “Avatar”.
In layman’s terms, there are two types of 3D technology. The old style that uses the red/green glasses and a newer, and better system that uses polarized glasses. Today, I will be talking about the latter, we will not be wearing those ugly red/green glasses. This polarized technology will also help someday to remove the glasses from the viewer by incorporating a special polarized glass surface on the 3D television.
The 3ality Digital cameras are Sony HDC-1500’s. I have used them before shooting sports, though we use the older Sony HDC-900 series cameras at Fenway Park. The HDC-1500’s are 2/3 inch 3 CCD 1080/720 switchable HDTV broadcast cameras. So what, nothing new here. But when you put two of these cameras together, creating a left and right eye, you have 3D HDTV. It is just putting them together correctly that is the really hard part.
There are two basic setups for capturing the world in 3D. Here is how it works:
The first method of setting up a left and right eye in the field is side by side. These two HDC-1500 hand held Sony broadcast cameras are normally mounted on a cameraman’s shoulder or in an xpansion sled fitted with a big lens. In the pictures above, the two cameras are mounted onto a special plate designed by 3ality Digital.
The entire weight of this system balances quite well on a Vinten Vector 750 pan head.
This plate is fitted with motors that can pull or push the two cameras together or apart remotely under computer supervision. By allowing the cameras to move within the sled, the camera operator is able to zoom into stuff and the computer corrects the convergence of the two cameras allowing for perfect three dimension television. When I say perfect, I mean 3D that looks real and does not give the viewer a headache.
This setup is complex. It is all computer driven with a Video guy and a Stereographer (I will talk about this job later) controlling the images, iris and positioning of the two cameras in the sled. The camera operator just sees the “left” eye in his lcd camera mounted viewfinder when shooting the action, so it is like they are operating a single camera in the familiar 2D world. But when the camera guy focuses or zooms at his tripod handles, both cameras do these actions at the same time, at the same speed with perfect precision in both the left and right “camera” eye. Understand? This must happen using computers, servos and gearing so that both the camera act like a single camera.
The video out of each camera is sent down single mode fiber, using telecast boxes plugged directly into the SMPTE fiber connection on the back of each camera. Each camera uses two single mode fiber strings on a TAC 6 cable loom. The cameras are powered locally with a/c power supplies. The motors also need power and there was this pelican case hanging off the side with electronics for the dual lens gear setups. I am not certain why they do not just plug into existing (camera and facility I/O) SMPTE fiber and have CCUs to provide power, but I did not get to ask all the questions I wanted to during my visit.
This side by side method of shooting 3D seemed the most straight forward to me. Simply put two cameras next to each other to get the left and right eye like a human head. But I found out that this type of 3D works best with cameras that are set up far away from the action. These side by side camera are good for the wider 3D shots. If you want up close and personal with foreground, you must get even more complicated.
The second method of shooting 3D is to setup two camera in an L-shaped configuration and shoot them through an expensive prism and $10,000 sheet of special polished, coated glass.
This setup looks crazy, but apparently, it is very effective to create realistic three dimensional television. This configuration uses two Sony HDC-1500 t-cameras. These cameras are not full sized rigs. They are just the 3CCD optical block from their handheld camera brothers. Basically, they are a square box with a lens on it. The 3ality guys are using 22x fujinon HD ENG lenses on all there 3D setups. These type of cameras were developed for remote controlled heads like the CineFlex giro stabilized mounted helicopter aerial cameras. Today, they are being given a new application inside the 3D prism sled because of their light weight. I asked if a full sized camera could be used in these sleds and I was told that they can mount RED ONE, Phantom high speed and F23/35 digital cinema cameras into these fiberglass, motor adjusting mounts.
Similar to the other side by side setups, these L-shaped systems also adjust using sliding plates to help keep the two images in the right place as the camera guy covers the action. No need for a screw driver or Leatherman tool in the field. All of this must be setup before hand and calibrated. These cameras slide on the plates by remote control with a computer checking the 3D image convergence between the two cameras.
The lenses have gearing and control to keep both cameras working like a single camera. There are a bunch of little boxes with multipin cables attached to them. The lenses must be equally matched up in f-stop, zoom position and focus location before the gears are applied. The camera operator uses a small hd lcd on camera monitor to view the action from just the left “eye” camera. The operator does not wear 3D glasses and does not get to see the 3D output at this camera viewfinder.
The control room was set up in a staging area outside the main arena. There were tons of monitors and a few of them were 42 inch special polarized 3D LCD television screens. They were each set up to show the three dimensional images coming from the six 3D cameras setup in the arena. The director would sit down in front of the six screens, wear polarized glasses and direct his cameras as they shoot the concert. Currently, the monitors were displaying test patterns while the cameras were being set up. Someday, these type of televisions will be available for purchase in Best Buy. This is the next step in consumer television after high definition television.
I mentioned “Stereographer” earlier in this post and this is a VERY important job. The Director’s Guild of America has now recognized this role by that name so it is the official title of anyone who sets the stereo depth. When preformed correctly, the stereographer is responsible for realistic 3D. When done incorrectly, you get a headache and end up cross-eyed. I was calling these people “stereologists”. There is no such thing! I guess I should bring a pen and paper when I go off to check out cool new tech to write about. I was corrected and I have fixed this blog article.
When I shoot sports, we have a video “shader” in the truck adjusting chroma, white balance, filter changes and opening/closing the iris during the show. I just point the camera and take the pictures. I pan, tilt, focus and zoom. That is it. Other people in the truck make all the cameras match. (I do know how to properly run a camera myself, just so you know…)
Now thanks to 3D, there is another department needed, in addition to the video shader, to make it all look good. The Stereographer. This “stereo op” (as I will say in my own slang) uses a device that looks like a remote film focus pull controller. I think it was, but forgot to ask. This device controls the physical location of the two cameras in the stereo 3D camera setup. The guy controlling this can spin a knob and slide a lever to move the cameras around in the sled so that the 3D is perfect to the human eye watching with polarized glasses.
So how do they know how to move the cameras together or away from each other to make the best 3D experience? They look at these little “lined” screens and watch the little green bar move around. A computer knows what should look best as far as the two images converging together to make a 3D on screen experience great, so the stereographer just needs to make the adjustments to match the computer readout.
I could compare this to the little “power bar” at the bottom of the screen when you are playing a golf video game. When you keep the little line in the sweet spot, the virtual golfer hits the ball correctly. So I guess these stereo what-ever-you-call-them are basically playing a video game. They must keep the bar green to keep the cameras lined up as the cameraman zooms into stuff changing his focal length and 3D point of view.
While walking around, I met a great guy named Ted Kenny. He works for 3ality Digital and really knows his stuff. He told me that the key to good 3D is to make sure you get it right at the acquisition phase. You can fix things in a computer later, but not all the time. It is possible to shift the two images in post to get a better 3D experience, but sometimes the footage was recorded so poorly, it is not fixable. Ted goes with the school of though of getting it right in the first place. You can then play around with the perfect 3D imagery in post as creative tool, not as a correction tool. A creative 3D editor could move a person or item around in the 3D space for an emotional effect, for example.
This show will not have a 3D line cut, everything will be switched and edited in post production for the 3D final project. There was a ROADIE HD 53 foot television truck parked outside and I think they were taking the left eye of all six cameras to cut a 2D show, but I did not get that far today to explore that.
The 3D show will be recorded to HDCAM SR. Each of the six Sony SRW-5800 digital video cassette decks pictured above retail for about $120,000 each! So you are looking at close to one million dollars worth of equipment to record the six cameras for this concert in 3D.
Each of the SR decks can record two streams (left and right eye) from each camera at 1080p 30p 4:2:2 high definition. If they did not have this capability, you would see twelve decks mounted into this rack space to get the job done. When recording two streams like this, video tape travels very fast over the heads, so whoever is in charge of the cassette tapes, will be slamming new ones into the machines very often.
So, how do you broadcast a 3D High Definition signal? You don’t. You broadcast a HD signal like normal. Again, in layman’s terms, the HD signal carries a split screen image of the left and right eye (camera) all in one 16:9 picture. This image is later weaved together using a piece of hardware that preforms multiplexing (also called a “muxer”, or something like that) to form one double, ghosting, blurry image. This blurry image is only this way to the naked eye. Once the viewer places the polarized glasses on their face, the image becomes clear and three dimensional. Of course with the proper 3D television.
And yes, these futuristic 3D televisions have a switch that lets you watch standard high definition standard two dimensional boring television. I miss black and white.
That is about it. I can’t believe you read all that! I hope most of it is accurate and I hope to do more work with the California based 3ality Digital in the future. This company was founded by very creative people. I got to speak to a guy named Steve Schklair. Steve came from a film background. He was VP of Digital Domain during many films including “Titanic“. Steve started developing the 3D technology 10 years ago. As a DP-Producer, he shot his first movie in high definition way back in 1989. They used the first generation of Sony cameras for this, the Sony 100.
My first HD shoot was ten years later!!!
Steve told me that he has not shot film in quite a while. But he has been shooting. He said he loves using digital and embraces the power of post production when “filming” with digital cinema cameras. This is the future of television and film. Now lets just double up everything and give you 3D!
A big thanks to Mike Narracci for his involvement with 3D television.