Posts from technology
I haven’t found time to fully digest Ben Thompson’s series on the future of TV, let alone think through the implications for the UK model, but they make fascinating reading. I found myself nodding in agreement, frowning as I pondered how the ideas translated (which typically means: how the existence of the BBC distorts things, for good or ill), and occasionally grimacing in disagreement.
Well worth your time, and mine.
[edit: also worth a look is The Verge’s response to Amazon Studio’s first round of TV shows. Spoiler: they think they suck. Television, it turns out, is hard. Who knew?]
Learning to light continues to be one of the most fun parts of my work. This tutorial doesn’t do anything beyond basic three-point lighting for a three-camera interview, but it’s weirdly compelling:
I like it partly because it illustrates why making films can be slow. That’s a lot of work, people and gear to make an interviewee and their interviewer look good.
If you’re looking at films you’ve made or commissioned and thinking “we used great cameras, why does it still look rubbish compared to broadcast?” — this is why. Prep, gear and crew all cost money: sometimes they’re worth it.
This piece from Zacuto covers material that should be familiar to most film-makers:
…but the sequence which goes hard light → hard shadow on scrim/Hitchcock gag → using that scrim to turn the same hard light into a soft source is very nicely thought-out. Sometimes demonstrations are about finding the minimal sequence of operations which makes your point.
Now you’re all set to start cutting with the confidence that your clips will sound great from the moment they land in your timeline.
Film-maker Dan McComb has been writing some excellent FCPX tutorials of late, and I particularly liked this piece on audio processing and synchronization for interviews. I don’t go to quite these sorts of lengths myself — in part because I tend not to shoot interviews — but it’s reassuring to know that my overall workflow and tool use is similar. Two notes in particular:
Firstly, Dan advocates cleaning up audio during ingest and edit prep, rather than going back later. I’d absolutely endorse that sort of thinking when working with FCPX. With previous NLEs I’d lock picture and then go through a grade and dub process, but with FCPX I’ll usually do basic corrections prior to cutting and let them ripple down through the edit process.
The result is a much more watchable edit at all stages, and that often leads to smoother client viewings when one’s working with inexperienced clients. One of the reasons I like FCPX is that it imposes essentially no penalty for working this way.
Secondly, I’d only vaguely heard of isotope RX, and wasn’t aware that it worked within FCPX. Useful.
So Panasonic have surprised everyone by not announcing a replacement for the AF101 at NAB. That sucks. The brave new large sensor cinematography world hence looks like this — and this is very much a personal take:
Canon 5DmkIII. Nice enough, but unless you dismantle the thing and remove the antialiasing filter (yes, people are really doing this) it’s not a vast improvement on the mkII. Better low light, well-controlled aliasing/moire, decent audio; low resolution, old codec, contaminated HDMI output. My vote: ‘meh.’
Other Canons: just old, now.
Nikon D800. Better in low light than we’d feared, decent audio, clean HDMI, decent codec, FX/DX crop serves as a 1.5x extender; same old horrid aliasing/moire issues, low resolution. My vote: ‘meh.’
Nikon D4: Love the idea of FX/DX/1:1 crop modes, but by all accounts it’s plain soft.
Older Nikons: bwah-hah-hah-hah! (And I write that as a D7000 shooter).
Panasonic AF101. Proper camera, HD-SDI, XLR audio, etc; dodgy highlight & shadow handling, still no really good run&gun lenses. You can get lovely pictures out of the AF101, but it’s edgy to use. My vote: ‘meh, but I don’t regret advising my client to buy one last year.’ They’re extremely good value.
Panasonic GH2. Look, I know nobody takes me seriously on this, but I still think a hacked GH2 is the best bargain this side of £5,000. I may yet put my money where my mouth is.
Sony FS100. Bonkers hunt-the-button design and hilariously dim-witted omission of SDI and ND filters can’t hide a superb sensor. Great choice.
Sony FS700. Like the FS100, but with SDI and ND filters. And 8-sec 240fps 1080p burst recording. WANT. Forget the “4K sometime in the future, once we’ve worked out how to record it and decided how much we’re going to sting you for the privilege”, out of the box this is a great camera. Once you’ve worked out where the buttons are. And stopped remembering that you could have bought two AF101s or a dozen GH2s for the same money.
Canon C300. Lovely camera, but for twice the price of the FS700 I just can’t see it making sense for the likes of me. If I did drama or commercial production, maybe, but I’m a documentary/factual/event guy. The C300 is small and discrete, perhaps… but if that’s your thing, stick with a DSLR and save £10,000. Canon really needs a new codec, too.
RED anything. Yeah. No. Thanks. Different world.
So your decision boils down to ‘Can I afford a proper camera?’: if yes, choose your Sony. If no, I’ll tell you to buy a GH2 and you’ll go straight to whichever Canon costs about what you’re happy to spend. Whatever.
Then Blackmagic do this
Bonkers. Utterly mad. Love it love it love it. They’re insane.
Now, low light looks a bit dodgy to me. The sensor is slightly smaller than m43, about a 2.3x crop by my calculations… which is a bit of a pain. Perhaps. Or maybe very good news for those of us who’ve actually tried focus-pulling APS-C handheld. Battery life is only ~90 minutes, and slapping an extended pack on looks to be tricky given the physical design. Media is relatively costly, sort-of but not really. It’s not exactly clear what they’re doing with lens mounts and aperture control, nor what ‘ZF mount’ means. Does the EF mount power Canon IS lenses? etc etc etc.
But… but… wow, that’s a great idea.
Damn, I really want one of those.
Aaaand an FS700.
I’m one of the few people who shoots video with a Nikon D7000, more-or-less by accident: I bought the thing for stills, and only subsequently discovered that it’s much better for video than its reputation suggests. For the most part its flaws don’t prevent it from capturing lovely images, but it’s sufficiently compromised that one has to take Nikon’s video aspirations with a pinch of salt.
So the Nikon D4 intrigues me, in that if you believe all the blog comments out there it’s become the video DSLR to have within minutes of its launch. It may well be so, but there are plenty of things we don’t know. Chiefly:
- Line skipping, or software downsampling with a low-pass filter? If it’s the former, the D4 is dead in the water for video — as is every other DSLR at this point except the Panasonic GH2 and the Canon 1Dx.
- Exposure metering during Live View/video? There doesn’t seem to be a live histogram, let alone waveform, but hopefully there’s something. If not, we may be able to work around using the zebras in a third-party EVF, but… well, sheesh.
- Aperture control in Live View. Still not clear how this works. Is it stepless? Does it really work?
- 20 minute recording time, apparently. If that’s due to European tax laws, well yah boo sucks and break out the Ninja/NanoFlash/whatever. If it’s instead a heat protection issue — how reliable is the camera for long-form recording via HDMI?
- Audio. There’s clearly hope, but is it really any good?
- Low-light performance in video. The D7000 is excellent, but in video it’s clearly not as good as for stills. How does the D4 fare?
Until we have more information on all of these, it’s wholly premature to assert the D4’s supremacy.
Here’s the thing, though: one of these with a 24-120 ƒ/4 VR would cover a huge range of focal lengths; rigged up it could deliver stunning quality. If everything turns out right it could be right up there with dramatically more expensive cameras like the Sony F3 and Canon C300. And I’d absolutely have one over a Panasonic AF100. Chances are Nikon have bottled or flubbed it. But maybe, just maybe, they haven’t.
So yes, it’s an exciting camera. But please, let’s not declare it a winner until we know more. That’s just noise.
One of the things we’re doing with the Ri project is setting up production processes from scratch. The whole shebang, from crewing through post-production and delivery, for an in-house production unit and using external freelancers. The key question when doing this is: how much baggage does one carry across from previous production models?
Part of the answer to that rests in another question: how far can you push production values with tiny crews?
Twenty years ago, nobody was expected to be a camera operator, sound recordist, director, researcher and editor, all at once, from the moment they joined the industry. Today, we absolutely expect production staff to be wielding cameras (thank you Sony for the PD150’s legacy) and cutting their own rushes (hat-tip to Apple for Final Cut Pro). But these are (or were) specialist skills, and training and support needs are big variables when you’re setting up a new web channel.
I’ll be blogging here about the process as we learn how to do this, but one thing is already clear: I can’t begin to say how valuable my broadcast experience was. I learned a huge amount from working on big studio shoots, where a 50-strong crew worked seamlessly. Without that background, we couldn’t have done DemoJam, where we threw most — but not quite all — of the studio process away and got away with something much more sustainable. By the skin of our teeth, granted.
But I’m also profoundly grateful that my first real broadcast gig was with a then-tiny team, making Local Heroes for BBC2. The production company, Screenhouse, consisted in those days of myself and the producer/director working from his attic, Adam the presenter in Bristol, a production manager in Harrogate, and another researcher in London, all working from spare rooms, bedrooms, the kitchen table, the sofa, or the local coffee shop. Our fax machines ran hot and our phone bills were huge, but we made transmission.
Nobody told me how ridiculous that setup was; nobody let slip that I was being asked to do impossible things. So I got on and did them.
That, I think, is often the key. Don’t let people know that what they’re doing is impossible. Hire smart people who can learn fast; let them make mistakes; be there to catch them; turn over enough work that you can afford to carry the learning curve.
Bumpy ride? Absolutely. It’s going to be an exciting year.
We’d be remiss not to point to Aleks Krotoski’s Guardian article about storytelling, which draws its chief influence from Frank Rose’s snappily-titled The Art of Immersion: How the Digital Generation is Remaking Hollywood, Madison Avenue and the Way We Tell Stories: Entertainment in a Connected World.
If I’m honest, though, said book remains on my Amazon wish list. I’m due some serious catch-up…
Final Cut Pro X is out, and there’s much wailing and gnashing. Walter Murch may have walked back to Avid, there’s nary a hint of Bruce the Wonder Yak. Vocal critics decry the lack of export formats and monitoring support. “This is not,” they say, “A professional tool.”
What’s changed, I think, is what ‘professional’ means. In a post-tape, post-broadcast world FCP X looks elegantly minimalist. Those are the circles in which I now move. If my former broadcast colleagues don’t think this is a ‘professional’ environment, well, I can sympathise. But the world is moving on.
There’s lots of focus on what’s missing from previous Final Cut versions. More interesting, I think, is what should be added to X. Which isn’t quite the same thing. My current wish-list is a mix of things that, on reflection, I think should return — and whole new approaches that would have made no sense previously.
More than one clip browser. We were used to having multiple bins open at once, and while the discipline of having just one browser is refreshing, it’s also limiting. Let me spawn more, please. I have a whole spare monitor waiting for them.
Tag-based filters and effects. Let me tag clips on the storyline, and then apply video and audio filters to those tags. For example, I’d tag dialogue as ‘voice1’ and ‘voice2’, and drag preferred EQs to those tags in the storyline index. Instant global EQ; tag groups become sub-mixes and I’ve regained most of what I miss from Soundtrack Pro. To the tag set “EXT, dayfornight” I apply a day-for-night grade. This would make me very happy.
Global filters. In Soundtrack Pro I habitually drop a soft limiter on the master mix; lots of people chuck a broadcast safe on their final video render. Give me a special tag ‘global’ and I can do those within the filter-on-tag interface.
Shared metadata. Lion brings Xsan into OS X for the first time; between that, low-bandwidth codec support, and Thunderbolt storage we have all the tools we need to make shared storage and editing a core feature. This has to be part of the plan, right? — but let’s have it sooner rather than later. Thanks.
Photoshop layers support. Bring that back, please. Make a PSD a compound clip of a 10-second freeze, let me step into it and enable/disable/extract individual layers. Thanks.
Timeline zoom and autoscroll. The storyline doesn’t autoscroll on playback? Tell me that’s a bug. Also: when I zoom in and out on the storyline, I expect the playhead to remain centred. We fought for that in FCP for years, and we were right.
FCP7 import. Tags for bins and we’re done, surely? Premiere Pro can do it, it’s kinda necessary.
Multiclip editing. It’s coming, right? Oh, but how are you going to show me a multitrack preview when there’s only one viewer window? Backed yourself into a corner there, huh?
Subtitle (closed caption) authoring. I’ve yet to find a genuinely pleasant workflow for producing subtitle tracks.
ProResLT. Project rendering options include vanilla, HQ, and 4444 flavours of ProRes, but not LT. Shame, it’s a great compromise.
Show me the clips I just synchronised. Minor detail: I’ve tagged clips with their take, which makes it easy to select all the material from that take and ‘Synchronise Clips.’ Trouble is, the synced clip ends up back in the Event, named for one of the video clips, and I have a devil of a job finding the damned thing. Since tagging it with the take is likely the next thing I’ll want to do… take me back to the Event and highlight the synced clip, please? Or tag it so it appears in the current collection, maybe?
Note that I’ve not talked about external monitoring, OMF export, YUV grading and all the other ‘professional’ stuff. Just the bits I’d actually use. The more I think about it, the more I think applying filters to tags is the killer here. And that’s interesting because such an approach would have made no sense at all in earlier editing packages. If it looks like progress and smells like progress…
For me, right now, Final Cut Pro X is probably closer to being the editing tool I need than Final Cut Studio 3 was. That’s impressive. But I want it to be clearly better. It’s not. Yet.
[Update 25/6: added Subtitle (closed captioning) to wishlist. I’ll keep updating as explore more; also added ProResLT.]
[Update 26/6: added ‘show me the clips I just synchronised’]