The editors' toolkit of the future

Today another editor on my course came into my room sounding rather excited about Microsoft's latest announcement for user interfacing - the Windows 7 multi-touch:


Video: Multi-Touch in Windows 7


This of course is a subject we've all discussed before - what if editing could have the user interface of Minority Report?





In the film, Tom Cruise reviews video footage of a crime (that has yet to happen, but that's irrelevant here) and utilises all manner of time-lapse, zoom and selection tools via a pair of gloves and a projector.

Minority Report cap


The potential has always been obvious - mapping functions on an editing system to certain gestures (in the same way they can be mapped to buttons on the shuttlepro) to start with, with more specialist applications being developed to relate those gestures to certain areas of the screen/room or depending on the toolset you'd like to access at the time.... but with the wii remote and now the promised multi-touch mapping to multiple features and becoming a part of regular useage, the future seems to look bright. We're getting there. Now just to combine them and not actually need a physical connection to input data.

Another 'future' development that came to my attention recently was in the 20th June issue of Broadcast - the UK TV trade paper. Within a more general section hypothesising the technology available in 2012 (with a fair number of mentions given to stereoscopic techniques, also known as 3D and previously discussed in this blog), there was a mention of the implications for post-production involved in tapeless filming. If rushes/ dailies are recorded straight to a disc, could that disc be part of a network which also includes an edit suite? How soon could editing theoretically begin after filming starts? And if an editor is on set/ receiving footage in real-time even in a remote location, how would that alter their role - as well as that of the director and cinematographer?

As a recent entrant to the post-production business, it's easy for me to think of the current processes as how they've always been. My technological progression has more or less been from Premiere to various versions of Final Cut and Avid. As part of the Editing MA at the NFTS we do two excercises on Steenbecks with 16mm film, and I have a vague recollection of witnessing linear editing at Oxford Road in Manchester during some BBC outreach programme I participated in during my school years. Unless an article or book is specifically referring to the physical techniques involved, the view seems to be that the editor's role and input has stayed largely the same over the conversion to digital even if the techniques have changed. But as I prepare to start my professional career (with any luck) with the tools that I think will see me through for the time being at least, I do wonder how different the job will look and be in 40 years' time.

Animation progress: Import, Export, Repeat.

Production designer James makes some last minute adjustments to the set




Production designer James makes some last minute adjustments

The second stage of editing on my NFTS graduation animation, in which the live action background rushes are assembled according to our original animatic, has been completed. The backgrounds from the first shoot were done a few weeks ago, and given to the animator as a low-res .mov so that she could start work on the line drawings and getting the early stages of the animation started. This finished sequence will now replace that half-finished version in the first layer of her after-effects timeline.

Handles have also been added to a duplicate sequence to be graded (to allow colouring and compositing to start) so that they're available for any extensions to the head or tail of a shot, should they turn out to be necessary - even though we timed everything as near as possible during the animatic phase. The film was shot with this in mind, so overall the sequence with the handles that will be graded is almost twice as long as the animatic and final film, though it's the same number of shots so it shouldn't be a major additional burden on the grading - and notes will be made on the EDL on which frames (multiple of 10 for easy trimming purposes) at the start and end of a shot are the handle.

We'll be exporting a full res (1920 x 1080) TARGA sequence from the online suite because the school edit suites only have Sony HVR-M15Es and we're not linked to any of the higher-res decks, and once I've removed the handles I'll be giving my animator a TARGA sequence (plus any additional frames as necessary)... so it seems to make sense. To us, and that's possibly what counts the most. It's all about the trial and error really - the way we're going about it may not be the most efficient, but given the resources we have plus the need for the animator and editor to have immediate access to the graded handles once the grader's left the job (he's a former student of the school - grading a grad project at this point in the year is a tricky proposition) it's what we've come up with.

Screening of work in progress

Inspired by a comment by beowulf.grimbly:

As part of the film school process, we constantly have reviews of the film as we're working on the edit so that the tutors can advise when something doesn't seem to be working out, ask the right sorts of questions about how necessary certain scenes are or the ordering of the ones we have.... etc etc. But at the start it was difficult to get used to it (despite having been part of the selection process for admittance to the school - 11 people took a 5 day course at the school for the 6 places). Screenings, no matter how late in the edit, were inevitably accompanied by some form of disclaimer on how there hadn't been time to do one bit, how it wasn't yet close to what we were actually going for because we hadn't had time to physically and/ or technically achieve that, or how it wasn't quite along the right lines corresponding to the director's vision once we started learning to work with directors and needed our own time to find the film....

But for the most part we've learnt to get past that now. Whether it's the self-confidence that we know that it's not the final edit and will hopefully be able to persuade anyone who asks of that fact now that we've had a bit of experience, or that we've grown used to the process, or just that it's too valuable whilst at Film School to not get every ounce of opinion that you can on your film (even if the suggestions profferred aren't ultimately taken up and a different solution is tried), it's something that I hadn't really noticed until working with the composer on the short I've been cutting for the last 3 weeks. Until now they've been used to working primarily with picture locks, but we really wanted to see how much some original music could set the tone and move the film on a bit so we brought it in fairly early in the edit. And all of the old discussions about screening rough cuts came back to me. Back when we used to work on the same rushes for exercises, and had screenings every few days so that we could see what everyone else was doing with the same material. And funnily enough, though we may have 'borrowed' ideas from another cut we still never ended up with even two films vaguely alike. Seeing the different stories you could tell by choosing different shots at different moments was possibly one of the most pivotal moments of my first term at Film School, and if I'd been hiding behind my seat from the shame of having to show my unfinished work to other people I think I'd have missed a lot. And I suppose that that would always hold true whilst you still consider yourself to be learning a craft - i.e. for as long as you continue to do it.

What I guess I've really learnt out of the experience is how to make the most of the early days - to make a proper rough representation of how a film will be, with the majority appropriate shots in the right place. Early edits can be fairly demoralising when cuts don't seem to flow or characters aren't really coming alive. But finding the bits which aren't working can really help on the way to getting a respectable cut, especially when you get that first onscreen insight into what makes your character tick. And if it's all been a struggle and things still aren't working, an outside perspective on the world you think you've been trying to create can really help to just plant a new idea in there. Just for now, I'm trying not to discount anything which may ultimately be of benefit to the film, and the comments made always help to remind me what I should be looking out for as an editor.

Vantage Point

Earlier today I went to see Vantage Point. I'd already earmarked the film from the previews as being something worth seeing from an editing point of view, but off the top of my head I couldn't think of a better example to show people what editing is.

Certain elements of plot are given away below, so don't read if you're spoiler-sensitive.

3D - a technological breakthrough or major threat to filmmaking as we know it?

Picture this:

Ever since you first got into music, you’ve been waiting for one of the world’s biggest bands, U2, to play a live gig that you can get to. Finally, they announce their tour. You buy your ticket, you take some holiday time away from work, you drive down to the venue, you queue for 3 days. When they finally open the gates, you run to the very front barrier as fast as you can. This is your chance to be the closest you will ever be to the greatest band in your world. You get to the front…. And find that just in front of the barrier you’re heading towards is a rather large camera crane. With TWO cameras mounted, and a larger than normal crew. All inbetween you and the stage.

The reason? U2 3D. Now in cinemas nationwide.

Last Friday 22nd February, through the NFTS, I was able to go to a Masterclass in the Cineworld at Shaftesbury Avenue (currently showing the film) which was hosted by the UK Film Council, and had a panel of people who’d been involved in the film and the technology involved from the start.

Things have come on a long way from the red/blue glasses and headaches usually associated with 3D. Polarisation has been around for a while now, but the ultimate application that they’re heading for is home viewing of sports without the need for eyewear at all (via hi-res lenticular screens… apparently already working on a small scale). But they decided to build the technology on the most complex thing they could think of - a 9 camera position (18 altogether, given the need for two cameras to function together as human eyes would to provide a natural 3D effect), multiple venue, 14 song live music tour of South America. In natural lighting. We didn’t actually get to see any of the final film because of time issues, but you start to get the idea of the scale of the project when you learn that they were in post for one year (including R&D).

They actually did the basic edit in 2D on an Avid. But had to think differently from the outset. For a start, fast cuts of the type usually associated with live concert footage were right out if the film was to be released in 3D. In their place, layering effects were utilised, and cuts made only when the drama of the shot naturally took you to a different angle. Balancing the depth of the 3D between one shot and the next was vital in order to avoid rapid eye fatigue from constant refocusing, and recreation of shots where the cameras weren’t working perfectly together (through tape change, lens issues, foreground objects, or any other number of reasons) had to be done to the pixel. An IMAX grade had to be performed separately because of the necessity of printing onto film rather than distributing digitally - and the 3D had to be imagined on that scale as well when deciding how extensive it should be.

In this film the 3D wasn’t to be used as a gimmick - it was a means of immersing an audience within a scene, rather than relying on things flying out of the screen at them. And it seems to be fairly appropriate. But the claim put forward that this is as big a technological step as going from silent to talkies seems a little far-fetched. Dreamworks’ plans to go entirely 3D starting 2009 as opposed to adding 3D elements to a finished 2D film sounds exciting, but certain film genres will never lend themselves to 3D. The limit on the speed of cuts will surely be a major sticking point in narrative film, and it seems possible to me that it will be pushing a bit too hard on the suspension of experience which allows editing to work in the first place - to view a 2D image jumping across to the other side of a room isn’t intuitive to the human eye and personal experience - but it still works. To effectively place someone in a room, then have them jump around within it, to another scene in a different location, back to that room… physically it seems disruptive, psychologically invasive and voyeuristic, and generally uncomfortable. You learn fairly early on that there are more important aspects of a cut than continuity on the screen or within the frame - but along this ‘z’ axis out of the screen, continuity will be key if people aren’t to reject the images because their eyes are constantly refocusing. The 3D elements invoking a very specific point in the overall image that an audience should be looking at - leaving open a massive space for action to be missed, and trust to be lost if that space is exploited. Once that trust is lost, we may as well all give up.

It sounds like a great tool, and it’s clear that a massive amount of thought, time and skill has gone into the development even in these relatively early days - but as things stand, I don’t see a major place for it in the non-specialised filmmaking process.

http://www.reald.com/
http://www.ukfilmcouncil.org.uk/13641

Dead is the king.

Our Richard III exercise is over. It was pretty enlightening, and we had some great tutors - namely Alex Mackie and Roger Crittenden. They were totally supportive, whilst pointing out possible weaknesses and parts which just didn’t really flow - right up to the very last moment. Literally. On the morning of the slightly flexible 12 noon deadline, the first part of my section (part two of six) was running ABCDEF. By 12.45 it was exported for joining up to the rest as ACBEDF. Via a few different permutations including the attempted removal of a scene which I was glad stayed in when I saw all the parts together. Slightly nerve-wracking, especially as I was trimming the 5 new scene transitions that the re-organisation created right up to the last possible minute.

Still, the result cleared up a major plot point which had never really come across as well as it could have. The screenplay had already reorganised Will Shakespeare’s scenes (logical in theatre, potentially section-after-section in modern day film terms), so I can’t really feel too bad about my last minute shuffling. My most recent documentary edit utilised the scene rearrangement method from a very early stage, but this is the first time I’ve extensively reshaped in fiction in this style - our short films at the school don't lend themselves open to much of that sort of thing. But having seen how effective it was, my mind feels blown open for future edits in all projects.

You really can read all of the books that you want on the theory of editing - but you just can't learn how to edit from them. Because editing has to be instinctive, it has to be natural, you have to feel it… and even the most poetic instruction manual is still an instruction manual.

Job satisfaction

There's something vaguely depressing in studying editing (and indeed working as an editor), in that the only cuts that people will tend to comment on are the bad ones. Most of the job is making the entire film look as if it flows naturally - sentences into one another (even if spoken days apart when filming), a character's actions and reactions making logical sense as the cause and effect of another's, giving moments of significance the exact emphasis they need without signalling to all far and wide "HEY, LOOK AT THIS" (though this can be unavoidable with a learned audience who know the tricks, and all you can hope for is to be subtle with it even on that level), turning 360 degree camera angles around a table in a dinner scene into a conversation where everyone's looking where they should be irrespective of the fact that half of the actors may have gone off to the lunchtime grazing tables, giving all necessary information but not dragging it out in the telling... the editing should stay invisible, in much the same way as the majority of editors seem content to sit back and watch as the cinematographers are credited for the length and timing of shots*, the director and writer for the storytelling, and the actors for the sound effects added in post.

So really, the greatest compliment one can receive at the end of a scene is a comment on the story itself - an "oooh, that's not going to last long" in reference to the newly formed relationship between two key characters or an "oh my god, he's mad" after a key incident with a character (assuming of course that that was the impression you intended to convey) is praise of the highest order. It's also an opportunity to discuss those strange people who have numerous issues that you’ve been getting to know recently. And just being able to do that makes any other potential gripes about the job disappear.

*This does happen. Though like the directors and writers, they’re also often blamed when it doesn’t work.