VFX Breakdowns


When I was a kid, I wanted to be a special effects wrangler. In fact, the short story Armaments Race by Arthur Clarke [wik] really appealed to me – I thought that making swords and guns and armor and tanks for movies would be a fine way to spend my life.

When the magazine Cinefex started publishing, I became an avid reader. It was such a great and loving look into what was going on behind the visuals of all the movies I loved. I sometimes bought it, if it was a great episode, but it was pretty expensive, so I shared copies around with a circle of friends who were also into movies. I still have my issue on the special effects for Blade Runner, somewhere on my shelf of rarities.

Of course, computers have totally changed visual effects, and that change is only going to get more profound now that visual effects are no longer a reality problem and have become a software problem. That trend is going to continue, as well, to the point where eventually the impediment will be imagination and our ability to muster and manage artistic details.

I’m going to throw out an idea, here: a good future career would be to produce hyper-realistic models of famous actors/actresses of the past, and begin training up AI motion databases of how the originals moved and walked. The time will not be far off when you could make a pretty good living renting a Marilyn Monroe or Orson Welles (or Tom Cruise) for less than the original would cost. Another, longer-term idea which may be being mooted by CGI, is to go around shooting stock footage of street scenes in various places, with an eye to eventually offering them to movie makers and CGI photogrammetrists as untouched and unused footage of historical times. Imagine what some high-resolution high quality film stock of 1950s New York, that hasn’t appeared anywhere else, would be worth? Maybe a lot. Or, maybe just a hobby. A few years ago I tried to kickstart an idea to produce stock for a nonexistent movie, by assembling actors and cameras and doing several location shoots then uploading the whole mess to an indexed digital archive: “just add dialogue!” It turned out that doing that, even with prosumer gear, would have been prohibitively expensive. But a lot changes in 10 years.

The time is already here where actors have to work with a pretend intelligent raccoon in their scene (Guardians of The Galaxy) and I don’t think it’ll be much farther before there aren’t actors, at all, for many roles – movies may well be made with entirely virtual characters in virtual settings. Why not? At the bottom of this posting I’ll include a link to a really gorgeous sci-fi short feature that was entirely “photographed” in a game engine: Unreal Engine. And last night Anna and I went to Dune, which – I am pretty sure – was a whole lot of green screen compositing. It looked to me as though almost every scene was a VFX scene, and I experienced no sense at any time that I was seeing special effects. Things have come a long way.

This posting is inspired by some of the commentary on a posting over at Mano’s [ftb], regarding the accidental shooting on the set of Rust. Some of the commenters appeared to be under the illusion that it’s hard to realistically render the recoil of a gun firing, etc. The reality of the matter is that VFX artists have been doing that for years, already. Why was Rust using actual prop guns? Probably because practical effects can still be cheaper than digital VFX – guns don’t kill people, cost-cutting does? If you look at various VFX archives, you’ll see plenty of footage of actors rigged for motion capture (“mocap”) including mocap weapons. How else do you think effects like a giant rocket launcher-wielding raccoon are achieved?

On the set of Titanfall, nobody died that day

Incidentally, sometimes VFX are a good way to protect valuable props, if you’re unfamiliar with the on-set tragedy in which Kurt Russell unknowingly smashed one of the original Martin guitars, made in 1870, in Hateful Eight, a mediocre and pretentious violence exploitation piece by Quentin Tarantino. [the guitar incident] The Martin Guitar Company, which owned the museum-piece, has said it will not be lending out any more guitars to Hollywood.

For what it’s worth, I have a great deal of respect for stunt-people, who are probably a vanishing breed of visual effects artists, who are being replaced by mocap artists driving digital dummies. [Although I notice that Andy Serkis, the mocap artist who played Gollum in Lord of The Rings, has managed to segue into actual acting, and good for him] There was a pretty cool documentary about Zoë Bell, who was Lucy Lawless’ stand-in for stunts on Xena [double dare] – I found it fascinating to see what they used a stunt double for; some of the scenes are “we need ‘Xena’ to walk down this hall and then jump out of the way of an arrow.” Also based on practical effects rather than just “jump out of the way and we’ll render in an arrow.” A real arrow doesn’t look as obviously arrow-ish, can’t be given a subtle glow as it’s rendered, and might hurt someone – but it’s cheaper than shipping out your scene to a VFX house that probably has a “TODO” list 20 miles long.

A cloud tank [source]

As I have watched industries rise and fall in my lifetime, I have noticed that creative talent is remarkably survivable in the face of technological change. I mention that only because, as technology rambles ahead, there will be winners and losers and the losers will be the ones who insist on doing things the old way because they can’t see the creative possibilities of other ways, or simply aren’t creative to begin with. And it’s the pure creative energy of VFX which is why I love the field so much. I remember happy-laughing at the VFX guys making Walking With Dinosaurs rendering water-splashes for a veliciraptor running through a creek by running through a creek wearing big wooden velociraptor feet (painted chromakey green) to get the water effects right, and on budget. As we all know here, “on time and on budget” is also an important and valuable part of creativity. I remember some footage that was shot for October Sky and the contrail of a rocket, ascending into the sky, was crucial to a scene. I imagined, naturally, that someone built a rocket and they put a bunch of cameras and everyone held their breath and $100,000 was spent in an instant. Instead, the actual effects were done in a “cloud tank” – an old-school Hollywood effect where you have a very clean very clear tank of very still water, and you drag a hypodermic that’s emitting a bit of white paint as it moves.

Then, there is the brilliance of “Forced Perspective” – another ancient Hollywood trick – in which the camera (since it just records a 2D flat rendering of a scene) is tricked into seeing size and positioning that is completely wrong. In one scene of Saving Private Ryan the soldiers are looking down a field at a little town in the distance, but what is actually going on is a bunch of guys dressed as soldiers lying on their bellies looking at a little model of a town, about 3 feet in front of them. I wonder, when movies are all made entirely in 3D/mocap, whether anyone will remember those tricks? I guess they will be irrelevant.

One of the channels I love on youtube is focused on VFX, and does a really good job of combining in-film visuals with the VFX used to generate them. There are tons and lots of VFX channels on youtube, including a few that specialize in low-budget short VFX-heavy pieces, such as Freddie Wong’s Rocketjump [youtube] – Freddie has migrated from being a hobbyist to a professional, because he’s a highly creative person. The current generation of video editing tools are good enough to liberate the creative desktop and you can make pretty cool stuff for hundreds of dollars instead of hundreds of thousands of dollars.

This is one example of the FameFocus channel’s many VFX breakdowns:

See why Keanu doesn’t need to wear a helmet? Just once I want Hollywood to shoot a motorcycle scene that doesn’t amount to “10,000 ways to get killed on a motorcycle.”

And, as I mentioned earlier, Irradiated is a film made entirely in a game engine.

The folks who make Unreal Engine also have produced a human face-rendering program called Metahuman which is probably worth a posting in its own right. Want that digital Marilyn Monroe? Metahuman can do it. Add some mocap acting and she can sing “Happy birthday, mister president” better than the original, who was not famous for her chanteuse chops. One other side-effect of all this delicious VFX is that it may finally break down people’s tendency to believe what they see. No, republicans, there are no giant intelligent rocket launcher-weilding racoons in antifa.

One thing we get right, in our decadent culture, is entertainment. Shame about the other stuff.

Comments

  1. says

    I didn’t write about it in June, but we just past the 25th anniversary of “Lone Star”, a film made for US$3 million. It goes to show that a good or great story and acting can carry even the cheapest film, and special effect can’t make a bad film good. Effects only have value if they facilitate telling the story, and aren’t the story themselves. Shakespeare was right, the play’s the thing.

    https://en.wikipedia.org/wiki/Lone_Star_(1996_film)

    I was recently roped into watching a “horror movie” and to say it was tedious was an understatement. I rolled my eyes so often the person I was watching with thought I was falling asleep. Who cares that it made $200 million at the gate?

  2. sonofrojblake says

    “losers will be the ones who insist on doing things the old way”

    Phil Tippett invented go – motion. Then he saw what the computers could do for Jurassic Park, and famously said “I’m extinct”. Then he invented the Dino Input Device, that allowed CGI animators to puppeteer CG dinos. Phil Tippett is a creative winner.

  3. sarah00 says

    Just an FYI – Andy Serkis was an actor long before he was helping revolutionise motion capture. His IMDb credits go back to 1989.

  4. xohjoh2n says

    During the SciFi Farscape run they used to have these little commentary clips in the commercial breaks – presumably because the US breaks are longer than the UK ones so we needed some extra filler to pad out the hour – I can’t find an online copy anywhere but in one Ben Browder was talking about how valued the various members of production crew were and the increasing role of automation/CGI then said something like “…but everyone wants to get rid of the actors, because we’re a pain in the ass. But I don’t think audiences undervalue the actors…”

  5. says

    sarah00@#5:
    Just an FYI – Andy Serkis was an actor long before he was helping revolutionise motion capture. His IMDb credits go back to 1989.

    I did not know that! Thanks for the correction.

  6. consciousness razor says

    Some of the commenters appeared to be under the illusion that it’s hard to realistically render the recoil of a gun firing, etc. The reality of the matter is that VFX artists have been doing that for years, already.

    Same with film and TV music (and a lot of other music). I think many people have no idea how much is done with software these days. And it’s really fascinating to see how VFX has changed at the same time. I’d say that more people are at least aware of VFX and know that it’s pretty much ubiquitous now (although they may not recognize it when they see it), because it seems like the very realistic synth music out there still flies under the radar for the most part.

    With music, it’s not as if live recordings are going away or something. I can’t imagine that happening. So that’ll be important in some form or another — not always larger chunks of audio but at least samples and the like. But of course, even regular recordings like that are enhanced or processed in all sorts of ways. So, don’t be under the illusion that you’re hearing a raw, untouched recording that hasn’t been manipulated at all. That’s just not how it works, because that kind of stuff never leaves the mixing room.

    Sometimes, people will use a pretty light touch, without doing much to it, so it sounds natural or whatever…. But I feel like maybe photographers understand the point pretty well, that almost everything about the process means that you’re not getting something like “the real image that you would’ve seen with your own eyes.” That’s just … not photography.

    Anyway, if you think you’re hearing “real” percussion or strings or whatever, there’s a pretty good chance that it’s a VFX/CGI/animated type of thing instead. Or that’s often a big part of it at any rate, which might be mixed in with a recording to get the best of both worlds.

    You’ve been fooled in the past, I guarantee it — maybe for upwards of 20 years at this point. And that’s okay. (By the way, for those who seriously think it’s not “real music,” then first of all, fuck you. Also, you should’ve told us to get off your lawn a few decades ago, because it’s just too late for that now.)

    The easiest way to tell? Maybe just ask yourself this: do they really have the budget to hire a bunch of musicians (possibly hundreds of them) to spend a bunch of time recording this particular material? Much of the time, they don’t, even when it’s a big-budget film and especially when it’s a smaller one, a TV show, a commercial, etc. So: that’s not what they did.

    But I’m old enough (just barely, I tell myself) that it’s still pretty incredible what you can do with nothing but a PC. It’s just fucking unbelievable, even when I’m the one doing it. And it’s pretty wild that (when it’s done well) almost nobody can tell the difference between that and “the real thing.” You still could sometimes, if you’re really going to sit down and analyze things to death, but probably not if you’re just listening like a normal person.

  7. Pierce R. Butler says

    The Darfsteller:

    “The Darfsteller” is a 1955 science fiction novelette by American writer Walter M. Miller, Jr., which won the first Hugo Award for Best Novelette. It was originally published in Astounding Science Fiction of January 1955.

    It is the 21st century story of an old stage actor who has become a theater janitor in order to remain near “show biz”. The theater has been overtaken by robot actors, made to look like humans, which act out plays under the direction of each venue’s central computer. …

  8. Ice Swimmer says

    Re the classic Hollywood stars: Not only is it possbile to “fake” VFX, but machine learning systems also can already produce fake speech. They demoed this in Finland by creating a fake “announcement” read by a female radio host.

    They taught a voice cloning system her voice and had it convert the announcement read by a man in a dialect she doesn’t speak to her voice and thus announce that she’s quitting her radio job and will start to work for the ice hockey team in Rauma as the head coach, in the rather distinctive Rauma dialect (Rauma giäl). It sounded real.

    They did need one hour of pure speech by both her and the Rauma giäl -speaking dude (they had both read the Finnish translation of Don Quijote), the thing took a few weeks or so to do and cost fairly much.

  9. says

    I recently renewed my Netflix sub mostly because a friend mentioned that a second season of Love, Death and Robots was up. Both anthologies are excellent examples of what becomes possible when creativity is let loose and how astonishing modern cgi can be.

  10. Daniel Holland says

    While there is obviously a lot of VFX in Dune, including the spaceships, shield effects or the sandworms, there is actually less of green screen than one might expect. According to the lead actor himself, Timothee Chalamet only did 2 scenes in front of a green screen. A lot of the effects, such as vibrating sand, ornithopters or the barons fat body were practical effects, elaborate sets and make up artistry.

Leave a Reply