I had a sudden trip to Boston that I needed to work into my schedule, which meant I had to get up early, vote, go to the post office, and then drive fast and furious down to Washington airport to get the 6:00pm Jetblue flight northbound.
As it happened, the flight was 3 hours late, so I had plenty of time. I spend a lot of time “in transit” and I usually let my mind wander; most of the postings in this blog are amalgamations of some form of mind wandering or other.
It was a strangely beautiful day. There was a bit of frost in the morning, which filled the air with the slightest amount of mist and that broke the sunlight into shafts of clean soft brightness against the fall colors.
When humans see a photograph and find it striking it seems as though some people look for contrast first, then color while others look for color, then contrast. That’s one reason (I believe) why ‘rules’ of composition and photography are B.S.: the catalogue of “photography that does not suck” is a mix of something for everyone. There are no objective rules at all, it’s all personal.
I find this image funny because I was actually thinking of the beautiful fall colors of the mountains, but firing the camera straight at the sun, I knew, would create a high contrast situation where the trees by the road would look very dark. Cameras only record a limited dynamic range, unlike our eyes – which is why some conspiracy theorists think the moon photos are fake: “where are the stars?” Simple: the film’s dynamic range wasn’t wide enough to record them because the scene was very contrasty. If the camera had been exposing for the star-field it would have recorded it just fine, but the astronauts would have been completely blown out.
This photo also amuses me because I know that with a tiny bit of photoshopping, it could look very sinister indeed. Hollywood photographers realized a while ago that you could do nighttime scenes just fine by shooting in daylight and compensating for the exposure – adjusting for the tonal range at filming time using a blue filter to simulate the kind of visual tone-stretch our eyes can get when they look into shadows (which are full of blue light) – our eyes cheat, they’re multi-modal cameras that record detail in the highlights (rods) and shadows (cones) separately; our brains mash it all together and give us the illusion of a continuous tonal range where one does not really exist.
It’s also fascinating to me how our expectations of vintage technology affect our understanding of current technology. When you see an image that has vignetting, for example, that’s a modern technique of simulating the lens “fall off” from great big vintage lenses in the day when we shot wet plates that were about 5ASA in terms of film speed. The “wide open portrait lens” look favored by Hurrell and other Hollywood photographers was also a look the resulted from slow film speed (around 100ASA) that was popular at the time, and the smooth grainless look was a result of big negatives not small grains. The biggest of these conventions is, of course, black and white. Black and white photography was an accident of history not an aesthetic choice: photos were black and white because commercially practical color photography had not been invented yet. But we still find black and white modes on modern digital cameras – I use Hipstamatic a lot on my supercomputer-powered iPhone 6, and I get endless amusement out of how it simulates poor depth of field and low speed film with big grain. My fancy Canon 5D mostly sits in its bag, forlorn and unloved. The pictures it produces are a lot of data and it takes a great deal of time and attention to retouch them, but if I just bang something out with Hipstamatic I can forgive myself for flaws in the image, which means I do a lot more photography.
I forget the model number but a few years ago I was at a show and Bob Blakley was running around with a digital camera by Pentax (I think it was) that had all the retro looks and styling of a vintage 1960s 35mm camera – except it had an LCD panel on the back that you could use in addition to the viewfinder. The camera had a bunch of “old film modes” which were rendered in real-time in the LCD panel; using the camera and looking at the LCD was like looking back into the 1950s, the film emulation was very good. All of this amounts to “making a really great camera deliberately emulate a fairly bad one.”
Back in the day when I was shooting wet plate/ambrotypes, I used to get people all the time asking me how I got that effect in Photoshop – was there an action pack that they could use to get a similar look, with the crazy borders and creamy tones? I finally scanned a few plates and posted high resolution versions of the edges to a few stock photo sites. I sometimes stumble across my plate-edges in book illustrations or covers – using supercomputers and massive amounts of memory to simulate a very primitive process that nobody in their right mind wants to mess with any more.