Adventures in Art, Photography & Design
PHOTO | GRAPHIC | IMAGE | ARTS
Five unstructured and unrepentant ruminations on art, vision, signs, symbols, graphic depiction and photographic imagery.
SHADES and SHADOWS
Photography as we know it cannot exist without light. Photography is the process for recording patterns of light in differing wavelengths (colors) and intensities (light to dark values) onto some form of recording media: film or digital sensors, (CCD or CMOS). In fact, photography as a process, is very similar to the process of recording audio information.
In the vernacular, when we refer to "light" we are referring to what is known as the "visible" part of the electromagnetic spectrum. The majority of the spectrum of electromagnetic radiation is invisible to the human eye—from gamma rays to radio waves. This does not mean that wavelengths we cannot perceive are invisible to the eyes or sensory apparatus of other creatures or to certain types of electromagnetic recording devices. These devices may use special film, filters or digital sensors that are sensitive to wavelengths of light that cannot be seen with the naked human eye. Infrared photography and night vision goggles are good examples, as are x-ray imaging, magnetic resonance imaging and scanning electron microscopes.
The human eye, magnificent as it is, is only sensitive to a very narrow slice of the total spectrum of radiation that abounds in the universe. Everything else is invisible to us.
Wavelengths of light that we cannot see can be captured and then "upsampled" or "downsampled" into colors we can easily differentiate. This technique for "transmuting" the invisible into the visible is commonly used in scientific, astronomical and satellite photography, and is described as "false color imagery". In the case of infrared photography, yellow and red may be arbitrarily "mapped" over warm areas and blue and green mapped over cooler areas. This allows a graphic depiction of heat patterns that can be immediately grasped visually by the human observer. This process is analogous to recording sounds far above the range of human hearing and reducing the pitch of those sounds into a range that we can hear for quick pattern recognition and human analysis. Note that this kind of "graphic depiction" of data is not necessary for mathematical analysis, but is a tool for enhancing human comprehension of otherwise invisible information.
Put another way, it is a method of taking raw data and converting it into an illustration for human consumption. One can also think of this process as separating the important information from background noise... based on what the presenter considers important or not important. There is no fundamental reality or baseline associated with various forms of graphic depiction. Thus, when it comes to correlation with the original captured dataset, all graphic depictions and representational transmutations should be understood as useful "for purposes of illustration". This does not imply that the graphic depictions themselves may not be aesthetically pleasing or even beautiful. The experience of being able to grasp the invisible can be downright profound.
Is photography itself more "real" than graphic depictions? Possibly, but not in the way you may think.
The Naked Eye
Because photography originated as a way or recording the light reflected off of the surface of objects in the "real world", it had a direct correlation with what we might naturally see under similar lighting conditions. A photograph of a particular face or flower superficially looked like that particular face or flower as seen directly with our eyes. A graphic depiction of the same face or flower may still be recognized in it's particularity, but graphic depictions, such as a caricature drawing or posterized image need to be "read", (i.e., interpreted), as much as "seen". Looking at photographs seems effortless, by comparison, since it "looks" similar to what we would naturally see with the naked eye.
This apparent correlation between what is seen and what the photograph looks like has led to many misunderstandings about the nature of the photographic image vs. the graphic image. In courtrooms, for example, photography is routinely used as evidence, whereas graphic depictions are considered to be "illustrations" to "help clarify" events in the minds of the jury.
Our society has placed photography on a higher level of reality than other kinds of graphic depictions, even higher than eye-witness testimony, which is considered (rightly) to be fallible. Photographic evidence is considered somehow to be infallible.
Yet every professional photographer, indeed, every student of Photoshop®, understands just how malleable photography can be. Even the amatuerish composite photographs that make their way around as email attachments may require expert analysis to debunk. Notwithstanding obvious clues like light sources and the angles of cast shadows, digitial imagery is by nature modular, comprised of discreet building blocks, (the individual pixels), much like a brick wall, therefore very easy to modify, modulate and alter to any extent. Film-based photography is harder to alter without leaving any trace, as the structure of the film grain itself creates a continuous pattern that is hard to manipulate without disrupting.
In other words, the "commonsense" preference for photographs over other types of evidence in the courtoom, never rational, is now obsolete.
Pixels in the Sky (with Diamonds)
In the last 25 years, the movement toward a fully digital media future has largely taken place. Analog broadcast television is history. Land line telephones are on their way out. Who uses fax machines any more? Files are sent via email, Skype®, or uploaded to servers for later download. Business cards, once a "must-have" for business is now a dying art form, largely replaced by sig blocks, company websites, Linkedin and Facebook sites. Wi-Fi is on it's way to becoming ubiquitous and the iPad® is the perfect way for photographers and designers to carry around their portfolios and show their work.
In both photography and graphic design, Adobe Photoshop® and Illustrator® are as common today as Ektachrome®, X-Acto® knives and Rapidograph® pens used to be. If you are old enough to have experienced working in the pre-digital age, you may smile at such references, but in all likelihood, you don't miss those times. My wife and I made the full switch to digital photography at the time that all the E-6 labs in Los Angeles were shutting down. Prior to that we had continued to use film for quite a long overlap period, until we were reduced to having to take our film to the UCLA Medical Labs for processing.
Photography, unlike the graphic design world, did not endure a long, awkward digital childhood and adolescence. Early "page layout" programs were horrible and many designers refused to use them for many years. Early versions were not even capable of kerning, therefore were not remotely suited for professional use. Digital photography had a long period of development, but it's development took place hidden away from the general public in very high-end film and video post-production studios. Companies like Aurora®, Dubner®, Quantel® and Via-Video® built very large, mage processing systems that were so expensive that they could only be purchased by big studios or small, early adopting, risk-taking shops. I was one of the small risk-takers.
I went into business with my father, Saul Montibon, and our first image processing machine was a Via-Video System One® image processing system. It weighed 1200 lbs in the crate, cost as much as the average American house at the time, and had no hard drive... just two 8" floppy drives. The A Drive was used to boot up the machine and contained the software to run it, the B Drive was used to store images while working on them.
Output was handled through a Honeywell® grayscale printer, which could print a video image of your work onto heat sensitve paper, and also through a Matrix® film recorder. We ordered our film recorder with three film backs: 35mm, 4x5 and 8x10. The images output to this monster were okay... low-res by today's standards, and the system itself could only display 16 usable colors on-screen at any given time out of a total palette of 4096 colors—16 levels each of red, blue and green. That's better than a box of crayons, but far less than the 16 million plus colors generated by average computers today. But the colors were decent, and I quickly learned how to optimize the output, and even how to do multiple exposures when we needed more color range in an image. We also had a standard-res black and white video camera for "digitizing" images. Quality color scanners did not yet exist, (you have to understand, this was the era of black ink dot matrix printers)! I "built" and optimized a lot of images "by hand"—meaning literally one pixel at a time. This was time consuming but gave me the most control over the final look of the image.
Using that image processing system, I was introduced to computer graphics at a very fundamental level: the "do-or-die" level. At this time there were no computer graphics books in your local bookstore, nor were there computer graphics or "new media" programs at your local colleges. If you had a problem, you had to solve it yourself. So, for the first year-and-a-half, every single project I produced involved doing things I had never done before.
In any case, I have many fond, (and many not-so-fond) memories of that time. I used that Via-Video machine on a daily basis for over 10 years, even after having access to far more advanced equipment. It never crashed once. That machine, and associated gear, ended up being accepted into the Computing Museum of America, in San Diego. It didn't contain any diamonds that I am aware of, but I'm sure the gold and silver used in it's innards, (yes, the real thing), alone, would be worth something today. Unless the current caretakers have melted the thing down, I am confident that I could plug it in today, and it would still work. They don't build 'em like they used to.
Clear as Molasses
There were a small group of people in Los Angeles who had similar systems at the time, and we all knew each other, but we all knew about as much as anyone else did about these systems. We had no mentors. We were all just experimenting, and pulling our hair out and figuring it out on our own. One of my friends, Bob Bechtold of Bechtold Studios knew far more about the video side of the equation than I did. My expertise was on the art and design side of the picture. Unlike many early experimenters with computer graphics, who would sketch out ideas on paper and then input the sketches into the computer (generally by digitizing tablet at that time), I decided early on that I would try to work entirely in the digital realm right from the get-go. And with few exceptions, (such as a comically ill-fated animated short produced with some friends for the 1984 Olympiad of Animation), I did. Therefore, everything I learned about digital media, I learned by solving problems in the middle of the night... nearly every single night for years on end. It was certainly not a boring time... In fact, most of the time, with my client deadlines, it was absolutely hair-raising.
I used that machine to produce art, design and photographic work. It is largely because of such machines that the world of photography did not have to endure years of feeble software like the graphic design world did. Unlike the music world, which is generally way ahead of the curve where new tools are concerned, the art world, was the last to catch on. In a 1986 LA Times interview with me and Paul Shimmel, (at that time the Curator of the Newport Harbor Art Museum), Shimmel, in reference to digital media, (at the time referred to as computer graphics), makes the statement, "It's not the next wave of what we call art". Times change.
The Hi-Fi Eye
Of course, these days, image resolutions are extremely high, color palettes are huge and gamuts are very wide. I routinely work on single image (non-stitched) files that are 350-500 mb in size. Many photographers working in medium format photography work with files that are far larger. As software becomes more sophisticated, working on such large files reqires ever more RAM and CPU power. Thankfully, the cost of digital technology drops each year. My current perferred system (a maxed out 17" MacBook Pro®) weighs only 5.6 pounds and is thousands of times more powerful than my trusty old Via Video System One. But without that first primitive, beautiful, tank-like system, I may not be doing what I am doing today.
(I should note that the Via Video System One was not beautiful in any conventional sense, visually or from a product design standpoint. It was a bulky, boxy, beige thing with heavy, boxy black peripherals. It was beautiful because it worked precisely as advertised and never stopped working in all the years I used it. It was the very definition of "industrial strength".)
In the audio world, the term "hi-fi" refers to a system that can reproduce the recorded sound event with a high level of fidelity to the original source material. There is a lot of discussion (much of it heated) in audiophile circles about whether the fidelity should be to a natural sound event (say a live acoustic guitar concert) or to the recording of that event, (in the form of the master recording). This ongoing dispute is similar to the line of thought outlined earlier regarding whether the image has an a priori existence, or whether the "image" is the thing that comes into existence post-capture and post-manipulation. In music recording, the producer and the recording engineer have a massive amount of control over the final "sound" of the recording. Like photographers, they have access to a very large array of processing tools, and can modulate every aspect of the recording. Thus, what ends up on the "master tape", (or master file), may bear little resemblance to what the musicians played live, particularly if the recording was made in a recording studio and not in a concert hall. It may be pitch corrected, mixed, over-dubbed, equalized, compressed and who knows what else.
The simplest way to record music is with a single microphone straight into a recorder with no modulation of the signal whatsoever. This would be analogous to a straight photograph, (i.e., pointing the camera at something and depressing the shutter release button). But, does this necessarily make for the best result? As we shall explore in the next section, even something as simple as altering the microphone placement can change the result as much as changing the position of the camera or the photographed subject in relation to the light source(s). These are things that have to do with human judgement.
In reality, there is nothing as simple as a "pure" snapshot, or a "pure" recording. Even snapshots are framed by the photographer, and often the subjects are instructed to "smile". The closest thing we have to "pure" images are surveillance images, because once certain decisions are made, (i.e., camera type, position, etc.), and assuming the camera is fixed and not a movable PTZ IP camera, each subsequent image is shot using the same perameters. But even those "fixed parameters" are modulated by the ambient conditions, (day/night environment, street light fixtures, illuminated store signs, headlights from passing cars, suspects wearing hoodies to darken their faces, and so on), not to mention the original equipment choices. A mega-pixel IP camera produces far sharper, and more useful images than a standard VGA-resolution analog video camera. All choices, including equipment choices, down to cable types and power supplies, can have a significant impact on the final result.
Never forget that in art, as in life, purity is a dangerous concept. Adolph Hitler was a "purist". Purity is dangerous precisely because it is a concept that rarely, if ever, correlates with anything in the real world. Even deep space is not a pure void, a true vaccuum. It is filled with intergalactic dust, charged particles and dark matter. There is no such thing as "pure", empty space any more than there is such a thing as a "pure", unmodulated photograph.
This thought holds enormous consequenses for the use of photography as evidence in the courtroom. Even unadulterated photographs can be "skewed" through lighting, camera angles and lens choices to emphasize different things. The photograph can indeed lie.