You don’t always *need* to shoot in RAW. There, I’ve said it. And by so doing, I have contradicted just about about every respected digital photography guru on the planet. And somewhere out there, Jared Polin (Fro Knows Photo) just felt a disturbance in the Force.
You do NOT always NEED to shoot EVERY photo in RAW format (or RAW+JPEG if your camera supports this). I will explain my reasoning below, but first let me mostly agree with the 99% of everyone else that says RAW is required always. There really are no downsides to shooting anything/everything in RAW. The upsides are great, especially if you have the time to commit to editing and the storage space to keep the original files. I’ve read a few places that the current crop of RAW file formats from Canon, Nikon, and others affords approximately 8 full stops of exposure latitude (in the midtones, anyway).
That’s an insane amount of exposure error that you can correct in post production. Not only that, but you also have immense control over color correction, lens aberration correction, sharpening, noise reduction, and half a dozen other nifty levers to push or pull. Some of those parameters can be adjusted with JPEG as well, but there is always a loss of image quality if you go more than a stop or two in either direction. Highlights and shadows are especially difficult to fix with JPEG as compared to RAW.
Why not RAW?
So why in the world would you ever want to shoot in JPEG instead of RAW? There is one main reason for me: time. Back when I was working in print publishing and was often assigned to photograph local events in NYC, speed was critical. I had to rush to the event, take my shots, and then hurry back to the editorial office to hand over the files (or sometimes email them if I was far enough away). The images had to be correct in the camera, or at least really close to be publication-ready. Sure, there were graphic designers to do touch-ups, but anything that required a lot of editing very well might not get published if a deadline was looming. I’ve heard similar stories from fellow photographers who shoot sporting events: it has to be well composed and properly exposed “in camera” or you likely won’t get paid.
I also learned the bulk of my photographic chops during the tail-end of the film era, so there was far more necessity to get it right when shooting, and there was no such thing as “I’ll just fix it in post.” Now that there is far more latitude in editing photos, the pure necessity of getting the exposure right at the time the photo is shot is far less pressing. Nevertheless, I would argue that striving for exposure accuracy on the front end will serve to make any photographer better at their craft (and reduce the amount of time spent staring at a screen instead of being out there shooting that next great image!).
And, of course, the image files themselves are several times larger than an equivalent JPEG. Storage is cheap, but data transmission isn’t always free. JPEGs are easy view and transmit on many mobile devices; RAW not so much (at least at the time of this writing).
So yeah, RAW Most of the Time
Having said all that — and having vigorously shaken my fist at the clouds — I will also say that RAW is a great tool to have. If you have a situation in which there is no room for error (first kiss at a wedding, for example) or if you are in an environment with very tricky lighting that you cannot correct with your own lights, you’d be crazy not to shoot RAW. It won’t fix your soft-focused images, but you can pull a real stinker of an image from the “useless” category back to “pretty good” if you’ve blown the exposure badly.
But Still, You Don’t Need RAW
If you find yourself in need of 6+ stops of exposure correction more than very rarely, you don’t need RAW. You need practice.
Sure, keep your camera set to JPEG+RAW. Just don’t use RAW as a substitute for understanding the fundamentals of photographic theory.