Video Codecs Explained for Filmmakers: RAW vs. LOG vs. Compressed and When to Use Each
The Codec Decision That Wrecked the Edit
A DP on a 10-day indie feature chooses ProRes 4444 for every internal camera recording because it's the highest-quality option the camera offers. The shoot produces 14 TB of footage. The post house's editing workstation has a 12 TB RAID array. The editor opens the project, lays in the first assembly cut, and the system begins dropping frames. The drives are fast enough for ProRes 4444 at 1080p -- but not at 4K. Every external drive brought in as overflow is formatted differently. Three weeks into the edit, the project is on its fourth drive configuration and nobody can find the original camera card backups.
The codec wasn't wrong. The decision to use that codec without accounting for the post infrastructure was. A different codec choice -- say, ProRes 422 HQ -- would have cut storage requirements by roughly 40% with no visible quality penalty at the delivery resolution.
This post explains how codecs actually work, what the technical specifications mean in practice, and how to choose a codec that fits your camera, your post workflow, and your delivery target.
Codec data referenced in this guide comes from published manufacturer documentation from Apple, Blackmagic Design, Sony, and ARRI, as well as the Society of Motion Picture and Television Engineers (SMPTE) standards for digital cinema.
How Video Codecs Actually Work
A codec (short for coder-decoder) is an algorithm that compresses video data for storage and decompresses it for playback or editing. Three technical axes determine what a codec can and can't do for you: compression type, bit depth, and chroma subsampling.
Intraframe vs. interframe compression is the first decision axis. Intraframe codecs (also called I-frame-only codecs) compress each frame independently. ProRes, DNxHD, and BRAW all use intraframe compression. Every frame is a complete image, which means you can cut to any frame without decoding surrounding frames first -- critical for editing. Interframe codecs (H.264, H.265, XAVC-S) compress sequences of frames by storing only the differences between them. This produces dramatically smaller files but makes editing slower because the decoder must reconstruct each frame from its neighbors.
Bit depth describes how many tonal values the codec can represent per color channel. An 8-bit codec can describe 256 levels per channel (256 x 256 x 256 = 16.7 million colors). A 10-bit codec can describe 1,024 levels per channel (over 1 billion colors). The difference matters most in color grading: an 8-bit grade on a scene with subtle gradients will produce visible banding in skies and skin tones. A 10-bit grade on the same material holds smoothly through aggressive corrections.
Chroma subsampling describes the ratio of color information to luminance information. The human eye resolves luma (brightness) at higher resolution than chroma (color), so codecs discard color information to reduce file size. The notation 4:2:2 means full luminance resolution, half horizontal chroma resolution, and half vertical chroma resolution. 4:2:0 halves horizontal chroma and then halves it again vertically. 4:4:4 preserves full color at every pixel -- used for VFX work and green screen. For most narrative work, 4:2:2 is the minimum for professional delivery; 4:2:0 is acceptable for acquisition on some cameras but should be avoided if you're doing any compositing.
LOG gamma curves are a separate concept from codec choice, but they're often confused. LOG is a picture profile that compresses the camera sensor's dynamic range into the codec's tonal range by recording a flat, desaturated image. The codec stores the LOG data -- you then apply a LUT (look-up table) in post to restore contrast and color. ARRI Log C, Sony S-Log3, Blackmagic Film, and RED Log3G10 are all LOG curves. Shooting LOG extends dynamic range but requires a color grading step. If a LOG clip ships without grading, it looks washed out. This is a workflow decision, not a codec decision -- but it affects which codec you should choose, because LOG material benefits more from higher bit depth.
The Three Codec Categories
| Category | Examples | Bit Depth | Chroma | Typical Bitrate | 1-Hour Storage |
|---|---|---|---|---|---|
| RAW | ARRIRAW, R3D, BRAW | 12-16 bit | 4:4:4 | 220-900 Mbps | 99-405 GB |
| Intermediate / Intraframe | ProRes 4444, ProRes 422 HQ, DNxHR 444 | 10-12 bit | 4:2:2 or 4:4:4 | 110-500 Mbps | 50-225 GB |
| Delivery / Interframe | H.265 (HEVC), H.264, XAVC-S | 8-10 bit | 4:2:0 or 4:2:2 | 25-200 Mbps | 11-90 GB |
The most important takeaway from this table: a 1-hour RAW recording at 220 Mbps requires roughly 8x more storage than the same duration in H.265 at 25 Mbps. That gap compounds across a 12-day shoot.
Three Real-World Codec Decisions
Example 1: Narrative Feature, ARRI ALEXA 35
A 15-day narrative feature shooting on the ARRI ALEXA 35 with a DIT on set. The DP wants maximum dynamic range and the director is planning extensive color grading to match three different locations.
Codec chosen: ARRIRAW (4.6K, 4:4:4, 12-bit), averaging 310 Mbps.
Storage required: 310 Mbps x 3,600 seconds x 15 shoot days x a 12:1 shooting ratio equals approximately 21 TB of raw camera data, before 3-2-1 backup copies.
Decision made: The production hired a DIT to manage LTO tape backup on set and rented a 72 TB Thunderbolt RAID for the edit. Total storage cost was $3,400 for the shoot -- justified because ARRIRAW preserves the sensor's full 17-stop dynamic range for the grade.
Example 2: Corporate Documentary, Sony FX6
A two-person corporate documentary crew shooting on a Sony FX6. The client wants 4K delivery to YouTube and LinkedIn. No dedicated editor -- the DP edits their own footage on a MacBook Pro.
Codec chosen: XAVC-S 4K at 150 Mbps (4:2:0, 10-bit). Interframe, so editing on a MacBook Pro without a dedicated GPU requires proxy transcoding.
Result: A 4-day shoot produced 1.2 TB of footage -- manageable on two 2 TB SSDs. The DP transcoded to ProRes 422 Proxy in DaVinci Resolve before editing, then reconnected original XAVC files for the final export.
Decision made: The DP added two hours of proxy transcode time to the post budget. Without that step, playback on the laptop dropped below real-time on 4K XAVC at full resolution.
Example 3: Music Video, Blackmagic URSA Mini Pro 12K
A music video director renting a Blackmagic URSA Mini Pro 12K for a one-day shoot. Delivery target is 4K streaming. The colorist works in DaVinci Resolve on a Mac Studio.
Codec chosen: Blackmagic RAW (BRAW) at 8:1 compression, 12-bit, approximately 150 Mbps at 12K.
Storage required: A 10-hour shoot day at a 20:1 shooting ratio (common for music videos) produced roughly 2.8 TB of BRAW files.
Decision made: BRAW at 8:1 gave the colorist the full 12-stop dynamic range of the sensor while keeping storage within two 2 TB CFast cards. BRAW decodes natively in Resolve without a proxy workflow, making the edit and grade happen in the same session.
Step-by-Step: Choosing a Codec for Your Project
Step 1: Identify your delivery target. If you're delivering DCP for theatrical, you need a codec that can be transcoded to JPEG 2000 without quality loss -- meaning intermediate codecs (ProRes, DNx) or RAW. If you're delivering H.264 or H.265 for streaming, you have more flexibility at the acquisition stage.
Step 2: Audit your post infrastructure. Check the read speed of your editing drive (in MB/s) and compare it to the codec's data rate. ProRes 422 HQ at 4K runs at approximately 708 Mbps (88 MB/s). An SSD running at 500 MB/s handles that easily. A USB 3.0 spinning hard drive running at 100 MB/s does not.
Step 3: Decide whether you'll grade LOG or baked-in. If yes to LOG, choose a codec with at least 10-bit depth and 4:2:2 chroma. 8-bit 4:2:0 LOG is possible but limits grading headroom significantly. If no LOG, 8-bit 4:2:0 at a high enough bitrate (at least 100 Mbps) is acceptable for delivery-grade content.
Step 4: Calculate total storage including backup copies. Use the Codec Storage Calculator to enter your codec bitrate, shooting days, and shooting ratio and get a total storage requirement. Budget for at least two copies of every card (camera original plus one backup) before the footage reaches the edit drive.
Step 5: Match codec to camera card capacity. If your camera uses CFexpress Type B cards (maximum typically 512 GB to 1 TB), verify that your chosen codec doesn't require card swaps every 20 minutes. At ARRIRAW 4.6K (310 Mbps), a 512 GB card lasts approximately 22 minutes. Plan card capacity before you rent.
Step 6: Confirm your editor's workstation can decode the codec in real time. H.265 decoding is CPU-intensive. If the editor's machine doesn't have hardware H.265 decode (available on recent Apple Silicon, NVIDIA RTX, and Intel 11th-gen+), you'll need to proxy. Plan the proxy workflow before the shoot, not after.
Pro Tips and Common Mistakes
Pro Tip: When shooting with interframe codecs (H.264, H.265, XAVC-S), always enable your camera's highest-bitrate mode for acquisition, even if your delivery target is lower resolution. A 4K H.265 file at 150 Mbps transcodes to a cleaner 1080p deliverable than a 1080p file shot at 50 Mbps, because the higher-bitrate source retains more chroma and luminance detail through the transcode.
Pro Tip: DNxHR and ProRes are functionally equivalent for most post workflows. The difference is ecosystem: ProRes is native on Mac and in Final Cut Pro; DNxHR is native on Windows and in Avid Media Composer. If your post house uses Avid, deliver DNxHR interchange files. If they use Resolve or Final Cut, ProRes is the expected format. Using the wrong intermediate codec won't break the project, but it adds an unnecessary transcode step.
Pro Tip: BRAW and R3D (RED RAW) are camera-proprietary RAW formats that decode directly in DaVinci Resolve without a separate RAW processing step. ARRIRAW requires ARRI's RAW Converter or Resolve's ARRI module. If you're renting an ARRI camera and your colorist works outside Resolve, confirm their software has a valid ARRI RAW license before the shoot.
Pro Tip: For run-and-gun documentary work where card management is a constraint, ProRes 422 (not HQ) at 4K represents a solid middle ground: 470 Mbps bitrate, 10-bit 4:2:2, intraframe editing, and approximately 210 GB per hour. It's grading-friendly without the card-management overhead of RAW.
Common Mistake: Shooting 4:2:0 footage and then attempting green screen compositing in post. 4:2:0 chroma subsampling reduces the color information needed to generate a clean key. The matte edges will be soft and you'll spend hours rotoscoping around problems that 4:2:2 acquisition would have avoided. If green screen is on the schedule, choose a camera and codec that offers at least 4:2:2 -- or shoot RAW.
Common Mistake: Choosing a codec based on what's "best" in absolute terms without considering the full chain from camera to delivery. ARRIRAW is exceptional -- but if your editor's machine can't decode it without proxy transcoding, and your colorist doesn't have an ARRI RAW license, and your DCP house charges extra to ingest ARRIRAW, the "best" codec has just added three weeks and $1,500 to your post schedule.
Frequently Asked Questions
What's the difference between a codec and a container format?
A container (also called a wrapper) is the file format that holds the encoded video data -- MOV, MXF, MP4, MKV. A codec is the algorithm that compresses the video inside that container. ProRes video can be wrapped in either a MOV or MXF container. H.264 video can live in an MP4, MOV, or MKV file. The container determines compatibility with software and delivery systems; the codec determines image quality and file size. Some streaming platforms require a specific container even if they accept multiple codecs -- Netflix requires MXF for mastering deliverables, for example.
Why do some cameras record in H.264 internally but output ProRes externally?
Most camera sensors generate more data than internal H.264 encoding can capture. External recorders (like the Atomos Shogun or Blackmagic Video Assist) receive a cleaner, higher-bandwidth signal via HDMI or SDI and encode it to ProRes or DNxHR on their own faster storage. The result is better quality than the internal card recording. This is why a Sony FX3 recording XAVC-S internally at 200 Mbps can simultaneously output to an Atomos for ProRes 4444 recording at 880 Mbps -- the same sensor data, encoded two different ways at two different quality levels.
When does bit depth actually matter?
Bit depth matters most when you're grading aggressively, compositing, or working with LOG footage. For a run-and-gun wedding or corporate video shot with minimal grading, the visible difference between 8-bit and 10-bit is small. For a narrative feature where a colorist is matching three different locations, pushing exposure corrections of 2+ stops, and working with sky gradients, 10-bit is the minimum professional standard. The jump from 10-bit to 12-bit adds value primarily in VFX pipelines where multiple compositing operations accumulate rounding errors.
Is H.265 better than H.264 for acquisition?
H.265 (HEVC) compresses approximately twice as efficiently as H.264 at equivalent quality. At a given bitrate, H.265 retains more detail. At a given quality level, H.265 produces a smaller file. The tradeoff is decode complexity: H.265 is more processor-intensive to decode, which means older editing workstations require proxy workflows. For delivery, H.265 is increasingly the standard -- Netflix, YouTube, and most streaming platforms prefer or require HEVC for 4K delivery. For acquisition, H.265 is fine if your post machine can handle it; otherwise, shoot H.264 at a higher bitrate.
Can you tell the difference between ProRes 422 and ProRes 4444 in the final delivery?
In a standard theatrical or streaming delivery from a well-shot scene with standard grading, probably not. ProRes 4444 retains the alpha channel (transparency data) and full 4:4:4 chroma sampling, making it essential for compositing and VFX work. For clean production footage that goes through standard color correction and a 4:2:2 delivery encode, ProRes 422 HQ is visually indistinguishable from ProRes 4444 in the output -- but takes roughly 40% less storage during acquisition and edit.
What codec should a beginner use for their first short film?
ProRes 422 if your camera supports it (most Blackmagic cameras, some Sony cinema cameras, and all Atomos recorders). H.264 at the highest available bitrate (150 Mbps+) if ProRes is not an option on your camera. Shoot at minimum 4:2:0, 8-bit for anything going to YouTube. Shoot 10-bit 4:2:2 or better if you're planning a color grade. Avoid shooting in anything below 100 Mbps for narrative work -- the compression artifacts will appear in skies, shadow gradients, and fast-moving subjects.
Related Tools
The Codec Storage Calculator lets you enter bitrate, shooting days, ratio, and shooting hours to calculate total raw storage requirements before you rent drives. For understanding how codec choice interacts with dynamic range, the Dynamic Range Comparison Tool shows how different camera sensors perform at various ISO settings. Once you know your codec and shooting ratio, the Storage and Footage Calculator gives you a complete breakdown of card, edit drive, and archive storage needed for the full project. If you're still in the planning stage, the post on how much storage a film production actually needs walks through backup strategy and RAID configuration in detail.
Conclusion
Codec choice is a logistics decision as much as a quality decision. The right codec is the one your camera can write, your drives can sustain, your editor's machine can decode, your colorist can grade, and your delivery platform can accept -- all without introducing a conversion step that adds time or degrades quality. Shooting RAW on a production where nobody has planned for RAW storage, RAW decoding, or RAW delivery is a more expensive mistake than shooting ProRes.
This guide focuses on acquisition and editorial codecs. Delivery encoding -- the H.264 or H.265 file you master out for streaming -- is a separate technical step with its own standards for bitrate ladders and platform-specific requirements.
If you've made a codec decision that caused a real post problem -- a format that your NLE couldn't handle, a card capacity crisis on a shoot day, or a delivery rejection -- what was the codec and what would you do differently?