[2] As of 2023, professional 4K digital cameras were approximately equal to 35mm film in their resolution and dynamic range capacity.
[4] The first practical semiconductor image sensor was the charge-coupled device (CCD),[5] based on MOS capacitor technology.
The Digital High Definition image was transferred to a 35mm negative via an electron beam recorder for theatrical release.
The offline editing (Avid) and the online post and color work (Roland House / da Vinci ) were also all digital.
In May 2000, Vidocq, which was directed by Pitof, began principal photography shot entirely using a Sony HDW-F900 camera, with the video being released in September the next year.
According to the Guinness World Records, Vidocq is the first full length feature filmed in digital high resolution.
[13] In June 2000, Star Wars: Episode II – Attack of the Clones began principal photography shot entirely using a Sony HDW-F900 camera as Lucas had previously stated.
In May 2001 Once Upon a Time in Mexico was also shot in 24 frame-per-second high-definition digital video, partially developed by George Lucas using a Sony HDW-F900 camera,[14] following Robert Rodriguez's introduction to the camera at Lucas' Skywalker Ranch facility whilst editing the sound for Spy Kids.
[21][22][23] Today, cameras from companies like Sony, Panasonic, JVC and Canon offer a variety of choices for shooting high-definition video.
Single chip cameras designed specifically for the digital cinematography market often use a single sensor (much like digital photo cameras), with dimensions similar in size to a 16 or 35 mm film frame or even (as with the Vision 65) a 65 mm film frame.
All formats designed for digital cinematography are progressive scan, and capture usually occurs at the same 24 frame per second rate established as the standard for 35mm film.
When distributed in the form of a Digital Cinema Package (DCP), content is letterboxed or pillarboxed as appropriate to fit within one of these container formats.
During 2009 at least two major Hollywood films, Knowing and District 9, were shot in 4K on the Red One camera, followed by The Social Network in 2010.
This trend has accelerated with increased capacity and reduced cost of non-linear storage solutions such as hard disk drives, optical discs, and solid-state memory.
These files can be easily copied to another storage device, typically to a large RAID (array of computer disks) connected to an editing system.
Once data is copied from the on-set media to the storage array, they are erased and returned to the set for more shooting.
Post-production not requiring real-time playback performances (typically for lettering, subtitling, versioning and other similar visual effects) can be migrated to slightly slower RAID stores.
Short-term archiving, "if ever", is accomplished by moving the digital files into "slower" RAID arrays (still of either managed and unmanaged type, but with lower performances), where playback capability is poor to non-existent (unless via proxy images), but minimal editing and metadata harvesting is still feasible.
Because of the need to decompress extra frames in these situations, inter-frame compression can cause performance problems for editing systems.
If the full frame, called I-frame, is lost due to transmission or media error, none of the P-frames or B-frames (the referenced images) can be displayed.
In July 2005, they released the first version of the Digital Cinema System Specification,[33] which encompasses 2K and 4K theatrical projection.
[35] Theater owners initially balked at installing digital projection systems because of high cost and concern over increased technical complexity.
However new funding models, in which distributors pay a "digital print" fee to theater owners, have helped to alleviate these concerns.
Digital projection also offers increased flexibility with respect to showing trailers and pre-show advertisements and allowing theater owners to more easily move films between screens or change how many screens a film is playing on, and the higher quality of digital projection provides a better experience to help attract consumers who can now access high-definition content at home.
This is particularly true in the case of high-end digital cinematography cameras that use a single large bayer pattern CMOS sensor.
Generally with a bayer pattern sensor, actual resolution will fall somewhere between the "native" value and half this figure, with different demosaicing algorithms producing different results.
Some filmmakers have years of experience achieving their artistic vision using the techniques available in a traditional photochemical workflow, and prefer that finishing/editing process.
[36] As long as the negative does not completely degrade, it will always be possible to recover the images from it in the future, regardless of changes in technology, since all that will be involved is simple photographic reproduction.
"[41] Christopher Nolan has speculated that the film industry's adoption of digital formats has been driven purely by economic factors as opposed to digital being a superior medium to film: "I think, truthfully, it boils down to the economic interest of manufacturers and [a production] industry that makes more money through change rather than through maintaining the status quo.
Archiving digital material is turning out to be extremely costly, and it creates issues in terms of long-term preservation.