Talk:Digital cinematography

Major Rewrite
So, I've extensively reorganized, edited or rewritten almost the entire article (with the exception of the Cameras section) over the last two days. It was a slow weekend. I got rid of the POV warnings, because I think I addressed the POV issues. I also added a few new sections, slimmed down the text related to some of the tangents the article was wandering off on, and eliminated some off-topic material entirely.

It's not perfect, but I think it's a much better starting point for the kind of article that should be here than the previous incarnation, which had become practically incoherent in places as digital video supporters and detractors argued back and forth with alternating sentences in the text.

I'd be interested in comments/corrections/suggestions, etc.

--Chris Kenny 05:19, 16 October 2006 (UTC)
 * Thomson Grass Valley refer to themselves and their products at their own site extensively as 'Grass Valley' equipment, 'Grass Valley' cameras, etc. They do not call themselves 'Thomson'. Maybe you better do the same. —Preceding unsigned comment added by 90.5.12.208 (talk) 05:26, 16 January 2008 (UTC)

Overall, you've done quite a good job, but you still don't address what I see as a major failing of this article: Over the years the various manufacturers of Digital cameras have done a bang-up job of convincing the general public that most of the films they see have been shot with digital cameras, which is simply not the case. Maybe one day, but not right now. I'm thinking of the high school student who has been given the task of writing an assignment on modern film making. As I pointed out in a section that you've now removed, in the case of the movies that the average reader is likely to actually see, the print he or she sees in the cinema is still overwhelmingly likely to have been derived from footage that was intitally shot on film, printed from a "dupe" negative that was in turn copied from a master negative that will have have been hand-cut and hand-assembled from the original camera negative, (albeit using editing information derived from a computerized "off line" editing system as a guide). In other words, the basic process hasn't really changed for a century. The Digital Intermediate process, while it becoming more common, is still a long way from being the standard process, because it is so expensive. For ordinary films, the traditional approach is still by far the cheapest as it avoids the (currently very expensive) film scanning and printing processes, which often vastly exceed the cost of the film stock itself.

Although it it is true you can avoid the expense of film scanning by shooting with a digital camera, the catch is that at present if you want cinema release you have no option but using a very expensive digital film printer to produce the printing negative. With film origination you don't necessarily need that step, or if you do, only for the parts that need the digital intermediate process. If the project is only going to be released as HD video, you certainly could avoid that step, but then I'd really like to know what distinguishes that from from any other HDTV production!

Elekas 02:24, 21 October 2006 (UTC)


 * Elekas, I don't think that the misconception you talk about is a major failing of this article. It does state that "As of mid-2006 only a small percentage of high-end movie productions have used digital cinema cameras."  I think that one sentence is reasonable enough.  Wikipedia isn't a soapbox... it should present facts and whatnot (i.e. prevailing opinion, where the facts are unclear).


 * As far as DI goes... this article isn't necessarily about it. DI does greatly factor into the costs.  I think the article sort of does cover that, although very few words are given for that.  It does mention that digital is cheaper if a DI is used, and that film isn't as expensive when you have a photochemical finish.  From second-hand information, DI isn't ridiculously expensive... I vaguely remember that you can get a basic one done for $150k (although this figure is wrong; the CML mailing list sort of has some better figures).

Glennchan 19:24, 21 October 2006 (UTC)

Predictability
I don't think that film is more predictable than video, since video is sort of WYSIWIG whereas film isn't. The problem with film is that dailies can be graded multiple ways.

In terms of monitoring, this is a big issue. For example, the Viper's raw output has a very strong green cast (which can be removed in post production). Some cinematographers do not find this image very useful, and prefer to apply to a look-up table to correct the color cast and to apply a print film LUT. This is sort of the same problem as dailies grading for film. However, people crap themselves over this moreso. With film, you only get one thing from the dailies (and it doesn't look stupid like the Viper green cast). With video, people are looking at the "actual" video... and it looks 'wrong'.

[| whether to use magenta filter with Viper thread on the CML list] - sort of related [on set color correction and monitoring thread]

2- We should try to avoid opinion (word it neutrally / talk about both sides' positions) and stick with references. Don't follow me, because I interject my own opinion here. :D

Glennchan 02:16, 17 October 2006 (UTC)

Yet another rarely mentioned advantage of shooting on film! The "rushes" that get projected on-set the next morning or whatever are normally just a direct contact postive taken off the camera negative. In a lot of cases what you see on the screen is pretty much what you're going to see in the cinema! No green cast, no colour correction or white balance etc...


 * Forgive my possible ignorance here, but I thought most films transfer their camera negatives to video.(?) This video is used for rushes and for offline editing.  It is not a direct contact positive this is used for rushes.  As well, a direct contact positive isn't necessarily a good indicator of the final image anyways (for film or video distribution).  Glennchan 19:17, 21 October 2006 (UTC)

Split most of article off into a "Digital vs. film cinematography" article?
OK, this article has been bugging me for months, and I've just figured out why. The problem isn't so much that it's not neutral (though, obviously, as discussed extensively below, it's not). It's that it isn't even the right article. The digital cinematography article should actually be about digital cinematography. It should have information about digital cinema cameras, different formats people shoot in, movies shot digitally (I have some nice links to various interviews with DoPs on digital features), digital cinema workflow, etc.

Practically this entire article is about digital vs. film. Even the parts of the article which ostensibly address some of what I've listed above are permeated with digital vs. film comparisons. For instance, most of the section on formats is taken up by a discussion of weather digital 4K is as good as film 4K. That just doesn't belong there.

If I were drawing up an outline for a digital cinematography article from scratch, I'd probably include the film vs. digital debate as a section... but there's already a large amount of material material here on this subject, and more and more just gets added as people argue both sides of every minor point back and forth in the article in alternating paragraphs (or sometimes alternating sentences, which leads to some pretty incoherent stuff).

At this point, I think the best solution is to split most of the content of this article off into a "Digital vs. film cinematography" article, and write a very different article from scratch under the title "Digital cinematography". Because I frankly don't see how we're doing to get to a decent "digital cinematography" article by incremental steps starting with what we've got now.

I don't want to do anything this drastic without some discussion, so... what do people think?

--Chris Kenny 00:52, 15 October 2006 (UTC)


 * I agree with what you're saying. It could be organized something like:
 * What digital cinematography is
 * History
 * Cameras
 * Techniques?
 * Issues -i.e. backup, monitoring, dailies / grading, do you use magenta filters on the viper, etc.
 * Digital versus film (with lots of subtopics; quality, pros versus cons, workflow, cost, etc.)


 * Glennchan 02:05, 15 October 2006 (UTC)


 * OK, I'm going to start by doing some serious rearranging. I'll keep this all as one article for now, but I'll move all the film vs. video stuff that isn't in the film vs. video section down there, and then create a structure more like what Glennchan proposes above for the rest of the article. If anyone thinks this is totally out of line, revert it... but like I said, this has been bugging me, and I'm going to try to fix it.
 * --Chris Kenny 03:02, 15 October 2006 (UTC)

I think that would be an excellent idea. However you must take care to present the facts as they are, and then attempt to explain why it is so.


 * In my opinion, the section titled "Digital vs. film cinematography" ought to be another seperate article.
 * N7I2S5 22:40, 7 June 2011 (UTC)

Changes to Viper information
Made by Marker Karahadian, Plus8digital. The previously posted comments regarding recording formats were simply wrong and entirely misleading. Our company provided Vipers for Michael Mann's Colateral and the soon to be released Miami Vice feature. Both were recorded to tape. We are by far the largest source of Viper cameras world wide and we did not offer or support any disk based recording format until April 2005. 66.46.234.2 04:35, 18 May 2006 (UTC)

Definition of a pixel
From this article:


 * In general, when film is scanned the terms "2K" or "4K" specifically mean 2,000 or 4,000 trios of Red, Green, and Blue pixels across the width of the scan, that is, a "2K" scan will actually be made up of about 6,000 pixels.

This does not agree with common definitions of what a pixel is. From pixel:


 * The intensity of each pixel is variable; in color systems, each pixel has typically three or four dimensions of variability such as red, green and blue, or cyan, magenta, yellow and black.

In other words, a pixel (in an RGB display device) is made up of a red, green and blue element. These elements are not pixels themselves. 4K means 4096 RGB pixels.

This non-standard use of terminology leads to an extremely misleading comparison with Bayer filtered cameras later in the same paragraph. While it's true that Bayer sensors don't capture full RGB data at every pixel, the difference is not nearly as large as the article's non-standard definition of a pixel (which would suggest that film scans are ~12000x6000 while 4K Bayer data is ~4000x2000 -- one ninth of the size!) would lead one to believe. A proper examination of this issue would have to examine luminance and chrominance resolution separately and take into account the sensitivities of the human visual system. This comparison should probably be made on the bayer filter page, and just linked from here.

--ZnU 03:54, 6 September 2006 (UTC)

I think I'd agree with you. The whole issue involves several important factors, which complicates things. In pragmatic terms, there is no 'true' 4K RGB camera that shoots 4K luminance and chrominance resolution. Super 35mm film doesn't do it (AFAIK)- depending on how you take into account its grain/noise, how you interpret its modulation transfer function, etc. Most practical tests show that film scanned at 4K is only slightly sharper than 2K.

In terms of video, there are no 4K RGB systems available.

If you think about it in terms of truth in advertising... it depends on how you define resolution. If you are after *perfect* resolution, then even HDTV may not qualify as full 1920X1280 resolution- 4:2:2 color subsampling artifacts can create luminance errors (4:2:2 schemes subsample luma and chroma, not luminance and chrominance). See Chroma_Subsampling.

In practical systems, you have issues like: Glennchan 21:16, 8 September 2006 (UTC)
 * noise/grain
 * optics; prism/3-chip designs imply a certain distance from lens to sensor
 * optical low pass filtering (OLPF), necessary to prevent aliasing
 * signal processing; doing the inverse of OLPF would increase resolution
 * practical sensor limitations
 * With Bayer designs, the de-mosaic algorithm plays a big role.
 * How you define resolution. If you are measuring line (pairs) of B+W lines, you need to specify a minimum amplitude/difference and acceptable noise.

Requests for expansion
Any requests for specific expansions?--Fallout boy 13:40, 27 August 2005 (UTC)

Maybe we could get a section on the democratization (sp?) of filmmaking... which digital cinematography (miniDV, HDV in particular) has enabled. On one hand, the cost of filmmaking has gone down significantly since you don't have to pay for film. On the other hand, miniDV hasn't exactly put that many new/outsider filmmakers' work into cinemas. Although if you look outside theatrical release and at TV or internet (i.e. youtube), homemade videos are definitely getting popularity. Glennchan 06:57, 16 July 2006 (UTC)

Digital video vs. film for lower budgets
I have a real problem with this section, being a low budget indie filmmaker I like to see for advice on this subject for us low budget shooters, but the majority of this article seems to be aimed at major budget feature production with gobs of money to throw at the production, so much so that format is a mininscule part of the budget. There are a lot of lower budget features being made, and with the cost of camera rentals and film stock comprising a higher percentage of their budget, the question is if digital may be the way to go?

This article needs to focus on 16mm as well, still in wide use for lower budget and television work. Fact is that HD exceeds 16mm in terms of grain and resolution (unless you are using one of those really expensive low grain stocks). The techical view that HD exceeds 16mm is supported by the Discovery Channel last year seeking to increase their catalog of HD content. They requested that HD projects are sourced on either HD or 35mm, not 16mm.

Now, about the "video villiage" and all the other expenses big budget films bring with them. For lower budget films usually all you get with film is a view though the eyepiece video assist for low budget films is pretty much non-existant. With an HD Cam I can run a video monitor, rewind and make sure I got the shot. I don't need to be like George Lucas surrounded by three giant plasma displays as he captures the shot. Fact is, low budget HD filming is very similar to low budget 16mm, but with a few advatages.

Fact is, I have a cinematographer with a Sony F-900 who I can hire for the same price as film, spend way less on stock, eliminate development fees, transfer to Betacam for editing and finish in a HD studio for one day to make the master. I can save a lot over film right now.

This article makes it seem like HD is the same cost or more than film with an image that is inferior to 35mm. The first opinion is wrong in many cases, the second is opinion which, with the prevalence of major directors using HD, is disagreed with. Also, the assumption that CGI makes a bulk of a film ignores all the films that use little to no VFX. My new film has one VFX sequence, a simple green-screen composite shot where the protagonists views himself in the past as a memory.

Another point supporting HD as a valid choice is how many television shows have abandoned film in place of HD. The last season of Star Trek: Enterprise was all shot on HD and the main cinematographer loved it, saying he got the images he wanted with better low-light sensitivity and wondered why they hadn't gone to HD before.

At any rate, HD is another choice for filmmakers who must balance aestetics and budget to achieve the final product. I for one would like to spend less on stock to get a few more dollars to a damn good actor, because neither film nor video makes a scene live, only acting does.


 * I think that the main problem is simply that the article shouldn't be taking advocacy stances regarding the issues. Both digital and film have their pros and cons, and it certainly would be germane to address some of these issues. But the problem with the way the article reads at the moment, it is analyzing it only from one particular type of filmmaking. As such, it's not really relevant one way or another, as it ignores a great many other modes of production which are still being practiced. While I personally disagree with your HD/16mm opinions, I certainly don't disagree with your preferences or working style. And this is something which needs to be kept in mind. Opinions notwithstanding, certainly everyone can agree that each format has relative strengths and weaknesses. To advocate either absolutely is to disregard the possibility that they may peacefully co-exist for a long time.


 * In essence, this article needs to be descriptive, not proscriptive. Breaking down an "average" shoot's costs and doing a point-by-point comparision is not, IMHO, the domain of an encyclopedia article on emerging technology; rather, it should be in a WikiBook section on filmmaking (don't we already have several on-going?). The reader wants to know what digital cinematography is, what defines it, how it is used, and how it compares. Cost considerations belong to how-to guides and production office arguments. Girolamo Savonarola 17:19, 13 March 2006 (UTC)

Well, I agree, and i like you disagree on the 16mm vs. HD comment I made. That is what is wonderful about motion pictures- it is an art and one that everyone has their own opinions about!

I think you are right, this article should be about digital cinematography, not about digital vs. film. However, this should point out the pros and cons, and as it is now it seems to just point out the cons, many of which are treated as facts but are, in essence, opinions.

As for what I work with and what I would like to work with, well... I do my best. I would prefer to shoot on 65mm! There is one film, I don't know remember which at this time, where the director shot all interiors and exterior night scenes on HD and all exterior daylights on 35mm, optimizing his looks for low light and bright sun light.


 * That'd be Collateral. PS, when you're writing on the talk pages, you're supposed to "sign them" with your user name. You can do this by typing ~ . Thanks! Girolamo Savonarola 01:53, 14 March 2006 (UTC)

>>The reader wants to know what digital cinematography is, what defines it, how it is used, and how it compares. Cost considerations belong to how-to guides and production office arguments.

The trouble is, the "average reader" is being misled. If even half what the pro-digital-cinematography camp claim as fact was true, the film making world would be a vastly different place to what it actually is. The fact that you include MiniDV as a "serious" digital cinematography format says it all: this is really just turning into another "wannabe George Lucas" forum. Elekas 02:51, 14 March 2006 (UTC)
 * Who are you talking to, Elekas? I didn't write the article, nor do I disagree with you. Girolamo Savonarola 04:10, 14 March 2006 (UTC)

The above quote "The reader wants to know etc" is from your posting. I suppose by "you" I'm referring to the Wikipedia generally.

My main concern is that a Wikipedia article on an industry which is set to have a profound effect on out popular culture, does not give the appearance of being written or edited by people with any actual input to that industry. (Although they will of course loudly pretend otherwise) Elekas 06:53, 23 March 2006 (UTC)

-This article is about digital taking over in mainstream filmmaking and I find it to be accurate. Really, the truth is that digital has made it possible for more people to make films, most of these films will never be seen. In terms of filmmaking on a professional level this article is 100% true. Digital is not there yet. For greenscreen work with a SD television finish, HD is fine. For features with a budget 1mil. and up film is still the solid choice. What this article says about digital for day exterior is accurate.

Digital video vs. film expansion
I can't help but see this turning into a soapbox for format conflicts, I've deleted some of the POV statements and inaccurate parts (if "it is obvious very few of the protagonists have any real association with the film industry!" isn't POV and inaccurate, I don't know what is). Some other statements:

As mentioned above, however, with the exception of Rodriguez, none of this seems to have been put into practice.

I checked every director there, and none of them have used film on a recent project.

However, taken outside into "on location" situations where there is far less control over the lighting, video cameras tend to perform poorly.

I wouldn't consider this to be a disadvantage only of DV. Film can overexpose just as easily.

''Although it is true the "per minute stock cost" of videotape is much less than an equivalent amount of film, in most cases this is more than offset by the cost of the extra monitoring equipment required. In any event, even if the cost of shooting digitally could be reduced to zero, the overall effect on the cost of producing the average feature would be negligible, since film costs normally make up a tiny part of a film's budget. So currently even very cheap "made for cable" movies are nearly always shot on film.''

That really only applies to bigger budget films. The cost per minute of 35mm is about $1,000 vs. $0.17 for DV. Muliply by about 600 min of footage for a feature film, and there is a major difference in cost.

With modern timecode systems, there is no particular advantage to having the sound and pictures recorded on the same medium.

''*Although it is true that video cassettes and Hard Disk arrays can hold more footage than the 20 minute maximum of a "1,000 foot" 35mm film magazine, in practice it is extremely rare for the average "take" to last more than a couple of minutes. Most Directors of Photography are loath to "put all their eggs in one basket" anyway.''

Every DP I've met will maximize the ammount of film. Using only a few minutes of a film stock only increases cost significantly.

Apart from that, for release all digitally-originated footage still has to be transferred onto film by means of an expensive film printer

The exact same thing is true for film, that is not a disadvantage or advantage for either format.

I've also deleted the section about digital projection: Wikipedia is not a crystal ball --Fallout boy 22:28, 26 October 2005 (UTC)

VariCam & DVX100b
Should the Varicam be mentioned? And maybe for an indie-filmmaking section, it could include the upcoming HVX200 (Panasonic) 1080-24p 16:9 variable frame-rate camcorder for only $6,000. Also, i n the Mini DV section, the XL2 is mentioned; although most people will make the claim that the dvx100b is superior (it's in the same price range). It is MUCH more often used in filmmaking (if you do an imdb.com technical search, the dvx comes up with dozens upon dozens of results of movies which used the camera, while the XL2 comes up with barely any.)

And, the article seems to make a few naming mistakes with the CineAlta cameras. At one point, it notes that Sin city is shot with the HDC-950; but should it be HDC-f950? Earlier on in the article, the new improved CineAlta is called the HDW950 -- is this different from the HDC-950 or HDC-f950?

The 'DVX100B' is not superior to the XL2 being only a native 4:3 camera (the XL2 is native 16:9) and having a fixed lens (the XL2 has interchangeable lenses). Although it may appear to be "...MUCH more often used in filmmaking (if you do an imdb.com technical search, the dvx comes up with dozens upon dozens of results of movies which used the camera, while the XL2 comes up with barely any.)", these are infact low budget and wanabee "features" - to get on IMBD does not signify true mainstream international production credentials. By this criteria you could say that the Canon XL1s and SONY PD150 are superior to the DVX100B, which would clearly tell you nothing about there choice as 'premier' making filmtools either. The Varicam is rarely used for theatrical production and the HVX200 is not even fully released. Could we have a proper technical film historian working on this not someone who gleans their info from a few DV forums.


 * I think that the Varicam should definitely be mentioned. The HVX200 and its rivals by Canon, Sony and JVC maybe too. Peter S. 02:19, 1 February 2006 (UTC)

ESpanavision link

 * A group Ex-Panavison employees comment on the real story behind the push for "digital Cinematography"

Why is this link even here? The text is POV itself, and the website even more so. —Last Avenue [ talk | contributions ] 05:05, 16 February 2006 (UTC)


 * This guy has surfaced before on the Panavision article (see edit history and talk pages), and I think he's the anonymous IP or two editing right now. Very POV, doesn't meet Verifiability tests often. Girolamo Savonarola 04:03, 11 March 2006 (UTC)

POV-check
'Twould be nice to see some 3rd party eyes on this one...recent anon IP(s) (from the 210.8.232.x range) are adding some material which may be POV. Girolamo Savonarola 04:06, 11 March 2006 (UTC)

May be POV? I'd say blatantly so. As I am not technically proficient, can someone explain how digital film and visual effects relate? Because to my layman's eyes, this section on costs is attributing the costs of visual effects to shooting digitally. How are they related? 24.137.111.194 10:35, 24 April 2006 (UTC)


 * The benefits of shooting digitally when using CG effects means that you can skip the process of scanning the film into the computer for compositing purposes. This can reduce time and money if you are using a lot of digital effects and compositing. Comedian x 16:18, 20 May 2006 (UTC)


 * Yep, a lot of POV in this baby. It's mostly just a big sermon about how DV isn't as good or as cheap or as versatile as people say. Is DV as good as 35mm? No, not yet, but it's a hell of a lot cheaper, for the equipment and the "film", and a hell of a lot easier to work with if you're doing any CGI.
 * At the point where DV is equal to 35mm in quality, 35mm filmmaking will die. It may take holdouts like Spielberg a long time to give up the old ways, but consumer 35mm stills are dying fast, and film will too, for most of the same reasons. Mrdarklight 20:06, 23 May 2006 (UTC)


 * Very very very POV.

Logical fallacy
This is from the article:


 * For the last 25 years, many respected filmmakers like Francis Ford Coppola and George Lucas have made the claim that digital techniques will make films cheaper to produce. However, in the last 25 years, the average production budget has jumped by 300% (from $20 million to $80 million[citation needed]), despite the embrace of many new types of digital equipment and techniques. Movies are continually spending more and more on computer-generated images (CGI) and editing. On average, they spend far more on CGI than 1950s and 1960s epics did on special effects and extras (even after inflation).

It seems to me like the article is trying to say that new digital techniques like CGI has made it more costly to make films. That's a logical fallacy; it's not the CGI or digital techniques that has raised production costs; it's the fact that CGI has opened up a new window of opportunity in storytelling; so filmmakers use CGI to make more fantastic things they couldn't do without CGI. They have become more ambitious. So, if someone was to say, try and replicate the special effects you see in a film like King Kong (2005) but without CGI or digital techniques, it would probably be a whole lot more expensive than had they of used CGI like Peter Jackson and Weta did. But I guess one could make the argument that without CGI or digital techniques in filmmaking, one wouldn't be as "ambitious" and want to spend as much money. But I believe what Goerge Lucas or Coppola meant was that if one was to make similar films to the ones being made in their times (same special effects) but using digital techniques, production costs would decrease (when all else is equal). 24.23.51.27 23:18, 6 June 2006 (UTC)


 * I don't really have a clue whether or not your assessment holds water or not, but here's my two cents - saving money on non-CGI movies that are just straight live-action shooting by using digital...I can understand the idea of cost benefits there. But let's face it - the lab costs of film vs. digital on something the size of a Hollywood blockbuster are a drop in the bucket as far as both CGI budget and total budget go. You can certainly argue the benefits of all-digital workflow for something like Lucas or Jackson does, but it's hard to believe that the cost difference is actually much of a factor. But I'm not a line producer. Girolamo Savonarola 00:38, 7 June 2006 (UTC)

Most of the money in Hollywood budgets goes towards the stars, then labour, then equipment costs. A $200 million film may only have a production budget of about $40 million (fairly arbitrary numbers here). I'm not sure what the point is here, but that's sort of how the cookie crumbles. Film costs represent only a very, very small portion of the budget.

Another factor is that the industry wanted to move towards making big-budget blockbuster films, since they were more profitable than lots of smaller films. Jaws and Star Wars are examples of ridiculously profitable blockbusters, where they made lots of additional money off anciliary rights (toys, other merchandising, etc.). So the trend towards big blockbusters has nothing to do with the actual cost of moviemaking... it's just that the studios wanted more blockbuster movies. At the same time, the studios also put out "small budget" films (~$20 million) since those make money too.

I think the quote actually refers to "independent" filmmaking, where you're working with very small budgets (i.e. <=$5 million) and where film costs do consume a large part of the budget. With miniDV technology now, it has allowed a lot of people to make their own films and sort of democratized filmmaking... you can shoot a film on a shoestring budget. Glennchan 06:42, 16 July 2006 (UTC)

Resolution Issues
I'd like to see a citation as to where the deference between the film scanning use of "2k" and "4k" and the more generally accepted applications of the two labels. "2k" for 16x9 material would be 1920 x 1080, well over two million pixels. "2k" for film scanning, the article asserts, is only 6,000 pixels. That's less than standard resolution 640 x 480 (307,200 pixels)! Robert Harris, a noted name in film restoration, scans old prints at 4k (only 12,000 pixels by the article's math) and creates digital files suitable for printing to archival prints. Something clearly doesn't add up.--03:54, 2 September 2006 (UTC)Cg-realms


 * 2K and 4K only measure the number of pixels across the top edge of the screen, not in the whole image. Calculator time:  1920 x 1080 = 2,073,600.  That is two megapixel.  Since 1920 is close to 2000, it is also the de-facto standard for "2K".  (They don't go with exact numbers here.) 4K would double both numbers:   3840 x 2160 = 8,294400.  That is eight megapixel, not four, because you have doubled the pixel count both vertically and horizontally.  Films are achieving the 2.35 to 1 aspect ratio by cropping the screen, so Star Wars and Superman Returns are actually 1920 x 817 = 1,568,681 or one and a half megapixel.  The theatre near my house has a digital projector, and even at this resolution, the image is outstanding for both digital and film-shot movies.  (Except for Attack of the Clones, which was just okay.) While film SOMETIMES looks as sharp as digital, it more often falls far below. Only Batman Beyond in IMAX looked better to me.  (And even had had a projection screw up that day. - Big black chunks sliding across the screen, despite a delay to clean the projector.)  Algr 15:45, 2 September 2006 (UTC)


 * I think the following statement needs a reference:
 * '''In general, it is widely accepted that film exceeds the resolution of HDTV formats and the 2K digital cinema format, but there is still significant debate about whether 4K digital acquisition can match the results achieved by scanning 35mm film at 4K, as well as whether 4K scanning actually extracts all the useful detail from 35mm film in the first place.

'''
 * 4K digital acquisition would be arguably same/higher resolution than film, although it depends on how you factor in aliasing, how you measure resolution, etc. Also if you consider practice, you don't find that many cinematographers complaining about resolution- exposure latitude is a bigger deal (i.e. see CML list).


 * The following article talks about some of the issues involved; it doesn't compare digital cinema acquisition to film acquisition though.


 * Determining resolution in digital acquisition seems straightforward, but is significantly complicated by the way digital camera sensors work in the real world. This is particularly true in the case of high-end digital cinematography cameras that use a single large bayer pattern CMOS sensor. A bayer pattern sensor does not sample full RGB data at every point; each pixel is biased toward red, green or blue[2], and a full color image is assembled from this checkerboard of color by processing the image through a demosaicing algorithm. Generally with a bayer pattern sensor, actual resolution will fall somewhere between the "native" value and half this figure, with different demosaicing algorithms producing different results.
 * I think this explanation could be made clearer, and the resolution issues bumped into the bayer article. I think the practical thing people need to know is that actual resolution is somewhere betwen 0.5x and 1.0X the frame size; or roughly 70% of the frame size.  It highly depends on optical low-pass filtering, de-mosaic algorithm, how much aliasing is acceptable, what situations you're measuring, etc.
 * Glennchan 20:00, 21 October 2006 (UTC)


 * Some of the resolution issues are discussed at the ASC website here. The whole article is actually quite relevant, really. Girolamo Savonarola 13:56, 22 October 2006 (UTC)

Superman Returns
Reaction to the image quality on Superman Returns was not particularly favorable either.

Do we have a reference for this? All the reviews I saw praised it's look, and criticizing a movie about Superman for having a "comic book" finish is rather silly. Algr 22:19, 21 August 2006 (UTC)


 * Yep, heard good things on cinematography message boards, too. Peter S. 23:02, 21 August 2006 (UTC)

I think this comment was to reference the fact that the image was highly affected and therefore did not represent what a standard image out of the Genesis looked like. I'm sure if Superman was shot on film it would have been affected in the same way. unsigned comment


 * Welcome to the world of color grading... :) Girolamo Savonarola 00:19, 1 September 2006 (UTC)

Automated Negative Conforming
I am not aware that negatives are conformed using an automated machine as mentioned in one of the questionable section. Specifically, it says, "An automated machine then duplicates the project on film by cutting up and splicing the original negatives, using the edit marks produced on the computer system." Robert Elliott 12:17, 1 October 2006 (UTC)


 * That would be news to me too. See negative cutting - negative cutting software exists for the purpose of data management, not physical cutting, which is always done manually. It couldn't be done easily by a machine either, since you have to scrape off the emulsion of one frame and then cement the frame. Girolamo Savonarola 20:37, 1 October 2006 (UTC)

Context of notable directors' comments about film being cheaper
I think those directors made those comments in the context of indie/low-budget filmmaking, not blockbuster stuff. Could we get some sort of reference to the particular quotes, or remove that erroneous argument?

Sensors
For high-end cameras, it is becoming common for a single sensor to be used rather than for three sensors to be used in conjunction with a prism.

I think a citation is needed for this one or at least it needs to be more specific regarding which high-end cameras that are being talked about. I thought all professional digital video cameras have 3CCDs.


 * I think this is referring to high-end digital cameras being used for feature films. Namely the Genesis, D20, Viper, etc... Girolamo Savonarola 23:43, 13 November 2006 (UTC)

First publicized digital film
"However, David Kaiserman's, Driven Together, filmed in 1999, and shown in August 2000, was one of the first all-digital films shown publically Kaiserman citation. " Someone removed this stating it wasn't significant. However, it clearly was the first documented screening of an all digital film at a public forum, specifically at a movie theather. Whether it turned out to be a blockbuster or not or got widespread theatrical release is not the issue. The fact that a full length all digital film was premiere to the public is what makes it important. The first of anything rarely gets widespread distribution - but they are still credited as the first. Also, given that there is a clear citation documented date and time, makes this a verifiable statistic. The first of something new of anything is never guaranteed to get widespread release or distribution. This is a valid statistic.
 * I don't think that its accomplishment is very well-stated, to be quite frank. (Though I wasn't the one who deleted it.) Define "all digital". Plenty of things were shot on DigiBeta before 1999, to say nothing of miniDV. Was it projected digitally as well? Furthermore, your citation says nothing whatsoever about the screening being a digital first, much less "one of the first". Are you David Kaiserman, and if so, can you elucidate the matter? Girolamo Savonarola 01:03, 29 November 2006 (UTC)
 * I was the one who removed this information since the claim was unclear and because I felt that the information was not notable (it seemed to say that this film was the first no-budget film to be shot on DV and be shown in a theatre). Doing some basic research, it seems that other films predate Kaiserman's.  For example, "The Celebration" debuted in 1998 and received attention at Cannes, was a pioneer in Dogme95 films, and received a theatrical release.  Glennchan 03:06, 29 November 2006 (UTC)

 

"The Celebration that you speak of followed the Dogme95 standard which required "The final picture must be transferred to the Academy 35mm film". Driven Together was shot in DV, Edited in DV, and projected from a video projector from a DV player, making it SHOT, EDITED, and PREMIERED in digital.  This makes it the first "ALL-DIGITAL" film recorded.


 * This article is about digital cinematography, so digital projection isn't quite relevant (that's the digital cinema article). Secondly, premiering in ""digital"" is not very notable at all.  Anyone can rent out a theatre and screen their film to the public.  It is a much bigger feat to transfer your digital material to 35mm film.  This process is much more expensive than ""digital"" projection.  The Celebration could easily have been projected digitally, but if they did so it wouldn't be that notable/important/significant.  Glennchan 05:24, 3 December 2006 (UTC)


 * Do you really believe that no one actually shot, edited, and projected a film digitally before 1999? I know for a fact that I have - the technology has existed for decades. The only reason why it's been getting press lately is because the projectors are improving in quality. But having an all-digital pipeline from start to finish is far older than your film. As stated before, the DigiBeta format alone predates it by six years.


 * Furthermore, looking into this more, The Last Broadcast was released in 1998, via satellite delivery to digital cinemas, was shot on digital video, and edited digitally. It mainly got press because of the satellite delivery mechanism, which probably is also indicative that it wasn't the first to do the all digital route either, merely the first to include satellite delivery into the pipeline. Either way, it debunks the Driven Together claim, and is vastly more citable. Girolamo Savonarola 07:04, 3 December 2006 (UTC)

Superman Returns - Critical response to digital cinematography
On the CML list (for professional cinematographers), many didn't like the plasticky look of the flesh-tones although that look was created in the color grade to get a comic book feel. So perhaps the technology was good (hard to say/judge), some cinematographers disliked the look. Glennchan 05:05, 6 December 2006 (UTC)


 * Well, even if the technology was "bad", you have to remember that cinematographers are more likely to be able to judge and spot out many more subtle image artifacting techniques than the average moviegoer. And lots of DPs were very critically viewing the film from the start, since it was one of the first big films shot on the Genesis. The question is not so much whether or not the technology had flaws (which is more or less a fact in as much as it doesn't behave like film and thus will fall short of it), as whether or not the flaws were judged considerable enough to make a difference. What might be worth noting is that while a sizeable number of cinematographers and techs still had reservations about the quality of the above-mentioned factors, the audience and critics left the issue generally unremarked, implying that they did not consider it an issue. That's how I'd phrase it, perhaps? Girolamo Savonarola 09:59, 6 December 2006 (UTC)


 * Well also consider that critics rarely comment about the cinematography of a film directly. A lack of mention could mean anything really.  That being said, it didn't look particularly bad or good in my opinion.  The quality seems comparable to film, although that's hard to tell without looking at the ungraded images and not knowing whether cinematographers are playing to the camera's strengths.  Or perhaps the camera is not really a big part of the equation(!).  Certainly, Superman Returns did well commercially so we'll likely see more digitally shot movies in the future. Glennchan 10:39, 6 December 2006 (UTC)

Analog or digital?
Film is kind of digital and kind of analog. The actual particles are either one color or another, right? Their size and placement is analog, similar to dither in digital. I think there is a meaningful way to say the maximum spatial frequency that film can capture, but I don't know what that measurement is. But I think that saying film doesn't have a resolution is not true. — Omegatron 17:59, 23 February 2007 (UTC)

Same concerns about the discussion of digitization and quantization noise. Film has a noise floor, too, so saying that the digitization process is lossy is not necessarily true. We need to be as specific and accurate as possible. I wonder if this should be moved to its own article. — Omegatron 18:07, 23 February 2007 (UTC)


 * Thing about noise floors and film is that it is very particular to the circumstances of the film being shot. Each film stock has different sensitivity characteristics. Newer emulsion technologies have increased resolution and color space, while the type of development and exposure will also affect the quality and quantity of film grain. Girolamo Savonarola 01:59, 24 February 2007 (UTC)

Did Michael Mann use Viper for RHD/LA?
Did Michael Mann use the Viper for his shortlived CBS series RHD/LA? --24.249.108.133 20:22, 26 February 2007 (UTC)

No, this series started production in Summer 2002, the Viper wasn't available until later that year. --Onejaguar 16:53, 6 June 2007 (UTC)

Lossless compression in avi files
"AVI files can be lossless." Avi (audio video interleave) is a container, not a codec. Containers hold the video and sound file, and possibly other files as well(subtitles, multiple language tracks). Containers have nothing to do with video compression, or indeed any type of compression, video or audio codecs do. Perhaps mpeg2 or divx/xvid(video codecs) was meant instead of avi(container).

67.70.9.32 19:27, 4 March 2007 (UTC)


 * I'm going to rework the whole compression section and remove the stuff that doesn't have anything to do with digital cinematography, which is most of it, including the AVI stuff mentioned above.


 * --Chris Kenny 16:55, 5 March 2007 (UTC)


 * So, that's done. I'm going to be polishing, etc. over the next few days.


 * --Chris Kenny 19:29, 5 March 2007 (UTC)

Industry Acceptance of Digital Cinematography
Quentin Tarantino, who acording to the article vowed to always shoot on film, is about to release a film shot on digital. —The preceding unsigned comment was added by Photomonkey (talk • contribs).
 * If you're referring to Grindhouse, Tarantino's portion of the movie was shot on film, while Robert Rodriguez's portion was shot digitally (at least according to IMDb, which is sometimes less than reliable). Green451 15:28, 18 April 2007 (UTC)
 * Ah! I normally give up with IMDb as most of their stuff is incomplete. Thanks for the info --Photomonkey 14:08, 19 April 2007 (UTC)

Should it be Electronic Cinematography instead?
I would like to suggest that "Digital cinematography" is actually the wrong title for what this article is really about. It should be "Electronic Cinematography". The idea of using video technology to produce movies for theaters dates back as far as the 1940s. In 1971, 200 Motels was shot on video using a special system designed for theatrical prints.

Some movies shot on analog video for theatrical release:


 * 200 Motels (1971)
 * Monty Python Live at the Hollywood Bowl (1982)
 * Julia and Julia (1988)
 * White Hot (1989)
 * The Blair Witch Project (1999)

When digital video arrived, it improved image quality and made high-end systems much easier to use. But it did not fundamentally change the production process, or add any new reasons to shoot filmless. Julia and Julia and White Hot were shot with the MUSE 1035i HDTV system, and so would still likely look better then something like Bamboozled, that was shot in digital SD. Algr 06:26, 9 July 2007 (UTC)


 * No. You could certainly justify making "Electronic Cinematography" redirect to this article and expanding the history section but everyone uses digital technology today and everyone inside and outside the movie industry refers to it as "Digital Cinematography" or (perhaps incorrectly) "Digital Cinema." The article mostly covers contemporary digital techniques so the title should stay as-is. --Onejaguar 16:21, 9 July 2007 (UTC)
 * Just a note . . . Julia and Julia was shot with the MUSE 1035i HDTV system but transferred to 35 mm film before being released in theaters. I'm not technically inclined so I don't know if that's significant. MovieMadness (talk) 16:21, 13 February 2008 (UTC)
 * That is still the standard today, since most theaters in 2008 still project film. Cinematography refers to how a movie was shot, not how it is viewed. Algr (talk) 20:59, 13 February 2008 (UTC)

Image copyright problem with Image:Courteney Cox in November.jpg
The image Image:Courteney Cox in November.jpg is used in this article under a claim of fair use, but it does not have an adequate explanation for why it meets the requirements for such images when used here. In particular, for each page the image is used on, it must have an explanation linking to that page which explains why it needs to be used on that page. Please check


 * That there is a non-free use rationale on the image's description page for the use in this article.
 * That this article is linked to from the image description page.

This is an automated notice by FairuseBot. For assistance on the image use policy, see Media copyright questions. --04:24, 20 May 2008 (UTC)

IP keeps vandalizing the Article
Martin Scorsese started using digital cinematography (HDCAM) with film combined for his rolling stones documentary shine a light. For his new George Harrsion documentary he is using digital cinematography as well (Red one)

However some IPs keep vandalizing the article by putting him back to the directors who dont shoot digitally.

i suppose an admin should tempban these ips. —Preceding unsigned comment added by 84.190.91.194 (talk) 00:56, 12 November 2008 (UTC)

director Martin Scorsese began shooting his newest feature length documentary with three RED ONE cameras under the helm of Ellen Kuras 11/2008. 87.159.143.134 (talk) 20:39, 2 January 2009 (UTC)

F23
Just so you are aware. The F23 is listed as one of the cameras used on the recent Star Wars pre-quels. Both of those movies came out before that camera was even released to the public. Also the Sony F950 that did work on those films is not included in that segment. (207.138.149.30 (talk) 18:13, 2 December 2008 (UTC))

CinemaDNG
Is CinemaDNG on-topic for this article? (And if so, where?) I recently created the article, and I've added it to lists of terms used in movies (etc), but it may be too technical / low-level for this page. Barry Pearson 12:14, 26 September 2009 (UTC)

Oliver Stone
Why was Oliver Stone listed as enemy of digital cinematography? Her shot several movies digitally. —Preceding unsigned comment added by 78.53.43.70 (talk) 09:18, 13 December 2009 (UTC)


 * Oliver Stone issue addressed in the comments above. --berr 216.15.63.67 (talk) 15:25, 8 August 2010 (UTC)

Article thoroughly riddled with POV, reads like it was written by industry techs with an agenda.
Article reads like an opinion essay advocating digital technology as "destined" to take over the market, complete with ample ungrammatical industry-speak such as "shot digital".

It's also worth noting that this article is 3 times as long as the article on motion picture film, which seems like an equally telling undue weight in how the two articles are written.

("shot on digital" as a noun is also ungrammatical, and yet is listed as the title of a non-redirected stub page listing movies "shot on digital". This reminds me of some English-impoverished Hollywood producer who said that movies "shot on real" meaning with live actors, would become less popular).

The proper English is "on digital (noun)" or better yet, "digitally".

Since most of the people in a position to edit this article are involved in the trade (and would have no professional incentive to expand the article on Movie camera to match this one, or remove POV in this article to match that one), it is only natural that they would be tempted to lapse into boosterism.

--berr 216.15.63.67 (talk) 15:21, 8 August 2010 (UTC)


 * Here's just one example, I actually removed this paragraph from the section on Industry acceptance, pending someone rewriting it to remove POV, which I doubt will happen, as it obviously hasn't since the issue was first brought up at the top of the talk page in 2006...


 * Digital cinematography accounts for a larger fraction of feature movie shooting every year, and seems destined to eventually eclipse film-based acquisition, much as digital photo cameras have largely replaced film based photo cameras in the still photography world.

--berr 216.15.63.67 (talk) 15:53, 8 August 2010 (UTC)


 * I think those statements can be backed up with references - certainly there must be statistics on how many films are shot digitally each year. Algr (talk) 08:58, 9 August 2010 (UTC)

The article, which has been written by many people by the way, only states facts by mentioning Digital cinematography accounts for a larger fraction of feature movie shooting every year, and seems destined to eventually eclipse film-based acquisition, much as digital photo cameras have largely replaced film based photo cameras in the still photography world.. The market share of mechanical cameras (and cinema projectors as well) has dropped to one-digit percentage since 2007. Thats no POV, that simply the market. No new mechanical cameras have developped by the most relevant manufacturers (aaton, arri, panavision) of mechanical euipment since then, however they all introduced digital cinema systems.

And honestly, the discussion if such facts are POV is completly boring. It has beed the exact same discussion with typewriters, mechanical still cameras, LP vs. CD, Super 8 film vs vhs then dvd then bluray, DCI vs 35mm mechanical projection in cinemas and so on and so forth. In *all* areas of media-systems, no matter if aquisition, production or distribution, the digital systems *always* dwarfed their mechaincal counterparts for *all* levels of production, no matter if amateur or professional.

Therefore i will remove the neutrality disputed, and please feel free to quote *any* manufacturer who says that mechanical cameras sales will grow back or mechanical cameras will have relevant sales in the future. As of 2010, *all* manufacturers, including Panavision, Arri, Sony, Red and Aaton state the exact opposite.

Also, please tell us what besides the part you deleted is POV in your opinion. The deleted part simply states facts and the obvious trend all manufacturers agree on. And to be precise: Digital cinema cameras already sell in thousands a year, mechanical cameras for cinema are sold in dozens. In fact even consumer grade-equipment today starts to be used by ASC-Members and for top-TV-series (as example lets take the canon 5D which was used for Dr. House or by Shane Hurlbutt, ASC, http://www.hurlbutvisuals.com/ , for his recent full-budgeted full-feature movie). So do you really want the article to inform readers that there will be a glorious comeback of mechanical cameras instead of stating the obvious? 78.53.47.151 (talk) 15:59, 12 August 2010 (UTC) 78.53.47.151 (talk) 15:49, 12 August 2010 (UTC)

Inherent qualities of film stock vs Digital information
First thing I question is the fact that music CDs will always have holes in them, empty spaces of no sound no amount of sampling can compensate for. Although not reconized easily on the surface many people feel something is lacking as opposed to listening to an analog recording. Playing the LP of SGT Pepper is a much richer experience then listening to the CD for most people although they don't know why. The digital revolution makes portability and other aspects easier but there is a trade off no matter what people say. I love my Itunes but miss my Dual turntable and vinyl(damaged in a move with a very small check for compensation).

Comparing that to the fact that pixels are square seems to bring up the fact that no matter how high resolution is curves will not exist and the brain will pick up on this on digital film as with music as something lacking similar to screening of photographs in non digital printed work. The lower the screen used the more noticeable the image is no longer analog. The higher the screen the less noticeable but it is no longer an analog image. It becomes discrete dots, like TV did. When I pause a digital film on my TV the pixels stand out. The chunks we use to call them but whatever they are called now it is just not the TV resolution, they seem more pronounced when pausing a digital movie than pausing a movie that was shot on film.

Like early video I see a flatness to the digital film image. The look of digital film is very interesting and I feel that it will really come down to what is better for a project not what is more dominate and the cost is meaningless to 5 million plus productions.

The above was simply what I feel is the major difference between digital and analog applications. This is prelude to my real point and I am curious for an answer.

My point is simply can digital film create the same images that Welles used to film his March of Time Newsreel for Citizen Kane by dragging the negative on the floor and otherwise damaging it to create a lifetime of old footage. Kubrick used 16mm TV stock in the firefight at the airforce base. Can these effects be recreated in digital film.

However Sin City could never be made on film. But it still retains that flatness. Plus this is one of the movies I paused and it had pronounced jags, another term we used when we switched from screening photos to digital photos in printing.

So I am very interested if the effects in the Welles and Kubrick films among others can be duplicated with Digital film, as a movie like Sin City can't or it would be very hard and expensive to duplicate on analog-which is not analog anyway 24fps blah blah blah persistence of vision blah blah blah.

Does anyone know if these effects are difficult for digital film without losing that natural feeling?

I am also curious of how digital film is projected. Is it converted into something similar to 24fps or is it like a video?

Anyway thanks for any answers about this but besides this issue I do believe these methods will co-exist with film makers using the best method for their projects.

ThanksAlphaboo —Preceding unsigned comment added by 24.188.98.241 (talk) 06:52, 27 August 2010 (UTC) CD-quality audio (or 16bit 44khz 2 channel to be precise) is rarely used for digital cinematography. Typical recording formats are 24bit 96Khz on 4 or 8 channels - this quality easily beats any analogue recording system in the $$.$$$ price range. DCI, the defacto standard for digital cinema also uses 24bit @ 96 khz - and modern home-entertainment as bluray also has higher sample rates and bit depth compared to CD. Audio on *film* in the cinema however is way inferior to CD. Usually you will get, when not in a moden digital cinema, datareduced 96 kbit audio tracks when Dolby Digital is used, thats magnitudes below CD-quality (Dolby Digital on 35mm is typically ~96kbit per Audiotrack, CD is ~600kbit per track).

Working since many years mainly 24bit/96khz i can easily say that the quality of this digital audio is *much* better than *anything* we ever got from even the best & most expensive analogue gear. Classical LP and Tape are, sorry, terrible in comparision, and even more sophisticated analogue devices - as professional large audiotape - cant touch the quality.

To simulate the problems of film in the digital world has become quite easy, for most of the defects film has as least: Film, which moves mechanically frame by frame, is never accurate in pciture stability. Typically a projector will move 0,2-4%, cameras are better but also move 0,x%. Emulating this can be archieved in the digital world by motionblurring the image. Noise, also called grain, is easily recorded from blank filmstock and then pasted into the noiseless digital file - often done in VFX & compositing to combine fully synthetic and analogue layers in a composited image. Burn-ins (by stalled projectors ie) and overexposed images (by stopping film transport with gate / shutter open) are done digitally by (many) specialised plugins. More sophisticated issues filmrecording devices can have, such as stains, dirt & hairs stuck in the optical path or disasters in the lab (even the intentional ones as crossprocessings etc) are today also emulated by specialised DI-Tools. Fading, which is often a problem especially with older filmstocks, is easily simulated with any decent secondary colorcorrection.

Wath remains more challenging is emulating scratches etc - to perectly emulate these types of filmdamage a good artist is mandatory still.

For audio in the studio there also are TONS of simulators for distortion/overdrive/speakers/tubes etc, who emulate the defects of fully analogue (or old discrete electronics) btw.

However, the much larger part of the market is DI & restauration - in fact getting rid of all the issues film has or can exhibit qualitywise.

Images on film as well as on digital media today always are *flat*. Not surprisingly as both media record 3D to 2D. The booming stereoscopic (aka 3D) productions are *fully* digital, very few use the ncihe system "techniclor 3D" - the market share of film is below 5%, of digital presentation >95% for todays 3D cinema.

And in fact, today it can be pretty challenging to tell if something was shot fully digital or mechanical - no matter if still photo or moving image. If film shows its defects (as 500ASA 16mm stock *will* be noisy & grainy) then its easy to spot, hoewever most of these defects today are reduced in the postproduction.

Cinemas, btw, are switching from mechanical to digital projection in the thousands. Very few filmformats (classic IMAX being one of them) can competete with digital projection quality wise, and even the best mechanical cinemas will always wear & tear copies, while the digital copies remain unchanged. Most of the projections are done in 24, fewer in 25 & 30. For 3D however, most digital projection is 48 (rarely 50 or 60) - hoewever many Artists (among them J. Cameron ie) deeply regret that we are still stuck with the anachronistic 24 frames a second, and i do have to agree, 48, 50 and 60 look so much better with fast images - and luckily almost all digital cinema projectors on sale today are also able to run 60/50 or at least 48. (24 was a compromise for film as well, many format used higher framerates) 78.53.43.211 (talk) 00:53, 6 September 2010 (UTC)

Red Epic
There are a lot of professional filmmakers using the new Red Epic and have provided some information here for discussion on this new camera. The Epic has a 5K Mysterium-X™sensor and a 27 layer ASIC, the most advanced processor of its type in the world, enabling EPIC to capture up to 120 frames per second, each frame at full 14MP resolution. This will provide more than enough clarity with any type of media including independent filmmakers and professional filmmakers alike. Acclaimed filmmaker Peter Jackson is using this camera to film his latest masterpiece, The Hobbit. Providing native dynamic range of over 13 stops and resolution that exceeds 35 mm motion picture film, this camera can more than handle low light and all the other nuisances that a traditional film based camera can provide. The RED also comes with a newly developed HDRx™ extended dynamic range technology and EPIC boasts an amazing dynamic range of up to 18 stops. Purpose-built for perfect multi-camera synchronization, EPIC comes to market at a time when 3D capture requires the sophistication of a new generation of innovative technology. Jonplante (talk) 20:07, 29 November 2011 (UTC)

Bias Removal
Removed a sentence that asserted that the change from film to digital in cinemas was an inherent degradation of quality without citing sources. Analog vs. digital is an aesthetic debate, it is incorrect to assert that one is objectively better than the other. The criticism section provides viewpoints of major film-makers that dislike or have concerns about digital cinematography. SC2Mashimaro (talk) 21:49, 10 May 2013 (UTC)
 * Missed that one. We had a problem with a single user who had an extreme anti-digital bent and who tried to repeatedly use "sources" to support the POV which were either old and not-proper comparisons, a single person's opinion or even Kodak.  That user attempted to use socks to push their POV and has been blocked indefinitely for repeated sockpuppet offenses.  Keep a look out for new (or old) socks. --Oakshade (talk) 22:29, 10 May 2013 (UTC)

I don't want to jump right in on editing this without a bit of discussion first, but it feels to me like there's still quite a bit of bias in this article. For instance, the opening section in the final paragraph, digital video is more or less dismissed as inferior in terms of quality. There is a cite but it doesn't that I can tell support the point. A lot of discussion here seems to be quite old so I'm wondering if maybe it's time for a refresh. VanBoek (talk) 19:29, 10 January 2017 (UTC)
 * Agreed. At the time when one biased user kept on making making major re-writes, some of their blatant film-is-better content was quickly removed while many of the more subtle biased lines like what you just mentioned remained.  I'll support a neutral rewrite effort. --Oakshade (talk) 20:09, 10 January 2017 (UTC)


 * I've just removed some more of this: adding material that is not supported by the cites given is a no-no. Yes, there is a valid case that the very best film cinematography might be better than digital -- in particular, 70 mm film is still significantly better in terms of resolution than 4k digital. But it is just not the case that average run-of-the-mill 35mm film is, or was, better than modern 2017-era 4k cinematography. Different, yes, but not better. -- The Anome (talk) 11:26, 3 April 2017 (UTC)

Needs a thorough review
I just emended the opening sentence for clarity (a very minor edit); then, on reading the article, I found parts of it opinionated, dated, or simply unclear. The transition to all-digital production continues, but opinions differ re how far it has progressed. Cameras are still evolving rapidly, and lighting techniques are only beginning to exploit the characteristics (no, I didn't say advantages) of digital acquisition.

This piece needs a re-write now, and then further revisions later to match the continuing changes in production techniques. I would undertake it myself, but though I try to keep up, I no longer have daily contact with film production. We need a writer who is disinterested and fully up-to-date on production techniques.Jim Stinson (talk) 00:28, 8 November 2015 (UTC)

External links modified
Hello fellow Wikipedians,

I have just modified 3 one external links on Digital cinematography. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
 * Added archive https://web.archive.org/web/20160201064107/http://www.indiewire.com/article/as-studios-abandon-35-mm-film-small-theaters-struggle-to-transition-to-digital to http://www.indiewire.com/article/as-studios-abandon-35-mm-film-small-theaters-struggle-to-transition-to-digital
 * Added archive https://web.archive.org/web/20130521164844/http://motion.kodak.com/motion/Products/Lab_And_Post_Production/Archival_Films/2332.htm to http://motion.kodak.com/motion/Products/Lab_And_Post_Production/Archival_Films/2332.htm
 * Added archive https://web.archive.org/web/20130523095937/http://motion.kodak.com/motion/Products/Customer_Testimonials/Wally_Pfister/index.htm to http://motion.kodak.com/motion/Products/Customer_Testimonials/Wally_Pfister/index.htm#ixzz2Ti47T2NZ
 * Added tag to http://www.mdisc.com/what-is-mdisc/

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at ).

Cheers.— InternetArchiveBot  (Report bug) 02:40, 13 December 2016 (UTC)

RAW does not equal effective 4:4:4
In most digital cameras, whether still or video, chroma subsampling happens not at the software level, but at the hardware level, as the sensors are physically built with a matrix to record an image with a lower color than brightness resolution (both spatially in pixels as well as in color resolution). That's the only reason why modern digital cameras nowadays are actually able on a hardware level to physically capture framerates of 24p or even higher, whereas actual RGB scanners take much longer to record a single image.

RAW, however, is only a *STRORAGE AND OUTPUT* format which may prevent additional blocking, banding, or fringing artifacts from digital compression after the image is taken, but does not change anything about pixel and color resolution. In other words, the RAW storage and output format may be 4:4:4, whereas the camera only records an image that is color sub-sampled already because that's how the sensor works on the bare hardware level, and this sub-sampled image is then simply stored and put out inside a 4:4:4 RAW format. It's like a spoonful of flour inside a 1-gallon sac.

The section on Color sub-sampling needs to be edited accordingly. Right now, the article makes it sound as if color sub-sampling would be a mere software issue that could be prevented by recording to RAW, when in fact it's a hardware issue that recording to RAW doesn't change anything about. --2003:EF:1700:6247:A49E:679C:7080:8964 (talk) 10:29, 12 September 2020 (UTC)