The comments about attentionspans, video games and frame rates are pretty much universally missing the point.
At Thanksgiving I was talking to my 18 year old brother about video games, movies, etc. He had never been educated on the frame rates involved with various formats but he made a comment about how much he disliked watching movies on his friend's TV - using several disparaging analogies, etc. When he described the phenomenon, I was able to piece together that his friend's TV had motion-smoothing and explained how that worked. It was like a lightbulb went off for him and he could understand what was going on. He was glad to know such settings could be disabled.
Someone that spends more of their time playing high frame-rate video games is not inherently going to like higher frame-rate movie formats. He likes them in his games in part because they reduce the input and response latency (something I've discussed in several previous posts as being unique to interactive media and a non-issue in cinema).
As far as attention-span, I can tell you that before he ever turned 7 years old, I used to watch artistic computer animation compilations with him (the kind that had essentially no dialogue to the tune of two sentences in 45 minutes) and he would watch intently, ask one contextual question during the viewing and be fully engaged the rest of the time.
Some people in each generation will have taste and some will not. The same goes for people that enjoy a given form of entertainment.
As far as computer animation in movies, let's not forget just how far the boundaries were pushed by Final Fantasy: The Spirits Within in regards to realistic human figures as compared to all the other CG movies around that time. It took a while for everyone else to catch up and if you look at what some of those animators have been doing recently compared to "The Hulk" examples, it's a pretty stark contrast. Just because CG makes extensive use of technology does not mean that the artists stop being one of the biggest differentiating factors. :)
[EDIT: There was a typo before so this was directed at the wrong person.]
@endlos You want to know why those apps are not getting the attention the BMCC is? Because with the BMCC you are talking about expanding a segment of the pro philosophy, workflow, paradigm and quality into a much lower price range. It's about asking people to do more with their images, asking prosumers, amateurs and indie people to take advantage of a new opportunity to work like their higher budgeted peers would (or to adopt a variant of that approach) in regards to getting a bit more serious about color, detail and grading. It's about not saying "good enough" in terms of quality.
Key components: - Massive increase in quality available at a given price point. - Preservation of/refinement of a workflow designed for professionals (in regards to color grading) and emphasis on a tweak-able RAW acquisition format. - To be crystal clear: the camera is designed to encourage people to really focus on getting good quality.
There are tons of issues that go along with that camera that are more or less important to given people but let's compare that to the Apple Final Cut X Pro (I'll ignore Motion for the moment because I think a smaller community could contribute to the discussion at present).
- Final Cut X Pro was not an evolution of the industry standard editing approaches. It could not be seen as Final Cut Pro 8, etc. and it did not encourage a lower income subset of the existing market to think more like their higher paid counterparts had done. Instead it completely restructured the approach taken and asked everyone (professionals included) to adopt it. The fact that certain facets of the approach had been more clearly signaled in earlier consumer products than in the professional ones also left a bad taste in many peoples mouths.
- BMCC delivers a product that directly responds to what many indie filmmakers (and would-be filmmakers) had been asking for. The benefits were clearly visible and many parts of the online community felt listened to and respected by the design decisions. Some people wanted to wait for the next evolution (in terms of mount or sensor size) but few people said anything amounting to "this is the wrong direction".
- There was no obvious increase in image quality with FCP X. Let me be crystal clear about that: if you spend a similar amount of money on a new or used competing product vs Final Cut X Pro new, you will be able to buy something else that can get you similar quality. You are paying for the workflow approach you prefer. I am not saying that one is better or worse and I know several people that really enjoy FCP X and I'm not trying to bag on the app. But it is not (and never has been) a product that brought higher image quality to a massively lower price point - it just brought Apple's price point down.
- Final Cut X Pro launched at $300. That's a big cut from ca. $1,000 but less impressive compared to some of the competitors. Premiere Pro and Vegas already occupied a price-point between the two and Premiere Pro currently offers a rental program that the competitors do not. On top of that, the low end variants (such as Sony Vegas HD Movie Studio Platinum) have expanded to included more and more functionality at price points under $100 (not to mention less mainstream efforts such as Lightworks being even more aggressive).
- All this doesn't even take into account that Apple already had Final Cut Express available at $200. FCP X represents a price increase for that (discontinued) product. - The BMCC represents the addition to the marketplace of a new product without the removal or discontinuation of another. - To re-examine the respective differences, the gap in price between FCP 7 Studio (which included more in the way of bundled applications) and FPC X (which costs less but got rid of some of the bundled software) is $600. If we consider the excluded apps as having value of their own, that means we are looking at a proportional difference that is smaller than that. Apple had already reduce the pricing of FCP 7 by $300 compared to a previous edition - so they completely changed functionality but continued with their decreasing pricing strategy.
- By comparison, the BMCC arrived into a market that had no RAW movie options under $9,700 - yet it launched for just under $3K, with $1k professional color grading software bundled at no additional cost. While that may look like a similar percentage to the Apple pricing change, that's a price cut of more than $6,500 in the market with no competitors (then or now) available at the same price point for RAW recording (without factoring in the software). If we look at the overall price difference (as it relates to the funds available for the potential consumers) this is a very big deal.
And in regards to discussion of Redmatica and Logic: I remember when I was read to switch sequencers to Logic, right as Apple bought EMagic and discontinued the PC version. I remember all the headaches of AU and the discontinuation of VST support - not to mention the frequent Quicktime/AU incompatibility issues that cropped up with subsequent OS upgrades. I also remember that quality did not improve in the plug-ins compared to TDM, VST, MAS, Direct X (etc.) alternatives already in play at the time.
I remember their consolidation of the Logic range, the eventual price cuts and I compare to those to the price cuts by many competing companies on products they were about to discontinue. I do this not as evidence that Apple has any plans to discontinue their audio products, but to refute the argument that they could be used as evidence that Apple will not discontinue those product lines.
I remember how one of the first things Apple did after the bought EMagic was to port underlying technology from Logic to Garageband. I also remember that Garageband had very serious limitations in terms of sample rate and bit depth - limits that were exceeded by various freeware applications on the PC that adopted a different workflow. In fact, if you were just recording live players, you could have gotten better samplerates and bit depth with inexpensive shareware like Cool Edit 2000. In other words, Apple bought a company primarily making professional tools and one of the first things they dedicated their resources to was an inescapably consumer/amateur oriented product with some very hardwired limitations. While Apple may use Redmatica to leverage their professional line, they might just as easily emphasize the lower end here.
Apple has a history of creating new approaches, discontinuing old ones and forcing their entire customer base to either adopt the new approach or jump ship. Some markets are much more open to this than others. Microsoft, for all their faults, only recently discontinued support for Windows 3.11. Different companies take different approaches and people value one priority over another.
I really like Apple's industrial design. I consider them to be market leaders in that area for good reason. I find their software design to be much more of a mixed bag and I disagree with many choices they've made in the software area. In other words, I feel both positive and negative things towards them and have no desire to see them put on a pedestal or unfairly critiqued.
But I see very little to support your thesis that Apple is doing for video editing software what BMCC is doing for low-priced cinema cameras. To some people it may be just as important, but it is fundamentally different - and to people like myself, it is less helpful.
In regards to everyone discussing I-frame/intra-codec compression the Gh2, GH3 amd 5DmkIII here are my own notes based on spending a lot of time testing (and my experience as consultant to tech companies in the past, which is not as an engineer).
Question 1: Is there any reason to believe that the Gh2 and GH3 would be sharper at given Intra-codec bitrate than another camera, like the 5DMkII or 5DMkIII?
A: The primary determinant of the cameras video mode sharpness (other than the lens used, etc.) is approach to downscaling/downsampling the image in combination with the processor used. The codec format and bitrate determines how much of that information is preserved but a huge amount of information is lost before it even gets that chance.
The GH2 feautres a superior downscaling/downsampling algorithm to that used in the Canon APS-C DSLRs and 5DMkII and 5DMkIII. That means that there is more information for the codec to start with. Therefore, given an equal bitrate and equally effective algorithms, the GH2 would be capable of recording more detail.
Question 2: How big a role does bitrate play in determining image quality?
A: Increasing the bitrate increases the range of material that can be represented to the codec`s best ability. At some point you hit a point of diminishing return where there are essentially no realistic scenarios that receive any benefit. Usually this happens some point before you reach the bitrate it would take to reproduce the data in uncompressed fashion. But if your codec is really weird, it is theoretically possible to have it be less efficient than uncompressed at high enough bitrates.
At 70mbps or 80 mpbs, frame size is still much smaller than what would normally be used for a JPEG of similar dimensions (as a somewhat unrelated reference point given the differeing efficency of .h264 encoding) and compression artifacts can easily manifest themselves depending on content and the specifics of the encoder. The most common h.264 artifacts are macroblocking and mosquito noise. If you scene does not have sufficient detail to suffer from one of these (or some of the less common artifacts) then any gains from increasing the bitrate will be subtle at best. Note that changing from GOP1 (I-frame only) to IPB encoding has a much more dramatic difference as it affects the way grain is handled in a scene (or shadow detail) by changing the "update frequency" with which the codec handles what it considers to be "low detail" or "low priority" areas.
Raising the bitrate will not make much difference if you are not spotting artifacts in the first place. A general softness is not the result of a low bitrate: macroblocking often is.
Question 3: Can we get a sense for how the GH2 handles/would handle bitrates around 70 to 80 mbps and possibly compare them to the 5DMkIII?
A: Yes, there multiple settings that Nick Driftwood made for the Gh2 much earlier on that did just that. Note that they do not use the newer high detail matrices and various other tricks, but they still outperform the 5DmkIII in terms of resolved detail. Look up SMBU (v1) from last year and some of the early Quantum X settings from early this year that target lower bitrates (Pictoris is available in two matrices with both CBR and VBR options and there may be others).
Question 4: Does the bitrate of the GH3 All-I settings affect GH2 settings or serve us reason for new ones?
A: No, the bitrates were already tested and provided as settings and the newer matrices may have issues at lower bitrates. In addition, the specifics of the codec handling on the Gh3 may mean a difference in performance at the same bitrate (remember that the GH2 could often look better at the same bitrate than the GH1, even with simlar GOP settings).