Jump to content

Damphousse

Members
  • Posts

    913
  • Joined

  • Last visited

Posts posted by Damphousse

  1. 2 hours ago, joema said:

    It is ironic that networks require this since the technical quality they deliver is often so poor. Note this frame grab of NBC footage from the Olympics. It is smeared, blurry, full of artifacts. Their excuse would probably be "it's not us, it's Comcast". However transmission of network content is a signal chain that's only as strong as the weakest link. If they permit gross degradation of image quality at any point in the chain, then being persnickety about technical matters at other points is simply lost in the noise. It implies they don't really care about image quality.

    i-BKWbk3B-X3.jpg

     

    The technical quality of NBC Olympic content delivered to end users was so bad that the below footage from 1894 was actually better. Imagine that -- some of the first film footage ever shot, and it's better than what NBC delivered. Despite having supercomputers on a chip, satellites in space, and optical fiber spanning the globe, the delivered quality was worse than an old piece of film.

     

    Oh, brother.  If you are willing to pay Comcast $80 a month for a highly compressed crap picture who is an idiot in this scenario NBC or you?

    My advice is use 1930s technology called an antenna and watch NBC for free with a much sharper picture.  Does NBC have to come to your house and spoon feed you?  I plug my $50 antenna into my Windows 8.1 PC with media center and have a free DVR.  I don't have cable because I don't want to pay for a crappier picture.

    Using Comcast as an excuse to do a pro job with an M1 is weak.  I am an amateur but for my day job no one would even think of cutting corners like that.  Honestly for a lot of pro shoots camera rentals are the least of their expenses.  If someone is thinking of doing this as a job I would probably just get a loan and buy a C100 mk II.  Use it for the project and sell it.  Someone sold one for $3,500 on ebay in August.  The camera sells for $3,999 new at B&H.  Do the math.  Even with ridiculous ebay fees you are probably out $1,000 when it is all said and done.  And your 1080p video looks great.

    I don't know.  $1,000 investment for your day job?  Peanuts in my industry.

  2. 11 minutes ago, mercer said:

    Maybe, but they're not going to price the G80 at the same price as the GX80/85. So it will definitely be more than $799. My guess is at least $999. So, then the question becomes... How much will they price the GH5... I highly doubt, if the spec rumors are true, that it will be $1500. 

    The stills market dwarfs the video market and I don't see anyone interested in stills paying $2,000 for a mft stills camera from a second tier company.  They would have to be banking on selling it purely to people interested in video.  If I was just interested in stills or did 95% stills and 5% video which is pretty typical I would just buy a Canon 7D mk II and save myself $500.  Not to mention the stable of lenses the Canon would give you access to.

  3. URSA Mini 4.6K for $5,000.  In a controlled studio setting you can mitigate most of the major caveats the camera has.  You also get 4k raw right out of the box.

    On that budget I don't know how you can do better.  You get 4k raw.  HDD are cheap so archiving shouldn't be a problem.  CFast media isn't cheap though.  But if you have a good shoot and offload technique you should be okay.

  4. 3 hours ago, JurijTurnsek said:

    Bluetooth earphones and wireless charging have both been available for some years now. Go for it, Apple does not exist in a vacuum. Have fun charging you earphones, though.

    The only sensible thing would be to make wireless earphones that can charge and work with a supplied cable when the battery runs out. Cables are only really obtrusive only a fraction of the time (like for exercise).

    Yeah besides the tangle of cords I haven't really been bothered by headphones.  And all those cables means I've never lost a pair of wired headphones while I have spent a small fortune on bluetooth headsets.  I refuse to pay for premium bluetooth headphones because they get lost so easily.  Airpods are $159 a pair!  I can't wait to read the internet rants when people lose those.  And what happens if you lose one?  Can you just buy the other side for half price?

    The charging thing is a major no no for me.  My Beats noise cancelling headphones run forever on AAA batteries.  On a 10 hour flight I have no interest in shutting everything down and waiting for hours while my headphones to recharge.  With my current setup using an external travel battery I can use my phone continuously for 10 hours and arrive at my destination with everything charged and ready to go for a full day.

    When I am done with my jog I just toss my wired earbuds in the center console of my car.  They are there and ready to go the next day.  No thought or planning required.  The amount of stuff we have to carry around and charge these days is ridiculous.  It is so nice to have something that is always ready to go and that you can leave in a car on a sunny 95F day no problem.

    5 hours ago, Mattias Burling said:

    I remember all the angry mobs who cried about the first iPad not having one but several full sized USB ports :)

     

    Actually quite a few people complain pretty loudly on the internet about the lack of USB... and lack of SD/microSD... and the high price... and the engineered obsolesces with OS updates.  It's kind of like Canon.  Just because you are the most profitable company in your space doesn't mean that people don't have valid complaints about your products.

    Apple does certain things really well but I honestly find flaws with all the current mobile OS ecosystems.  I'm happy that some people enjoy the idea of wireless headphones as much as you and your colleagues but for me that is a solution looking for a problem.

    4 hours ago, Phil A said:

    What I wonder with the expectation regarding 4k60p is, why would anyone shooting with a mobile phone consider that?I'd guess there's really no interest from regular consumers. 1080p gives them 120fps which is "better" slow motion.

    I made a quick check by asking 5 people at work (industrial manufacturing company) if they set their iPhone 6s (we have them as work phones) to 4k30fps. I got a "can Instagram or Facebook show that already?" and everyone stuck with the standard 1080p30fps, actually only one of them didn't have to look into the settings to find out (the others didn't even know there is a setting), he said 4k uses too much memory and he made a concious decision.

    I feel like there's as good as no interest from consumers in 4k60p. Like with the cameras, enthusiasts are a small market wedged between consumers who don't give a f*ck and pros who buy purpose built equipment, so we can't really expect the industry to make us happy.

    I have no recollection of anyone showing me a cell phone slow motion video... ever.  Granted I don't hang around teenagers who are into extreme sports.

    Most people take a quick video of their kid or something they saw and post it to Facebook or send it to me via Whatsapp.

    The irony is with lack of expandable memory iphones are the last kind of phone people are going to want to shoot 4k of any type on.

  5. 2 hours ago, Axel said:

    It aren't the limitations of our cameras.

    Sorry I wasn't referring to your camera.  I was speaking about my camera and the other people who posted who complained of things such as banding with 8bit acquisition codecs on certain cameras.

    My point was I agreed with the other posters who cited mitigating technical flaws as one motivation for using film grain.  I was actually surprised by this thread.  A little film grain as a technical tool was something of a nice discovery for me.  I know I got back into analog photography because of artifacts like banding in digital photography.  At least with photography getting the film look is easy.  Just shoot film.

  6. 1 hour ago, DBounce said:

    All your huffing and a puffing will not stop the winds of change. Mark my words. The idea of emulating the traditional film look are numbered. It will soon only be used for nostalgic purposes. The iPhone and GoPro generation has no warm and fussies for film. They do not feel the need to add grainy distortion to otherwise clean footage.

    Just thought you should read this...

    Quote

    Nearly a third of film photographers are younger than 35 years of age, and “support for traditional film is growing,” says Ilford Photo. The company, best known for its analog photo products, reported these findings after doing an international survey of film users.

    The “comprehensive” survey was conducted at the tail end of 2014, and “thousands of users” from over 70 countries around the world participated. “The results were inspiring,” says Ilford.

    http://petapixel.com/2015/02/04/30-film-shooters-younger-35-says-ilford/

    I don't know what you mean by the "iPhone" generation.  Do you mean the content producers or the consumers?  I can tell you right now consumers don't have warm fuzzies about 99% of what we argue about on here.  And if you mean the content producers, I would have to guess the people that are really passionate about this field study and take cues from movies shot on film.

    I also have to say as someone who does film photography your characterization of film is very strange.  My advice is to have a look at a print made from a well exposed Kodak TMAX 100 medium format roll of film and then come back and tell us what "grainy distortion" you saw.  With film the amount of grain varies from nonexistent at normal viewing distances to gigantic.  You have to decide how much to add.  If all your work looks like "grainy distortion" that's a problem with your technique or the particular tool you chose.  For a lot of circumstances to me the idea is to subtly dial in the amount of grain you need to get the needed effect while avoiding turning things up so much the grain walks off the screen and shakes hands with the audience.

    A lot of "film looks" I see at the movies don't look like anything I get with film photography.  I think in a way you may be on to something.  There are generations of people that are being taught the "film look" is gigantic grain and "teal and orange" or monochromatic color pallet.  It sounds to me like you are commenting on the abuse of the film look vs a subtle implementation of the film look.

  7. A bunch of people wrote some good stuff in this thread, too many to quote.  Before I really got into editing and color correcting my videos I thought adding grain was just to emulate a film look.  Now I realize with the inherent flaws in digital cameras and post processing such as denoising and high compression for web delivery grain is a valuable tool.  I also realize using a good grain plugin and understanding the nuances of applying it are crucial.

    Given the limitations of our cameras I can't imagine taking film grain out of my tool box.  Just about every guide I've seen on denoising DSLR footage mentions adding grain to mitigate some of the ill effects of the process.

  8. 12 hours ago, animan said:

    most insurance that covers theft wont pay you anything in case the bag is unsupervised or in a place where it can be taken without you noticing

    Really?  They would consider a bag in an overhead compartment that you looked away from for a few seconds "unsupervised"?  I've never looked into this type of insurance but that seems a bit unreasonable.  If that is the case then that is pretty useless.  Is being locked in the trunk of an unattended car considered "unsupervised"?

  9. I've had a phone that can do 4k for years.  My advice?  Curb your enthusiasm.  4k definitely adds more detail in good light but you still have the abysmal lack of dynamic range and noise in anything other than ideal light.

  10. 4 hours ago, Andrew Reid said:

    Now a surprise

    The 3x crop mode with 1:1 sensor output (full pixel output of a cropped region) delivers absolutely superb image quality with no moire or aliasing.

     

    That is pretty standard with lower level Canons in my experience.  I noticed the same thing on the T3i and the 50D.  I used the 3x mode as much as I could and used wide angle lenses and stepping back a lot.  I believe it has to do with the aliasing filter.

    Honestly I would buy the camera if they did a slight crop and full readout.  They could just downscale it to 1080p if they want to protect their 4k offerings.  As is though I just can't spend the dough.  I'll stick with the BMPCC.

  11. 6 hours ago, IronFilm said:

    Kinda strange neither F5, FS5, or FS700 raw is on that list!
    RED MX cameras are also not listed.

    I don't work for Netflix so all I know is what is on the web which anyone else can easily Google.  The impression I get is they are continually evaluating cameras.  My guess would be the list isn't comprehensive.  I saw this back in the day with DSLRs and stock agencies.  They had an approved list but cameras were coming out so frequently that as long as your camera was in the same ballpark and met the minimum specs it was fine.  They did QC and would let you know if they had a problem.

    Honestly I thought this thread started with an insightful post by Jimmy.  Contrary to jax_rox's ludicrous pronouncements the people at Netflix are very knowledgeable and didn't make their decision lightly.  And honestly characterizing it as Netflix's decision is misleading.  Of course if someone doesn't bother to read a little background it makes sense that they would just think this is a Netflix thing.  The fact of the matter is Netflix is a member of the UHD Alliance.  And the UHD Alliance has laid out specific criteria for their certification.  If you look at the companies and people that are a part of the alliance you rapidly realize they have more experience and knowledge than someone on the internet ranting about Canon's awful color.

     

    Quote

    Resolution: 3,840x2,160 pixels, otherwise known as 4K.

    Color depth: 10-bit (important for HDR, as most other TVs are just 8-bit).

    Color gamut: Wide, including the ability to show at least 90 percent of the P3 color gamut (check out Ultra HD 4K TV color, part 1 and part 2 for more on this).

    High dynamic range: Specifically the ability to use SMPTE ST2084's electro-optical transfer function, which Dolby helped create (check out What is HDR? for more info).

    Minimum brightness and contrast ratios: This is probably the most interesting one, as it's great for consumers and shows the different players here at work. There are two possible minimum specs. A minimum brightness of 1,000 nits, along with a black level of a maximum of 0.05 nits (20,000:1 contrast ratio), or a minimum brightness of 540 nits, along with a black level of a maximum of 0.0005 (1,080,000:1).

    http://www.cnet.com/news/what-is-uhd-alliance-premium-certified/

    I really feel for the TV makers.  The fact of the matter is a high quality 1080p smart TV is perfectly adequate for everybody.  There is not a whole lot of true 4K content out there and even when it is available it doesn't make me want to toss a perfectly good TV and spend $1,000+ on a new set.  I can see why the industry is trying to set a high bar and wow customers.  I'm not sure it is going to work.  Like DSLRs I think what people have is good enough.

    Having said that TV has a bad history of using substandard media and being careless with it.  I was amazed to learn early episodes of Dr. Who were recorded on tape AND original tapes were recycled and taped over as a cost savings measure.  There is a current world wide search going on for the lost episodes.

     

    Quote

    Eleven Doctor Who episodes were discovered (nine of which have not been seen for 46 years) by Philip Morris, director of Television International Enterprises Archive, by tracking records of tape shipments made by the BBC to Africa for transmission. Morris says, “The tapes had been left gathering dust in a store room at a television relay station in Nigeria."

    http://www.bbcamerica.com/anglophenia/2013/10/lost-doctor-who-episodes-from-the-60s-now-available-on-itunes

    Anyone who knows the history of TV and has actually watched a couple of videos detailing Netflix's workflow and archiving would not reach the conclusion they are somehow thoughtless, ignorant, or careless.  Netflix doesn't run the UltraHD Alliance and they don't make TVs nor video cameras.  Given the environment they are working in and the constraints put on them by external forces I don't think whatever compromises they have to make are even worth mentioning considering the history of the industry they operate in.

  12. 7 hours ago, jax_rox said:

    Lordy.

    I never said that, ever. Feel free to read through my comments, and show me where, exactly, I said that Netflix are turning down Alexa footage and accepting GH4.

    Uhh...

    23 hours ago, jax_rox said:

    I guess, technically, a show originated on the A7sII or GH4 would have more chance of getting through than something shot on an Alexa...

     

    12 hours ago, jax_rox said:

    A blanket rule is a bit strange, and suggests that an A7sII or GH4 shot show would have more chance of being approved (as they are originating in UHD) than an Alexa originating in 2k or 3.2k despite the fact that we all know the Alexa footage would probably look better.

    Sounds to me like someone just can't admit their hyperbole is flat out wrong.

    Anyone who watches the Netflix howto video that comes away from that with the impression that "an A7sII or GH4 shot show would have more chance of being approved (as they are originating in UHD) than an Alexa" needs to see a psychiatrist.

    Look jax_rox I realize when you started this little tirade you didn't know about the raw requirement which is understandable.  But now that you know please quit chanting "4K, A7SII, GH4" over and over again.  The ironic thing is Netflix said 4k AND raw.  You are the only one that seems to be fixated on 4K and can't read any words beyond that.

  13. Just in case anyone wants to know the real deal here is a document discussing Neflix's criteria and approved cameras.

    Approved Camera List

    Canon C300 Mk II
    Canon C500
    Panasonic VariCam 35
    Panasonic VariCam LT
    RED Dragon
    RED Weapon
    Sony F55
    Sony F65
    Blackmagic Design  URSA Mini 4.6K
    Blackmagic Design  URSA 4.6K

    And of course this caveat...

    Secondary Cameras:

    ● Any cameras other than the primary camera (crash, POV, drone, underwater, etc.) must be  approved by Netflix. 

    ● Test footage should be shot and provided to dailies and post­production to ensure compatibility with  primary camera.

     

    Absolutely nothing about accepting the GH4 while simultaneously rejecting an Alexa.

    Netflix Originals Cameras.pdf

  14. 8 minutes ago, jax_rox said:

    I'm not suggesting that's what's happening. But the blanket rule suggests that you can't use an Alexa to shoot, say, B-cam, but you could use an A7sII. The point of the argument isn't 'wow the GH4 is the best camera ever', the point of the argument is 'how ridiculous is it that technically a GH4 could pass but an Alexa won't'.

    And Netflix have no say over non-original content. I'm not suggesting they won't accept Alexa gear, just that on Netflix original content it's not acceptable.

    Which is crazy, surely. 

    But again, feel free to not understand the argument once more and throw some more random condescending comments around..

    Just to reiterate.  There is no scenario where Netfilx will accept GH4 footage but will turn down Alexa footage.  You can bold whatever you want and sneak in "B-cam" and use waffle words like "suggests" but all that internet forum dancing around does not change the facts on the ground.

    You seem to want to fixate on the word 4K and want to completely ignore the word raw.  I realize that advances your bizarre hyperbole but there is nothing in Netflix's criteria that says, "we will never bend the rule on 4k but sure nonraw is fine".  Don't just cherrypick what you want to hear to fuel your internet rage.  Read their guidelines.  Watch their howto video.  There is a very specific reason they require raw.  They spend half the video talking about raw and it makes sense and frankly to me is more important than 4k.

    Really man it just isn't cool coming on here and spreading rumors and getting people all hyped up over something that is just not true.  Netflix is not turning down Alexa for GH4.  Please stop saying that or at a minimum please provide links as I have to actual proof.

  15. 1 hour ago, jax_rox said:

    No, they shoot in UHD natively. There's nothing that suggests Netflix only accepts content originated in raw acquisition, just that they won't accept anything originated with a spatial resolution <UHD.

    That's the point. A blanket rule is a bit strange, and suggests that an A7sII or GH4 shot show would have more chance of being approved (as they are originating in UHD) than an Alexa originating in 2k or 3.2k despite the fact that we all know the Alexa footage would probably look better.

    There are plenty of productions shooting Varicam and F55/65 that don't shoot raw and still end up with mass distribution, including on Netflix. 

    Look man there is no scenario where Netflix is turning down Alexa footage and accepting GH4.  Don't be ridiculous.

    Netflix originals require 4k and raw.  Other stuff that they have on their service can be Alexa, B&W film, whatever.  My advice is sign up for a free trial and just watch a few shows.  This is really not something to argue about.  Tons of B&W film stuff from decades ago on there... and Alexa stuff.  And definitely no GH4 Netflix originals.

    https://youtu.be/b_vGQyWm3o4?t=2m30s

  16. 3 hours ago, jax_rox said:

    I think it's an example of people making decisions who don't really understand the nuance of acquisition. They're looking from a perspective of delivering the highest quality you can get on your TV set.

    I guess, technically, a show originated on the A7sII or GH4 would have more chance of getting through than something shot on an Alexa...

    Which is strange to us, but I guess if you're looking at a paper spec sheet, and you see one originating in UHD, the other upscaled, you would probably assume the one being upscaled would be of poorer quality..

    The A7SII and GH4 shoot raw?!  Or were you just saying something completely false and hyperbolic for comedic effect?

     

    Quote

    Netflix commission stipulates the need to capture 4K RAW...

    http://www.missiondigital.co.uk/marcella-the-netflix-challenge/

  17. 1 hour ago, Hanriverprod said:

    I talked to my dp who shoots korean studio films regularly, and he said he could use the camera since he probably won't go past 5.6. But this is for a narrative project. If you are filming landscapes, architecture, real estate, or for scientific purposes I can see the need to stopping down past 5.6/8.

    Still, I think it's messed up if this issue was known by bmd and prerelease reviewers but swept under the rug to get it out there and see how it plays out by keeping quiet as long as possible. If you read through the magenta thread in balckmagic's forum an early adopter was being attacked for identifying this issue very early on, even pointing it out on some prerelease footage. I find it hard to believe that bmd was not aware of this  developing the camera when a few users picked up on it right away on it's release. They're qc can't be that bad. If they knew, that's just shady. But hey in this day and age, screw integrity when a buck is a buck.

    Yeah I have no idea.  I'm not a pro.  It just would never occur to me to point my camera at a white wall and stop down to f/11 when kicking the tires.  My inclination would be to get the sharpest lens I could and open it up to it's sharpest aperture and then stress test the sensor.

    Having said that if I was buying a camera for thousands of dollars I would not be amused if it went magenta in the corners just because I visited f/11.  Shame on them if they snuck this out the door with that kind of known flaw.

  18. 47 minutes ago, Hanriverprod said:

    The customers who are buying the camera and using it already either don't care they are cutoff at f8 on certain focal lengths or are content lying to themselves probably won't effect BMD's business, so they  should do the right thing and just clear the air since its already taken them a year to admit that the problem does exist.

    Just out of curiosity unless you are going for mega depth of field why would you go past f8 on a super35 sensor?  Sharpness really starts to drop at f/11 on a lot of lenses.

    I'm not trying to defend a camera flaw but unless I am being sloppy and not using the correct amount of ND I really avoid stopping down that much.

    I actually inadvertently discovered this with a film 35mm camera.  I mean I knew about it theoretically but I thought it was just something pixel peepers worried about.  Well I had a must get shot so I bracketed it at various apertures just to cover my depth of field bases and the results were astonishing.  Sharpness at f/16 and f/22 was obviously less on a 35mm negative, not a print.

    Maybe people haven't worked with their camera much at f/11 and beyond.  I never realized simply stopping down could reveal something like this magenta issue.  I'll have to make sure I test all my future cameras.

  19. 5 hours ago, Justin Bacle said:

    The one I bought had an original canon battery and lasted like 10 minutes when recording :s

    I bought a couple of Patona batteries. On my last shot (4 hours event, but I always shut down the camera between takes) i did not emptied one battery. So that's pretty acceptable :)

    Those batteries you got with the camera are potentially 8 years old and lord knows what was done with them.

  20. 1 minute ago, BrorSvensson said:

    sorry for that, i thought i knew more than i did!

    Lol.  Not the crime of the century, just a sensitive topic.

    Leaving out the beefier codec in the 1DC, consumer grade Canon DSLRs have a pretty thin 8 bit codec.  It does not hold up well with a bunch of pushing and pulling.  Various color profiles have been peddled for years and ultimately I abandoned my T3i for video and now us a BMPCC.  It is a whole different beast.  It has a true log profile in a 10 bit 4:2:2 wrapper and of course it also has raw.

    I don't know what the exact technical difference is between a log profile and a general run of the mill flat profile but I do know I wasted a lot of time with the latter.

  21. On 8/31/2016 at 0:18 PM, dhessel said:

    My biggest issue with BM is that I really don't like the way the company, at least when it comes to cameras, operates. It is awesome that BM offers free versions of their software but the way they conduct themselves otherwise is downright disrepectful at times. They over promise and under deliever, they are deceptive and not really trust worthy. With the Mini two features were lost GPS in September 2015 and global shutter in March 2016 yet they announced at NAB the camera was going to be shipping in July 2015 when they were clearly nowhere near ready to ship. Some say it was over optimism but I don't buy it, I think they knew full well they wouldn't make that date an announced it anyway. Then for the following eight months they moved the ship date one month at a time stringing customers even though I am sure they also knew that it would not be shipping that following month in the many instances when they changed the date. It seems they like to create a buzz, offer a BS ship date to get people excited and focused on the product then string them along until it is finally ready. 

    There are many other instances dating back to their very first camera and their communication still an issue. The biggest one for me is how they treated their URSA owners. At NAB they said that URSA owners would be given priority for the 4.6k sensors and would get their turret upgrades first. They even announced a lower price for the 4K URSA so new customers could buy it then and pre-order the turret for the same price as just pre-ordering the 4.6K URSA. That way the can use the camera now and would also be given first priority on the new sensors. Yet when the time came they backed out of their promise and released the 4.6k Mini first. Now I understand it is a business and clearly the Mini was the new flag ship camera. With it being so overdue they decided to release it instead of living up to their promise. Understandable but how did they handle it? An appology or at least an explanation to the URSA owners? Nope, they said absolutely nothing at all. They released the Mini and didn't even bother to mention the URSA in their announment email or video. They left the URSA owners in the dark with out any official announcment or explaination. The URSA owners still have no idea when they might be getting theirs but they are clearly at the bottom of priorities now. 

    If someone can come up and offer what BM does I am sure many BM users would happily jump ship, especially if they are respectful and communicate well. Until then BM probably isn't going anywhere. 

    Who in the video world doesn't know how Blackmagic operates?  When the pocket first came out I balked.  On this very forum I got into arguments when I said no way for $1,000... for ME.  I didn't tell anyone else what to do with their money.  I just said I would pass.  Well a few firmware enhancements later and a $500 price cut and I became a proud and satisfied owner of a BMPCC.  It is still my main video shooter.  Cell phone is by B camera.  T3i... Don't even bother.

    Wait for the camera to launch and the inevitable firmwares and price cuts.  When things are just the way you like it, then buy.

    Blackmagic is a bizarre company.  They put out some nice and very interesting stuff but they are bizarre.  Just work around the oddball stuff.

×
×
  • Create New...