Jump to content

sunyata

Members
  • Posts

    391
  • Joined

  • Last visited

Everything posted by sunyata

  1. it's an idea i've wondered about since i was a kid playing with prisms and pinhole cameras.. why do you have to commit to a specific focal point if all the light is available in the area of a circle.. why couldn't you just record all the light? this is basically what they have done, but in order to be able to focus later, you have to also record the vectors that the light is traveling in.. this is why that lens is fixed, you need an array of light sensors. here's the paper that was done by the founder while he was at Stanford, before the company was launched: https://www.lytro.com/downloads/resources/renng-thesis.pdf it's not new in theory but to make a consumer camera it required all the other digital components to exist. from what i understand the difficulty really is in developing the algorithms specific to this novel device. and i do think it's a game changer.. there are really interesting interactive implications too.. you could develop a file format that contains scatter light vector info - or whatever they are calling it - and work with mozilla and google on a plugin. it's gonna be a big file, but how cool would it be to have all images online be capable of focus. add hdri to the format and it's starting to feel like a bad sci-fi movie set in the not too distant future.
  2. I don't use resolve but It's highly likely it's image caching based on your viewer settings.. if you have to load frames before you get realtime playback, probably that's it. well hot damn.. check it out: from the davinci resolve website "Real time On-The-Fly extremely high quality proxy generation without disk cache."
  3. I'm still downloading but unless you're doing a ram preview or cached playblast, i'd be really surprised if you could play a 4k 10bit file on your laptop, even with the SSD, without drop frames.. that's 4x the data of an 8bit 4k file. if it was compressed enough to playback in realtime on a non-raid system, it would look very degraded. i have a SAS raid that get's around 700Mb/s sustained and I doubt it will play in RT. you need to render proxies or use an app that has dynamic image caching. the paradox of beautiful 4k, you can't see it.
  4. yellow-- yea, sorry, that was barely coherent eh.. i was thinking maybe there was an issue internally or externally with superwhites getting forced to 255, but now that i've looked at the footage, i'm back to thinking there's nothing to worry about, unless you want to keep levels safe for a CRT? it looked fine to me.. but i was importing into flame and then checked a dpx sequence i made with ffmpeg in nuke. i know quicktime can do weird stuff in after effects and premiere. when you say QT player clips, do you mean a hard clamp, as in nothing reads over 235 when you sample whites, or blown out (just checking). if so, that sounds like a proper quicktime codec install might be needed, for interpreting as you say.
  5. hey yellow, i'm fully retracting that previous statement based on your info here.. question: everything above 235 in the clips you found was reading as white? it seems like the internal signal could be 235 max no matter what you choose as the format.
  6. hey julian-- i'm now starting to realize that this could be a problem with codecs on export from davinci or premiere, it looks like that's what most people are using. not really recovering super white info or making sure levels are "legal", but rather more like preventing a possible Y,CbCr screw up on export before upload to vimeo. this could also be based on render settings. if you reimport your exported video, right after your export it for vimeo and see that it has changed at that stage, that would answer it. if it looks too bright on import and has the "super whites" above 235 straight from the camera, then the internal signal might not be getting to 255. this kinda makes sense if it's Y,CbCr internally.
  7. jnorman34-- Not trying to be snooty here at all, I just want to clarify that rendering and playback are different terms. Playback at a locked framerate is what you want. That depends on compression of the source and speed of the drives. As Sean stated, external drives are usually the fast ones, not internal. I wouldn't advise recording 4k 8bit 4:2:0 as a workflow for an i7 laptop, separate issue but if possible I'd go with the BMPCC, save the $700 bucks and get a nice external RAID (or SSD if you're into portability over space). Then, working with 1080 10bit (whatever compression depends on the camera), You might be able to work in RT with the SSD/RAID solution and a good video card. It will cost more for you to get there, to high quality 10bit HD with the GH4 than it will with the BMPCC. I don't see the advantage to recording 4:2:0 4k 8bit unless you have a 4k TV set, otherwise I'd go for higher quality HD. Good Luck.
  8. So there is interest in an m43 10bit 4k 4:2:2 internal (like a firmware hack on the GH4) and a full frame or medium format 2k 10bit uncompressed. i really like the idea of a 120mm sensor with higher dynamic range that can shoot video, i know hasselblad is doing something similar for photos. http://***URL removed***/news/2014/03/03/hasselblad-officially-launches-50mp-medium-format-cmos-camera with a longer lens like an 80mm on a 120mm cmos sensor that could shoot 24p, that would be very cool. edit: turns out the size of the hasselblad "medium format cmos sensor" is actually only 32.9 x 43.8mm
  9. I'm skipping the classics (except one at the bottom), here are newer movies from 2000+ that I thought looked amazing: The Good Thief (modern noire) Let the Right One In (minimalist vampire) Sin Nombre (got face tattoos?) A Single Man (squeaky clean, my favorite coffee shot) Valhalla Rising (a new gritty sandal realism) Senna (rush was a pale imitation) Everyone Else Beyond the Black Rainbow (cool like a cray supercomputer) More Than Honey (incredible macro photography) Certified Copy (i still have no idea, but it's incredible) Blue is the Warmest Color (so many extreme close-ups) The one "classic" that I would add though is "The Conformist" - directed by Bernardo Bertolucci, cinematography by Vittorio Storaro, because this is out on Blu-Ray now by Arrow Video in the UK, I have successfully converted it to region 1, but if you're in the UK there's no problem: http://www.amazon.com/Conformist-Dual-Format-Edition-Blu-ray/dp/B005PLP5X6
  10. There is a difference between dynamic range in pure EE terms and what a sensor gets you in a camera in exposure stops.. The chip used in the BMPCC is rated at >88dB intra scene dynamic range and the BMPC is rated at 60dB, but the BMPCC advertises 13 exposure stops, BMPC advertises 12. http://www.cambridgeincolour.com/tutorials/dynamic-range.htm I can't find specifics on "stops" for the A7S either. I think they are trying to avoid that detail, possibly because it's actually low (due to the 8bit limit)? I don't know. I agree with the cat..
  11. Which of these 4 camera profiles would you prefer, let's say that they are priced competitively and the other features are equal. If you choose "other", please state your dream camera specs (within realm of what you feel is currently near possible and you might afford). I tried to model the choices on realistic trade-offs, one camera I wish existed but doesn't, and a new 4k consumer camera like the A7S. I am biased towards higher depth vs more pixels personally, full disclosure.
  12. This is correct. I commented on this in a couple other threads. I'm not on a mac and I didn't want to keep hammering the point w/o posting some sort of test that uses this app, to show what really happens with visible 4:2:0 artifacts. It's not so useful to start with good looking footage, a gradient or color chart would be better. The confusion is in thinking that all we need to do is re-sample pixels to get "true 10bit 4:4:4", but in fact what we would really need to do is re-sample the original light, which of course is not possible... You WILL get a little anti-aliasing from this technique, which might help reduce the appearance of noise, or it might make the scene too soft. This is not a good way to capture 10bit 2k though... you still want to use the HDMI out on the GH4, or some other solution, to have footage that contains 1024 levels of luma information (or close) from the original scene.
  13. yea, i've noticed a high level of confrontational people on this forum too, but it's not the dissent that is a problem I think, it's the personal attacks. saying someone still lives at home and uses "mommy's credit card", lol. you could just start a warning system with clearly stated rules not to break? no personal "ad hominum" attacks and get a moderator, even if that moderator is a user. i'd also advise you to take a less active role in responding to negative comments.. just ignore some of them or issue a warning, but don't engage in an argument.. that just feeds the trolls. don't feel the need to fight back if someone criticizes the site, just consider if it's worthy of an official response. from a ux perspective, rather than sorting based on quality of post (or only allowing certain people to speak), just have a collapse of a comment if it gets too many negative votes, just like youtube.
  14. It's possible that blackmagic just responded to Andrew's questions with boilerplate responses, in other words, they are the ones copying and pasting.. just playing devils advocate. But I saw an interview where the president admitted that formatting a disk in the camera was way down on his list of to-do's and the response gives a false sense of "almost being ready". My general sense is that blackmagic is a little far over their skiis on products and firmware updates will be far from satisfying for most little glitches with older, less important stuff.
  15. It's also not the same as sampling the original light at 2k 10bit, even in the luminance (Y) channel... which is the real issue, not the fact that you're short a few samples in the chromatic channels. This is confusing a lot of people. Someone should just do a test that shows banding in green (just use a ramp that goes from 0,1,0 to 0,10,0).. then resize with the app and compare to the same ramp made in 2k 10bit uncompressed. I'm not on a mac. You will need a program that can save out a native 10bit dpx file like Nuke to generate the 2k 10bit ramp.. (not sure about AE). People need visual proof to sort all this out I think.
  16. thesubversive -- so i think you're going to get better results grading after converting, because not only will you be in a higher depth, but those border contrast areas will be smoothed out a little, but if you want to resolve to 1080 and get dynamic range to work with, i think you should look at a native 10bit capture solution instead... which could be the GH4 with HDMI, or BMPCC etc.
  17. that is correct but, in your example, you would never have a better dataset to sample from than that 2-bit data, so you would always represent your higher bit data as a different representation of the original 2 bit data. so something that might have millions of shades of luma would always appear as 1.75 (in your example), you would never be able to see a clearer representation of the original light... even with more pixels. if you think about a building way off in the distance with dimly lit windows and someone standing inside: even with infinite pixel density, we might never see that person w/o enough levels of luma... no matter how many times you sample that data, that person is not visible.
  18. I feel I should also add something of a warning for people that might think you can get great 10bit from 4k 8bit.. there are areas of quantization in your source 4k (large blocks of pixels that are forced to the same color values) that can be much larger than the pixel matrix that you are subsampling from, no matter what the algo is that you use, the blocks remain visually. in effect, you are smoothing just the contrast border areas of those quantized blocks... and later if you push the gamma in color correction, they will re-appear. This is not the equivalent of re-sampling the light at 2k, 10bit 4:4:4 native, but I commend the effort and am sure it will be useful.
  19. Thomas, maybe you should post this file on one of the more recent forums, this thread seems pretty dead now...
  20. agreed, "one mind, any software".. but voted other for flame.
  21. the guys working on the axiom are also having troubles with this but it should be fixable.. the sensor has consistent pattern noise that needs to be subtracted within the FPGA. this could be fixed with a firmware update. the beauty of programming FPGA's is that you can do a lot after you ship your product.. it's like a logic gate blackboard.. which makes it challenging to program, but also very flexible. they need to fix this.
  22. Cmosis just came out with a 4/3 sensor that I could see black magic releasing an updated product around, maybe after NAB? Although it has a smaller uM and is slightly lower in the dynamic range than fairchild's >88dB bmpcc chip, it does 3360 x 2496 @ 104fps in 10bit and can be clocked internally or externally, and it costs less than the fairchild sensor. They must be doing something with that...
  23. i think someone has been looking at the axiom website with the ursa interchangeable sensor concept. the danger of open source.
  24. Yes, I think Black Magic must be relieved that Sony didn't do 10bit 4:2:2 with this sensor and price. If it did 10bit I think I'd forget about the Panasonic GH4.
×
×
  • Create New...