-
Posts
7,847 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by kye
-
@Parker Nice honest channel launch - subscribed! A request if I may... or two... 🙂 1) Please talk about the most advanced topics you are comfortable talking about. I say this because everyone else seems to start with the basics and never gets past them. This will be a lot of work in the writing stage for each video, but will also help you as talking about something forces you to really understand it. 2) Please talk about how to create a story from the footage that you get. What I mean is, when shooting weddings and the little one-day profiles, you find the story in the edit instead of planning it beforehand. Obviously I know that these things have a bit of predictability to them, but there's definitely an art to finding the story in the edit. I watched the video with isolating headphones and found it fine. Maybe you should balance your voice with the music on headphones and then turn the music down by 3dB more or something as a bit of a safety margin for less pristine listening conditions. It would make the mixing phase pretty straight-foward to have a rule like that. Or better yet, get a traditional musician-style microphone stand and arrange it so the bar is horizontal, put a colour checker where your head will be, focus on It, hit record, then go sit down and just push the horizontal bar out of the way (it will just pivot out of shot) and not only will you be in focus, but you'll also have a colour checker at the start of every shot. You could also mount a small whiteboard to use as a slate for future shot identification, and you can put the mic stand to the side out of shot so no having to put a tripod on the chair where you'll be sitting. Also, on the next shot you can just swing it back into place again without having to move the whole stand. Alternatively, you could hang something from the boom mic, which should be almost directly over your head and just out of shot. Awesome! Content is king, so as long as you make the videos edited nice enough to not be unwatchable, then just film it and get it out there. I watch far more content of people doing things (building houses, machining, travelling, renovating, etc) than I watch professionally edited stuff, simply because it's mostly more interesting. It's pretty hard to be a good enough writer that you can match the authenticity of someone who just opens their mouth and speaks in response to life, it's pretty hard to be a good enough actor that you can match the authenticity of someone who is actually living in their characters shoes and is genuinely seeing things happen and reacting for the first time, and it's pretty darn hard to write a story that is as surprising but logical and plausible as reality itself. This is what film-makers find hard to understand about vlogging or content creators. Their film-making mostly sucks, but the acting and story is equal to that of the pinnacle that Hollywood has to offer.
-
Good post. I wrote a DCPX plugin that reduces the bit-depth of images and found that you can really reduce the bit depth significantly before it becomes visible on real world images. I think 6 bits was fine for many test images of skin tones, for example. Probably the biggest enemy of the 8-bit codec is the gradients that occur when filming flat-coloured surfaces like any interior wall. Not only is the light going to be unevenly applied to the wall, and not only is the lens going to vignette by a little bit, but the closer these are to perfect, the wider the posterisation banding is going to be. Still, you have to try pretty hard to get visible banding from 8-bit if the image is exposed well and isn't in a crazy-flat log profile.
-
Watched the video, haven't watched the film, thought that the video was funny and made me glad I didn't watch the movie. It reminded me of this video: Of course, I enjoyed Guardians of the Galaxy, but he does have a point about movie making and editing and story...
-
Thoughts on the Blackmagic Pocket Cinema Camera 6K Pro - why EF mount?
kye replied to Andrew Reid's topic in Cameras
I suspect you are a devotee of the "higher resolution sharper images" school of thought. This is also called "the video look". I did tests that mathematically analysed various codecs and the h264 / h265 codecs were mathematically more accurate than Prores at similar bitrates (which makes sense considering that maths is how the designers of compression algorithms test those algorithms and Prores is based on an older compression algorithm) however, humans are not computers. My suspicion is that the preference for prores over h264 etc is in the errors. For starters, Prores is ALL-I so renders motion very well, which is not something that is taken into consideration by the mathematics used to test algorithms. Yes, h264 etc can do ALL-I although most implementations do not enable it. Secondly, to my eye at least, Prores tends to give a softer look to an image than h264, which appears spiky and sharp/brittle in comparison. Anyone familiar with film will know that it isn't sharp at all. Anyone shooting RAW will be familiar with this. Prores has a very similar sharpness to RAW images, and requires sharpening in post to tune the image if you want a more modern feeling image. H264 and H265 feel very different to Prores. Yes, camera manufacturers tend to sharpen the image before applying H264/5 compression so that will add to this effect as well, but when you're comparing Prores to h264 you're comparing the whole image pipeline and this is what tends to happen. Unless you're a fan of the "video look" then these images likely have too much sharpening applied, and I find that my skills in softening an image with blurs are more developed than my skills in sharpening images purely because of how much sharpening is typically applied. All of the above would be similar for Motion JPG implementations as well, and the cameras with that as a codec also have a very good reputation. Finally, I suggest this - after reading this post look around you and notice your surroundings. Do they look 'sharp'? Look at your hands and the texture of your skin, look at the fabrics of your clothes, look at the device you're reading this on. Do they look 'sharp'? No. Reality isn't sharp. It's detailed, sure, but in an understated way, it just IS, without all the details yelling at you to notice them. Unlike h264/h265. -
We were definitely spoiled, and the price going up is more reflective of the optical quality of some of those lenses. Covid and buying boredom / nostalgia will have driven the whole market up somewhat and it might correct in the next few years, but I don't think it will go down that much. There is a diminishing supply with lenses ageing and breaking, and there is a rising demand, both due to technology driving up visual media in terms of both video and stills photography, and a rising middle class in countries like China and India that has added billions of people to those who have discretionary income: https://en.wikipedia.org/wiki/Middle_class#Recent_global_growth
-
I guess you raise an interesting point. Vintage lenses used to be good ways of getting a high image quality lens for relatively little money, but now that prices have skyrocketed for almost all the recognisable brands, outside some bargains in garage sales etc, maybe that's not the case anymore. Considering that China has stepped in and is making new copies of older lens formulas with brands like 7artisans and TTartisans etc, and considering that we can go a long way to matching the coatings of old with diffusion filters and the like, maybe the time of vintage lenses for the masses is over?
-
Do zoom lenses flare with more / a longer series of circles than primes? Is it because of the larger number of elements in zoom lenses?
-
I don't think that matters so much really. If they have enough money and are willing to listen to industry and customers to build something that people actually want then I don't think it's a problem. My two favourite cameras are the GH5 and the OG BMPCC. The GH5 is the result of a long line of Panasonic listening to customers and gradually pushing things further and further, and it's the most convenient and usable camera I have. The OG BMPCC is the nicer image, by a long way, and has no legacy whatsoever. If customers placed much emphasis on lineage and history then the OG BMPCC would have been a flop, and yet it was a massive hit, even though the camera itself had quality issues, many significant design issues, and came from a company with no service network. Give people what they want and the customers will come.
-
I think the phrase you're looking for is "I'm not addicted. I can stop any time I like."
-
Which party are you talking about exactly that Nikon would be late to? https://imaging.nikon.com/history/chronicle/cousins18-e/index.htm https://imaging.nikon.com/history/chronicle/cousins19-e/index.htm
-
Disappointing Panasonic GH5 Mark II specs leak in Japan – Where is the GH6?
kye replied to Andrew Reid's topic in Cameras
Yes, but what do we call it? The GH5mk2 thread? The GH6 thread? Hell, if we're not believing the rumours then maybe they're going straight to the GH7 and going for a 12K sensor! -
That's a good video, but doesn't address the different bit depths as the source is an 8-bit signal from the C100.
-
Interest in vintage lenses is directly correlated to sensor resolution, so it's up up up up up....... I saw a post the other day from someone saying that the second-hand price of one particular vintage lens had gone up 4X in about the last year.
-
They could use sensors from Fairlight and leapfrog the generic video look that almost all Sony-sensor cameras give these days. Now that really would be something.
-
Starting from scratch might be an advantage, especially considering they don't have a cine line to protect, or internal departmental politics to navigate. Canon has made a mess of their prosumer offerings, with potentially the architecture of the camera being a limiting factor, whereas starting fresh gives you an opportunity to design things from the ground up for the way that things work now, rather than either having to design custom chips (ala ARRI) or having to try and shoehorn new chips into a legacy architecture (ala Canon). Blackmagic did well by essentially designing multiple cameras from the ground up to fit underserved niches, but Nikon would be doing that with (I assume) much more resources, stronger relationships with industry, a proven manufacturing capability, large professional servicing network, a large customer base with brand trust and loyalty, and a huge body of lenses to draw from.
-
The more I learn about each aspect of the image pipeline, the more I realise that everything matters. If you have a camera like the A7S3 where it's a good sensor, good image processor, and good codec, then the fact that it's 8-bit isn't too bad because the other things are fine. I've run into issues shooting 8-bit C-Log with the Canon XC10. This should be a reasonable example of 8-bit done well as it's a Canon cine/video camera with 4K 300Mbps 8-bit codec, but alas.... Here's this shot, which I will admit is underexposed: which gives us this when we convert the colour space: but unfortunately, if we zoom into the seat in the bottom left: Is this an 8-bit issue? Is this a poor codec issue? I'll leave that as an exercise for the reader. Let's take another example: Which after a transform gives this: and look at the noise! Now, is that ISO noise? 8-bit quantisation? a "poor" 300Mbps codec on a Canon cine camera? This is the vector scope, which clearly shows the 8-bit quantisation: Yes, this is an extreme example with a low-contrast scene, a flat log profile, and 8-bit codec, but this was a $2K Canon cine camera with a 300Mbps codec. So, I replaced it with a GH5, with the decisive factor being the internal 10-bit. Let's take this image here, a 10-bit HLG frame with almost zero contrast: and actually try to break it by applying more contrast than you would ever use in real life: and it holds together. Is that the 10-bit? Is that the 150Mbps 422 codec? Is that the fact it's a downsampled 5K image? Who knows, but it's a night and day difference where with an 8-bit example I struggle to get a good image from a real world shot and a 10-bit image where I can't break it even if I tried. I like to think about it like this - 8-bit can be good enough if you have a good enough implementation of it (unlike the XC10) but a good 10-bit implementation will give you security that you're not going to run into bit depth issues, so for me it's a safety thing.
-
Of course it is better to know where a problem is and fix it properly, but let's not kid ourselves - the OP doesn't even have time to listen to the final edit to see where any problems might be, so the basket of solutions is pretty empty when there is such a lack of care-factor, especially considering that poor video quality is tolerable but poor audio makes something unwatchable. Besides, any half-decent limiter can put a pretty heavy limit on a signal without it having much/any perceivable effects. Anyone familiar with tape saturation would know that this type of limiting can be very sonically benign. I used to use them for compression and sonic effects in writing electronic music and you'd have to go seriously out of your way to push it hard enough to get anything sonically interesting in terms of distortion. I read somewhere that there is a bone in the ear that we hear through and that it causes high levels of second harmonic distortion, which is then removed by the auditory processing parts of the brain. I'm not sure if that's true, but it's pretty obvious that even-order harmonic distortion is orders of magnitude less offensive than odd-order harmonic distortions, so anything that can create even instead of odd harmonics will be far more pleasing, or can be used so much more before becoming unpleasant.
-
@zerocool22 Why not just use a limiter? Not sure about FCPX, but Resolve has one built into every track, and every bus, and also the master output.
-
Thanks, I've asked this question previously but hadn't gotten a definitive answer. A couple of things that are interesting from that spec sheet are: It can do 100fps It can do 100fps with rolling shutter and 50fps with a global shutter I wasn't aware that either the BMMCC or OG BMPCC have either 100fps or a global shutter enabled - does this mean that the sensor from all those years ago is only semi-utilised? Wow. Seriously Blackmagic - do an update on these cameras!! I see minimalism has found another follower...
-
Do you have any links outlining the dual-gain architecture of the sensor? Some time ago I looked for more information on that aspect and couldn't find anything, and am curious to read more about it 🙂 I definitely agree that small sensor doesn't have to mean vintage look. Lots of wider aperture lenses are able to give a decent amount of background separation. If anyone isn't clear on this, then the following might be of interest: http://www.yedlin.net/NerdyFilmTechStuff/MatchLensBlur.html
-
This came up in my feedlot sure if it's been posted already but a search didn't reveal any hits: There's some great info in the comments about equipment, settings and process.
-
@zerocool22 When you play the video you can see levels via the Mixer section of the edit page (it's the panel bottom-right corner with levels just going into the yellow band on the main mix) and they show if something is close/clipping via colour. You have to enable the Mixer via the button at the top-right, similar to how you turn on and off the Inspector window. Also, if you have a few audio tracks you might have to drag out the width of the Mixer panel so you can see each track. It won't show if an audio clip is clipped on the original recording, although I wouldn't think this is something you'd need to refer to frequently, especially considering the damage has already been done, but maybe there's something else going on.. In a sense, the more useful measures are if a track is clipping (which is recorded audio + clip volume + track volume) or if the whole mix is clipping. It's a similar thing in the Fairlight page, although you have more places to see individual levels for tracks and busses etc. Was there something specific you are doing, or a specific situation you are in?
-
Wonderful! I think there's a few interesting points here. Firstly, I disagree when you say the OG isn't more gorgeous than the 4K. There's a view that because they all shoot RAW that they're the same because you can make RAW look like anything, but that's simply not true. For example, even in colourist circles the ability for Steve Yedlin to emulate film is often viewed as practically impossible, and Steve even says in his blog that it's not possible with the limited tools available (referring to Resolve and Baselight) and he had to write his own custom software to do it. But setting aside professional colourists for a second and getting to the people who would realistically buy a BM pocket cinema camera (P2K, P4K, P6K, or P6KP), these people aren't likely to be able to grade the P4K to look like the P2K, or vice versa (and let's be honest - a great proportion of them only go as far as applying a LUT). So while in theory the P2K isn't more gorgeous than the P4K, it is in practice when the average user has next to no ability to colour grade. Moving into the comments about resolution and sharpness, I think this is something that relates to marketing and to ability to colour grade as well. It relates to marketing in a couple of ways. Firstly, people often pick up the P2K for a retro look, because they're been brainwashed by TV companies to think that more resolution is more modern, when in reality most cinemas still project in 2K. In addition to this they often pair the P2K with vintage lenses, which of course give a vintage look, which lacks sharpness. However, in fact the P2K and BMMCC can just as easily be paired with modern cinema glass, and is by people who aren't after a vintage look. Secondly, as people watch more and more content from YouTube rather than content shot on Alexas and graded by professional colourists, people get used to a sharper look as the default look. Once again, this is marketing because the default look is over sharpened because the TV manufacturer who made your camera is baking that look into the camera in order to manipulate you into wanting a HD FHD 4K 8K TV. Sharpness also comes down to colour grading, and this is something that I am experimenting with at the moment. The P2K and BMMCC are very soft straight out of camera, which is fine if you know what you're doing, but alas, these files aren't being graded by professional colourists, they're being graded by people who don't even understand how LUTs work. On top of this, sharpening is a whole art form but is almost never discussed and there are basically no tutorials. Why? Because your TV manufacturer camera oversharpens rather than under sharpens, and instead of learning to sharpen in post you learned to use vintage lenses and diffusion filters to get a sensible level of sharpness.
-
One thing that struck me from the tests was how neutral the over and under exposures are. That's absolutely not the case on most other cameras, and I speak from experience here because I have routinely exposed things poorly in the past and then suffered in post when things didn't behave how I anticipated/hoped they would! Once again, the cameras that most remind me of this trait is the original BMPCC and BMMCC, which use very little highlight rolloff, so the way you use them is to ETTR and pull things down in post, and they look very similar (I did tests), at least when overexposing things anyway, if things are underexposed then the results aren't so good. It might even be that the usability of the DR of the Alexa is more useful than the sheer amount of it.
-
Simply gorgeous image... and it was shot in Prores 422, not even HQ!