Jump to content

kye

Members
  • Posts

    7,490
  • Joined

  • Last visited

Everything posted by kye

  1. These are very different cameras. What type of shooting do you do and what kind of videos do you make? Maybe some more info would help point out any weaknesses or things to watch for.
  2. So, let me see if I caught everything there: It's an 8K camera that can't record 8K Canon don't even have an external 8K recorder They made an 8K video but we can't watch it in 8K A YouTuber with a RED posted an 8K video on YT over two years ago, but Canon hasn't worked out how to do it yet? Did I miss anything?
  3. Cool videos! Really nice colours and movement. Not sure about Cine-D vs Vlog but why not just shoot a test and try both modes, lit only by your LED light? Pulling the footage into NLE should answer most of your questions?
  4. kye

    bmp4k adventures

    Doh! What focal length would suit you best? Many favour the 24-28mm equivalent, although the 35mm equivalent is my favourite, and the 50mm equivalent is popular too. I have the 14mm Panasonic and it's not too bad, and very compact.
  5. I'd keep my eye on it but wait. Every new model seems to get better by quite a lot, and Insta seems to be the DJI of 360 cameras so they're the ones to watch. You're right that 4K isn't enough for 360 degrees, and they're not going to up the bitrate much beyond the 100mbps, at least for the two-lens consumer versions. That's why I say to wait for the 8K versions. It's highly promising new tech, but it's just not quite there yet for the kind of quality that people on this forum demand as a minimum standard.
  6. I thought you got good at running by watching those "exercise motivation" videos on YT? Tell me you're not tired just watching that...
  7. Yeah. The more 3D stuff I watch, the more I think you have to be reeeeeealy careful if you move the camera! Can you put a USB into it when it's mounted? If so, you'd likely be using it with a pole mounted in its blind spot so you could potentially just coil a USB cable up the pole? Log on 8-bit low bitrate consumer camera doesn't sound that promising... Great! Another product that will only work in countries that are so cold all you do in winter is sit inside wanting to kill yourself and then when it's barely above freezing everyone strips off naked in parks and old people and babies die because the temperature got all the way up to the temperature we run our refrigerators at.
  8. Cool. So make those changes to yours and then compare them again, then fix that, etc etc.. Post when it's as close as you can get it
  9. Open up each still in a browser window and swap back-and-forth and tell me what you see. List as many differences as you can. This is actually a great exercise in learning how to see
  10. These are very nice. One of the main challenges I see with this LUT pack is in the variation of skin tones. Many images posted have skin tones where there are areas of skin that are obviously yellow and others that are obviously pink, or where the skin tone is completely pink and there is nothing towards the orange side of that spectrum. I will be the first to admit that I'm not a keen judge of colour, but to me good skin tones seem to sit in the ambiguous area between the "why is that person yellow? are they sick?" and "why is that person pink? are they sick?" tonal ranges. Some examples show colour in both of these areas and I suspect that its to do with excessive saturation in the final grade. I've messed around with these LUTs on my own footage across a number of different shoots and it does push the skin tones quite far towards the magenta, but when I look at Alexa footage it doesn't seem to be graded with that level of push, so I think controlling that aspect of the colour is an important part of grading this material (and I suspect also when grading Alexa files, simple because of how well the LUT pack matches the Alexa, unfortunately I haven't had the pleasure of playing with Alexa footage myself). To this end I find a lot of benefit in using hue adjustments to ensure that the hues in skin tones are contained within that sweet spot between yellow and pink. I'm curious to hear what other people's views on this are, maybe it's a personal taste thing?
  11. Im confused, is it "backpacking cats" or "back packing cats"?
  12. kye

    Framerate

    I just operate under the concept that you choose your base frame rate and carefully match it to shooting normal speed clips, but when shooting slow-motion you just conform it properly and it kind of doesn't matter in a way. For example, if you're on a 25fps timeline and you shoot 50p slow-motion then it will conform to 50% of real speed, but if you shoot 60p it will conform to 40% of real speed. I'm not really sure there would be that many situations where a 50% speed shot is required and a 40% speed shot wouldn't also be acceptable. In that sense I kind of view 50p and 60p as being the same,.
  13. Interesting discussion, but we need to be careful we don't deviate from what actually matters, which is how the image looks to people. @amanieux is correct that there is lossless and lossy, but there is no real 'visually lossless' category, unless they saved space by removing the metadata or something, but that's basically pointless. It's more accurate to talk about how much visual loss is caused by how much compression. For example, mp3s are very highly compressed (down to single digit percentages) but the perceived sound quality remains a lot higher than the file sizes because they're very smart about what information they're removing, and have chosen to remove the things that humans are least sensitive to. In visual terms there are equivalents to this and certain changes do a lot more perceived 'damage' to an image than others. For example, if you took a video signal and made every pixel 2% brighter then we basically wouldn't notice and no-one would complain. But if you made every pixel 2% brighter, but only on every second frame, it would flicker like mad and would be a total shit-show, despite the video quality actually having half as much total error. Obviously this is a ridiculous example, but when we compress video we are essentially doing two things, one is that we are choosing how much total deviation there will be (all else being equal, more compression = more deviation from the source material) and we are choosing where and how that deviation will be allocated, both in the frame and across frames. When you heavily compress an image you often find horrific blocking artefacts in plain shadow areas, and also on flat surfaces next to edges. This is due to how the image is compressed and which algorithms are used to do so. It's less obvious today since the compression in JPG and MPG has kind of taken over, but those of us who recall comparing images from a GIF file that only had a small number of colours but could do flat surfaces and edges perfectly with a JPG image that could do many more colours but flat surfaces and edges abysmally will know what I mean. It's kind of like when we talk about ISO performance of a new camera - one reviewer says "ISO 1000 is the limit" and the next person says "ISO 6400 is totally usable" and we're really none the wiser because all we know is that they have different tolerances for noise in their images. In video we do have some standard measurements, like Signal-to-noise, but unfortunately that's difficult to calculate, and due to perception not being purely mathematical it's also not a reliable predictor of how good something looks. In a sense, what we should have is a standard test of what percentage of people could tell the difference between the source and a compressed version over a range of delays between the images. That would give us something like "compression example 1 was visibly different to 80% of people over a 1 second gap and only 30% of people over a 40s gap, whereas compression example 2 was visibly different to 60% of people over a 1 second gap and only 10% of people over a 30s gap" and that would be useful when compared with other things like file sizes etc. Unfortunately, I don't know about such a standard test, and even if there is, no-one seems to be doing it.
  14. Hell, even just reading out select sentences from this thread would be funny. It would sound like crazy people all standing around blabbering about kind of the same stuff but kind of not. It would even have quite a crescendo when you get to this post here and it reveals what you did and what they've just watched.
  15. In addition to @Towd's good advice, I'd suggest bookmarking the demo videos from the big cameras as colour reference standards. eg, the video from ARRI for the LF, the Canon C300, some of the Sony ones, Panasonic EVO, etc. These videos are demoing the camera and to show off the camera they spend a truckload on a grade to make it look great, but they're not trying to show 'a look', they're trying to show the camera in as neutral a way as possible. If you swap back and forth from your grade to these demo videos then you'll have a good reliable reference to make sure your eyes aren't going too far from a neutral position. You can even use the demo videos as references to find images from nice films (film trailers for example) that you can collect to for a 'look book' of reference styles to reference. All these references will help you to not run away with yourself, provide inspiration from reliable sources, but also partly make up for a lack of a proper calibrated colour grading setup. I don't know about the various copyright infringements involved, but (in theory!!!) you could download those reference videos from YT with a video downloader, then pull still images from them in your NLE, and save them into a folder that you've bookmarked in your NLE so you can just pull them up at any point during grading for a quick reference. You know, in theory...
  16. kye

    Lenses

    Thanks! I feel like I got more value than I paid for ??? One of the things I noticed in my lens tests was that when I was comparing lenses with slightly different focal lengths at the same aperture the slight difference in DOF due to the slight aperture size actually made quite a bit of difference in 3D pop, even at something like f5.6 (which is F11 in FF terms). In some sense there really isn't an alternative to faster glass. LOL, maybe. Although when I was looking for comparisons I kept finding them in various places but what surprised me was that even with the right keywords google wasn't that good at unearthing them. I wouldn't be too worried about how much we're impacting the global marketplace here in this thread
  17. I think they're different cameras. There is a fundamental aesthetic difference between digital and optical image stabilisation, and also about the entire over-capture vs framing workflows too. For my travel and family videos I am tempted to add a 360 cam as a b-cam to my GH5 but they would be for very different situations and types of shot. Loks video shows the near stitching on the EVO to have problems on closer objects because the cameras are a lot further apart. Something to consider if you're unsure about framing and will have objects closer to the camera. Interesting that the BBC is using it. I suspect their quality threshold is roughly the same as mine, where 4K wasn't acceptable at all and the new ~6K ones are just on the cusp. When 8K h265 becomes mainstream and they put it in a form factor like the X I think that will start to really be useful in a serious way, much the same as how GoPros became a viable part of a low budget production kit over time as the quality improved. If you can do a half decent job and be the first person to do it (or if previous attempts weren't good enough but you cross some kind of threshold) then you've got a shot at it getting that "first one" PR and success. But you'd have to be quick!
  18. Lok was impressed with the 3D-without-goggles feature:
  19. A trick in grading you can do to 'see better' is to add a contrast/saturation effect at the very end to both shots. I'd suggest: Start with normal settings and get the right colour space transformation Sat=0 and just match the overall contrast and gamma curve Sat=0 with added contrast to fine-tune the gamma adjustments Sat=150-200% with normal contrast to exaggerate the colours and match the WB, saturation of various hues, and any tints to shadows / highlights back to normal sat and contrast and play the "what looks different' game In a sense, a lot of grading to match things is just looking for what looks wrong and then fixing it and keeping on doing it until it's good enough. The other part of grading is looking for what might make the image better (whatever that means for the project) and just trying things and when you find something that improves it then just dial in the strength of it and then look for the next thing.
  20. kye

    Lenses

    You're welcome!! Of course, nothing in my lens tests is stopping you from just putting a 50mm on the camera and then leaving the house... Wider than the 28? or similar to but wider than the Helios? If you're talking the helios, then here: http://www.reduser.net/forum/showthread.php?152436-Russian-Soviet-USSR-Lens-Survival-Guide You're welcome too!! ah, pigeons located, cats thrown.. my work is done!
  21. Awesome - 3D stuff is moving along nicely In terms of the fisheye stuff, when you're viewing a 3D image with VR goggles you're cropped into the image significantly, plus the fisheye is eliminated in software. This is a screen grab from a mobile VR test app I wrote: When you have the goggles on that line is straight...
  22. Roll two dice and then open a dictionary to a random page, go to the line number matching the first dice, and the word number on that line of the second dice. Do that a dozen times and write them all down. Pick the stupidest / worst / most bizarre three and make a script inspired by or including those words. You can choose to apply a three-act structure or conflict or drama or tension or whatever the hell you want, but the vast majority of "cucumber buzzsaw hysteria" or "spontaneous curry nightingale" films will be fun to make and fun to watch..
  23. I was tempted to list it, but it depends on what type of stills you shoot. For example I doubt it can match the range of shutter speeds as a stills camera. But you're right that raw frame grabs are nicer than the majority of stills cameras still on sale today.
×
×
  • Create New...