Jump to content

kye

Members
  • Posts

    7,972
  • Joined

  • Last visited

Posts posted by kye

  1. 3 hours ago, herein2020 said:

    That is very true, post processing is every bit as important as properly shooting the scene. 

    Even if it had ALL-I I would never use it. The much larger file sizes, more expensive memory card requirements, and the minimal improvements in editing performance combined with even less improvements in image quality (if any) make ALL-I a very unattractive proposition for me. For my specific type of work I have never been in the middle of editing and wished that I had RAW or ALL-I footage.

    ALL-I gives a HUGE performance boost when editing!  If I ask an NLE to go to a specific frame in a file, Long-GOP requires to find the previous keyframe and render each frame between the keyframe and the target frame.  ALL-I just goes to the frame and decodes just that one frame.  ALL-I footage plays backwards with the same CGP/GPU performance as playing forwards.

    For the same bitrates ALL-I does take a slight quality hit, but it's not that much.  

    The bitrates aren't that different either - the GH5 has 200Mbps and 400Mbps ALL-I modes, but there is nothing stopping Panasonic from adding other bitrates as well.

    ALL-I is one of the reasons that BM can get a modestly spec'd computer to play 12K BRAW footage, when almost no computer on earth can play the 8K h265 files from the Canon R5.

  2. 3 hours ago, herein2020 said:

    I think the reason you think the S1H is a lot better is because it is more likely that the buyers of the S1H are shooting higher end productions where they can properly light the scene, use better lenses, etc. Unless the S1H is shooting 5.9K raw (which BTW even the S5 can do now to an external recorder), the S5 and the S1H should produce nearly identical footage if all else is equal (lighting, staging, set design, lens, etc.). 

    I agree..  also, S1H buyers are more likely to have higher skill levels to get the most from the footage out of the camera - post processing shouldn't be underestimated - the more I learn about colour grading the more I view camera footage (even rec709 h264 files) as an unprocessed image for input to an entire image processing pipeline.

    Does the S5 have all the 10-bit 422 ALL-I codecs yet?  I haven't kept up with it, but when it launched it was like the GH5 at launch, limited to unremarkable 72Mbps type codecs.

  3. The XC10 can undercrank.  I shot some pretty amazing night footage once at about 1fps, which I didn't use in the edit but was quite useful at the time because the camera could see the boat passing nearby, how far away it was, how many people were on it, and many other things, when to the naked eye the boat was completely in darkness apart from its headlights.

    I was talking about filming 24fps and then playing it back slower, ie 12fps / 50% speed, or lower.  Most recent usage was the dramatic scene where a woman is crying and chasing a train which is pulling out of the station and has her lover on it.  I felt it matched the aesthetic very well.

    My understanding of cinematic language is that 50p conformed to 24 is typically used in Hollywood to make something seem a bit surreal, the way we experience reality during peaks of emotion.  Shooting 50p and playing it back at 24p slows time but still gives a continuity of medium, ie, it doesn't break the illusion of continuous motion.  It is also a subtle speed reduction that the footage doesn't feel overly artificial, whereas 120p at 20% feels completely artificial and calls attention to itself.

    Having less than 24fps displayed to the audience will break the continuity of motion.  This can be used to great effect for sequences where the perception of a character is highly impaired, such as due to heavy drug use, injury, or severe trauma.  In situations like this the use will likely be seen as fitting into the context, but if it had no alignment to the story then it would pull the audience out of the experience and call attention to the medium, so it can't be used without careful consideration.

    Slowing down 24fps footage will do both of the above things - it will slow down the events to slower than real-time and it will show an altered perception of reality.  This may be appropriate in situations like the point of view of a car accident victim coming in and out of consciousness, or the above mentioned woman chasing the train, but the level of emotional engagement has to be right - it would work for the woman if her lover leaving would mean that her life would be basically over but it likely wouldn't work if her lover was going away for a week and forgot his spare socks.

    I think frame rate effects are something that you can only use effectively when you understand the aesthetic associations involved and are using these to support the dramatic content of the story.  Failure to do so is one of the things that I see with vloggers who have copied the Casey Neistat vlog formula but don't understand it, so they use slow-motion in ways that say "I just discovered this function of my camera" rather than "here's something that looks awesome in slow-motion, and the thing and the slow-motion effect are both relevant to the story".  

    One thing that is interesting is the use of slow-motion pedestrians walking in a city.  It's a stereotypical image that I find works quite well occasionally, typically when the music is on the higher-intensity end of things, but also where the context is also high-intensity.  If someone always wanted to go to NY and this was their trip of a lifetime then that context might warrant it, but a NY vlogger going out to buy milk would have to try a lot harder to 'justify' the altered-state high-emotional context of that shot.  The wedding film-maker shooting a wedding couple in the middle of Times Square that I saw when I was there might get away with using it in the wedding reel, as the wedding is likely a high-emotion situation too.

    I see film-making as a three-level thing: 1) understand the technical aspects, 2) understand the aesthetic of each technical aspect, and 3) apply the right technical choices in order to support the story at each given moment.

  4. Recently I've seen a few instances where a TV show or movie slowed down 24p footage, effectively creating a <12fps shot.

    The interesting thing is that often these productions are modern, shot on Alexas, and could just as easily shot 50p or 120p+, but didn't.  The moments that were slowed down were obvious shots for slow motion, so it's not like the editor went in an unexpected direction that the director or script couldn't have anticipated.

    The thing that strikes me is that it's a different effect to shooting 50p and slowing it down.  A different aesthetic.  TBH it seems more timeless and more emotional than the typical 50p, high-emotion, queue the big music, pivotal story moment that is commonly done.  It feels more like classic cinema.

    Have you noticed this?  How does it feel to you?

    Would you do it deliberately?

  5. 2 hours ago, Happy Daze said:

    This post is not particularly about my friends and family.

    My main point was that people are generally ignorant about frame rates, dynamic range, calibration etc. I feel that the TV manufacturers should spend more time calibrating the TV's they sell to a similar standard to what the creative industry sees as a standard. TV's out of the box can be set to ridiculously bright/saturated levels, I assume they do this so that when they are on sale in retailers they want them to stand out as bright and beautiful against all of the other sets that are up for sale. TV's these days are technically superior and are very capable of mostly being calibrated to look the way that creators intended there material to be viewed and I know that a lot of people find their TV's quite complex and feel a sense of achievement if they can just take it from the box connect to WIFI and start watching Netflix, Amazon and Youtube.

    When you consider that as a result of Covid it is likely to change the way people view going to the cinema (assuming that cinemas can survive longer than the current pandemic which I think is unlikely). So most people will consume their media choices on the TV or other devices in the home. It wouldn't hurt manufacturers to calibrate their TV's sensibly so that the experience is as close to the original creation as possible out of the box. Then those that want can ruin the experience as they desire but I feel that a calibrated look should be the starting point.

    As for frame rates, again the standard appears to be interpolated out of the box (should the TV support it), my question is why? During setup a simple question such as "would you prefer to watch your movies and programming in the way it was intended or would you prefer the enhanced experience" along with a message explaining how to change that decision in the future. There is a lot of discussion surrounding shutter angles etc, with interpolation the whole point is mute.

    Presently the only people that are going to appreciate the effort and the financial commitment you make to your creation are other people who create, the public will happily consume hours of Youtube videos filmed on a 720p Chinese action cam or reruns of low res TV shows from the 70's & 80's. So my point is if TV's were calibrated properly from the box more people may get to be able to appreciate the difference a little more easily.

    So it's not about taste or the sub-conscious, it's about people accepting what they have bought into because they probably don't know any better and they trust that the expensive TV that they just unpacked must be set up OK, right?

    TV manufacturers are in the business of selling TVs.

    If you want to talk about the mis-alignment between what film-makers want, what consumers can see, and what is getting used to sell TVs, then you're wasting your time talking about anything other than resolution, and specifically 4K.  Most people can't tell the difference between 1080p and 4K at normal / sensible viewing distances, and most video content even shot in 4K looks awful when viewed close enough to see the horrific compression that gets applied to it by streaming services.

    HDR is something that I think is worthwhile, and an argument can be made for why that should be something that is worth re-buying your TV for, but 4K in the consumer market is just a gimmick to sell things.

    I find that the world makes zero sense when you don't look at it from the right angle - the design of a TV isn't to give the best cinematic experience, it's to put dollars into the pockets of electronic company CEOs!

    Lots of discussions here on the forums are around how depressing the world has become, for a variety of reasons, and I think that a major contributor is the idea that the world should be a certain way, and that that way is different to how it is.  This is simply a recipe for frustration and depression.  

    I think Fuzzynormal said it best....

    13 hours ago, fuzzynormal said:

    You build what you want to an acceptable baseline that satisfies you and then release it into the wild.  What happens after that is anybody's guess.

     

  6. If you care about it that much then I'd say get more discerning friends!

    My experience is that people see / hear / smell / taste things differently and uniquely, not worse or better.  They are likely not sensitive to the things that you are attune to, and vice versa.  When I ask my wife about colour grading she has all kinds of opinions, however she doesn't know the words to use and so it's not an easy subject to communicate about, but my overall impression is that she's sensitive to different things than I am, rather than not caring.

    This was my experience when I was into high-end audio equipment too, it's very common for the wives of audiophiles to be able to tell that you did a minor upgrade, but without being told and by noticing it from the other room.

    The other thing to consider is that much of film-making is the deliberate crafting of things that other people find sub-conscious or completely un-conscious.  The difference between good and bad editing for example results in the viewer being able to follow the plot vs being confused by certain elements, and this is often accomplished (especially by the great directors / editors who show and don't tell) by the careful arranging of lines, shots, and edits which the viewer will correctly interpret but likely won't be aware of.

    I suggest that rather than rant on the internet because the world doesn't align to your particular preferences, you take this as an opportunity to learn what they do care about and focus on that.  After all, you're making content for other people, and not yourself, right?  

  7. 4 hours ago, BenEricson said:

    You are correct in your research.

    Let me rephrase what I said. I believe that if you're after that overall softer / filmic look, the bmpcc will get you there easier and quicker. There's no sense shooting with a GH5 if you can get that look with less work. A colorist friend of mine often says, "the least amount of moves wins.

    I definitely agree with the sentiment "the least moves wins", although I would refine that considerably.

    I used to find myself basically fighting the footage, unable to get the footage to look even half-way decent.  This was because my skill level was basically zero at that point.  As I gradually learned, I started being able to make a few good adjustments before the 'tweaking' started doing more harm than good.  Now, I think of it as a skill-level thing - the more skill you have the more adjustments you can make before you're making it worse and not better.

    My new workflow is now to apply a look (Kodak 2393 LUT with some blurring/texture as shown above is my current favourite but I'm adjusting and optimising over time) and then to WB, adjust levels, and do any localised corrections required, but all underneath the look.  The secondaries are normally Hue-v-Hue / Hue-v-Lum / Hue-v-Sat but I will also do local adjustments if required.  These are often to match shots rather than significantly push things around.  I'm often doing things like darkening / desaturating distracting objects in the background or doing a large and soft power-window to brighten the subject.  

    I regard this as being quite minimal, considering that the 2393 is by far the largest adjustment in the grade and it was created by people far above my skill level, so I'm still responsible for the minority of the grade.  With my control surface I'm able to rip through an edit only spending a short amount of time per shot, by only making a few relatively repeatable adjustments.

  8. 6 hours ago, TomTheDP said:

    Blackmagic probably has better highlight roll off then. Though highlight roll off can be controlled in post as long as the dynamic range is there. 

    Every camera has great rolloff if the DR is there, just apply a curve.  If I can get a compliment on the highlight rolloff from a budget P&S camera from 2009 then there are no excuses!

    It's also possible to get a nice rolloff on areas that are clipped, although you don't get any detail back from them obviously.  

    The more I learn about colour grading, the more that I realise that grading is in the same category as production design or lighting, you can't expect your footage to look great if you don't do any set dressing, hair or makeup, or just shoot with whatever light happens to be there when you happen to show up, so why would it look great if you didn't do any colour grading either?

  9. 2 hours ago, IronFilm said:

    We had the theory today that the rise in popularity of Soju could be linked to the increase in popularity of K Pop. 
    But yes, watching Korean TV series could be another avenue. 

    Another personal theory I've got is that Soju is being used as a substitute for RTDs. 
    Especially as you can't buy stronger RTDs any longer, they've been banned 😞 😞 No more 12% Cody's of my youth! (although, I'd prefer 8% as the sweet spot) 

    Could be, they're very easy to drink!

    Speaking of Korean TV series, they're surprisingly well produced and image quality is often remarkably high.  My wife says the storytelling is also very good, so if you're inclined to a bit of soap then they might be worth a watch.  Netflix is full of them if you can navigate their algorithm in the right direction 🙂 

  10. Everyone wants to talk about 4K and 6K and 8K and 12K, and to talk about 8-bit vs 10-bit vs 12-bit vs 14-bit, but when it actually comes to the quality of the image, people don't want to know.

    I was interested in it but found so little online that I ended up having to do my own study on the SNR of various codecs myself.  It's here:

    You're talking about the quality / performance of the same flavours of codec but with differing encoding methods (software/hardware).  I'd bet you won't find anything and you'll have to do your own testing to find answers.  

    I found it almost impossible to get information on the T2 chip that even specified if it was supported, and if so, what codecs / bit-depths / subsampling it supported.

    Good luck, but my advice is to give up asking.

  11. 40 minutes ago, IronFilm said:

    I didn't even realize there was so many different flavors of Soju until I started working on this ad campaign last week! Guess it has been a long long time since I last drank Soju before this. 

    We bought one of each and tried them all at a dinner party, with my wife (who is heavily into watching Korean TV series) explaining the customs around drinking it, with the eldest person serving first, how to hold the bottle as you pour, who has the glass highest when toasting, etc etc.  

    The only challenge was the 'original' flavour, which reminded me of grappa / methylated spirits!  Urrgh.

  12. 2 hours ago, herein2020 said:

    That was what I was alluding to in regards to the hours of research. I have had every incompatibility imaginable; cases that blocked critical slots, power supplies missing the proper connector to power the video card, memory that was simply the wrong kind (and memory almost always is not returnable), CPU coolers that would not fit the motherboard, motherboard risers that got stripped while screwing them into the mounting board, BIOS's that refused to see the hard drive, BIOS's that refused to see the add in video card, RAID controllers that would do everything you could imagine except let you boot from any drives connected to them, etc. etc.  

    To me adding storage, a video card, and NVME quad port card is a piece of cake compared to dealing with the rest of it. Even for non video editing computers, I no longer build any of them. I just get used enterprise class gear off of eBay. You can get great deals on high end used desktops at the 3yr mark on eBay because that's when many business leases end. And I'll take a 3yr old enterprise class desktop/laptop/networking equipment any day over brand new consumer grade hardware.

    Yeah, that's what I thought you were getting at.

    2 hours ago, herein2020 said:

    I actually know a photographer that still uses a 5DIII for photography and the best part is....its the only camera she has ever owned (7yrs)....and she has a single lens which she has never removed from the camera since the day she got it (Canon 50mm). She also has never owned a single flash and when we started talking gear she didn't know the difference between an EF mount lens and an RF mount. She is actually a well known photographer in my area and charges over $300/hr for sessions; AND she is booked for weeks straight. It all seemed crazy to me but she is very friendly, great with families and posing kids, and that is all she does.  She literally knows nothing about flashes, gear, diffusion, etc, and shoots 100% natural light, yet she has managed to become a successful photographer in my area; and for post processing she uses a single Lightroom preset which is her signature "look". Meeting her really reminded me that you can make nearly anything work these days and the gear is only a small piece of the big picture.

    Wow...  you mean that hiring a photographer (or videographer) isn't just hiring their camera, and that the operator is more than just a technician?????  

    HOLD THE FRONT PAGE!!

    But seriously...  yeah.  All this equipment stuff is BS if you're putting it ahead of the creative side.  No-one does their best work when fighting with the tech, even if the tech is nice tech.  "The acting was uninspired, the lighting was awful, and the story was un-engaging, but overall it was a great film because it was shot on a great camera with high resolution and great colour science" - said no-one ever.

  13. 1 hour ago, greenscreen said:

    Because Resolve power is based on GPU but there isn't only one NLE on earth, isn't it? 😉

    This conversation gets better and better....   The simple choice of wanting 8K output not only completely dominates the hardware that the OP has to use, necessitates a proxy workflow which the OP seems to be against, but now it dominates the choice of NLE!

    Never mind if I hate the computer, workflow, or software....  I have 8K!

    I think the OP should really be asking themselves how much 8K is worth in comparison to the other things it looks like they have to trade-off.  I used to be very 4K-centric and was wanting to future-proof my videos as much as possible (which is a genuine thing considering I am basically the family historian so videos will get more interesting over time rather than less) but as I gradually discovered what I liked in an image and what made the most difference to getting those results I have gone down to 1080p.  

    My interest in 4K was naive and it was only through learning more about the craft that I realised how little it actually matters.  Everyone is different in their priorities, but when priority #1 means having to compromise on #2, #3, #4, and #5...  it's a good time to question how that stacks up.

  14. 37 minutes ago, IronFilm said:

    "Free" only in the sense of computing power needed, in that you're enabling a "4K capable" PC to edit 8K via the proxies cheat. 

    But you're still spending thousands of dollars more on storage, and thousands of dollars more on your equivalent camera. 
    Not counting the value of your time either. 

    None of this is likely to make any financial sense whatsoever. 

    But perhaps you don't care about money at all. Fair enough, heck I've spent thousands of dollars on Ironman racing over the years, there is no financial logic to that either! And I'd rather someone has the hobby of 8K filmmaking than playing golf. 

    I was advocating for 1080p Prores / DNxHD proies, so that only requires a 1080p ALL-I capable computer, which these days is almost all of them.

    If he wants to shoot 8K and master in 8K then good luck to him, it's the editing experience that is the question.  Plus who knows what kind of hype around 8K is present in the marketplace these days - until clients work out that 8K is pretty well useless and doesn't improve IQ even if it can be broadcast then there might still be money in offering 8K as a service, and considering the state of film-making and 2020 I understand anything to give a competitive edge..

  15. 8 hours ago, IronFilm said:

    I bet if I did a deep dive I could absolutely get a better bang for my buck system than HP can do! 
    HP isn't perfect. They're not optimizing for the same things as I am optimizing for, neither could they stay up to the minute like a proper computer nerd can. 

    The other challenge that @herein2020 was avoiding was that of incompatibility.

    My dad used to work for a large educational institution and ordered a custom built PC to replace their main file server, so naturally ordered the latest motherboard, CPU, RAM, RAID controller and drives.   Long story short, two months after getting it he still hadn't managed to get an OS to install correctly, and neither had the other people on the hundred-page thread talking about the incompatibility, in which multiple people verified that the manufacturers of various components were all blaming each other for the problem and no-one was working on a solution, so my dad did what everyone else in the thread did and gave up.  He was lucky enough to be able to lean on their wholesaler to take the equipment back with a full refund, but others weren't so lucky, and the thread continued for another year as various people swapped out parts for other models/brands to see what the best fully-functional system was.

    Or you just buy something that someone else has already checked for you.

    There's a reason that many serious software packages are only supported on certain OS versions and certain hardware configurations.  It's because their clients value reliability and full-featured service and support rather than getting a 12% improvement on performance.

  16. 3 hours ago, Oliver Daniel said:

    It’s beautiful to hold and makes shooting very joyful. 

    While the C70 has many advantages and attractive features, I think the above is probably the most important factor of any camera because it impacts the level of creativity and inspiration of the video, not just what colour or clarity the pixels are in the output files.

    I consistently find that creativity evaporates when I feel like i'm fighting the equipment rather than it having my back and supporting the work.  In the creative pursuits, this is a night-and-day difference, but of course, is also different for all people, so it's about the synergy between the camera and the user rather than one camera suiting everyone.

    Great stuff!

  17. 10 hours ago, shooter said:

    Yes, it is. Multi-million dollar Hollywood films don't need it. Different production values. Different gear. Etc. We cannot compare their cameras with ours. Our 1080p is far away of theirs. Our 8K is a way to improve our production value. Different targets too. We can shoot 8K today, why to end with a master below than that!?? Why not take advantage of a future proof?

    Fair enough.

    Unfortunately, your budget isn't sized appropriately for the resolutions you're talking about.

    I think you have three paths forward:

    • Give up on the laptop and add a zero to your budget, making it $20000 instead of $2000, then go find where people are talking about things like multi-GPU-watercooling setups and where to put the server racks and how to run the cables to the control room
    • Do nothing and wait for Apple to release the 16" MBP with their new chipset in it (this could be a few years wait though and no guarantees about 8K)
    • Work with proxies

    Proxies are the free option, at least in dollar terms, and you probably don't need to spend any money to get decent enough performance.  I'd suggest rendering 1080p proxies in either Prores HQ or DNxHD HQ.  This format should be low enough resolution for a modest machine to work with acceptable performance, but high enough resolution and colour depth so that you can do almost all your editing and image processing on the proxy files, and they will be a decent enough approximation of how the footage looks.  Things like NR and texture effects would need to be adjusted while looking at the source footage directly, but apart from that you should be able to work with the Proxy files and then just swap to the source files and render the project in whatever resolution you want to deliver and master in.

×
×
  • Create New...