Jump to content

kye

Members
  • Posts

    7,656
  • Joined

  • Last visited

Posts posted by kye

  1. Is there any way to tell what focal length a zoom lens was at from looking at your footage?  EXIF data?  any other tricks?

    I shoot with zooms and would like to know what focal lengths ended up in the final edit.  (and no, I don't have logs or anything - I shoot completely unplanned home and travel videos, most of the time I'm just trying to capture and anticipate whatever is going on).

    I suspect the answer is no, but maybe someone will surprise me and make my day? ???

  2. On 7/9/2018 at 9:23 PM, tihon84 said:

    Roman post this video on Vimeo, and i think it looks amazing.  Look at this colors. This is the look i`m searching all the time.

    Shot on URSA Mini pro

    Nice video..  lens flares and colours look really good.

    But the content!  All test videos should be so entertaining!!

  3. On 7/15/2018 at 9:36 PM, DBounce said:

    In response to the thread title... apparently, yes!

    That video is wonderful..  any camera in the hands of a skilled operator is a good camera.  This is precisely my problem - good camera unskilled operator lol ???

    3 hours ago, webrunner5 said:

    So the DJI Gimbal will balance all of it, or does it have a counter balance weight you have to screw on?

    I see the Moment guys always using counterweights and that anamorphic adapter looks pretty sizeable, so my guess would be yes to counterweights.  Even if it was powerful enough, you'd save a lot of battery life by having the weight relatively balanced to begin with, although the battery life on these things is pretty amazing now.

  4. The Black Magic cameras all seem to share a similar look and feel with how they handle light.  It is very nice, almost like they have put a diffuser over the sun.  There seems to be nothing harsh about the image.

    What is it about their cameras that creates this effect?  Is it the DR?  Colour science?  High bitrate (either prores or raw)?

    I really want to own a camera that has this look but my style of film-making relies heavily on stabilisation, zoom lenses, etc so they're just not the right tool for the job.

    16 hours ago, Papiskokuji said:

    Just wanted to share a short piece I shot with my BMMCC during a sunny afternoon in Paris !

    Nice video - I particularly like the fun / different editing style!

  5. 8 minutes ago, Snowbro said:

    The best is the people that purposely grade their footage to look like log & post it to YouTube as cinematic footage : ) haha. I see a lot of this now. 

    Low contrast looks sure can look gorgeous - one of my favourites is Peaky Blinders on Netflix.  It just looks spectacular!

  6. For those of you who shoot (or are connoisseurs of) beautiful travel films, what types of movement do you prefer them to contain?

    I recently binge-watched a couple of hours of BMPCC travel films and I noticed that the vibe was quite distinctive, partly because the lack of stabilisation meant that most shots were static.

    What mix of freeform movement vs controlled movements (pans / tilts) vs static shots do people think suits the genre the most?

    I shoot mostly travel / home videos and the vibe I'm going for is kind of like the nicest version of real-life - in the same way that you'd take holiday photos using the best light / best angles / nicest smiles to create the warmest memories.

  7. One thing I noticed about YT microphone reviews (from people who don't know much about audio) is that they tend to be judging the EQ of the microphone instead of the quality of the sound overall.  I'd hear a microphone and think it would be fine with a bit of EQ and then the person would say something like "it sounds thin and hollow - not even usable!" and I'd just roll my eyes.

    @IronFilm is this something you also see across lots of reviews?

    In a way it's like filming in RAW and then saying the camera footage looks "too grey - not even usable!!".

  8. On 7/19/2018 at 2:02 PM, jonpais said:

    a7 III with the Voigtlander 65mm f/2.

    In a way that's a side effect of the large aperture lenses - those things are so fat it's more like the lens is squishing your fingers against the grip instead of the other way around!!

  9. @wolf33d interesting information - thanks.  The internet isn't keen to tell you the weaknesses of various products!

    In terms of the size, I'm sure that lots of accessories are (or will be) available for making the camera bigger - if only there were accessories available for making a camera smaller and lighter!!  Then I might buy a C700 and as many accessories as it takes to make it the size of an RX100 ???

  10. 3 hours ago, Snowbro said:

    Hey guys; so I was telling a little story and used the Rode VideoMicro for the first time on my 1DX the other day. I know Canon doesn't have great preamps, but I didn't realize how bad it can be. I am not great at sound (just use Premiere cc), I tried fix the whooshing sound the best I could, without sounding like a robot haha. Can you tell me if it sounds ok? Or if you think I need to use a better mic..

    Nice video, especially the funny bits from Grandma!

    I did a bunch of research before I ended up with RVMP+ and the TAKSTAR SGC-598 and Pixel MC-50 seemed to get positive reviews, and at $40 or less are HUGELY cheaper than the Rode or the Azden.

    However the reason I went with the Rode was that those two mics are quite large and I'm trying to fly under the radar with my setup, so that matters to me.  I'd watch a bunch of reviews to see if there's a reason to pay more than $40...

  11. 13 hours ago, Robert Collins said:

    Ok so here is what I dont get about this.... (BTW I understand that 2000/MBs disks are great for operating system start ups.)

    So, say my Sony records at 100Mb/s which is 12MB/s and my Mavic Pro records at 60Mb/s which is 8 MB/s. I can download RED 4k Raw which will be 20MB/s and 160Mb/s.

    So what essentially use is a 2000/MBs hard disk in terms of read or write speeds for video?

    And I know I sound a bit of a whinger. But essentially I am in the 'dont mind spending money camp' as long as I get 'decent performance gains'.

    And I just feel there is a bunch of smokes and mirrors trying to sell us i9s in wafer thin laptops, egpus because we have wafer thin laptops, thermal throttling, thunderbolt 3, Titan Vs, fast ram etc without a whole of evidence it does a lot of good - and if it does what?

    And really I dont think everything should be setup in a way that you need a degree in particle physics to work out what you should buy.

    My understanding is that you're right, in that it doesn't really matter if you have a disk speed of 2X your bitrates, or 10X your bitrates, however I have a thought.

    Fast drives tend to be SSD instead of physical drives, and one thing about physical drives is their poor latency performance, which does have an impact to performance.  So, if the drives in question are different technologies then perhaps it's not the straight read-speed but other differences?

  12. This is a good intro to the structure of proxies and caches in Resolve - it's from v12 but the overall structure is likely to be similar if not exactly the same in 15.

    There's also a google doc of the diagram: https://docs.google.com/drawings/d/1pbBbA4I2q3RZYrOSELPXgfdzp0X6B9OngVimTeSmbuo/edit

    Resolve is a complicated beast.

    I had to watch a few tutorials on manual Online / Offline editing workflows to find one that worked, especially considering my proxy files were slightly different filenames (extensions were different than proxies) but this was one of the ones I watched and might give an idea about the logic involved.

     

  13. Hi All,

    As someone who shoots 305MBit 4K and edits on a 2016 MBP with Resolve, there's one thing you should all keep in mind about Resolve performance.

    Resolve has several in-built features for caching and rendering, some are manual and some are automatic, and they can be used in any combination you like (ie, you can use all of them at the same time if you choose to).  However, for editing 4K footage on a low powered machine they may simply not be enough, which is why some people use an Online / Offline workflow that they manage manually.

    This manual online / offline workflow is complicated, takes some time to get working and understand, but it works really well.

    I personally transcode my own proxy footage using "Prores Proxy" at 720p and edit with that - timelines with this footage play flawlessly forwards and backwards at more than 60p with no lag and editing is a breeze.  This is with effects disabled of course.

    As Resolve is an NLE, a colour correction suite, a professional sound mixing and mastering suite, and is now a VFX suite, we have to be clear with our language around these things.

    I can EDIT whatever resolution footage I like because I use proxies.  My proxy workflow doesn't help with mixing lots of audio tracks though.  Nor will it help if you want to do precise colour or VFX work.  You may be able to do simpler VFX work at a lower resolution and then just bump up the resolution when rendering out, but you may not, depending on your specific situation.

    Resolve is great in that you can have a 1080 timeline, you can edit 720p proxy footage on it, viewing it at a range of resolutions while you do so, then you can swap back to the 4K source footage and then render out at whatever resolution you like - all from that same 1080 timeline.  In this way, you can do non-critical things at lower resolutions and get the performance benefits, but not limit the quality of the final output.

    So when someone says "I edit 4K footage in Resolve" the first step is understanding what they are talking about SPECIFICALLY.  When people say that I just automatically change it in my head to say "I use Resolve with 4K footage in some unspecified way for some unspecified purpose" and then go from there.

    I hope this helps.

    K.

  14. Has anyone shot anything really impressive with the A7III yet?

    I did a big comparison of the 'look' of the BMPCC vs the A73 by binging on BMPCC travel films for 2-3 hours and then trying to binge on A7III films, but the only real observation I made was that A7III owners can't edit for sh*t!

    The only A7III videos I've seen that aren't terrible are these..

    Matti Haapoja (he had focus issues - I think from not using the right focus modes)

    Christian Mate Grab:

     

  15. I agree with previous comments about this not being value for money.  This is a new space and early adopters will pay heavily for the privilege - either navigating software limitations and customisations as @Don Kotlos mentioned, or by buying plug-in solutions that aren't great value from a power:dollar perspective.

    For those unfamiliar with the eGPU space, this is a pretty good resource (this link is for Mac, but the site covers everything): https://egpu.io/setup-guide-external-graphics-card-mac/

    9 hours ago, Shirozina said:

    The XPS 9570 still suffers from CPU and GPU throttling when run hard (like my 9560) but as it has Thunderbolt it can use an external GPU. These slim laptops are just not designed to deal with the thermal stress of sustained high CPU and GPU rates but if you can offload this work to an external box it may solve it. I have a 4 disk RAID0 on my 9560 via Thunderbolt which gives me 500 mbps data rates but the Thunderbolt chip side of the laptop gets very hot. If you have a big raid storage box and eGPU + laptop you have to ask yourself why are you not using a small desktop ( micro ATX form factor) PC and a Screen......

    Totally agree.

    Basically everything is more expensive for laptops - just look at the prices for the BM external monitor cards for PCI vs USB....  The 4K PCI card is $199, but I think the Ultrastudio 4K is the cheapest external 4K converter and it's $995!

    I think the only reason not to have a desktop computer for editing is that laptops are portable.  Obviously if you're a pro working from an office (with controlled lighting etc) then this isn't an argument that applies, but if you're like me and edits on the move, or even someone that doesn't want to run two computers (and manage all the syncing that requires) then a laptop is the only option.  Travel film-makers, YT creators with demanding publish-schedules, etc are in this situation.

    I read that more and more producers and directors want a colourist on set to provide feedback on the 'look' of footage, so flexibility might be worth something.  Or, if you're at the lower-end of the market and using Resolve as your all-in-one (I hear the Media Management features are excellent for ingesting footage) but can only afford one computer then a laptop might also be a compromise that makes sense.

    I'm waiting for the support for multiple eGPUs to take off, and then it won't matter what the computer is because you'll be plugging in 4 or 6 of them and having a real-time render farm.  Resolve should be well suited for this as I hear it's more reliant on GPU than CPU, and if they're partnering with Apple that might give them access to the MacOS bits that might need to change there too.  Plus, the ability to sell multiple eGPUs to each person would be a huge deal.

  16. 1 hour ago, Tone1k said:

    "Lucky to not understand"... Patronising much? I'm in Melbourne, a friend of mine is a product manager at BMD though not the camera department and I've had conversations about this with him, though not directly relating to BMD. I understand perfectly.

    Now it's my turn to not understand.  If you know all of that, then why did you ask this?

    On 7/8/2018 at 11:34 AM, Tone1k said:

    While you say that BM have asked most of the questions asked here already in the devolopment stages, I'm a little more interested in the product testing stages pre release.

    And then why did you criticise when my answer was clearly stated as being generic about testing.

    It sounds like you're the one with the contacts etc, and we should be asking you!

  17. 1 hour ago, Tone1k said:

    If you read my original question, I said I understand devices like these can have bugs that need to be ironed out. What I don't understand is how a camera can be released with major imaging problems that show up under normal shooting conditions. It essentially fails at its main task. 

    OK, let me have another go.

    A camera can be released with what you determine to be major imaging problems because.......  it wasn't you making the decision, and they care about different things than you do.

    I'd suggest that you're very lucky to not understand.  It's not uncommon for decisions in large corporations to be made straight after the boss says something like "either the product is going to be out on the street tomorrow morning or you are!"

  18. 3 hours ago, Tone1k said:

    Thanks for the reply. While I understand that any piece of equipment can have many small bugs, major image issues for a device who's job it is to capture images should not be allowed to make their way into production. 

    LOL..  I guess you know better than all the people who actually do this for a living.  I look forward to you releasing your prefect camera!!

  19. 20 minutes ago, Savannah Miller said:

    Notice how in one quote the guy said, "We truly have tested every digital platform out there, and C500 is the best we've ever seen."  Now either that's out of context, or they're paying him to say that because what is making him say this with other cameras like RED Dragon, Arri Alexa, F65 etc. also available?

    Yeah, context is king.  Every tool has pros and cons depending on the situation.

    That's one of the biggest challenges in these forums, we're all shooting different stuff in different situations but we don't reliably communicate what our unique needs are.

  20. On 7/8/2018 at 11:34 AM, Tone1k said:

    Hi JB, 

    Thanks for your input here on this forum. 

    While you say that BM have asked most of the questions asked here already in the devolopment stages, I'm a little more interested in the product testing stages pre release.

    While the Ursa Mini line have had a lot less image quality issues than 1st Gen BMD cameras like the Production and Production 4k, image quality issues like sunspots in highlights, high levels of flicker and FPN (in the UrsaMini 4k) seem like obvious  issues that should be picked up on pre release of the camera. Why do issues like this get through if product testing occurs prior to release? Surely a test of the original Production camera would have included shooting a frame with a light source on it and the sun spot show up? 

    While I trust that BM have learned from their past mistakes, and I know that a working version of the Pocket 4K has been doing the rounds with viewings in retailers here in Australia to hopefully get feedback pre release, do cameras go through more thorough testing now compared to a few years ago or can we expect the Pocket 4K, with a new sensor (to BMD) to have image quality issues on release and then BMD address them afterwards? 

    Cheers. 

    I'm definitely not JB, but I can talk about testing, having been involved in software development and testing in my day job.

    The short version of why things are released with bugs is this:

    • A company figures out that they can build a camera with X features in Y time, and they think that it will fill a niche and make money
    • They start to develop it, and due to how dependencies in projects work, development takes longer than anticipated
    • The company knows that releasing a product late is a huge mistake, especially in a rapidly developing market, but they also know that releasing a product that is flawed is also a bad idea
    • The company goes into TESTING, where people are using the camera, noting down issues, annoyances, and product features in a big database
    • Everything in the database is ranked (according to importance) and then allocated to a tech to fix
    • Once an issue is fixed it is then sent back to the person who found it to test it again
    • It is common for a change to fix something but break something else, and it's also common for a problem to be caused by two things (eg, hardware and software, or two different software modules) not being completely aligned.  Communication needs to occur, discussions to understand what is happening, what to be done, implications etc..
    • At some point (normally the publicised release date) a huge meeting is held and all the remaining items to be fixed are reviewed by management and the decision to release it anyway is made.  It is very very very rare for something to miss the delivery deadline because of the number of issues.
    • The process of identifying, tracking, fixing, testing, continues during the lifetime of the product (and is why there are firmware updates to a product)

    In reality there will be thousands, maybe tens of thousands of items involved in a process like this.  Nothing is ever perfect.  It is not possible to test every function with every combination of data. 

    Here's a quote from an article about developing the software for the space shuttle:  [Edit: here's the link to the below]

    Quote

    Because of the nature of the software as it is delivered, the verification team concentrates on proving that it meets the customer's requirements and that it functions at an acceptable level of performance. Consistent with the concept that the software is assumed untested, the verification group can go into as much detail as time and cost allow. Primarily, the test group concentrates on single software loads, such as ascent, on-orbit, and so forth146. To facilitate this, it is divided into teams that specialize in the operating system and detail, or functional verification; teams that work on guidance, navigation, and control; and teams that certify system performance. These groups have access to the software in the SPF, which thus doubles as a site for both development and testing. Using tools available in the SPF, the verification teams can use the real flight computers for their tests (the preferred method). The testers can freeze the execution of software on those machines in order to check intermediate results, alter memory, and even get a log of what commands resulted in response to what inputs147.

    After the verification group has passed the software, it is given an official Configuration Inspection and turned over to NASA. At that point NASA assumes configuration control, and any changes must be approved through Agency channels. Even though NASA then has the software, IBM is not finished with it148.

    [121] The software is usually installed in the SAIL for prelaunch, ascent, and abort simulations, the Flight Simulation Lab (FSL) in Downey for orbit, de-orbit, and entry simulations, and the SMS for crew training. Although these installations are not part of the preplanned verification process, the discrepancies noted by the users of the software in the roughly 6 months before launch help complete the testing in a real environment. Due to the nature of real-time computer systems, however, the software can never be fully certified, and both IBM and NASA are aware of this149. There are simply too many interfaces and too many opportunities for asynchronous input and output.

    I highlighted the relevant passages in bold.

    Obviously, NASA has more at stake with software problems than a consumer electronics company, and even then, they can't possibly test everything.

    There is a typical divide in culture in an organisation around Risk.  

    • IT and engineering professionals are normally trained in a culture of excellence, where due to advanced mathematical training, there is often an underlying and often unconscious mindset of there being one answer to a question, and therefore one solution to a problem, with the rest being sub-optimal.  These teams are often incentivised by having KPIs and bonuses around system reliability.
    • Sales, marketing, and product managers operate in "the market" which is complicated, messy, and is basically a shit-fight, and know that sales (and therefore profit) are more related to perception rather than facts, and they know that every day a product shipping date is delayed is lost sales.  They know that nothing will ever be 'perfect' and are fully ready to 'explain away' any shortcomings of the product once it's in the market, but they can't do a single thing or sell a single unit until it is actually released.  These people have KPIs and often have large percentages of their income based on sales bonuses.  They care about quality, but only as it impacts sales.
    • Often, Sales, marketing, and product managers think that IT and engineering professionals are ivory tower elitists who will 'gold plate' everything until the company goes bankrupt and products have to be ripped from their hands in order for the company to ever be finished and for anyone to ever get paid.
    • Often, IT and engineering professionals think that Sales, marketing, and product managers are reckless, dodgy, cowboys who have no pride in quality, no understanding of shafting the consumer by fast-talk and no integrity, and they need to prevent products from being released too soon otherwise their lack of quality will immediately sink the company and no-one will get paid ever again.

    I hope this illuminates why products ship will bugs.  It's a fundamental issue, and the final result is always a compromise.

  21. 22 hours ago, BTM_Pix said:

    Ah, I see what you were getting at now.

    Its the same deal in essence with a thunderbolt controller and graphics card but, yes, they are getting the smaller footprint from using the MXM module.

    You could argue that conceptually they've just chopped a smaller desktop case in half that has an external power supply ;) 

    The smaller footprint is useful if you are going to be moving about but without access to AC power it becomes a bit moot I suppose if you are looking at truly mobile editing. 

    Indeed - conceptually they're the same!

    I guess I see two major elements to the picture, one is size, and the second is the scalable architecture I mentioned before.  With those two they can start putting two, three, four, six, twelve, and then onwards to dozens of them in the same box.

    I remember programming a computer that had 2048 processors when I was back at university, and geez, if you knew how to program it right it sure could fly!  I didn't go into parallel computing far enough to get a glimpse of how they program a parallel algorithm for an unknown number of processors, but they should have worked it out now with these multi-core machines we all have.

×
×
  • Create New...