Jump to content


  • Content Count

  • Joined

  • Last visited

About elgabogomez

  • Rank

Recent Profile Visitors

1,898 profile views
  1. Look, 2k for exhibition is ok. I’m not arguing that, however if your point is that if a movie from 2018 is distributed in 2k reason enough for 4k to be “not relevant” then I would choose a film “made” in 2k, not on film that is basically resolution independent. Depending on the conditions of the negative or the exhibition print it can display from 480 to 8k comparable detail. And if well shot, properly processed, carefully conserved originals can be obtained, there is a chance for a future “re-master” in other resolutions and formats. By imdb limited info, there is no way to know if the telecine convertion was made in 2k, 4k or 6k, and or 10 bit log or 16 bit floating point, for that you will need to contact the editor, the lab or the dp. If you can have the Spike Lee’s workflow of 35mm and 16mm converted to digital then you are a very lucky guy.
  2. Blackkklansman is not a digitally captured film
  3. The latest GoPro, sure you can find ways to use it after the trip. If stills are a concern let your kid do the video work and convince the wife to still use the a5100.
  4. If you are ok with a green tint on your toast, I can loan you my Sony a5100. Great affordable toaster.
  5. I haven’t seen a Blackmagic production 4k camera for $700.00 anywhere. If anyone knows of that kind of deal, please pm me.
  6. If you ask “regular” people, they prefer hyper realistic paintings... but that doesn’t negate the power of other types of paintings. Abstract, surreal, impressionistic... Are they all for the painters?
  7. None unfortunately, the only dslr-like camera with anamorphic mode is the Panasonic gh series (gh4, gh5, gh5s). You can crop the xt3 16:9 image to 4:3 in post though. Yes you loose detail and information but with 4k is not that bad. The “open gate” modes of the gh5 and gh5s give you almost the same field of view of the xt3 4:3 “crop in post” with more info and details. You need to go to bigger cinema cameras to have anamorphic modes, the cheapest apsc being the ursa mini 4.6, then the kinefinity mavo, and then up in price to a second hand alexa or red epic.
  8. Henry, the problem is the graphics card in your laptop, resolve 14 requires 4gb of video ram for 4k, on a desktop you could get away with a 2gb card half of the time or with a couple of nodes or with a 2k max timeline, but the other half of the time resolve doesn’t have enough vram. If I remember correctly, the 840m GeForce tops at 1gb of ram.
  9. I would have guessed that it covered a mid point between s35 and ff...
  10. Andrew, did you ever tried your Lomo on a Sony FF? How much of the sensor is useable?
  11. Ok, fair enough. That’s counting ef lenses as “native” for the R system, I suppose. But at this point they still need to be tested as equally good than on a mirror canon to be counted as such. It’s still too early to say they work best adapted to the R than the Sony A which has been able to adapt them for years. I don’t want to defend the system I have, but it can be argued that right now it’s a far more adaptable system for every set of lenses, from c mount to medium format which you have done in the past, and that includes everything from fd to ef to every nikon lens you could get on photo stores for very little money. When it comes to video-cine lens sets, the only “matched” set in stops, look, color that canon has is the cine line which can also be adapted to any mirrorless system and that is not auto focus nor inexpensive.
  12. What would you call a complete set of lenses? For video/cine? Or for photography? I normally agree with your comments but this one needs more context cause canon and nikon new mirrorless systems can not be called better at this point, so I really don’t know what are you trying to say.
  13. I guess you can call me a sony fanboy... and I want the xt3! Battery is my one concern.
  14. Kye, I hope such tests get done, gimbals are relatively new and can do great things. I’m just saying that the effort required to do them is bigger and more expensive than just getting all the gear together and creating a machine that can test them. I’ve personally only used a ronin as a camera assistant in one feature and as a camera operator in a short. I’ve played with a friend’s Zhiyun Crane for a few hours and found it easier to set. But the feature was with an alexa mini and the short with a c100... that the Zhiyun was flying my a6300 was obviously a lot easier task. For many things I prefer to use a glidecam but that’s a biased opinion cause I’ve used them for years. So as a user I strongly believe that the end result of a project has more to do with the user/team of users than the tool.
  15. It can certainly be done, but look at the cinema5d tests for dynamic range... there are several ways to compromise results. Not convinced with the example, look at the tests made by ebu to determine if a camera can be used for broadcast, as soon as you add extra tests as technology allows, how can your new results compare to your old ones?... Or look at the anamorphic test by sharegrid, how many people involved do you need to have comparable and quantifiable results? That’s what the challenge is and now consider doing retests cause the motors where acting strange on that day you rented the Gemini or the alexa mini...
  • Create New...