Jump to content


  • Content Count

  • Joined

  • Last visited

About tomekk

  • Rank
    Frequent member

Recent Profile Visitors

2,748 profile views
  1. Pros will tell you that there are no cheap options for professional grade HDR editing. The cheapest pro option likely being EIZO CG3145 for ca. 20-30k$ USD? However, some HDR TVs are used on a budget and you can use LUT for calibration.... LG C8 series OLED 4K or Samsung QLED 4/8K (although not sure about using LUT in these) seem to be good. Read through the topics on different forums: For example: http://www.reduser.net/forum/showthread.php?173930-State-of-the-GRADING-monitor-2019 https://liftgammagain.com/forum/index.php?threads/lg-c8-experience.12401/
  2. If it is a dedicated server which you do not own then read the small print. Server maintenance can mean a lot of things. It is highly suspicious their SLA is 3hrs to replace anything in a server and server support could be really basic (almost nothing) for +-50/month. For example, power supply change / faulty drive change is +- 30mins job if you take into account reading a ticket, finding a part, getting to a server, replacing the part, logging it, dealing with the faulty one and that is if everything goes smoothly. Add to this time which takes to notice a faulty component, logging
  3. Dedicated server will be more expensive when you run into problems you cannot solve. I monitor and fix servers for corporations and you get these problems: hardware failures or O/S problems or you cannot access your server remotely to fix it (requires a visit to a DC if you want to do it cheaply, rate for remote hands fix is over 100gbp/hr and you have to buy and ship the part to the DC if it is a hardware problem. Check the cost of the parts for the dedicated server you are buying. O/S troubleshooting if something stops working, well, if you can fix it yourself then that is fine, s
  4. Just an idea, would it not be possible to estimate it by shooting in continuous mode, waiting for the buffer to fill up and then once it slows down estimate how long it takes to move those files from the buffer to the card? Provided you have fast enough SD card which is not going to be a bottleneck you would be able to estimate the speed of the controller. =========== Wikipedia: UHS-II Specified in version 4.0, further raises the data transfer rate to a theoretical maximum of 156 MB/s (full duplex) or 312 MB/s (half duplex) using an additional row of pins[33][34] (a to
  5. If anything, I think, ML will be the only thing keeping me from selling all my canon gear, sigh
  6. Could someone point me out to how far Canon is behind Sony in terms of investment in Image sensors? I have heard the rumour about the $9b investment Sony is doing, what about Canon? I cannot find anything on their website or anywhere else in terms of how they are investing in their sensors... are they upgrading their factories at all? Has anyone heard anything? I know about the 120MP sensors, 200mmx200mm sensor, ultra low light sensor but it all seems like extracting the last potential from their current technology... what after that?
  7. tomekk

    I hate big cameras

    Yes it is. It is even funnier when you realise you could be pulling in 150k USD/year and still living in a shoebox apartment like in London... It is not about how much you earn but what you can buy for it, IMHO.
  8. It is a little bit confusing, but what ML does is not equivalent of 12 bit vs 14 bit raw photos. Shades are mapped out differently in "true" 12 bit vs 14 bit raw photos. ML just truncates bottom bits and does not change values in higher stops, AFAIK. Anyway, a theory is just a theory. Check in real life. If you see the difference in grading, that is great, but it would be nice to see those clips/dngs too.
  9. It is a very good visualisation of how much data difference in total there is, but if I understand it correctly, ML is just cutting out the darkest blacks which contain mostly noise and does not re-distribute shades in higher stops of light. If you can use noise bits to your advantage, of course, use 14 bits.
  10. @kidzrevil You are right, but they cut out bits that contain mainly noise, where there are no visible shades anyway.
  11. Current consensus is that there is a difference between 10 bit and 12/14, but not so much between 12bits and 14 bits. The last 2 bits are mainly noise, but I cannot find the exact quote from a1ex right now (main ML developer). http://www.eoshd.com/comments/topic/21004-12-or-10-bit-raw-magic-lantern/?page=3#comment-168058 - 10 bit vs 14bit after lifting shadows @kidzrevil - I would love to see a difference between 12 and 14 bits. Could you post your findings in this thread by sharing a short clip that shows the difference and, if possible, a few DNGs?
  12. Good points. That's why I'd rather focus on getting great today with today's tech. What's the point of having my movies future proofed if they suck and I'm forgotten tomorrow? (Although, I think, good 4k 60fps raw with good DR is my minimum for future proofing before VR takes off ;)).
  13. Not everyone wants to make porn "movies" ;).
  14. Well, it seems, we have learned that there is a difference, and it is not a slight one but a noticeable one ;).
  15. hehe, yeah I know. I just thought I would add this small piece of information for the sake of completeness :).
  • Create New...