Jump to content


  • Posts

  • Joined

  • Last visited

Everything posted by tupp

  1. I never tested it. I merely have a little knowledge of how high-end down-conversions (and up-conversions) have worked since the early days of DV. Plus, the math and theory are straightforward and "dumb-simple." I think we basically agree here. As I have maintained throughout this thread, summing in a down-conversion is merely swapping resolution for bit depth -- not increasing color depth. One can sacrifice resolution for greater bit-depth, but one can never increase the color depth of a digital image (not without introducing something artificial). However, I am not sure on whether or not the "color accuracy" can be increased during a down conversion. A lot depends on what is depicted by those four pixels. One of those four summed pixels might be 199 with another being 202 and the other two pixels being 200, hence, 801. Obviously, smoother surfaces/areas (such as the example you gave ) can cause minute "color accuracy" discrepancies. The main instance in which such minute discrepancies become apparent is banding effects on smooth areas. Banding has been discussed in this thread, and. nevertheless, the color depth of the original image is maintained in a down-conversion when the pixels are summed -- even with banding. No, there is definitely a benefit in increasing the bit-depth during a down-conversion. It is important to understand that there is a world of difference between color depth and what you call "color accuracy." If you do not sum and also increase the bit depth in a down-conversion, you throw away valuable color depth -- even if the "color accuracy" remains "8-bit" on some smooth sections of the image. Such a sacrifice in color depth will be apparent in the more complex, "cluttered" sections of the image that have zillions of complex transitions between color tones. If you don't sum the pixels and don't increase the bit-depth, you may or may not have occasional banding (8-bit accuracy) but you will certainly have reduced color depth (apparent in the more complex areas of the image). If you do sum and do increase the bit-depth, you likewise may or may not have occasional banding, but you will have nonetheless maintained the color depth of the original image (no reduced color depth). The increased bit-depth is not artificial -- it is merely sacrificing resolution for bit-depth to maintain color depth. Most people don't realize that resolution is a major factor in color depth, and that fact is usually the misunderstood point in the aforementioned down-conversions. In fact, you could have a system with a bit depth of "1," and, with enough resolution, have the same degree of color depth as a 12-bit, 444 system. Actually, there exist countless images that have absolutely no bit-depth, yet every one of those images have color depth equal to or greater than 12-bit, 444 images. By the way, the mathematical relationship between color depth, resolution and bit depth is very simple in digital RGB imaging systems: COLOR DEPTH = (RESOLUTION X BIT DEPTH)3
  2. I'll have to take your word that those programs average (without rounding) when they downscale. However, summing is more accurate, and increased bit-depth is implicit with summing. That's fine, but more importantly, does resolve yield you greater bit-depth in the final down-scaled file, without rounding the average?
  3. A down-scale from UHD to full HD is 4:1 ratio. So, regardless of the resolution per color channel, about 3/4 of the information per color channel would be thrown away, when down-scaling from UHD to full HD without summing nor averaging the adjacent pixels in the original digital image (and without increasing the bit-depth of the final image -- assuming the final chroma sub-sample is identical to the original).
  4. I did not make any claims regarding the scaling method used by any programs. By "simple down-scaling," I mean reducing the resolution of the image without any summing nor averaging the adjacent pixels in the original image and without increasing the bit-depth in the final image. Such a simple conversion throws away the information of the unused pixels from the original image. It is irrelevant that Resolve generally operates in 32-bit depth with RGBY color space, if it doesn't increase the bit-depth when down-scaling. Again, to downscale a digital image and retain all of the color depth of the original image, the adjacent pixels in the original image must somehow be summed or averaged and the bit-depth must be increased in the final image. If your program/transcoder is not doing both of those things, then you are losing color depth information.
  5. Works on most any digital imaging file. All it does is retain the color depth of the original, higher-res file by swapping resolution for bit-depth. By the way, contrary to some of the more recent comments, the method discussed in this thread is very different from simply scaling-down an image, A simple down-scaling throws away information and does not retain the original color depth of the image
  6. Then perhaps it would be best for you to stop being dramatic. All I have done is link references, state fact, ask questions and suggest folks be careful when dealing with Lenovo machines.
  7. You are incorrect on both "stands." No exaggeration on my part -- I simply link articles that report facts, and I also linked a press release directly from Lenovo and a warning from the US government. Furthermore, I have never been Lenovo customer nor product owner. I have no clue what you are trying to say. Please just say what you mean. What? What does my age have to do with the fact that Lenovo has repeatedly snuck persistent malware into the BIOSes of it's machine, even after lying about it multiple times? I am no expert on laptops that are good for editing, and the power of such machines is constantly progressing. However, with a brief web search it shouldn't be too difficult to find a comparison article on current units that fit the bill. No doubt, most of the non-Lenovos won't have insidious malware in the BIOS. Again, no clue as to what you mean here. Please say what you mean. Good for you! You've been warned about Lenovo machines with links citing facts about their practices. You're on your own now. This Lenovo fiasco is rather recent. Unless you are less than eight years old, it wasn't "pre-existent" for you. Same to you, bud. Perhaps you might eventually experience the real life consequences of not heeding multiple security warnings.
  8. In light of Lenovo's practices, avoiding its products is motivated more by wisdom than fear. Perhaps a more accurate and comprehensible analogy would be that of the choice between going down a dark alley or a well lit street. Would you choose to traverse a dark alley in which you know creeps lurk at night, or would you choose the bright street in which you can see everything? Likewise, do you choose a laptop manufacturer who keeps sneaking creepy, tenacious malware/crapware deep into the BIOS and who repeatedly lies about it, or do you choose an honest manufacturer who is just trying to create a good product?
  9. In addition to the items you already mentioned (sound, power distro, etc.), take wide angle photos/videos from each corner of any shooting room. Scout/plan staging areas for equipment, hair/makeup, craft services, wardrobe, actor privacy/dressing rooms, etc. Take light meter readings, and bring a compass determine which direction windows/doors/openings face. Gauge the cooperativeness of the property owner/caretaker. If you plan on sending light through a window, note the height of the window from ground level outside.
  10. Looks like Lenovo was messing with the Thinkpads after all. What was that again regarding "bitches" and "misinformation?" People, please be careful when choosing Lenovo machines.
  11. Well, that NHK rack was one of the first HEVC encoders in the world, and it was encoding 8K as well, without FGPAs. So, of course, it's not going to be efficient and miniaturized. I have no idea if NHK has continued developing their system to make smaller, but the size of a first attempt is not the point. I don't know... someone who really wanted to shoot 8K in 2013 (who had some funding) would probably not be hindered by the size of that first NHK encoder. Keep in mind, when the 4K Dalsa Origin first appeared in 2003, it had a huge body that was tethered to a desktop computer and a raid array. Likewise, Quadruplex SD video recorders which were used widely in the late 1950s and early 1960s were humongous, yet TV productions still seemed to shoot. Me, too, but currently I am not very keen on dealing with anything past 4K.
  12. Very interesting! Great find! Well, back in July of 2013, NHK was compressing streams from their 8K Hi-Vision Camera in real time down to 85 Mbps, 8K , HEVC/H.265, so they probably used a single recorder for their 8K demos. I would imagine that there are other "one-off" methods of recording 8K in existence, but we don't know about them yet.
  13. While I am solidly in the greater DR and bit depth camp (I'd rather shoot with an Alexa or a Sony F35 than a Red Epic or Canon's 8K), it is important to play the "devil's advocate" at this juncture in the thread and remind all that color depth actually consists of part bit depth and part resolution. So, the higher the resolution, the greater the color depth. In fact, you could have a system with a bit depth of "1" and, with enough resolution (and ignoring banding in capture), have the same degree of color depth as a 12-bit, 444 system. The formula for color depth in digital, RGB imaging systems is: Color Depth = (Bit Depth x Resolution)3
  14. Certainly, the scope of the convergence of form factor and hybrid concepts will synergistically target future-proof users and compellingly empower technically sound bandwidth to optimize superior functionalized systems and streamline value-added catalysts for heightened verticals.
  15. Just use the Forza 18K camera and you will get plenty of "convergence" -- your heatsinks will "converge" with your CPUs/GPUs when they all liquify as you try to edit and color grade.
  16. How do you come to that conclusion? Did Lenovo say that? Keep in mind, there has been more than one instance. I never stated that it was a problem specifically with "Thinkpads." Again, there has been more than one instance, so I wouldn't put it past them to try to sneak it into some Thinkpads. Most of the reports are calling it malware -- it's in the BIOS, it reincarnates itself after you think it's deleted and it phones home. You can call it whatever you like. Is that why Lenovo issued this statement back in February when they got caught, and why they were yet again caught hiding spyware in the BIOS again just last month? Furthermore, is that why they seem to be doing it once again in their smartphones? Even the Department Of Homeland Security is posting warnings on Lenovo machines. Well, I've done my part in making people aware of the risk. Not sure where you think that I misinformed nor why you would try to dismiss the threat.
  17. Perhaps we don't understand the ramifications of having manufacturer-installed malware in our BIOS. Here's a hint -- it's not good.
  18. Be careful using Lenovo machines -- they put s#!t in their firmware. No "and" needed.
  19. I have never been impressed with Apple hardware (nor software), and I am all for hackintosh projects. However, exercise caution when using Lenovo products -- the company has a history of installing spyware in the firmware and OS of their machines.
  20. Nope... unfortunately, the world doesn't work that "simply." Sometimes the best things succeed, but a lot of the time they don't (especially in this current age of mediocrity). That's why Oracle is so prevalent in spite of mass dissatisfaction with its products... that's why lobbyists influence laws and government projects (in the USA)... that's why we have to listen to Miley Cyrus, Justin Bieber and Kanye West, instead of artists as talented as the Beatles or Burt Bacharach. Nevertheless, Chrome is free and it has more users than its proprietary counterparts. Since you mentioned a web browser, how about Firefox? Off the top of my head, there's also Android, Thunderbird, Wordpress, Audacity, VLC, Handbrake and Blender, etc. Of course, there is a bunch of open source software that dominates network and web installations, such as Apache, MySQL, SSL, Drupal and PHP, etc., not to mention most of the prominent programming languages. No need to apologize. Everyone is entitled to their opinion.
  21. One would think so, but they probably have never heard of it, and, again, FUD (such as this very point that I'm countering). Actually, it has happened. There are countless examples of businesses who have dropped MS Office for Open/Libre Office and have saved a bundle, without suffering any productivity. Heck, there are entire governments who have switched to open source soaftware. Once more, we are dealing with FUD and user conditioning. People resist change, even if the alternative is better. The FUD makes it much worse. Again, I ask you, please give specific examples of how Microsoft office is better than Libre Office. In the first place, GIMP isn't the only open source image editor. Secondly, yes, that too has happened. Again, I ask you, what features, specifically, in Photoshop are superior to its open source counterparts? Also, FUD and user conditioning. No, but it has a lot to do with folks' resistance. Can you share a double-blind study to the contrary (that is not sponsored by Microsoft). Not sure what a science background has to do with GUI design, except maybe it helps when field testing. Everyone is entitled to their opinion, which is all you are putting forward. In my opinion, Windows is clunky, unprofessional, quirky and full of crapware/bloatware. I would also like to point out that whatever the desktop environment you were using on Ubuntu, it is only one of zillions that are available on Linux/Unix systems. Any design elements specific to Ubuntu probably went through Canonical -- the corporation that started and maintains the Ubuntu distro. Canonical is owned by billionaire Mark Shuttleworth. If you want a UI that is not clunky, I suggest you go with one of the many tiling open source window managers. Power users with tiling window managers invariably run circles around Windows, Mac and Linux desktop users. No doubt. No, it doesn't. What?!! Wow! Perhaps one of us has an agenda, after all. That's fine. That explains a lot about the notions you put forth. Nevertheless, there are plenty software developers and content creators who use open source software and free content. I don't think that it is "important" to pay for content. There are a lot of ways that content sustain itself and make a profit. Again, we are getting philosophical and departing from the topic of this thread, which is open source software for production. I propose that they make money exactly as they currently doing it. What's the problem? That is certainly the simple answer.
  22. Saying it doesn't make it so. Sorry, but I gotta call BS. There is plenty of open source software which exceeds proprietary software in commerce -- open source is usually more concise, efficient and more innovative. Furthermore, there are countless of professionals working full time on tons of open source software. Let's consider one of the most proprietary software providers in "commerce" -- Oracle. Ask users of Oracle software if they would rather use open source alternatives, and see what kind of response you get. Productivity? Please name features (exactly) of proprietary business software that are superior to those in open source. You mean powerful features in Photoshop, such as Content Aware Fill, 32-bit editing and raw capability? All of those fundamental features were available in GIMP years before they appeared in Photoshop... YEARS BEFORE. I concede that the proprietary outfits got a head start in NLEs, but open source will catch up fairly soon. Furthermore, Linux proprietary NLEs, compositors (and other production software) have dominated in the past -- Piranha, Maya and Ant (the first RED/4K optimized NLE) come to mind. Also, I wouldn't classify Resolve as an NLE. Please. Audacity is not as robust as Protools, but free and open sourced Ardour certainly is. I hear this a lot, but I have yet to find anything that can be done in Microsoft Office that is not possible in Libre Office. I just named a few, and there are plenty more (Firefox, Chrome, Android, Linux Distros, BSD projects, etc). Again, just saying that doesn't make it so. I have given examples in which open source code is superior and more advance than proprietary. There are countless examples in which people are frustrated with proprietary bloatware. and all of the crap that goes with it. We who use open source software don't suffer any of those problems. Of course, we are not even touching on security, in which open source software has a huge advantage. If that were only true. Unfortunately, that's not the way it works. For one thing, you are neglecting FUD. Again, there are thousands of paid developers who work full time on open source software, plus there are the really ones who code out of enthusiasm for the product. As earth shattering as these revelations are, they have no bearing on the quality of software. Sometimes one invests time and money (energy) in to a film, without the intention of recovering their costs. I do it all the time, because I think that a project is worthwhile or I think that I might get some good footage from the deal. What is your point here and how does it apply to open source software? Are you seriously implying that users and developers of open source software are stealing? Well, for one thing, those who currently develop open source software would continue creating interesting things. There are plenty of interesting movies and musical projects which are done on spec. In regards to food, water and shelter, that is another matter which is probably better discussed some philosophy site. We are talking about open source and Linux production software here. Open source software is free.
  23. Not really. Yes. This is the typical FUD scenario -- early adopter of Red Hat, then got disinterested. I've never used Red Hat. That's fine. I would rather have open source and free software. Disagree wholeheartedly. With open source and free software, I can do almost anything that can be done with proprietary software. Furthermore, open source software often can do more than proprietary software, as a lot of the innovation occurs in open-source code. I would rather use software from a coder who is enthusiastic than from one who is merely drawing a paycheck.
  24. A Linux beta of Resolve 12 is available as a free download.
  25. Wow! Great to see more from the Forbes 70 after nothing for almost two years! The 40mm footage says it all. The shot of the girl and the two dogs under the tree made with the 180mm and the teleconverter is great, too. @Inazuma & @TheRenaissanceMan: As I understand it, the Forbes 70 is essentially a fancy, medium-format DOF adapter, employing a BMPCC to capture the images. Apparently, significant ND was used in these shots, possibly without an IR blocking filter. @richg101: No problem with it being HD (decent DR and pro color specs are more important to me). On the other hand, 4K Super-16 sensors are already here, in a few machine vision cameras and in both the Blackmagic Studio camera and Micro Cinema camera (both UHD and slightly wider than Super-16). Given the resources/funding to modify the optics behind the focal plane, what would be the disadvantages of using a APS-C/Super-35mm sensor? Also, have you considered configuring the unit so that the lens mounts nearer to the bottom of the front? By the way, in your narrated video, you mention the differences between the Alexa and Forbes 70 frame sizes. Keep in mind that the Alexa 65 has a humongous sensor -- 54.12mm x 25.58. However, if you could crank these things out, the Forbes 70 would no doubt get more use than the Alexa 65. To me, this is the most exciting thread since NAB! The Dog Schidt lenses are great, but please don't let the Forbes 70 languish.
  • Create New...