Jump to content

Mokara

Banned
  • Posts

    744
  • Joined

  • Last visited

Everything posted by Mokara

  1. Why not? Canon have done that in the past as well, with incremental upgrades that sometimes get out of step. The G20 followed by the G30 about three months later springs to mind. If you bought a G20 when it was released you likely would have been less than happy about that. When these sorts of things happen very often it is due to a specific upgrade being delayed for some reason, but in order to remain competitive something else has to be released in the interim. Then, changes on the competitive landscape necessitate the release of the next product much sooner than otherwise would happen. Sony bodies remain essentially the same, the main upgrade in any cycle is usually one of the sensor, processor or LSI, sometimes more if there has been a particularly long gap between updates. They may be holding off on the processor upgrades until the FS7, with the FS5 targeted at the sort of market something like the C100 occupies, for instance. For that market the specs are likely fine, the biggest drawback of the FS5 I would have been AF performance versus the C100M2, and the FS5 II is probably an attempt to address that deficiency. If the C100M3 comes out with higher specs, such as internal 4K60p, then you would probably see an FS5 III sooner rather than later, but otherwise it will show up in about 2-3 years from now. For higher performance applications you would need to get the next gen FS7, or (potentially) an a7S III for less formal production applications.
  2. Those tests compare systems, not processors. They don't control for the effects of the OS and other supporting hardware that may impact on overall performance.
  3. It depends. Recent camera releases have had their LSI chip updated, and sometimes an updated sensor depending on hold old the earlier sensor was. The first cameras with the latest LSI was the RX100 (and one of the RX10 models IIRC), so it is not necessarily the most expensive lines that get updates first. It all depends on the development cycle and how many cameras of a particular model they sell. The main upgrade in the FS5 II might be the LSI, which means greatly improved AF functionality. At some point they are going to be releasing the next generation processor, and that is when we can expect to see performance increases. Maybe the first camera that sees it will be the a7S III or it could even be something with a shorter development cycle, such as the RX100. A7S III can be expected to include a new sensor as well as the latest LSI. If the next processor is available in time it may well have to.
  4. Are the specs on that page you linked correct? According to those the max frame rate for 4K is 30 fps, they don't say anything about higher frame rates (in the specs).
  5. Because to do that they would need to hold a X86 architectural license. Currently only Intel, AMD and VIA have licenses, and I doubt Intel have any desire to make that lineup broader outside of niche applications. Apple might be able to make something similar, but it would not be compatible with the PC universe. It is more likely that any move to use their own processors would result in a complete severing of the connection Apple has to the PC world.
  6. Hardware encoding is determined by what the processor inside is capable of, and the DV6 apparently has a more consumer orientated encoder than the DV5, probably because it has been optimised for use in DSLRs/MILCs/compacts. Presumably the DV6 has other advantages over the DV5 when it comes to RAW, and that is probably why they used it in the C200 rather than the DV5.
  7. I would guess that the sensor is returning something like 14 bits of data per pixel, but the processor is only actually using some lesser amount, such as 10 bits. When you increase or decrease the color channels you adjust exactly which 10 bits of the 14 gathered are used. So by moving everything up two notches and then reducing exposure by a corresponding amount, you are effectively increasing ISO without increasing gain. If NR is kicking in at a particular gain setting the net outcome is that for all practical purposes you could get an extra two stops of exposure before that happens using the OP's method. I would guess that you can do the opposite as well, essentially adding an internal ND function to the camera if you had need for that. The downside however is that data collected at the top and bottom of those 14 bits is likely to be less accurate, and that may cause other unanticipated issues, such as WB or color cast problems (since in camera correction of those properties requires some headroom in the color channels).
  8. It is not like Apple mobile chips are better than anyone else's. There is no reason they will be better than Intel's chips on desktops/laptops either. The reason for development stagnation is that technology is approaching physical limits, and software demands are flattening out. For all of the spin that Intel is not moving fast enough, remember that Apple themselves lag behind when it comes to keeping their hardware up to date, with the result that high end PCs typically have better hardware. I would guess that the real reason they are doing this is to make hardware that is not compatible with the PC universe in the belief that software would migrate to their systems. And maybe so that there would no longer be a direct comparison between PCs and Apple computers, that way they could camouflage their slow development cycle.
  9. I think the problem with Canon is that the message they took home from the 5D2 experience is that users wanted smallish ILC video cameras, when in fact what they wanted was hybrids. And that is why they went down the wrong path and consequently opened the door for the likes of Panasonic and Sony to exploit, since those companies had a better understanding of where the consumer market was heading. There were lessons to be learned from the 5D2 experience, but unfortunately it was not Canon that learned them. The problem is that mechanically a lot of that older EF lens base is not responsive enough. Not good for manual focusing on MILC cameras either, since they tend to have very short throw distances that makes critical focus awkward. Plus, requiring the glass to be further away from the camera due to the mirrorbox results in lenses being larger than they need to be. A new mount with new lenses will result in smaller lighter systems. If you move to a MILC format, optimal lens performance would benefit from a mount designed specifically for that. If you really wanted to use older lenses you could always add an adapter for those folk.
  10. If it has a grip like the 5D it will be a non-starter for me, since the grip on that is way too big for my hands. Canon prosumer cameras feel very awkward for me to handle. They are just too big for anything other than a tripod. 5% of primary users perhaps, but the purpose of a hybrid is that you can shoot both stills and video when you want to. Even stills photographers from time to time might want to shoot video as the situation requires, and they probably don't want to carry around a dedicated camera just for that. And in fact likely don't - hence they don't shoot video because they don't have the tools to do it in the first place. Extrapolating the absence of tools to mean the absence of desire is unwise. It is that flexibility that makes a particular camera attractive in the modern market, and why cameras like the a7 and GH series are taking off.
  11. With the relatively low cost of media these days, why would anyone bother with compressed raw on a camera like this, especially considering the additional artifacts it introduces?
  12. A full frame prosumer camera will not be like the M50. It will almost certainly have the additional electronics that the 1D/5D/7D cameras have, which should enable PDAF and hardware 4K at the same time. The M50 is a basic model with stripped down internals to save on materials costs, FF prosumer MILCs from Canon will not be done that way, and we can expect superior performance. The M50 will be to the new prosumer FF MILCs the same as Rebels are to the 5D series in terms of performance.
  13. Because that additional processing being done in the 5D4 is performed by a second processor that is not present in the M50? The Digic 8 processing capabilities are likely not all that different from the Digic 6+, the main difference is the added logic for the 4K hardware encoder. Running the hardware encoder is going to generate a lot of heat and that will limit what else the processor can do since it has to remain within a safe thermal envelope or it will fail. If they stick active cooling onto the processor it probably would be able to do both 4K and PDAF at the same time, but a fan in a small MILC body is not practical. The reasons these things are not implemented together is because of the limitations of the hardware inside the equipment. It has nothing to do with some kind of artificial market segmentation.
  14. Mokara

    NX2 rumors

    Codependency and interdependency are the same thing. What you are calling "codependency" is actually dependency. Redefining terms to suit yourself does not negate someone else's argument just because they used those terms. And btw, you do need other people to survive, no one is an island. If you doubt that, try discarding everything made by someone else then walking off into the woods some day and see how long you last without any sort of input from anyone else (and that includes all the tools they have made). No doubt there will be the odd one or two people who would be able to survive solely by their own hand, but almost everyone would die in short order. There is a TV program called "Naked and Afraid" that tries to look at that as entertainment, and if you have ever watched that show it becomes immediately clear that almost everyone in the modern world, even self proclaimed "survivalists", would be dead within a month if left solely to their own devices without any assistance whatsoever from other people.
  15. That is what he is saying. 4K DPAF is technically feasible, but not in the hardware choices made for the consumer products. The processor has the capability of doing one or the other function, but not both at the same time, at least not without additional hardware support. There are reasons for doing it this way, first and foremost being the cost and what the target market would be prepared to pay. Deliberately ceding market sure to competition is a stupid strategy, and whatever else I might think of Canon I am pretty sure they are not stupid. They don't have these features in the camera because it would make it too expensive. You are not going to sell too many cameras in the entry level market if they cost $2k. Those sorts of people will buy some other camera that costs less instead. However, what it does mean that if Canon were to produce a prosumer MILC, it most likely WOULD have that added hardware that would make the features practical to implement.
  16. Mokara

    NX2 rumors

    Not true. You can get beyond the inherent sensor dynamic range by stacking images acquired at different gains and/or ND settings to get a final image with a DR much greater than the individual images.
  17. That is not what they are saying at all. In order to shoot 4K and use DPAF the 5D4 needs to use MJPEG to reduce processor overhead to the point where it is feasibly. This in turn requires very high bit rates that the UHS-I slot would have trouble with, and that most likely is the "technical reason" they are referring to. The 4K shot with the M50 is done with hardware encoding, allowing much lower bit rates to be used. But, apparently that doesn't leave enough processor overhead to handle DPAF as well. It is not deliberate crippling, it is just a consequence of the limitations of the processor. At least some of the high end DSLRs include additional dedicated processors for focussing as well, it is possible that the M50 lacks that in order to cut costs.
  18. Mokara

    NX2 rumors

    You can't. The fastest cards available will be able to just cope with 10 bit at most, and that is assuming you only use a 4K crop (in other words, no oversampling).
  19. Mokara

    NX2 rumors

    The difference is due not to the interface itself, but what is happening internally in those devices. Just because an interface is capable of a particular speed does not mean that the device can generate or receive data at that rate. To use a UHS-II spec label they have to be capable of meeting the 300 mB/s data transfer requirement however. In the case of a camera like the NX, for example, there is a bunch of processing going on that limits the availability of the data, and that results in lower speeds when recording natively. However, we know the minimum base internal data rate of the camera since it does a 6.5k sensor read at 8 bits/30 fps in preparing data for 4K video. That is a bandwidth of approximately 630 mB/s. The camera does processing on that data however, and the bottleneck for data delivery is the processing itself, not the UHS-II interface. If you found a way to side step that processing you should be able to deliver enough data to swamp the interface, since the camera is dealing internally with at least twice the amount specified by UHS-II. If you only used a crop of the sensor however you would be able to (in theory) deliver a RAW data feed as I explained earlier.
  20. Mokara

    NX2 rumors

    If the camera has the UHS-II spec then it must be capable of that sort of transfer speed by definition. That is the whole point of a spec, if something complies with it, it has to meet it. If there is no processing going on then there should be nothing stopping the camera from achieving that. We know the camera can natively shoot full sensor 12 bit RAW+JPEG at 15 fps at a minimum (it can likely do more than that in practise if it only collects data and nothing else). What it can't do is process the files fast enough to clear the buffer, but presumably if the data is written directly to media without any processing then that should not be an impediment. The camera is capable of outputting a full sensor read at 30 fps with at least 8 bits of depth, it may do more but just not normally use the extra bits due to computational overhead. So the data is there, it is just a question of in what form it can be directed to recording media. If he is recording a crop rather than 6K then it should be almost possible to do what he says he is doing (it would require 350 MB/s). A 12 bit 4K RAW image at 25 fps would require a data transfer rate of 300 MB/s, which would be within spec for the card slot. The bigger question is what cards would he be using, AFAIK the best UHS-II cards have write speeds in the 260-280 MB/s write range, and to do that consistently without dropped frames would be asking a lot. More realistic would be 8-10 bit 4K RAW, that would provide enough headroom for fluctuations. 8 bit 4K RAW at 25 fps would only require 200 MB/s for example, while 10 bits would require 250 MB/s.
  21. Bad aliasing artifacts at around 5:50 on that building in the rear.
  22. The lens issue is about to go away since most of the major third party lens manufacturers seem to be getting on the E mount bandwagon now. Right now the A7III is probably the premier hybrid choice for the prosumer user.
×
×
  • Create New...