Jump to content

Axel

Members
  • Posts

    1,900
  • Joined

  • Last visited

Everything posted by Axel

  1. The upcoming A6700 very probably will have 60p UHD and even better AF. It will almost certainly also have an HLG profile. Now if it also had all that with 10-bit video (and the fully articulated display) ... GH what? Be it as it may, as soon as a new version is available, all Sonys drop in price dramatically. This is a sad fact now that you made up your mind to get one. Should you wait? Your decision. At least I would try to get a used one.
  2. No, I can't tell "with confidence". I only ever made 5-6 test shots with slomos. One was a fast jump to my microphone shock mount on the table from 6 feet distance to close up, AF set to center. With this 5 x slomo, it nailed the focus perfectly in every frame. But then, the motif was black with sharp outlines and lying on bright wood. What this test showed clearly was that the image quality is really poor then. That's why I didn't do more slomo tests. AF is good in general, but it all depends on the mode, if you chose the right one. The most reliable modes are those which involve tracking, for example face detection. They can be called perfect. But there are caveats there as well. See Max Yuryevs and Jason Wongs tips on Youtube. A vlogger's face may be constantly sharp as he/she moves forwards and backwards, but there may be perceptible breathing of the oof background, and this also depends on the AF speed and (here it gets complicated) on the lens used. Fact is, the AF works best if it has little to do. When the lens is stopped down to it's maximum resolution and the focal length allows hyperfocal distance. What's the point then? Here is a real world example from a wedding videographers view. It's a bright day, and you are outside. In 2160p, the display gets so dark, you really can't see your image! In AF mode, there s no peaking to check critical focus. So what I do is either focus manually through the viewfinder (with IBIS and OIS you need only the most basic rig, like a more comfortable *grip*, if at all) or, if I am to use a gimbal, use a bright field monitor (i.e. the smallHD Focus, which has peaking in AF mode). Touch AF doesn't work then, for obvious reasons. I then stop down the aperture to - typically - f7. Yes, I do lose nice DoF effects, but the AF won't fail then. One of the biggest advantages of having a camera with good low light capabilities is to be able to use, say, f5.6 in an available light situation. Also, shooting with 120 fps. You need five times the light for that. Then again, since the HD quality of this camera is nothing to write home about anyway, you'll want to use a lower ISO to get a cleaner image. Lenses: too many aspects to consider. If good and reliable AF is essential, use good Sony lenses. Or perhaps the new Sigma 16mm, the reviews suggest it's the perfect gimbal lens for the A6500. There is no easy answer ...
  3. Yes, of course! Good explanation. If it's the last clip (appended at end), you could also hold p and slightly move it to the right (produces a slug). That'd be the fade-in. Then comes Matthias: This will fade out.
  4. Look, Andrew, this makes no sense. It‘s very hard to switch if you are used to another NLE. You are constantly looking „for the blue button“, as Mike Matzdorff put it. It‘s easier for a newbie actually. Either you learn FCP as if you knew nothing about editing or you will go on to compare and complain. Would be the same the other way around.
  5. You can save the keyframed opacity as an effect preset for fade in, fade out or both at once. Double clicking will apply either of them. You can lift a clip from the primary and apply cmd+t to either left or right bracket or whole clip. You are married to Premiere, Andrew. Move on.
  6. Though you can do that (Deadcode wrote), it's not recommended. FCPX is not good with clips that are isolated from their card-structure bundle. Through the import dialog, FCP will copy selections from the card by wrapping them in a Quicktime container. That's the way it works. But keep in mind: 1. because of the skimmer, you can browse the card very quickly, find your shots and import only selections - in case you think you need to save disk space. 2. you can alternatively just import everything and do the qualifying and organization later on (you did it in the Finder? - waste of time). Forces you to copy from the card. Into the Library or to an external folder. Links to clips without copies ("leave files in place", the Premiere way) is possible (by copying only the MTS files to disk or by drag&drop), but you shouldn't do that! 3. either way you can start editing while the clips are still being imported = in effect that's editing directly from card. The concept of the primary storyline. In this (e, w, d, ctrl+shift+y) you create the spine of your narration. If you have text (off comments or an interview), you put that to the primary and all illustrating clips are then connected (q). For a music video, you'd first select the music track and hit q. From there on, you connect everything to the black slug. Connected clips can move freely (no need to use p). The magnetic timeline is locked. To lock a connected clip in it's position relative to the music, just overwrite it to the primary storyline by hitting cmd+alt+down. The following 10.3 show is staged of course. But it's done by a master. You think FCPX is inferior to Premiere? Watch it from beginning to end, particularly what he does in the timeline, and imagine what ordeal this had been in a track-based timeline:
  7. I'm afraid I'm about to make another TLDR posting ;-) What's meant by Dynamic Range? Someone above said common rec_709 displays could show 7 stops of light at best. This is the standard. Let's therefore agree to call it SDR. Many modern cameras can record 10-13 stops. A linear recording (picture profile-related) would only be able (hypothetically, see below) to store an excerpted range of nine stops. Due to broadcast-rules, the values below 16 and above 235 were considered 'illegal', because they couldn't be truthfully reproduced. That's the background to 7 stops of light at best. But none of the non-LOG profiles bakes in a truly linear curve, because that would look terrible. They reserve most of the values for the midtones and rob some of the wasted 20 superwhite values. The image of a standard picture profile - often called Standard - tends to look punchy as well as natural. Most of the time this is actually what one wants to achieve. There are also profiles that favor skintones ('Portrait' - for Sony mirrorless the creative style Autumn Leaves has become popular because it additionally has a color shift complimentary for skin), lanscapes and so on. I don't know the exact numbers, but let's assume that there will be 50-60 values reserved for the skin range then. A LOG quantization curve tries to distribute more or less (there is a knee for superwhite usually) the same amount of values for every stop of light into that 256 scheme. For the Sony A7sii, which claims to record 14 stops @Slog3, this would mean 18 values for each stop (it may be somewhat more complicated, but for the sake of simplicity, let's agree upon that simplification). One term that currently came up in conjunction with the new GH5S' dual ISO is *useable dynamic range*. Tom Antos' tests to verify the actual dynamic range of cameras showed that all LOG recordings capture the noise floor too, so one has to substract one stop for the shadows. But there are limiting factors for the highlights too. If a daylight sky's gradient does show banding in Slog-3 in a graded rec_709 version, where in-between-values are being interpolated with floating point accuracy in post, one can imagine how this will affect HDR. Admittedly I am not a broadcast engineer. But taking all of he above into account, I'd say that 10-bit or more is needed to really extend the DR. Please comment and correct my arguments. Now for a subjective point of view: We still live in an SDR world. Seven stops. Light is just white, a dull color mix of RGB. 13 stops of light crammed into rec_709, lifting shadows, preserving highlights, results in the kind of artificial-looking images we often see when people record in LOG. If light is the subject, if it's prominent in the image, I consider this counterproductive and pathetic. It's better to let the highlights clip, to let the 100 IRE white eat away the detail. I found an image to demonstrate what I mean: Always? No, but the unwritten law to never let the highlights clip is stupid, imo. Light is not just an informal part of the environment, it's an epiphany. I like a punchy image more than an expressionless one. Oh, Larry, well done! I really see detail in the clouds, terrific!
  8. As for serious CGI (he mentioned that in connection with Jurassic Parc asf.), you have to consider that in 2018 this still involves so-called render farms. Any traditional desktop machine won't cut it. These complexes are so expensive that you need exchangeable parts to keep them up to date. Compact size and design (Apples speciality) are not asked for. More and more, the GPUs do most of the job, preferably in realtime (we are on the verge of realtime already, see Blenders EEVEE), and it's paramount to be able to swap graphic cards AND motherboards with no restrictions by the hermetic OS or the undersized housings. Once you have successfully built a setup for the software you are working with, the operation system doesn't make a huge difference anymore. I use an iMac because it's a simple all-in-one machine, not overpriced, if you count the parts and the time needed to frankenstein them together for a Hackintosh. What I don't get is why anyone would want MacOS for Adobe, where this notoriously performs worse than on Windows 10 ...
  9. I don't know about the "bad videos". As we all know, Youtube filters the millions of clips according to your own viewing habits. I see my special interests reflected in the suggestions Youtube offers me when I open it. I do not have to search for the needle in the haystack. The less I fall for presumed clickbaits, the more nuggets appear beneath all this crap. They are there, and in bigger quantities than on Vimeo, where everything looks so decent at first glance, but turns out to be pretentious crap later, and mostly boring. I like the idea of the open secret. Something that's already there and waits to be discovered. If it's any good, more are likely to find it sooner or later. I recently saw an analysis of The Exorcist by Rob Ager (accidentally, didn't look for it, because I wouldn't have had an idea about it in the first place). He argues that there is an open secret about this film, ignored by practically everybody, but impossible to ignore once you view the film again. The girl had been sexually abused and has suppressed this experience. Her "possession" is the answer. Over-interpreted? Look it up! What is needed is a guide on how to place one's own clips in Youtube so that they are found by the right audience. And, in general, you should be able to improve the quality. Pay for higher bitrates or better compression? Why not?
  10. If it's *just* about money, if the only votes that count are those of the shareholders, this industry is doomed. Applicable to everything. If we are measured by by the degree we can be exploited, our kidneys will be sold and the rest becomes soap. Watch The Cooler. The old casino mafia ruled this frivole business with cruelty - and passion. Then the bankers appeared and took over. And the world turned to shit.
  11. The reasoning behind this: The GH5S has a bigger sensor compared to GH5 (17,3 mm x 13 mm), but not as big as that of my A6500 (23.5 mm x 15.6 mm). Lowlight performance with my camera is still better. But I don't care at all about lowlight. Dynamic range is also slightly better, 13 stops (Sony) over 12 stops (Pana). The Pocket (as well as the Micro, which has the advantage of 50/60p @ 1080) has 13 stops. People now have introduced the term usable dynamic range in connection with the cleaner shadows of the GH5S. But there is another side to this: gradation of the highlights. You'd usually ETTR and bring the highlights down to IRE 100 in post. Dave Dugdale has made a video in which he proves 8-bit to be much less problematic compared to 10-bit than is commonly believed. But this is for rec_709, where you squeeze 13 stops into 6 stops: ultra low dynamic range. The Pocket, with it's tiny sensor, is by no means a lowlight performer. Shadows will be very noisy (you see it in froess' fairground clip, but also in the shadows in the forest clip). The noise does look a lot like film grain, but of course you would want to avoid it nonetheless. You can do so by setting zebra to 100% and adjust exposure so that the highlights just start to clip. This camera really is (contrary to it's reputation) the easiest of them all in this regard. Of course: there has to be enough light and/or you have to use fast lenses and additionally speedboost them. But. Then. The colors are beautiful to start with, and with 10 or 12 bit, they stay beautiful in post. Now finally we here the bell ringing for SDR. I bet if you saw a highkey image with sun and sky in HDR, shot with Pocket, you said wow! Shot with Slog in 8-bit? Probably eew ... I wonder if the GH5S could be the right compromise. Good enough images, good resolution, low noise in the shadows, enough bit depth for good highlights. That's paramount. Missing AF, IBIS and high ISOs don't need to show in the final image.
  12. Wonderful. How I rue to have sold this camera! It sometimes showed the limits of it's resolution, but that's the extend of it. Thanks, froess!
  13. Provocative thoughts on IBIS: Terribly overrated. Nowadays no one seems to be able to conceive a shot without it. The point is that with UHD small shakes and jitters instantly reduce resolution considerably. I know. People want to be able to hold the camera body in their hands - a body that's ergonomically unfit to be held that way -, looking casually on the swing-out display. The switch from traditional camcorders (no one wants to shoot with one nowadays) to DSLRs brought the rig industry into being. Rigs never were perfect ootb. You didn't buy a rig and were ready to go. You had to adapt a rig, make it fit in size and angles to your own body and preferred way of shooting (field monitor or rather a bigger EVF?), exercise with it, buy additional parts. There was no in body image stabilization, so designing your personal setup for stabilization was top priority. Additionally, you had to train your muscles and breath (exhale while concentrating). Zen And The Art Of Body Camera Stabilization. Smart technical solutions to compensate for common problems (not enough light, no patience to focus, those things) make us lazy. I doubt very much that they contribute to better images, rather on the contrary. Like I said, deliberately provocative. You can also tell I'm considering the GH5S.
  14. Thanks, jonpais. Important to know that with bright outlines you need to pan slower. I am a fan of slow pans anyway. As for the example with the silver buttons in the lowkey scene: can't you just subdue these highlights through grading?
  15. @jonpais 720p 60? Did you try other settings? It could be worth trying AV out in Resolve. Once that works (may or may not work better), you can launch FCP parallelly. Sounds weird, but may help.
  16. I was skeptic about HD as well. Saw it first in autumn 2004 on a trade show with the then-new Sony FX-1. Worries were unsubstantiated since the images were seen on SD TVs then. I wasn't impressed. The first time I saw UHD was on a trade show again. Some JVC camcorder, stitching four HD videos together. Horrible colors, terrible edge-sharpening. The best part was where they showed a fish tank that was supposed to look real. The audience was impressed. I said, no, the fish look dead. I have a more convincing screensaver ... Groundhog day.
  17. Demo-reels, "test shots" featuring once new technology is always cringeworthy in retrospect. On your comparison of B&W with Technicolor: in one of his docs on cinema history Martin Scorsese shows that color was used creatively early on. They learned to hold back very quickly. 2D projection had to be 50 nits peak (15 foot lambert). With brighter projection, you'd lose contrast again. The blacks never had been very convincing in cinema either. It's true that particularly analog film (but digital cinema packages also) can hold more stops of light. More than could be shown. This whole HDR affair is about new display technology more than about camera technology. Cameras that can record 10-15 stops in 10-bit or higher are with us a few years now. Only that until recently most of it was "lost in translation" for distribution. The downside will probably be that HDR will be less 'forgiving'. Many affordable cameras were just a tad better than what the 8-bit rec_709 Youtube clip they were bought for demanded. I was among the 4k skeptics. Resolution is not about image quality. The trek moved in the wrong direction. If 4k was sharper than HD, it was because it hadn't been true HD before. And for the sake of more pixels everybody was happy to allow heavier compression. Although storage costs have become so low (I have to think about John Olivers How Is This Still A Thing?) and in spite of the warnings that compression artifacts degrade perceived image quality the most (see Yedlin again). But I wasn't "opposed" to 4k. To those who feared problems with make up and the like, I said, why would it make a difference? Would I light and frame differently? No. Why? I had been shooting DV. Did I avoid long shots with lot of background detail? No. Why? The same with HFR. We've been discussing this ad nauseam. I had reservations. But I could name the reason. The comparative lack of motion blur takes away momentum. If you know about that, you can shoot accordingly. As I see it, you can still shoot 24p in UHD. The resolution goes down as the camera moves? So be it. If you are fixated on resolution, you will eventually stop motion altogether. No more fights, car chases. Pristine calendar stills of the graveyard, soon with a 1000 nits sun playing behind the headstones ...
  18. Give it a couple of years, yeah. That's reasonable. I won't try to shoot a wedding in HDR now or in summer. I do, however, have some more ambitious shorts in planning. The visual quality of which concerns me now. In short: I think so. My iMac display is *not* HDR, but it has ~500 nits. LCD, means blacks are grey. I have it backlit with a 6500°K LED bar for perceived contrast. Can't stand to watch Netflix on my ~4 year old Samsung TV - anymore. In comparison, all images look muddy and faded. Once it is replaced by an HDR-TV, it's clear to me that I wouldn't want to invest any effort into producing rec_709 images any further. There are two sources that discuss HDR vs Resolution vs Brightness in detail: 1. the Yedlin "Resolution Myths Debunked" video. The bottom line of which is, with true 1080p, we have passed a threshold. We won't be able to see individual pixels at reasonable viewing distances. What is more, since all images are scaled - always! - an upscaling on a device with bigger resolution will improve the perceived resolution dramatically. HD on a UHD display looks better than UHD on an HD display. Fact. Resolution is only good if you can't see (or rather feel) it's limits. So resolution must be "invisible". Resolution is often confused with perceived sharpness. Beyond the said threshold, contrast adds more sharpness, brilliance, clarity than more pixels. 2. The lectures on rec_2020, which include 4k (UHDTV1), 8k (UHDTV2), HFR, WideColorGamut and HDR. This is complicated matter, but all engineers agree that an extended dynamic range contributes most of all factors to perceived image quality. As a side-note, regarding resolution: it's an indisputable *fact* that 4k @ standard frame rates is only HD for moving images. 4k demands bigger pictures (interchangeable with shorter viewing distances, as in retina display), and the motion blur then diminishes the spatial resolution. 50/60p for 4k, 120p for 8k. Like it or not. You can't be a pixel peeper and resolution fundamentalist and at the same time insist on cinematic 24p. We have to define the word benefit here. At the present point, it may not be reasonable or economically advisable to buy the hardware. If these were generally accepted arguments, EOSHD would probably die.
  19. Thanks again. One questions, hopefully a simple one: what kind of Thunderbolt port is that? The reviews for the BM device date back to 2013. Will an adapted connection allow it to suck power from the Mac? Nowhere on the german seller's sites is specified if there are different versions for different ports (and neither on the BM homepage). With taxes, I could get the Ultrastudio AND a Flame for ~ 1000 €. Now I'm spoilt for choice whether I should spent 300-400 € more for an Inferno, since it may turn out that my A6500's 8-bit (which is also limited to 30p @ UHD) turns out to be unsufficient. jonpais, invaluable information!
  20. Not only do the interfaces (for *Macs*) cost more than the monitor/computer. You might come to the conclusion that - for HDR - it'd be wiser to jump to Windows. You'll probably be able to build a custom PC (for *Resolve*) for the price of the AJA that'd outperform your iMac. Since Apple obviously recognizes the potential of HDR (HDR in AppleTV 4k and more and more on the iTunes store VOD), we may see an iMac with an HDR display. You'd have a superior monitor with computer for ~ $2700, it would reverse the situation. When? Here come 3d-party-manufacturers. All it needs is a little box with Thunderbolt in and HDMI out that correctly flags the intended color/HDR space to the monitor. Does it have to be very expensive? You could have spent the fortune for the AJA, and three weeks later you see the device (perhaps even from BM) advertised for under 300 bucks ... As good as Resolve has become, it's still inferior for editing compared to FCP (and will remain so as long as it's married to tracks). And FCP adds CC features that turn out to be half-baked. It's an editor, period. Doesn't have - for instance - an audio mixer. Nevertheless, with audio events quickly and neatly sorted in role-lanes, everything is perfectly prepared for (i.e.) Fairlight, if an audio mixer is what you desperately missed. You can have best of both worlds. HDR is the next big thing.
  21. I don‘t know. I‘ve been using it since the day it became free software. I‘ve read the manual (all), I bought the Hurkman video training, I searched for and tested roundtrip workflows. I witnessed, how the two softwares became more and more compatible. Last time I used Resolve, it didn‘t yet auto-adjust the pitch of retimed clips, for instance, no big deal. Roundtripping with APPLES Color was more awkward than Resolve is now.
  22. I don't mean FCP. Perhaps (as Mark Spencer says) the AJA is the only officially supported interface for FCP. The old Decklink Mini Monitor (HD) also worked so-so with FCP. The image sometimes froze. The solution then (with cheesegrater MP) was to have Resolve open at he same ime - where it worked perfectly. My maximum tolerable price for an interface would be ~ 600 €. Roundtripping is pretty reliable, and grading in Resolve is still better. More so since the new CC tools of FCP seem to have their problems.
  23. Now we are talking sense. I'd be happy to skip over to Resolve for grading and monitoring at an acceptable price, which should be not a big problem now that FCP understands what HDR means in the first place. Thoughts on that?
  24. Have no GH5 Have no HDR TV How do you judge things like midtones and saturation? Of course you can't see the DR on an iMac. But does the preview suffice? Just trust the scopes?
  25. Look, what bothers me most is not that Apple missed a bug here (though it's not exactly awesome that this slipped their QA). It's the reaction of the FCP X fanbase, among them famous Ripple trainer Mark Spencer, who openly denies that something is wrong. Despite clear evidence. By some individuals, Ubsdell was treated like a traitor, and I find this contemptible. I'm not one of these blinded. I still think FCP is best for editing. And still, even if the new CC tools all worked as expected, Resolve is faster for serious grading. Nodes are more appropriate for organization, there is an easy wipe/splitscreen mode, and I love the still store. I can have best of both worlds, fortunately, because I find the roundtripping very easy and reliable.
×
×
  • Create New...