Jump to content

Policar

Members
  • Posts

    406
  • Joined

  • Last visited

Posts posted by Policar

  1. 2 hours ago, webrunner5 said:

    Yeah My Panasonic AF100A is a total pain in the butt to take video with it, because there is No Auto anything, and every scene you have to White Balance, Black Balance, adjust F stops or use the ND filters, etc., etc., etc. It is crazy how hard it is, but when you get it all right, man it can look pretty unbelievable, just the way You want it, not what Panasonic wants.

    Now my Panasonic G7, you turn it on and shoot. You can have it on Auto everything.  I guess if Apple made a camera this might be the one for the masses. It is almost prefect all the time. Run n Gun, Bamm, it works. But it looks videoish as heck. But if that is what you want it, it is easy as hell to do.

    Apple already makes the world's most-used camera, the iPhone. I think it produces a great image for what it is, great color and the HDR mode rocks for stills.

    You still need a crew to use an Alexa properly (less so an Amira, which even does grading in-camera if you want) but it's definitely built for luddites and to fit into existing workflows, more like the G7 than the AF100 I suppose (having never used either). So I think these brand preferences boil down more to what approach you take than what offers what for the money. 

  2. The Sennheiser sound signature is mid-centric, with rolled off highs and a slower impulse response resulting in the notorious "veil." I haven't used the HD800 (although I'm trying the HE-1 this weekend and will hopefully get to try it, too) but I expect it would be the only model you'd care for. Grados are much brighter, Stax as well.

     

  3. Yeah, I don't hear it. A friend had both the M50 (the old one) and the HD650 and the HD650 was in an entirely different league, vastly better in every conceivable category. But he was driving it with thousands of dollars of hardware (amp/DAC). 

    I do find the Sennheisers polite/veiled. If you love detail and sparkle, they aren't for you.

    If comfort isn't an issue, the $20 monoprice headphones (monoprice 8323) sound about as good as the M50/7506/HD280. I hear the Samsons do, too. 

  4. The Stax are very bass thin, however. Which is too bad. They're no Beats, which have thicker richer bass. But the detail is great. Imo they don't sound too good with modern recordings.

    The HD600 series does sound very veiled, but driven by a high impedance source it's not bad and actually sounds quite good imo. Very warm. Try it through 100+ ohm impedance outputs and you might change your mind.

    While I don't love the M50, I agree it's a great recommendation at the price point. Similar to the Sonys or HD280s but with better sound for listening. Yet still accurate enough for basic monitoring. The only advantage with the Sony is the ubiquity; it's what your mixer is using as a reference, likely, or your editor.

  5. I apologize for any perceived attitude. I'm a camera enthusiast who still shoots as a hobby (I used to shoot tv professionally), but now that I'm working in post I get to work on some even higher end projects with the most cutting edge camera systems and lenses. And I wanted to share some of the latest news that I was incredibly excited about as well as my experiences with different camera systems. Every day I' working with Alexa footage with the highest end lenses or Varicam footage and with the highest end lenses, which is especially fun for me since I'd shot with almost every camera system previously, but I get an even better impression working with the files in post and interfacing with people on the cutting edge and learning their preferences and prognostication. And I never got to the level on set where I could shoot Ultra Primes next to Aluras one day and then C Series anamorphic the next, but I do get to work with that footage now on a daily basis. But I get that with this (and my confidence in the people around me and what they say–not in my own opinions, which I try not to mix in and apologize for my misunderstanding regarding the rec2020 color space*) isn't welcome here because of the perceived arrogance, and I won't be posting here anymore.

    *That said, Canon's white papers do indicate a camera that fills most of the gamut (which naturally includes imaginary colors the eye can't see and so arguably needn't be filled) and we've seen before that a camera needn't resolve a full "4k" to be 4k compliant and yet Netflix does exclude some cameras from common use that claim a 4k spec. So there is a middle ground, and I believe this is where the divide falls. The advice I'm hearing from the top brass at top labs is that 15 stops and support for high bit depth high bit rate rec2020 will be what producers ask for from a camera when shooting HDR. Canon had HDR in mind specifically when developing the C300 Mk II and C700. That said, I will have to trust you that the GH4 can also fill almost the entire rec2020 gamut as I have not read the white papers and my only experience with that camera is without an external recorder or VLOG, where it still performs well for the price. As regards the 12 stop spec in HDR10 for acquisition, that is news to me, and interesting to consider given that Dolby (where I consulted for free simply due to my enthusiasm for the product; I have friends with financial interests, but I don't have one) is looking toward much higher numbers.

  6. Thanks for clearing that up, I was confused because I assumed it was light sensitive. I agree about the Q7 being clunk to work with even though it's very impressive in other ways. I always found the C500 through the Q7 to be extremely sharp at 4k, far sharper than the Epic or Alexa, but the color straight to prores seems odd to me and I've seen mosquito noise and aliasing that's a bit annoying. Never worked with the raw files, but it seems like a camera that's halfway there to me.

  7. Some of that is fair. But for those looking into a camera that will meet future HDR specs for YouTube and Netflix, etc. the C300 Mk II is the lowest end that has a good chance of being approved. That said, this site has never been about what standards others have approved, and instead about getting great results with what you have. For the money, the GH4 definitely offers a good image.

    The rest... take it or leave it. We all have different goals. (Mine isn't further consultant work, it's simply that I think Dolby's standard is superior.) I don't have a crystal ball but I do have access to information that many people don't. I don't know enough about gamuts and chromaticities to claim any camera can cover any given gamut outside of what is printed in white papers, and it's clear from Canon's white papers that the C300 Mk II does not cover rec2020 in full (I don't believe anything does?). But its native gamut lines up very closely with rec2020 and it covers far, far more of it than most cheaper cameras, and was designed specifically to cover more of it than the competition.

  8. 22 minutes ago, markr041 said:

    Who is talking about REC709? REC2020 is already an official spec of HDR - this generation. The GH4/Shogun combo can do REC2020 color no worse than the Canon. Canon beats it by a few stops in DR when the GH4 uses Vlog L (12 stops) (if Canon meets the goal of 15 stops), but The GH4 is equal in providing expanded color gamut, color gradations and color sampling, which are just as important for HDR video. In short, that combo is way beyond the specs of REC709. And 12-stops is within the specs of HDR10, an agreed on standard for HDR.

    The phrase "doesn't meet the standard for professional use" is an incorrect statement (not to mention arrogant), as there is no "standard" for "professional" use of HDR, although there are current standards for HDR (like HDR10), which the GH4 meets as does the Canon.

    I have knowledgeable friends who shoot professional video by any definition of professional and use GH4's. They do not have secret relationships with companies that try to sell products by setting arbitrary standards in the interest of maximizing profits (nor do I); they are just professionals creating art and getting paid for the quality of their artistic product.

    Good for you. I have friends shooting and grading major features on Alexas and who are on the ACES board. And I recently worked as a consultant at Dolby for their new Dolby Vision workflows.

    I'm glad your made up HDR standard works for you and your friends with GH4s. Which can't deliver a full rec2020 image, unlike the C300 Mk II, but that's fine that you think it can. I'll let Technicolor know that they should throw out their research. It's true that there is no current standard beyond 10 bit and rec2020 (and 1000 nits target) for HDR10 because those are display standards and not acquisition standards, and the lowest end ones that will soon be abandoned, anyway. But it's fine, 28 Days Later is on Blu Ray, and that meets the 1080p standard... because it was unpressed to it. It still looks like shit.

    I'm trying to provide some insight into the future, even possibly breaking NDA because I'm so enthusiastic about this forthcoming tech that I've been lucky enough to have demoed for me. And for those who are interested in where things are going, I want to offer up some advice so they don't make the wrong purchasing decisions. The standards being developed will require 15 stops and rec2020. At least.

    But you're right. You and your friends have it figured out. I'll call Dolby up and tell them to quit it.

  9. 54 minutes ago, markr041 said:

    "The C300 Mk II with the new SLOG 2 upgrade is the cheapest thing on the market that is HDR-ready, and it was actually designed as such."

    Nonsense. The combination of Panasonic GH4 + Atomos Shogun Flame gives you 12-stops dynamic range (using Vlog L), 10-bit 4:2:2 and UHD, with extended color gamut at very high bitrates. That is about the same as the C300 Mk II ($12,000) provides at that resolution, for far less money. And don't think you don't need the Shogun or something like it for the Canon - the Canon viewfinders cannot show what HDR video looks in the field. The Shogun can, at 1500 nits and the appropriate HDR LUTs.

    Shooting in log at HDR specs but seeing only REC709 without LUTs is just not going to provide good HDR video. An appropriate HDR monitor is needed. Someone here evidently thinks shooting in log and underexposing is the correct way to shoot HDR; that is exactly the kind of thinking that reflects what you think you have to do when you cannot monitor HDR video in the field.

    The HDR acquisition spec among those developing next generation standards is 15 stops, though, and rec2020. Not 12 stops and rec709. I've used the GH4 and it simply doesn't have that much highlight latitude. I have friends at Dolby, Technicolor, Deluxe, CO3, etc. developing the next-gen HDR grading systems and they're all operating under the 15 stop spec. To reveal my sources more specifically would break NDA, but I know for a fact that Canon targeted 15 stops as it was considered baseline for HDR. 

    To be fair, these labs are remastering content from film and cameras specc'ed at 14 stops and under. I'm sure you can create compelling personal HDR-ready content with that system, but it doesn't meet or even come close to the standards being developed for professional use. And when distributors and exhibitors further and more formally standardize, the 15 stop/rec2020 spec (or better) is what's going to be expected. Whether the C300 II meets its 15 stop claim is another question (before the firmware update it probably didn't; I have a number of friends who worked on developing that camera and it was an ugly protracted development, the early failures of which mask a now-impressive sensor in firmware update SLOG 2), but anything less isn't even close. (Fwiw, I would rate the Alexa Mini at 15+ stops, and internally, Arri does as well. Likewise, the F65 has footage that can be graded for HDR even if it wouldn't meet specs for official support, so yes, it's not cut and dry.)

    I'm not saying this to argue with you but to educate you. I have no doubt that you can make impressive content on your camera that looks impressive on an HDR display. But this is not an approach I would recommend to anyone as anything other than a hobby and the results will never make full use of the technology being developed, not even close. That said, from this experience you are building now, you will be on the forefront of shooting and grading HDR, and that is an extremely valuable skill going forward.

  10. 21 minutes ago, BenEricson said:

    When you say "speed of the camera," that's where it becomes confusing. I think saying a slow sensor or fast sensor makes sense, since that's directly comparable to a film stock. 

    That makes sense. But now that cameras sort of are their sensor, it does get muddy. I'm still curious what the OP meant.

  11. 1 hour ago, Eric Calabros said:

    If I remember correctly Philips research showed that people can't tolerate 10000 nits. Yes its breathtaking for first minute, but after that you feel its destroying your eyes. 4000 was acceptable, and 2000 was most preferred. 

    At the moment, we only can do this to future-proof our work:

    Shoot log, 10bit, widest color gamut available. Underexpose to save the highlights but be careful that unavoidable clipped highlights stay very small part of the whole image area. 

    If that's what Dolby is saying, it's ridiculous. Step outside during a sunny day and you're exposing yourself to brightness far in excess of 10,000 nits all around you. Orders of magnitude brighter. Having seen 10,000 and 4000 side-by-side I can confirm that neither is fatiguing, even in indoor light, though you might choose to turn brightness down at night as you would with any display. And simply because a display can get that bright doesn't mean it will. It's like saying 8k is too sharp. It might be unnecessarily sharp, but you can always show a less sharp image. Only very small parts of the frame ever get that bright.

    Dolby, who doesn't have a horse in this race as Philips does (hardware manufacturers want lower levels to be perceived as preferable so they can sell displays before technology advances, whereas Dolby just wants to have the best standard that will eventually get adopted) has found that 10,000 nits is what 90% of consumers consider "adequate." Based on my experiences with HDR, I wouldn't consider anything below 4000 nits even "true" HDR (though it might still be quite impressive) and 10,000 nits for me at least is the baseline of what's ideal, similar to how I think 2k or 1080p is an ideal "good enough" resolution. Anything beyond that is great but diminishing returns and only really appeals to that last 10%. Of course, 2000 nits will still look much better than 100, which is the standard now. It's considerably more than halfway to 10000 in log base 2.

    As regards future proofing, I don't think any camera I have will come close to matching HDR spec so I don't worry about it. The C300 Mk II with the new SLOG 2 upgrade is the cheapest thing on the market that is HDR-ready, and it was actually designed as such. 

  12. 1 hour ago, Cinegain said:

    In context though, 'slow' and 'old' I personally coupled to the operation of the camera, not the sensitivity at which it shoots, which you seem to agree with would make more sense?

    I got started shooting film and that was all I shot for a while. So when I shoot digital I'm still approaching it how I did when I shot 16mm. And when I shoot stills I'm approaching it how I did when I shot 120 and 4x5. So while I might expose a little differently for a digital camera, same as I'd expose differently for slide film and color negative, I'm always thinking about it that way. And for me (and I think most film guys) "speed" refers to ISO of the film or stop of the lens set. 

    While you're clearly thinking about things in a more complex and advanced way than I am, you'll have to forgive those of us who are old and slow ourselves. I think it's worth respecting some of these old terms if only because the plurality of shooters still abide by them and they do have a definitive and clear meaning. If you ask any DP about a camera's speed, they won't think about ergonomics, which concern the AC and operator more than the DP, anyway. So while most people here are approaching things on a more holistic and aware level and don't need to respect such outdated ways of thinking, I still think respecting some of the old terminology will help with guys like me who still use that outdated approach. The OP seems like an experienced shooter and anyone with any experience on set would use the term "speed" correctly, which confuses me.

  13. I thought speed referred to ISO sensitivity? Maybe I am stuck in the film days still... and maybe that's why I still think 850 ISO is fast.

    You might have to talk with Zeiss about renaming their super speeds... while I agree that you'll usually want to give your AC at least t2.8 I would rather have the speed when I need it with camera and lenses than have to rent a generator or do a tie in. And I see DPs opening up all the way even on major features to get the sensitivity needed; after all, the Alexa and Red (and film) are comparatively slow. What I don't see is anyone pushing them beyond 1600 ISO (though the C500 I often see going to 3200 ISO, hence my belief that it's not slow--maybe compared with the Varicam it is?).

    The OP seems to be an experienced shooter so I have to assume he means speed in terms of sensitivity, but then he mentions the Epic, which is definitely a slower camera.

  14. 1 hour ago, DaveAltizer said:

    I shot this for a company that owns the footage so unfortunately I can't share files with you but I'll post some screen grabs from it on Monday. The camera was so slow. All the old C cameras are slow. But the image from the C500 is undeniably great. We only shot to 4K ProRes HQ as the editors at the company I worked for aren't ready for raw and the turn around needs to be quick. From the dailys I've seen the footage from the C500 with Odyssey looks better than any Epic MX footage I've shot in the past. Idk why we never rented this camera before on RED shoots. I think the RED name holds a lot of value to producers. Idk. But this forum really encouraged me to branch out and try these old slow cameras. I recommend you guys do too!

    Isn't 850 ISO (well, 500-1000 ISO) about standard for base ISO? I don't understand what you mean. Slow to work with? Or did you find the ISO inaccurate or are used to Sonys at base 2000? 

  15. On 11/18/2016 at 9:49 PM, Liam said:

    I'm lost. There are displays that actually show up to 15 stops of dr now? Or still just an image with 15 stops compressed, but not as much? What does 10 bit have to do with hdr? Just another advancement that's buddying up with hdr in youtube and better screens? They definitely don't have to go hand in hand, right? A good hdr display still uses all of its dr with a video not made for hdr, right? Not just essentially lifting the blacks and everything to match a bad display? A video made IN hdr, when played on a normal screen, will clip the boundaries? Or compess it, giving it that flatter look? Will old content be poorly converted in order to appear to be hdr?

    Sorry.. last I heard on this topic was like "Wow, there's a display what has EIGHT stops of Dr!! The futuuuuuure!" so I'm pretty behind. I should try to figure it out on my own really, but if I'm missing something important, please share. Also are pretty much all 4k tvs these days that say hdr liars?

    I've never seen stops as a measurement for a tv's contrast ratio (if that's what you mean by dynamic range, the misnomer is confusing me?), but it's trivially easy to convert between contrast ratio (X:1) and stops. Fwiw, LCDs have had greater than eight stops of contrast for many, many years. For reference, nice prints on paper can have at most 4-5 stops of contrast, which is part of the reason high contrast film was popular for print (slide film) but insufficient for for digital distribution. To convert for a tv, take the log base 2 of the contrast ratio you'll have your contrast ratio in stops.

    Dolby claims in their Dolby Vision white paper that blu ray standards limit content from a range of 100 nits to 0.117 nits, which is under 10 stops of contrast. Perhaps this is what you are referring to. But that's not true of the display itself, just the arbitrary standard, and OLEDs and plasmas can get very very high contrast ratios, they just don't get that bright. (And neither has an intrinsic "dynamic range"––that's a measure of the signal captured or represented in a capture, not the display used to show it.) What defines HDR is a combination of contrast ratio, brightness, and color gamut. The massive increase in brightness is what's most significant. 

    I'm not sure what the current spec for HDR is but there are many that are competing and within those that are competing, many tiers. 720p and 1080p were decided upon as HDTV standards, but that was among a number of competing standards. For HDR we don't even have the set of standards finalized. But I think the current spec is 1000 nits for most manufacturers (though you see HDR labels on far dimmer displays) and maybe 10 bit color? Dolby Vision I believe is 4000 nits and higher with a goal for 10,000 nits and higher. I'd read their white paper, it's interesting without getting very technical.

    Today's standards for HDR are not too high, but the displays are not too impressive. The 15 stop standard (for acquisition, the display itself surely has a much higher contrast ratio than 15 stops and I can explain why but if you think about it then it's obvious) and 4000/10000 nit standard are I believe Dolby's goals for the future. It's true that there are competing standards still (DR10, Dolby Vision, and most requiring only 1000 nits to get the label), but I personally wouldn't consider them HDR based on what I've seen of them. Like 4k, they don't look good enough or different enough to catch on and are mostly just marketing or halo items.

  16. Interesting to see an informed, centrist response about something that matters on a website full of brand evangelizing about something that... ultimately doesn't.

    Better than it being the other way around. 

  17. 5 hours ago, Mat Mayer said:

    Sorry but I am too busy to read it all.

    Please could someone just list the basic specs please? A few bullet points would be really appreciated.

    I assume Panasonic G7 won't be good enough, but do you think the GH5 will have what it takes internally?

    Thanks.

    Ps. To those complaining, at least they are not going to 8k which I assume would be a much bigger task. Maybe with HDR and then Dolby Vision we will be set for 10 years?

    The specs I'm hearing are 15 stops DR, rec2020. For acquisition. Then 10 bit 4000 nit wide gamut for the panel itself. Obviously not many current systems meet these specs and there are many, many competing standards. After all, 1024X720 was once "HD."

    The result is breathtaking, though. Especially on the 10,000+ nit display. Only one other tech demo impressed me as much this year and it felt less mature. What's cool is you'll be able to see colors you've never seen before.

     

  18. I agree! I still have my ST60 Panasonic plasma even though it's old tech by now. I do think the state of the art OLED sets are better but my eyes aren't good enough to need 4k at normal viewing distances anyway, though most people I know are getting 4k displays now. The dithering on plasmas makes them a little soft to begin with, but the ST60 is fine.

    I saw HDR demoed on a smaller 1080p screen after seeing state of the art 4k projection and there's no comparison. I actually don't think 4k looks any better unless you walk right up to the screen or it's projected on a huge screen. I'm beginning to see the advantage for acquisition (for cropping in or whatever) but I think Netflix and YouTube and Amazon are sort of doing this for marketing rather than quality. It's a marketing thing meant to get people to replace their displays. I find it really laughable that people think this is something that matters. The old 1080p plasma screens have better acutance and the illusion of better resolution at normal viewing distances. However if you have the money a 4k OLED would be even better!

    Your plasma is 100 nits at full brightness. Brightest highlight, every setting maxed out to bright. I've seen two HDR displays demoed and one was 4,000 nits and the other was 10,000 nits or more .So imagine all that contrast and better resolution and better color detail and then on top of that it goes 100X brighter but not just brighter, the darker areas are still as dark and well-rendered (actually much better). The sun looks like the sun. It doesn't look like an image of the sun. It's amazing tech. On top of that you're getting much richer reds and greens. It's just a massive jump in quality.

    Fwiw, current-gen HDR-certified displays are 600 to 1000 nits. That's what's commercially available. So while those will look really really good.... They're nowhere near what's possible. I think energy conservation standards may prevent HDR from taking off, however. HDR projection standards will also never compare with home monitors in terms of contrast or brightness. So this may be something that never emerges as mainstream or properly-implemented. State of the art is 3XLED per-pixel (similar to Sony's "crystal led" technology) and LED efficiency is already high. This is very very expensive and still too inefficient for widespread use. OLED doesn't cut it for brightness/efficiency. Standard LED/LCD doesn't cut it for contrast. So we may be left with 1000 nit faux-HDR, which should look much better than anything you've ever seen, but nowhere near what's being demoed. Sony has 4000 nit displays at trade shows. That's very interesting because at that point it does feel very different. Only small areas of the screen can be that bright at once, but a large area that bright would be almost painful to look at. We'll see when and if this technology becomes commercially available. 4000 nits is a ways away and I wouldn't consider anything less than that representative of HDR.

  19. 1 hour ago, Inazuma said:

    What cameras can record in HDR?

    Why is a HDR screen required to display HDR when you could just bake the dynamic range into a normal video file?

    The camera spec I believe is 15 stops, but I saw film and F65 footage that looked fine as HDR.

    That's similar to asking why an HD screen is required to view HD when you can downscale to SD. Truth is, tone mapping only goes so far. With HDR, it's the difference between listening to a very compressed (dynamic range compressed while mastering, not MP3 compressed but that too) track on your iPhone with bad headphones vs being at the concert live. It's the difference between a cheesy tone mapped image and being there. It's really incredible and difficult to describe because no screens exist now that can approximate the high end test beds.

    Imagine that the image you're seeing of a day exterior isn't an image, but instead a window, with as much contrast as your eye can see and as many colors. Or even colors that you have never seen before. The sun can be so bright on an HDR set that it's unpleasant to look at. Imagine if your tv could get as bright as looking into a 60w bulb and as black as pure darkness. Sure you can compress that range, but why would you want to?

  20. I've seen demos of cutting edge HDR displays. Unfortunately, a small screen screen consumes nearly as much power as a small house (due to the need for a bank of air conditioners behind the unit) to cool it. But the image is unbelievable. Much bigger jump from HDTV to HDR than from 1080p to 4k. As big as SD to HD, easily.

    The high end first-gen sets are likely very impressive so it's good to see YouTube pushing the technology. It does seem immature. The ecosystem is very immature. But HDR is mind-blowing. 

  21. 36 minutes ago, fuzzynormal said:

    Unless you're a creative wunderkind that made something uniquely awesome (and entertainingly so) you're probably not going to get into any serious festivals on your own. 

    Actually, in spite of making something awesome and new, that actually would probably get you rejected from most film festivals.

    I actually had this happen. I made a short that was narrowly rejected from a very high profile because the voice was too strong and one of the programmers was afraid it was too weird for their brand. Every lower tier festival rejected it outright because it was way too weird for them. Very discouraging, but I was at fault for not knowing the community I was applying to be part of.

    You should just do what you want and enjoy yourself. You will probably have fun at the local festival, which is more than I've ever gotten into! If not, don't apply to it next time. Lesson learned. Easy. Find people who are doing what you like. That's step one. Breaking in is step two. Do step one first. Trust me. The same goes for film school, etc. Watch the shorts they're producing, meet the students and faculty. Is this the community you want to be a part of? If so, it's worth considering applying.

    LA is an extraordinarily difficult place to produce quality work cheaply ($10,000 for a short, let alone a feature, sounds 10X too low compared with a high end thesis film). If you have a strong technical skill set you can benefit from this when some of the money trickles down your way. If you learn Avid or Nuke or something or become a wizard at motion graphics you can make six figures in your twenties without much trouble, but I won't suggest you'll like what you're doing. There are lots of good local actors in most communities, even just community theater, but you have to look for them and write a script that's strong enough that they want to be in it. You have to be the community they want to be a part of. Or you can pay them. :)

  22. Yeah, I agree with the above. 

    But the post above that, which rates dSLRs above the CX00 for video quality is totally absurd. The CX00 is leagues ahead. And I've tried everything and worked with and for the biggest post houses there are using everything there is. No, you won't get an Alexa with any cheaper camera. But if you're good and careful you'll get everything you need to intercut with an Alexa with the C300 or C100 even. And nothing else will do that. Comparing an A7S with a C300 and putting the A7S first is so absurd it's almost more sad than absurd. That said, the A7S is a brilliant specialist camera for low light and workable in general situations if you spend thousands more to rig it up wth a Q7+ and a bunch of accessories and don't mind grading out weird colors.

×
×
  • Create New...