Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 02/03/2018 in all areas

  1. PannySVHS

    Lenses

    You humble cat. Not the one standing proudly on your back:)
    2 points
  2. @Mark Romero 2 - here are my thoughts for shooting architecture / interiors, etc. with the A6500/NX1 (!). My claims base on own experience with some famous (and extremely picky about detail and resolution) architecture's offices and luxury real estate dealers, possessed by by "Gigapixel mania". These guys seem to have invented the pixel peeping... I own and use the A6500 and the NX1. As you know, these two cameras are exceptional in 4K when talking detail and sharpness - much "sharper" than even much more expensive dedicated video/production cameras. But...there is a price to pay for this sharpness - and the "micro-jitter" or "stroboscope effect" is one of the disadvantages. Sometimes, when shooting 4K even on tripod and panning with A6500/NX1 and stopping down the lens to f5.6-f11 for a maximum of sharpness and resolution, you will encounter the stroboscoping (jitter) effect. Let's take a look a the different use cases: 1. Panning When panning, try to pan smoothly, personally I pan with these two cameras a 90 degree angle in 12+ seconds - not faster. I do it this way when shooting with gibmal, but also when using the cameras on tripod 2. Using a motorized gimbal Today's gimbals are phantastic. But still, even people who claim to be "very experienced" do often a superficial balancing and calibration. Spend a lot of time on precise balance and (micro) calibration of your gimbal, you will see a huge difference when done right. As you might know, the Zhiyun or Moza DO NOT counterbalance vertical shake. Therefore, please put a piece of mousepad (cut in the shape you need) between camera and gimbal plate. My piece of mousepad is about 2mm thick and it helps a lot avoiding jitter - because it seems to be quite efficient when buffering vertical shake (when runnung, stepping, or running stairs) 3. Practice running with gimbal You might say now, this is a trivial advice. It's not! Most guys believe they do it the right way. They don't. Two months ago I payed 600,- EUR for an 10 hours seminar / course with a gimbal and steadyshot pro. It was great, because there I learned I was a nob. It could take even years to some users to learn to use a gimbal / staedypod properly. You have to move and walk like a ninja, to use your hands and arms independently from your body, etc. it's not as easy as it might sound. 4. Frame rate / Shutter speed This is essential. You are based in a NTSC country. As you shoot interiors / property, you don't need to shoot 24fps. Shoot 30fps. As seen in your video, you usually use 3-6 seconds takes for your video. When shooting 30fps in NTSC, you can even slow this a little bit down in post, by putting your shot in a little bit faster timeline than initially shot.... So, when you shoot 30fps, try to keep shutter speed at 1/50. This will make each frame more "blurry" then when shooting at 1/60 - and will diminish the "stroboscoping effect" / jitter substantially. Put a ND filter on your lenses, don't try to reduce the amount of light by increasing your shutter speed. Increasing the shutter speed and panning will accentuate the stroboscoping effect. 5. Use OSS / IBIS if possible...but test the same takes WITHOUT...there are some cases, when micro jitter disappears with IBIS off...Just test... 6. You have to rethink the term "interior video" When you shoot interior, (mostly) nothing will move. So, why do you shoot video? Shoot stills! Let me give you some example out of your video: The takes 0:20s-0:25s / 0:25s-->0:28s / 0:51s-->0:58s / 1:19-->1:24, etc. WHY do you shoot video? Why? Shoot stills and create the pan / zoom in effect IN POST... Shoot on a tripod. Shoot 3+ exposures (depending on contrast and DR expectations) for each take and RAW at base ISO (100 for the A6500). When shooting like this, you could get a unbelievable amount of DR and great colors within 5 minutes. Blend your shots and apply luminosity masks in post, if necessary...You will get a DR you could never get within the baked in h264 when filming... Now...Your 10-18mm Sony lens is great...But it's a 15mm on the crop A6500. You have a FOV of 76 degrees vertically, this should be more than sufficient....You have a horizontal angle of view with this lens at 99 degrees. IF you need a wider angle (=more FOV) - my picky customers refuse to accept wider angles (FOV) than about 120 degress, because they find it "unnatural" - take more photos by panning and stiching in post. When talking 120 degrees field of view, we talk about 3 horizontal takes. BUT: use a panoramic tripod head (here in German from 130 - 190 EUROS) for this, you have to get the proper nodal point before stiching...If needed, shoot vertically, but you need more takes to stich in post...There is plenty of cheap or almost free stiching software out there. Then import your post processed photo (after sing RAW post pro, blending, stitching) in your NLE and create the pan / zoom effect in a spectacular resolution and DR....NOONE will know, it's not film... BTW: Shooting f11 with the 10-18mm on the A6500 is NOT useful, as for most APSC cameras, f11 is where difraction starts to kick in (in stills quite visible). At this wide angle (10-18mm), f8 should be enough to get maximum of sharpness and resolution. Have fun!
    2 points
  3. Matter of taste and personal choice. Just because someone has a strong preference for one system over the other doesn't make them a 'fan boy'. There's a lot I like about Mac OS over Windows and all of it is to do with functionality, security, driverless plug & play, uninterrupted creative flow, aesthetics, app ecosystem, app exclusives like EditReady, codecs like ProRes, file encryption, Finder vs Explorer and a better user experience (for me) and nothing to do with the brand "Apple". Hackintosh is great, but even when you have every single aspect working fine, there's a lot of errors in the logs thats go unnoticed but do point to it not being just right, and updates could break things at any stage. Mine has been rock solid, it's just the wirelesss stuff... At one point the WiFi was dropping out when I plugged things into USB! Since I'd rather be editing videos than installing custom KEXTs, I think going back to an official Mac would be a creative move. With Windows 10, yes I know it has come a long way since 7 (shudder) but still the look of the UI makes me vomit. Not exactly easy on the eyes is it? And the control panel - holy crap - what a stinking mess. Two in one - an old one and a new one - the new one is practically useless - the old one is from 1995! WHY?! Why have two?!
    2 points
  4. Inazuma

    This guy is hilarious

    Everyone has different experiences. I really disliked the Mac interface when I tried it years ago. Windows gives me no problems.
    2 points
  5. mercer

    Lenses

    So, as much as I love my Canon lenses, I am having a lot of fun messing around with vintage lenses and ML Raw. I still cannot believe how amazing the Minolta 35mm 1.8 is on the 5D3. Here are a few more shots... My next big goal is to convert a 58mm 1.2. But in the meantime, I think I’m going to buy a Helios... it is one vintage lens that has eluded me for the past 5 years. I found one that is listed as New Other in EF Mount? Did the Soviets make a Helios in the EF Mount? I believe it’s the 44-M 7... maybe? Also has anyone used any of the new Zenit Helios lenses? I saw a 50mm 1.2 that looks intriguing. And here’s one more from the Minolta. Also I want to add so there isn’t any confusion to some users that are unaware... Minolta lenses will not adapt to Canon EF without having the lens modified. There are adapters but they have a corrective glass inside that is not of good quality.
    2 points
  6. Honestly, I was trying my best to give you an open mind and not disagree with you, but if this is what you think, without even experiencing it in person, then I disagree with everything you've been saying. It's frustrating that you've already come to a conclusion without actually experiencing it. HDR is more nuanced. It's more detailed. It's more pleasing in every single regard, including color. If you can't see that in the iPhone photos, fine, but take my word for it - in person, the kind of saturation in the 2K BD is gross. As is the contrast. I'm sorry again, but you're just not speaking from a credible position here. If by saturation you mean there are fewer variations in color (IE the computer technical definition), sure, but it makes the image look like it was shot on an iPhone 4. I could easily make the HDR look just as sh*tty as the SDR version if I cranked up the saturation and fake contrast controls on the TV. Maybe I'll do that later to show you. This is literally the statement where you lost me completely. This is absolutely 100% unequivocally incorrect. SDR colors look like Crayola colors in comparison to the HDR version. I guess I disagree with your definition of "rich colors," entirely. Please, PLEASE stop any further disbelief in HDR until you do this kind of true comparison in person. Until then, I don't think you have credibility speaking on this subject.
    1 point
  7. webrunner5

    Lenses

    We want to see pictures of Hot Women, not out of focus Weeds!
    1 point
  8. 2k (not HD) 444 ProRes is about 38MB/sec; ArriRAW (2.8k Bayer for 2k delivery) is 168MB/sec. Yes, it's only about a 77% reduction in file size, which is significant on tv shows but perhaps not on the largest feature films. I suppose "tiny fraction" is an exaggeration. But ArriRAW has its own gamma mapping to a 12 bit container from a dual 14 bit ADC that then converts to a 16 bit signal in the camera. So, if you were starting with the true RAW signal, which is either 28 bit or 16 bit depending on how you look at it, the reduction in file size would be dramatically more. In the case of ArriRAW, the RAW data itself has its gamma stretched (similar to, but different, from Log) to save space. So perhaps ArriRAW is not the best example because it compresses the gamma, too, and a 77% reduction in file size isn't that big for your needs (it is for mine). I'm not sure what I "don't get." My own experience shooting 10bit SLOG 2 on the F5 indicated that the codec wasn't well-implemented for that flat a gamma, and I ended up not liking that camera when it was first released. (Overexposed by a stop, it's not so bad, and it's better now.) I think what you miss is that most serious shooters are running these tests for themselves. Problems like sensor banding in the C300 Mk II reducing the stated 15 stops of dynamic range and SLOG 2 on the A7S being too "thin" and Red's green and red chromaticities being placed too close are well-documented at the ASC and the ACES boards. Furthermore, the Alexa is pretty darned good even at 422, which you posit is too thin for log. (And Log C is very flat as gammas go.) Many tv shows shoot 1080p 422 (not even HQ) for the savings in file size. They still shoot log, the images still have good tonality, if slightly less flexibility than 444 ProRes or ArriRAW affords. Just because a few log profiles aren't all they're cracked up to be doesn't mean log profiles are inherently bad or wasteful.
    1 point
  9. Can you send me a link to the video you mention where you discuss bit depth per stop in various formats/gammas? I want to make sure I watch the right one. It is an interesting topic and worth exploring. There are, no doubt, trade offs with log gammas screwing with tonality. But by distributing data differently (I believe most camera sensors have 14 bit ADCs in RAW, but that data is not stored efficiently) you can maintain good tonality in a smaller package. Which is the whole point of log capture. No one says it's better than RAW capture, but in the case of the Alexa, for instance, 10 bit Log C 444 is maybe 99.9% as good–and a tiny fraction of the size. Furthermore, dynamic range is not the question so much as tonality is. With adequate dithering (or in the case of most cameras, noisy sensors doing the job for you) you can obviate banding for any given dynamic range at an arbitrarily low bit depth. (At a certain point it'll just be dithered black and white pixels–but no banding!) The color and tonality, however, will suffer terribly. I shoot a bit with a Sigma DP2 and I do notice a lot of poor tonality on that camera relative to the gold standard of 4x5 slide film, despite both having poor dynamic range, and even in RAW. I believe that has a pretty low bit ADC. While I admire your reasoning and rigor, I agree with @jonpais for the most part. I agree that a ten bit image, properly sourced, will hold up better in the grade than an 8 bit one, but will look the same to the eye ungraded. While I know (secondhand) of some minor "cover ups" by camera manufacturers, none are too nefarious and consistently it's stuff you can figure out for yourself by running camera tests, and things people online identified anyway, and which were eventually rectified to some extent. Most camera manufacturers are surprisingly transparent if you can talk to their engineers, and there are white papers out there: However, this over my head. Where I disagree with Jon is his statement that a given log profile from any camera is adequate for HDR content production. In theory, since HDR standards are poorly defined, this might be true. But it doesn't mean it's giving you the full experience. My only exposure to HDR (other than displays at Best Buy, and trying HDR Video on an iPhone X) has been a Dolby 10,000 nit demonstration and a few subsequent conversations with Dolby engineers. The specs I was given for HDR capture by them were 10 bit log capture or RAW capture, rec2020 or greater color space, and 15 stops of dynamic range or greater. Of course, there are many HDR standards, and Dolby was giving the specs for top of the line HDR. But still, this was the shorthand for what Dolby thought was acceptable, and it's not something any consumer camera offers. They are, however, bullish on consumer HDR in the future. Fwiw, the 10,000 nit display is mind blowingly good. Just because Sony seems to be careless at implementing log profiles (which is weird, since the F35 is excellent and F3 is good, too) doesn't mean log profiles are universally useless. The idea is to compress the sensor data into the most efficient package while sacrificing as little tonality as possible. The problem arises when you compress things too far, either in terms of too low a bit depth or (much worse) too much compression. I do see this with A7S footage. And I think it's the reason Canon won't allow Canon Log 2 gammas in its intermediate 8 bit codec on the C200. I realize you wouldn't consider the Varicam LT and Alexa consumer-level, but the images from their 10 bit log profiles are great, with rich tonality and color that does not fall apart in the grade. Furthermore, I suspect the C200's RAW capture would actually fulfill even Dolby's requirements for high end HDR, and $6000 is not that expensive considering. Out of curiosity, do you use Canon Log on the C100? It's quite good, not true log, but well-suited for an 8 bit wrapper.
    1 point
  10. Yeah and with Luca's NX-L adapter the NX500 is even better... Super 35mm 4K in your pocket for cheap, with EF mount. Colour is much better than A6000 too!
    1 point
  11. mercer

    Lenses

    Thanks, Marty. I appreciate that. I’m trying, sometimes with more success than other times.
    1 point
  12. PannySVHS

    Lenses

    Holy cow, Matt! That is some massive color! Is it filmed with a higer bitrate with magic lantern? @mercer Glenn, your shots and color look like cinema. Not just cool but DOPe!
    1 point
  13. I had that exact machine. Sold it for a 2015 iMac. After testing a 4ghz model with the 4gb video card and just 16gb ram - it was much faster than my MP and had a much, much nicer monitor. I sold the trashcan and bought a loaded late 2015 off craigslist for about half the price of new. My current machine has the 27" 4ghz i7, 32gb ram (easy upgrade from OWC, don't pay the apple tax for stock ram), 4gb video card and a 1tb flash drive. The problem with the trashcan is the display lag and how it scales text and icons - the older video cards performance with 4k monitors is only so-so, and there are few supported monitors. 2013 was really the beginning of 4k display support, so performance isn't great. There are better monitors now, but the hardware is going to be a bottleneck - its just not great with a 4k monitor. I tried mine with a few different monitors and it consistently wouldn't see the monitor when waking from sleep or from startup if there was any sort of power outage (I live in Florida when not traveling, thunderstorms cause frequent power outages), so I had to plug an old 1080 monitor into the trashcan and open the monitor settings to re-establish the connection. Scaling text and icons was a constant struggle too, I had to keep changing sizes of everything when re-doing the connection to the 4k monitor because every time it made a new connection, it would revert back to the stock settings of impossibly small icons and text. It was a royal PITA to have to do this on a regular basis. The late 2015 iMac update quietly added a wide gamut P3 monitor, its the sweet spot in the recent iMac lineup. You can find a fully loaded machine like the 4ghz model I bought for under $2000 US - some are selling in the $1700 range on ebay, and it'll wipe the floor with the trashcan. Just my 0.02 Chris p.s. It was/is the quietest computer I've ever used though, how they achieved that much power without the crazy cooling PC towers use is amazing. Still, it wasn't enough for me to keep it.
    1 point
  14. PC's can be smooth, fast, all that, I'm not denying it. However security can be an issue. Lots of malicious PC-orientated stuff out there. Expert users will be more careful, install safeguards, etc. But Mac users - of all skill levels - don't even need to THINK about it. It just isn't a consideration. Keep it updated automatically, erm that's about it. Little aesthetic things - there's more user interface and experience consistency across all visual elements of the Mac OS desktop, finder and apps you have open. The Windows start menu is a piece of shit as is the control panel. So tacky. Makes me feel cheap inside. Look at the icons in Mac OS Launcher. Works of art. Look at the P3 5K display and the colour you get it on the latest iMacs. PC monitors are 90% plastic junk and look shit on the desk. Exceptions are exceptionally expensive yet you get aluminium 5K cinema P3 display practically for free on an iMac. It's these things that add up and make you feel like you are using something well engineered and well designed rather than just a work tool. Drivers. Hate drivers. Mac so rarely needs them, unless it's an incredibly custom piece of equipment, again this is not even something Mac users think about. No need to tinker. No need to keep them updated manually or search the web. Windows can't even PRINT something without drivers having to be installed sometimes. And the constant updates... jesus christ. Windows 10 feels like a more up-to-date facade on an old building. The structure feels like it is creaking. The 2 control panels situation is a mess. The boot screen is ugly as shit. The need to make it all work with infinite combinations of hardware involves fundamental compromises in long-term reliability and security. It can be "just working" and then one day it isn't. Retina display scaling is atrocious in 4K. Fonts look like crap. So that's why I do 99% of what I do on a Mac and leave my Windows rig as a Hackintosh with option to occasionally boot up in Windows 10 for gaming or VR. Don't get me wrong - the hardware is super powerful. But then so is a Mac. People think they are simplistic, dumbed down and underpowered. It's not true and the power that is there is utilised better like a console. System OS is matched to the hardware 100%, whereas on a PC it is full of fallback legacy functionality and compatibility strands.
    1 point
  15. Hi guys, I just bought a GH5s and I am planning to use the C4k mode a lot for weddings. But i have a question about it. Since most monitors and televisions are 16:9, or either 1920x1080 or 3840x2160, I think that it is best if the films I make for people are exported in 3840x2160. That way I am sure that their telivision isn't doing any weird scaling. I usually add black bars in post to crop to 1:2.35, but I deliver in 3840x2160 with the crop bars hardcoded in the export. This is what I am doing for a long time now, but now that I have C4k or 4096x2160 to my disposal, what would be the best workflow in Premiere? I was thinking that I should leave my sequence settings at 3840x2160 and when I add the C4k files just let Premiere 'scale frame size' them to 3840x2160. In other words, the C4k files are 'zoomed out' by 93.75%. Then just add the crop bars like I normally would and export as 2160p. Does that sound good to you guys? Another reason why I am thinking of doing it this way is because I use multiple camera's and not all of them have a C4k mode. So mixing them on the timeline becomes easier and nothing is upscaled. Any disadvantages for this method? Or do you know a better way of working and delivering with these formats? Thanks!
    1 point
  16. focal excursion? Somehow that sounds obscene! What are you going on a Safari or what. I don't get it?
    1 point
  17. Bit depth and DR are two different things. I don't see why you expect one *must* change just because the other changes. (yes, they might, but doesn't mean they're always connected)
    1 point
  18. Me too. I'm bored with video/photography lately!!!!! Again, Jon, I appreciate that you're reporting back from HDR land! I can definitely see how dual ISO, rather dual-voltage tech, could fit right in!
    1 point
  19. As for serious CGI (he mentioned that in connection with Jurassic Parc asf.), you have to consider that in 2018 this still involves so-called render farms. Any traditional desktop machine won't cut it. These complexes are so expensive that you need exchangeable parts to keep them up to date. Compact size and design (Apples speciality) are not asked for. More and more, the GPUs do most of the job, preferably in realtime (we are on the verge of realtime already, see Blenders EEVEE), and it's paramount to be able to swap graphic cards AND motherboards with no restrictions by the hermetic OS or the undersized housings. Once you have successfully built a setup for the software you are working with, the operation system doesn't make a huge difference anymore. I use an iMac because it's a simple all-in-one machine, not overpriced, if you count the parts and the time needed to frankenstein them together for a Hackintosh. What I don't get is why anyone would want MacOS for Adobe, where this notoriously performs worse than on Windows 10 ...
    1 point
  20. To me it was obvious what ARRI's #1 focus of their demo reel for their new camera: skin tones.
    1 point
  21. Inazuma

    This guy is hilarious

    Andrew is an apple fanboy confirmed
    1 point
  22. It seems to me that the low frame rate is the only problem. If you play it back in 1/4 speed you notice that it moves the same amount in each frame, and it´s not actually jumping back and forth (which is what I consider jitter). It almost feels like the frame rate is less than 24 fps, but I don't have the time to measure it. If the shot was more blurry it would be less obvious, but I would say increasing the frame rate would be the best way to go. This tack sharp kind of video isn't emulating film anyway, so no need to keep it at 24fps.
    1 point
  23. OK. I own and use the NX1 and the A6500 too and shoot high end architecture / real estate / property and interiors too. I've had these issues too (with both cameras) but now got rid of them. I will write down my detailed experience and some solutions later this day, as now I'll get out for some sports.
    1 point
  24. Kisaha

    new updates NX1 ???

    GH5 is not "marginally" better than the GH4. That is not even close. I do not think that we will see another NX camera.
    1 point
  25. He has a computer that has lasted 3 years and he thinks that is some sort of record or something? That thing is damn near new. There are Tons of computer, Mac and PC that are still going 3 times as old. WTF. For what he paid for it it out to last until he is dead.
    1 point
  26. Still nothing to say it is NOT a Sony sensor. Until I see a Panasonic executive swear under oath the sensor in the GH5s did not come out of a Sony factory, I will accept it is a Sony sensor. It doesn't matter who makes it though as long as it works in the camera it is in as intended (and that it seems to do). All the makers are pretty incestuous anyway (EG Sony and Nikon are still the two biggest shareholders in Olympus I think?) I would love to have a GH5s and maybe I will at some point and that would apply even if the sensor was made by Kraft or Lego or whoever.
    1 point
  27. I am surprised Canon don't do their own EOS spinner. It would be lacking some features though to protect the higher end model.
    1 point
  28. This is hardly a rumor, it was simply a three line response to the countless request for comment that says "maybe you will get a gift while you fight" this means absolutely nothing and out of the very long list of what this could mean an NX3 is very very very far down that list. Here is my list of what this could mean in order of probably 1. Probably means nothing and we are not getting anything. 2. A software update that helps with compatibility with newer phones 3. A software update that helps with compatibility with newer phones & gives minor phone interface improvements because it was easier to connect their new cell phone camera app to the NX1 then to make the old app work with the newer cameras. ... 10,000. They have taken all the tech they learned with the NX to make a concept camera of the future for display at tech shows and showboat what they could do if they wanted. ... ... 1,000,00. they have an NX3 that will be released in two years that is nothing like a camera as we know it and more like a cellphone with 1" sensor like all the other cell phone companies are coming out with now. (actually, this is probably not that unlikely but really has nothing to do with cameras or the NX1) ... ... 10,000,000. they have an NX3 that is an iteration of the NX2 and can use NX lenses.
    1 point
  29. HurtinMinorKey

    DIY VR180 Setup

    Thanks! I've posted in the BMD forums and see if I get lucky!
    1 point
  30. Yes this is crap about YouTube, it should remember your connection speed each time and only drop the quality if your connection is suddenly a lot slower or you are downloading a 100GB game patch on Steam in the background Vimeo meanwhile, I agree with all the comments about their reliability... It's pretty shocking. I don't know how such a well funded and successful business can continue to neglect the basic fact that they need to deliver their core business....Streaming. Videos. Smoothly! I suppose there is a balance to be had between YouTube's pixilated first 10 seconds and Vimeo's 10 minute wait before even one frame moves.
    1 point
  31. Sage

    GH5 to Alexa Conversion

    Working on something now that I think you're going to like ; )
    1 point
  32. keessie65

    new updates NX1 ???

    On Facebook now discussion about a new NX-3 is coming. Who knows more about this rumour? https://www.facebook.com/groups/SamsungNX1.for.video/permalink/1123178037816446/
    1 point
  33. For every blogger like you Andrew, there're 10 sponsored by Canon. Who's going to be louder? As I've said before (and you know too) Canon only works with those who're positive about them as much as possible. Here's a new example: In a completely unrelated site, someone was saying that the best camera for vlogging is a Canon 6D because some YouTuber recommended it. Turns out Canon has been sending 6Ds to high-profile YouTube vloggers to push the camera for that purpose. You can't deny marketing like that is effective.
    1 point
  34. 1 point
  35. I agree. There is no way in hell Panasonic comes up with a sensor that is that drastic of a change from the GH5 in that short of time on their own, no way.
    1 point
  36. I really don't think Philip Bloom ought to endorse Any product at all. Test them, tell it like it is, and don't take money doing it. Anytime money is involved it is easy to get the results they want you to get ,maybe not the truth. Now if he needs money go out and shoot for a living and give up reviews. Pretty simple concept. I just don't think a person ought to do both for money. And he Ain't doing this Kinefinity just for the hell of it. He is getting gear, or money, or both.
    1 point
  37. In the article, it says that paramount did a couple of test screenings and found it to be too intellectual.... Which explains a lot.... I remember being in a second year psych. course in University back in the early 2000s.... And the professor asked a question, can't remember the actual question or what prompted him to say this.... But I remember it almost on a consistent basis.... He said, "You are technically smarter than 85% of the population." Which made me think... It is true on some respects, you need to get the top grades (85% or more) to get into University.... But, is this true in respects to everybody.... I'm starting to believe it is.... I've been running into a lot of them too, and Im sure some of you guys have too.... Examples: Some cashier that can't do basic math, a barrista that can't get a simple order right. Ever get a time when you specifically say... "No mayo" or something similar and you repeat it a couple of times too, and they say "yes, yes.... No mayo..." And the sandwich comes back with Mayo. Then you look at the sandwich and ask what the hell is this.... And she is looking at you weirdly... This has been happening to me on the regular. And, I generally go when it's not that busy cause you don't want to hurt their pea brains with small requests like this amongst all the other requests. I know.... The above sounds a bit petty, especially if you've never experienced it yet. But, if you experience it on the regular... I, especially, try to avoid confrontation too... I completely try to avoid the dumb peeps... I always say, these people work here for a reason. Going back to this movie, if it's only going to appeal to the top 15 percenters.... Then I too would choose those that know how to subscribe and use Netflix.
    1 point
  38. If it's *just* about money, if the only votes that count are those of the shareholders, this industry is doomed. Applicable to everything. If we are measured by by the degree we can be exploited, our kidneys will be sold and the rest becomes soap. Watch The Cooler. The old casino mafia ruled this frivole business with cruelty - and passion. Then the bankers appeared and took over. And the world turned to shit.
    1 point
  39. Matt Kieley

    Lenses

    I had to sell my G7, so to get the most out of my EOS M, I'm sticking to crop mode with c-mount lenses. I've added a Cosmicar 6mm 1.2 and Tokina 12.5-75mm 1.2 lenses to my c-mount lens collection. I tested all of them along with my Meteor 5-1 17-69mm 1 lens from my K3 16mm camera. All are tested wide open. I don't plan on shooting wide open all the time, but I wanted to test sharpness and DOF at wide open apertures. Tokina 12.5-75mmmm at 12.5mm Tokina at 50mm: Meteor 5-1 at 17mm: Meteor at 69mm: Cosmicar 6mm: Cosmicar 8.5mm: Cosmicar 12.5mm: Kern Paillard 26mm: Love this set-up so far.
    1 point
  40. Shoot 4k. Use a fader ND. Shoot c-log. Expose to the right. I’ve found the compression is much much better wide open/nearly wide open. Maybe this is the lens, but the f11 stuff I shot came out mushy. It is good for what it is. If it had a 2.8 constant lens, this camera would be a beast.
    1 point
  41. Never owned one, but I can't believe no one has replied. I think Mercer had one?
    1 point
  42. Oh I won't argue with any of your comments. If you can only have or afford one camera you are not going to buy a GH2. A G7 puts one to shame overall. These new cameras make every ones life easier, but I just think beautiful 1080p output is, well just beautiful. We just seem to be loosing that in these new cameras.
    1 point
  43. I've certainly seen some nice shots on a hacked GH2. Having never owned one, one always needs to imagine how good it looks without the compression. It's funny because the GH5S is bringing back the multi-aspect sensor- something the GH2 already had and we lost with the GH3, GH4, and GH5. To my eyes, the GX80 has a slightly different color algorithm than the GH4 and I'm satisfied. The GX80's size, colors, 4k, and IBIS trumps the hacked GH2's 1080p, but right now the GH2 is half the cost- not bad. Personally, the GX80's still a value option that's hard-to-beat. It gets the job done for me. I still haven't seen any features in any of those new Panasonic cameras that would merit the price jump. Secondly, the GX80's successor is going to have to be pretty damn good to convert me.
    1 point
×
×
  • Create New...