Jump to content

maxotics

Members
  • Posts

    957
  • Joined

  • Last visited

Everything posted by maxotics

  1. I watched Dave Dugdale's review of the GH4 last night. When Dave, who is about the nicest, least confrontational camera expert on the internet, when Dave is saying he's going to sell all his Canon gear then Canon has a problem (I often drive Andrew nuts by saying "if you don't have something nice to say then..."). In other words, if Canon has lost Dave it is just a matter of time before they lose the consumer video market. Andrew has already taken care of us enthusiasts ;) Dave makes a very strong argument for Panasonic. The first is that the video, in detail, makes all the Canon cameras, even the 5D3, look blurry. He actually shows you. He then goes onto a full-frame's 1st trump card--shallow DOF. He then shows one of the most interesting focal-reducer (Metabones) experiments, showing how you can get very close to a good shallow DOF with MFT. Then, with the low light, Canon has an edge, but not by much. He did his review before the Sony a7. NOTHING in Canon's arsenal can compete with that. Dave caps it off with what I feel very strongly about too--the smaller the camera the more one uses it. At this writing, I believe Dave's main two bags are GH4 and A7S. If you read any marketing book it says somewhere, it is 10x easier to keep a customer than get a new one. Like many posters above, I don't care what Canon does. I can barely keep up with Panasonic and Sony and Blackmagic. And if I REALLY have to have the best full-frame, sorry Canon, I'll pick the higher DR of Nikon.
  2. It's hard to read this topic because I dislike talking about phones almost as much as I hate talking on them :) From my photography side, I can't see the CM1 replacing my Ricoh GR, which I bought a few weeks ago (and wish I had bought a long time ago). In photography, the larger the sensor the better. I have seen no way around it. Video benefits from smaller sensors and it seems MFT is the sweet spot compromise. The video on the GR is full of aliasing/moire (it has no AA filter I believe). It's barely usable. The video on my a6000 is very good, but I can't fit the camera in my pocket. As a photo camera the only GR peer is the Sigma--but that camera makes a huge trade-off in speed for low ISO IQ. Here's such a snap with the Ricoh GR, barely out of pocket shot > Here's a shot where you can see the benefits of a prime lens matched to a large sensor. The Sigma would have been better, but it would never have been have to take the shot above. The CM1 can probably take something that looks good in a small print, but would fall apart when larger. > On the Android versus iOS thing. I bought an Android phone (The Samsung S3) because I wanted to get software that allowed me to control cameras. Unfortunately, I feel Google/Android has been a failure. Theoretically, with Android's open architecture, someone should have been able to create an application that controls 3 GH4s or a7s and an External audio recorder, get them to start at the same time, remember their file names, and then give you the ability to sync all three shots before hitting your NLE. Obviously, the camera manufacturers have to create APIs and camera-side hooks (Sony has, but doesn't seem to be gaining much traction). Panasonic seems to have the best Android app...but, I don't think it's any better than the iOS one. The good thing about this phone, and the reason I'm happy Andrew is reviewing it, is that is DOES have a lot of promise from the software side. I remain hopeful.
  3. All scientific and artistic advancement is built on a pile of discarded and seemingly useless ideas and findings. One man's garbage becomes another's treasure and the other way too :)
  4. About "“4K, so what, it isn’t all about resolutionâ€. I'd extend your answer to point out that most 1080 is not really 1080 in actual resolution because of the space between pixels on a sensor and the resolution problems they create. This is obvious to anyone who pulls their hair out trying to eliminate moire! 4K, downscaled to 1080 virtually eliminates morie in most circumstances. More importantly, a pixel in any Bayer camera (which is all of them in video) only captures 30% of color information (red, green or blue) and borrows the rest from neighboring pixels. When you downsample 4K to 1080 you are getting 4 original color values into that pixel (averaged from the 4 in the 4K image). Hats off that you reviewed this camera! If I was shooting,say concerts in good light, this would be a hard camera to dismiss for the reasons you point out.
  5. In Andrew's first look at the NX1 he compares against the LX100. http://www.eoshd.com/2014/11/samsung-nx1-review-glory-technology/ Also, I'd search "LX100" on vimeo for other sample footage.
  6. In my 20s I would probably run people over in the street to make sure I got to a movie on time. If I missed even 10 seconds it was totally ruined for me. It would NEVER occur to me to leave a movie in the middle--I never did. If people were talking I physically grabbed them and told them to shut up, no matter how many of them there were. Thirty years later and I can live with missing the beginning of the movie (though I try not to) and I now find it easy, not hard, to walk out of many movies. As for people talking, ironically, I now often find their commentary a lot more entertaining :) Daniel, the point of all my comments is that Interstellar could have been a lot better. I didn't walk out of it. It was watchable. There were good ideas in it. However, I believe that when YOU get a chance to watch more of the classics you will not only become inspired (again, don't have to go back far, GATTACA is already a classic, at least in my book), you will get more enjoyment from good films. That is, when I saw GATTACA I was BLOWN AWAY. It is a great feeling, when you've watched so many movies, to see something new and great. I felt the same way watching "The Wire". When I saw "Fifth Element" it got horrible reviews. I thought it an instant classic. I agreed with what the reviews said, but the cool way he brought fashion in Science Fiction was beautiful to behold. Since that movie I have dreamed of a day stewardesses dress like that :) That's why I ask about Interstellar. Is there something in it that, in time, will be recognized as beautiful/interesting? If I've missed something maybe I'll watch it again.
  7. As Andrew pointed out, the NX1 uses a new CODEC, which is a pain even for people who work in this stuff all the time. If you want low light, the A7S is your camera. If you ergonomics and features for movies, and haven't used a Panasonic before, then the GH4 will NOT let you down. I'd only get the NX1 if you had money to burn and you just wanted to have fun on the bleeding edge. As nice as the footage is, RAW delivers better organic dynamic range, IMHO. If any camera keeps looking like a steal it's the LX100! At least on the web, it looks as good as anything out there.
  8. My question is more about how important external recorders are if, in the end, you're going to end up with 1080 (assuming no cropping in post). So do you really need an external recorder for the a7s if you're going to use 1080 footage? Or, is the 1080 from the lx100 as good as its 4k post down-sampled? I understand the you always want the highest-data footage. Maybe this is a bad question ;)
  9. This is a question I think may deserve its own thread. Understand if you move it Andrew. How is the in-camera a7 in-camera 1080 compare to post-processed 4k to 1080? Same for Panny GH4 and LX100, etc. Is a7 with externally recorded 4k end up as better 1080 than the camera, or is internal GH4 4k scaled down to 1080 end up as better 1080 than the a7 internal? Can you end up with better graded footage from a GH4 4K than a7 source 1080...etc. How important will an external 4k recorder be for the a7s?
  10. I believe you can compare all films. Agreed, Hollywood films and independents have different audiences, but they use the same material and devices and have the same goal of getting the viewer to say, "I trusted this person to entertain me and they did." I don't get a lot of enjoyment from Wes Anderson's films. I enjoyed Rushmore because I felt it was my life on screen, even my name! :) However, I understand what he's trying to do and respect the film-making decisions he makes because they work great for my wife and kids. They love his films, many of my friends do. If Hmcindie enjoyed Interstellar then who am I to knock it. I'm glad he did. Anyway, I can only criticize Wes Anderson about what I don't like, not about his fundamental film-making which is excellent. Or put another way, I really appreciate Anderson as an artist even if it isn't my cup of tea. The problem with Interstellar is that the director doesn't seem to make choices based on style, but on wrong-headed concepts of what makes a story emotional. It isn't extremely shallow DOF, or a 3-hour running length, or a father who goes into space without a proper goodbye from his daughter. As Bioskop said above, Nolan should have focused on what he's good at--the crazy science-fiction time travel stuff. I wouldn't know how to write a better script than Nolan, but I know there are people out there who do know. I do know enough about science to have improved the movie, but at my age this isn't bragging or ego. I WANT to be blown away by mass-market science fiction movies. My guess is that, despite all his fame and money, Nolan is too insecure to hire people more talented than him in their respective areas. If the spacecraft in Interstellar are going to be able to land and take off from planets with 2-times Earth's gravity then DO NOT show them being put into space by an Apollo era rocket ship. It's NOT necessary to anything in the movie. Again, with Anderson, I might not like an effect, but I can't argue that the effect fits into his style. In the end, I enjoyed Grand Budapest, a type of film I don't normally like, a lot more than Interstellar which is a film I naturally enjoy (so I enjoyed Pacific Rim and didn't care how corny it was).
  11. Hi JCS, I can accept any fictional time-space theory. It was the blatant small-science sloppiness that got to me. When Flash Gordon became a hit, a long, long time ago, you knew it was inaccurate that rocket exhaust would flame upwards in level flight, but you also knew there was no other way for them to get the effect. They created a model and put a Roman candle in it and filmed. In Star Trek you knew it made no sense that the ship's phasers would each go at an angle to the target, but you also knew it would look stupid on your tiny little TV if they showed it as a straight line (unless it was a little blip of photon torpedo). I'm sure you get my point. Historically, the problem with special effects was they often just couldn't film what would be more believable (from a science point of view) in a visual fashion. The tech just wasn't there. But the filmmakers KNEW what they would have done if they could have. Today we have F-35s with vertical thrusters, which were used in Harrier Jump jets beginning in the 1960s, and pretty much common knowledge by the 1990s. Yet the design of the space-ships in Interstellar are 1930s. If the water wasn't 1-foot deep on that planet, what would have happened then? The whole movie I felt the director was saying "f-you I'll make my spaceships any way I want because I'm THE genius here. I know that because my last movie made a billion dollars" The problem is rampant in movies, maybe always has been. Like Redford's "All is Lost". I'm no sailing expert but even I noticed all the unrealistic shots. For example, if there is one thing marine electronics is, it is WATERPROOF. Yet we're to believe they all short circuited. It's the audience's vanity that they want to be part of something special so badly they can't say the emperor has no clothes. My daughter says most of her friends would look at her as a freak if she said what she really thought about Nolan's movies. As for Interstellar's cinematography. Again, I don't see anything special there. The b-movie "Moon" was 100x more evocative of space and interstellar psychology. There was one scene, in Interstellar, where the organ music is building and building and building and then they run out of fuel and it stops and the character says something like "we ran out of fuel" and I so badly wanted to turn to my movie companion and say, "no, they ran out of organ music". It was SO obvious. I almost laughed out loud. An actor should have said what Bette Davis said in "Dark Victory". "Max (Steiner), only one of us is going up those stairs."
  12. I've been a film-nut since my teens and I'm in my 50s now. Nine times out of ten, I now exit the movie theater wondering why I still bother. In order to suspend my belief, the filmmaker can't dare me to ignore the incongruous. Like many of Tim Burton's movies, Interstellar came across as a movie made by a visually imaginative person with no interest in subtle character development and story-telling. Am I just too old to get it? What am I missing? To put it bluntly, though the film was watchable, I thought it immature. Inazuma, what about Interstallar did you find incredible? How is it a movie that defines moves? Have you see the original "Day the Earth Stood Still", or "2001 A Space Odyssey" or even "GATTACA" or "Contact", which are fairly recent? There was only one thing that Interstellar got right. The black character who waited 23 years, alone, in the ridiculous time warp remained cool, while the white one became a sociopath ;)
  13. Which means they replaced the camera. I think if there wasn't noticeable damage on my GM1 they might have fixed it. That's another thing to keep in mind. I don't know about the GX7, but the metal housing of the GM1 is extremely thin. Which means if you knock it even slightly it will probably get dented. Again, this is a benefit if you want the lightest possible camera. However, it also means that even if the GM1 develops an internal problem they may blame it on you. Sounds like you basically got a new GX7 for 150 pounds. That's good!
  14. Ebrahim, "Tim's Vermeer" is out now (guy who invented the Tricasters btw). Everyone on this blog would enjoy that documentary. In it he explores how Vermeer must have used optical equipment to paint the proper dynamic range in an image. Anyway... To expand on what you wrote, our eyes have 6.5 stops of dynamic range (and also only about 5 megapixels of center resolution). What gives us 21 stops of dynamic range isn't the mechanics of our eye but the compositing nature of our brain. When you look from bright to dark, the brain essentailly HDRs the image in your head (as you say, the iris changes). There are few "blown out" area or shadows without detail because our eyes adjust. One day, I believe camera sensors will do this. The camera computer will take sections of sensor data, change the exposure a bit, then create one HDR image. Right now, they can only do it with the whole sensor (unlike our brains which pick and choose sensory data). Some people confuse the dynamic range of a camera (a Nikon is close to 20, a Sigma 8, etc.) with the dynamic range of human vision. They are really to different things. The dynamic range of a sensor is the gradations of light it can capture in a scene, however, those gradations are based on a single reference point. When you change one, you change them all. The dynamic range of our vision is not based on one reference point of brightness, but on a reference point of image sense. Our brain is the master photographer. Indeed, philosophically, can we create mechanical images that are better than what's in our head? If we take a Nikon image, say, and want to bring back detail in our blown out areas, we might change the "dynamics" at a part of the image that's at the 19the stop of DR. However, we don't get full detail to work with because if we need 6 stops, we're only getting 17 to 20, not 16 to 22. Our eye gets the full 6 at the "window" to use. And it does so smoothly. Which is difficult to do in image editors (assuming we have multiple exposed shots) without tons of time and skill. As for it being a hoax. That's certain possible! However, I don't see how it isn't possible if you accept that the color will suffer from temporal and other accuracy issues. they can vibrate a color filter over the sensor as well as DIYers vibrated focusing glass in those early DOF adapters. What Sony is doing is what they did with the A7S. They are created specialized sensors for specialized needs using known technical trade-offs. (keep in mind, the A7S doesn't get it's low-light for free. It looses DR and resolution in good light. Sony believes there are enough people who will pay for that trade-off--time will tell, so far, looks like a good gamble).
  15. "Ergonomic nightmare" is a bit rough. But I agree Canon has the best ergonomics of any camera. That said, you can't get through the viewfinder focus peaking/zoom in the Canons (except EOS-M of course). What I love about the Sony A7 line is it is SMALL (again, Canon 5Ds, love them, but they're big) and the EVF has a lot of benefits in eyeballing exposure and again, vintage glass use. If money and space are no object I don't understand why any pro wouldn't have an a7 and (Canon or NIkon). For my old eyes, Sony is a Godsend.
  16. It's all starting to make sense. These cameras will be gobbled up by sports and wild-life photographs as quickly as wedding photographers/videographers are bringing the A7Ss online. I was just out with my A7 and a6000, with vintage glass. Sony SO gets it :) Thanks for keeping me up to date with these stories Andrew!
  17. Hey, some evidence I was right in one of my first guesses (sorry to blow me own horn): "single pixel color sample by moving color filter on electrified track" At first I thought the 16-bit A/D sampling was suspicious, but now I believe they're using higher precision readings to reduce the amount of mathematical calculations--interesting! Andrew, you're probably right that Sony has signal-processing chip strengths here that Sigma doesn't. Quirky, what's with all the negative waves this early in the rumor ;) Photography has ALWAYS, and will ALWAYS, be pushed ahead by crazy ideas. I can safely say that if people weren't into these challenges there wold be no photography or film. Sony and Panasonic are doing really cool things, from an electrical engineering standpoint. FORGET THE MARKETING. Will I need this camera? No. I don't have any interest in slow-motion video. But I might one day. You might one day. Don't get angry at the scientists because they work for money focused corporations :) This sensor will be a specialist product.
  18. Hi Andrew, as you know, Sigma cameras take seemingly forever to write RAW data. I assume it's because, as Benjamin pointed out, there is a lot of heavy math going into the interpretation of color readings. More evidence of the difficult math is that Adobe has never created a RAW reader for these files and the Sigma software is notoriously slow and buggy processing these files. The newer Sigma cameras use a high resolution top (blue?) layer to make a better compromise between resolution and color. I tried one of the new camera, still very slow. So the question is, has Sony figured out an electronic or mathematical way around Foveon (vertical color sampling) problems. I somewhat doubt it. Or are they using high resolution, over-sampling and the less critical nature of video color to sacrifice color accuracy for aliasing free video at high frame rates? Like the A7S, which isn't anything new, but a chip made the way a low-light videographer would make it, my belief is that Sony hasn't developed a new technology here. They're just making chips that will do one thing well, along the lines of what Sunyata said, (as you know, as nice as thte a7S is, it loses dynamic range at low ISO). Thoughts?
  19. Hi Benjamin, I have not had any "unpredictable" outputs with my Sigma cameras in mixed light sources. I have noticed problems with the red channel, however. So I'm a bit confused about what you said that the sensor just reads the depth of the light to ascertain color. I'm not saying you're wrong, I'm just staying I'm still confused. How does the Foveon work then, does it take 3 separate reading from the sensel and use math to calculate color, or if it doesn't use filters at each depth, does it have some sort of probe (for lack of a better word) at 3 depths of silicone, one for each color. And if either of those are the case, why don't any other monochrome sensors have color possibilities? That makes it difficult for me to believe that Foveon gets diffierent colors from the same sensel material. As for the Leica monochrome, aren't color filters added after the chip is made? Machine vision systems still use mostly monochrome, right? Anyway, my point wasn't to bash Leica, only to point out that all sensor designs are compromises in some form and that some photographers will pay a lot to get near full-frame monochrome images (I agree, it's probably an expensive short run from the manufacturer). If Sony is going to use some sort of material for color reading, what could it possibly be? Do you have any idea? I mean that alone would be game changing. That the chip is going to have such high FPS and a global shutter seems too good to be true.
  20. Yes, even whiskey wouldn't hurt this conversation ;) I don't see how they can do it without physically moving some kind of filter over the sensor. One of the problems with sensors, of course, is the space between actual light reading silicon. I believe a similar problem is present here, the time between filter changes.
  21. You mean the color filter could be slower than that and work? The Sony sensor might use the technique used in the first color photo, except at a pixel level. The camera would in 0.0139/sec take a red reading, shift to the green filter, take a second reading, then to blue filter and take the 3rd reading, all within 0.0417th of a second. Look at these images: http://educacionporlaaccion.blogspot.com/2013/04/evolucion-historica-del-color.html
  22. I use them. I hate everything about them except the final image. If I have good light, and time, and need medium format quality still in my pocket, they are the ONLY game in town. The same for your BMPCC if you want 1080 RAW video in your pocket. My gut feeling is this new Sony sensor is more about semantics, than any true pixel-level RGB sampling (like Foveon). The problem is something along the lines of Heisenberg Uncertainty Principle. In our world, you can either know a light beam's color or intensity, but not both at any given instant. All sensors are intensity only. Whether Bayer or Foveon, filters are put in front of the sensor to estimate color. If you think there is NO problem in this then why does Leica have a monochrome (no color filter) camera that sells for $6,000? (It probably cost them $50 to make it, but no matter ;) ). If you're a B/W purist, those filters degrade your black and white image. So there are only two ways around this problem (unless Sony has discovered an electrically conductive material that can read color), you can stack color filtered sensors on top of each other (like Foveon) or next to each other (like Bayer). With Foveon, you get true color pixels with little color distortion (if in strong light); with Bayer, you get high sensitivity color pixels, but when you combine them horizontally you get aliasing/moire problems. Theoretically, you could take a grid of color filters, RGB, and vibrate them across the sensor so that the sensor could take three readings for each color. So if you had a global shutter that ran at 72 frames a second, it could take the red at 1/3rd a 24th of a second, then the blue, then green. Perhaps they use a seriously precise stepper-motor to do this. If the sensels are rectangle, maybe they use that to capture all three colors at any instance, but the vibration changes the pixel-center focus color and the pixel just averages them all together. It's all very interesting stuff, to me at least, but my guess is that though it may make for a good video application, it won't be good enough for still photography (at least professional or enthusiast). The reason is that Foveon doesn't work because of PHYSICS, it isn't a failure of Sigma. They simply can't find a substance that will take the color value of light and send enough light to the sensel below it, then the next one below. If Sony can change a color filter over the sensel it could eliminate color moire problems in a still subject, but if the subject moves then the color may change between filter changes (in the 1/3rd of 24th of a second). Color problems are back! I believe, understanding this stuff makes one a better photographer, or filmmaker, even if it has no immediate practical use on set. For example, if you noticed a lot of moire in the background of your shot you might go out and buy all kinds of blur filters. Every time you got rid of the moire you might find the image not sharp enough. If you knew this stuff about sensors, however, you'd open the aperture up a bit and increase the blur in the background while keeping your subject in focus.
  23. Any idea how it "scraps the traditional bayer RGB filter"? Manufacturers have been trying for decades. Forgetting everything else, that alone is revolutionary if done without extra noise in one of the color channels. As for what camera they'll put it in. Maybe Nikon will surprise us ;)
  24. Will it be easier for Canon to develop 4K for their full-frame, or for Panasonic to develop a full-frame for their 4K? The video coming out of that lx100 is what 1080p was always meant to be (it solves the lack of color information inherent in 1080 sensel bayer patterns). I mean, it's SO clean. It's been relatively easy for me to resist the impulse of the false photo god in the GH4. But the lx100 video "photo" mode in a small form-factor. Oh the word of Panasonic does sound true!!!! :)
  25. Hi. I've created a fair amount of camera control type stuff ,here's a robotic panorama thing I built: http://maxotics.com/?p=21 Five years ago it was between Mac or PC, now there is still that plus iOS and Android. I'm still interested in it. I tried to get people on the Magic Lantern form focused on a stable version of ML RAW for the EOS-M, but couldn't get there. Anyway, it looks like there are now two people on the forum who can program cameras to do something. The question is, what kinds of programs/software would filmmakers on this forum want. How many of them want it? How badly? Behind me sits boxes of arduinos, usb interfaces, old windows tablets, etc.
×
×
  • Create New...