Jump to content

no_connection

Members
  • Posts

    385
  • Joined

  • Last visited

Everything posted by no_connection

  1. Did they solve the freezing bug yet? Where it freezes windows for a few seconds for no apparent reason.
  2. I had it along with a X-Pro1 on loan from a friend as I thought about buying it cause all the craze about the magic X-Pro1. Don't get me wrong it was sharp and did a lot of things right, but it was just something about the not quite in focus bokeh that just feels wrong. Maybe it was the lens I had, maybe it was me, who knows. I don't remember the settings used so it might be some in camera processing making it slightly worse, but it shows what I mean. (it's a 50% crop of a larger image)
  3. I can see it being nice, but the lack of nice affordable stabilized lens kinda holds it back for me. I used the 18-55 2.8-4 with X-Pro1 and bokeh was worse than my D80 kit zoom, and that horrible hiss. Not sure about the X-T20 RS but it needs to be way better than the X-T2 to be a serious contender.
  4. Using the anamorphic mode on GH5 would give limited tilt shift in post I think,
  5. Focal length or pan speed does not really matter as long as you can see and measure the effect. In short you have a vertical line like a lamp post, when you pan it at a constant speed you take the distance between two frames, then you check how much the bottom part of the line lags behind. Then you calculate the % of the distance that it lagged compared to distance moved. Then you take the frame time, roughly 42ms for 24fps. And you apply your % to that. X-T2 would be about 72% lag if RS is 30ms. Which when you think about it it is quite a lot, the bottom part of the image is lagging a significant portion of the frame behind. The a6300 pretty much lags an entire frame behind between top and bottom, which is why 30fps had to be cropped (only 33ms to work with)
  6. I want to know what the E-M10 MIII gets.
  7. I don't see why X-T2 would be incorrect, I think I got about that number when I used the promo clip and calculated it.
  8. I would be more worried about the lithium cells inside a confined pipe, only one way out and that is through a piston with a human mounted on top. It did show how useful a heat camera can be tho, I want one of those.
  9. That is some pretty neat construction. The iPad is probably just using compass to detect magnetic field of the permanent magnets in the motor, and as it's a very strong field you don't really need an accurate tool anyway, you can see it saturate the sensor almost instantly. The twichy part is they don't seem to know where the sensor is located or what it does.
  10. That focus assist was interesting, so any other Panasonic cameras have that?
  11. What is the latency in ms for the different frame rates? Or expressed in frames if you want that. What's with the sudden hate against using existing interface solutions? h264 encoding chips are not new and aught to be cheap at this point, same with the USB side. And when it come so getting the data, you could just have ethernet as transport via USB, it's not like the 15Mbit stream is that high demand. What would be interesting is if they made a host mode USB port where you could plug in a wifi adapter, just set it up through cable first then it would work as an access point. UDP multicast and you have a wireless monitor, or several depending how many phones you have. The app side would be fairly similar I think. Latency would not be much more than cable, maybe max 10ms and that is insignificant compared to other things. You could add a wired ethernet adapter too which would be useful as you can add professional access point and have large area coverage. I would not rely on recording through wifi tho.
  12. This has to be one of the more unique events captured!
  13. Looks like a simple hdmi to usb encoder with some software on top of it that change the h264 hardware encoding settings. Latency looks to be 300ms or so, don't see any numbers of it. Would be trivial to measure so the omission of it does raise some concern. Wifi link would not really add much latency or problems if implemented right, but then you'd have to certify it and that would be a pain I imagine. With that kind of distance a 5GHz link would be capable of 150Mbit with little issue, as long as the phone plays nice that is.
  14. Someone need to mod a gopro into the body of one, that would be interesting. As long as the existing sensor isn't too embedded into a cage or something.
  15. No there is no repeating frames, that is 8 frames blended together so you can easily see that the movement is uniform over time. (distance is the same for each frame). I also measured the edge of the blur compared to movement between frames and it is 1/50ss. However half of the dimmer tail of it is very faint as the bright parts swamp any dark trail making the knob below the M in 60MM look like 1/100 or even higher. This is expected behavior and is why high contrast is more prone to "strob" than more uniform background like a forest. There is also the aspect that is is a linear phenomenon and the final grade will affect how the resulting "gradient" of motion blur looks. Now there may or may not be some processing going on making the problem worse (I did not read the gh2 thread) and I have not enough data to go on with only one video, but what we can see is a pretty sharp fence jumping from one place to another making it look very non smooth or fluid. That is something that need fixing one way or another is those pans are gonna look smooth. One thing to test, do a pan but take a photo in raw mode and compare that to a similarly shot video frame. That would tell us if there is digital processing involved in camera, or if it is some sensor behavior. Just tossing out thoughts here.
  16. Won't UV light kill off any fungus? Maybe hit it with a high dose of that once in a while, as long as lens coating and material allows.
  17. Gonna miss it too. Wonder if this guy will post some nice footage. https://www.youtube.com/user/jdbastro/videos
  18. You could test down to 1/30 or so and see if it makes a difference. I'm not entirely sure that is really 1/50. Ill try and measure it. *edit* it does indeed look like 1/50ss but due to the high contrast a lot of it is washed out leaving maybe a 1/100 edge or even less. And it is in high contrast areas where it's worse to begin with. So it might need more motion blur to help. I noticed that my video I made with the first settings you see in my first screenshot looked smoother than the original video, so it might need some blur to look ok at fast pans, which make sense if you think about it. At slower pans it might be too buttery instead.
  19. Here is maybe a better view. You can see that over 8 frames there is little to no difference in distance between frames. You can also see that the blur on the knob on the fence thingie is very charp, way less than 180* of movement you would expect from 1/50ss.
  20. I am not sure what you are referring to as jitter. I analyzed parts in the video where you see the shadow of several frames (think frame based motion blur) and found no evidence of any difference in frame timing/position. You can even see some parts where every frame lands exactly one half a fence rib apart making the pins "stand still". If ibis was indeed moving the ribs would not land in the same place every frame. Or if the frame timing was wrong. Here is a quick and dirty "test" of the footage, so please excuse me if I got the problem in question wrong. I attaches a screenshot of one part of it where you can see the edge of the ribs compared to previous frames and they show pretty much the same distance traveled between each frame. All I can see is a strobing effect from having high contrast objects jumping from one place to another. And that is a real problem too, but it's not jitter. Playing back low framerate content and having it look smooth is a whole chapter in itself. And so is capturing it. 180* shutter isn't a be all end all solution, it depends on the scene.
  21. https://obsproject.com/ It is used by streamers of all kinds, pretty much the go-to application to stream stuff to the web. Which is why is has a lot of tutorials and help online. If you type OBS and greenscreen into youtube you will find tutorials for days. Do note that probably most of them are streamers, not ppl the know what they are doing, but can still show you how to set up the program. There are other applications too out there, if you look around.
  22. Thing is, you want to light your subject like it was in the real environment, or in the case of a "flat" world, stylize it to taste. There is no fixing that in post if the light comes from the wrong direction. If they meant filming with a flat profile to get highest dynamic range than that is a bad idea in this case. As you already have complete control over lighting you can make it "flat" that way if camera struggle. I would suggest you set up a quick greenscreen in OBS or similar to have a realtime preview when playing around with lights to get the right feel and "pop" compared to background. Or even set up a script that does it for photos so you can do it somewhat realtime. If you can't get good light for the subject, then any advantage in resolution or quality goes out the window.
×
×
  • Create New...