Jump to content

hoodlum

Members
  • Content Count

    319
  • Joined

  • Last visited

  • Days Won

    1

hoodlum last won the day on November 17 2012

hoodlum had the most liked content!

About hoodlum

  • Rank
    Frequent member

Profile Information

  • Gender
    Not Telling

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. The latest 13.2 beta Apple finally move the selection of video resolution and frame rate to the camera app. Unfortunately, only the iPhone 11 will be getting this change.
  2. Dpreview must have updated their info. We will don't know if line skipping or binning was used to accomplish this. https://www.dpreview.com/reviews/nikon-z50-initial-review-what-s-new-how-it-compares/2
  3. A very nice video was posted by Matteo Bertoli with most of it done using the Ultra-wide lens in 4k60p and then conformed to 24p. i would strongly recommend ready the details on how he created the footage (stabilization, iPhone Camera App, Resolve + LUT), along with comments on the output quality.
  4. Fortunately you can adjust tint now in the new Photos App.
  5. Here are some details on the video editing capabilities. https://sixcolors.com/post/2019/09/13-features-of-ios-13-video-editing/ "Tap on the dial icon and you can adjust image qualities, including exposure, highlights, shadows, contrast, brightness, black point, saturation, vibrance, warmth, tint, sharpness, definition, noise reduction, and vignetting..... Moreover, all of that image adjustment is non-destructive, so if you ever decide you don’t like any of the changes, you can revert back without losing anything."
  6. I am surprised the X-PRO3 went with a tilt-down screen.
  7. The difference is that Google/Apple is selectively merging regions of each image based on what has moved from frame to frame. If you just merge images then you will capture the blur in every movement. This will be getting more complex over time and currently some of this cannot be handled with post-processing and may never be able to.
  8. Reviews started to come out today and I came across this interesting detail on how the Smart HDR is handled. While I don't expect video to have this level of adjustments it does look like a serious update to how photos are taken and could bode well for the video quality. https://www.theverge.com/2019/9/17/20868727/apple-iphone-11-pro-max-review-camera-battery-life-screen-midnight-green-price Early reports indicate that the iPhone 11 sensor has a higher ISO range and faster possible shutter speeds. But Apple told me that the real improvements are due to a bump from an 8-bit rendering pipeline to 10 bits, and something it calls “semantic rendering,” which is basically an update to Smart HDR that recognizes individual elements of an image and adjusts them appropriately. From my conversations with Apple, semantic rendering basically goes like this: The iPhone starts taking photos to a buffer the instant you open the camera app. So by the time you actually press the shutter button, it’s captured four underexposed frames and the photo you want. Then it grabs one overexposed frame. (This is all basically the same as the iPhone XS and the Pixel 3, except the Pixel doesn’t grab that overexposed frame.) Smart HDR looks for things in the photos it understands: the sky, faces, hair, facial hair, things like that. Then it uses the additional detail from the underexposed and overexposed frames to selectively process those areas of the image: hair gets sharpened, the sky gets de-noised but not sharpened, faces get relighted to make them look more even, and facial hair gets sharpened up. Smart HDR is also now less aggressive with highlights and shadows. Highlights on faces aren’t corrected as aggressively as before because those highlights make photos look more natural, but other highlights and shadows are corrected to regain detail. The whole image gets saved and shows up in your camera roll. This all happens instantly every time you take a photo.
  9. IOS 13 will add multi-camera support for the XS for those that are not looking to upgrade. Now if only they did the same for the Night Mode https://developer.apple.com/videos/play/wwdc2019/249/
  10. It will have a new 24mp mode that they are calling Deep Fusion. I am curious what sensor Apple is using as they could be using a 48mp sensor that they then downscale to 24mp after merging the images.
  11. The ultra-wide doesn't have it, although it is less necessary for this focal length
  12. The telephoto camera all have smaller sensors, no OIS in some cases and slower aperture lenses than the wide lens on the iphone. This applies to all android phones as well. What this means is that you need to shoot at higher shutter speeds and that combined with the smaller sensor introduces a lot more noise (especially in low light), almost negating the benefit of telephoto lens due to heaver Noise Reduction. I still question the benefit of these telephoto lenses although at least the new iphone's telephoto camera now has a faster lens.
  13. The enhanced dynamic range is now available in 4k60p. It looked great at 30p on the XR.
×
×
  • Create New...