Jump to content

tupp

Members
  • Content Count

    810
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by tupp

  1. Blackmagic cameras are sensitive to IR, and tungsten sources emit a high percentage of IR. So, you might want to try an IR cut filter.
  2. Evidently, the TIFF format has metadata designations that include camera make, model, color profile, orientation, exposure, etc. (see the linked exif file below). The highest quality still file that your camera produces will give the best image quality. However, it can be unwieldy or visually detrimental to directly edit raw camera files. Some still photographers first run their raw camera files through a raw image "processor/developing" program (such as Lightroom), before retouching them in an editing program such as Photoshop. I use the open source Darktable as my processing program and GIMP as my photo editor. I just checked the exif info in a tiff that I converted from a raw file using Darktable and the metadata is extensive: tif_exif_info.txt (Note that I was using a manual lens on a Canon EOSM, so there is no lens/f-stop info).
  3. Every raw file "development" program that I have used initially defaults to the camera's settings. Of course, one can usually change from the default settings to adjust white balance, exposure, camera profile, etc., but it probably depends on which program that you are using to convert from raw to .tiff.
  4. Shooting raw stills and comparing them to processed frames can reveal the extent of sharpening or other image processing that is occurring in the camera.
  5. Don't trust auto-focus for anything critical, even if you can see the image momentarily magnified. Again, seeing raw stills could help, as those images should bypass a lot of the in-camera image processing.
  6. Did you use the same 12-35mm f2.8 lens on both cameras, or did you use two different 12-35mm f2.8 lenses? I wonder what the comparison would be like with both cameras set to "natural" and with the sharpening dialed all the way down in both cameras. It would be helpful to also see raw stills. By the way, are you auto-focusing or manually focusing?
  7. It appears that the GX80 has some artificial sharpening, judging from the halo around some of the lines/numerals on its chart. By the way, did you use the same lens with each camera?
  8. I want neither. This is just a discussion. The film look and the "cinematic" look are different, but they are not mutually exclusive goals. Let's not "put on airs." I'm no huge fan of Seinfeld, but I would say that Seinfeld was more "cinematic" than the home films that I linked. As I recall, on Seinfeld they occasionally did use camera movement (staging), inserts and nearby CUs (lensing) and they had motivated lighting to convey certain environments. It was not entirely a flatly lit, distantly shot, three camera sitcom. Even if Seinfeld were utterly "uncinematic," how does your point differ from my example of home movies not being cinematic, but being undeniably captured on film? I don't think that we disagree here. I am not advocating emphasis on post, nor am I suggesting that we should dwell on technical aspects. I merely addressed your linked comparison (and the comparison that you quoted) between digital cameras and film stocks. My point is simply that there are certain variables that can make video look like it was shot on film. Some of these variables must be wrangled while shooting and some are dealt with in post, but none of them have anything to do with being "cinematic." Not necessarily. The Vilm camera had low dynamic range compared to current digital cameras. Also, much of the early video that FilmLook processed was captured with analog cameras of low dynamic range. Of course, having extra dynamic range helps. In regards to bit depth, I don't think that it is an important factor in mimicking film. There are plenty of film stocks that have low color depth, so high bit depth (as a factor of digital color depth) is not necessarily crucial in making video look like film. By the way, as you may have gathered from the parenthetical part of the previous sentence, bit depth is not color depth. Also, bit depth and dynamic range are independent properties. I agree that peculiarities in how emulsion rolls off the highlights/bright areas is an important characteristic to address in emulating film with video. After using up a significant portion of the dynamic range to deal with the highlights, it certainly is beneficial to have more room left over in the middle and low end for nice contrast range and color depth. However, huge dynamic range is not crucial in merely making video look like film. Peculiarities of emulsion grain likely should be considered when trying to emulate film with video. I don't know anything about post grain similations, but film grain is a somewhat controversial and complex topic in regards to film look. However, I have always understood that with negative stock, the grain clumps are usually larger and more overlapping in the bright areas, while grains are smaller but more separated and distinct in the dark areas. Also, the way noise appears in the shadows in digital is somewhat analogous to the more visible grain in shadows with film, and there have certainly been a few posts in this forum about how noise from certain cameras feels "organic." To me, whether or not film has an edge in color is largely subjective. In addition, I couldn't say that all film has a greater color depth than all digital sensors. the color depth of different film stocks varies dramatically, as does that of digital cameras/sensors. Special processing also significantly affects color depth and contrast in emulsions. Larger film formats yield greater color depth. Higher bit depth does yield greater color depth in digital, but, again, bit depth and color depth are not the same thing. No. I'm not confusing display technology with capture technology. I most certainly made a distinction between the viewing bit depth and the capture range in the YouTube scenario. I was additionally making the point that low bit depth (in the post-capture) stage doesn't influence whether or not something looks like it was shot on film. Apparently, we agree here. I don't have have an Iphone (I never fell for the Apple trick), but I have no doubt that I could shoot something "cinematic" with it.
  9. I don't think that's it. It is important to differentiate between what looks "filmic"/"cinematic" and what looks like it was shot with film. Consider this home movie and this home movie. In regards to lighting, lensing and staging, there is nothing particularly filmic/cinematic about these home movies, but both of them were unmistakeably shot on film. Also, the dynamic range of the stock was probably equivalent to a 7 1/2 stop or less capture range. Just before digital took over, negative stock was generally rated at only 7 1/2 to 8 1/2 stops capture range, with normal processing. So, the technical capture range of today's pro and prosumer cameras is greater than film stocks. Furthermore, those home movies are channeled through 8-bit, YouTube pipes, so the bit depth of what we are viewing is only 8-bits, yet the footage is obviously captured on film. It's something else... a combination of variables that can be quantified for the most part. The Filmlook people and Eddie Barber (with his Vilm camera) were early pioneers in making video look like film, and they evidently did a thorough analysis of the variables involved. I never used Filmlook but I did shoot with the Vilm camera, and I was able to glean a little on how it is done.
  10. Uh, I understand that he was asking about his adapter/focal-reducer, but if I read correctly, he is not sure what mount to get with his lenses. As you subsequently concurred, I suggested that he get Nikon F mounts on his lenses. However, I also noted that many Canon EF adapters/focal-reducers are more expensive than their counterparts. So, he might be better off by getting such adapters with a different mount (that would allow Nikon F lenses to be adapted).
  11. Go for the Nikon mount on your lenses. You will always be able to adapt the lens with a Nikon mount onto anything with a Canon EF mount (given that the version of the same lens with an EF mount also fits). However, the opposite is not true -- a lens with a Canon EF mount will be incompatible with quite a few adapters and focal reducers, a few of which are only available with a Nikon mount. Furthermore, the Canon EF version of some focal reducers and adapters are significantly more expensive than those with most other mounts, because of the electronics required for Canon EF lenses. Go with the Nikon mount on your lenses.
  12. If your input frame rate is not a multiple of the output frame rate (or vice versa), then you might need to blend some of the input frames for the output to seem smooth. Frame blending can involve a meshing of interlaced fields from adjacent frames (as done in typical pull-downs) or involve a digital blending of adjacent progressive frames. A touch of motion blur can be added digitally if needed. I don't do much post work, but I would guess that rendering time is the biggest drawback of digital frame blending and of adding motion blur. An interlaced pull-down/pull-up uses fewer resources than some digital frame blending processes. Of course, if you are making a simple conversion in which your input frame rate is a multiple of the output frame rate (e.g., 48fps -> 24fps), you are simply dropping unneeded frames, and very little computer power is needed. There must be examples of various frame rate conversions and digital motion blur on YouTube, Vimeo, and DailyMotion, etc. At any rate (pun unintended), it is probably best/easiest to capture in the frame rate that you will be outputting.
  13. Strobes are measured in watt-seconds (not watts) and in guide numbers. Guide numbers rate the actual illumination output of a strobe, while watt-seconds usually measure the electrical expenditure of a strobe's "power pack." A watt-second is equivalent to the expenditure of power of one watt for one second. So, a momentary flash from a 1,000 watt-second strobe is equivalent to leaving the shutter open for one full second with a 1,000 watt constant light source (all other variables being equal). That's a lot of light. Let's make another comparison between a 1,000 watt constant light with a 1,000 watt-second strobe. If you shoot 24fps video with a 180-degree shutter, your shutter speed is 1/48th of a second. So, shooting 24fps video with a 1,000 watt constant light is only yielding the equivalent of 1/48th of 1,000 watt-seconds -- only 20.8 watts-seconds. Generally, if a monoblock strobe and a strobe with a separate power pack have the same watt-second rating, the monoblock will be brighter. Monoblocks are more efficient because they have no head cable to incur line loss. If you intend to do a lot of stills outdoors and/or shoot stills indoors with large sources (umbrellas, soft boxes), then you will probably be much happier with strobes. Strobes are a lot more powerful than most constant sources, and strobes can freeze/sharpen action. Strobes allow one to make daytime exteriors look dark, with a short shutter speed and short flash duration. Most LED and fluorescent sources can't even come close to achieving this power/time density, and using focused tungsten and HMI sources to do the same would probably fry the subject.
  14. Thank you!!!! That's perfect -- no video necessary! Considering Panasonic mirroless now!
  15. Thank you for your suggestion, but I have actually been directly involved in the last two pages, and it is not clear whether or not others understand exactly what I am asking. There seems to be confusion between "aperture" and "shutter." If someone with a Panasonic camera and lens would merely observe the lens aperture (set to f11-f22) during a 1 second exposure, the answer would be clear. So, are you saying that you observed the lens aperture momentarily stopping down when the shutter was released?
  16. I hope so. However, could someone with a Panasonic camera and lens please confirm this by simply observing the aperture stopping down during a 1 second exposure? Thanks!
  17. Thank you for the reply, but I am not sure that I have made my question clear. I am not talking about the shutter -- I am referring to the lens aperture (fstop). SLRs (and most DSLRs when shooting stills not in "live view") have the aperture (not the shutter) always wide open for focusing and framing. Then, when the shutter button is pressed, the mirror goes up, the aperture stops down to it's desired setting and the shutter opens and closes. So, my question is: Can Panasonic mirrorless cameras do the same -- can the aperture (not the shutter) on the lens of a Panasonic mirrorless camera be always wide open for stills until the shutter button is pressed? Anyone who has a Panasonic camera and lens should be able to easily tell what actually happens to the aperture by looking through the front element of the lens when the shutter is clicked. Set the shutter speed to 1 second, set the aperture to it's smallest setting (f11-f22?), and click the shutter. It should be obvious whether or not the aperture stops down during the 1 second exposure. Of course, this process will not occur in "Constant Preview" mode. Thanks!
  18. All of them Thanks! Are the Panasonic cameras actually keeping the aperture physically open, or are they just boosting gain/iso?
  19. Does anyone know if any other Panasonic mirrorless cameras (G7, G85, GH5, etc.) can also keep the aperture open when focusing for stills?
  20. Wondering if anyone knows if the exposure simulation on a G7 or a GX85 or a G85 can be overridden to give a usable EVF/LCD when shooting stills with manual studio strobes. On some mirrorless camera, the viewfinder/LCD goes dark when shooting with manual strobes, as the iris must be closed down for the much brighter strobes. DSLRs don't have this problem, as one can just use the optical viewfinder. Thanks!
  21. Canon EFs have good, clean glass, but keep in mind that, because of their electronic iris, you might run into some situations in which using those lenses will be frustrating or nearly impossible to use. If you want to do an aperture ramp during a pan, you might be operating from a camera menu and/or a touch screen. Very annoying. Also, certain adapters (namely tilt) will not allow any aperture adjustment with an EF lens. So, unless you have handy a spare Canon body or spare powered EF adapter, you are "SOL" if you want to change the f-stop. The same drawbacks apply to most other lenses with electronic irises.
  22. I use mostly open-source (free) software, and GIMP is my main image editor. I don't use LUTs, so I cannot give details, but I do know that GIMP supports LUTs through the G'MIC plug-in. I have heard of another GIMP LUT plug-in, but I can't remember its name. There is also an open-source LUT converter called LUT_TO_LUT. I use open-source Darktable as my image "developer," and I am fairly sure that it also accepts LUTs
  23. Some Nikkor lenses will not work with the Metabones EF speedboosters, when using a Nikon-F-to-Canon-EF adapter. Those EF speedboosters have a flange surrounding the front optical element that is positioned slightly forward from that front element. The protective metal "tongue" on the back of some of the Nikkor lenses hits this forward flange, thus preventing the lens from being mounted. Also, keep in mind that EF speedboosters are about 30%-50% more expensive than those with other mounts.
×
×
  • Create New...