
KnightsFan
-
Posts
1,351 -
Joined
-
Last visited
Reputation Activity
-
KnightsFan got a reaction from IronFilm in Is the Samyang VDSLR kit of primes still relevant in 2019.
I don't know about whether they are relevant to the market as a whole, but to me they are since they remain unique for their features and price. There are no other brands with anywhere near the number of focal lengths, all with similar size, clickless manual aperture, decently long focus throw, and standard gear positions. And they are all very fast, and in EF mount. The only problem is they make fairly ugly images, in my subjective opinion.
-
KnightsFan got a reaction from Jirej Productions Ltd in The right Sigma 18-35mm for my Metabones Spedbooster Ultra
It looks like your 18-25 might be a Sigma SA mount lens, perhaps mislabeled as Canon EF by the seller.
(Based on looking up pictures of lens mounts, I don't actually have the lens in SA mount)
I found this page which has a detailed pic of an SA mount http://conurus.com/forum/sigma-sa-mount-thread-t266-15.html?sid=90c0074805e17063cfbc549a5315e1b0
-
KnightsFan reacted to Attila Bakos in ProRes and H.265 Comparison
It records a lot of stuff, including aperture and ISO, use Exiftool.
Here's a sample output:
ExifTool Version Number : 11.29
File Name : DSCF7556.MOV
Directory : .
File Size : 606 MB
File Modification Date/Time : 2019:01:26 19:27:17+01:00
File Access Date/Time : 2019:01:26 19:34:59+01:00
File Creation Date/Time : 2019:01:26 19:34:59+01:00
File Permissions : rw-rw-rw-
File Type : MOV
File Type Extension : mov
MIME Type : video/quicktime
Major Brand : Apple QuickTime (.MOV/QT)
Minor Version : 0.0.0
Compatible Brands : qt
Movie Header Version : 0
Time Scale : 25000
Duration : 25.00 s
Preferred Rate : 1
Preferred Volume : 100.00%
Preview Time : 0 s
Preview Duration : 0 s
Poster Time : 0 s
Selection Time : 0 s
Selection Duration : 0 s
Current Time : 0 s
Next Track ID : 4
Track Header Version : 0
Track Create Date : 2019:01:14 16:13:13
Track Modify Date : 2019:01:14 16:13:13
Track ID : 1
Track Duration : 25.00 s
Track Layer : 0
Track Volume : 0.00%
Image Width : 4096
Image Height : 2160
Graphics Mode : srcCopy
Op Color : 0 0 0
Compressor ID : hvc1
Source Image Width : 4096
Source Image Height : 2160
X Resolution : 72
Y Resolution : 72
Bit Depth : 24
Color Representation : nclc 1 6 6
Video Frame Rate : 25
Time Code : 3
Balance : 0
Audio Format : lpcm
Audio Channels : 3
Audio Bits Per Sample : 16
Audio Sample Rate : 1
Matrix Structure : 1 0 0 0 1 0 0 0 1
Media Header Version : 0
Media Create Date : 2019:01:14 16:13:13
Media Modify Date : 2019:01:14 16:13:13
Media Time Scale : 25000
Media Duration : 25.00 s
Handler Class : Media Handler
Handler Type : Time Code
Gen Media Version : 0
Gen Flags : 0 0 0
Gen Graphics Mode : ditherCopy
Gen Op Color : 32768 32768 32768
Gen Balance : 0
Text Font : System
Text Face : Plain
Text Size : 12
Text Color : 0 0 0
Background Color : 0 0 0
Font Name :
Other Format : tmcd
Format : Digital Camera
Information : FUJIFILM DIGITAL CAMERA X-T3
Movie Stream Name : FUJIFILM MOV STREAM 0130
Make : FUJIFILM
Camera Model Name : X-T3
Modify Date : 2019:01:14 16:12:24
Artist :
Thumbnail Offset : 985974
Thumbnail Length : 8447
Copyright :
Exposure Time : 1/100
F Number : 4.0
ISO : 640
Sensitivity Type : Standard Output Sensitivity
Date/Time Original : 2019:01:14 16:12:24
Create Date : 2019:01:14 16:12:24
Shutter Speed Value : 1/100
Exposure Compensation : 0
Metering Mode : Multi-segment
Version : 0130
Internal Serial Number : FF02B4624018 Y54774 2018:08:28 930330111016
Quality : NORMAL
Sharpness : 0 (normal)
White Balance : Kelvin
Saturation : 0 (normal)
Noise Reduction : 0 (normal)
Fuji Flash Mode : Not Attached
Flash Exposure Comp : 0
Focus Mode : Unknown (65535)
Slow Sync : Off
Picture Mode : Manual
Shadow Tone : 0 (normal)
Highlight Tone : 0 (normal)
Grain Effect : Off
Auto Bracketing : Off
Blur Warning : None
Focus Warning : Good
Exposure Warning : Good
Film Mode : F0/Standard (Provia)
Min Focal Length : 18
Max Focal Length : 55
Max Aperture At Min Focal : 2.8
Max Aperture At Max Focal : 4
Rating : 0
Frame Rate : 25
Frame Width : 4096
Frame Height : 2160
Lens Info : 18-55mm f/2.8-4
Movie Data Size : 634417152
Movie Data Offset : 1050112
Aperture : 4.0
Avg Bitrate : 203 Mbps
Image Size : 4096x2160
Megapixels : 8.8
Rotation : 0
Shutter Speed : 1/100
Thumbnail Image : (Binary data 8447 bytes, use -b option to extract)
Light Value : 8.0
-
KnightsFan reacted to Jim Giberti in Black Magic Pocket OG 2019
Yeah, this can't be overstated.
All of the critical differences mentioned in discussions are in blown up stills.
Actual running footage, let alone footage compressed for broadcast and web delivery, is simply indistinguisable to a viewer. Period.
I did a blind test with a few people at different times in the studio the other day and the responses were exactly what I expected - "What difference am I supposed to see...they all look the same" kind of stuff.
What people should be talking about is how "organic" and "film like" Braw and Gen 4 processed in Resolve actually look. It's sort of the Holy Grail that so many people have been looking for.
4k, raw, detail, file size, smooth roll off in shadows and highlights, deep accurate colors, ease of editing.
The only people still debating the issue aren't the ones shooting with the new cameras and codecs. The debate is over for all of them/us.
-
KnightsFan reacted to IronFilm in Audio: voice over recording
However you save so little money while losing out on so very very very much!
Not worth it.
-
KnightsFan reacted to Snowfun in Audio: voice over recording
Managed to get a Rode AI-1 kit to play with. After reading the recommendations for a USB mic (which makes a lot of sense) I looked around and liked the look of it especially since I can use the mic with its XLR (plus adapter) on the P4k too if required. Might also get an XLR lav to use with the AI USB. Thanks for comments.
-
KnightsFan got a reaction from IronFilm in Audio: voice over recording
I often record dialog for video games and sometimes voiceovers for film projects. I have a corner of my room lined on a couple sides with thick, wool sleeping bags (really dense and heavy). I record with an AKG CK93 running into a Zoom F4 used as a USB audio interface, usually with Reaper as the software. I monitor with MDR-7506's. It sounds great. It's extremely budget-efficient as all of the components are things that I use on set. Well within 3 figures if you look for used equipment.
- You could switch out the mic for a cheaper cardioid (or omni, if you have to), but stay away from shotguns indoors.
- The Zoom F4 can be swapped for a cheaper H6.
- You can use Audacity to record for free--though I highly recommend Reaper if you do any audio post work at all. It's phenomenal!
- You may want a pop filter. I haven't gotten one yet.
I would definitely get ambient sound to fill the silence if there is no other audio playing. You can just get ambient sound from the room where you are doing the VO. It will probably just be faint hiss, but will remove that "jarring" factor of silence. If you go with laboratory sounds, you could record them in stereo. That way the mono VO will stand out against the ambient sounds better.
-
KnightsFan got a reaction from Snowfun in Audio: voice over recording
I often record dialog for video games and sometimes voiceovers for film projects. I have a corner of my room lined on a couple sides with thick, wool sleeping bags (really dense and heavy). I record with an AKG CK93 running into a Zoom F4 used as a USB audio interface, usually with Reaper as the software. I monitor with MDR-7506's. It sounds great. It's extremely budget-efficient as all of the components are things that I use on set. Well within 3 figures if you look for used equipment.
- You could switch out the mic for a cheaper cardioid (or omni, if you have to), but stay away from shotguns indoors.
- The Zoom F4 can be swapped for a cheaper H6.
- You can use Audacity to record for free--though I highly recommend Reaper if you do any audio post work at all. It's phenomenal!
- You may want a pop filter. I haven't gotten one yet.
I would definitely get ambient sound to fill the silence if there is no other audio playing. You can just get ambient sound from the room where you are doing the VO. It will probably just be faint hiss, but will remove that "jarring" factor of silence. If you go with laboratory sounds, you could record them in stereo. That way the mono VO will stand out against the ambient sounds better.
-
KnightsFan got a reaction from IronFilm in Using 2 mount adapters in simultaneous
I do this all the time. With nikon F to canon EF adapters, you can use a flathead screwdriver to tighten the leaf springs before mounting the lens, so that the adapter will have zero play. It is just as physically solid as if the nikon lens was a native canon lens.
-
KnightsFan reacted to IronFilm in Z Cam releasing S35 6k 60fps camera this year
Looks like they're seriously looking into MFT now and are going to do a MFT S35 camera!
All thanks to user feedback requesting it :-)
-
KnightsFan got a reaction from IronFilm in Z Cam releasing S35 6k 60fps camera this year
Yeah, I've been watching those developments closely. Price is a mystery still, but hopefully we'll hear within the next few weeks. I'm very excited, because the two main problems I have with the E2 are the smaller sensor, and the low MP count. I'd like have decently high resolution still images, and close to the native FOV with my vintage FF lenses.
A S35 version with a M43 mount seems to be something they are looking into, though they seem more interested in EF at the moment. So maybe we'll finally have that spiritual successor to the LS300 as far as lenses are concerned.
-
KnightsFan got a reaction from leslie in Framerate
You should shoot with your distribution frame rate in mind. If you are shooting for PAL televisions, then you should shoot in 25 fps for normal motion. If you want 2x slow motion, shoot in 50 and conform to 25. If you want 2.3976x slow motion, shoot in 59.94 and conform to 25, etc. (I know you aren't talking about slow motion, I just mention it to be clearer)
Essentially, at the beginning of an edit you will pick a timeline framerate to edit in, based on artistic choice or the distribution requirements. Any file that is NOT at the timeline framerate will need to be interpolated to some extent to play back at normal speed. Mixing any two frame rates that are not exact multiples of each other will result in artifacts, though there are ways to mitigate those problems with advanced interpolation algorithms. So you shouldn't mix 23.976 and 59.94. If you have a 23.976 timeline, the 59.94 footage will need to be modified to show 2.5 video frames per timeline frame. You can't show .5 frames, so yoy have to do some sort of frame blending or interpolation, which introduces artifacts. Depending on the image content, the artifacts might not be a problem at all.
The same would apply for putting 23.976 footage on a 29.97 timeline, or any other combination of formats.
The only way to avoid all artifacts completely is to shoot at exactly the frame rate you will use on the timeline and in the final deliverable, or conform the footage for slow/fast motion.
-
KnightsFan got a reaction from thephoenix in Framerate
There is not much visual difference between 24 and 25 if you stick to one or the other, but you should not mix and match them. Use 25 for PAL television or 24 if you want to use the film convention. 30 is slightly different on scenes with a lot of motion.
I am almost 100% certain YouTube keeps the original framerate. I think they can even do variable frame rate? I could be mistaken on that one though. I would be very surprised if Vimeo converted, but I don't use Vimeo so I am not positive.
Yes, exactly.
You can speed it up. If you speed it up 2x it will look a little jerky simply because it is essentially a 15 fps file now. If you speed it up to some weird interval, like 1.26x, then there will be frame blending and that will probably look pretty bad, depending on the content of your shot (a static shot with nothing moving won't have any issues with frame blending, whereas a handheld shot walking down a road will look really bad).
Technically, yes, you can do that. If you want your final product to be a 48fps file, and you are sure such a file is compatible with your release platform(s), then it should work. I think that it is a phenomenal idea to try it out as an experiment--but definitely try it out thoroughly before doing this in a serious/paid project--there is a very good chance it will not look the way you expect if you are still figuring stuff out.
Also, for any project, only do this if you want the large majority of your 30fps clips sped up. If you want MOST to be normal speed and a COUPLE to be sped up, then use a 30fps timeline and speed up those couple clips.
If I were you, I'd go out one weekend and shoot a bunch of stuff in different frame rates, then try out different ways of editing. Just have fun, and try all the things that we tell you not to do and see what you think of our advice!
-
KnightsFan got a reaction from tupp in So Is a7 III Still The Dynamic Range King? (Not tolling, just asking)
I'm not 100% sure about this, but it's my current understanding.
The reason you are incorrect is because doubling light intensity doesn't necessarily mean doubling the bit value. In other words, the linear factor between scene intensity and bit value does not have to be 1. For example: If each additional bit means a quadrupling of intensity instead of doubling, it is still a linear relationship, and 12 bits can hold 24 stops.
As @tupp was saying, there is a difference between dynamic range as a measure of the signal measured in dB, and dynamic range as a measure of the light captured from the scene measured in stops. They are not the same measure. A 12 bit CGI image has signal DR just like a 12 bit camera file does, but the CGI image has no scene-referred, real world dynamic range value.
It seems that all modern camera sensors respond linearly to light, roughly at a factor of 1 comparing real-world light to bits. I do not know exactly why this is the case, but it does not seem to be the only conceivable way to do it.
Again, I am not 100% sure about this, so if this is incorrect, I'd love an explanation!
-
KnightsFan reacted to kye in Framerate
I just operate under the concept that you choose your base frame rate and carefully match it to shooting normal speed clips, but when shooting slow-motion you just conform it properly and it kind of doesn't matter in a way.
For example, if you're on a 25fps timeline and you shoot 50p slow-motion then it will conform to 50% of real speed, but if you shoot 60p it will conform to 40% of real speed. I'm not really sure there would be that many situations where a 50% speed shot is required and a 40% speed shot wouldn't also be acceptable. In that sense I kind of view 50p and 60p as being the same,.
-
KnightsFan reacted to amanieux in Blackmagic Pocket Cinema Camera 4K gets BRAW in extensive FFPGA hardware update delivered via software
don't want to split hairs but "lossy compression" can be "visually lossless" or "visually lossy" , "lossless compression" is unambiguous because it means compression which does not change a single bit from the uncompressed data so it is visually indistinguishable because it is identical
-
KnightsFan got a reaction from thephoenix in Framerate
Youtube will play anything. If you shot 23.976, then stick with that. I don't know for certain, but i bet vimeo will also play anything. The only time you really have to be careful when deciding which format to use is with television broadcast or specific festivals, since modern computers can play anything.
For slow motion, you can shoot anything higher than your timeline frame rate and conform it. If your NLE has the option, you should conform footage instead of manually slowing it down. That way you will avoid any frame artifacts in case your math wasnt correct. But to directly answer the question, slowing 59.94 to 23.976 is a great way to get slow motion.
-
KnightsFan got a reaction from BrunoCH in Framerate
You should shoot with your distribution frame rate in mind. If you are shooting for PAL televisions, then you should shoot in 25 fps for normal motion. If you want 2x slow motion, shoot in 50 and conform to 25. If you want 2.3976x slow motion, shoot in 59.94 and conform to 25, etc. (I know you aren't talking about slow motion, I just mention it to be clearer)
Essentially, at the beginning of an edit you will pick a timeline framerate to edit in, based on artistic choice or the distribution requirements. Any file that is NOT at the timeline framerate will need to be interpolated to some extent to play back at normal speed. Mixing any two frame rates that are not exact multiples of each other will result in artifacts, though there are ways to mitigate those problems with advanced interpolation algorithms. So you shouldn't mix 23.976 and 59.94. If you have a 23.976 timeline, the 59.94 footage will need to be modified to show 2.5 video frames per timeline frame. You can't show .5 frames, so yoy have to do some sort of frame blending or interpolation, which introduces artifacts. Depending on the image content, the artifacts might not be a problem at all.
The same would apply for putting 23.976 footage on a 29.97 timeline, or any other combination of formats.
The only way to avoid all artifacts completely is to shoot at exactly the frame rate you will use on the timeline and in the final deliverable, or conform the footage for slow/fast motion.
-
KnightsFan got a reaction from thephoenix in Framerate
You should shoot with your distribution frame rate in mind. If you are shooting for PAL televisions, then you should shoot in 25 fps for normal motion. If you want 2x slow motion, shoot in 50 and conform to 25. If you want 2.3976x slow motion, shoot in 59.94 and conform to 25, etc. (I know you aren't talking about slow motion, I just mention it to be clearer)
Essentially, at the beginning of an edit you will pick a timeline framerate to edit in, based on artistic choice or the distribution requirements. Any file that is NOT at the timeline framerate will need to be interpolated to some extent to play back at normal speed. Mixing any two frame rates that are not exact multiples of each other will result in artifacts, though there are ways to mitigate those problems with advanced interpolation algorithms. So you shouldn't mix 23.976 and 59.94. If you have a 23.976 timeline, the 59.94 footage will need to be modified to show 2.5 video frames per timeline frame. You can't show .5 frames, so yoy have to do some sort of frame blending or interpolation, which introduces artifacts. Depending on the image content, the artifacts might not be a problem at all.
The same would apply for putting 23.976 footage on a 29.97 timeline, or any other combination of formats.
The only way to avoid all artifacts completely is to shoot at exactly the frame rate you will use on the timeline and in the final deliverable, or conform the footage for slow/fast motion.
-
KnightsFan reacted to amanieux in Blackmagic Pocket Cinema Camera 4K gets BRAW in extensive FFPGA hardware update delivered via software
disagree lossless raw is still raw because it is bit for bit identical to uncompressed raw but lossy raw is no longer raw unless you give another definition of raw that is for me "data straight out of the sensor", altering a single bit of data disqualifies for raw otherwise where do you put the limit of more or less lossy compression and still qualifying for "lossy raw" ? is h264 a very lossy raw ?
-
KnightsFan got a reaction from thebrothersthre3 in HLG explained
@mirekti HDR just means that your display is capable of showing brighter and darker parts of the same image at the same time. It doesnt mean every video made for an HDR screen needs to have 16 stops of DR, it just means that the display standard and technology is not the limiting factor for what the artist/creator wants to show.
-
KnightsFan reacted to kye in HLG explained
Actually, I think that it's a deeper issue.
Just look at the terminology - "recovering highlights" is something that you do when things have gone brighter than white. This only makes sense if white has a shared definition, which it does if everyone is publishing in rec709.
HLG is actually a delivery standard, so when someone shoots in HLG and are going to deliver in HLG then there is no recovery - if the camera clips the HLG file then that data is clipped. Same as if you shoot rec709 for delivery in rec709. If this guy was talking about filming in LOG then no-one would assume that he's delivering in LOG, so the conversation would be in the context of the default and standard delivery colour space / gamma.
The point of HLG is that it's an alternative to rec709, and so now there is no default / standard / goes-without-saying reference point.
Edit: coming from a camera, clipped HLG is the same as clipped 709.. that is some cameras may not actually be clipped because they might include super-whites in the output file, like I know the Canon cinema cameras do (as well as others I'm sure). Ah, I love the smell of confusion in the morning.
-
KnightsFan reacted to kye in HLG explained
Well, he makes no sense through most of that video but was absolutely right about one thing - when he said "I don't know the technicalities of what's actually going on".
My understanding would be this:
HLG includes the whole DR of the camera and the other profiles he tested don't FCPX is confusing him.. here's what I think is happening: FCPX takes the HLG file (which isn't clipping anything) and then converts it automatically to rec709, "clipping" it heavily but retaining that data in super-whites When he makes adjustments to lower the exposure those values above 100 get brought back into range He thinks that FCPX pushing the exposure beyond 100 somehow means the GH5 "clipped" (it didn't) He things that lowering the exposure in FCPX and getting the highlights back means you can somehow recover clipped highlights (you can't) If something is clipped then the values are lost (digital clipping is data loss) FCPX is "helping" by automatically doing things without asking you TL;DR - HLG has greater DR; exposing for HLG is different than rec709; FCPX is "helping" and confusing people; and this guy isn't the person to be listening to for this stuff.
-
KnightsFan got a reaction from IronFilm in So Is a7 III Still The Dynamic Range King? (Not tolling, just asking)
The problem is there are many ways to measure DR. If you read "the Sony a7III has 14 stops of DR" and "the Arri Alexa has 14 stops of DR" both may be correct, but are utterly meaningless statements unless you also know how they were measured.
Many years ago, Cinema5D pegged the a7sII at like 14 stops. However, they later standardized their measurement to use SNR = 2, which gave the result of a7sII at 12. But whichever way you measure, it's ~2 stops less than the Alexa. Many members here will tell you that Cinema5D is untrustworthy, so take that as you will. I have yet to find another site that even pretends to do scientific, standardized tests of video DR.
Cinema5D puts the XT2 and XT3 at just over 11, so that confirms your finding. And again if you change your methods, maybe it will come out at 13, or 8, or 17--but in every case it should be a stop less than the a7sII when measured the same way.
Bit depth doesn't necessarily correlate exactly to dynamic range. You can make a 2 bit camera that has 20 stops of DR: anything below a certain value is a 0, anything 20 stops brighter is a 3, and then stick values for 1 and 2 somewhere in between. It would look terrible, obviously, because higher bit depth reduces banding and other artifacts. There is pretty much no scenario in which an 8 bit encoding has an advantage over 10 bit encoding of the same image.
-
KnightsFan got a reaction from IronFilm in cinematic color?
I did some VR development a few years ago when I had access to a university Oculus. It was a ton of fun. Even simple things like mapping a 360 video so you can freely look around is amazing, let alone playing VR games with the handset and everything.
I guess what I love in games is where the game never forces you to use a certain item to defeat the monster, thereby encouraging creativity to overcome tasks. You could get that specific item, or you could find a way to bypass the monster altogether--but then that same monster may come up later in the game. That's where cinematic techniques like color come in. The developer may use color to psychologically influence a player to make a decision, which makes it much more rewarding to find a different way to accomplish the task.
For a great use of color in a game, think of Mirror's Edge, where objects you interact with are bright red and yellow against a mostly white world. It's makes it much easier to identify things you can climb without stopping and breaking your momentum. Films can use color in a similar way, to draw attenion to certain objects, but the fact that attention is drawn to an object actually changes how a game is played and, in some cases, the actual plot of the game, whereas a movie still exists on a linear timeline no matter where you look.