Jump to content

KnightsFan

Members
  • Posts

    1,190
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. I totally understand paying for different software features, like separate purchases for raw recording, or log recording, or higher frame rates Subscriptions with monthly payments, on the other hand... how does that even work? Are there enough theoretical updates to warrant paying each month? I can't think of more than 1-2 software items I'd want updated on any camera I've owned. What can they do month after month? They can't increase dynamic range. So if the only updates are little things I either never use or work around easily, then I'd just unsubscribe. Not sure why I would pay for a subscription and not get functional updates that I care about. Unless the manufacturer shuts the camera off remotely if I don't pay monthly, which would of course be hideous, and I would never buy into that system. However... if the subscription model was like a constant rental, where you pay $x/month and you not only get software, but also hardware upgrades as they arrive, that might be a neat pricing model. When I buy a camera for X, sell it Y months later for Z, I always calculate my cost to own over that time period. If a manufacturer cuts out the buying and selling part, I see it as a win. For example, I could conceivably pay some fixed price per month to always have the latest Fuji XHx model--with some kind of future reporting on when new models will arrive and what they will have, of course. This is a very unlikely model for any company to move to, I'm just writing it out as something that would be interesting and actually pretty acceptable to me.
  2. This. I have no interest in mirrorless cinema lenses locking me into a particular mount, particularly manual cinema lenses. Nikon lenses in PL mount, however, that would be extremely welcome.
  3. I think the patent says 2k, not 4k and I don't think many people want to shoot <2k even for compressed raw. https://patents.google.com/patent/US8872933B2/en (A rumored reason is that the SI-2K, which existed prior to Red's patent, shot 1920x1080 cineform raw. Red added 80 pixels and stepped into history).
  4. I fully support your endeavor! I'm not negative on what you're doing. Ideally it is best to standardize, I just worry that the list won't grow very large, because of the purchase requirement. Unless you can get buy in from a big reviewer who gets their hands on a lot of models (or maybe you are a reviewer who gets your hands on lots of models personally)
  5. Sounds like either you or the other person measured it wrong (or possibly both of you did).
  6. Here's my list https://outerspaceoatmeal.com/tools/RollingShutterComparison.html You can add or not , up to you. Most of my numbers come from DVXuser, CineD, and a couple from other primary sources where the test method has been shared. Global shutter cameras are self explanatory so I link to the product page. Yeah I mean it's ideal to always do it the same way, I'm just not sure many people will buy a specific arduino to fill out this table. Difference with DR is that it's extremely subjective. Rolling shutter is not. People can measure it incorrectly-- which they can do whatever their intended test method is -- but they can't measure it correctly and then arrive at a different conclusion than someone else. Edit: And to be clear, measuring signal to noise ratio is also objective.
  7. In that case none of my values will go into your table. Seems like a waste, though--it's a raw speed so there's no subjectivity.
  8. Nice! I have my own database of rolling shutter values that I can get to you. The one column I would add to the table is the ratio of the rolling shutter to the frame rate. That value normalizes the skew per frame.
  9. I have no idea who licenses what anywhere in the industry lol. Between Nikon buying Red, Blackmagic joining the L mount alliance, several global shutter cameras announced in quick succession, and AI video generation/editing taking off, I do think this might be the point at which I least recommend buying a brand new camera. 2024 might be a big year for market shifts.
  10. Probably Nikon has wanted to acquire a cinema company for a while. When Nikon first announced Z mount, they mentioned cinema a lot, but never backed that up with a full feature set. There were little things, like the tripod locating pin (so critical for solid cinema rigs). And of course supporting Raw outputs. But Nikon didn't make the big jumps, like timecode, internal NDs, XLRs. Maybe they wanted to, but didn't quite have the tech, personnel, etc. and buying out a smaller company was the easiest option. So really there are three possible futures. 1. Red's tech moves to Nikon's mirrorless cameras. Redcode perhaps, accessory compatibility e.g. their new EVF, the global shutter sensors. 2. Nikon's tech moves to Red. Z mount, autofocus, or even simple things like LCD screens, mirrorless-size EVFs. And the other huge category: lenses. Perhaps some of Nikon's excellent optics will find their way into cinema housings, either in PL or Z mount. 3. Nothing changes, but Nikon owns a more diversified product line
  11. That was a Z Cam E2-M4. I saw similar results with the XT3 back when I used that primarily. Fwiw, I also saw similar color issues comparing Canon 5D3 8 bit vs magic lantern Raw. The color blocks are most obvious in relatively uniform gradients, such as skies. This tree shot isn't the best viewing since it's so busy with high frequency changes, but you can still see it pretty easily when you zoom to 100% or view on a 4K monitor, especially in motion. Most obvious is the greenish splotch in the bottom area that I highlighted, and the upper highlighted area has red and green splotches. If it's not a big enough deal for what you do, then great! To me, it's a big enough deal, since I have the option of 10 bit. There's no downside: file size is the same, and I've never encountered overheating on any camera ever (shooting narrative I take relatively short clips with time in between). That's not to say the difference is uber important... I mean I am splitting hairs about something that has very little bearing on the final product. I'm posting here to show what the difference is not to tell you that it matters for you. All else being equal, I'll always use 10 bit on the cameras I've tested.
  12. I don't know if this is universal, or just the cameras I've tested, but I've found that recording 8 bit produces blocky color artifacts that are visible even without log recording or color grading. See my example in the other thread, and note that the comparison is with roughly equivalent bitrate (In this particular example, the 10 bit file ended up slightly smaller but within a couple %).
  13. When people talk about motion cadence, I think it's a conflation of many possible sources, where rolling shutter is just one piece and is often not the prominent one. The various motion problems I have identified are below. I don't see a lot of discussion on 4, 5, and 6 on film forums. I see stuff about 4 every now and then. Obvious Settings 1. Frame rate (24 vs 30 vs etc) 2. Shutter speed Sensor Tech 3. Rolling shutter 4. Weird artifacts almost like double exposures. My camera explicitly has a mode that captures at two shutter speeds simultaneously for higher dynamic range. I've seen artifacts like that on other cameras, but not advertised as a specialty mode. Display Issues 5. Display scan rate, frame rate, ghosting, trails. Some screens scan slowly, like rolling shutter on the display end. Others have ghosting effects, where the previous frame is still slightly visible, or trails. I see a lot more information about displays on video game tech sites, rather than film tech sites. Refresh rate, as mentioned earlier, includes pull downs or judder, to fit nondivisible integers. 6. Decode speed (laggy motion with H.265 on older computers, and stutter from high bitrate files on old mechanical drives).
  14. Yup, it was always amusing when people watched 24p on a 60 fps monitor and claimed it looks better than 30p. Maybe some people like the effect of pulldowns or frame blending, but to me it's a strong BS indicator. I am fortunate enough to have a 120 fps monitor, which is a great number because it's divisible by 24, 30, and 60. It's great that high refresh rates on our screens are the norm now! I really dislike rolling shutter. It's one of my least favorite imperfections. I'm not saying "global shutter or bust" but the faster the better, and 10ms is around the cutoff where I'm happy. Quick controlled pans don't bother me so much as the vague wobble when it's on a steadicam, or handheld. My favorite movies to make have plenty of action, running, fighting, etc. so it's way more present to me than for most corporate or wedding shooters. I'd sacrifice a stop of DR and noise from modern full frame sensors to get rs in the 5 ms range instead of the 20ms range. The nice thing is that plenty of budget cinema cameras have fast readouts these days, like the UMP 4.6k G2, FX6, and of course Komodo.
  15. Theoretically I agree, but in practice no box cameras have built the required ecosystem of accessories to create compact camcorder ergonomics. The two missing pieces are are the side handle, and the monitor. There are plenty of "dumb" side handles, but to match camcorder ergonomics it needs lots of buttons. The FS7 handle with its multiple function buttons, joystick, etc. is a starting point. There are very few good monitor options under 5", and you need bulky batteries or lots of cables. By that time you've got a cinema rig. We could really do with some more open standards for camera controls, video, and audio. Lots of vendor lock these days in terms of accessories.
  16. I loved old camcorder ergonomics. Large battery on the back, flip out screen on the side, nice slot for your hand on the right. The Sony NEXVG900 was a neat concept, but I guess it didn't sell because it was a one off.
  17. A trend towards "fewer wealthier photographers" is generally an accurate description of many industries, and is a cause for economic concern. Often when people talk about what AI can't do, they jump to comparing to the top 0.0000001% of humanity. AI might never achieve what Michelangelo, or Scorsese, or Bach, or Pink Floyd did. But if it can achieve what the bottom 30% of artists can--that's a lot of artists losing money.
  18. If a tool that reduces the cost to produce audiovisual content by 50% isn't game changing, I don't know what is. And this will be a heckuva lot more than 50% reduction over the next 3 years (in my prediction). Same. I think my favorite use case for AI would be something like Wonder Studio, where I can shoot everything with a couple actors, and then replace them with animated characters. That would make something with animatable characters (for example, a Star Wars scene with lots of droids) really easy with a limited cast size. Or if I could go out in the wilderness and shoot with amazing real scenery, and perfectly composite people in later without motion tracking and chroma key artifacts.
  19. I don't think a photo real animation with no back end labour can be described as just a better animation tool. Current animation tools, critically, take years of practice and hundreds of paid hours to create each individual work. A production going from "writer, director, and 10,000 hours of professional, lifelong technical artists" to "writer, director, and a 2 month subscription to OpenAI" is, in my opinion, something to pay attention to and expect disruption from, whether you categorize it as a "just a better tool" or not. Switching perspectives a little, these tools are absolutely perfect for hobbyists like me. I'm never going to hire artists, so my productions go from crap CGI to amazing CGI, and no one loses a job. There are no downsides! If that's the angle you're coming from, then I agree with you. However, for anyone making a living off of video work, there's a very very large chance that the amount of money that anyone is willing to pay for ANY kind of creative content creation is going to decrease, fast.
  20. It won't always look shitty. Remember 30 years ago when CGI looked like Legos photographed in stopmotion against a flickery blue screen? Let's wait 30 years on AI generated imagery. No technology can take away the enjoyment of doing something, though it can take away the economic viability of selling it. Which indirectly affects us, because if fewer cameras are sold, people like you and I will face higher equipment prices. Certainly AI is already used in the gaming industry to make assets ahead of time. It will be a bit longer before the computational power exists at the end user to fully leverage AI in real time at 60+ fps. When you have a 13 millisecond rendering budget, it's a delicate balance between clever programming and artistically deciding what you can get away with--and that it requires another leap in intelligence levels. Very few humans are able to design top-tier real time renderers. AI will get there, but it's a vastly more complex task than offline image generation. But yes, AI today already threatens every technical game artist the same way it does the film and animation industries, and will likely be the dominant producer of assets in a couple years. In the near term, humans might still make hero assets, but every rock, tree, and building in the background will be AI. Human writers and voice actors might still voice the main character, but in an RPG with 500 background characters and a million lines of dialog, it is cheaper and higher quality for AI to write and voice generic dialog.
  21. A few years ago I said something here on this forum about how AI could even replace wedding photgraphers/videographers. My point was that I didn't know what the technology would look like, but it would eventually be possible. My wild brainstorm was something along the lines of setting up a dozen video cameras, then AI uses that information to generate a whole edit, with closeups and clear audio, nicer lighting, etc. Doesn't look like the tech that far off. Budget weddings won't even need the video cameras, just a couple photos and a description of what happened. It won't be "real," but will the influencers and influencees care?
  22. It's technically trivial to make a timecode input that writes TC metadata in the file header. (Source: I've created timecode readers on arduinos, android apps, and webpages to keep virtual production units synced). I think it's lack of interest among customers. And to be clear, accurate timecode generators require more specialized hardware than your average off the shelf board--I'm talking about timecode input, which is what people mostly use on cameras since the generator is either an audio recorder, or a dedicated TC generator. I see these prices and am reminded that the demand side of supply/demand is more important on tech items like that.
  23. You're right-- I knew I had that in my head from somewhere, but for some reason I missed it when I skimmed the article again yesterday. Apologies for my share of misinformation!
  24. I was referring to audio. Octopus can use any random audio interface with USB, which is also very common. Are you sure? If so then you're right, it's not quite the same. I'd be kinda surprised if it doesn't have internal scratch audio though. I can't remember seeing anything about it not having any--but I might have missed something! Anyway, a USB dongle for a 3.5mm scratch mic is what, $10? So clearly you and many other people need 3.5mm and scratch audio at a minimum and I'm certainly not dismissing that. My point is that I don't. For me its either true scratch audio, or I'm running a real recorder. I'm happy with adding a dongle and scratch mic if I need to. I'd love timecode input, but there also aren't many (any?) options <$1k that have TC in. Digital bolex didn't. I'd say wait and see what their API looks like. If you can have some simple code that reads LTC from a USB dongle then I'd say that's already better than almost everything else at the price, including Z Cam (which I say as a Z Cam owner with the ridiculous dongle, haha) I guess I just think you're expecting too many big-production features considering the price they're targeting, while glossing over the unique feature that they have instead--and I mean truly no-one-else-has-it unique. All I'm saying is I'd trade internal audio for USB audio, but I don't expect everyone else to. Yeah 100% this camera is not for the kind of productions you usually work on.
  25. I don't think the target audience is the "film industry" to be honest. Probably more likely to find its way in the hands of tinkerers, auteurs who already have other cameras, and maybe some very specific applications like low budget crash cams when you can't afford Komodos. Z Cam relied on 3rd party monitors from day 1. Their standard was HDMI for video, Octopus' is USB for audio. Both are consumer connectors with extremely wide support among consumer products. It's worth pointing out that this Octopus camera has the same audio and TC solution as the OG BMPCC and most hybrid cams, except that it can also record synced, bit perfect audio from your favorite Zoom recorder (or thousands of low budget audio interfaces) I don't necessarily disagree with you, IronFilm, but it's a different audience and their tradeoff was in favor of smaller size, weight, and price. So I wouldn't use the term "huge middle finger" personally, more like they decided on a different audience and feature set
×
×
  • Create New...