Jump to content

24p is outdated


zlfan
 Share

Recommended Posts

On 12/8/2023 at 8:32 PM, fuzzynormal said:

But, seriously, how could AI ever surpass some gal or guy that has earned wisdom (plus the context of it) and knows how to use that experience artistically?

Well the most obvious reasons it can are: (i) because AI has almost 100 years of data (i.e. film history) to draw from, and (ii) because it likely has (or eventually will have) access to data mined from all these programs we're using right now to make our art. I mean why else do you think some of these apps with these fantastic tools are being offered for us to use for free? And remember AI doesn't need the entirety of that data - it just needs a large enough sample size to crack the code.

The mistake we continue to make as humans is thinking that the things that make us complex cannot be reduced to 1s and 0s. But it totally can, if given enough data.

And again, none of this will ever end our human need to create or be creative. It will however make it harder for us to monetize our creativity in economically profitable and sustainable ways.

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

Sure AI will have data. 
 

But it’s inherently backward looking. It’s not going to be an innovator.  Just at best, mash ups of existing auteur. 
 

You have to prompt it with whose work you want to plagerise and it’s really good at copying that. 
 

It can’t innovate new. All it can do is regurgitate old.  
 

Before you had Kubrick how would you tell it to emulate a Kubrick sensibility….

 

Link to comment
Share on other sites

I absolutely agree with @Ty Harper that with enough data it will be able to differentiate the movies that got nominated for an academy award from those that didn't, those that did well in the box office from those that didn't, etc.

What it won't be able to do, or at least not by analysing only the finished film, is know that the difference between one movies success and the next one is that the director of one was connected in the industry and the second movie lacked that level of influence.  But, if we give it access to enough data, it will know that too, and will tell a very uncomfortable story about how nepotism ranks highly in predicting individual successes...

I also agree with @JulioD that the wisdom will be backwards-looking, but let's face it, how many of the Hollywood blockbusters are innovative?  Sure, there is the odd tweak here or there that is enabled by modern production techniques, and the technology of the day changes the environment that stories are set in, but a good boy-meets-girl rom-com won't have changed much in its fundamentals because humans haven't changed in our fundamentals.

Perhaps the only thing not mentioned is that while AI will be backwards looking, and only able to imitate / remix past creativity, humans inevitably use all the tools at their disposal, and like other tools before it, I think that AI will be used by a minority of people to provide inspiration for the creation of new things and new ideas, and also, it will give the creative amongst us the increased ability to realise our dreams.

Take feature films for example.  Lots of people set out to make their first feature film but the success rate is stunningly low for which ones get finished.  Making a feature is incredibly difficult.  Then how many that do get made are ever seen by anyone consequential?  Likely only a small fraction too.

Potentially these ideas might have been great, but those involved just couldn't get them finished, or get them seen.  AI could give everyone access to this.  It will give everyone else the ability to spew out mediocre dross, but that's the current state of the industry anyway isn't it?  YT is full of absolute rubbish, so it's not like this will be a new challenge...

Link to comment
Share on other sites

22 hours ago, kye said:

No, it's not an echo chamber, and people are free to have whatever perspectives they want.

But take this thread as an example.  It started off by saying that 24p was only chosen as a technical compromise, and that more is better.  

Here we are, 9 pages later, and what have we learned?

  • The OP has argued that 60p is better because it's better.  What does better even mean?  What goal are they trying to achieve?  They haven't specified.  They've shown no signs of knowing what the purpose of cinema really is.
  • You prefer 60p.  But you also think that cinema should be as realistic as possible, which doesn't make any sense whatsoever.  You are also not interested in making things intentionally un-realistic.
  • Everyone else understands that 24p is better because they understand the goal is for creative expression, not realism.

If we talk about literally any other aspect of film-making, are we going to get the same argument again, where you think something is crap because you have a completely different set of goals to the rest of us?

Also, the entire tone from the OP was one of confrontation and arguing for its own sake.  Do you think there was any learning here?

I am under no illusions.  I didn't post because I thought you or the OP had an information deficit, but were keen to learn and evolve your opinion.  I posted because the internet is full of people who think technical specifications are the only things that matter and don't think about cameras in the context of the end result, they think of them as some sort of theoretical engineering challenge with no practical purpose.

A frequently quoted parallel is that no-one cared about what paint brushes Michelangelo used to paint the Sistine Chapel except 1) painters at a similar level who are trying to take every advantage to achieve perfection, and 2) people that don't know anything about painting and think the tools make the artist.

I like the tech just as much as the next person, but at the end of the day "better" has to be defined against some sort of goal, and your goal is diametrically opposed to the goal of the entire industry that creates cinema and TV.  Further to that, the entire method of thinking is different too - yours is a goal to push to one extreme (the most realistic) and the goal of cinema and TV is to find the optimum point (the right balance between things looking real and un-real).

Agreed. Not to mention OP couldn't probably tell a difference between 24/48 (Titanic being 48p turned out to be factually wrong).

I also find it funny when people think 50/60p is anything close to real life, as 50p is technologically still heavily compromised. For human eye it is certainly much smoother than 24p, but it's also it has this weird motion that sits in somewhere between 100 and 24, where the footage somehow ends up looking actually less real and seems hollow compared to the cinema standard. If you want your production to have videogamey/behind the scenes/soapy/whatever look, then go ahead. But no, it doesn't look real.

When we talk about realism that can fool the eye, it starts from 100 fps minimum. Yet I think none of the cinema projectors currently in use are technically able to show 100 fps material, most TV's in use also don't have the ability. YouTube is capped at 60, not to mention streaming services. There is a long way to go.

But once we are there, even then 24p will have it's place, as it has been said multiple times in this thread, that people experiencing movies crave to escape from the reality and 24p is perfect for that.

But don't tell me 50p = realism.

 

Link to comment
Share on other sites

11 hours ago, Jedi Master said:

This couldn't be further from the truth. As someone who designs CPUs for a living, probably know a little more about this than most. Yes, all computers perform binary logical and arithmetic operations, but they are far more sophisticated that a pocket calculator, and it's not just speed.

It doesn't take much to implement a pocket calculator. One of the first, the HP-35, used a 1-bit CPU with a serial ALU. More recent calculators tend to use more general-purpose CPUs, but the sophistication needed is not that great. By contrast, modern desktop and laptop CPUs have 64-bit data paths, can address gigabytes of memory, run at multi-gigahertz speeds, are superscalar (can execute more than one instruction per clock cycle), implement sophisticated branch prediction and speculative execution of instructions. They implement virtual memory, hyperthreading, virtualization, support for multiple SMID instruction set extensions, floating-point coprocessors, support for PCI Express, DDR4 and DDR5 memory interfaces, and have megabytes of on-chip cache. Some even have hardware support for encoding and decoding H.264/H.265, ProRes, and VP9. Yes, modern desktop and laptop CPUs have multiple cores, but not billions of them (four to 24 cores is typical). What they have billions of is transistors.

Comparing a modern CPU with a pocket calculator is like comparing a Ford Model T with a Lamborghini. 

Supercomputers used to be very fast single core machines (like the Cray-1), but modern ones use thousands of the same CPUs and GPUs used in desktop PCs. These computers are increasing in power and sophistication every year, and combined with the advancements in AI, will soon be able to do the things no one dreamed of ten years ago.

The human brain, by contrast, isn't as fast as a modern computer, but is massively parallel in a way that's as yet unmatched by even the most powerful supercomputers. We can still do things computers can't, but the gap is closing.

Interesting stuff, no doubts : ) But I think there's something you've missed from kye's entry. The figurative speech : ) And no, I'm not thinking you're a machine :- ) On the contrary, I praise your human input. People comment people. Not so sure the will stands with AI : D

; )

Which applies for the world of affections and between the lines where art as for instance pops up -- humans are able to.

Will machines be able to compete at the same level? I have my doubts.

There's a subliminar world I still think is science fiction yet to see computers understand and incorporate it.

Psychology (people who did the film school I attended had it) tell us there's the unconscious and subconscious components of human mind.

Soul... : X Even atheism has no name for... LOL ; )

Will they be able to match it?

I have my sincere doubts this will ever be possible.

 

(yeah, the territory where the philosophical problem of faith flows too and philosophy is not an empty science neither psychology ;- )

Link to comment
Share on other sites

Also, sorry me again (better to let it written here before any hater will come to ask me to learn the basics of psychology other than only the English idiom LMAO : D) but those Wikipedia entries are too much poor, so once I guess in a filmmaking forum, psychology makes all the sense not only technology, here is a better one:

https://diversity.social/unconscious-vs-subconscious/

I actually incline to see how this whole talk about AI lacks of some other fields of science like we only have exact sciences... and this is a big fail IMO :- )

 

Link to comment
Share on other sites

I don't understand why this is so hard for some to understand. Just watch the footage. The more frame per second, the more it looks like a Broadway show in a theater, not a movie like the ones most of us grew up with. Are there technical advantages to "seeing more data"? Yes, absolutely. Is that "better"? No, not always.

Here's the rub: if a film takes you out of the story, it's bad. That's exactly what happens when I watch high frame rate stories. Is the inverse true? When watching 24p, does it take you out of the story? I'm rather certain it almost never does.

Link to comment
Share on other sites

12 hours ago, JulioD said:

But it’s inherently backward looking. It’s not going to be an innovator.  Just at best, mash ups of existing auteur. 
 

You have to prompt it with whose work you want to plagerise and it’s really good at copying that. 
 

It can’t innovate new. All it can do is regurgitate old.

Do you believe that humans have a non-physical and/or magical ability to innovate using information outside of that which we learned? Human thoughts are also mashups of our experiences. We start with nothing and gradually take in information during our lifetime.

Link to comment
Share on other sites

On 12/9/2023 at 2:39 AM, Jedi Master said:

I do admit that as someone in a highly technical career field, and who is involved in lots of things on a hobby level that mostly involve tech, I'm probably much less artfully inclined than most people here. Where some see art, I see the tech behind the art, and that is what interests and fascinates me. I'd much rather read the ACES specification or the history behind why it's 23.976 FPS rather than 24 FPS than read someone's opinion on why Citizen Kane is a good or a great movie. Where some look at a Vermeer painting and wax lyrically about its artistic merit, I wonder how the heck he nailed the perspective in his paintings so accurately and whether he used mechanical or optical aides. That's just my nature.

Unrealistic art is just as technically challenging as realistic art. A lot of research, experiments, and programming went into the shaders and tools used to make Up, which is highly impactful emotionally. The technique of creating emotion is itself technical. As a casual viewer, it's completely fine to have an opinion that the more frames the better. It's your opinion. However, for someone who is creating movies--whether personal art projects, or part of hundreds of professionals on a blockbuster--one tool cannot be better than another. Different frame rates create different emotions, the same way different focal lengths do. Saying 60p is better is like saying horror movies are better than romances.

On the topic of horror, however, it's well known that one of the critical elements of horror is not showing the monster. Obscuring the monster through camera angles and shadows is a critical element of scaring people. That's not an artistic note. It's simply scarier. In fact, most of effective storytelling is saying just enough that the primary story takes place in the audience's head. If you disagree with that, then I'm not even sure fiction is something you enjoy--which is perfectly okay, but also renders everything else moot!

When people say 24p has a dreamy effect, another way to say it is that giving the audience less information allows them to create more in their head.

 

 

Something else I will add to the discussion about 24p vs 60p is that I have never seen a really good movie shot in 60p. By that I mean, I have never seen a movie that has top class story, lighting, direction, editing, and acting that is also 60p. It's hard to compare The Lord of the Rings with the Hobbit on the merit of framerate because I bet that, all else being equal, 48p Lord of the Rings would be more enjoyable than 24p The Hobbit.

Link to comment
Share on other sites

11 hours ago, Emanuel said:

Will they be able to match it?

I have my sincere doubts this will ever be possible.

And I have no doubt that it will eventually be possible. The human brain is just a very sophisticated, massively parallel chemical computer, but there's nothing about it that couldn't be implemented given sufficiently advanced technology. We're not there yet, but progress is being made.

Link to comment
Share on other sites

Sigh : )

 

2 hours ago, Jedi Master said:

And I have no doubt that it will eventually be possible. The human brain is just a very sophisticated, massively parallel chemical computer, but there's nothing about it that couldn't be implemented given sufficiently advanced technology. We're not there yet, but progress is being made.

To  mimic human unconscious/subconscious?? Each man is unique. A machine is soulless. No binary digital language/system/world looks like to be able to match uniqueness.

The same an animal species are able to be another one. Science estimates a range from 3 million to 100 million. Up, up. Only about 1.7 million species identified since last time the calculation has been shown/announced...

Demographics say 109 billion people have lived in the course of 192,000 years. Has anyone ever succeeded to not distinguish one from another? Will it ever be possible?

Singularity is part of the universe and this equation as much as art is impossible to impersonate as original when fake. Becomes a formula, no art anymore.

I just guess you're watching so many Hollywood movies... : D

- EAG :- )

Link to comment
Share on other sites

*Yes, the plural of species is species. We however are able to use it both with an "are" for plural as well "is" if used in the singular form. If we place an "a" before, it is not possible to use it as plural anymore ; )

The idea art can be not singular anymore is actually a contradiction of what the concept of art is.

Is AI the killing procedure of art itself?

Should we see AI as a threat to art then?

 

I solidly don't think so. A mere tool. Just another one. No reinvention of the wheel : P

 

Seems to me more a geeks meeting all this talk than anything else TBH. Looks like they've all been alerted about this thread LOL : D or then, filmmaking today is invaded by preachers of machines rather than those who realize what movies are really made of. Much distinct to write "made from" (I am on my English basics LMAO ; ) Not (only) technology per se, count on it! :- )

Link to comment
Share on other sites

4 hours ago, Emanuel said:

To  mimic human unconscious/subconscious?? Each man is unique. A machine is soulless. No binary digital language/system/world looks like to be able to match uniqueness.

Never say never. People who do usually end up eating their words eventually.

What is a “soul” anyway?

Yes, consciousness will be a tough nut to crack, especially since we don’t even know yet what causes it in animals. 

Look how far we’ve come in just a few decades. In the mid-1970s a CPU had a few thousand transistors. Today they have tens of billions. In the 1960s we had ELIZA. Today we have ChatGPT.

Link to comment
Share on other sites

14 hours ago, KnightsFan said:

Do you believe that humans have a non-physical and/or magical ability to innovate using information outside of that which we learned? Human thoughts are also mashups of our experiences. We start with nothing and gradually take in information during our lifetime.

Of course no art is created in isolation.  It’s always affected by what came before and what others are doing. 

AI though is inherently introverted. It can ONLY be based on what’s gone before. 

It inherently can only copy or emulate as mashup. And even then it can only do so through the right prompting.  

In traditional painting apprenticeships the students would copy the works of masters.  

Then they make new works.  

AI can’t do that.  It can only copy paste and mash up. 
 


 

Link to comment
Share on other sites

7 hours ago, Jedi Master said:

Never say never. People who do usually end up eating their words eventually.

What is a “soul” anyway?

Yes, consciousness will be a tough nut to crack, especially since we don’t even know yet what causes it in animals. 

Look how far we’ve come in just a few decades. In the mid-1970s a CPU had a few thousand transistors. Today they have tens of billions. In the 1960s we had ELIZA. Today we have ChatGPT.

What is a soul? Good question...

It doesn't surprise me anything what we have today. Really : ) It's a natural technology evolution. So what?

You mean consciousness will be hard work... Right : ) I meant unconscious and subconscious mind territories. Not tangible reading by physical means which includes machines is possible there at all.

As said, I think ignoring human science lenses such as psychology or philosophy, among others, doesn't help... ; )

Let's see when we'll end up eating our words eventually or if ever ;- )

Link to comment
Share on other sites

On 12/10/2023 at 12:51 AM, kye said:

I also agree with @JulioD that the wisdom will be backwards-looking, but let's face it, how many of the Hollywood blockbusters are innovative?  Sure, there is the odd tweak here or there that is enabled by modern production techniques, and the technology of the day changes the environment that stories are set in, but a good boy-meets-girl rom-com won't have changed much in its fundamentals because humans haven't changed in our fundamentals.

Hollywood has always been and continues to be formulaic. Occasionally some new things are introduced, like influences from New Wave or graphic novels, but then those aspects just become part of the palette of options. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...