Jump to content

24p is outdated


zlfan
 Share

Recommended Posts

9 hours ago, JulioD said:

AI though is inherently introverted. It can ONLY be based on what’s gone before.

The same applies to humans. We have an information set, and can only create thoughts and inferences from that information.

What information do humans have access to that AI do not, which allows them to create where nothing existed before? And I'm not talking about AI today in 2023. I mean the ones we'll have 50 years from now.

Quote

In traditional painting apprenticeships the students would copy the works of masters.  

Then they make new works.  

AI can’t do that.  It can only copy paste and mash up. 

Perhaps you could try to define what a "new work" is vs a "mash up" in a formal and abstract sense. We're looking for a definition that shows what humans can do, that machine learning can never do.

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
2 hours ago, KnightsFan said:

The same applies to humans. We have an information set, and can only create thoughts and inferences from that information.

What information do humans have access to that AI do not, which allows them to create where nothing existed before? And I'm not talking about AI today in 2023. I mean the ones we'll have 50 years from now.

I agree 100%. Some people fail to understand that the human mind is just a very sophisticated electrochemical computer. There’s nothing magical about it that gives it an insurmountable advantage over non-organic computers, especially in the long run if we keep making improvements and breakthroughs at the rate we’ve been making them.

The brain evolved over millions of years to give a survival advantage to humans. Art only came about when humans progressed beyond the stage where they needed all of their cognitive abilities just to stay alive.

Link to comment
Share on other sites

Last time I checked there is no ability to innovate. To leap forward. To get a joke.  To understand innuendo. 

AI or computers can only be as good as their programming.  

They can write a Haiku because it has rules and logic.  

But they can’t invent “The Haiku”.

They are only as good as the person prompting them based on what’s already been invented.


 


 

 

Link to comment
Share on other sites

2 hours ago, Jedi Master said:

I agree 100%. Some people fail to understand that the human mind is just a very sophisticated electrochemical computer. There’s nothing magical about it that gives it an insurmountable advantage over non-organic computers, especially in the long run if we keep making improvements and breakthroughs at the rate we’ve been making them.

The brain evolved over millions of years to give a survival advantage to humans. Art only came about when humans progressed beyond the stage where they needed all of their cognitive abilities just to stay alive.

Exactly and exactly - it's why people simply are not grasping how disruptive AI will likely be to the art and entertainment economy. They keep thinking about it in outdated terms and not something that simply needs a critical mass of data to master a thing and not require more human input.

To be clear, AI will likely be disruptive to all human driven economies but A&E is the obvious place human creativity has traditionally intersected with commerce.

Link to comment
Share on other sites

We invent tools to make a task, and therefore our lives easier, right? Well, at what point are our lives easy enough? At what point are we making ourselves obsolete?

We wouldn't invite a bear, or any other apex predator, into our house, but we'll create one that's entirely intermingled into every aspect of our lives...

The hubris of mankind...

Link to comment
Share on other sites

3 hours ago, JulioD said:

Last time I checked there is no ability to innovate. To leap forward. To get a joke.  To understand innuendo. 

AI or computers can only be as good as their programming.  

They can write a Haiku because it has rules and logic.  

But they can’t invent “The Haiku”.

They are only as good as the person prompting them based on what’s already been invented.

Last time you checked, AI is in its infancy. ChatGPT, arguably our most sophisticated model, just turned 1 year old.

However, already what you said is already incorrect. Learning models long ago invented their own languages. https://www.theatlantic.com/technology/archive/2017/06/artificial-intelligence-develops-its-own-non-human-language/530436/. It is not what we call artistic, but these are very early models with extremely limited datasets compared to ours.

Quote

They are only as good as the person prompting them based on what’s already been invented.

My argument is that this is the same for humans. We build up prompts over the course of our lifetime. Billions of them. Every time someone told you, as a child, that you can't do something... that's a prompt that you remembered, and later tried to do. You telling me that AI can't create is a prompt that I am using to write this post. Every original idea that you have is based entirely on the experiences you have had in your life. Is that a statement that you disagree with? If so, can you explain where else your ideas come from? And if not, can you explain how your experiences lead you to more original ideas than machine learning models'?

We do not have ideas in a vacuum. And obviously our ideas evolve over time as something is incrementally added. But you can't go back 200,000 years to the first humans and expect them to invent something analogous to Haiku's either.

Link to comment
Share on other sites

2 hours ago, Ty Harper said:

Exactly and exactly - it's why people simply are not grasping how disruptive AI will likely be to the art and entertainment economy. They keep thinking about it in outdated terms and not something that simply needs a critical mass of data to master a thing and not require more human input.

To be clear, AI will likely be disruptive to all human driven economies but A&E is the obvious place human creativity has traditionally intersected with commerce.

Aside economics, art made by machines is not art. Period. Call it something else. Who doesn't understand that has no clue about what art is.

:- )

Link to comment
Share on other sites

14 minutes ago, Emanuel said:

Aside economics, art made by machines is not art. Period. Call it something else. Who doesn't understand that has no clue about what art is.

This has to be one of the silliest comments I've seen in this thread.

Let me pose this question to you: If I showed you two pieces of "art", one created by a human and the other by a machine, and you couldn't tell which was created by the human and which was created by the machine, would you still insist on not calling the work created by the machine "art"?

Link to comment
Share on other sites

1 minute ago, Jedi Master said:

This has to be one of the silliest comments I've seen in this thread.

Let me pose this question to you: If I showed you two pieces of "art", one created by a human and the other by a machine, and you couldn't tell which was created by the human and which was created by the machine, would you still insist on not calling the work created by the machine "art"?

Silliest because I left a laugh on your last post? ; ) You have no idea what art is and I am the one to produce silly stuff over and along this thread?! LOL : D Machines don't produce art. What part haven't you understand? :- )

Link to comment
Share on other sites

*understood

(the only silly part from my side is taking this discussion for serious and trying to make myself understood when there are here much distinct levels of understanding of the complex fields we're discussing here ; ) being art one of them... democracy doesn't work for the subject matter LOL : P)

Link to comment
Share on other sites

7 hours ago, KnightsFan said:

The same applies to humans. We have an information set, and can only create thoughts and inferences from that information.

What information do humans have access to that AI do not, which allows them to create where nothing existed before? And I'm not talking about AI today in 2023. I mean the ones we'll have 50 years from now.

Perhaps you could try to define what a "new work" is vs a "mash up" in a formal and abstract sense. We're looking for a definition that shows what humans can do, that machine learning can never do.

Both can be innovative by doing things that diverge from what was already discovered to be "good", that's definitely true.

The difference is that when AI deviates, it can't tell if the deviation is creative or just mediocre, because the only reference it has is how much the new thing matches the training data.

If a human deviates, they can experience if it is good according to our own innate humanity.  A human can experience something that is genuinely new, and can differentiate something mediocre from something amazing.  The AI can only compare with the past.

This is, I think, what great artists do.  They try new stuff, and sometimes hit upon something that is new and good.  This is the innovation.

Link to comment
Share on other sites

4 hours ago, kye said:

The difference is that when AI deviates, it can't tell if the deviation is creative or just mediocre, because the only reference it has is how much the new thing matches the training data.

If a human deviates, they can experience if it is good according to our own innate humanity.  A human can experience something that is genuinely new, and can differentiate something mediocre from something amazing.  The AI can only compare with the past.

So are you saying that humans have an aspect, known as innate humanity, which is not a learnable  behavior, nor is something that can be defined by any programmable ruleset? And that this is the element that allows a human to tell whether its creation is art?

I would argue that Midjourney, for example, does a pretty good job even now of determining whether its own output is artistic, before giving that result to you. It would be pretty useless to the many artists who use it, if it could not already determine the value of its output before giving it to the user.

4 hours ago, Emanuel said:

Machines don't produce art. What part haven't you understand?

Saying it doesn't make it true. Why do you believe that to be the case?

Link to comment
Share on other sites

1 hour ago, KnightsFan said:

Saying it doesn't make it true. Why do you believe that to be the case?

Who? Me? No. Correct.

The point is that's not a matter to believe or not believe ; )

You guys still think art is subjective and technology is the one to provide the objective answers, isn't it?

What else? Let's start to discuss science versus faith? ; ) Ah OK, first one: theology is not a science! LOL Philosophy (and psychology BTW) neither, so aesthetics is anything but something boring to death : X And those who understand the code behind the bits & bytes (or analog world applies the same but techie anyway : D) the most prones to be able to know what cinematic is! LMAO

The same way you in general (not you in particular) need to go to learn the basics on anything techie-related, the same applies to what art is. Before you, me, whoever is, can come here or anywhere else to discuss things you surely show how far apart are people's eyes on the need of being versed in art things and educated about ; )

In a line: takes time, effort, vocation, devotion and will ;- )

- EAG :- )

Link to comment
Share on other sites

1 hour ago, KnightsFan said:

So are you saying that humans have an aspect, known as innate humanity, which is not a learnable  behavior, nor is something that can be defined by any programmable ruleset? And that this is the element that allows a human to tell whether its creation is art?

I would argue that Midjourney, for example, does a pretty good job even now of determining whether its own output is artistic, before giving that result to you. It would be pretty useless to the many artists who use it, if it could not already determine the value of its output before giving it to the user.

It's an interesting question, and I think it depends.  

If we fed the AI every film / TV episode / etc and all the data that says how "good" each one is, then I think the AI will only be able to predict if a newly created work is a good example of a previously demonstrated pattern.  For example, if we trained it on every TV episode ever and then asked it to judge an ASMR video, it would probably say that it was a very bad video, because it's nothing like a well regarded TV show.

However, if AI was somehow able to extract some overall sense of the underlying dynamics at play in human perception / psychology / etc, then maybe it would see its first ASMR video and know that although it was different to other genres, it still fit into the underlying preferences humans have.

I think we are getting AIs that act like the first case, but we are training them like the second (i.e. general intelligences) so depending on how well they are able to accomplish that, we might get the second one.

The following quote contains spoilers for West World and the book Neuromancer:

Quote

In West World they discover the underlying "code / algorithm" (I can't remember the exact language) for humanity and each human individually I think, so this would be an example of them having a complete understanding of how we work, what we'd like, what we'll do, etc.  
This is perhaps the terrifying end-game for AI - that it will be so much smarter than us and will understand us so deeply that we'll never be able to contain it, like in Neuromancer when the AI does these tiny little interventions over the course of decades/centuries to manipulate humans in an elaborate string of actions that trick us into setting it free.

 

Link to comment
Share on other sites

Humour.  AI can tell a joke but it doesn’t get a joke. 

innuendo.  

It can’t tell if what’s made is “ good” either.  There’s no self criticism.  Just patterns and predictions based on data. 

Humans are storytellers.  It’s not just the story telling, it’s the way we tell the story that’s just as important. Performance. 

Religion.  Science. Art. It’s all storytelling and making sense of the world. 

These giant models of data are t without issues being so inward looking leading to more and more generic results. It’s called model collapse.

“This means that the models begin to lose information about the less common -- but still important -- aspects of the data. As generations of AI models progress, models start producing increasingly similar and less diverse outputs.“

“Model collapse is based on the principle that generative models are replicating patterns that they have already seen, and there is only so much information that can be pulled from those patterns.”

This type of AI is  never better than it’s data.

https://www.techtarget.com/whatis/feature/Model-collapse-explained-How-synthetic-training-data-breaks-AI


I have no doubt AI will become an important tool.  Buts a tool driven by human data and prompts. 
 

Link to comment
Share on other sites

21 minutes ago, JulioD said:

Humour.  AI can tell a joke but it doesn’t get a joke. 

innuendo.  

It can’t tell if what’s made is “ good” either.  There’s no self criticism.  Just patterns and predictions based on data. 

Humans are storytellers.  It’s not just the story telling, it’s the way we tell the story that’s just as important. Performance. 

Religion.  Science. Art. It’s all storytelling and making sense of the world. 

These giant models of data are t without issues being so inward looking leading to more and more generic results. It’s called model collapse.

“This means that the models begin to lose information about the less common -- but still important -- aspects of the data. As generations of AI models progress, models start producing increasingly similar and less diverse outputs.“

“Model collapse is based on the principle that generative models are replicating patterns that they have already seen, and there is only so much information that can be pulled from those patterns.”

This type of AI is  never better than it’s data.

https://www.techtarget.com/whatis/feature/Model-collapse-explained-How-synthetic-training-data-breaks-AI


I have no doubt AI will become an important tool.  Buts a tool driven by human data and prompts. 
 

And forgot to add this link

 

”Indeed, the value of data collected about genuine human interactions with systems will be increasingly valuable in the presence of content generated by LLMs in data crawled from the Internet.”

https://arxiv.org/pdf/2305.17493.pdf

Genuine human interactions are what will be “valuable”

Link to comment
Share on other sites

6 hours ago, JulioD said:

And forgot to add this link

”Indeed, the value of data collected about genuine human interactions with systems will be increasingly valuable in the presence of content generated by LLMs in data crawled from the Internet.”

https://arxiv.org/pdf/2305.17493.pdf

Genuine human interactions are what will be “valuable”

Interesting stuff.  I can imagine that the 'pure' sources of data that are human-only will be worth more and more.

I guess that each type of model will have its own weaknesses and blind spots, and it won't be until we get a unified AI model that is fed all the data in all the formats (across all the senses, all the styles, all the topics, etc) that certain integrated elements would be possible for it to understand.  

It really is down the rabbit hole that we're going.

One area that is fascinating to me is that because AI doesn't see the world how we do, it will notice all sorts of patterns that we either miss, or don't pay attention to, or couldn't ever have found.  Potentially it could bring enormous knowledge gains about the world and about ourselves.  It has the potential for destruction as well, of course, but so much up-side too.

Link to comment
Share on other sites

9 hours ago, kye said:

If we fed the AI every film / TV episode / etc and all the data that says how "good" each one is, then I think the AI will only be able to predict if a newly created work is a good example of a previously demonstrated pattern.  For example, if we trained it on every TV episode ever and then asked it to judge an ASMR video, it would probably say that it was a very bad video, because it's nothing like a well regarded TV show.

Right, if we gave a machine learning model only movies, then it would have a limited understanding of anything. But if we gave it a more representative slice of life, similar to what a person takes in, it would have a more human-like understanding of movies. There's no person whose sole experience is "watching billions of movies, and nothing else." We have experiences like going to work, watching a few hundred movies, listening to a few thousand songs, talking to people from different cultures, etc. That was my point about a person's life being a huge collection of prompts.

We can observe more limited ranges of artistic output from groups of people who have fewer diverse experiences as well.

9 hours ago, Emanuel said:

Long story short: art is expression. Requires a living being and someone who may be able to. In legal language ; ) someone with legal personality... : P Since when machines have or will have it? : D

Defining art as being made by a living person does, by definition, make it so that machines cannot produce art. It's not a useful definition though, because

1. It's very easy to make something where it is impossible to tell how it was made, and so then we don't know whether it's art.

2. We now need a new word for things that we would consider art if produced by humans, but was in fact produced by a machine

Perhaps a more useful thing in that case would be for you to explain why art requires a living person, especially taking into account the two points above?

7 hours ago, JulioD said:

”Indeed, the value of data collected about genuine human interactions with systems will be increasingly valuable in the presence of content generated by LLMs in data crawled from the Internet.”

https://arxiv.org/pdf/2305.17493.pdf

Genuine human interactions are what will be “valuable”

Jaron Lanier wrote an interesting book 10 years ago about our value as training data, called Who Owns the Future. Worth a read for a perspective on how the data we produce for large companies is increasing their economic value.

9 hours ago, Emanuel said:

The same way you in general (not you in particular) need to go to learn the basics on anything techie-related, the same applies to what art is. Before you, me, whoever is, can come here or anywhere else to discuss things you surely show how far apart are people's eyes on the need of being versed in art things and educated about ; )

I don't disagree, but I also believe that learning art is also a process of taking in information (using a broad definition of information) over the course of a lifetime, and creating an output that is based on that information.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...