Jump to content

Katrikura

Members
  • Posts

    54
  • Joined

  • Last visited

Reputation Activity

  1. Like
    Katrikura reacted to horshack in Analysis of Nikon's new universal N-Log LUT   
    Nikon recently released a new universal N-Log technical LUT for all their cameras, which replaces Nikon's original camera-specific LUTs. The original LUTs handled highlights particularly poorly. In fact they didn't really handle them at all - they inexplicably clipped all N-Log values above ~68% to white.
    Here is my analysis of the new LUT. First here are the tone/gamma curves for both the original vs new LUTs, which I calculated by converting both 3D LUTs to 1D and graphing the resulting curves. Notice how the original LUT clips all highlights above 68% to white, whereas the new LUT has a nice, smooth roll-off.

    Here is a visual comparison showing the original vs new LUT in action. This is a high DR N-RAW capture from a Z6 III, with the original vs new LUT compared, along with a comparison to the same scene using Davinci Resolve's CST with tone mapping.

    For reference, here is how Nikon describes the new LUT in the comments at the top of the LUT file:
    # Development Color Space = REC709
    # Output Gamma Curve = BT1886
    # Output Tonemap = MEDIUM_CONTRAST
    # Highlight Rolloff = Rolloff 2 - medium
  2. Like
    Katrikura reacted to Clark Nikolai in I took a cinema camera on holiday, and it changed everything   
    These are really good. I especially like the ones of the streets at night. The light is where the subject is so it's okay if the rest is too dark to see.
    I know what you mean about things taking time to set up with a cinema camera. I've been working on a project for over a year now that's mostly tripod shots of things in the city. People do ignore you after awhile. Some come over to chat about cameras. Very few take issue with it. I guess I look like some photography hobbyist out getting shots. In a tourist and film school town like Vancouver, there are so many people taking pictures and shooting their school projects that it's just more of the same.
    I also learned a trick to not look at what you're shooting. Set it up on a tripod, point it where you want, frame, focus and start rolling but then look away or down at your phone and check your emails or something. Then let the action happen in the frame. If someone looks over at you it looks like you're not rolling because your attention is not where the lens is pointing.
    I also set it at the native ISO and native colour temperature and leave it there. Mostly have a 180º shutter (but occasionally go longer for more light.) Shooting raw it is the best way to go. In post you can do pretty much anything then.
    Recently I've been letting interior light stay a bit warm and exterior daytime light be bluish. I think it's better than correcting things to be always white. Gives more mood and a sense of the light at the time and place.
     
  3. Like
    Katrikura reacted to kye in I took a cinema camera on holiday, and it changed everything   
    That's all I can think of for the moment, and I'll take a long time for me to digest what I do with this info.
    One final observation was how hot the camera got.
    For the unfamiliar, the BMMCC has huge grills on each side for the integrated fan (and as one of the smallest cameras on the market it really shows there's no excuses) and for those who haven't experienced it - it pumps out hot air the whole time!
    I previously mentioned that the camera was exposed to sweaty conditions, which was absolutely true (I was drenched with sweat basically the whole time I was outside) but part of that was that the camera was this little hand-warmer blowing hotter than normal air on my hands.
    This really gave me an appreciation of how serious the challenge of camera thermal management is.  I'd suggest that technology has likely gotten better since they made this camera, but the amount of data processing has also gone up, so perhaps that evens it out - I don't know.
    It made me appreciate that I didn't need to worry about other cameras that may well have overheated in these conditions.  
    The heat this gives off, combined with the deep shadows and moody images from shooting in the rain, really made me think this would be a great camera to shoot with in winter.  Not only would the images be moody and cinematic, but it would keep your hands warm!
  4. Like
    Katrikura reacted to kye in I took a cinema camera on holiday, and it changed everything   
    Grenade #6: I worked out how to sharpen and process the footage.
    Prior to the BMMCC, I've previously only shot with cameras that sharpened (over sharpened) their footage.  This is the GF3, 700D, XC10, GH5, and GX85.  I got good at blurring the footage to get a nice level of sharpness (a level of sharpness that only the pros seem to be able to achieve - amateurs seem to want to slice everyones eyeballs - maybe it's self-loathing over not being skilled at using their NLE?) but I could never work out how to sharpen up RAW footage and footage from the OG BMMCC and BMPCC.
    In retrospect I really don't know why I couldn't do this, and it sort of seems silly now, but it was a barrier to my ability to do what I wanted with the footage.  
    (It's not that I wanted to over-sharpen the footage, but in order to grade optimally you need to be able to go too far in order to choose the right amount of something.)

    Push push push....

    More more more!!

    High-end smartphone look achieved!!
  5. Like
    Katrikura reacted to kye in I took a cinema camera on holiday, and it changed everything   
    Grenade #5: OIS with a good implementation is mostly good enough.
    I'm used to IBIS and more recently Dual-IS (which is IBIS and OIS combined) and had previously shot with OIS-only and disregarded it because it doesn't stabilise roll.  I'm also used to the 'tripod mode' on the GH5 which maintains a motionless frame with the IBIS like it's locked-off, which I missed on the GX85 which gives a bit of a floaty image.
    The BMMCC maintains a very steady frame with the OIS, but doesn't fight you if you want to move the frame, so it creates still compositions but also allows smooth movement.
    I can still tell from the footage when I was tired or low in blood-sugar or just generally struggling - it was over 30C / 86F and 70% humidity, so pretty demanding on the arms to hold a very small and lightweight camera perfectly still for hours and hours.  
    To counter the roll motion, I took to using my left hand to 'cup' the camera and lens and put my thumb up onto the side of the monitor, which helped stabilise any roll motion / jitters.  It didn't work perfectly, as in this orientation any rotation of my forearm translates to roll motion in the camera, but for the most part the jitters are small enough that using stabilisation in post can clean them up almost perfectly.  There was still a bit of sudden-blur on the odd frame here or there where there was motion-blur while the shutter was open, but considering the subject matter is a large city with the grunge and grit as well as the glitz and glamour, it's aesthetically appropriate.
  6. Like
    Katrikura reacted to kye in I took a cinema camera on holiday, and it changed everything   
    Grenade #4: Low light cameras and lenses aren't strictly needed.
    This was a surprise, but ISO 800 and an F2.8 lens is (almost) all you need, even for low-light situations.
    The BMMCC has an ISO 1600 mode, but I did a side-by-side and the 1600 shot was still noisier than the 800 shot even after it had been darkened by a stop in post to match exposure.  The BMMCC is a pretty noisy camera though, and I did ETTR whenever I could.
    For the really low-light situations I ended up taking off the vND, which is still something like 1-stop when at its lightest setting, but even that wasn't needed for most situations.
    The secret to making this combo work is to actually have blacks in the frame.  This seems to be something that only professionals do now, amateurs all have milky shadows for some unknown reason - perhaps they think they're emulating film when in fact film would crush your blacks faster than the YT comments section will crush your dreams.
    These are all shot at ISO 800, with me cheating an extra stop by going to a 360 shutter.  After doing that cheat, I ended up dropping exposure significantly in post on most shots, so I probably didn't need to cheat that much:




    These are also shot at ISO 800, but might have been with the 50mm F1.2 prime I also took with me, and were tripod shots:


    (These last two shots also had a WB change in post to get the cool tones, but most of the other shots I've posted had no WB changes in post at all)
    I am contemplating an 85mm F1.4 prime to give extra reach for very low-light situations but it's a pretty speciality thing.
    I also took with me the 7.5mm F2, 17mm F1.4 and 50mm F1.2 primes.  The reason I took these is specifically for very low-light situations like going to a look-out at night and wanting to capture the city lights cleanly and get a little bit of exposure on the things they're illuminating rather than simply getting a black frame with individual pixels lit up by the lights directly.
    Having a dual-ISO camera would really simplify this, because if the second native ISO was 2.5 or 3 stops above then the F2.8 lens would turn the T2.8 exposure into a T1.0 or T1.2 exposure which makes a big difference into reaching into the blackness.
    Here's a table of common scenarios, where ISO 800 at T2.8 and 1/50 shutter speed = exposure value of 5.5:

    As you can see, lots of night situations you'd find yourself in are brighter than the 5.5 with 6 and 7 being common night situations.
    The other thing to take into account is that 5.5EV is probably better than most of us can see at night with our eyes, so if it's too dark to film, you might not be inclined to be there in the first place (unless you enjoy walking down dark alleyways in foreign cities at night with expensive camera gear).
  7. Like
    Katrikura reacted to kye in I took a cinema camera on holiday, and it changed everything   
    Grenade #3: Restricted zoom range and cropped monitoring give scale to compositions.
    This is a combo one.
    After decades of shooting travel and street, I compose on instinct and if I wanted to override that, I have to concentrate.  Often while shooting my whole capacity is being used elsewhere, so I get what I compose on auto-pilot, but the gotcha for these is that I compose using the monitor, which is normally 16:9.
    If you later want to crop then subjects are too tall in the frame, so it doesn't work.
    Cameras often have completely rubbish crop lines for their display - some are barely visible - let alone strong enough to get my autopilot to use them for compositions.  The BMMCC is different - the frame guides are 100% opacity black bars.  This is perfect because that real estate can still be used to display settings, but isn't displaying misleading parts of the frame.
    So, this part encourages me to not make the subjects too big in the frame, but the 100mm limit of the 12-35mm lens goes even further - it makes far away subjects even smaller in the frame.




    What I find is that is that this can really give a sense of scale to the world - we tend to measure everything against the size of humans after all.  I've seen people talk about how the greats of cinematography aren't afraid to make the humans small in the frame and how this is something less awesome people don't do - this combo helps!



    This point might be a little lost if you're reading this on your phone instead of on a 70" TV like you should be doing, and movement also helps because with most of these shots the composition is static so the buildings are motionless and the movement of the people really helps them to stand out in the frame.
  8. Like
    Katrikura reacted to kye in I took a cinema camera on holiday, and it changed everything   
    Not clickbait, despite sounding like it.   I'd say that it threw a spanner in the works, but it was more like a hand-grenade.
    I went to South Korea on holiday, and despite having a mostly-sorted setup based on GX85, I took the BMMCC and ended up shooting almost the whole trip on it.
    The rig


    The rig consists of:
    OG BMMCC - the 1080p one from almost a decade ago 12-35mm F2.8 lens IR/UV cut filter cheap vND filter (tried to buy a new one there but retail shops didn't stock what I wanted) Ikan 3.5" monitor Smallrig monitor mount (tilts forwards) curly HDMI cable from amazon (really tidies the rig up) 3 x LP-E6 batteries (two are older Wasabi ones and one was genuine a Canon one I bought there) Peak Design arca-swiss (mounted on the bottom) random wrist strap (looped through the arca-swiss plate Sandisk 128Gb SD card This ended up being a killer setup.  A few highlights of the rigs performance are...
    Professional equipment
    The BMMCC is a cinema camera with fans and designed to shoot in harsh environments.  The 12-35 is a professional lens.  No BS overheating snowflake influencer crap here.  This gave me confidence to use it in (light) rain, heavy humidity and serious sweatiness, and basically to not baby it.
    Fixed aperture zoom lens
    When setting up for a shot the vND setting from the last shot is probably in the ballpark for this current shot, even when going straight from a one end of the zoom range to the other.
    Flexible zoom range
    The zoom range was from 30mm to about 100mm.  Sometimes it wasn't quite as wide as I'd have liked, but it suited the environment as Seoul, with a population of around 10M people, is one of the worlds megacities and is seriously compact, so most compositions would simply contain too much stuff if they had a wider FOV.  You always miss shots when travelling, but I felt like I didn't miss that many.
    OIS on a cinema camera, even at the wide end
    Not a lot of OIS options for 12mm lenses outside of zoom lenses.
    Light weight
    This is a full cinema camera rig with 12 items, and yet weighs about 850g / 1.9lb.  My GH5 weighs 750g / 1.65lb with battery and SD card, BUT NO LENS.
    Dynamic range
    No choosing between the sky or shadows.  When shooting uncontrolled situations like this you often see things in post that you didn't see while you were shooting, so the flexibility is super-useful.
    Ok, so that's all lovely and all, but what was so transformative about it?  These are things you could probably work out from the specs...
     
    Grenade #1: Shooting slowly is (mostly) shooting honestly.
    It's a cinema camera, so it's slow as f to shoot with.  Yes, I know that practice makes you faster, but my phone will expose and focus in the blink of an eye, so comparatively it's far slower.  
    So, you see a composition and you stand in the right place.  Then you adjust the monitor angle if not shooting from the hip (as I prefer).  Then you adjust the ND to expose.  Then you adjust the zoom for the composition.  Then you adjust the focus.  With this kind of setup you have to rely on peaking, so you adjust it back and forth to see how much peaking is the maximum, then zero in on that.  Then you hit record.
    This means several things:
    You are immediately obvious when you stop and start fiddling with a camera in public.  There's no hiding.  People aren't stupid, especially these days. You cannot be this obvious without getting comfortable with it.  If you don't learn to be comfortable then you'll mentally implode before getting any shots, forcing you to relax and just play that role. By the time that you actually hit record, most of the people who were staring at you will have gotten bored and gone back to what they were doing.  They don't know you hit record, so the shot will contain people who aren't suddenly paying attention to you. This was a revelation for me because it forces a different way of shooting.
    When you have a fast camera, you can act like a street photographer.  You watch the people, you see something about to happen, you quickly point the camera, and capture THE DECISIVE MOMENT.  This means you are shooting specific people doing specific things.  Good freaking luck doing that with a fully-manual cine camera.
    When you have a slow camera, you probably can't anticipate moments far enough in advance to be ready in time, so you think differently.  You find a composition and shoot it, and anonymous people drift in and out of frame in ways you didn't specifically anticipate.  Sure, you can frame up a background and then wait for people to walk through it, and you could even see someone interesting a few hundred meters away and be ready when they walk past, but it's still a good distance from seeing something 2s before it happens and grabbing it.
    This has a massive caveat though.  It's not a good way to shoot people you know, unless you're directing them or they're basically stationary.
    The way I've come to understand the difference between cinema cameras and video cameras is that video cameras are designed to capture the world as it happens, and cinema cameras expect the world to bend around them.  There's lots of overlap now with that line blurring, but the concept is still a useful one, and the cameras with the best image quality still tend to be very self-centred.
     
    Grendade #2: Fixed WB is awesome.
    I used to shoot auto-WB because I used to think that you wanted 'correct' WB.  This mostly works, but leaves you with tiny WB 'errors' that change during the shot.  I used to think that the alternative was a fixed WB that would either be correct (if you took a manual grey-card reading at every location or whenever the lighting changed) or you'd use a fixed WB and then have to change it in post.
    This is probably still true if you're doing something where the WB has to be 'correct', but the only situations I can think of where this would apply is for professional work.
    I shot the whole trip on 5600K.
    So, why wasn't this a 'problem' creating shots with 'wrong' WB?
    Well, shots with warm light sources look warm:


    Shots with cool light sources (like a blue sky) look cool:


    Shots taken on a 'grey old day' look grey:

    ... and light sources that aren't on the warm/cool line will show as being coloured but don't look wrong - they just sort of look like that's what they looked like:


    and sometimes can even look beautiful:



    More to come, too many images to attach!
  9. Like
    Katrikura reacted to Emanuel in In memoriam: John Bailey, ASC (1942-2023)   
    I hadn't noticed of it yet such a loss... Just some minutes ago when I had the chance to search about him and received the shock of the sad news... : (
    https://variety.com/2023/film/obituaries-people-news/john-bailey-dead-academy-president-cinematographer-1235787930/
    https://theasc.com/news/in-memoriam-john-bailey-asc-1942-2023
    https://www.imdb.com/name/nm0007037/
    Got the unique chance to meet him along a workshop made at the Lisbon Film School about a quarter of century ago when his presence and his lovely wife and E.T.'s editor Carol Littleton brought a singular experience with this passionate couple of the world of movies and filmmaking.
    For a few days, we had the chance to hear them to talk about their work in two respective distinct workshops as Lisbon Film School's visiting guest lecturers throughout their personal experiences of their own, the way they met in the old continent they simply loved and their first meeting IIRC in Vienna, Austria.
    Ordinary People (1980), directed by Robert Redford in his feature directorial debut, The Big Chill (1983), Silverado (1985), The Accidental Tourist (1988), made with his partner in crime Lawrence Kasdan, Groundhog Day (1993) or Mishima: A Life in Four Chapters (1985), produced by Francis Ford Coppola and George Lucas, directed by Paul Schrader with whom he also helped to make American Gigolo (1980)...
    https://theasc.com/videos/lighting-tech-tips-household-lightbulb-russell-carpenter-asc-1/john-bailey-asc-the-stylization-of-mishima
    Among a few of the case studies in those memorable days I really miss his input in that week from the bottom of the heart.
    Long farewell dear John, it was great to have had the happiness to know a bit more of your presence among us.
    Never forget the humble character typical of yours expressed in the way you addressed to each one of us, your workshop attendees whether classmates or the other film school instructors, you added no distinct treatment as used in those customary elitist circles more often than it should. Your vivid enthusiasm in that close personal and direct communication face to face stands in our memories, as well your helpful advices to bless our careers, always plenty of generosity, love and endless good faith, a big thanks! < 3
    - EAG
  10. Like
    Katrikura reacted to mercer in Magic Lantern update! Original EOS R recording HD 14bit RAW   
    As I'm sure you know and you just misspoke, you need an external drive for 12bit 4K, not an external recorder. You can shoot internal 12bit 1080p, but ML Raw has an undeniably better IQ, in all respects, than the FP's FHD.
    I agree about the IBIS, but I don't really see a point for ML Raw with the R5? You can already shoot internal raw. But all of the Canon IBIS cameras are Digic X, so it will be a long time before they break that code.
    It will be interesting to see what they can do with the EOS-R and hopefully the M50. I'd actually be really happy if they were able to enable continuous MLV raw on the 5D4.
    But I'm still really impressed with what the 5D3 with ML Raw and a Canon IS lens is capable of handheld. So maybe I'm just easily impressed. 
  11. Like
    Katrikura reacted to kye in Philip Bloom = The iPhone 15 log   
    Our host isn't the biggest fan of that particular person and has requested previously that his content not be shared.
  12. Haha
    Katrikura reacted to MrSMW in Philip Bloom = The iPhone 15 log   
    Let’s see how long this post stays up.
    I have started a countdown…
  13. Like
    Katrikura reacted to Emanuel in Another Christmas greeting   
    Clever ; ) Loved the sausages part! LOL : ) This is what a real production means as conforming creative choices to the stuff we have in hands... Well done again in this yearly tradition of yours, it will surely be appreciated by the local community :- )
    ¡Felices fiestas de Navidad!
    - EAG
  14. Like
    Katrikura reacted to Grimor in Another Christmas greeting   
    Hello to all EOShd members. I haven't been here much this year, but I consider this forum to be my second home. This year has been hard personally (divorce with young children) but we must continue in life with good spirits. So like every year (and it's been 6 now) I link you to my police Christmas greeting:
    In the tech side it is Sony FX6 in UHD mode at 100fps (S&Q) and S-log, Hi Base Iso. Rokinon lenses. 
    I hope you like it and I send you a hug.
    Adiós Amigos!! 
  15. Like
    Katrikura reacted to markr041 in Sandisk Pro Blade?   
    I accidentally shot 8bit cDNG 4K RAW on the internal sd card (I forgot to switch to the external drive):
     
  16. Like
    Katrikura reacted to markr041 in DJI Pocket 3?   
    OK, here you go: using the D-Log M 10bit setting and the official DJI D-Log M LUT (I hate LUTs, but what can you do):
    Harsh highlights and very processed-sounding audio (set to stereo, front). Not stressed in lower light as expected. Colors are good, but somehow not so appealing. I nixed a bunch of outdoors shots because they were all overexposed (based on using auto exposure), not a problem with current GoPro's (used to be).
    Noise reduction was turned off in the camera, sharpness set to -1, not -2, from 0.
    GoPro's look to me to have better handling of highlights, more appealing color, and much better audio. Pocket 3 does the job in low light, where the GoPro essentially cannot operate.
    I do not think anyone would shoot a mediocre attempted blockbuster film with the Pocket 3 (as was done with the fx3), and it's not a crash cam.
    But if one wants to move with the camera in dim places and loves portability this is the only alternative.
    It does, btw, charge really fast and it is a nice screen. But that audio...
    I will try again outdoors in bright light.
  17. Like
    Katrikura reacted to Emanuel in DJI Pocket 3?   
    4K 120fps on 1" sensor size going with such inconspicuous form factor is incredibly sweet... I am in love again! :- *
  18. Like
    Katrikura reacted to androidlad in DJI Pocket 3?   
  19. Like
    Katrikura reacted to ND64 in 5 concerning trends in photo/video forums   
    1- the rate of decline in engagement in all of the photo/video forums and websites is depressing. Even comment sections are mini ghost towns compared to same place ten years ago. Maybe social media is stealing a lot of that free time usually spent on traditional web in the past. 
    2- many of those people who could write informative blog posts are now like "why bother writting any more when no one reads any more?". Today they're making videos, and try hard to make it 10 min long, which means they have to add a lot of water to the milk. 
    3- I don't see any other industry with so much negativity about the major brands of that industry. Telling people they don't need and shouldn't buy new released products is a norm in our corner of internet fora! There is hype moments before and after press release days, but overall discouragement is way bigger. But look at car enthusiasts or audiophiles online communities...They're constantly encourage each other to buy more!
    4- lack of communication between experienced users and newcomers is hurting everyone, and sometimes it's sad. Many people who are upgrading from smartphone, are making mistakes related to misunderstandings that discussed and explained and solved seven years ago. They just don't know where to find the knowledge.
    5-  a tendency to reduce everything to "matter of taste" has emerged to the point that the whole concept of critique apears as moot point, like there is no wrong way and right way of doing things!
     
    Maybe its overthinking. I don't know.. just wanted to share my thoughts. 
  20. Like
    Katrikura reacted to Andrew Reid in Seen Oppenheimer... pretty good   
    Here are my thoughts
    https://www.eoshd.com/news/oppenheimer-review-70mm-imax-screening/
  21. Like
    Katrikura reacted to IronFilm in Don't panic about AI - it's just a tool   
    Indeed, just look at the leap forwards in improvement from GPT2 to GPT3
    Or each generation from Midjourney V1 vs V2 vs V3 vs V4 vs vV5 (and those 5 generations only took a single year to happen!!!). 
    https://aituts.com/midjourney-versions/ 
    We might laugh at the efforts of generative AI video right now, but they're no worse than Midjourney V1 was.... 
    Perhaps 50/50 odds we'll have the Midjourney V5 equivalent for video by 2028:
    https://manifold.markets/ScottAlexander/in-2028-will-an-ai-be-able-to-gener 
    Or maybe even higher odds than that... 
    https://manifold.markets/firstuserhere/will-we-have-end-to-end-ai-generate-12f2be941361 
    https://manifold.markets/firstuserhere/will-we-have-end-to-end-ai-generate-de41c9309e38 
    I agree with your disagreeing.
    That's a good analogy! 
    And if it is carefully/appropriately managed, you can even have a change in voice actor who is doing these characters, and almost none of the fans will notice or care. 
    Another good analogy. It is indeed very likely, I feel, that the country as a whole will be massively better off and wealthier thanks to AI. But... there will also be huge numbers of individuals (such as those middle aged textile workers) who will be a lot worse off. 
    We'll be able to have super niche "micro celebrity AI avatars"
    At the moment, celebrities need a certain amount of broad appeal. As  you said, they need to avoid offending their fans. So end up appealing to the common denominator, because what might appeal to one section of the fan base could drive away other fans who get offended by it. But once you're freed from the physical constraints, then an "AI celebrity" could cater to any and all of these micro niche fanbases. 
    "I think there is a world market for about five computers." ~ IBM's president, Thomas J Watson (said in the early 1940's) 
    Nah, my Raspberry Pi can run a LLM. (ok, only a baby-ChatGPT that's quite cut down, and somewhat crippled. But even if I want to run a LLM that's quite close to the power of GPT3, that only costs me much much less than a $1/hr, in fact, more like a handful of cents per hour. It is cheap to run a LLM)
    It's predicted as highly like that even GPT4 can be run on consumer grade hardware by next year:
    https://manifold.markets/LarsDoucet/will-a-gpt4equivalent-model-be-able 
    What you're thinking about, is the costs to train GPT4 from scratch. That's VERY EXPENSIVE! 
    But still, it isn't quite as bad as you think. If a government wanted to do it, then absolutely any government in the OCED  could do this, they could do it ten times over. Likewise, there are hundreds, if not thousands, of companies in the world which could train the next GPT4 if they wanted to. (GPT4 would've cost roughly the same order of magnitude as $100M in costs, waaaay out of reach for you and me, but easily within reach of many many other organizations) 
    But they won't, because the costs to train their own GPT4 vs the profits they could make (as AI is quickly becoming a very competitive space!) just isn't worth it. 
    The good news though, is that costs for training are dropping drastically fast! 
    Look at this prediction, it is highly likely that before 2030 it will cost under $10K to train from scratch a GPT3 quality LLM (i.e. any keen hobbyist can do it themselves!):
    https://manifold.markets/Gigacasting/will-a-gpt3-quality-model-be-traine 
    And that's yet another reason why there are not hundreds of other companies training their own GPT4, why put that risk into it if you're not already an industry leader in this? When your $100M+ investment could quickly within a few short years be worth next to nothing. You need a solid business plan to recoup your costs fast. OpenAI can do that, because they're massively funded with Microsoft's backing, and they have a first mover advantage. 
    Too late, that genie left the bottle long ago. 
     
  22. Like
    Katrikura reacted to kye in Don't panic about AI - it's just a tool   
    Great post.  As a fellow computer science person, I agree with your analysis, especially that it will get better and better, and will get so good that we will learn more about the human condition due to how good it will get.  This is also not something new, in the early days of computer graphics, someone wrote a simulation of how birds fly in formation and it was so accurate that the biologists and animal behavioural scientists studied the algorithms and this is how the 'rules' of birds flying in formation were initially discovered. 
    I just wanted to add to the above quote by saying that studios have already made large strides in this direction with the comic-book genre films, whose characters are the stars and not the actors that play them.  This is an extension of things like the James Bond films.  These were all films where the character was constant and the actor was replaceable.  
    VFX films are the latest iteration of this, where the motion capture and voice actors and the animators are far less known, and when it's AI replacing those creatives to make CGI characters that will be the next step, and then it will be AI making realistic-looking characters.
    For those reading that aren't aware of the potential success of completely virtual characters and how people can bond with a virtual person, I direct your attention to Hatsune Miku, a virtual pop star:
    Link: https://en.wikipedia.org/wiki/Hatsune_Miku
    She was created in 2007, which in the software world is an incredibly long time ago, and in the pop star world is probably even longer!
    But did it work?
    That's a figure from over a decade ago and equates to just over USD$70,000,000, which is almost USD$100M in todays money.  I couldn't find any reliable more recent estimates, but she is clearly a successful commercial brand when you review the below.
     
    What does this mean in reality though, it's not like she topped the charts.  Here is a concert from 2016 - she is rear-projected onto a pane of glass that was mounted on the stage.
    She was announced as a performer at the 2020 Coachella, that was cancelled due to covid.
    So, while Japan might be more suited to CGI characters than the west is (although that is changing) - take the Replika story for example.  Replika is a female virtual AI companion who messages and sends pics to subscribers, including flirty suggestive ones.  The owners of Replika decided that the flirty stuff should be a separate paid feature and turned it off for the free version - the users reacted strongly.  So strongly in fact that it's now an active field of research for psychologists trying to figure out how to understand, manage and regulate these things.  It's one thing for tech giants to 'curate' your online interactions, but it's another when the tech giants literally control your girlfriend.
    Background: https://theconversation.com/i-tried-the-replika-ai-companion-and-can-see-why-users-are-falling-hard-the-app-raises-serious-ethical-questions-200257
    There are also other things to take into consideration as well.  Fans are very interested in knowing as much as possible about their idols, but idols are real people and have human psychological needs and limitations, but virtual idols will not.  The virtual idols that share their entire lives with their fans will be even more relatable than the human stars that need privacy and get frustrated and yell at paparazzi etc.  These virtual idols will be able to be PR-perfect in all the right ways (i.e. just human enough to be relatable but not so human that they accidentally offend people).  
    There is already a huge market for personalised messages from stars, virtual idols will be able to create these in virtually infinite amounts.  Virtual stars will be able to perform at simultaneous concerts, make public appearances wherever and whenever is optimal, etc.  
    And if you still need another example about how we underestimate technology... 
    "Computers in the future may weigh less than 1.5 tons.” - Popular Mechanics magazine, 1949.
  23. Like
    Katrikura reacted to KnightsFan in Don't panic about AI - it's just a tool   
    Nice article! My perspective is as a software engineer, at a company that is making a huge effort to leverage AI faster and better than the industry. I am generally less optimistic than you that AI is "just a tool" and will not result in large swaths of the creative industry losing money.
    The first point I always make is that it's not about whether AI will replace all jobs, it's about the net gain or loss. As with any technology, AI tools both create and destroy jobs. The question for the economy is how many. Is there a net loss or a net gain? And of course we're not only concerned with number of jobs, but also how much money that job is worth. Across a given economy--for example, the US economy--will AI generated art cause clients/studios/customers to put more, or less net money into photography? My feeling is less. For example, my company ran an ad campaign using AI generated photos. It was done in collaboration with both AI specialists to write prompts, and artists to conceptualize and review. So while we still used a human artist, it would have taken many more people working many more hours to achieve the same thing. The net result was we spent less money towards creative on that particular campaign, meaning less money in the photography industry. It's difficult for me to imagine that AI will result in more money being spent on artistic fields like photography. I'm not talking about money that creatives spend on gear, which is a flow of money from creatives out, I'm talking about the inflow from non-creatives, towards creatives.
    The other point I'll make is that I don't think anyone should worry about GPT-4. It's very competent at writing code, but as a software engineer, I am confident that the current generation of AI tools cannot do my job. However, I am worried about what GPT-5, or GPT-10, or GPT-20 will do. I see a lot of articles--not necessarily Andrew's--that confidently say AI won't replace X because it's not good enough. It's like looking at a baby and saying, "that child can't even talk! It will never replace me as a news anchor." We must assume that AI will continue to improve exponentially at every task, for the foreseeable future. In this sense, "improve" doesn't necessarily mean "give the scientifically accurate answer" either. Machine learning research goes in parallel with psychology research. A lot of machine learning breakthroughs actually provide ideas and context for studies on human learning, and vice versa. We will be able to both understand and model human behavior better in future generations.
    My third point is that I disagree that people are fundamentally moved by other people's creations. You write
    I think that only a very small fraction of moviegoers care at all about who made the content. This sounds like an argument made in favor of practical effects over CGI, and we all know which side won that. People like you and I might love the practical effects in Oppenheimer simply for being practical, but the big CGI franchises crank out multiple films each year worth billions of dollars. If your argument is that the people driving the entertainment market will pay more for carefully crafted art than generic, by the numbers stories and effects, I can't disagree more.
    Groot, Rocket Raccoon, and Shrek sell films and merchandise based off face and name recognition. What percent of fans do you think know who voiced them? 50%, ie 100 million+ people? How many can name a single animator for those characters? What about Master Chief from Halo (originally a one dimensional character literally from Microsoft), how many people can tell you who wrote, voiced, or animated any of the Bungie Halo games? In fact, most Halo fans feel more connected to the original Bungie character than the one from the Halo TV series, despite having a much more prominent actor portrayal.
    My final point is not specifically about AI. I live in an area of the US where, decades ago, everyone worked in good paying textile mill jobs. Then the US outsourced textile production overseas and everyone lost their jobs. The US and my state economies are larger than ever. Jobs were created in other sectors, and we have a booming tech sector--but very few laid off, middle aged textile workers retrained and started a new successful career. It's plausible that a lot of new, unknown jobs will spring up thanks to AI, but it's also plausible that "photography" shrinks in the same way that textiles did.
  24. Like
    Katrikura reacted to BTM_Pix in DJI Pocket 3?   
    Bitrate is 100mbps fixed which is near as dammit the upper limit of the variable bitrate that my iPhone 12 uses (a typical comparator for what people would use a Pocket for) and I don't find it lacking for the purpose.
    More important for me is that the Osmo Pockets have Cinelike D which is a good compromise and very easy to grade.
    I had to do a little project at the weekend using the LX10/15 (with my Cinelike D hack activated) and the Pocket and they matched easily both in terms of colour and acceptably enough in image quality.
    On the Pocket 2, it is derived from a 4.6K sensor which is giving about 1.2x lossless zoom when shooting in 4K and roughly 2x and 1.6x in 1080p and 2.7K respectively.
    In the edit, I punched in to around 2x on a 4K file from my weekend project and was comfortable with the result so I wouldn't be bothered going up to say 1.5x as a safety and then adding the rest in the edit.
    A big advantage of the Pocket is adding the control stick which will give you full pan/tilt/zoom controls right from the handle.
    That usability trumps a small loss of image quality for the purposes I use it for.
     
    The 24mm in that spec is coming from it being a drone camera though.
    The actual lens on a Pocket 2 is 20mm f1.8.
    On of the beauties of this camera though are the magnetic clip on wide angle adapters (which gives you 15mm) and of course the anamorphic versions too.
    As well as also simultaneously being able to us magnetic ND filters too.
    If DJI do bring a new version 3 out with an optical zoom and/or with 10 bit capture then that would be great but I have to say that even as it stands now the Pocket 2 is a very compelling camera for travel and, for me, it is a vastly superior proposition to my iPhone for that role.
    Its not only because of the creative possibilities it has with having an integrated gimbal so you are not only stabilised but can do tracking etc or the anamorphic and NDs or the ability to have real hardware controls with the control stick.
    Its that you can take it out of your pocket, hold the power button on and be ready to shoot 2 seconds later rather than titting about unlocking the phone, opening an app and making sure you are in the correct mode.
    Another bonus is that if do have a need for remote shooting or self shooting with tracking then you can operate it from your phone which you can't do with your phone as you are already using it to film with 😉 
  25. Like
    Katrikura reacted to Andrew Reid in Don't panic about AI - it's just a tool   
    Filmmaking is an art form. In the realm of creativity, people often crave the authenticity that only real humans can provide. This is why blockbusters often enlist well-known actors to generate interest and why, despite our daily reliance on them, our personal computers don't make headlines in gossip columns.
    https://www.eoshd.com/news/dont-panic-about-ai-its-just-a-tool/
×
×
  • Create New...