Jump to content

ntblowz

Members
  • Posts

    1,880
  • Joined

  • Last visited

Reputation Activity

  1. Like
    ntblowz got a reaction from herein2020 in Canon C70 User Experience   
    C70's raw file size is much more manageable than c200/c300's 1gbps monster and that's only 10bit in 4k50/60p vs 12bit on c70.
     
    The 2k raw is even usable in wedding too lol (for ceremony, not sure about reception cause of lowlight on s16 crop).
     
    This raw update really makes c70 attractive! No need bulky external monitor + battery if you want compact raw cinema camera.
     

  2. Like
    ntblowz got a reaction from webrunner5 in Canon C70 User Experience   
    C70's raw file size is much more manageable than c200/c300's 1gbps monster and that's only 10bit in 4k50/60p vs 12bit on c70.
     
    The 2k raw is even usable in wedding too lol (for ceremony, not sure about reception cause of lowlight on s16 crop).
     
    This raw update really makes c70 attractive! No need bulky external monitor + battery if you want compact raw cinema camera.
     

  3. Like
    ntblowz reacted to Gianluca in Unreal and python background remover   
    Hello everyone ... I preferred to open a new topic rather than continue to abuse "my Journey to virtual production"
    With the carnival I was able to do some more tests with unreal and this time for I recorded my subject up against a yellow wall ... As you can see, in controlled working conditions, the results can be really good ... Obviously there is still a lot of room for improvement, for example I have to synchronize the two video tracks by recording a common audio track, I have to balance the gimbal better (I have a crane-m and with the mobile phone mounted I exceed the grams it can support, so it vibrates a lot) , but apart from that, if I had an actress who does something sensible apart from shooting all the time 🙂 I might at this point think I can shoot something decent ... What do you think? Opinions, advice?
     
  4. Like
    ntblowz reacted to herein2020 in Canon C70 User Experience   
    I tested out the new C70 firmware today and shot in Cinema RAW LT at 30FPS. On a whim, I directed the RAW footage to record to my V30 card and it actually recorded fine.
    I just shot a few seconds of footage to test out playback on my editing system and to see if my SD cards could handle it. I definitely think my 1TB SandDisk Extreme Pro cards are rock solid, it was almost comical seeing 411 min of recording time available when recording to the 1TB card in RAW LT. 
    When I tested RAW LT at 60FPS though it buffer overflowed immediately. So my storage concerns were definitely unwarranted; this camera, despite its faults is really turning into a rock solid workhorse for me. Below is a quick table that I put together to show the codecs that I care about and their recording times.
     
    SanDisk Extreme Pro 1TB V30 Canon RAW LT -> 29.97FPS -> Recording Time 422min Canon XF-AVC Long GOP -> 29.97FPS -> Recording Time 852min Canon XF-AVC Long GOP -> 59.97FPS -> Recording Time 524min ProGrade 256GB V90 Canon RAW ST -> 29.97FPS -> 68min Canon RAW LT -> 29.97FPS -> Recording Time 105min Canon RAW LT -> 59.97FPS -> Recording Time 52min Canon XF-AVC Long GOP -> 29.97FPS -> Recording Time 213min Canon XF-AVC Long GOP -> 59.97FPS -> Recording Time 131min Also, the Canon RAW LT at 29.97FPS seems to have a bitrate of 322Mb/s which is a data rate of 40MB/s, which is only half the 80MB/s that my SanDisk V30 card registered during its tests so I should be pretty safe with it. I will definitely keep the ProGrade cards in my bag, but it looks like for 90% of my work I am fine with keeping two 1TB SanDisk Pro cards in the camera with it set to dual slot recording.
    If I shoot a project that requires raw footage I will probably just use two ProGrade cards, not worth the headache of trying to remember which footage gets recorded to where depending on framerate.
  5. Like
    ntblowz got a reaction from filmmakereu in Canon EOS R5C   
    Got the R5c today, checked the video AF and it locked on fine.
    512GB only last 30min of 8k50p, on 5.9k 50p it last nearly an hour which is more doable.
    Might have to use the speedbooster again for 5.9k raw 🤔
    And battery life is much worse than R5, 4k50p last 40 something min so I guess I will be pairing with my 18w pd powerbank for long tripod shoot..
    Btw absolutely no video on photo mode. You have to switch mode to do video and vice versa.
    Body and operating wise it feels like mini c70 when shooting video, better balance with lens than R5 I reckon.
     



  6. Like
    ntblowz got a reaction from Video Hummus in Canon EOS R5C   
    Got the R5c today, checked the video AF and it locked on fine.
    512GB only last 30min of 8k50p, on 5.9k 50p it last nearly an hour which is more doable.
    Might have to use the speedbooster again for 5.9k raw 🤔
    And battery life is much worse than R5, 4k50p last 40 something min so I guess I will be pairing with my 18w pd powerbank for long tripod shoot..
    Btw absolutely no video on photo mode. You have to switch mode to do video and vice versa.
    Body and operating wise it feels like mini c70 when shooting video, better balance with lens than R5 I reckon.
     



  7. Haha
    ntblowz reacted to kye in Canon EOS R5C   
    AUS only issue - that sounds about right!
  8. Haha
    ntblowz reacted to KnightsFan in Canon RF 5.2mm f/2.8L Dual Fisheye 3D VR Lens   
    3D porn is last decade, we're way beyond that haha
  9. Like
    ntblowz reacted to dlfimage in Canon RF 5.2mm f/2.8L Dual Fisheye 3D VR Lens   
    I have had this lens now for about 3 weeks, shot a few different types of scenes, and am really impressed with quality and over all ease of use. Canon's VR utility renders/converts the video and pictures much faster than I thought it would. Though some of you old guys (yeah I'm 51, bite me) are sounding like my grandfather did when TV's showed up on the scene...."nobody's gonna waste time with that thing" or "What kind of idiot just sits on the couch and looks into a box?".... well between Facebook (Meta), Microsoft, and Disney there are literally billions of dollar getting dumped into this format because it is clear to them, and me, that in the next 3-5 years damn near half the people you know will have some form of VR gear in their family or household. It really is incredible if you let yourself actually consider the ramifications. I took my first picture in a family party setting, a few days ago I took my Oculus over there to share those clips and pics and literally everyone was amazed to be "in" that party again, seeing everything exactly as it was like you're standing in that spot again. In fact my uncle, the oldest person there, was the one to point out how absolutely amazing it would be to be standing in his childhood home again, looking at his brothers and sisters and seeing it all again "like time traveling to the past" is what he said. I don't think many people are recognizing this aspect of VR yet, but it's going to be huge. I've already had my first wedding party request a VR shoot, I might make this the focus of my future photography business. 
     
  10. Like
    ntblowz reacted to BTM_Pix in My Journey To Virtual Production   
    So....getting into the time machine and going back to where the journey was a few weeks ago.
    To recap, my interest in these initial steps is in exploring methods of externally controlling the virtual camera inside Unreal Engine rather than getting live video into it to place in the virtual world.
    To me, that is the bigger challenge as, although far from trivial, the live video and keying aspect is a known entity to me and the virtual camera control is the glue that will tie it together.
    For that, we need to get the data from the outside real world in terms of the camera position, orientation and image settings in to Unreal Engine and then do the processing to act upon that and translate it to the virtual camera.
    To achieve this, Unreal Engine offers Plug Ins to get the data in and Blueprints to process it.
    The Plug Ins on offer for different types of data for Unreal Engine are numerous and range from those covering positional data from VR trackers to various protocols to handle lens position encoders, translators for PTZ camera systems and more generic protocols such as DMX and MIDI.
    Although it is obviously most closely associated with music, it is this final one that I decided to use.
    The first commercially available MIDI synth was the Sequential Prophet 600 which I bought (and still have!) a couple of years after it was released in 1982 and in the intervening four decades (yikes!) I have used MIDI for numerous projects outside of just music so its not only a protocol that I'm very familiar with but it also offers a couple of advantages for this experimentation.
    The first that, due to its age, it is a simple well documented protocol to work with and the second is that due to its ubiquity there are tons of cheaply available control surfaces that greatly help when you are prodding about.
    And I also happen to have quite a few lying around including these two, the Novation LaunchControl and the Behringer X-Touch Mini.


    The process here, then, is to connect either of these control surfaces to the computer running Unreal Engine and use the MIDI plug in to link their various rotary controls and/or switches to the operation of the virtual camera.
    Which brings us now to using Blueprints.
    In very simple terms, Blueprints allow the linking of elements within the virtual world to events both inside it and external.
    So, for example, I could create a Blueprint that when I pressed the letter 'L' on the keyboard could toggle the light on and off in a scene or if I pressed a key from 1 to 9 it could vary its intensity etc.
    For my purpose, I will be creating a Blueprint that effectively says "when you receive a value from pot number 6 of the LaunchControl then change the focal length on the virtual camera to match that value" and "when you receive a value from pot number 7 of the LaunchControl then change the focus distance on the virtual camera to match that value" and so on and so on.
    By defining those actions for each element of the virtual camera, we will then be able to operate all of its functions externally from the control surface in an intuitive manner.
    Blueprints are created using a visual scripting language comprising of nodes that are connected together that is accessible but contains enough options and full access to the internals to create deep functionality if required.
    This is the overview of the Blueprint that I created for our camera control.

    Essentially, what is happening here is that I have created a number of functions within the main Blueprint which are triggered by different events.
    The first function that is triggered on startup is this Find All MIDI Devices function that finds our MIDI controller and shown in expanded form here.

    This function cycles through the attached MIDI controllers and lists them on the screen and then steps on to find our specific device that we have defined in another function.
    Once the MIDI device is found, the main Blueprint then processes incoming MIDI messages and conditionally distributes them to the various different functions that I have created for processing them and controlling the virtual camera such as this one for controlling the roll position of the camera.

    When the Blueprint is running, the values of the parameters of the virtual camera are then controlled by the MIDI controller in real time.
    This image shows the viewport of the virtual camera in the editor with its values changed by the MIDI controller.

    So far so good in theory but the MIDI protocol does present us with some challenges but also some opportunities.
    The first challenge is that the value for most "regular" parameters such as note on, control change and so on that you would use as a vehicle to transmit data only ranges from 0 to 127 and doesn't support fractional or negative numbers.
    If you look at the pots on the LaunchControl, you will see that they are regular pots with finite travel so if you rotate it all the way to the left it will output 0 and will progressively output a higher value until it reaches 127 when it is rotated all the way to the right.
    As many of the parameters that we will be controlling have much greater ranges (such as focus position) or have fractional values (such as f stop) then another solution is required.
    If you look at the pots on the X-Touch Mini, these are digital encoders so have infinite range as they only interpret relative movement rather than absolute values like the pots on the LaunchControl do.
    This enables them to take advantage of being mapped to MIDI events such as Pitch Bend which can have 16,384 values.
    If we use the X-Touch Mini, then, we can create the Blueprint in such a way that it either increments or decrements a value by a certain amount rather than absolute values, which takes care of parameters with much greater range than 0 to 127, those which have negative values and those which have fractional values.
    Essentially we are using the encoders or switches on the X-Touch Mini has over engineered plus and minus buttons with the extent of each increment/decrement set inside the Blueprint.
    Workable but not ideal and it also makes it trickier to determine a step size for making large stepped changes (think ISO100 to 200 to 400 to 640 is only four steps on your camera but a lot more if you have to go in increments of say 20) and of course every focus change would be a looooongg pull.
    There is also the aspect that, whilst perfectly fine for initial development, these two MIDI controllers are not only wired but also pretty big.
    As our longer term goal is for something to integrate with a camera, the requirement would quickly be for something far more compact, self powered and wireless.
    So, the solution I came up with was to lash together a development board and a rotary encoder and write some firmware that would make it a much smaller MIDI controller that would transmit to Unreal Engine over wireless Bluetooth MIDI.

    Not pretty but functional enough.
    To change through the different parameters to control, you just press the encoder in and it then changes to control that one.
    If that seems like less than real time in terms of camera control as you have to switch, then you are right it is but this form is not its end goal so the encoder is mainly for debugging and testing.
    The purpose of how such a module will fit in is in how a camera will talk to it and it then processing that data for onward transmission to Unreal Engine. So its a just an interface platform for other prototypes.
    In concert with the Blueprint, as we now control the format of the data being transmitted and how it is then interpreted, it gives us all the options we need to encode it and decode it so that we can fit what we need within its constraints and easily make the fractional numbers, negative numbers, large numbers that we will need.
    It also offers the capability of different degrees of incremental control as well as direct.
    But that's for another day.
    Meanwhile, here is a quick video of the output of it controlling the virtual camera in coarse incremental mode.
     
  11. Like
    ntblowz reacted to Mmmbeats in Canon C70 User Experience   
    In other news - I just purchased a C70, which I will be collecting on Monday 😍.
  12. Like
    ntblowz reacted to ac6000cw in Olympus OM-1   
    The user manual for the OM-1 is available now - https://cs.olympus-imaging.jp/en/support/imsg/digicamera/download/manual/om/man_om1_e.pdf
  13. Like
    ntblowz reacted to Andrew Reid in First impressions of the Panasonic GH6 I have on loan   
    Not a review yet, just initial impressions.
  14. Like
    ntblowz reacted to Video Hummus in Associated Press / Sony deal weirdness   
    Kinda smells this way. Just like YouTubers getting thousands of dollars of free gear and say they aren't getting paid! They then openly resell it on eBay. Total bullshit.
  15. Like
    ntblowz reacted to Davide DB in Panasonic GH6   
    90 minutes of applauses!
    I just don't understand why it is assumed that EVERYONE aspires to have a FF camera. And that therefore those who don't have a FF camera must necessarily suffer from the small penis syndrome. It's just the premise that is wrong.
    In my own small way (deep technical diving), I wouldn't think of taking an FF camera on a dive, which, with its housing and porthole, is as big as a washing machine. The M43 is the perfect compromise of quality and size.
    For decades sports photographers have used APSC cameras because it was the right tool for the job.
    And for eons cinematography has been Super 35mm.
    Now legions of youtubers have decided that if you don't have an FF camera you're a loser. Then of course I respect all those who need a FF sensor for their work but they should stop explaining me what I need...
     
    Regarding Gerald, if you're a Youtuber for a living you play by the rules of your job and shut up just like all the other workers who clock in in the morning.
  16. Thanks
    ntblowz got a reaction from Emanuel in Panasonic GH6   
    Just saw potato jet's review, really impressed with DR Boost on GH6, the Canon mirrorless camera's DR is soo lacking

  17. Thanks
    ntblowz got a reaction from Emanuel in Panasonic GH6   
    magenta skintone is fixed too  (canon slog3 also have magenta skintone problem grr)

  18. Like
    ntblowz got a reaction from kye in Panasonic GH6   
    magenta skintone is fixed too  (canon slog3 also have magenta skintone problem grr)

  19. Like
    ntblowz reacted to Gianluca in My Journey To Virtual Production   
    In this example I used a python script that allows you to have perfectly outlined images without any green screen, the camera must be placed on a tripod and then just a background image without the subject is enough and the script will do everything else ... not 100% reliable but when it works it's really perfect ...
    I then used the "composure" plugin in unreal to try to put real and virtual in perspective ...
     
    Now I'm experimenting with special effects to be inserted directly into unreal ..
  20. Like
    ntblowz reacted to kye in Panasonic GH6   
    +1 for Media Divisions review.
    I looked through the dozen or so GH6 videos in my feed and stopped when I saw Media Division had done one and just watched that.  They are the most level-headed film-making centric channel out there - both technically capable and also film-making aware so don't get caught up in tech for its own sake.
    It looks like an absolute cracker of a camera.
    My impressions are that although it's not teaching FF lessons like the GH4 and GH5 did on their release, it'll still be a solid presence in the market, especially for people who don't need AF.
    I'm also looking forward to Andrews review.
  21. Haha
    ntblowz reacted to MrSMW in Olympus OM-1   
    Ha, just read Andrew's tweet:
    "OM-1 is a nice camera but will sell exactly 0 units at £2000"
    Well one muppet has pre-ordered a pair so we're 2 up on your 0 already 😉
  22. Like
    ntblowz reacted to Gianluca in My Journey To Virtual Production   
    This is a super simple example I'm working at the moment
     
    I want to understand how much real world and virtual world can match without any effort like in this case...sorry for low resolution and bad mask but it's a very quick work
     
    Unreal camera it's set to 21mm and my camera it's 12mm on apsc (18equivalent ), it's preatty similar but not perfect...
     
    Someone kwon how to stabilize to a fix point the footsool in fusion and after that track the unreal world and assign the coordinate to the footsool?
  23. Like
    ntblowz reacted to John Matthews in Olympus OM-1   
    This guy has some low-light slow-mo footage. Interesting review, but I hate mouth-open thumbnails- sorry about that! He only has 500 subs.
     
  24. Like
    ntblowz reacted to androidlad in Olympus OM-1   
    It seems OM1 has really good internal NR, applied at RAW debayer level:
    https://www.geh-photo.org/post/die-om-1-im-praxistest?continueFlag=143e81367b1d01290587846ff81073b9

    Left EM1X, right OM1
     

    OM1 ISO40000, yes forty thousand.
  25. Like
    ntblowz reacted to Gianluca in My Journey To Virtual Production   
    This topic have made me mad... 
    Unreal engine it's very complicated to understand, and there are issue every 2 minutes for all that you want to do if you are not expert... 
     
    I've watched many tutorial on YouTube, starting with oculus quest (that I have) and other, but today I've maybe found my personal workflown... 
     
    Unreal 4.27 and virtual plugin for Android (80$), very easy to setup but at the moment there are issue for anything different than 24p (I want 25p and there's a bug) 
     
    I really don't want to use composure for merge virtual to real life, and I have chosen to add two track in davinci resolve (unreal world and real subject with a matte) in syncro (with waveform of music) 
     
    The matte will came from runwayml (15€ a month for 1080p) and I'm still forced to use some green screen in background (without light) to help rotoscoping.. In this way I can turn around my subject (my 5 years old son) in my living room without divorcing my wife 
     
    In the next days maybe I will be able to post some sample.. 
×
×
  • Create New...