• Products
    • Isadora
    • Get It
    • ADD-ONS
    • IzzyCast
    • Get It
  • Forum
  • Help
  • Werkstatt
  • Newsletter
  • Impressum
  • Dsgvo
  • Press
  • Isadora
  • Get It
  • ADD-ONS
  • IzzyCast
  • Get It
  • Press
  • Dsgvo
  • Impressum

Navigation

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags

    [ANSWERED] Thinking about NDI and IP Video (CPU vs GPU video)

    Hardware
    3
    3
    349
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • craigw
      craigw last edited by mark

      Hi, In the future when NDI and IP video become the norm, will the graphics card still do the actor processing, or does it use CPU?

      Mac OS Ventura 2017 MacPro, 32g Ram ,AMD FirePro D700 6144 MB. Isadora v3.2.6 Los Angeles CA

      DusX mark 2 Replies Last reply Reply Quote 0
      • DusX
        DusX Tech Staff @craigw last edited by

        @craigw

        I imagine we will want to continue doing the processing on the gpu, it is simply much faster for parallel processing than a CPU will be. Systems are likely to change in other ways to allow for faster transfer of video frames between the gpu and cpu. This is already happening with the M1 processor in new Macs.

        Troikatronix Technical Support

        • New Support Ticket Link: https://support.troikatronix.com/support/tickets/new
        • My Add-ons: https://troikatronix.com/add-ons/?u=dusx
        • Profession Services: https://support.troikatronix.com/support/solutions/articles/13000109444-professional-services

        Running: Win 11 64bit, i7, M.2 PCIe SSD's, 32gb DDR4, nVidia GTX 4070 | located in Ontario Canada.

        1 Reply Last reply Reply Quote 0
        • mark
          mark @craigw last edited by mark

          @craigw said:

          Hi, In the future when NDI and IP video become the norm, will the graphics card still do the actor processing, or does it use CPU?

           In the end, the data from NDI or a movie or even a webcam always starts its life as bitmap on the CPU. (Well, maybe not always, but I don't know of any examples to the contrary.) Then you need to get that bitmap image uploaded from the CPU to the GPU so that you can do fast processing on it. This is all quite fast, because GPUs are designed to let you perform CPU to GPU transfers really fast, using things like direct memory access and other special techniques.

          The harder part is getting the GPU image back into the CPU. This is always inherently inefficient, because GPUs (which are arguably designed for gaming, not for video processing) are not designed to make these transfers fast, because in a game, you never need to do it. Sending the GPU image out to NDI requires a transfer from GPU to CPU, not to mention  costly compression algorithms to prepare the image to be sent over the internet. Because of this, such output will always be a kind of "big ticket item" when it comes to performance. Especially if you start doing it with several high-resolution images.

          As @DusX points out, the memory on an M1 is "integrated", meaning that the distinction between CPU memory and GPU memory would no longer seem to exist. I am awaiting my M1 laptop, and finding out if this is true in practice is something I am extremely curious to try.

          Best Wishes,
          Mark

          Media Artist & Creator of Isadora
          Macintosh SE-30, 32 Mb RAM, MacOS 7.6, Dual Floppy Drives

          1 Reply Last reply Reply Quote 2
          • First post
            Last post