• Products
    • Isadora
    • Get It
    • ADD-ONS
    • IzzyCast
    • Get It
  • Forum
  • Help
  • Werkstatt
  • Newsletter
  • Impressum
  • Dsgvo
  • Press
  • Isadora
  • Get It
  • ADD-ONS
  • IzzyCast
  • Get It
  • Press
  • Dsgvo
  • Impressum

Navigation

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags

    Images > 4096 Pixels Wide Don't Work with Intel HD Graphics 3000

    Troubleshooting and Bug Reports
    2
    4
    5584
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • mark
      mark last edited by

      Dear All,

      A little warning that there is some kind of bug in the driver of the Intel HD Graphics 3000, the integrated chip that you use to save power on the more recent MBP. Textures over 4096 pixels across will not render properly. Gory details below, which is a copy of the bug report I submitted to Apple.
      The Radeon 6750M works fine with textures up to 8192 pixels.
      Best Wishes,
      Mark
      ------------
      Dear All,
      There seems to be another anomaly in the driver when testing my app against the Radeon 6750M vs. Intel HD Graphics 3000\. This came up whilst trying out a large texture of 6000 x 4500 px. If someone else could confirm this problem, I would appreciate it.
      To ensure my textures are valid, I'm doing the following:
      1) Checking the resolution against GL_MAX_TEXTURE_SIZE or GL_MAX_RECTANGLE_TEXTURE_SIZE_EXT as appropriate. On both cards, the result of this check is 8192.
      2) If the texture size is within those limits, I use GL_PROXY_TEXTURE_2D like so:
      glTexImage2D(GL_PROXY_TEXTURE_2D, 0, 4, mTextureWidth, mTextureHeight, 0, mTexFormat, mTexType, NULL);
      if (glGetError() == GL_NO_ERROR) {
         glGetTexLevelParameteriv(GL_PROXY_TEXTURE_2D, 0, GL_TEXTURE_WIDTH, &testWidth);    
         glGetTexLevelParameteriv(GL_PROXY_TEXTURE_2D, 0, GL_TEXTURE_HEIGHT, &testHeight);    
      }
      I then check to ensure testWidth and testHeight are not 0
      For both the 6750M and HD Graphics 3000 the results of these tests indicate my 6000 x 4500 pixel texture is fine. So I go about creating it (glTexImage2D create/glTexSubImage2D upload to GPU) and rendering it.
      The 6000 x 4500 px texture works on the 6750M, but it doesn't work on the HD Graphics 3000\. Further testing reveals that the upper limit for the HD Graphics 3000 is 4096\. Anything above this fails.
      I tried using GL_TEXTURE_2D (POT Textures) instead of GL_TEXTURE_RECTANGLE_ARB but the result was the same.
      Here are some snapshots
      At 4096 x 3072, Mr. Hubble is happy. (snapshot at http://troikatronix.com/files/hd-graphics-3000-tex-4096x3072.png)
      At 4104 x 3072, Mr. Hubble is sad. (snapshot at http://troikatronix.com/files/hd-graphics-3000-tex-4104x3072.png)
      At 4360 x 3072, Mr. Hubble is very sad. (snapshot at http://troikatronix.com/files/hd-graphics-3000-tex-4360x3072.png)
      It seems like the driver limits the row bytes at 4096\. As you increase the horizontal resolution, it appears that the row bytes are "stuck." Or, that once past 4096, the GPU refuses to accept a data from glTexSubImage2D. Here's a little movie that shows how things increase the resolution from 4096 to 5120:
      http://troikatronix.com/files/hd-graphics-3000-test-desktop.m4v
      considering the "garbage" that one sees moving up the frame in this movie, I strongly suspect that would seem that the GPU is refusing to load new data into the texture -- it's just using what's already in from the time it worked at 4096.
      I quickly modified one of the HeNe tutorials to exhibit this problem. You can download the sample project at http://troikatronix.com/files/simple-opengl-example.zip
      All insights and info much appreciated.
      Best Wishes,
      Mark

      Media Artist & Creator of Isadora
      Macintosh SE-30, 32 Mb RAM, MacOS 7.6, Dual Floppy Drives

      1 Reply Last reply Reply Quote 0
      • eight
        eight last edited by

        Hi Mark,

        Did you try to turn off the Automatic graphics switching? That's what I had to do to get past a no CUDA-capable device is detected on my MBP Retina with NVIDIA. System Preferences->Energy Saver ->Automatic graphics switching->Uncheck keeps NVIDIA card active at all times.

        --8

        Analysis: http://post.scriptum.ru | Synthesis: http://onewaytheater.us
        Twitter: https://twitter.com/eight_io | Flickr: http://www.flickr.com/photos/eight_io/
        Github: https://github.com/eighteight | MulchCam: https//mulchcam.com
        MulchTune: https://itunes.apple.com/us/app/mulch-tune/id1070973465 | Augmented Theatre: https://augmentedtheatre.com

        1 Reply Last reply Reply Quote 0
        • mark
          mark last edited by

          Dear Eight,

          Well, I'm testing a new Isadora version. I purposely test in both integrated and discrete mode to ensure Izzy plays nice with both. In so doing, I found this bug in the driver itself. So it's not a matter of working around it, it's a matter of submitting a bug to Apple and saying "fix it"!
          BTW, to skip the System Preferences thing, you might want to download [gfxCardStatus](http://codykrieger.com/gfxCardStatus). Puts the option for which card you are using right in the menu. Very handy and free. 
          Best Wishes,
          Mark

          Media Artist & Creator of Isadora
          Macintosh SE-30, 32 Mb RAM, MacOS 7.6, Dual Floppy Drives

          1 Reply Last reply Reply Quote 0
          • eight
            eight last edited by

            Hi Mark,

            FWIW: I ran your test program on my Retina's integrated HD 4000 (VRAM 512 Mb) -- with the provided texture 6000x4500 it runs fine. I then created a 12000x4500 image and the program crashes with the integrated card, while it still works with the NVIDIA GeForce GT 650M (VRAM 1024 MB)
            If my calculation is correct (6000x4500x8 = 216 MB and 12000x4500x8 = 532 MB, assuming 8 bytes per pixel) this behavior is in line with HD 4000 memory rating at 512 MB. If it's 4 bytes per pixel, the 12000x4500 crash would be a surprise, but still could be perhaps explained by the memory overhead a texture requires on the GPU.
            --8

            Analysis: http://post.scriptum.ru | Synthesis: http://onewaytheater.us
            Twitter: https://twitter.com/eight_io | Flickr: http://www.flickr.com/photos/eight_io/
            Github: https://github.com/eighteight | MulchCam: https//mulchcam.com
            MulchTune: https://itunes.apple.com/us/app/mulch-tune/id1070973465 | Augmented Theatre: https://augmentedtheatre.com

            1 Reply Last reply Reply Quote 0
            • First post
              Last post