RenderingPipeline

from geometry to pixels

The sad state of OpenGL on MacOS X

If you want to do hardware-accelerated 3D graphics on Macs, your only option is OpenGL, which is fine except for a few Mac-specific problems:

  • Outdated OpenGL support: Did MacOS X ever have up-to-date OpenGL support? With MacOS X 10.7 Lion OpenGL 3.2 support was added to Apples operating system. Till then only OpenGL 2.1 was available. Currently, 4.2 is the latest version and already supported by Linux and Windows – some Macs even have 4.2 capable GPUs! Yes, the 4.0 specs are only one year old and were only 6 month old when 10.7 came out, but: Apple is a member of Khronos and thus knows earlier what the next specs will look like – that’s how NVidia and ATI manage to provide updated drivers within weeks after a new standard becomes available…
  • Core profile only: With 3.2 you can chose between a compatibility and a core profile in OpenGL. With 3.2, the driver developers (in most cases: the hardware vendors, on MacOS: Apple) must implement a core profile and are free to also implement a compatibility profile. In compatibility all your old code will still run but you get also access to new features. In core, all the old, deprecated stuff goes away. For new projects, core is the way to go IMHO, but if you have legacy code this decision will lead to more work of porting your code base to a 100% core application. Supporting OSX 10.6 and 10.7 can become a pita!
  • Buggy drivers: Apples drivers are far from being bug free. The situation is subjectively worse than ATI drivers a few years ago. Uniform Buffers perform very badly, glGetActiveUniformName reports wrong names for uniforms from uniform blocks which have no instance name (Bug ID 11335828). Attribute less rendering is broken (OS X 10.7.3, Bug ID 11335272 & 8997308) – it doesn’t ‘just’ not work (with some kind of wrong OpenGL error), the application crashes because of a float point exception within the driver! Large 3D texture allocations can result in garbage data and possibly a crash of OS X (Bug ID 11226795). Uniform structs can return garbage depending on the order of elements (Bug ID 10518401)! I also had driver crashes related to shadow sampler lookups which disappeared when I changed the arithmetic at a different point of the shader code (trivial changes like adding 0 to a variable…) – talk about broken GLSL compilers…
  • Bad developer support: From all the bugs I reported, only once I god a useful answer (and that surprisingly within a day): The bug is known and will get fixed for the next release (the attribute less rendering bug). The normal reaction is no reaction, maybe a note that the bug is known and you get a Bug ID of the original report – which doesn’t help you at all, because you can’t look that up and you don’t get any updates on that bug anymore. I have (non-graphics related) bug reports open for years now without updates and without fixes. Apples forum are of little help also, it’s very rare to get answers from Apple staff via that channel…
  • No hi-end GPUs: You don’t get much choise of GPUs for notebooks and these GPUs are never compatible with their desktop counterparts. But even the desktop Macs lack the option of hi-end graphics – the iMacs have mobile chips – non-upgradeable – and the MacPros have outdated mid-range GPUs (when they were released! It’s even worse now that the Mac Pros didn’t get an update since 2010…). You can’t just put your favorite PCIe board in a MacPro. If you want to develop on a system that will be mid-range when you hit the market, you need a hi-end system of today – then don’t get a Mac.

But why is the situation so bad? I guess part of the problem is Apples decision to develop the drivers themself, similar as Microsoft is developing the Direct3D front end in-house. This way, we have to wait longer for OpenGL updates as NVidia and AMD can’t just provide updated drivers. The GLSL compiler is based on llvm and probably responsible for some of the bugs I found. Implementing such a complex low-level API is not an easy task, so why do you want to do it yourself, Apple? Why not just using the implementation from the experts at NVidia, AMD, Intel who at least know their own hardware like the back of their hands?

I have little hope to see this improving. Even OpenGL 3.3, a minor update that should work on all 3.2 camaple cards, is not planned for the next release of OS X…

Update 6/16/2013: The ‘next release’ stated above was 10.8. The ‘next version’ as of today, MacOS X Mavericks (10.9), will finally support OpenGL 4.1.

m4s0n501

, ,

6 thoughts on “The sad state of OpenGL on MacOS X
  • Matt says:

    They’ve got their issues for sure, but driver development is hard and time-intensive. The UBO spec in particular is a nightmare. I personally think GL4 is a total mess. And it reimplements OpenCL inside of OpenGL which is insane. And having only a Core profile makes it much easier to optimize stuff since all the crap from the old gl specs is gone. I don’t know why they even created the Compatibility profile since it means the runtime can’t make any assumptions and it means people will not update their code.

    The GLSL compiler is not based on llvm. And the nvidia and ati drivers are quite buggy themselves, especially if you use really new stuff. If you see bad performance from UBO you probably need to either discard your buffers or double buffer because you can’t modify one that’s being used to render. GL makes this unclear in general, but that’s an API limitation more than anything.

    Though if you really have pressing issues you should mail the apple ogl mailing list.

    • Robert says:

      You are absolutly right, driver development is hard and the UBO spec isn’t the most simple part of the specs. That’s why I’m wondering why Apple tries to do it on there own and not letting NVidia/AMD/Intel do the porting of there code. Yes, those have also problems, but I’ve see much less trouble with NVidia and even AMD than with Apple :-(

      Apple is using llvm inside of there OpenGL stack (http://llvm.org/devmtg/2007-05/10-Lattner-OpenGL.pdf) and I think the other sources claiming that it’s used for GLSL as well (e.g. http://www.phoronix.com/scan.php?page=news_item&px=OTI2NA) are accurate. Unless that has changed lately, in that case I would be interested in sources on how it’s done now.

      The problems I find get a bug report so Apple is informed.

      • Robert Wm. "Ruedii" says:

        Linux has been using LLVM for their OpenGL for some time without any issues. However, the developers spent 3 years also keeping support for the old drivers before the LLVM drivers were deemed stable, instead of launching them while LLVM was not capable of it.

        However, they don’t compile everything with -o3 the way Apple is known to, and accurately blacklist code that won’t compile with -o3, which Apple is known not to do. This includes the LLVM compiler itself.

        They also fix bugs rather than just post workaround for them.

  • Chris Bentley says:

    Hi Robert,

    hopefully we can improve your developer support experience. Feel free to drop me a line.

    chrisb

    Chris Bentley
    Mac 3D Manager
    Advanced Micro Devices, Inc.
    90 Central St.
    Boxborough MA, 01719

    email: chris.bentley@amd.com
    cell: (508) 259-2139
    main: (978) 795-8718

  • Nicol Bolas says:

    “With 3.2 you can chose between a compatibility and a core profile in OpenGL.”

    This is a common misconception. In 3.2 and above, the core profile is the only required profile. The compatibility profile is entirely optional. And as far as I’m concerned, the sooner it dies, the better. To me, MacOSX supporting only core OpenGL from here on out is a good thing.

    Granted, their unwillingness to support anything more than 3.2 is a very, very bad thing. So it doesn’t exactly even out overall.

    • Robert says:

      You are right, only core is mandatory (I reformulated that sentence to make it clear).
      My point wasn’t that the missing compatibility profile is a ‘bug’, but that it makes the transition of old code to modern OpenGL harder. You might argue that forcing the developers to port the whole app to core is better in the long run than ending up with a mixture of modern and deprecated functions.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

*