bestofolz.blogg.se

Which graphics card opengl 3.3 or later compatible graphics cards
Which graphics card opengl 3.3 or later compatible graphics cards













which graphics card opengl 3.3 or later compatible graphics cards

You can safely use these interchangeably.Įxtensions which are problematic often have an explanatory comment somewhere as well (such as "This is not an alias of PrimitiveRestartIndexNV, since it sets server instead of client state."), but do not rely on these, rely on the alias field.

which graphics card opengl 3.3 or later compatible graphics cards

But how to know which ones you can use and which ones will eat your cat? Look at gl.spec - a function which has an alias entry is identical and indistinguishable from its alias. In practice, most (not all!) extensions are identical to the respective core functionality, and work just the same. However, be warned: As a blanket statement, this is illegal, undefined, and dangerous. OpenGL 4.x features are loadable as extension if available, which is fine.īut, of course, everybody has to decide for himself/herself which shoe fits best.Įnough of my blah blah, back to the actual question:Ībout not duplicating code for extensions/core, you can in many cases use the same names, function pointers, and constants. This works with all hardware that can be reasonably expected having the processing power to run a modern application, and it includes the users who were too lazy to upgrade their driver to 3.3, providing the same features in a single code path. Therefore, I have personally chosen to never again target anything lower than version 3.2 with instanced arrays and shader objects. Having said that, supporting OpenGL 2.x and OpenGL >3.1 at the same time is a nightmare, because there are non-trivial changes in the shading language which go far beyond #define in varying and which will bite you regularly. If you ever intend to sell your application, you have to ask yourself whether someone who cannot afford (or does not want to afford) this would be someone you'd reasonably expect to pay for your software. The market price for an entry level OpenGL 3.3 compatible card with roughly 1000x the processing power of a high end OpenGL 1.4 card was around $25 some two years ago. Targetting OpenGL 2.1 is reasonable, there are hardly any systems nowadays which don't support that (even assuming a minimum of OpenGL 3.2 may be an entirely reasonable choice). There is little to gain (apart from a support nightmare) from that. Which means that if someone has OpenGL 1.4 and can run your shaders, he is using 8-10 year old drivers. Since using "shaders that are core in 3.0" necessarily means that the graphics card must be capable of at least some version of GLSL, this rules out any hardware that is not capable of providing at least OpenGL 2.0. Unless you really have to support 10 year old graphics cards for some reason, I strongly recommend targetting OpenGL 2.0 instead of 1.4 (in fact, I'd even go as far as targetting version 2.1).















Which graphics card opengl 3.3 or later compatible graphics cards