1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73
|
'''OpenGL extension OES.required_internalformat
This module customises the behaviour of the
OpenGL.raw.GLES1.OES.required_internalformat to provide a more
Python-friendly API
Overview (from the spec)
The ES 1.1 API allows an implementation to store texture data internally
with arbitrary precision, regardless of the format and type of the data
supplied by the application. Similarly, ES allows an implementation to
choose an arbitrary precision for the internal storage of image data
allocated by glRenderbufferStorageOES.
While this allows flexibility for implementations, it does mean that an
application does not have a reliable means to request the implementation
maintain a specific precision or to find out what precision the
implementation will maintain for a given texture or renderbuffer image.
For reference, "Desktop" OpenGL uses the <internalformat> argument to
glTexImage*, glCopyTexImage* and glRenderbufferStorageEXT as a hint,
defining the particular base format and precision that the application wants
the implementation to maintain when storing the image data. Further, the
application can choose an <internalformat> with a different base internal
format than the source format specified by <format>. The implementation is
not required to exactly match the precision specified by <internalformat>
when choosing an internal storage precision, but it is required to match the
base internal format of <internalformat>.
In addition, ES 1.1 does not allow an implementation to fail a request to
glTexImage2D for any of the legal <format> and <type> combinations listed in
Table 3.4, even if the implementation does not natively support data stored
in that external <format> and <type>. However, there are no additional
requirements placed on the implementation. The ES implementation is free to
store the texture data with lower precision than originally specified, for
instance. Further, since ES removes the ability to query the texture object
to find out what internal format it chose, there is no way for the
application to find out that this has happened.
This extension addresses the situation in two ways:
1) This extension introduces the ability for an application to specify
the desired "sized" internal formats for texture image allocation.
2) This extension guarantees to maintain at least the specified
precision of all available sized internal formats.
An implementation that exports this extension is committing to support all
of the legal values for <internalformat> in Tables 3.4, 3.4.x, and 3.4.y,
subject to the extension dependencies described herein. That is to say, the
implementation is guaranteeing that choosing an <internalformat> argument
with a value from these tables will not cause an image allocation request to
fail. Furthermore, it is guaranteeing that for any sized internal format,
the renderbuffer or texture data will be stored with at least the precision
prescribed by the sized internal format.
The official definition of this extension is available here:
http://www.opengl.org/registry/specs/OES/required_internalformat.txt
'''
from OpenGL import platform, constant, arrays
from OpenGL import extensions, wrapper
import ctypes
from OpenGL.raw.GLES1 import _types, _glgets
from OpenGL.raw.GLES1.OES.required_internalformat import *
from OpenGL.raw.GLES1.OES.required_internalformat import _EXTENSION_NAME
def glInitRequiredInternalformatOES():
'''Return boolean indicating whether this extension is available'''
from OpenGL import extensions
return extensions.hasGLExtension( _EXTENSION_NAME )
### END AUTOGENERATED SECTION
|