Retired Document
Important: OpenGL ES was deprecated in iOS 12. To create high-performance code on GPUs, use the Metal framework instead. See Metal.
Glossary
This glossary contains terms that are used specifically for the Apple implementation of OpenGL ES as well as terms that are common in OpenGL ES graphics programming.
- aliased
Said of graphics whose edges appear jagged; can be remedied by performing antialiasing operations.
- antialiasing
In graphics, a technique used to smooth and soften the jagged (or aliased) edges that are sometimes apparent when graphical objects such as text, line art, and images are drawn.
- attach
To establish a connection between two existing objects. Compare bind.
- bind
To create a new object and then establish a connection between that object and a rendering context. Compare attach.
- bitmap
A rectangular array of bits.
- buffer
A block of memory managed by OpenGL ES dedicated to storing a specific kind of data, such as vertex attributes, color data or indices.
- clipping
An operation that identifies the area of drawing. Anything not in the clipping region is not drawn.
- clip coordinates
The coordinate system used for view-volume clipping. Clip coordinates are applied after applying the projection matrix and prior to perspective division.
- completeness
A state that indicates whether a framebuffer object meets all the requirements for drawing.
- context
A set of OpenGL ES state variables that affect how drawing is performed to a drawable object attached to that context. Also called a rendering context.
- culling
Eliminating parts of a scene that can’t be seen by the observer.
- current context
The rendering context to which OpenGL ES routes commands issued by your app.
- current matrix
A matrix used by OpenGL ES 1.1 to transform coordinates in one system to those of another system, such as the modelview matrix, the perspective matrix, and the texture matrix. GLSL ES uses user-defined matrices instead.
- depth
In OpenGL, the z coordinate that specifies how far a pixel lies from the observer.
- depth buffer
A block of memory used to store a depth value for each pixel. The depth buffer is used to determine whether or not a pixel can be seen by the observer. All fragments rasterized by OpenGL ES must pass a depth test that compares the incoming depth value to the value stored in the depth buffer; only fragments that pass the depth test are stored to framebuffer.
- double buffering
The practice of using two buffers to avoid resource conflicts between two different parts of the graphic subsystem. The front buffer is used by one participant and the back buffer is modified by the other. When a swap occurs, the front and back buffer change places.
- drawable object
An object allocated outside of OpenGL ES that can be used as part of an OpenGL ES framebuffer object. On iOS, the only type of drawable object is the
CAEAGLLayer
class that integrates OpenGL ES rendering into Core Animation. - extension
A feature of OpenGL ES that’s not part of the OpenGL ES core API and therefore not guaranteed to be supported by every implementation of OpenGL ES. The naming conventions used for extensions indicate how widely accepted the extension is. The name of an extension supported only by a specific company includes an abbreviation of the company name. If more then one company adopts the extension, the extension name is changed to include
EXT
instead of a company abbreviation. If the Khronos OpenGL Working Group approves an extension, the extension name changes to includeOES
instead ofEXT
or a company abbreviation. - eye coordinates
The coordinate system with the observer at the origin. Eye coordinates are produced by the modelview matrix and passed to the projection matrix.
- filtering
A process that modifies an image by combining pixels or texels.
- fog
An effect achieved by fading colors to a background color based on the distance from the observer. Fog provides depth cues to the observer.
- fragment
The color and depth values calculated when rasterizing a primitive. Each fragment must past a series of tests before being blended with the pixel stored in the framebuffer.
- system framebuffer
A framebuffer provided by an operating system. This type of framebuffer supports integrating OpenGL ES into an operating system’s windowing system. iOS does not use system framebuffers. Instead, it provides framebuffer objects that are associated with a Core Animation layer.
- framebuffer attachable image
The rendering destination for a framebuffer object.
- framebuffer object
A framebuffer that is managed entirely by OpenGL ES. A framebuffer object contains state information for an OpenGL ES framebuffer and its set of images, called renderbuffers. Framebuffers are built into OpenGL ES 2.0 and later, and all iOS implementations of OpenGL ES 1.1 are guaranteed to support framebuffer objects (through the
OES_framebuffer_object
extension). - frustum
The region of space that is seen by the observer and that is warped by perspective division.
- image
A rectangular array of pixels.
- interleaved data
Arrays of dissimilar data that are grouped together, such as vertex data and texture coordinates. Interleaving can speed data retrieval.
- mipmaps
A set of texture maps, provided at various resolutions, whose purpose is to minimize artifacts that can occur when a texture is applied to a geometric primitive whose onscreen resolution doesn’t match the source texture map. Mipmapping derives from the latin phrase multum in parvo, which means “many things in a small place.”
- modelview matrix
A 4 x 4 matrix used by OpenGL to transform points, lines, polygons, and positions from object coordinates to eye coordinates.
- multisampling
A technique that takes multiple samples at a pixel and combines them with coverage values to arrive at a final fragment.
- mutex
A mutual exclusion object in a multithreaded app.
- packing
Converting pixel color components from a buffer into the format needed by an app.
- pixel
A picture element—the smallest element that the graphics hardware can display on the screen. A pixel is made up of all the bits at the location x, y, in all the bitplanes in the framebuffer.
- pixel depth
In a pixel image, the number of bits per pixel.
- pixel format
A format used to store pixel data in memory. The format describes the pixel components (red, green, blue, alpha), the number and order of components, and other relevant information, such as whether a pixel contains stencil and depth values.
- premultiplied alpha
A pixel whose other components have been multiplied by the alpha value. For example, a pixel whose RGBA values start as (1.0, 0.5, 0.0, 0.5) would, when premultiplied, be (0.5, 0.25, 0.0, 0.5).
- primitives
The simplest elements in OpenGL—points, lines, polygons, bitmaps, and images.
- projection matrix
A matrix that OpenGL uses to transform points, lines, polygons, and positions from eye coordinates to clip coordinates.
- rasterization
The process of converting vertex and pixel data to fragments, each of which corresponds to a pixel in the framebuffer.
- renderbuffer
A rendering destination for a 2D pixel image, used for generalized offscreen rendering, as defined in the OpenGL specification for the
OES_framebuffer_object
extension. - renderer
A combination of hardware and software that OpenGL ES uses to create an image from a view and a model.
- rendering context
A container for state information.
- rendering pipeline
The order of operations used by OpenGL ES to transform pixel and vertex data to an image in the framebuffer.
- render-to-texture
An operation that draws content directly to a texture target.
- RGBA
Red, green, blue, and alpha color components.
- shader
A program that computes surface properties.
- shading language
A high-level language, accessible in C, used to produce advanced imaging effects.
- stencil buffer
Memory used specifically for stencil testing. A stencil test is typically used to identify masking regions, to identify solid geometry that needs to be capped, and to overlap translucent polygons.
- tearing
A visual anomaly caused when part of the current frame overwrites previous frame data in the framebuffer before the current frame is fully rendered on the screen. iOS avoids tearing by processing all visible OpenGL ES content through Core Animation.
- tessellation
An operation that reduces a surface to a mesh of polygons, or a curve to a sequence of lines.
- texel
A texture element used to specify the color to apply to a fragment.
- texture
Image data used to modify the color of rasterized fragments. The data can be one-, two-, or three- dimensional or it can be a cube map.
- texture mapping
The process of applying a texture to a primitive.
- texture matrix
A 4 x 4 matrix that OpenGL ES 1.1 uses to transform texture coordinates to the coordinates that are used for interpolation and texture lookup.
- texture object
An opaque data structure used to store all data related to a texture. A texture object can include such things as an image, a mipmap, and texture parameters (width, height, internal format, resolution, wrapping modes, and so forth).
- vertex
A three-dimensional point. A set of vertices specify the geometry of a shape. Vertices can have a number of additional attributes, such as color and texture coordinates. See vertex array.
- vertex array
A data structure that stores a block of data that specifies such things as vertex coordinates, texture coordinates, surface normals, RGBA colors, color indices, and edge flags.
- vertex array object
An OpenGL ES object that records a list of active vertex attributes, the format each attribute is stored in, and the location of the data describing vertices and attributes. Vertex array objects simplify the effort of reconfiguring the graphics pipeline.
Copyright © 2018 Apple Inc. All Rights Reserved. Terms of Use | Privacy Policy | Updated: 2018-06-04