3GPP — 3rd Generation Partnership Project - Industrial format standard of multimedia files that was designed for use in mobile network of third generation (3G).
AAC — Advanced Audio Coding, and in the beginning it was also called MPEG-2 NBC for "Non-Backwards Compatible" as opposed to the MPEG-1 and MPEG-2 BC (with 5.1 channels) standards. It is considered to be the actual "state of the art" in general audio coding and the natural successor of MPEG-1/2 Layer III / MP3 in the multimedia standard MPEG-4 that uses MP4 as the container format for all kinds of content.
AC3 — Initially known as Audio Coding 3 AC3 is a synonym for Dolby Digital these days. Dolby Digital is an advanced audio compression technology allowing to encode up to 6 separate channels at bitrates up to 448kbit/s.
AMD — Advanced Micro Devices, Inc. (abbreviated AMD; NYSE: AMD) is a manufacturer of integrated circuits based in Sunnyvale, California. It is the second-largest supplier of x86-compatible processors, and a leading supplier of non-volatile flash memory. It was founded in 1969 by a group of defectors from Fairchild Semiconductor, including Jerry Sanders. AMD's current CEO is Dr. Hector Ruiz. The current president and Chief Operating Officer is Dirk Meyer.
API — application programming interface.
ARM — The ARM architecture (originally the Acorn RISC Machine) is a 32-bit RISC processor architecture that is widely used in a number embedded designs. Due to their power saving features, ARM CPUs are dominant in the mobile electronics market, where low-power consumption is a critical design goal.
ASF — Advanced Streaming Format. Microsoft's answer to Real Media and streaming media in general.
ATSC. DVS157 — Advanced Television Systems Committee. In 1978, the Federal Communications Commission (FCC) empaneled the Advisory Committee on Advanced Television Service (ACATS) as an investigatory and advisory committee to develop information that would assist the FCC in establishing an advanced broadcast television (ATV) standard for the United States. This committee created a subcommittee, the ATSC, to explore the need for and to coordinate development of the documentation of Advanced Television Systems. In 1993, the ATSC recommended that efforts be limited to a digital television system (DTV), and in September 1995 issued its recommendation for a Digital Television System standard, which was approved with the exclusion of compression format constraints (picture resolution, frame rate, and frame sequence).
AVC/H.264 — is the latest ratified video coding standard. It emerged as the result of joint development of International Telecommunication Union Video Coding Experts Group (ITU VCEG) and MPEG ISO. This standard is known as H.264 (ITU-T name), or MPEG-4 part 10 (ISO/IEC 14496-10), or MPEG-4 Advanced Video Coding (AVC).
AVI — Audio/Video Interleaved - Standard that has been designed by Microsoft. AVI is the file format in which the audio and video data are alternated one after another. During playback a sound-track is synchronized with video.
Bandwidth — A range within a band of frequencies or wavelengths.
Bitrate — rate of video/audio data transfer. Measured in kilobit per second. The higher bitrate is the more space on the disk is occupied.
C, C++, C# — programming languages.
C-Cube — C-Cube Microsystems is the leading global provider of digital video silicon and digital broadcast systems. This technology is used worldwide to enable digital video communications and consumer applications.
CAVLC, CABAC — Context-Adaptive Variable Length Coding and Context-Adaptive Binary Arithmetic Coding are tools for entropy coding of the bitstream syntax (macroblock type, motion vectors + reference-index, etc.). CAVLC is a default compression method in AVC/H.264. CABAC is a more powerful compression method, which is able to bring down the bitrate additionally by about 10-15% (especially on high bitrates). Both these methods provide lossless compression and therefore never harm the quality, however using CABAC method slows down both encoding and decoding processes.
Closed Captions — Textual video overlays that are not normally visible, as opposed to open captions, which are a permanent part of the picture. Captions are usually a textual representation of the spoken audio. In the United States, the official NTSC Closed Caption standard requires that all TVs larger than 13 inches include circuitry to decode and display caption information stored on line 21 of the video signal. DVD-Video can provide closed caption data, but the subpicture format is preferred for its versatility.
Codec — coder/decoder or compression/decompression software or hardware module. Codecs are used to encode and decode (or compress and decompress) various types of data that would otherwise use up huge amount of disk space, such as raw sound and video files. Codecs can be used with either streaming (live media data) or files-based (AVI, WAV, MP4 etc) content.
COM — The Component Object Model is an object-oriented programming model used by numerous applications.
D1 — 720x576 (PAL-based), 704x480 (NTSC-based)TV video resolution). Subsets of D1 are CIF (352x240 and 352x288 respecrively) and QCIF (176x120 and 176x144 respectively).
Deinterlace — The process of converting a progressive video stream out of an interlaced one is called deinterlacing.
Demultiplexing — The opposite of multiplexing. In this process a combined audio/video stream will be separated into the number of streams it consists of.
DirectShow® — Microsoft® DirectShow® is an architecture for streaming media on the Microsoft Windows® platform.
DirectX — Microsoft® DirectX® is a set of low-level application programming interfaces (APIs) for creating games and other high-performance multimedia applications. It includes support for two-dimensional (2-D) and three-dimensional (3-D) graphics, sound effects and music, input devices, and networked applications such as multiplayer games.
DSP — Digital Signal Processor.
DTV — Digital Television. DTV can be used to carry more channels in the same bandwidth than analog TV (6 MHz or 7 MHz in Europe) and to receive high-definition TV program.
DXVA — Microsoft DirectX Video Acceleration. DXVA is a specification for hardware acceleration of digital video decoding processing.
EDK — The Xilinx Embedded Development Kit (EDK) provides a suite of tools, libraries, and IP to create an embedded PowerPC and/or MicroBlaze system on a Xilinx FPGA.
Elementary stream (ES) — An elementary stream is a single (video or audio) stream without container. For instance a basic MPEG-2 video stream (.m2v or .mpv) is an MPEG-2 ES, and on the audio side we have AC3, MP2, etc. Most DVD authoring programs require ES as input.
FPGA — A Field Programmable Gate Array or FPGA is a semiconductor device containing programmable logic components and programmable interconnects. The programmable logic components can be programmed to duplicate the functionality of basic logic gates (such as AND, OR, XOR, NOT) or more complex combinatorial functions such as decoders or simple math functions.
GOP — Group Of Pictures in MPEG streams.
HDTV — High-Definition Television, a new type of television that provides much better resolution than current televisions based on the NTSC standard. HDTV is a digital TV broadcasting format where the broadcast transmits widescreen pictures with more detail and quality than found in a standard analog television, or other digital television formats. HDTV is a type of Digital Television (DTV) broadcast, and is considered to be the best quality DTV format available.
Inter prediction — the process of predicting blocks of pixels based on temporal dependency between two or more frames. Also referred to as Temporal prediction
Interlaced — Interlaced is a video storage mode. An interlaced video stream doesn't contain frames (pictures as we know them) but fields with each field containing half of the lines of one frame (all even or all odd lines).
Intra prediction — the process of predicting blocks of pixels based on spatial dependency (i.e. within the frame). Also referred to as Spatial prediction.
ISMA — Internet Streaming Media Alliance - Group promoting the use of RTSP, RTP, and a subset of the MPEG-4 specification.
ISO — International Organization for Standardization, also provides publicly available MPEG standard documents, reference software and conformance streams for free.
JPEG — JPEG is a lossy compression technique for color images. Although it can reduce files sizes to about 5% of their normal size, some detail is lost in the compression.
MJPEG 2000 — M-JPEG2000 (MJ2K) is a video adaptation of the JPEG2000 standard. M-JPEG2000 will treat a video stream as a series of still photos, compressing each individually.
Motion estimation — The process of estimating motion vectors during the encoding process.
MOV — QuickTime Movie file.
MP3 — It was standardized by the ISO as MPEG Audio Layer III (ISO/IEC 11172-3 and ISO/IEC 13818-3).
MP4 — file format that was designed for storing MPEG-4 data in a file.
MPEG — The Moving Picture Experts Group is a working group of ISO/IEC in charge of the development of international standards for compression, decompression, processing, and coded representation of moving pictures, audio and their combination.
MPEG LA, LLC — Reseller of technology platform patent licenses, enabling users to acquire worldwide patent rights necessary for a particular technology standard or platform from multiple patent holders in a single transaction as an alternative to negotiating separate licenses. Any use of video or software other that consumer personal use in any manner that complies with the MPEG-2, MPEG-4 or AVC standard is expressly prohibited without a license under applicable patents in the MPEG-2, MPEG-4 or AVC patent portfolio, which license is available from MPEG LA, L.L.C. Additional information may be obtained from MPEG LA, L.L.C. See http://www.mpegla.com.
MPEG-1 — Audio and video compression format developed by Moving Pictures Expert Group. Official description: Coding of moving pictures and associated audio for digital storage media at up to about 1,5 Mbit/s.
MPEG-1 System Stream (SS) — multiplexed streams complying with ISO/IEC 11172-1 standard.
MPEG-2 — Audio and video compression format developed by ISO/IEC/JTC/SC29/WG11 and is known as ISO/IEC 13818. The MPEG-2 video coding standard is primarily aimed at coding of CCIRR-601 or higher resolution video with fairly high quality at challenging bitrates of 4 to 9Mbit/s. It aims at providing CCIR/ITU-R quality for NTSC, PAL, and SECAM, and also at supporting HDTV quality, at data rate above 10Mbps, real-time transmission, and progressive and interlaced scan sources.
MPEG-4 SP/ASP — (Simple profile/Advanced simple profile) MPEG-4 is a standard for graphics and video compression that is based on MPEG-1 and MPEG-2 and Apple QuickTime technology. Wavelet-based MPEG-4 files are smaller than JPEG or QuickTime files, so they are designed to transmit video and images over a narrower bandwidth and can mix video with text, graphics and 2-D and 3-D animation layers. MPEG-4 was standardized in October 1998 in the ISO/IEC document 14496.
Multiplexing — Usually video and audio are encoded separately. Then you have to join both of them to make a movie that you can play (you can of course play audio and video separately in two players but to get synch would be rather hard). During multiplexing the audio and video track are combined to one audio/video stream. The audio and video streams will be like woven together and navigational information will be added so that the player can example fast forward/backward and still retain synch audio/video
NQI — Nonlinear Quality Index
NTSC — National Television System Committee. The NTSC is responsible for setting television and video standards in the United States and Japan (in Europe and the rest of the world, the dominant television standards are PAL and SECAM). The NTSC standard for television defines a composite video signal with a refresh rate of 60 half-frames (interlaced) per second. Each frame contains 525 lines and can contain 16 million different colors.
OEM — original equipment manufacturer.
OMAP, OMAP2, OMAP3 technologies — Texas Instruments OMAP is a Texas Instruments proprietary microprocessor for multimedia applications.
PAL — Phase Alternating Line, the dominant television standard in Europe. The United States uses a different standard, NTSC. Whereas NTSC delivers 525 lines of resolution at 60 half-frames per second, PAL delivers 625 lines at 50 half-frames per second. Many video adapters that enable computer monitors to be used as television screens support both NTSC and PAL signals.
PDA — personal digital assistant (Pocket PC).
PES — Packetized Elementary Stream consists of a continuous sequence of PES packets of one elementary stream with one stream ID. When PES packets are used to form a PES stream, they shall include Elementary Stream Clock Reference (ESCR) fields and Elementary Stream Rate (ES_Rate) fields.
PNX Nexperia/TriMedia — Nexperia™ is the Philips brand for a unique group of products that streamline development of next-generation, connected multimedia appliances. From highly integrated, programmable system-on-chip (SoC) and companion ICs to reference designs, system software, and development tools, flexible Nexperia solutions help manufacturers meet market demands for innovative, appealing products in targeted consumer and communications markets.
Program stream — The Program Stream is similar to MPEG-1 Systems Multiplex. It results from combining one or more Packetised Elementary Streams (PES), which have a common time base, into a single stream. The Program Stream is designed for use in relatively error-free environments and is suitable for applications which may involve software processing. Program stream packets may be of variable and relatively great length. The MPEG-2 Program Stream is defined in the (ISO/IEC 13818-1).
PSNR — peak signal to noise ratio.
PVR — Personal Video Recorder. Also referred to as a DVR (digital video recorder).
QA — quality assurance.
QuickTime — A video and animation system developed by Apple Computer. QuickTime is built into the Macintosh operating system and is used by most Mac applications that include video or animation. PCs can also run files in QuickTime format, but they require a special QuickTime driver. QuickTime supports most encoding formats, including Cinepak, JPEG, and MPEG. QuickTime is competing with a number of other standards, including AVI and ActiveMovie.
RTP — RTP is the Internet-standard protocol for the transport of real-time data, including audio and video. It can be used for media-on-demand as well as interactive services such as Internet telephony. RTP consists of a data and a control part. The latter is called RTCP.
RTSP — protocols for streaming servers, clients and infrastructure.
SCTE — The Society of Cable Telecommunications Engineers is an organization that develops training for cable television installers and engineers; in this role it is analogous to the Society of Broadcast Engineers for broadcast television. SCTE is also an ANSI-recognized standards-developing organization for the cable industry.
SDK — software development kit.
SECAM — Systeme Sequentiel Couleurs a Memoire is television standard in Russia and France. SECAM delivers 625 lines at 50 half-frames per second.
Set-Top-Boxes — The term set-top-box describes a device that connects to a television and some external source of signal, and turns the signal into content then displayed on the screen.
SoC — System on a Chip.
Streaming — Data is streaming when it's moving quickly from one chunk of hardware to another and doesn't have to be all in one place for the destination device to do something with it.
Subpicture — Graphic bitmap overlays used in DVD-Video to create subtitles, captions, karaoke lyrics, menu highlighting effects, and so on.
TI — Texas Instruments Incorporated (TI) - TI's platform is one of the newest applications processors for cell phones and mobile devices. This offers up to 40 per cent improvement in performance for a variety of applications, while consuming as little as half the power of current TI processors.
Transport stream — The Transport Stream combines one or more Packetized Elementary Streams (PES) with one or more independent time bases into a single stream. Elementary streams sharing a common timebase form a program. The Transport Stream is designed for use in environments where errors are likely, such as storage or transmission in lossy or noisy media. Transport stream packets are 188 bytes long. The MPEG-2 transport stream is defined in the ISO/IEC standard (13818-1).
VB .NET — programming language.
Via Licensing Corp — An independent subsidiary of Dolby Laboratories, enabling users to acquire worldwide patent rights necessary for a particular technology standard or platform from multiple patent holders in a single transaction as an alternative to negotiating separate licenses. Any use of video or software other that consumer personal use in any manner that complies with the MPEG-2 AAC, MPEG-4 Audio or AVC Video standard is expressly prohibited without a license under applicable patents in the MPEG-2 AAC, MPEG-4 Audio or AVC Video patent portfolio, which license is available from Via Licensing Corporation. Additional information may be obtained from Via Licensing Corp. See http://www.vialicensing.com.
VOB Files — All DVD movies are stored in so-called VOB files. VOB files usually contain multiplexed Dolby Digital or MPEG Audio and MPEG-2 video.
VOD — Video on demand (VOD) systems allow users to select and watch video content over a network as part of an interactive television system. VOD systems either "stream" content, allowing viewing while the video is being downloaded, or "download" it in which the program is brought in its entirety to a set-top box before viewing starts.
VQM — Video Quality Metric.
WAV — is the Windows standard for waveform sound files.
Wavelet — A mathematical function used in compressing images. Images compressed using wavelets are smaller than JPEG images and can be transferred and downloaded at quicker speeds. Wavelet technology can compress color images from 20:1 to 300:1, grayscale images from 10:1 to 50:1.
x86 (Intel, AMD) — x86 or 80x86 is the generic name of a microprocessor architecture first developed and manufactured by Intel. The x86 architecture currently dominates the desktop computer, portable computer, and small server markets.
YUV — YUV formats fall into two distinct groups, the packed formats where Y, U (Cb) and V (Cr) samples are packed together into macropixels which are stored in a single array, and the planar formats where each component is stored as a separate array, the final image being a fusing of the three separate planes.