Documentation ¶
Overview ¶
Package video contains bindings for the gstvideo C API.
Index ¶
- Constants
- Variables
- func CalculateDisplayRatio(videoWidth, videoHeight, videoParNum, videoParDenom, displayParNum, ... uint) (darNum, darDenom uint, ok bool)
- func ConvertSample(sample *gst.Sample, toCaps *gst.Caps, timeout time.Duration) (*gst.Sample, error)
- func ConvertSampleAsync(sample *gst.Sample, toCaps *gst.Caps, timeout time.Duration, ...)
- func GuessFramerate(dur time.Duration) (destNum, destDenom int, ok bool)
- func MakeRawCaps(formats []Format) *gst.Caps
- func MakeRawCapsWithFeatures(formats []Format, features *gst.CapsFeatures) *gst.Caps
- type Alignment
- type ChromaFlags
- type ChromaMethod
- type ChromaResample
- type ChromaSite
- type ColorBalance
- type ColorBalanceChannel
- type ColorBalanceType
- type ColorMatrix
- type ColorPrimaries
- type ColorPrimariesInfo
- type ColorRange
- type Colorimetry
- type ConvertSampleCallback
- type CropMetaInfo
- type FieldOrder
- type Flags
- type Format
- type FormatFlags
- type FormatInfo
- func (f *FormatInfo) Bits() uint
- func (f *FormatInfo) ComponentDepth(component uint) uint
- func (f *FormatInfo) ComponentHSub(component uint) uint
- func (f *FormatInfo) ComponentWSub(n uint) uint
- func (f *FormatInfo) Flags() FormatFlags
- func (f *FormatInfo) Format() Format
- func (f *FormatInfo) HasAlpha() bool
- func (f *FormatInfo) HasPalette() bool
- func (f *FormatInfo) IsComplex() bool
- func (f *FormatInfo) IsGray() bool
- func (f *FormatInfo) IsLE() bool
- func (f *FormatInfo) IsRGB() bool
- func (f *FormatInfo) IsTiled() bool
- func (f *FormatInfo) IsYUV() bool
- func (f *FormatInfo) Name() string
- func (f *FormatInfo) NumComponents() uint
- func (f *FormatInfo) NumPlanes() uint
- func (f *FormatInfo) Plane(n uint) uint
- func (f *FormatInfo) PlaneOffset(n uint) uint
- func (f *FormatInfo) PlaneStride(n uint) uint
- func (f *FormatInfo) TileHS() uint
- func (f *FormatInfo) TileMode() TileMode
- func (f *FormatInfo) TileWS() uint
- type Info
- func (i *Info) ChromaSite() ChromaSite
- func (i *Info) Colorimetry() *Colorimetry
- func (i *Info) Convert(srcFormat, destFormat gst.Format, srcValue int64) (out int64, ok bool)
- func (i *Info) FPS() *gst.FractionValue
- func (i *Info) FieldHeight() int
- func (i *Info) FieldOrder() FieldOrder
- func (i *Info) FieldRateN() int
- func (i *Info) FlagIsSet(f Flags) bool
- func (i *Info) FlagSet(f Flags) *Info
- func (i *Info) FlagUnset(f Flags) *Info
- func (i *Info) Flags() Flags
- func (i *Info) Format() Format
- func (i *Info) Free()
- func (i *Info) FromCaps(caps *gst.Caps) *Info
- func (i *Info) HasAlpha() bool
- func (i *Info) Height() int
- func (i *Info) InterlaceMode() InterlaceMode
- func (i *Info) IsEqual(info *Info) bool
- func (i *Info) IsGray() bool
- func (i *Info) IsInterlaced() bool
- func (i *Info) IsRGB() bool
- func (i *Info) IsYUV() bool
- func (i *Info) MultiviewFlags() MultiviewFlags
- func (i *Info) MultiviewMode() MultiviewMode
- func (i *Info) Name() string
- func (i *Info) NumComponents() uint
- func (i *Info) NumPlanes() uint
- func (i *Info) PAR() *gst.FractionValue
- func (i *Info) Size() int64
- func (i *Info) ToCaps() *gst.Caps
- func (i *Info) Views() int
- func (i *Info) Width() int
- func (i *Info) WithAlign(align *Alignment) *Info
- func (i *Info) WithFPS(f *gst.FractionValue) *Info
- func (i *Info) WithFormat(format Format, width, height uint) *Info
- func (i *Info) WithInterlacedFormat(format Format, interlaceMode InterlaceMode, width, height uint) *Info
- func (i *Info) WithPAR(f *gst.FractionValue) *Info
- type InterlaceMode
- type KeyEvent
- type MouseEvent
- type MultiviewFlags
- type MultiviewFramePacking
- type MultiviewMode
- type Navigation
- type NavigationCommand
- type NavigationEvent
- type NavigationEventType
- type NavigationMessage
- type NavigationMessageType
- type NavigationQuery
- type NavigationQueryType
- type OrientationMethod
- type PackFlags
- type TileMode
- type TileType
- type TransferFunction
Constants ¶
const ( TagVideoColorspage gst.Tag = C.GST_META_TAG_VIDEO_COLORSPACE_STR TagVideoOrientation gst.Tag = C.GST_META_TAG_VIDEO_ORIENTATION_STR TagVideoSize gst.Tag = C.GST_META_TAG_VIDEO_SIZE_STR TagVideo gst.Tag = C.GST_META_TAG_VIDEO_STR )
Additional video meta tags
const ( ColorimetryBT2020 string = C.GST_VIDEO_COLORIMETRY_BT2020 ColorimetryBT202010 string = C.GST_VIDEO_COLORIMETRY_BT2020_10 ColorimetryBT2100HLG string = C.GST_VIDEO_COLORIMETRY_BT2100_HLG ColorimetryBT2100PQ string = C.GST_VIDEO_COLORIMETRY_BT2100_PQ ColorimetryBT601 string = C.GST_VIDEO_COLORIMETRY_BT601 ColorimetryBT709 string = C.GST_VIDEO_COLORIMETRY_BT709 ColorimetrySMPTE240M string = C.GST_VIDEO_COLORIMETRY_SMPTE240M ColorimetrySRRGB string = C.GST_VIDEO_COLORIMETRY_SRGB )
Pre-defined colorimetries
const CapsFeatureFormatInterlaced string = C.GST_CAPS_FEATURE_FORMAT_INTERLACED
CapsFeatureFormatInterlaced is the name of the caps feature indicating that the stream is interlaced.
Currently it is only used for video with 'interlace-mode=alternate' to ensure backwards compatibility for this new mode. In this mode each buffer carries a single field of interlaced video. BufferFlagTopField and BufferFlagBottomField indicate whether the buffer carries a top or bottom field. The order of buffers/fields in the stream and the timestamps on the buffers indicate the temporal order of the fields. Top and bottom fields are expected to alternate in this mode. The frame rate in the caps still signals the frame rate, so the notional field rate will be twice the frame rate from the caps.
Variables ¶
var TypeFormat = glib.Type(C.gst_video_format_get_type())
TypeFormat is the GType for a GstVideoFormat.
Functions ¶
func CalculateDisplayRatio ¶
func CalculateDisplayRatio(videoWidth, videoHeight, videoParNum, videoParDenom, displayParNum, displayParDenom uint) (darNum, darDenom uint, ok bool)
CalculateDisplayRatio will, given the Pixel Aspect Ratio and size of an input video frame, and the pixel aspect ratio of the intended display device, calculate the actual display ratio the video will be rendered with.
func ConvertSample ¶
func ConvertSample(sample *gst.Sample, toCaps *gst.Caps, timeout time.Duration) (*gst.Sample, error)
ConvertSample converts a raw video buffer into the specified output caps.
The output caps can be any raw video formats or any image formats (jpeg, png, ...).
The width, height and pixel-aspect-ratio can also be specified in the output caps.
func ConvertSampleAsync ¶
func ConvertSampleAsync(sample *gst.Sample, toCaps *gst.Caps, timeout time.Duration, cb ConvertSampleCallback)
ConvertSampleAsync converts a raw video buffer into the specified output caps.
The output caps can be any raw video formats or any image formats (jpeg, png, ...).
The width, height and pixel-aspect-ratio can also be specified in the output caps.
The callback will be called after conversion, when an error occurred or if conversion didn't finish after timeout.
func GuessFramerate ¶
GuessFramerate will, given the nominal duration of one video frame, check some standard framerates for a close match (within 0.1%) and return one if possible,
It will calculate an arbitrary framerate if no close match was found, and return FALSE.
It returns FALSE if a duration of 0 is passed.
func MakeRawCaps ¶
MakeRawCaps returns a generic raw video caps for formats defined in formats. If formats is empty or nil, returns a caps for all the supported raw video formats, see RawFormats.
func MakeRawCapsWithFeatures ¶
func MakeRawCapsWithFeatures(formats []Format, features *gst.CapsFeatures) *gst.Caps
MakeRawCapsWithFeatures returns a generic raw video caps for formats defined in formats with features. If formats is empty or nil, returns a caps for all the supported video formats, see RawFormats.
Types ¶
type Alignment ¶
type Alignment struct { // extra pixels on the top PaddingTop uint // extra pixels on bottom PaddingBottom uint // extra pixels on the left PaddingLeft uint // extra pixels on the right PaddingRight uint }
Alignment represents parameters for the memory of video buffers. This structure is usually used to configure the bufferpool if it supports the BufferPoolOptionVideoAlignment.
type ChromaFlags ¶
type ChromaFlags int
ChromaFlags are extra flags that influence the result from NewChromaResample.
const ( ChromaFlagNone ChromaFlags = C.GST_VIDEO_CHROMA_FLAG_NONE // (0) – no flags ChromaFlagInterlaced ChromaFlags = C.GST_VIDEO_CHROMA_FLAG_INTERLACED // (1) – the input is interlaced )
Type castings
type ChromaMethod ¶
type ChromaMethod int
ChromaMethod represents different subsampling and upsampling methods.
const ( ChromaMethodNearest ChromaMethod = C.GST_VIDEO_CHROMA_METHOD_NEAREST // (0) – Duplicates the chroma samples when upsampling and drops when subsampling ChromaMethodLinear ChromaMethod = C.GST_VIDEO_CHROMA_METHOD_LINEAR // (1) – Uses linear interpolation to reconstruct missing chroma and averaging to subsample )
Type castings
type ChromaResample ¶
type ChromaResample struct {
// contains filtered or unexported fields
}
ChromaResample is a utility object for resampling chroma planes and converting between different chroma sampling sitings.
func NewChromaResample ¶
func NewChromaResample(method ChromaMethod, site ChromaSite, flags ChromaFlags, format Format, hFactor, vFactor int) *ChromaResample
NewChromaResample creates a new resampler object for the given parameters. When h_factor or v_factor is > 0, upsampling will be used, otherwise subsampling is performed.
func (*ChromaResample) GetInfo ¶
func (c *ChromaResample) GetInfo() (nLines uint, offset int)
GetInfo returns the info about the Resample. The resampler must be fed n_lines at a time. The first line should be at offset.
type ChromaSite ¶
type ChromaSite int
ChromaSite represents various Chroma sitings.
const ( ChromaSiteUnknown ChromaSite = C.GST_VIDEO_CHROMA_SITE_UNKNOWN // (0) – unknown cositing ChromaSiteNone ChromaSite = C.GST_VIDEO_CHROMA_SITE_NONE // (1) – no cositing ChromaSiteHCosited ChromaSite = C.GST_VIDEO_CHROMA_SITE_H_COSITED // (2) – chroma is horizontally cosited ChromaSiteVCosited ChromaSite = C.GST_VIDEO_CHROMA_SITE_V_COSITED // (4) – chroma is vertically cosited ChromaSiteAltLine ChromaSite = C.GST_VIDEO_CHROMA_SITE_ALT_LINE // (8) – choma samples are sited on alternate lines ChromaSiteCosited ChromaSite = C.GST_VIDEO_CHROMA_SITE_COSITED // (6) – chroma samples cosited with luma samples ChromaSiteJpeg ChromaSite = C.GST_VIDEO_CHROMA_SITE_JPEG // (1) – jpeg style cositing, also for mpeg1 and mjpeg ChromaSiteMpeg2 ChromaSite = C.GST_VIDEO_CHROMA_SITE_MPEG2 // (2) – mpeg2 style cositing ChromaSiteDV ChromaSite = C.GST_VIDEO_CHROMA_SITE_DV // (14) – DV style cositing )
Type castings
func (ChromaSite) String ¶
func (c ChromaSite) String() string
String implements a stringer on ChromaSite.
type ColorBalance ¶
type ColorBalance interface { // Get the ColorBalanceType of this implementation. GetBalanceType() ColorBalanceType // Retrieve the current value of the indicated channel, between MinValue and MaxValue. GetValue(*ColorBalanceChannel) int // Retrieve a list of the available channels. ListChannels() []*ColorBalanceChannel // Sets the current value of the channel to the passed value, which must be between MinValue // and MaxValue. SetValue(*ColorBalanceChannel, int) }
ColorBalance is an interface implemented by elements which can perform some color balance operation on video frames they process. For example, modifying the brightness, contrast, hue or saturation.
Example elements are 'xvimagesink' and 'colorbalance'
func ColorBalanceFromElement ¶
func ColorBalanceFromElement(element *gst.Element) ColorBalance
ColorBalanceFromElement checks if the given element implements the ColorBalance interface, and if so, returns a usable interface. This currently only supports elements created from the C runtime.
type ColorBalanceChannel ¶
type ColorBalanceChannel struct { // A string containing a descriptive name for this channel Label string // The minimum valid value for this channel. MinValue int // The maximum valid value for this channel. MaxValue int }
ColorBalanceChannel represents parameters for modifying the color balance implemented by an element providing the GstColorBalance interface. For example, Hue or Saturation.
type ColorBalanceType ¶
type ColorBalanceType int
ColorBalanceType is an enumeration indicating whether an element implements color balancing operations in software or in dedicated hardware. In general, dedicated hardware implementations (such as those provided by xvimagesink) are preferred.
const ( ColorBalanceHardware ColorBalanceType = C.GST_COLOR_BALANCE_HARDWARE // (0) – Color balance is implemented with dedicated hardware. ColorBalanceSoftware ColorBalanceType = C.GST_COLOR_BALANCE_SOFTWARE // (1) – Color balance is implemented via software processing. )
Type castings
type ColorMatrix ¶
type ColorMatrix int
ColorMatrix is used to convert between Y'PbPr and non-linear RGB (R'G'B')
const ( ColorMatrixUnknown ColorMatrix = C.GST_VIDEO_COLOR_MATRIX_UNKNOWN // (0) – unknown matrix ColorMatrixRGB ColorMatrix = C.GST_VIDEO_COLOR_MATRIX_RGB // (1) – identity matrix. Order of coefficients is actually GBR, also IEC 61966-2-1 (sRGB) ColorMatrixFCC ColorMatrix = C.GST_VIDEO_COLOR_MATRIX_FCC // (2) – FCC Title 47 Code of Federal Regulations 73.682 (a)(20) ColorMatrixBT709 ColorMatrix = C.GST_VIDEO_COLOR_MATRIX_BT709 // (3) – ITU-R BT.709 color matrix, also ITU-R BT1361 / IEC 61966-2-4 xvYCC709 / SMPTE RP177 Annex B ColorMatrixBT601 ColorMatrix = C.GST_VIDEO_COLOR_MATRIX_BT601 // (4) – ITU-R BT.601 color matrix, also SMPTE170M / ITU-R BT1358 525 / ITU-R BT1700 NTSC ColorMatrixSMPTE240M ColorMatrix = C.GST_VIDEO_COLOR_MATRIX_SMPTE240M // (5) – SMPTE 240M color matrix ColorMatrixBT2020 ColorMatrix = C.GST_VIDEO_COLOR_MATRIX_BT2020 // (6) – ITU-R BT.2020 color matrix. Since: 1.6 )
Type castings
type ColorPrimaries ¶
type ColorPrimaries int
ColorPrimaries define the how to transform linear RGB values to and from the CIE XYZ colorspace.
const ( ColorPrimariesUnknown ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_UNKNOWN // (0) – unknown color primaries ColorPrimariesBT709 ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_BT709 // (1) – BT709 primaries, also ITU-R BT1361 / IEC 61966-2-4 / SMPTE RP177 Annex B ColorPrimariesBT470M ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_BT470M // (2) – BT470M primaries, also FCC Title 47 Code of Federal Regulations 73.682 (a)(20) ColorPrimariesBT470BG ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_BT470BG // (3) – BT470BG primaries, also ITU-R BT601-6 625 / ITU-R BT1358 625 / ITU-R BT1700 625 PAL & SECAM ColorPrimariesSMPTE170M ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_SMPTE170M // (4) – SMPTE170M primaries, also ITU-R BT601-6 525 / ITU-R BT1358 525 / ITU-R BT1700 NTSC ColorPrimariesSMPTE240M ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_SMPTE240M // (5) – SMPTE240M primaries ColorPrimariesFilm ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_FILM // (6) – Generic film (colour filters using Illuminant C) ColorPrimariesBT2020 ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_BT2020 // (7) – ITU-R BT2020 primaries. Since: 1.6 ColorPrimariesAdobeRGB ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_ADOBERGB // (8) – Adobe RGB primaries. Since: 1.8 ColorPrimariesSMPTEST428 ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_SMPTEST428 // (9) – SMPTE ST 428 primaries (CIE 1931 XYZ). Since: 1.16 ColorPrimariesSMPTERP431 ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_SMPTERP431 // (10) – SMPTE RP 431 primaries (ST 431-2 (2011) / DCI P3). Since: 1.16 ColorPrimariesSMPTEEG432 ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_SMPTEEG432 // (11) – SMPTE EG 432 primaries (ST 432-1 (2010) / P3 D65). Since: 1.16 ColorPrimariesEBU3213 ColorPrimaries = C.GST_VIDEO_COLOR_PRIMARIES_EBU3213 // (12) – EBU 3213 primaries (JEDEC P22 phosphors). Since: 1.16 )
Type castings
type ColorPrimariesInfo ¶
type ColorPrimariesInfo struct { Primaries ColorPrimaries Wx, Wy float64 // Reference white coordinates Rx, Ry float64 // Red coordinates Gx, Gy float64 // Green coordinates Bx, By float64 // Blue coordinates }
ColorPrimariesInfo is a structure describing the chromaticity coordinates of an RGB system. These values can be used to construct a matrix to transform RGB to and from the XYZ colorspace.
type ColorRange ¶
type ColorRange int
ColorRange represents possible color range values. These constants are defined for 8 bit color values and can be scaled for other bit depths.
const ( ColorRangeUnknown ColorRange = C.GST_VIDEO_COLOR_RANGE_UNKNOWN // (0) – unknown range ColorRange0255 ColorRange = C.GST_VIDEO_COLOR_RANGE_0_255 // (1) – [0..255] for 8 bit components ColorRange16235 ColorRange = C.GST_VIDEO_COLOR_RANGE_16_235 // (2) – [16..235] for 8 bit components. Chroma has [16..240] range. )
Type castings
type Colorimetry ¶
type Colorimetry struct { // The color range. This is the valid range for the samples. It is used to convert the samples to Y'PbPr values. Range ColorRange // The color matrix. Used to convert between Y'PbPr and non-linear RGB (R'G'B'). Matrix ColorMatrix // The transfer function. used to convert between R'G'B' and RGB. Transfer TransferFunction // Color primaries. used to convert between R'G'B' and CIE XYZ. Primaries ColorPrimaries }
Colorimetry is a structure describing the color info.
type ConvertSampleCallback ¶
ConvertSampleCallback represents a callback from a video convert opereration. It contains the converted sample or any error that ocurred.
type CropMetaInfo ¶
type CropMetaInfo struct {
// contains filtered or unexported fields
}
CropMetaInfo contains extra buffer metadata describing image cropping.
func GetCropMetaInfo ¶
func GetCropMetaInfo() *CropMetaInfo
GetCropMetaInfo returns the default CropMetaInfo.
func (*CropMetaInfo) Height ¶
func (c *CropMetaInfo) Height() uint
Height returns the cropped height.
func (*CropMetaInfo) Instance ¶
func (c *CropMetaInfo) Instance() *C.GstVideoCropMeta
Instance returns the underlying C GstVideoCropMeta instance.
func (*CropMetaInfo) Meta ¶
func (c *CropMetaInfo) Meta() *gst.Meta
Meta returns the parent Meta instance.
type FieldOrder ¶
type FieldOrder int
FieldOrder is the field order of interlaced content. This is only valid for interlace-mode=interleaved and not interlace-mode=mixed. In the case of mixed or FieldOrderrUnknown, the field order is signalled via buffer flags.
const ( FieldOrderUnknown FieldOrder = C.GST_VIDEO_FIELD_ORDER_UNKNOWN // (0) – unknown field order for interlaced content. The actual field order is signalled via buffer flags. FieldOrderTopFieldFirst FieldOrder = C.GST_VIDEO_FIELD_ORDER_TOP_FIELD_FIRST // (1) – top field is first FieldOrderBottomFieldFirst FieldOrder = C.GST_VIDEO_FIELD_ORDER_BOTTOM_FIELD_FIRST // (2) – bottom field is first )
Type castings
func (FieldOrder) String ¶
func (f FieldOrder) String() string
String implements a stringer on FieldOrder
type Flags ¶
type Flags int
Flags represents extra video flags
const ( FlagNone Flags = C.GST_VIDEO_FLAG_NONE // (0) – no flags FlagVariableFPS Flags = C.GST_VIDEO_FLAG_VARIABLE_FPS // (1) – a variable fps is selected, fps_n and fps_d denote the maximum fps of the video FlagPremultipliedAlpha Flags = C.GST_VIDEO_FLAG_PREMULTIPLIED_ALPHA // (2) – Each color has been scaled by the alpha value. )
Type castings
type Format ¶
type Format int
Format is an enum value describing the most common video formats.
const ( FormatUnknown Format = C.GST_VIDEO_FORMAT_UNKNOWN // (0) – Unknown or unset video format id FormatEncoded Format = C.GST_VIDEO_FORMAT_ENCODED // (1) – Encoded video format. Only ever use that in caps for special video formats in combination with non-system memory GstCapsFeatures where it does not make sense to specify a real video format. FormatI420 Format = C.GST_VIDEO_FORMAT_I420 // (2) – planar 4:2:0 YUV FormatYV12 Format = C.GST_VIDEO_FORMAT_YV12 // (3) – planar 4:2:0 YVU (like I420 but UV planes swapped) FormatYUY2 Format = C.GST_VIDEO_FORMAT_YUY2 // (4) – packed 4:2:2 YUV (Y0-U0-Y1-V0 Y2-U2-Y3-V2 Y4 ...) FormatUYVY Format = C.GST_VIDEO_FORMAT_UYVY // (5) – packed 4:2:2 YUV (U0-Y0-V0-Y1 U2-Y2-V2-Y3 U4 ...) FormatAYUV Format = C.GST_VIDEO_FORMAT_AYUV // (6) – packed 4:4:4 YUV with alpha channel (A0-Y0-U0-V0 ...) FormatRGBx Format = C.GST_VIDEO_FORMAT_RGBx // (7) – sparse rgb packed into 32 bit, space last FormatBGRx Format = C.GST_VIDEO_FORMAT_BGRx // (8) – sparse reverse rgb packed into 32 bit, space last FormatxRGB Format = C.GST_VIDEO_FORMAT_xRGB // (9) – sparse rgb packed into 32 bit, space first FormatxBGR Format = C.GST_VIDEO_FORMAT_xBGR // (10) – sparse reverse rgb packed into 32 bit, space first FormatRGBA Format = C.GST_VIDEO_FORMAT_RGBA // (11) – rgb with alpha channel last FormatBGRA Format = C.GST_VIDEO_FORMAT_BGRA // (12) – reverse rgb with alpha channel last FormatARGB Format = C.GST_VIDEO_FORMAT_ARGB // (13) – rgb with alpha channel first FormatABGR Format = C.GST_VIDEO_FORMAT_ABGR // (14) – reverse rgb with alpha channel first FormatRGB Format = C.GST_VIDEO_FORMAT_RGB // (15) – RGB packed into 24 bits without padding (R-G-B-R-G-B) FormatBGR Format = C.GST_VIDEO_FORMAT_BGR // (16) – reverse RGB packed into 24 bits without padding (B-G-R-B-G-R) FormatY41B Format = C.GST_VIDEO_FORMAT_Y41B // (17) – planar 4:1:1 YUV FormatY42B Format = C.GST_VIDEO_FORMAT_Y42B // (18) – planar 4:2:2 YUV FormatYVYU Format = C.GST_VIDEO_FORMAT_YVYU // (19) – packed 4:2:2 YUV (Y0-V0-Y1-U0 Y2-V2-Y3-U2 Y4 ...) FormatY444 Format = C.GST_VIDEO_FORMAT_Y444 // (20) – planar 4:4:4 YUV Formatv210 Format = C.GST_VIDEO_FORMAT_v210 // (21) – packed 4:2:2 10-bit YUV, complex format Formatv216 Format = C.GST_VIDEO_FORMAT_v216 // (22) – packed 4:2:2 16-bit YUV, Y0-U0-Y1-V1 order FormatNV12 Format = C.GST_VIDEO_FORMAT_NV12 // (23) – planar 4:2:0 YUV with interleaved UV plane FormatNV21 Format = C.GST_VIDEO_FORMAT_NV21 // (24) – planar 4:2:0 YUV with interleaved VU plane FormatGray8 Format = C.GST_VIDEO_FORMAT_GRAY8 // (25) – 8-bit grayscale FormatGray16BE Format = C.GST_VIDEO_FORMAT_GRAY16_BE // (26) – 16-bit grayscale, most significant byte first FormatGray16LE Format = C.GST_VIDEO_FORMAT_GRAY16_LE // (27) – 16-bit grayscale, least significant byte first Formatv308 Format = C.GST_VIDEO_FORMAT_v308 // (28) – packed 4:4:4 YUV (Y-U-V ...) FormatRGB16 Format = C.GST_VIDEO_FORMAT_RGB16 // (29) – rgb 5-6-5 bits per component FormatBGR16 Format = C.GST_VIDEO_FORMAT_BGR16 // (30) – reverse rgb 5-6-5 bits per component FormatRGB15 Format = C.GST_VIDEO_FORMAT_RGB15 // (31) – rgb 5-5-5 bits per component FormatBGR15 Format = C.GST_VIDEO_FORMAT_BGR15 // (32) – reverse rgb 5-5-5 bits per component FormatUYVP Format = C.GST_VIDEO_FORMAT_UYVP // (33) – packed 10-bit 4:2:2 YUV (U0-Y0-V0-Y1 U2-Y2-V2-Y3 U4 ...) FormatA420 Format = C.GST_VIDEO_FORMAT_A420 // (34) – planar 4:4:2:0 AYUV FormatRGB8P Format = C.GST_VIDEO_FORMAT_RGB8P // (35) – 8-bit paletted RGB FormatYUV9 Format = C.GST_VIDEO_FORMAT_YUV9 // (36) – planar 4:1:0 YUV FormatYVU9 Format = C.GST_VIDEO_FORMAT_YVU9 // (37) – planar 4:1:0 YUV (like YUV9 but UV planes swapped) FormatIYU1 Format = C.GST_VIDEO_FORMAT_IYU1 // (38) – packed 4:1:1 YUV (Cb-Y0-Y1-Cr-Y2-Y3 ...) FormatARGB64 Format = C.GST_VIDEO_FORMAT_ARGB64 // (39) – rgb with alpha channel first, 16 bits per channel FormatAYUV64 Format = C.GST_VIDEO_FORMAT_AYUV64 // (40) – packed 4:4:4 YUV with alpha channel, 16 bits per channel (A0-Y0-U0-V0 ...) Formatr210 Format = C.GST_VIDEO_FORMAT_r210 // (41) – packed 4:4:4 RGB, 10 bits per channel FormatI42010BE Format = C.GST_VIDEO_FORMAT_I420_10BE // (42) – planar 4:2:0 YUV, 10 bits per channel FormatI42010LE Format = C.GST_VIDEO_FORMAT_I420_10LE // (43) – planar 4:2:0 YUV, 10 bits per channel FormatI42210BE Format = C.GST_VIDEO_FORMAT_I422_10BE // (44) – planar 4:2:2 YUV, 10 bits per channel FormatI42210LE Format = C.GST_VIDEO_FORMAT_I422_10LE // (45) – planar 4:2:2 YUV, 10 bits per channel FormatY44410BE Format = C.GST_VIDEO_FORMAT_Y444_10BE // (46) – planar 4:4:4 YUV, 10 bits per channel (Since: 1.2) FormatY44410LE Format = C.GST_VIDEO_FORMAT_Y444_10LE // (47) – planar 4:4:4 YUV, 10 bits per channel (Since: 1.2) FormatGBR Format = C.GST_VIDEO_FORMAT_GBR // (48) – planar 4:4:4 RGB, 8 bits per channel (Since: 1.2) FormatGBR10BE Format = C.GST_VIDEO_FORMAT_GBR_10BE // (49) – planar 4:4:4 RGB, 10 bits per channel (Since: 1.2) FormatGBR10LE Format = C.GST_VIDEO_FORMAT_GBR_10LE // (50) – planar 4:4:4 RGB, 10 bits per channel (Since: 1.2) FormatNV16 Format = C.GST_VIDEO_FORMAT_NV16 // (51) – planar 4:2:2 YUV with interleaved UV plane (Since: 1.2) FormatNV24 Format = C.GST_VIDEO_FORMAT_NV24 // (52) – planar 4:4:4 YUV with interleaved UV plane (Since: 1.2) FormatNV1264Z32 Format = C.GST_VIDEO_FORMAT_NV12_64Z32 // (53) – NV12 with 64x32 tiling in zigzag pattern (Since: 1.4) FormatA42010BE Format = C.GST_VIDEO_FORMAT_A420_10BE // (54) – planar 4:4:2:0 YUV, 10 bits per channel (Since: 1.6) FormatA42010LE Format = C.GST_VIDEO_FORMAT_A420_10LE // (55) – planar 4:4:2:0 YUV, 10 bits per channel (Since: 1.6) FormatA42210BE Format = C.GST_VIDEO_FORMAT_A422_10BE // (56) – planar 4:4:2:2 YUV, 10 bits per channel (Since: 1.6) FormatA42210LE Format = C.GST_VIDEO_FORMAT_A422_10LE // (57) – planar 4:4:2:2 YUV, 10 bits per channel (Since: 1.6) FormatA44410BE Format = C.GST_VIDEO_FORMAT_A444_10BE // (58) – planar 4:4:4:4 YUV, 10 bits per channel (Since: 1.6) FormatA44410LE Format = C.GST_VIDEO_FORMAT_A444_10LE // (59) – planar 4:4:4:4 YUV, 10 bits per channel (Since: 1.6) FormatNV61 Format = C.GST_VIDEO_FORMAT_NV61 // (60) – planar 4:2:2 YUV with interleaved VU plane (Since: 1.6) FormatP01010BE Format = C.GST_VIDEO_FORMAT_P010_10BE // (61) – planar 4:2:0 YUV with interleaved UV plane, 10 bits per channel (Since: 1.10) FormatP01010LE Format = C.GST_VIDEO_FORMAT_P010_10LE // (62) – planar 4:2:0 YUV with interleaved UV plane, 10 bits per channel (Since: 1.10) FormatIYU2 Format = C.GST_VIDEO_FORMAT_IYU2 // (63) – packed 4:4:4 YUV (U-Y-V ...) (Since: 1.10) FormatVYUY Format = C.GST_VIDEO_FORMAT_VYUY // (64) – packed 4:2:2 YUV (V0-Y0-U0-Y1 V2-Y2-U2-Y3 V4 ...) FormatGBRA Format = C.GST_VIDEO_FORMAT_GBRA // (65) – planar 4:4:4:4 ARGB, 8 bits per channel (Since: 1.12) FormatGBRA10BE Format = C.GST_VIDEO_FORMAT_GBRA_10BE // (66) – planar 4:4:4:4 ARGB, 10 bits per channel (Since: 1.12) FormatGBRA10LE Format = C.GST_VIDEO_FORMAT_GBRA_10LE // (67) – planar 4:4:4:4 ARGB, 10 bits per channel (Since: 1.12) FormatGBR12BE Format = C.GST_VIDEO_FORMAT_GBR_12BE // (68) – planar 4:4:4 RGB, 12 bits per channel (Since: 1.12) FormatGBR12LE Format = C.GST_VIDEO_FORMAT_GBR_12LE // (69) – planar 4:4:4 RGB, 12 bits per channel (Since: 1.12) FormatGBRA12BE Format = C.GST_VIDEO_FORMAT_GBRA_12BE // (70) – planar 4:4:4:4 ARGB, 12 bits per channel (Since: 1.12) FormatGBRA12LE Format = C.GST_VIDEO_FORMAT_GBRA_12LE // (71) – planar 4:4:4:4 ARGB, 12 bits per channel (Since: 1.12) FormatI42012BE Format = C.GST_VIDEO_FORMAT_I420_12BE // (72) – planar 4:2:0 YUV, 12 bits per channel (Since: 1.12) FormatI42012LE Format = C.GST_VIDEO_FORMAT_I420_12LE // (73) – planar 4:2:0 YUV, 12 bits per channel (Since: 1.12) FormatI42212BE Format = C.GST_VIDEO_FORMAT_I422_12BE // (74) – planar 4:2:2 YUV, 12 bits per channel (Since: 1.12) FormatI42212LE Format = C.GST_VIDEO_FORMAT_I422_12LE // (75) – planar 4:2:2 YUV, 12 bits per channel (Since: 1.12) FormatY44412BE Format = C.GST_VIDEO_FORMAT_Y444_12BE // (76) – planar 4:4:4 YUV, 12 bits per channel (Since: 1.12) FormatY44412LE Format = C.GST_VIDEO_FORMAT_Y444_12LE // (77) – planar 4:4:4 YUV, 12 bits per channel (Since: 1.12) FormatGray10LE32 Format = C.GST_VIDEO_FORMAT_GRAY10_LE32 // (78) – 10-bit grayscale, packed into 32bit words (2 bits padding) (Since: 1.14) FormatNV1210LE32 Format = C.GST_VIDEO_FORMAT_NV12_10LE32 // (79) – 10-bit variant of GST_VIDEO_FORMAT_NV12, packed into 32bit words (MSB 2 bits padding) (Since: 1.14) FormatNV1610LE32 Format = C.GST_VIDEO_FORMAT_NV16_10LE32 // (80) – 10-bit variant of GST_VIDEO_FORMAT_NV16, packed into 32bit words (MSB 2 bits padding) (Since: 1.14) FormatNV1210LE40 Format = C.GST_VIDEO_FORMAT_NV12_10LE40 // (81) – Fully packed variant of NV12_10LE32 (Since: 1.16) FormatY210 Format = C.GST_VIDEO_FORMAT_Y210 // (82) – packed 4:2:2 YUV, 10 bits per channel (Since: 1.16) FormatY410 Format = C.GST_VIDEO_FORMAT_Y410 // (83) – packed 4:4:4 YUV, 10 bits per channel(A-V-Y-U...) (Since: 1.16) FormatVUYA Format = C.GST_VIDEO_FORMAT_VUYA // (84) – packed 4:4:4 YUV with alpha channel (V0-U0-Y0-A0...) (Since: 1.16) FormatBGR10A2LE Format = C.GST_VIDEO_FORMAT_BGR10A2_LE // (85) – packed 4:4:4 RGB with alpha channel(B-G-R-A), 10 bits for R/G/B channel and MSB 2 bits for alpha channel (Since: 1.16) FormatRGB10A2LE Format = C.GST_VIDEO_FORMAT_RGB10A2_LE // (86) – packed 4:4:4 RGB with alpha channel(R-G-B-A), 10 bits for R/G/B channel and MSB 2 bits for alpha channel (Since: 1.18) FormatY44416BE Format = C.GST_VIDEO_FORMAT_Y444_16BE // (87) – planar 4:4:4 YUV, 16 bits per channel (Since: 1.18) FormatY44416LE Format = C.GST_VIDEO_FORMAT_Y444_16LE // (88) – planar 4:4:4 YUV, 16 bits per channel (Since: 1.18) FormatP016BE Format = C.GST_VIDEO_FORMAT_P016_BE // (89) – planar 4:2:0 YUV with interleaved UV plane, 16 bits per channel (Since: 1.18) FormatP016LE Format = C.GST_VIDEO_FORMAT_P016_LE // (90) – planar 4:2:0 YUV with interleaved UV plane, 16 bits per channel (Since: 1.18) FormatP012BE Format = C.GST_VIDEO_FORMAT_P012_BE // (91) – planar 4:2:0 YUV with interleaved UV plane, 12 bits per channel (Since: 1.18) FormatP012LE Format = C.GST_VIDEO_FORMAT_P012_LE // (92) – planar 4:2:0 YUV with interleaved UV plane, 12 bits per channel (Since: 1.18) FormatY212BE Format = C.GST_VIDEO_FORMAT_Y212_BE // (93) – packed 4:2:2 YUV, 12 bits per channel (Y-U-Y-V) (Since: 1.18) FormatY212LE Format = C.GST_VIDEO_FORMAT_Y212_LE // (94) – packed 4:2:2 YUV, 12 bits per channel (Y-U-Y-V) (Since: 1.18) FormatY412BE Format = C.GST_VIDEO_FORMAT_Y412_BE // (95) – packed 4:4:4:4 YUV, 12 bits per channel(U-Y-V-A...) (Since: 1.18) FormatY412LE Format = C.GST_VIDEO_FORMAT_Y412_LE // (96) – packed 4:4:4:4 YUV, 12 bits per channel(U-Y-V-A...) (Since: 1.18) FormatNV124L4 Format = C.GST_VIDEO_FORMAT_NV12_4L4 // (97) – NV12 with 4x4 tiles in linear order. FormatNV1232L32 Format = C.GST_VIDEO_FORMAT_NV12_32L32 // (98) – NV12 with 32x32 tiles in linear order. )
Type castings
func AllFormats ¶
func AllFormats() []Format
AllFormats is a convenience function for retrieving all formats for inspection purposes. This is not really intended for use in an application, and moreso for debugging.
func RawFormats ¶
func RawFormats() []Format
RawFormats returns a slice of all the raw video formats supported by GStreamer.
func (Format) FOURCC ¶
FOURCC converts this format value into the corresponding FOURCC. Only a few YUV formats have corresponding FOURCC values. If format has no corresponding FOURCC value, 0 is returned.
func (Format) Info ¶
func (f Format) Info() *FormatInfo
Info returns the FormatInfo for this video format.
type FormatFlags ¶
type FormatFlags int
FormatFlags are different video flags that a format info can have.
const ( FormatFlagYUV FormatFlags = C.GST_VIDEO_FORMAT_FLAG_YUV // (1) – The video format is YUV, components are numbered 0=Y, 1=U, 2=V. FormatFlagRGB FormatFlags = C.GST_VIDEO_FORMAT_FLAG_RGB // (2) – The video format is RGB, components are numbered 0=R, 1=G, 2=B. FormatFlagGray FormatFlags = C.GST_VIDEO_FORMAT_FLAG_GRAY // (4) – The video is gray, there is one gray component with index 0. FormatFlagAlpha FormatFlags = C.GST_VIDEO_FORMAT_FLAG_ALPHA // (8) – The video format has an alpha components with the number 3. FormatFlagLE FormatFlags = C.GST_VIDEO_FORMAT_FLAG_LE // (16) – The video format has data stored in little endianness. FormatFlagPalette FormatFlags = C.GST_VIDEO_FORMAT_FLAG_PALETTE // (32) – The video format has a palette. The palette is stored in the second plane and indexes are stored in the first plane. FormatFlagComplex FormatFlags = C.GST_VIDEO_FORMAT_FLAG_COMPLEX // (64) – The video format has a complex layout that can't be described with the usual information in the GstVideoFormatInfo. FormatFlagUnpack FormatFlags = C.GST_VIDEO_FORMAT_FLAG_UNPACK // (128) – This format can be used in a GstVideoFormatUnpack and GstVideoFormatPack function. FormatFlagTiled FormatFlags = C.GST_VIDEO_FORMAT_FLAG_TILED // (256) – The format is tiled, there is tiling information in the last plane. )
Type castings
type FormatInfo ¶
type FormatInfo struct {
// contains filtered or unexported fields
}
FormatInfo contains information for a video format.
func (*FormatInfo) Bits ¶
func (f *FormatInfo) Bits() uint
Bits returns the number of bits used to pack data items. This can be less than 8 when multiple pixels are stored in a byte. for values > 8 multiple bytes should be read according to the endianness flag before applying the shift and mask.
func (*FormatInfo) ComponentDepth ¶
func (f *FormatInfo) ComponentDepth(component uint) uint
ComponentDepth returns the depth in bits for the given component.
func (*FormatInfo) ComponentHSub ¶
func (f *FormatInfo) ComponentHSub(component uint) uint
ComponentHSub returns the subsampling factor of the height for the component.
func (*FormatInfo) ComponentWSub ¶
func (f *FormatInfo) ComponentWSub(n uint) uint
ComponentWSub returns the subsampling factor of the width for the component.
func (*FormatInfo) Flags ¶
func (f *FormatInfo) Flags() FormatFlags
Flags returns the flags on this info.
func (*FormatInfo) Format ¶
func (f *FormatInfo) Format() Format
Format returns the format for this info.
func (*FormatInfo) HasAlpha ¶
func (f *FormatInfo) HasAlpha() bool
HasAlpha returns true if the alpha flag is set.
func (*FormatInfo) HasPalette ¶
func (f *FormatInfo) HasPalette() bool
HasPalette returns true if this info has a palette.
func (*FormatInfo) IsComplex ¶
func (f *FormatInfo) IsComplex() bool
IsComplex returns true if the complex flag is set.
func (*FormatInfo) IsGray ¶
func (f *FormatInfo) IsGray() bool
IsGray returns true if the gray flag is set.
func (*FormatInfo) IsRGB ¶
func (f *FormatInfo) IsRGB() bool
IsRGB returns true if the RGB flag is set.
func (*FormatInfo) IsTiled ¶
func (f *FormatInfo) IsTiled() bool
IsTiled returns true if the tiled flag is set.
func (*FormatInfo) IsYUV ¶
func (f *FormatInfo) IsYUV() bool
IsYUV returns true if the YUV flag is set.
func (*FormatInfo) Name ¶
func (f *FormatInfo) Name() string
Name returns a human readable name for this info.
func (*FormatInfo) NumComponents ¶
func (f *FormatInfo) NumComponents() uint
NumComponents returns the number of components in this info.
func (*FormatInfo) NumPlanes ¶
func (f *FormatInfo) NumPlanes() uint
NumPlanes returns the number of planes in this info.
func (*FormatInfo) Plane ¶
func (f *FormatInfo) Plane(n uint) uint
Plane returns the given plane index.
func (*FormatInfo) PlaneOffset ¶
func (f *FormatInfo) PlaneOffset(n uint) uint
PlaneOffset returns the offset for the given plane.
func (*FormatInfo) PlaneStride ¶
func (f *FormatInfo) PlaneStride(n uint) uint
PlaneStride returns the stride for the given plane.
func (*FormatInfo) TileHS ¶
func (f *FormatInfo) TileHS() uint
TileHS returns the height of a tile, in bytes, represented as a shift.
func (*FormatInfo) TileMode ¶
func (f *FormatInfo) TileMode() TileMode
TileMode returns the tiling mode.
func (*FormatInfo) TileWS ¶
func (f *FormatInfo) TileWS() uint
TileWS returns the width of a tile, in bytes, represented as a shift.
type Info ¶
type Info struct {
// contains filtered or unexported fields
}
Info describes image properties. This information can be filled in from GstCaps with InfoFromCaps. The information is also used to store the specific video info when mapping a video frame with FrameMap.
func NewInfo ¶
func NewInfo() *Info
NewInfo returns a new Info instance. You can populate it by chaining builders to this constructor.
func (*Info) ChromaSite ¶
func (i *Info) ChromaSite() ChromaSite
ChromaSite returns the ChromaSite for this info.
func (*Info) Colorimetry ¶
func (i *Info) Colorimetry() *Colorimetry
Colorimetry returns the colorimetry for this info.
func (*Info) Convert ¶
Convert converts among various gst.Format types. This function handles gst.FormatBytes, gst.FormatTime, and gst.FormatDefault. For raw video, gst.FormatDefault corresponds to video frames. This function can be used to handle pad queries of the type gst.QueryTypeConvert.
func (*Info) FPS ¶
func (i *Info) FPS() *gst.FractionValue
FPS returns the frames-per-second value for the info.
func (*Info) FieldHeight ¶
FieldHeight returns the field height for this info.
func (*Info) FieldOrder ¶
func (i *Info) FieldOrder() FieldOrder
FieldOrder returns the field order for this info.
func (*Info) FieldRateN ¶
FieldRateN returns the rate numerator depending on the interlace mode.
func (*Info) FlagSet ¶
FlagSet sets the given flag(s) on the info. The underlying info is returned for chaining builders.
func (*Info) FlagUnset ¶
FlagUnset unsets the given flag(s) on the info. The underlying info is returned for chaining builders.
func (*Info) Format ¶
Format returns the format for the info. You can call Info() on the return value to inspect the properties further.
func (*Info) InterlaceMode ¶
func (i *Info) InterlaceMode() InterlaceMode
InterlaceMode returns the interlace mode of this Info.
func (*Info) IsInterlaced ¶
IsInterlaced returns true if the interlace mode is not Progressive.
func (*Info) MultiviewFlags ¶
func (i *Info) MultiviewFlags() MultiviewFlags
MultiviewFlags returns the MultiviewFlags on the info.
func (*Info) MultiviewMode ¶
func (i *Info) MultiviewMode() MultiviewMode
MultiviewMode returns the MultiviewMode on thee info.
func (*Info) NumComponents ¶
NumComponents returns the number of components in the info.
func (*Info) PAR ¶
func (i *Info) PAR() *gst.FractionValue
PAR returns the pixel-aspect-ration value for the info.
func (*Info) WithAlign ¶
WithAlign adjusts the offset and stride fields in info so that the padding and stride alignment in align is respected.
Extra padding will be added to the right side when stride alignment padding is required and align will be updated with the new padding values.
func (*Info) WithFPS ¶
func (i *Info) WithFPS(f *gst.FractionValue) *Info
WithFPS sets the FPS on this info.
func (*Info) WithFormat ¶
WithFormat sets the format on this info.
Note: This initializes info first, no values are preserved. This function does not set the offsets correctly for interlaced vertically subsampled formats. If the format is invalid (e.g. because the size of a frame can't be represented as a 32 bit integer), nothing will happen. This is is for convenience in chaining, but may be changed in the future.
func (*Info) WithInterlacedFormat ¶
func (i *Info) WithInterlacedFormat(format Format, interlaceMode InterlaceMode, width, height uint) *Info
WithInterlacedFormat is the same as WithFormat but also allows to set the interlaced mode.
type InterlaceMode ¶
type InterlaceMode int
InterlaceMode is the possible values describing the interlace mode of the stream.
const ( InterlaceModeProgressive InterlaceMode = C.GST_VIDEO_INTERLACE_MODE_PROGRESSIVE // (0) – all frames are progressive InterlaceModeInterleaved InterlaceMode = C.GST_VIDEO_INTERLACE_MODE_INTERLEAVED // (1) – 2 fields are interleaved in one video frame. Extra buffer flags describe the field order. InterlaceModeMixed InterlaceMode = C.GST_VIDEO_INTERLACE_MODE_MIXED // (2) – frames contains both interlaced and progressive video, the buffer flags describe the frame and fields. InterlaceModeFields InterlaceMode = C.GST_VIDEO_INTERLACE_MODE_FIELDS // (3) – 2 fields are stored in one buffer, use the frame ID to get access to the required field. For multiview (the 'views' property > 1) the fields of view N can be found at frame ID (N * 2) and (N * 2) + 1. Each field has only half the amount of lines as noted in the height property. This mode requires multiple GstVideoMeta metadata to describe the fields. InterlaceModeAlternate InterlaceMode = C.GST_VIDEO_INTERLACE_MODE_ALTERNATE // (4) – 1 field is stored in one buffer, GST_VIDEO_BUFFER_FLAG_TF or GST_VIDEO_BUFFER_FLAG_BF indicates if the buffer is carrying the top or bottom field, respectively. The top and bottom buffers are expected to alternate in the pipeline, with this mode (Since: 1.16). )
Type castings
func (InterlaceMode) String ¶
func (i InterlaceMode) String() string
String implements a stringer on interlace mode
type MouseEvent ¶
type MouseEvent string
MouseEvent represents types of mouse events.
const ( MouseButtonPress MouseEvent = "mouse-button-press" MouseButtonRelease MouseEvent = "mouse-button-release" MouseMove MouseEvent = "mouse-move" )
Enums
type MultiviewFlags ¶
type MultiviewFlags int
MultiviewFlags are used to indicate extra properties of a stereo/multiview stream beyond the frame layout and buffer mapping that is conveyed in the MultiviewMode.
const ( MultiviewFlagsNone MultiviewFlags = C.GST_VIDEO_MULTIVIEW_FLAGS_NONE // (0) – No flags MultiviewFlagsRightViewFirst MultiviewFlags = C.GST_VIDEO_MULTIVIEW_FLAGS_RIGHT_VIEW_FIRST // (1) – For stereo streams, the normal arrangement of left and right views is reversed. MultiviewFlagsLeftFlipped MultiviewFlags = C.GST_VIDEO_MULTIVIEW_FLAGS_LEFT_FLIPPED // (2) – The left view is vertically mirrored. MultiviewFlagsLeftFlopped MultiviewFlags = C.GST_VIDEO_MULTIVIEW_FLAGS_LEFT_FLOPPED // (4) – The left view is horizontally mirrored. MultiviewFlagsRightFlipped MultiviewFlags = C.GST_VIDEO_MULTIVIEW_FLAGS_RIGHT_FLIPPED // (8) – The right view is vertically mirrored. MultiviewFlagsRightFlopped MultiviewFlags = C.GST_VIDEO_MULTIVIEW_FLAGS_RIGHT_FLOPPED // (16) – The right view is horizontally mirrored. MultiviewFlagsHalfAspect MultiviewFlags = C.GST_VIDEO_MULTIVIEW_FLAGS_HALF_ASPECT // (16384) – For frame-packed multiview modes, indicates that the individual views have been encoded with half the true width or height and should be scaled back up for display. This flag is used for overriding input layout interpretation by adjusting pixel-aspect-ratio. For side-by-side, column interleaved or checkerboard packings, the pixel width will be doubled. For row interleaved and top-bottom encodings, pixel height will be doubled. MultiviewFlagsMixedMono MultiviewFlags = C.GST_VIDEO_MULTIVIEW_FLAGS_MIXED_MONO // (32768) – The video stream contains both mono and multiview portions, signalled on each buffer by the absence or presence of the GST_VIDEO_BUFFER_FLAG_MULTIPLE_VIEW buffer flag. )
Type castings
type MultiviewFramePacking ¶
type MultiviewFramePacking int
MultiviewFramePacking represents the subset of MultiviewMode values that can be applied to any video frame without needing extra metadata. It can be used by elements that provide a property to override the multiview interpretation of a video stream when the video doesn't contain any markers.
This enum is used (for example) on playbin, to re-interpret a played video stream as a stereoscopic video. The individual enum values are equivalent to and have the same value as the matching MultiviewMode.
const ( MultiviewFramePackingNone MultiviewFramePacking = C.GST_VIDEO_MULTIVIEW_FRAME_PACKING_NONE // (-1) – A special value indicating no frame packing info. MultiviewFramePackingMono MultiviewFramePacking = C.GST_VIDEO_MULTIVIEW_FRAME_PACKING_MONO // (0) – All frames are monoscopic. MultiviewFramePackingLeft MultiviewFramePacking = C.GST_VIDEO_MULTIVIEW_FRAME_PACKING_LEFT // (1) – All frames represent a left-eye view. MultiviewFramePackingRight MultiviewFramePacking = C.GST_VIDEO_MULTIVIEW_FRAME_PACKING_RIGHT // (2) – All frames represent a right-eye view. MultiviewFramePackingSideBySide MultiviewFramePacking = C.GST_VIDEO_MULTIVIEW_FRAME_PACKING_SIDE_BY_SIDE // (3) – Left and right eye views are provided in the left and right half of the frame respectively. MultiviewFramePackingSideBySideQuincunx MultiviewFramePacking = C.GST_VIDEO_MULTIVIEW_FRAME_PACKING_SIDE_BY_SIDE_QUINCUNX // (4) – Left and right eye views are provided in the left and right half of the frame, but have been sampled using quincunx method, with half-pixel offset between the 2 views. MultiviewFramePackingColumnInterleaved MultiviewFramePacking = C.GST_VIDEO_MULTIVIEW_FRAME_PACKING_COLUMN_INTERLEAVED // (5) – Alternating vertical columns of pixels represent the left and right eye view respectively. MultiviewFramePackingRowInterleaved MultiviewFramePacking = C.GST_VIDEO_MULTIVIEW_FRAME_PACKING_ROW_INTERLEAVED // (6) – Alternating horizontal rows of pixels represent the left and right eye view respectively. MultiviewFramePackingTopBottom MultiviewFramePacking = C.GST_VIDEO_MULTIVIEW_FRAME_PACKING_TOP_BOTTOM // (7) – The top half of the frame contains the left eye, and the bottom half the right eye. MultiviewFramePackingCheckerboard MultiviewFramePacking = C.GST_VIDEO_MULTIVIEW_FRAME_PACKING_CHECKERBOARD // (8) – Pixels are arranged with alternating pixels representing left and right eye views in a checkerboard fashion. )
Type castings
type MultiviewMode ¶
type MultiviewMode int
MultiviewMode represents all possible stereoscopic 3D and multiview representations. In conjunction with MultiviewFlags, describes how multiview content is being transported in the stream.
const ( MultiviewModeNone MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_NONE // (-1) – A special value indicating no multiview information. Used in GstVideoInfo and other places to indicate that no specific multiview handling has been requested or provided. This value is never carried on caps. MultiviewModeMono MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_MONO // (0) – All frames are monoscopic. MultiviewModeLeft MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_LEFT // (1) – All frames represent a left-eye view. MultiviewModeRight MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_RIGHT // (2) – All frames represent a right-eye view. MultiviewModeSideBySide MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_SIDE_BY_SIDE // (3) – Left and right eye views are provided in the left and right half of the frame respectively. MultiviewModeSideBySideQuincunx MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_SIDE_BY_SIDE_QUINCUNX // (4) – Left and right eye views are provided in the left and right half of the frame, but have been sampled using quincunx method, with half-pixel offset between the 2 views. MultiviewModeColumnInterleaved MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_COLUMN_INTERLEAVED // (5) – Alternating vertical columns of pixels represent the left and right eye view respectively. MultiviewModeRowInterleaved MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_ROW_INTERLEAVED // (6) – Alternating horizontal rows of pixels represent the left and right eye view respectively. MultiviewModeTopBottom MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_TOP_BOTTOM // (7) – The top half of the frame contains the left eye, and the bottom half the right eye. MultiviewModeCheckerboard MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_CHECKERBOARD // (8) – Pixels are arranged with alternating pixels representing left and right eye views in a checkerboard fashion. MultiviewModeFrameByFrame MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_FRAME_BY_FRAME // (32) – Left and right eye views are provided in separate frames alternately. MultiviewModeMultiviewFrameByFrame MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_MULTIVIEW_FRAME_BY_FRAME // (33) – Multiple independent views are provided in separate frames in sequence. This method only applies to raw video buffers at the moment. Specific view identification is via the GstVideoMultiviewMeta and GstVideoMeta(s) on raw video buffers. MultiviewModeSeparated MultiviewMode = C.GST_VIDEO_MULTIVIEW_MODE_SEPARATED // (34) – Multiple views are provided as separate GstMemory framebuffers attached to each GstBuffer, described by the GstVideoMultiviewMeta and GstVideoMeta(s) )
Type castings
type Navigation ¶
type Navigation interface { NavigationCommand) SendEvent(*gst.Structure) // and "key-release". The key is the character representation of the key. This is typically // as produced by XKeysymToString. SendKeyEvent(event KeyEvent, key string) // to the display space of the related output area. This is usually the size in pixels of the // window associated with the element implementing the Navigation interface. Use 0 for the // button when doing mouse move events. SendMouseEvent(event MouseEvent, button int, x, y float64) // relative to the display space of the related output area. This is usually the size in pixels // of the window associated with the element implementing the Navigation interface. SendMouseScrollEvent(x, y, dX, dY float64) }SendCommand(
Navigation interface is used for creating and injecting navigation related events such as mouse button presses, cursor motion and key presses. The associated library also provides methods for parsing received events, and for sending and receiving navigation related bus events. One main use-case is DVD menu navigation.
The main parts of the API are: - The Navigation interface, implemented by elements which provide an application with the ability to create and inject navigation events into the pipeline. - Navigation event handling API. Navigation events are created in response to calls on a Navigation interface implementation, and sent in the pipeline. Upstream elements can use the navigation event API functions to parse the contents of received messages. - Navigation message handling API. Navigation messages may be sent on the message bus to inform applications of navigation related changes in the pipeline, such as the mouse moving over a clickable region, or the set of available angles changing.
The Navigation message functions provide functions for creating and parsing custom bus messages for signaling GstNavigation changes.
func NavigationFromElement ¶
func NavigationFromElement(element *gst.Element) Navigation
NavigationFromElement checks if the given element implements the Navigation interface. If it does, a useable interface is returned. Otherwise, it returns nil.
type NavigationCommand ¶
type NavigationCommand int
NavigationCommand is a set of commands that may be issued to an element providing the Navigation interface. The available commands can be queried via the QueryNewCommands query.
const ()
Type castings
const ()
Extra aliases for convenience in handling DVD navigation,
type NavigationEvent ¶
NavigationEvent extends the Event from the core library and is used by elements implementing the Navigation interface. You can wrap an event in this struct yourself, but it is safer to use the ToNavigationEvent method first to check validity.
func ToNavigationEvent ¶
func ToNavigationEvent(event *gst.Event) *NavigationEvent
ToNavigationEvent checks if the given event is a NavigationEvent, and if so, returrns a NavigationEvent instance wrapping the event. If the event is not a NavigationEvent this function returns nil.
func (*NavigationEvent) GetType ¶
func (e *NavigationEvent) GetType() NavigationEventType
GetType returns the type of this event.
type NavigationEventType ¶
type NavigationEventType int
NavigationEventType are enum values for the various events that an element implementing the Navigation interface might send up the pipeline.
const ()
Type castings
type NavigationMessage ¶
NavigationMessage extends the Event from the core library and is used by elements implementing the Navigation interface. You can wrap a message in this struct yourself, but it is safer to use the ToNavigationMessage method first to check validity.
func ToNavigationMessage ¶
func ToNavigationMessage(msg *gst.Message) *NavigationMessage
ToNavigationMessage checks if the given message is a NavigationMessage, and if so, returns a NavigatonMessage instance wrapping the message. If the message is not a NavigationMessage, this function returns nil.
func (*NavigationMessage) GetType ¶
func (m *NavigationMessage) GetType() NavigationMessageType
GetType returns the type of this message.
type NavigationMessageType ¶
type NavigationMessageType int
NavigationMessageType is a set of notifications that may be received on the bus when navigation related status changes.
const ()
Type castings
type NavigationQuery ¶
NavigationQuery extends the Query from the core library and is used by elements implementing the Navigation interface. You can wrap a query in this struct yourself, but it is safer to use the ToNavigationQuery method first to check validity.
func ToNavigationQuery ¶
func ToNavigationQuery(query *gst.Query) *NavigationQuery
ToNavigationQuery checks if the given query is a NavigationQuery, and if so, returns a NavigationQuery instance wrapping the query. If the query is not a NavigationQuery, this function returns nil.
func (*NavigationQuery) GetType ¶
func (q *NavigationQuery) GetType() NavigationQueryType
GetType returns the type of this query.
type NavigationQueryType ¶
type NavigationQueryType int
NavigationQueryType represents types of navigation interface queries.
const ()
Type castings
type OrientationMethod ¶
type OrientationMethod int
OrientationMethod represents the different video orientation methods.
const ( OrientationMethodIdentity OrientationMethod = C.GST_VIDEO_ORIENTATION_IDENTITY // (0) – Identity (no rotation) OrientationMethod90R OrientationMethod = C.GST_VIDEO_ORIENTATION_90R // (1) – Rotate clockwise 90 degrees OrientationMethod180 OrientationMethod = C.GST_VIDEO_ORIENTATION_180 // (2) – Rotate 180 degrees OrientationMethod90L OrientationMethod = C.GST_VIDEO_ORIENTATION_90L // (3) – Rotate counter-clockwise 90 degrees OrientationMethodHoriz OrientationMethod = C.GST_VIDEO_ORIENTATION_HORIZ // (4) – Flip horizontally OrientationMethodVert OrientationMethod = C.GST_VIDEO_ORIENTATION_VERT // (5) – Flip vertically OrientationMethodULLR OrientationMethod = C.GST_VIDEO_ORIENTATION_UL_LR // (6) – Flip across upper left/lower right diagonal OrientationMethodURLL OrientationMethod = C.GST_VIDEO_ORIENTATION_UR_LL // (7) – Flip across upper right/lower left diagonal OrientationMethodAuto OrientationMethod = C.GST_VIDEO_ORIENTATION_AUTO // (8) – Select flip method based on image-orientation tag OrientationMethodCustom OrientationMethod = C.GST_VIDEO_ORIENTATION_CUSTOM // (9) – Current status depends on plugin internal setup )
Type castings
type PackFlags ¶
type PackFlags int
PackFlags are different flags that can be used when packing and unpacking.
const ( PackFlagNone PackFlags = C.GST_VIDEO_PACK_FLAG_NONE // (0) – No flag PackFlagTruncateRange PackFlags = C.GST_VIDEO_PACK_FLAG_TRUNCATE_RANGE // (1) – When the source has a smaller depth than the target format, set the least significant bits of the target to 0. This is likely slightly faster but less accurate. When this flag is not specified, the most significant bits of the source are duplicated in the least significant bits of the destination. PackFlagInterlaced PackFlags = C.GST_VIDEO_PACK_FLAG_INTERLACED // (2) – The source is interlaced. The unpacked format will be interlaced as well with each line containing information from alternating fields. (Since: 1.2) )
Type castings
type TileMode ¶
type TileMode int
TileMode is an enum value describing the available tiling modes.
const ( TileModeUnknown TileMode = C.GST_VIDEO_TILE_MODE_UNKNOWN // (0) – Unknown or unset tile mode TileModeZFlipZ2X2 TileMode = C.GST_VIDEO_TILE_MODE_ZFLIPZ_2X2 // (65536) – Every four adjacent blocks - two horizontally and two vertically are grouped together and are located in memory in Z or flipped Z order. In case of odd rows, the last row of blocks is arranged in linear order. TileModeLinear TileMode = C.GST_VIDEO_TILE_MODE_LINEAR // (131072) – Tiles are in row order. )
Type castings
type TileType ¶
type TileType int
TileType is an enum value describing the most common tiling types.
const (
TileTypeIndexed TileType = C.GST_VIDEO_TILE_TYPE_INDEXED // (0) – Tiles are indexed. Use gst_video_tile_get_index () to retrieve the tile at the requested coordinates.
)
Type castings
type TransferFunction ¶
type TransferFunction int
TransferFunction defines the formula for converting between non-linear RGB (R'G'B') and linear RGB
const ( TransferUnknown TransferFunction = C.GST_VIDEO_TRANSFER_UNKNOWN // (0) – unknown transfer function TransferGamma10 TransferFunction = C.GST_VIDEO_TRANSFER_GAMMA10 // (1) – linear RGB, gamma 1.0 curve TransferGamma18 TransferFunction = C.GST_VIDEO_TRANSFER_GAMMA18 // (2) – Gamma 1.8 curve TransferGamma20 TransferFunction = C.GST_VIDEO_TRANSFER_GAMMA20 // (3) – Gamma 2.0 curve TransferGamma22 TransferFunction = C.GST_VIDEO_TRANSFER_GAMMA22 // (4) – Gamma 2.2 curve TransferBT709 TransferFunction = C.GST_VIDEO_TRANSFER_BT709 // (5) – Gamma 2.2 curve with a linear segment in the lower range, also ITU-R BT470M / ITU-R BT1700 625 PAL & SECAM / ITU-R BT1361 TransferSMPTE240M TransferFunction = C.GST_VIDEO_TRANSFER_SMPTE240M // (6) – Gamma 2.2 curve with a linear segment in the lower range TransferSRGB TransferFunction = C.GST_VIDEO_TRANSFER_SRGB // (7) – Gamma 2.4 curve with a linear segment in the lower range. IEC 61966-2-1 (sRGB or sYCC) TransferGamma28 TransferFunction = C.GST_VIDEO_TRANSFER_GAMMA28 // (8) – Gamma 2.8 curve, also ITU-R BT470BG TransferLog100 TransferFunction = C.GST_VIDEO_TRANSFER_LOG100 // (9) – Logarithmic transfer characteristic 100:1 range TransferLog316 TransferFunction = C.GST_VIDEO_TRANSFER_LOG316 // (10) – Logarithmic transfer characteristic 316.22777:1 range (100 * sqrt(10) : 1) TransferBT202012 TransferFunction = C.GST_VIDEO_TRANSFER_BT2020_12 // (11) – Gamma 2.2 curve with a linear segment in the lower range. Used for BT.2020 with 12 bits per component. Since: 1.6 TransferAdobeRGB TransferFunction = C.GST_VIDEO_TRANSFER_ADOBERGB // (12) – Gamma 2.19921875. Since: 1.8 TransferBT202010 TransferFunction = C.GST_VIDEO_TRANSFER_BT2020_10 // (13) – Rec. ITU-R BT.2020-2 with 10 bits per component. (functionally the same as the values GST_VIDEO_TRANSFER_BT709 and GST_VIDEO_TRANSFER_BT601). Since: 1.18 TransferSMPTE2084 TransferFunction = C.GST_VIDEO_TRANSFER_SMPTE2084 // (14) – SMPTE ST 2084 for 10, 12, 14, and 16-bit systems. Known as perceptual quantization (PQ) Since: 1.18 TransferARIBSTDB67 TransferFunction = C.GST_VIDEO_TRANSFER_ARIB_STD_B67 // (15) – Association of Radio Industries and Businesses (ARIB) STD-B67 and Rec. ITU-R BT.2100-1 hybrid loggamma (HLG) system Since: 1.18 TransferBT601 TransferFunction = C.GST_VIDEO_TRANSFER_BT601 // (16) – also known as SMPTE170M / ITU-R BT1358 525 or 625 / ITU-R BT1700 NTSC )
Type castings