#import <RTCVideoFrame.h>
◆ initWithPixelBuffer:rotation:timeStampNs:() [1/2]
- (instancetype) initWithPixelBuffer: |
|
(CVPixelBufferRef) |
pixelBuffer |
rotation: |
|
(RTCVideoRotation) |
rotation |
timeStampNs: |
|
(int64_t) |
timeStampNs |
|
|
| |
Initialize an RTCVideoFrame from a pixel buffer, rotation, and timestamp.
◆ initWithPixelBuffer:rotation:timeStampNs:() [2/2]
- (instancetype) initWithPixelBuffer: |
|
(CVPixelBufferRef) |
pixelBuffer |
rotation: |
|
(RTCVideoRotation) |
rotation |
timeStampNs: |
|
(int64_t) |
timeStampNs |
|
|
| |
Initialize an RTCVideoFrame from a pixel buffer, rotation, and timestamp.
◆ initWithPixelBuffer:scaledWidth:scaledHeight:cropWidth:cropHeight:cropX:cropY:rotation:timeStampNs:() [1/2]
- (instancetype) initWithPixelBuffer: |
|
(CVPixelBufferRef) |
pixelBuffer |
scaledWidth: |
|
(int) |
scaledWidth |
scaledHeight: |
|
(int) |
scaledHeight |
cropWidth: |
|
(int) |
cropWidth |
cropHeight: |
|
(int) |
cropHeight |
cropX: |
|
(int) |
cropX |
cropY: |
|
(int) |
cropY |
rotation: |
|
(RTCVideoRotation) |
rotation |
timeStampNs: |
|
(int64_t) |
timeStampNs |
|
|
| |
Initialize an RTCVideoFrame from a pixel buffer combined with cropping and scaling. Cropping will be applied first on the pixel buffer, followed by scaling to the final resolution of scaledWidth x scaledHeight.
◆ initWithPixelBuffer:scaledWidth:scaledHeight:cropWidth:cropHeight:cropX:cropY:rotation:timeStampNs:() [2/2]
- (instancetype) initWithPixelBuffer: |
|
(CVPixelBufferRef) |
pixelBuffer |
scaledWidth: |
|
(int) |
scaledWidth |
scaledHeight: |
|
(int) |
scaledHeight |
cropWidth: |
|
(int) |
cropWidth |
cropHeight: |
|
(int) |
cropHeight |
cropX: |
|
(int) |
cropX |
cropY: |
|
(int) |
cropY |
rotation: |
|
(RTCVideoRotation) |
rotation |
timeStampNs: |
|
(int64_t) |
timeStampNs |
|
|
| |
Initialize an RTCVideoFrame from a pixel buffer combined with cropping and scaling. Cropping will be applied first on the pixel buffer, followed by scaling to the final resolution of scaledWidth x scaledHeight.
◆ initWithVideoBuffer:rotation:timeStampNs:() [1/2]
◆ initWithVideoBuffer:rotation:timeStampNs:() [2/2]
◆ newI420VideoFrame() [1/2]
Return a frame that is guaranteed to be I420, i.e. it is possible to access the YUV data on it.
◆ newI420VideoFrame() [2/2]
Return a frame that is guaranteed to be I420, i.e. it is possible to access the YUV data on it.
◆ NS_UNAVAILABLE() [1/4]
- (instancetype) NS_UNAVAILABLE |
|
|
|
◆ NS_UNAVAILABLE() [2/4]
- (instancetype) NS_UNAVAILABLE |
|
|
|
◆ NS_UNAVAILABLE() [3/4]
- (instancetype) NS_UNAVAILABLE |
|
|
|
◆ NS_UNAVAILABLE() [4/4]
- (instancetype) NS_UNAVAILABLE |
|
|
|
◆ dataU
◆ dataV
◆ dataY
Accessing YUV data should only be done for I420 frames, i.e. if nativeHandle is null. It is always possible to get such a frame by calling newI420VideoFrame.
◆ height
Height without rotation applied.
◆ nativeHandle
The native handle should be a pixel buffer on iOS.
◆ rotation
- (RTCVideoRotation) rotation |
|
readnonatomicassign |
◆ strideU
◆ strideV
◆ strideY
◆ timeStampNs
Timestamp in nanoseconds.
◆ width
Width without rotation applied.
The documentation for this class was generated from the following files:
- DerivedData/WebKit/Build/Products/Debug/usr/local/include/webrtc/sdk/objc/Framework/Headers/WebRTC/RTCVideoFrame.h
- Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/Framework/Classes/RTCVideoFrame.mm