Custom Capturing and Rendering

Last updated: 2021-10-22 09:50:16

    Sample Code

    Regarding frequently asked questions among developers, Tencent Cloud offers an easy-to-understand API example project, which you can use to quickly learn how to use different APIs.

    Platform GitHub Address
    iOS GitHub
    Android GitHub

    Customizing Video to Publish

    iOS

    If you need to process video by yourself (for example, add subtitles) but want to leave the rest of the process to LiteAVSDK, follow the steps below.

    1. Call enableCustomVideoProcess of V2TXLivePusher to enable custom video processing, so that you will receive the callback of video data.
    2. There are two cases for image processing.
      -The beauty filter component generates new textures.
      If the beauty filter component you use generates a new texture frame (for the processed image) during image processing, please set dstFrame.textureId to a new texture ID in the callback API.
        - (void) onProcessVideoFrame:(V2TXLiveVideoFrame _Nonnull)srcFrame dstFrame:(V2TXLiveVideoFrame _Nonnull)dstFrame
        {
            GLuint dstTextureId = renderItemWithTexture(srcFrame.textureId, srcFrame.width, srcFrame.height);
            dstFrame.textureId = dstTextureId;
        }
      
      • The beauty filter component does not generate new textures.
        If the third-party beauty filter component you use does not generate new textures and you need to manually set an input texture and an output texture for the component, consider the following scheme:
          - (void) onProcessVideoFrame:(V2TXLiveVideoFrame _Nonnull)srcFrame dstFrame:(V2TXLiveVideoFrame _Nonnull)dstFrame
          {
              thirdparty_process(srcFrame.textureId, srcFrame.width, srcFrame.height, dstFrame.textureId);
          }
        
    3. You need some basic knowledge of OpenGL to deal with texture data and should refrain from high volume computing. This is because onProcessVideoFrame is called at the same frequency as the frame rate, and sophisticated processing may cause GPU overheating.

    Android

    If you need to process video by yourself (for example, add subtitles) but want to leave the rest of the process to LiteAVSDK, follow the steps below.

    1. Call enableCustomVideoProcess of V2TXLivePusher to enable custom video processing, so that you will receive the callback of video data.
    2. There are two cases for image processing.
      • The beauty filter component generates new textures.
        If the beauty filter component you use generates a new texture frame (for the processed image) during image processing, please set dstFrame.textureId to a new texture ID in the callback API.
        private class MyPusherObserver extends V2TXLivePusherObserver {
           @Override
           public void onGLContextCreated() {
               mFURenderer.onSurfaceCreated();
               mFURenderer.setUseTexAsync(true);
           }
            @Override
           public int onProcessVideoFrame(V2TXLiveVideoFrame srcFrame, V2TXLiveVideoFrame dstFrame) {
               dstFrame.texture.textureId = mFURenderer.onDrawFrameSingleInput(
                       srcFrame.texture.textureId, srcFrame.width, srcFrame.height);
               return 0;
           }
            @Override
           public void onGLContextDestroyed() {
               mFURenderer.onSurfaceDestroyed();
           }
        }
        
      • The beauty filter component does not generate new textures.
        If the third-party beauty filter component you use does not generate new textures and you need to manually set an input texture and an output texture for the component, consider the following scheme:
           @Override
           public int onProcessVideoFrame(V2TXLiveVideoFrame srcFrame, V2TXLiveVideoFrame dstFrame) {
               thirdparty_process(srcFrame.texture.textureId, srcFrame.width, srcFrame.height, dstFrame.texture.textureId);
               return 0;
           }
        
    3. You need some basic knowledge of OpenGL to deal with texture data and should refrain from high volume computing. This is because onProcessVideoFrame is called at the same frequency as the frame rate, and sophisticated processing may cause GPU overheating.

    Customizing Video for Playback

    iOS

    1. Set V2TXLivePlayerObserver to listen for events of V2TXLivePlayer.
      @interface V2TXLivePlayer : NSObject
      /**
      * @brief Set callbacks for the player<br>
      * After setting callbacks, you can listen for events of V2TXLivePlayer,
      * including the player status, playback volume, first audio/video frame, statistics, and warning and error messages.
      *
      * @param observer Target object for the player’s callbacks. For more information, see {@link V2TXLivePlayerObserver}
      */

      - (void)setObserver:(id<v2txliveplayerobserver>)observer;
    2. Get the player’s video data from the onRenderVideoFrame callback.
      /**
      * @brief Video frame information
      * V2TXLiveVideoFrame is the raw data that describes a video frame before encoding or after decoding
      * @note It is used during custom video capturing to package the video frames to be sent, and during custom video rendering to get the packaged video frames
      */

      @interface V2TXLiveVideoFrame : NSObject

      ///**Field meaning:** video pixel format
      ///**Recommended value:** V2TXLivePixelFormatNV12
      @property(nonatomic, assign) V2TXLivePixelFormat pixelFormat;

      ///**Field meaning:** video data container format
      ///**Recommended value:** V2TXLiveBufferTypePixelBuffer
      @property(nonatomic, assign) V2TXLiveBufferType bufferType;

      ///**Field meaning:** video data when bufferType is V2TXLiveBufferTypeNSData
      @property(nonatomic, strong, nullable) NSData *data;

      ///**Field meaning:** video data when bufferType is V2TXLiveBufferTypePixelBuffer
      @property(nonatomic, assign, nullable) CVPixelBufferRef pixelBuffer;

      ///**Field meaning:** video width
      @property(nonatomic, assign) NSUInteger width;

      ///**Field meaning:** video height
      @property(nonatomic, assign) NSUInteger height;

      ///**Field meaning:** clockwise rotation of video
      @property(nonatomic, assign) V2TXLiveRotation rotation;

      /// **Field description:** video texture ID
      @property (nonatomic, assign) GLuint textureId;

      @end


      @protocol V2TXLivePlayerObserver <nsobject>
      @optional
      /**
      * @brief Custom video rendering callback
      *
      * @note You will receive this callback after calling [enableCustomRendering](@ref V2TXLivePlayer#enableCustomRendering:pixelFormat:bufferType:) to enable custom video rendering
      *
      * @param videoFrame Video frame data {@link V2TXLiveVideoFrame}
      */

      - (void)onRenderVideoFrame:(id<v2txliveplayer>)player
      frame:(V2TXLiveVideoFrame *)videoFrame
      @end

    Android

    1. Set V2TXLivePlayerObserver to listen for callbacks of V2TXLivePlayer.
      public abstract void setObserver(V2TXLivePlayerObserver observer)
    2. Get the player’s video data from the onRenderVideoFrame callback.
      public final static class V2TXLiveVideoFrame
      {
      /// Video pixel format
      public V2TXLivePixelFormat pixelFormat = V2TXLivePixelFormat.V2TXLivePixelFormatUnknown;
      /// Video data container format
      public V2TXLiveBufferType bufferType = V2TXLiveBufferType.V2TXLiveBufferTypeUnknown;
      /// Video texture pack
      public V2TXLiveTexture texture;
      /// Video data
      public byte[] data;
      /// Video data
      public ByteBuffer buffer;
      /// Video width
      public int width;
      /// Video height
      public int height;
      /// Clockwise rotation of video
      public int rotation;
      }

      public abstract class V2TXLivePlayerObserver {
      /**
      * Callback for custom video rendering
      *
      * @param player The player object sending this callback
      * @param videoFrame Video frame data {@link V2TXLiveVideoFrame}
      */

      void onRenderVideoFrame(V2TXLivePlayer player, V2TXLiveVideoFrame videoFrame)
      ;
      }