Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Current »

The FFmpegV4L2VideoSource is used to get data from sensors through the V4L2 interface. It supports setting the output data format and can convert one output into multiple outputs. V4L2 devices only support one data output at a time, but FFmpegV4L2VideoSource can provide multiple outputs.

API Instructions

namespace com { namespace sunplus { namespace media {

class FFmpegV4L2VideoSource : public FFmpegVideoSource {
public:
    FFmpegV4L2VideoSource(std::string device);
    ~FFmpegV4L2VideoSource();

public: 
    int open(AVDictionary* options);
    void close();
    
    AVStream* getAVStream();

    std::shared_ptr<FFmpegVideoProvider> creatVideoProvider(int maxQueueSize = 4);
    void destroyVideoProvider(std::shared_ptr<FFmpegVideoProvider> provider);

    void setVideoDataCallback(VideoPacketCallback callback);
};
}}}

Constructors

/*
 * @device V4l2 device, such as "/dev/video0"
 */
FFmpegV4L2VideoSource(std::string device);

open

Open this V4L2 device and retrieve stream information.

Set output format by options. Such as width, height, pixel format, fps, etc.

If options is NULL, the default value will be used.

/**
 * Open this V4L2 device and retrieve stream information.
 * The device must be closed with close().
 *
 * @param options  v4l2 demuxer-private options.
 *                 May be NULL.
 *
 * @return 0 on success, < 0 on error.
 *
 */
int open(AVDictionary* options);

You can get the formats the device supports by using the following command.

ffmpeg -hide_banner -f v4l2 -list_formats all -i /dev/video0
list_formats.png

Sample

auto videoSource = make_shared<FFmpegV4L2VideoSource>("/dev/video0");
AVDictionary *options = nullptr;
av_dict_set(&options, "video_size", "1280x720", 0);
av_dict_set(&options, "framerate", "30", 0);
av_dict_set(&options, "pixel_format", "uyvy422", 0);
int ret = videoSource->open(options);

Close

Close this V4L2 device. Free all its contents.

The device must be closed with close().

/**
 * Close this V4L2 device. Free all its contents.
 */
void close();

getAVStream

You can get stream information through it, but it must be after the device is successfully opened. Such as time_base, start_time, codec, etc.

/**
 * Get the AVStream for AVFormatContext, It can be used get stream information. 
 * Such as time_base, start_time, codec, etc.
 */
AVStream* getAVStream();

Sample

//open the device
......
// get stream info
auto videoStream = videoSource->getAVStream();
printf("videoStream, time_base: %d/%d\n", videoStream->time_base.num, videoStream->time_base.den);
printf("videoStream, start_time: %lld\n", videoStream->start_time);
printf("videoStream, start_time to ms: %lld\n", av_ts_make_time_ms(videoStream->start_time, &videoStream->time_base));
printf("videoStream, codec: %s\n", avcodec_get_name(videoStream->codecpar->codec_id));
printf("videoStream, width: %d, height: %d\n", videoStream->codecpar->width, videoStream->codecpar->height);
printf("videoStream, fps: %d/%d\n", videoStream->r_frame_rate.num, videoStream->r_frame_rate.den);
printf("videoStream, pix_fmt: %s\n", av_get_pix_fmt_name((enum AVPixelFormat)videoStream->codecpar->format));

creatVideoProvider

Create a video provider to cache data. You can limit the size of the queue. Then the obtained data will be put into the queue of FFmpegVideoProvider.

Then you can get the data by FFmpegVideoProvider. FFmpegVideoProvider API Instructions please refer to here.

/**
 * Create a video provider to cache data.
 *
 * @param maxQueueSize sets the max size of the queue.
 *                 	   default is 4.
 *
 */
std::shared_ptr<FFmpegVideoProvider> creatVideoProvider(int maxQueueSize = 4);

destroyVideoProvider

Free the FFmpegVideoProvider and all its cache data.

/**
 * Free the FFmpegVideoProvider and all its cache data.
 *
 * @param provider provider to free.
 *
 */
void destroyVideoProvider(std::shared_ptr<FFmpegVideoProvider> provider);

setVideoDataCallback

If you only need one output and do not want to use FFmpegVideoProvider, you can get the data by setting VideoPacketCallback.

/**
 * Get the data by setting VideoPacketCallback.
 *
 * @param callback VideoPacketCallback to get data.
 *
 */
void setVideoDataCallback(VideoPacketCallback callback);

Sample Code

This is a sample of creating two different frame rates YUV Video Provider through FFmpegV4L2VideoSource.

the flow of creating video providers:

create video source --> open video source --> create video provider --> provider prepare --> create the thread of get frame

void FFmpegV4L2VideoSource_Test() {
    /* init output format */
    auto videoSource = make_shared<FFmpegV4L2VideoSource>("/dev/video0");
    AVDictionary *options = nullptr;
    av_dict_set(&options, "video_size", "1280x720", 0);
    av_dict_set(&options, "framerate", "30", 0);
    av_dict_set(&options, "pixel_format", "uyvy422", 0);

    /* open the device */
    int ret = videoSource->open(options);

    /* get video stream info */
    auto videoStream = videoSource->getAVStream();
    printf("videoStream, time_base: %d/%d\n", videoStream->time_base.num, videoStream->time_base.den);
    printf("videoStream, start_time: %lld\n", videoStream->start_time);
    printf("videoStream, start_time to ms: %lld\n", av_ts_make_time_ms(videoStream->start_time, &videoStream->time_base));
    printf("videoStream, codec: %s\n", avcodec_get_name(videoStream->codecpar->codec_id));
    printf("videoStream, width: %d, height: %d\n", videoStream->codecpar->width, videoStream->codecpar->height);
    printf("videoStream, fps: %d/%d\n", videoStream->r_frame_rate.num, videoStream->r_frame_rate.den);
    printf("videoStream, pix_fmt: %s\n", av_get_pix_fmt_name((enum AVPixelFormat)videoStream->codecpar->format));

    /* create the first thread to get yuv frame */
    auto getYUVFrameThread_1 = [&](){
        /* 1. create video provider by FFmpegV4L2VideoSource */
        auto provider = videoSource->creatVideoProvider();

        /* 2. videoSource will push the frame in the queue after the provider is prepared */
        provider->prepare();
        int index = 0;
        while(!is_exit) {
            AVPacket* packet = nullptr;
            /* 3. get frame by video provider, pop frame from the queue */
            auto ret = provider->getFrame(packet);
            if (packet != nullptr) {
                printf("getYUVFrameThread_1, yuv frame[%d] pts: %lld, size: %d\n", index, packet->pts, packet->size);
                /* 4. process the yuv data, copy or write to file */ 

                .......
                
                /* 5. free packet */
                av_packet_unref(packet);
                av_packet_free(&packet);
                index++;
            }
            
        }
        /* 6. provider destroy */
        provider->destroy();
        /* 7. free  provider */
        videoSource->destroyVideoProvider(provider);
    };
    auto thread1 = make_shared<thread>(getYUVFrameThread_1);

    /* create the second thread to get yuv frame and change the frame rate of the stream */
    auto getYUVFrameThread_2 = [&](){
        auto provider = videoSource->creatVideoProvider();
        provider->prepare();
        /* set the put frame interval.
         * This way, you can change the frame rate of the stream. 
         * For example, if the frame rate of the Video Source is 30 
         * and you set the interval of the put frame to 3, 
         * the frame rate of this provider will become 10.
         * */ 
        provider->setPutFrameInterval(3);
        int index = 0;
        while(!is_exit) {
            AVPacket* packet = nullptr;
            auto ret = provider->getFrame(packet);
            if (packet != nullptr) {
                printf("getYUVFrameThread_2, yuv frame[%d] pts: %lld, size: %d\n", index, packet->pts, packet->size);
                av_packet_unref(packet);
                av_packet_free(&packet);
                index++;
            }
        }
        provider->destroy();
        videoSource->destroyVideoProvider(provider);
    };
    auto thread2 = make_shared<thread>(getYUVFrameThread_2);

    _wait_exit("FFmpegV4L2VideoSource_Test");
    
    if (thread1->joinable()) {
        thread1->join();
    }

    if (thread2->joinable()) {
        thread2->join();
    }

    /* close the device */ 
    videoSource->close();

}

Test Result

./ffmpeg_sample v4l2src
FFmpegV4L2VideoSourceTestResult.png

  • No labels