Skip to content

Commit

Permalink
various codecs support (e.g H.264, HEVC, VP8, VP9, MJPEG, ...) (#20)
Browse files Browse the repository at this point in the history
- extends hve_config with encoder field
- by default (NULL, "") set to "h264_vaapi"
- adds HEVC 10 bit per channel encoding example
- documentation update (codecs, profiles)
- readme update (codecs, examples)

closes #4
implements p010le example for #18
  • Loading branch information
bmegli authored Dec 30, 2019
1 parent 25d747b commit ac3a4c1
Show file tree
Hide file tree
Showing 6 changed files with 212 additions and 17 deletions.
2 changes: 2 additions & 0 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,3 +13,5 @@ install(FILES hve.h DESTINATION include)
add_executable(hve-encode-raw-h264 examples/hve_encode_raw_h264.c)
target_link_libraries(hve-encode-raw-h264 hve)

add_executable(hve-encode-raw-hevc10 examples/hve_encode_raw_hevc10.c)
target_link_libraries(hve-encode-raw-hevc10 hve)
25 changes: 16 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
This library wraps hardware video encoding in a simple interface.
There are no performance loses (at the cost of library flexibility).

Currently it supports VAAPI and H.264 standard.
Currently it supports VAAPI and various codecs (H.264, HEVC, ...).

See library [documentation](https://bmegli.github.io/hardware-video-encoder/group__interface.html).

Expand All @@ -13,10 +13,10 @@ See [hardware-video-streaming](https://github.com/bmegli/hardware-video-streamin

## Intended Use

Raw H.264 encoding:
Raw encoding (H264, HEVC, ...):
- custom network streaming protocols
- low latency streaming
- raw H.264 dumping
- raw dumping (H264, HEVC, ...)
- ...

Complex pipelines (muxing, scaling, color conversions, filtering) are beyond the scope of this library.
Expand Down Expand Up @@ -63,13 +63,18 @@ cmake ..
make
```

## Running Example
## Running Examples

``` bash
# ./hve-encode-raw-h264 <number-of-seconds> [device]
./hve-encode-raw-h264 10
```

``` bash
# ./hve-encode-raw-hevc10 <number-of-seconds> [device]
./hve-encode-raw-hevc10 10
```

### Troubleshooting

If you have multiple VAAPI devices you may have to specify Intel directly.
Expand All @@ -81,19 +86,21 @@ sudo apt-get install vainfo
vainfo --display drm --device /dev/dri/renderD128
```

Once you identify your Intel device run the example, e.g.
Once you identify your Intel device run the examples, e.g.

```bash
./hve-encode-raw-h264 10 /dev/dri/renderD128
./hve-encode-raw-hevc10 10 /dev/dri/renderD128
```

## Testing

Play result raw H.264 file with FFmpeg:
Play result raw H.264/HEVC file with FFmpeg:

``` bash
# output goes to output.h264 file
# output goes to output.h264/output.hevc file
ffplay output.h264
ffplay output.hevc
```

You should see procedurally generated video (moving through greyscale).
Expand All @@ -110,11 +117,11 @@ There are just 4 functions and 3 user-visible data types:

```C
struct hve_config hardware_config = {WIDTH, HEIGHT, FRAMERATE, DEVICE,
PIXEL_FORMAT, PROFILE, BFRAMES, BITRATE};
ENCODER, PIXEL_FORMAT, PROFILE, BFRAMES, BITRATE};
struct hve *hardware_encoder=hve_init(&hardware_config);
struct hve_frame frame = { 0 };

//later assuming PIXEL_FORMAT is "nv12" (you can use something else)
//later assuming PIXEL_FORMAT is "nv12" (you may use something else)

//fill with your stride (width including padding if any)
frame.linesize[0] = frame.linesize[1] = WIDTH;
Expand Down
7 changes: 4 additions & 3 deletions examples/hve_encode_raw_h264.c
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,9 @@ const int HEIGHT=720;
const int FRAMERATE=30;
int SECONDS=10;
const char *DEVICE=NULL; //NULL for default or device e.g. "/dev/dri/renderD128"
const char *ENCODER=NULL;//NULL for default (h264_vaapi) or FFmpeg encoder e.g. "hevc_vaapi", ...
const char *PIXEL_FORMAT="nv12"; //NULL for default (NV12) or pixel format e.g. "rgb0"
const int PROFILE=FF_PROFILE_H264_HIGH; //or FF_PROFILE_H264_MAIN, FF_PROFILE_H264_CONSTRAINED_BASELINE, ...
const int PROFILE=FF_PROFILE_H264_HIGH; //or FF_PROFILE_HEVC_MAIN, FF_PROFILE_H264_CONSTRAINED_BASELINE, ...
const int BFRAMES=0; //max_b_frames, set to 0 to minimize latency, non-zero to minimize size
const int BITRATE=0; //average bitrate in VBR

Expand All @@ -36,7 +37,7 @@ int main(int argc, char* argv[])
return -1;

//prepare library data
struct hve_config hardware_config = {WIDTH, HEIGHT, FRAMERATE, DEVICE, PIXEL_FORMAT, PROFILE, BFRAMES, BITRATE};
struct hve_config hardware_config = {WIDTH, HEIGHT, FRAMERATE, DEVICE, ENCODER, PIXEL_FORMAT, PROFILE, BFRAMES, BITRATE};
struct hve *hardware_encoder;

//prepare file for raw H.264 output
Expand Down Expand Up @@ -82,7 +83,7 @@ int encoding_loop(struct hve *hardware_encoder, FILE *output_file)

for(f=0;f<frames;++f)
{
//prepare dummy image date, normally you would take it from camera or other source
//prepare dummy image data, normally you would take it from camera or other source
memset(Y, f % 255, WIDTH*HEIGHT); //NV12 luminance (ride through greyscale)
memset(color, 128, WIDTH*HEIGHT/2); //NV12 UV (no color really)

Expand Down
154 changes: 154 additions & 0 deletions examples/hve_encode_raw_hevc10.c
Original file line number Diff line number Diff line change
@@ -0,0 +1,154 @@
/*
* HVE Hardware Video Encoder library example of encoding through VAAPI to HEVC 10 bits per channel
*
* Copyright 2019 (C) Bartosz Meglicki <[email protected]>
*
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*
*/

#include <stdio.h> //printf, fprintf
#include <inttypes.h> //uint8_t, uint16_t

#include "../hve.h"

const int WIDTH=1280;
const int HEIGHT=720;
const int FRAMERATE=30;
int SECONDS=10;
const char *DEVICE=NULL; //NULL for default or device e.g. "/dev/dri/renderD128"
const char *ENCODER="hevc_vaapi";//NULL for default (h264_vaapi) or FFmpeg encoder e.g. "hevc_vaapi", ...
const char *PIXEL_FORMAT="p010le"; //NULL for default (nv12) or pixel format e.g. "rgb0", ...
const int PROFILE=FF_PROFILE_HEVC_MAIN_10; //or FF_PROFILE_HEVC_MAIN, ...
const int BFRAMES=0; //max_b_frames, set to 0 to minimize latency, non-zero to minimize size
const int BITRATE=0; //average bitrate in VBR

int encoding_loop(struct hve *hardware_encoder, FILE *output_file);
int process_user_input(int argc, char* argv[]);
int hint_user_on_failure(char *argv[]);
void hint_user_on_success();

int main(int argc, char* argv[])
{
//get SECONDS and DEVICE from the command line
if( process_user_input(argc, argv) < 0 )
return -1;

//prepare library data
struct hve_config hardware_config = {WIDTH, HEIGHT, FRAMERATE, DEVICE, ENCODER, PIXEL_FORMAT, PROFILE, BFRAMES, BITRATE};
struct hve *hardware_encoder;

//prepare file for raw HEVC output
FILE *output_file = fopen("output.hevc", "w+b");
if(output_file == NULL)
return fprintf(stderr, "unable to open file for output\n");

//initialize library with hve_init
if( (hardware_encoder = hve_init(&hardware_config)) == NULL )
{
fclose(output_file);
return hint_user_on_failure(argv);
}

//do the actual encoding
int status = encoding_loop(hardware_encoder, output_file);

hve_close(hardware_encoder);
fclose(output_file);

if(status == 0)
hint_user_on_success();

return 0;
}

int encoding_loop(struct hve *hardware_encoder, FILE *output_file)
{
struct hve_frame frame = { 0 };
int frames=SECONDS*FRAMERATE, f, failed, i;

//we are working with P010LE because we specified p010le pixel format
//when calling hve_init, in principle we could use other format
//if hardware supported it (e.g. RGB0 is supported on my Intel)
uint16_t Y[WIDTH*HEIGHT]; //dummy p010le luminance data (or p016le)
uint16_t color[WIDTH*HEIGHT/2]; //dummy p010le color data (or p016le)

//fill with your stride (width including padding if any)
frame.linesize[0] = frame.linesize[1] = WIDTH*2;

//encoded data is returned in FFmpeg packet
AVPacket *packet;

for(f=0;f<frames;++f)
{
//prepare dummy image data, normally you would take it from camera or other source
for(int i=0;i<WIDTH*HEIGHT;++i)
Y[i] = UINT16_MAX * f / frames; //linear interpolation between 0 and UINT16_MAX
for(int i=0;i<WIDTH*HEIGHT/2;++i)
color[i] = UINT16_MAX / 2; //dummy middle value for U/V, equals 128 << 8, equals 32768
//fill hve_frame with pointers to your data in P010LE pixel format
//note that we have actually prepared P016LE data but it is binary compatible with P010LE
frame.data[0]=(uint8_t*)Y;
frame.data[1]=(uint8_t*)color;

//encode this frame
if( hve_send_frame(hardware_encoder, &frame) != HVE_OK)
break; //break on error

while( (packet=hve_receive_packet(hardware_encoder, &failed)) )
{
//packet.data is HEVC encoded frame of packet.size length
//here we are dumping it to raw HEVC file as example
//yes, we ignore the return value of fwrite for simplicty
//it could also fail in harsh real world...
fwrite(packet->data, packet->size, 1, output_file);
}

//NULL packet and non-zero failed indicates failure during encoding
if(failed)
break; //break on error
}

//flush the encoder by sending NULL frame, encode some last frames returned from hardware
hve_send_frame(hardware_encoder, NULL);
while( (packet=hve_receive_packet(hardware_encoder, &failed)) )
fwrite(packet->data, packet->size, 1, output_file);

//did we encode everything we wanted?
//convention 0 on success, negative on failure
return f == frames ? 0 : -1;
}

int process_user_input(int argc, char* argv[])
{
if(argc < 2)
{
fprintf(stderr, "Usage: %s <seconds> [device]\n", argv[0]);
fprintf(stderr, "\nexamples:\n");
fprintf(stderr, "%s 10\n", argv[0]);
fprintf(stderr, "%s 10 /dev/dri/renderD128\n", argv[0]);
return -1;
}

SECONDS = atoi(argv[1]);
DEVICE=argv[2]; //NULL as last argv argument, or device path

return 0;
}

int hint_user_on_failure(char *argv[])
{
fprintf(stderr, "unable to initalize encoder, try to specify device e.g:\n\n");
fprintf(stderr, "%s 10 /dev/dri/renderD128\n", argv[0]);
return -1;
}

void hint_user_on_success()
{
printf("finished successfully\n");
printf("output written to \"out.hevc\" file\n");
printf("test with:\n\n");
printf("ffplay output.hevc\n");
}
4 changes: 3 additions & 1 deletion hve.c
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,9 @@ struct hve *hve_init(const struct hve_config *config)
return hve_close_and_return_null(h);
}

if(!(codec = avcodec_find_encoder_by_name("h264_vaapi")))
const char *encoder = (config->encoder != NULL && config->encoder[0] != '\0') ? config->encoder : "h264_vaapi";

if(!(codec = avcodec_find_encoder_by_name(encoder)))
{
fprintf(stderr, "hve: could not find encoder\n");
return hve_close_and_return_null(h);
Expand Down
37 changes: 33 additions & 4 deletions hve.h
Original file line number Diff line number Diff line change
Expand Up @@ -47,13 +47,30 @@ struct hve;
* @brief Encoder configuration
*
* The device can be:
* - NULL (select automatically)
* - NULL or empty string (select automatically)
* - point to valid device e.g. "/dev/dri/renderD128" for vaapi
*
* If you have multiple VAAPI devices (e.g. NVidia GPU + Intel) you may have
* to specify Intel directly. NVidia will not work through VAAPI for encoding
* (it works through VAAPI-VDPAU bridge and VDPAU is only for decoding).
*
* The encoder can be:
* - NULL or empty string for "h264_vaapi"
* - valid ffmpeg encoder
*
* You may check encoders supported by your hardware with ffmpeg:
* @code
* ffmpeg -encoders | grep vaapi
* @endcode
*
* Encoders typically can be:
* - h264_vaapi
* - hevc_vaapi
* - mjpeg_vaapi
* - mpeg2_vaapi
* - vp8_vaapi
* - vp9_vaapi
*
* The pixel_format (format of what you upload) typically can be:
* - nv12 (this is generally safe choice)
* - yuv420p
Expand All @@ -62,18 +79,29 @@ struct hve;
* - yuv422p
* - rgb0
* - bgr0
* - p010le
*
* There are no software color conversions in this library.
*
* For pixel format explanation see:
* <a href="https://ffmpeg.org/doxygen/3.4/pixfmt_8h.html#a9a8e335cf3be472042bc9f0cf80cd4c5">FFmpeg pixel formats</a>
*
* The profile (H.264 profile) can typically be:
* The available profiles depend on used encoder. Use 0 to guess from input.
*
* For possible profiles see:
* <a href="https://ffmpeg.org/doxygen/3.4/avcodec_8h.html#ab424d258655424e4b1690e2ab6fcfc66">FFmpeg profiles</a>
*
* For H.264 profile can typically be:
* - FF_PROFILE_H264_CONSTRAINED_BASELINE
* - FF_PROFILE_H264_MAIN
* - FF_PROFILE_H264_HIGH
* - ...
*
* For HEVC profile can typically be:
* - FF_PROFILE_HEVC_MAIN
* - FF_PROFILE_HEVC_MAIN_10 (10 bit channel precision)
* - ...
*
* You may check profiles supported by your hardware with vainfo:
* @code
* vainfo --display drm --device /dev/dri/renderDXYZ
Expand All @@ -93,8 +121,9 @@ struct hve_config
int height; //!< height of the encoded frames
int framerate; //!< framerate of the encoded video
const char *device; //!< NULL / "" or device, e.g. "/dev/dri/renderD128"
const char *pixel_format; //!< NULL / "" for NV12 or format, e.g. "rgb0", "bgr0", "nv12", "yuv420p"
int profile; //!< 0 to guess from input or profile e.g. FF_PROFILE_H264_MAIN, FF_PROFILE_H264_HIGH
const char *encoder; //!< NULL / "" or encoder, e.g. "h264_vaapi"
const char *pixel_format; //!< NULL / "" for NV12 or format, e.g. "rgb0", "bgr0", "nv12", "yuv420p", "p010le"
int profile; //!< 0 to guess from input or profile e.g. FF_PROFILE_H264_MAIN, FF_PROFILE_H264_HIGH, FF_PROFILE_HEVC_MAIN, ...
int max_b_frames; //!< maximum number of B-frames between non-B-frames (disable if you need low latency)
int bit_rate; //!< the average bitrate in VBR mode
};
Expand Down

0 comments on commit ac3a4c1

Please sign in to comment.