op->initial_display_delay doesn't have the correct default value of BUFFER_POOL_MAX_SIZE = 10
The AV1 spec says, on page 114:
initial_display_delay_minus_1[ i ] plus 1 specifies, for operating point i, the number of decoded frames that should be present in the buffer pool before the first presentable frame is displayed. This will ensure that all presentable frames in the sequence can be decoded at or before the time that they are scheduled for display. If not signaled then initial_display_delay_minus_1[ i ] = BUFFER_POOL_MAX_SIZE - 1.
So the default value of initial_display_delay_minus_1[ i ] is BUFFER_POOL_MAX_SIZE - 1 = 10 - 1 = 9.
I searched for "initial_display_delay" in the dav1d source tree and found:
$ find . -name "*" -a -type f | xargs grep initial_display_delay
./include/dav1d/headers.h: int initial_display_delay;
./src/obu.c: op->initial_display_delay = dav1d_get_bits(gb, 4) + 1;
This means the default value of op->initial_display_delay
cannot possibly be BUFFER_POOL_MAX_SIZE = 10. op->initial_display_delay
is most likely initialized to 0.