1. 08 Oct, 2019 36 commits
  2. 07 Oct, 2019 4 commits
    • Andreas Rheinhardt's avatar
      1d54309c
    • Andreas Rheinhardt's avatar
    • Andreas Rheinhardt's avatar
      avcodec/flac_parser: Don't modify size of the input buffer · 87b30f8a
      Andreas Rheinhardt authored
      When flushing, MAX_FRAME_HEADER_SIZE bytes (always zero) are supposed to
      be written to the fifo buffer in order to be able to check the rest of
      the buffer for frame headers. It was intended to write these by writing
      a small buffer of size MAX_FRAME_HEADER_SIZE to the buffer. But the way
      it was actually done ensured that this did not happen:
      
      First, it would be checked whether the size of the input buffer was zero,
      in which case it buf_size would be set to MAX_FRAME_HEADER_SIZE and
      read_end would be set to indicate that MAX_FRAME_HEADER_SIZE bytes need
      to be written. Then it would be made sure that there is enough space in
      the fifo for the data to be written. Afterwards the data is written. The
      check used here is for whether buf_size is zero or not. But if it was
      zero initially, it is MAX_FRAME_HEADER_SIZE now, so that not the
      designated buffer for writing MAX_FRAME_HEADER_SIZE is written; instead
      the padded buffer (from the stack of av_parser_parse2()) is used. This
      works because AV_INPUT_BUFFER_PADDING_SIZE >= MAX_FRAME_HEADER_SIZE.
      Lateron, buf_size is set to zero again.
      
      Given that since 7edbd536, the actual amount of data read is no longer
      automatically equal to buf_size, it is completely unnecessary to modify
      buf_size at all. Moreover, modifying it is dangerous: Some allocations
      can fail and because buf_size is never reset to zero in this codepath,
      the parser might return a value > 0 on flushing.
      Signed-off-by: 's avatarAndreas Rheinhardt <andreas.rheinhardt@gmail.com>
      87b30f8a
    • Andreas Rheinhardt's avatar
      avcodec/flac_parser: Remove superfluous checks · a1701e75
      Andreas Rheinhardt authored
      For a parser, the input buffer is always != NULL: In case of flushing,
      the indicated size of the input buffer will be zero and the input buffer
      will point to a zeroed buffer of size 0 + AV_INPUT_BUFFER_PADDING.
      Therefore one does not need to check for whether said buffer is NULL or
      not.
      Signed-off-by: 's avatarAndreas Rheinhardt <andreas.rheinhardt@gmail.com>
      a1701e75