1. 28 May, 2016 1 commit
  2. 27 May, 2016 5 commits
  3. 26 May, 2016 12 commits
  4. 25 May, 2016 4 commits
  5. 24 May, 2016 2 commits
  6. 23 May, 2016 5 commits
    • Anton Khirnov's avatar
      ac84e618
    • Anton Khirnov's avatar
      lavc: document that avcodec_close() should not be used · 2ef6dab0
      Anton Khirnov authored
      We cannot deprecate it until the new parser API is in place, because of
      the way libavformat works. But the majority of the users can already
      simply replace it with avcodec_free_context(), which will simplify the
      transition once it is finally deprecated.
      2ef6dab0
    • Anton Khirnov's avatar
      lavc: deprecate avcodec_get_context_defaults3() · 04fc8e24
      Anton Khirnov authored
      This function is supposed to "reset" a codec context to a clean state so
      that it can be opened again. The only reason it exists is to allow using
      AVStream.codec as a decoding context (after it was already
      opened/used/closed by avformat_find_stream_info()). Since that behaviour
      is now deprecated, there is no reason for this function to exist
      anymore.
      04fc8e24
    • Anton Khirnov's avatar
      lavc: deprecate avcodec_copy_context() · 5f30ac27
      Anton Khirnov authored
      Since AVCodecContext contains a lot of complex state, copying a codec
      context is not a well-defined operation. The purpose for which it is
      typically used (which is well-defined) is copying the stream parameters
      from one codec context to another. That is now possible with through the
      AVCodecParameters API. Therefore, there is no reason for
      avcodec_copy_context() to exist.
      5f30ac27
    • Anton Khirnov's avatar
      lavf: update muxing doxy · 14634429
      Anton Khirnov authored
      Describe the new AVCodecParameters API.
      14634429
  7. 22 May, 2016 5 commits
  8. 19 May, 2016 6 commits
    • Anton Khirnov's avatar
      nvenc: allow setting the number of slices · 3399a26d
      Anton Khirnov authored
      Based on a patch by Agatha Hu <ahu@nvidia.com>
      3399a26d
    • Philip Langdale's avatar
      nvenc: De-compensate aspect ratio compensation of DVD-like content. · 10545f84
      Philip Langdale authored
      For reasons we are not privy to, nvidia decided that the nvenc encoder
      should apply aspect ratio compensation to 'DVD like' content, assuming that
      the content is not BT.601 compliant, but needs to be BT.601 compliant. In
      this context, that means that they make the following, questionable,
      assumptions:
      
      1) If the input dimensions are 720x480 or 720x576, assume the content has
      an active area of 704x480 or 704x576.
      
      2) Assume that whatever the input sample aspect ratio is, it does not account
      for the difference between 'physical' and 'active' dimensions.
      
      From these assumptions, they then conclude that they can 'help', by adjusting
      the sample aspect ratio by a factor of 45/44. And indeed, if you wanted to
      display only the 704 wide active area with the same aspect ratio as the full
      720 wide image - this would be the correct adjustment factor, but what if you
      don't? And more importantly, what if you're used to lavc not making this kind
      of adjustment at encode time - because none of the other encoders do this!
      
      And, what if you had already accounted for BT.601 and your input had the
      correct attributes? Well, it's going to apply the compensation anyway!
      So, if you take some content, and feed it through nvenc repeatedly, it
      will keep scaling the aspect ratio every time, stretching your video out
      more and more and more.
      
      So, clearly, regardless of whether you want to apply bt.601 aspect ratio
      adjustments or not, this is not the way to do it. With any other lavc
      encoder, you would do it as part of defining your input parameters or do
      the adjustment at playback time, and there's no reason by nvenc should
      be any different.
      
      This change adds some logic to undo the compensation that nvenc would
      otherwise do.
      
      nvidia engineers have told us that they will work to make this
      compensation mechanism optional in a future release of the nvenc
      SDK. At that point, we can adapt accordingly.
      Signed-off-by: 's avatarPhilip Langdale <philipl@overt.org>
      Reviewed-by: 's avatarTimo Rothenpieler <timo@rothenpieler.org>
      Signed-off-by: 's avatarAnton Khirnov <anton@khirnov.net>
      10545f84
    • Anton Khirnov's avatar
    • Timo Rothenpieler's avatar
      configure: Don't require nonfree for nvenc · 09522a30
      Timo Rothenpieler authored
      As the nvEncodeApi.h header is now MIT licensed, this can be dropped.
      The loaded CUDA and NVENC libraries are part of the nvidia driver, and
      thus count as system libraries.
      Signed-off-by: 's avatarAnton Khirnov <anton@khirnov.net>
      09522a30
    • Anton Khirnov's avatar
      nvenc: drop the hard dependency on CUDA · 6f58b4dc
      Anton Khirnov authored
      The code needs only a few definitions from cuda.h, so define them
      directly when CUDA is not enabled. CUDA is still required for accepting
      HW frames as input.
      
      Based on the code by Timo Rothenpieler <timo@rothenpieler.org>.
      6f58b4dc
    • Anton Khirnov's avatar
      nvenc: only support HW frames when CUDA is enabled · f11ec8ce
      Anton Khirnov authored
      hwcontext_cuda.h includes cuda.h, so this will allow building nvenc
      without depending on cuda.h
      f11ec8ce