summaryrefslogtreecommitdiffstats
path: root/libass/ass_render.h
diff options
context:
space:
mode:
authorOneric <oneric@oneric.stub>2022-09-05 00:46:04 +0200
committerOneric <oneric@oneric.stub>2022-09-15 18:35:43 +0200
commit1db406a0fea1fee5b3e1a5a5544d7e968a3e0591 (patch)
tree505639da14ed438a57709053dca269c7fc6153ad /libass/ass_render.h
parentd8f056158abe9d671c53f430ecd21022cc983d47 (diff)
downloadlibass-1db406a0fea1fee5b3e1a5a5544d7e968a3e0591.tar.bz2
libass-1db406a0fea1fee5b3e1a5a5544d7e968a3e0591.tar.xz
Fix legacy effect's delay scaling and precision
Usually the delay parameter of legacy effects is scaled to be relative to the PlayRes canvas. This happens explicitly in VSFilter and automatically in libass. However, this scaling in VSFilter happens _before_ applying max(delay, 1) which means, if e.g. delay=0 it ends up as 1 ms per _storage pixel_. To get the same effect in libass we must explicitly "unscale" the fallback for small or negative delays. VSFilter also casts the scaled delay value to int afterwards, which can lead to noticeable differences if the scaled value isn't an integer. To emulate this in libass we do not want delay to be an int (which would also ruin the unscaling for delay=0), but we need to convert our already PlayRes-relative value to a storage-relative one, then cast to int and finally convert back to a PlayRes-relative value. This rounding error can already be observed after just one second for PlayResX=StorageX/8 and delay=25.
Diffstat (limited to 'libass/ass_render.h')
0 files changed, 0 insertions, 0 deletions