
This will make the apple even more rotten looking. If you have a beautiful shot of an apple and you encode it with a lossy codec and the resulting (decoded image) looks like a rotten apple instead, then when you re-encode, you are encoding a new lossy version of the rotten-looking apple, not the pristine, beautiful apple. When thinking of quality, one must remember that every re-encode requires a full-decode first (not counting DCT-based transcoding shortcuts), and that decode doesn't ever get you back to 100% (otherwise, it would be a LOSSLESS encode instead of a LOSSY one). (higher the quality smaller bumps/errors) The aq/vbr now simply specifies the maximal size these new bumps are allowed to have. => Welcome to the world of lossy compression. > you created a new signal which has lower quality but requires more bit rate when uncompressed If you now decode the reencoded material it will require more bit rate than the source you started with since the additional bumps also require bit rate.

You start with a smooth sinus curve signal you wan to compress.Īfter the first recompression your reencoded signal is basically still the sinus curve, but it now contains some small additional bumps. quality is not necessarily bound to bit rate (just because a file has a bit rate of X it doesn't have to be better than a file of bit rate Y with X min, max distance to source).

the quality of your reencode can not be higher than the quality of your input.ī. So a higher quality means less difference.Ī.

Seems like we have totally different understandings what quality means.įor me (and the way ffmpeg understands it), quality of a reencode is defined by the difference between the uncompressed source and the uncompressed/decoded reencode.
