[Digikam-devel] [digikam] [Bug 337691] Select 16 Bit Depth as Default for highres jpg: Standard Option or Alternative
buitk14 at A1.net
Wed Jul 23 00:04:55 BST 2014
--- Comment #5 from Christian <buitk14 at A1.net> ---
(In reply to Gilles Caulier from comment #4)
> Converting 8 bits image to 16 bits will introduce noise to fill histogram
> holes :
> Gilles Caulier
I am not an expert in digital imaging. But as far as I can see the results, the
artificial noise is a minor problem compared to artefacts caused by 8 bit
The noise is added to the 8 bit version before the transformation takes place?
Or in 16 bit mode? Noise in 8 bit mode would be a drawback for jpgs with less
than 12 MP.
Well distributed entropy in 16 bit mode might be cleaned when the colors are
transferred back to 8 bit mode. I can't tell. My results where fine in print
and on a high resolution / wide gammut monitor using argb profile for 20 MP
But this is no proof in any way.
I will continue using the 16 bit conversion before I apply gradiation and the
like - artefacts/gradients are worse than noise for me.
You are receiving this mail because:
You are the assignee for the bug.
More information about the Digikam-devel