noise generators (forked from: Krita user community?)

Boudewijn Rempt boud at valdyas.org
Tue Feb 26 10:30:37 CET 2008


On Tuesday 26 February 2008, Matthew Woehlke wrote:

> Where did colormapping come into the plugin? The idea was that the
> channel-generation is totally decoupled from the color mapping. So, my
> plasma generator (as an example) spits out, say, an FP64 per pixel, with
> no knowledge or concern for how that gets translated into color data.

Well, if there is only one way of going from the generated noise to actual 
colors, then a plugin isn't necessary. I just expected that there would be 
many ways to actually use the generated noise :-). It's also easy to 
decouple, in a couple of ways:

* if the generated data is in a particular colorspace, the color conversion 
system can handle the mapping; it's possible to have different colorspaces 
with different conversion characteristics
* or the mapping can be in separate plugins
* or it can be hard-coded in the node type (which I doubt to be the best 
choice).

>
> > * add a node type that contains that plugin, a source paint device in the
> > 2 channel color model and a projection in a regular color model
>
> I don't know enough krita guts, but that sounds about right

We'll teach you :-). Then you can help implementing it!

> I'm envisioning one "layer" (node?) looking like this:
>
> <generator> (plugin)
> <filter> (maybe plugin but probably core; optional)
> <colormap> (core)
>
> Note that 'colormap' is basically the gradient map we talked about
> earlier, of course operating on the higher-bit-depth information from
> the previous stage. This would only be applied to the first channel.
>
> The filter (one per channel, so one or two) would be somewhat like the
> 'curves' adjustment tool.

If we create a colorspace for the output of the generator and implement the 
various colorspace operations, the filter can be the curves adjustment 
filter -- or any other filter. You could filter the output of the generator 
dynamically using a blur, an oilpaint or the raindrops filter -- or any new 
filter we create that is properly cs-independent. Filters can already work on 
any subset of channels in the paint device, and it is already possible to 
have more than one filter mask on a layer.

Actually, if we use the already-existing filter masks, you can have different 
filter settings for different areas or even filter some areas of your node 
twice, and it's possible to postpone the mapping to colors until the 
recomposition kicks in (which does automatic colorspace-to-colorspace 
conversion -- krita is the only raster app that can handle a heterogenous 
layer stack).

The colormapping would probably best be implemented in the colorspace 
conversion system, that way we can go to any colorspace from the original 
data, and generate, for instance, lab pixels instead of rgba.
	
So, we'd need

* a new colorspace class
* a generator plugin framework: the generator should have a method that 
suggests the preferred colorspace for the plugin and a generate method
* a node type similar to the adjustment layer, but with a generator instead of 
a filter.

The node would update the data if the parameters were changed; on projection 
the filter masks associtated with the node would be applied and the result 
would be converted to the projection colorspace using the new colorspace and 
then composited using any of the available composite ops.

> There may even be some way to combine this with, or at least make it
> resemble (i.e. similar UI), HDRI tone mapping.
>
> > * add a layer properties dialog that allows setting the data for both
> > functions in the plugin and has a regenerate button
>
> Is there a technical problem making regenerate happen when the
> generator's parameters change? I'd rather not need a button; it should
> be automagic :-).

Nope, no problem at all.

>
> > Come to think of it, it would be nice if we could use such generators as
> > input for masks, too.
>
> Hmm... yes, that is harder... or is it? I'd mentioned the second channel
> being alpha, I don't suppose we have a DestinationOver composition mode?

Not yet -- but it shouldn't be too hard to add.

> I.e. some way to have Layer B a child of Layer A, where the blend of the
> two results in the alpha being either A*B or just B? It's not quite as
> elegant, but achieves the same effect with less complicated UI.

Given the right composite ops, that should be doable.

> Btw, is this getting on a wiki anywhere, or is there a good place I
> should dump it?

http://wiki.koffice.org is also our krita scratchpad.

-- 
Boudewijn Rempt 
http://www.valdyas.org/fading/index.cgi


More information about the kimageshop mailing list