Filters "dialog" in 2.0
Moritz Moeller
mnm at dneg.com
Wed Jun 20 21:28:33 CEST 2007
Boudewijn,
>> I'd like the opportunity to place a plea for abandoning filters
>> altogether and going for filter layers instead (basically allowing to
>> put any filter on an adjustment layer and change it dynamically
>> afterwards).
>
> We have that :-). At least, as soon as I've finished the ui for manipulating
> them.
that's awesome! :)))
> Still, I also want to keep the ability to actually hard-change the pixels in a
> certain layer -- it's a way of working that's very common.
Only because most 2D still imaging apps suck balls the size of planets.
That particularly include Photoshop. None of the 2D compositing apps we
use when creating visual effects is that destructive.
And the other thing is that many people never experienced the very
pleasure of working an a totally non-destructive imaging app...
> How did it do that? I'm very interested. I mean, Krita does a lot of heavy
> caching of pretty much everything (adj. layers keep a rendered cache of the
> stack of layers beneath it, layers that have effect masks keep a rendered
> cache of the result of applying all the masks to the layer's pixel data), but
> that that's heavy on memory. On the other hand, not doing that makes Krita
> really slow.
Simply by only ever rendering what was on screen. If you think about it
-- at the time high res 19" Sony screens used on the SGI workstations
this app ran on had max 1280x960. Mostly it was 1024x748. Now subtract
the screen space taken by the palettes.
So all you ever have to render (and keep in memory), in the worst case
is this data (maybe 800x600x24bits.
If you cache data out to disk in a tiled fashion (think of TIFF using
tiles instead of bloody scanlines) and you keep a tile cache in RAM that
you populate with data around the area you're working...
Furthemore, one can create mip-map levels (what you call pyramids, I
think) when creating the disk tile cache. I think Eclipse didn't do
that. It means that when you zoom out, you only stream in the already
downsampled tiles for all the layers and combine them/apply the filters
according to the rules set up by the user through the layer palette.
Eclipse benefited from SGI's revolutionary crossbar switch which allwed
unbelievable speeds (sill by today's standards) bweteen RAM, disk,
graphics adapter and network (the latter meaning you could connect
several machines and their harddisks would be seen as one big disk,
their graphics adapters as one graphics adapter etc). But even on a
single machine the crossbar gave good enough performance for realtime
stuff. Today's harddisks and the sata stuff is fast enough to give
similar performance. And of course one can use hacks like e.g. make
layer tile size a multiplier of the cluster size on the resp. partition
used for swapping etc.
All brush strokes were cubic bezier splines. This also meant one could
change a stroke hours after it had been placed and it could be rendered
at any resolution. I'd rip code from Inscape. Their path reconstruction
while freehand drawing is one of the best I've seen and that include
commercial packages.
When one was done and wanted the final image, there was a 'render'
button. You go to have a cup of tea or several (it could take an hour or
longer with high res print files) and you'd have your image. However,
while working everything was realtime.
High end 2D compositing application like Apple's Shake or D2's Nuke work
in the same way. The advances outweight the waiting time at the end totally.
Today, one could also allow the user to select a renderer (an engine
that combines the pixels) and one of the available renderers could be
OpenGl, i.e,. use the GPU for accelerated compositing.
And also, instead of a disk tile cache that eclipse used, I'd use
something that uses RAM and only swaps least accessed tiles out to disk.
Lastly, there's another app which works this way but had a similarly
horrible interface as Eclipse: Satori Paint on Windows.
Cheers,
Moritz
More information about the kimageshop
mailing list