[Kstars-devel] UI For Memory Management

Akarsh Simha akarshsimha at gmail.com
Sat Jul 19 22:47:25 CEST 2008


Hi James

> I agree that a slider is preferable to a spinner.  If you can create
> a slider that goes from 0 to 100 (or whatever) then it would be trivial
> to convert it to zoomed out magnitudes that go from 0.0 to 10.0 by 
> dividing by 10.0.

That's an easy solution, but it will not really make the slider linear
in star density.

To make it linear in star density, I worked out the following math. I
find it very fishy, because I got to fix only one constant with
"boundary" conditions. So I thought I should explain it to you:

Looking through your math for the maglim formula, I find that:

10^(0.45 * maglim + 0.95) / (Zoom Factor)^2

is directly proportional to what we should be calling the star
density, that would be represented in arbitrary units by our
slider. Let me call the value of the slider as 'SD'. Then, I have,

SD * const = 10^(0.45 maglim + 0.95) / (Zoom Factor)^2

Let me abbreviate Zoom Factor to 'ZF'. Taking logarithms (to base 10):

log SD + const = 0.45 maglim + 0.95 - 2 log ZF

Divide by 0.45, and juggle around the terms:

maglim = 4.444 log ZF + ( 2.222 log SD + const )

So, instead of maglim = 4.444 log ZF + Options::StarDensity(), I think
that the above formula ( replace SD by Options::StarDensity() ) makes
the slider linear.

We want to fix const. So let's assume that the lowest maglim that a
user would want is 3.5 (really pathetic city skies. This would also
give a really poor star density, I think.). So, when SD = 1, maglim =
3.5. Thus const is 3.5. (We might want to change this, let's discuss)

maglim = 4.444 log ZF + 2.222 log SD + 3.5

If I let SD range from 1 to 100, I get values for maglim from 3.5 to
7.5 (really good skies, really rich star density. Again, we could
discuss this limit as well. You suggested 9.0, but IMO that will be a
memory overhaul). What I find very funny, is that this calculation
nowhere took another 'slope' constant to fix. It doesn't make sense to
me, because I could always "scale" the range of my slider. Can the
condition of a linear variation be so rigid as to curb one parameter?
I seem to be missing something, but I'm happy to have just one
parameter, all the same.

> It is possible that the star density "safety" control I was suggesting
> earlier will end up being an easier solution than predicting the memory
> usage.  I put a (very) arbitrary limit of 9.0 on the star density 
> setting available via the mouse-wheel.  If we don't put some kind of
> a limit on the density obtainable via the mouse-wheel then it would
> be all too easy for a user to just spin the mouse wheel and eat
> up gobs of memory (assuming a very large catalog).  This problem
> is exasperated by the fact that it can take a very long time (seconds)
> to draw the sky when it has a lot of stars.

Agreed.

> One possible solution would be to *rely* on the mouse-wheel (and - and =
> keys) to adjust the density and make the slider/spinner in the config
> menu control the "maximum star density".

Yes, that would be a nice idea. When I'm zoomed in sufficiently, I
could carefully hike up the star density, if I wished to, although I
don't know why anyone would want more than what the formula gives at
something like 7.5

> I wish I could have given you an elegant solution that wrapped it all
> up neatly but as I said above, this is a nasty little problem.
>
> Remember, your estimates of memory usage are based on the totally 
> arbitrary limit of 10.0 on the star density control.  Perhaps the 
> simplest thing for us to do now is to simply impose such a hard limit 
> (perhaps 9.0) and then just leave the rest of this problem for another
> day.
>
> I propose we just hard code a limit of 9.0 to the star density and then
> add a warning about memory usage.  We can revisit this problem at a 
> later date if we want to allow power users to go to much higher star 
> densities.  

I didn't realise that there would be such vast differences between the
best case and the worst case. I didn't recall that there's also a
larger number of thought-to-be-visible trixels than there are trixels
which are actually visible. That's hard to account, and maybe we could
leave the memory estimate, as you say, or make "eyeball average"
estimates as Jason says.

If we set the maglim when zoomed out to something like 5.5, I notice
that we can hardly get past 120 MB, which is a small increase from the
earlier ~70 MB usage that we had. This should further reduce when we
implement the AuxInfo hash. I guess hard-coding limits on the star
density slider should do the job, atleast for now. As I said earlier,
I don't know why anyone would want a star density that exceeds a 160
MB memory usage. I know what I'm doing here is wrong - I'm justifying
something as useless just because it's hard to achieve. Let me know if
this doesn't make sense.

Regards
Akarsh
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: Digital signature
Url : http://mail.kde.org/pipermail/kstars-devel/attachments/20080720/122248f6/attachment.pgp 


More information about the Kstars-devel mailing list