How do I remove the "new activity" item from the desktop?

Duncan 1i5t5.duncan at cox.net
Tue May 31 13:49:59 BST 2011


John Woodhouse posted on Tue, 31 May 2011 03:49:49 -0700 as excerpted:

> I'm on a number of forums that top post normally but I can try and
> remember to switch on here.

Back in the MSIE beta days (which that long rambling mentioned), their 
newsgroups didn't really care too much, as long as people didn't post a 
humongous unbroken block of reply at the top so people had to scroll down 
too far.

The long quote block is bad in any case, tho (with the exception of where 
whole-quote-at-bottom is accepted).  In general, if someone's posting more 
than a reasonably-sized page of worth of quoted text in a block, and 
replying below, they're doing it wrong.  The idea is to edit to the bit 
replied to and reply to it, with inling quote/reply/quote/reply if there's 
several elements replied to.  With some exceptions for technical output 
quoting, etc, then, if in a reasonably sized page one can't see a bit of 
their reply either top or bottom, they're quoting too much and need to 
either trim or summarize (with appropriate [] demarking so it's known to 
be a summary not a direct quote) the point to which they are replying.

But I've not seen a *ix related list where top posting is really 
considered great, tho it's often tolerated.  And personally, I don't 
really object to it either, certainly not anything even CLOSE to the 
extent to which I object to HTML posts.  My only frustration with top 
posting is in trying to reply, keeping appropriate context, etc.  
Sometimes I reformat it for top quoting instead of top posting.  Sometimes 
(as here) I reply to another post instead.  Sometimes I reply to only one 
select-quoted point instead of the more thorough reply I'd have otherwise 
posted, and sometimes I just blow it off and don't reply at all, as by the 
point in the thread at which it becomes a real problem, any further 
contribution I might make isn't as important as it might be were I the 
first reply (which by definition can't have that problem).

So I'm not really going to object to top posting, as long as you're not 
using HTML.  I might gripe a bit in replies, etc, but that's different 
than objecting strenuously...

> I've avoided dual monitors so far. Many use one as a sort of launch
> point and another for actually working within but it can be useful to
> mix that a little in the interests of real estate. Task bars are likely
> to always fit in nicely for people who use just one monitor.

Make that /most/ people and I'm fine with it. =:^)  By the time I left MS, 
I didn't really need them so much any more, and believe I dumped them 
shortly after switching to Linux, as Linux had so much else of interest to 
stick on the panels. =:^)  All that other stuff kind of crowded out the 
taskbar, which I didn't use enough to be worth it anyway.

OTOH, the new taskbar/launcher combos, where many app buttons remain in 
the same place whether the apps running or not, and clicking it either 
starts the app or switches to it, is a very interesting innovation.  It 
follows intuitively from the way things were going, but it's still rather 
creative.  I hope it doesn't end up being one of those infamous patent 
problems...

Anyway, if I hadn't switched to primarily hotkey launching, which if you 
think about it, is similar positional memory at work, only on the 
keyboard, not the task bar, I expect I'd find that rather useful.  And I 
expect it'll keep the taskbar around for some a little longer than it 
might have stayed around otherwise, were it just a traditional running 
taskbar.  But there's nothing wrong with that.  Customize it to the way 
you work and be happy! =:^)

> I run a 22ins monitor at  1680x1050 that's been with me for many years.
> About 9 I think. Now hd tv's are available I am wondering about moving
> in that direction but there really isn't much of a gain in resolution so
> it's a question of just how big one can go before the dot size/spacing
> becomes a problem. I run the monitor at 100dpi which produces nice tight
> clear text. that is a little small for ageing eyes without some aid. At
> some point I intend to play with this and x scaling to see just how
> large I can go before things get objectionable.

This is a rather practical consideration I too am facing.  My current 
setup is 96 dpi standard.  But while neither the 1080p monitors nor X 
appeared to have a built-in half-size/quarter-square resolution, I created 
one, 960x540, and for some time (until I switched to kwin's openGL based 
zooming effect, which I use most of the time now, tho the xrandr scripts I 
used before are still in my hotkey config), I regularly used the 960x540 
modeline I had created.

Somewhat to my surprise, I didn't really have a problem with that, at 
least at the comfortable distance of about four feet from the monitors I 
like to sit.  (You'd not want to be closer than about 3 feet to dual-42-
inch anyway, you'd have to move too much to see the whole thing!)  Being 
half the normal resolution with 96 dpi, it amounted to 48 dpi.  I've 
decided I don't have the problems with aliasing that some people have -- 
my mind must compensate automatically as at size, I can see the blocks if 
I deliberately look for them, but I don't see them and they don't normally 
bother me if I'm not looking for them.

Part of it may also be that I *STRONGLY* (to the point I can start feeling 
physically sick if stuck too long using a standard scheme, check back in 
the archives either here or on the Linux list for my complaints about 
color handling back with kde 4.2 when I first tried to switch from 3.5, if 
you doubt) prefer the traditional light text and foregrounds on a dark 
background, reversing the more common dark text and foregrounds on a for 
me WAY too bright background.  I remember back in the CRT era when I used 
to run my monitors close to their upper resolution tolerances, thus, close 
to the lower vertical refresh-rate tolerances.  I experimented quite a bit 
trying to eek out every last bit of resolution I could, and realized in 
the process that the darker a picture was the more tolerate I was to low 
refresh rates.  I eventually decided that part of my "reverse color-
scheme" preference was because I tended to run high resolutions and thus 
low framerates, and long before I realized why, I instinctively didn't 
like the harsh blinking and glare of the brighter color schemes at the 
refresh rates the monitors I had could deliver at the resolutions I ran 
them at.

Anyway, based on the half-resolution experiments I ran recently, and this 
is on higher quality LED based monitors (not florescent-backed LCDs) so it 
wasn't simply blur masking the problem, I don't anticipate that I'll have 
problems with upto 42" monitors, almost double-size what I have now.  Were 
I to go higher than that, 47 would be borderline and 50-inch-plus would 
likely start to pixelate for me, but 42-inch should be fine as long as 
it's full 1080p or better resolution.

Meanwhile, one of the things TVs do (or used to do at least) to look good 
at the larger sizes but smaller resolutions of even down to standard-TV, 
where monitors traditionally do just the OPPOSITE, is increase the blur.  
At the full-HD 1080p level, both TVs and monitors run comparable or often 
the exact same resolution and refresh rates, but monitors are at least in 
theory designed for sharp text display even up into the 30+ inch range, 
while TVs, particularly the larger ones, may blur a bit to avoid the 
pixelation effect.

But, due to the mass production effect, TVs tend to be as cheap or cheaper 
even with the required tuner that monitors can leave out, so they're 
tempting.  Additionally, I strongly suspect that at least in the 19-to-37-
inch ranges where the commonly overlap, despite the theory above, the mass-
production economics are strong enough that the exact same panels are 
being used for both monitors and TVs.

Meanwhile, one thing I'd STRONGLY recommend to anyone purchasing for 
monitor use, is that you spend the few extra bucks for LED quality.  I 
didn't realize how big a difference it made originally, and purchased the 
cheaper standard florescent-backed LCDs.  But an accident cracked the one 
so I sold the good one to a friend and threw in the cracked one (which was 
still in good enough shape he used it as a second monitor for awhile 
himself), and upgraded to slightly smaller but *ASTOUNDINGLY* better color-
quality LEDs.  Had I not actually run the LCDs for a few months, I'd have 
not noticed the difference, but having done so, and having actually run 
the kgamma adjustment sequence for both the LCDs and the LEDs... put it 
this way, if I were satisfied with LCDs any longer, they're enough cheaper 
I'd already have my dual 37-42-inch monitors/TVs!  But having seen the 
difference, I'm holding out until I scrape the money together for the LEDs.

Basically, with the LCDs, I couldn't get the kgamma settings to 
distinguish between all colors blocks on all the tests no matter what I 
did, while with the LEDs, it was easy to tell the difference.  And it was 
me in both cases so the difference isn't just my eyes vs someone elses'.  
That's the difference in color rendering quality!

Plus the LEDs are MUCH lighter and thinner, and run MUCH cooler and 
greener.  Here in Phoenix, AZ especially, where it's not uncommon to run 
the AC for an hour or so mid-day even in January, we pay for every bit of 
electricity a device uses twice, once powering the device and ultimately 
converting the energy to waste heat, then again to pump that heat 
outside!  So the electricity savings alone should pay for the difference 
over the lifetime of the device, several times over.

And being lighter and easier to handle, there's less chance of "accidents" 
like the one that cracked the LCD triggering my buying the LEDs, too.

> There is even another
> possible gain from using a TV. My graphics card has an hdmi output which
> means it's possible to gain a decent sound system as well all in the
> same unit.. Just how well it will work from a distance of under a couple
> of feet though is questionable.

Meh.  I already have my workstation hooked up to my 5.1 stereo using the 
sound system's digital out.  The TV's sound might well be worse, even if 
used for just the front channels.

But my card doesn't have HDMI out anyway, only DVI.  Of course that'll 
probably change when I next upgrade, but I'm shooting for a decade on this 
dual socket Opteron system from 2003, so that'll be a couple years yet 
anyway...  But this economy is really hurting ATM and at this rate it 
could be that before I get the big monitors, pushing back the full 
computer upgrade a couple years beyond that!  This tyan's been a good 
board tho, and could well make it a dozen years...

> Back to the topic - nearly. There is one person kicking about that
> refers to the thing in the top right corner as a hemroid.

LOL!  I hadn't seen /that/ label before! =:^)

> It's interesting to note that for all I know a cashew might be something
> entirely different and not the hemroid which just goes to show how
> useful a name it really is.

Yeah...

-- 
Duncan - List replies preferred.   No HTML msgs.
"Every nonfree program has a lord, a master --
and if you use the program, he is your master."  Richard Stallman

___________________________________________________
This message is from the kde mailing list.
Account management:  https://mail.kde.org/mailman/listinfo/kde.
Archives: http://lists.kde.org/.
More info: http://www.kde.org/faq.html.




More information about the kde mailing list