Massive Konqueror Regression

Luke-Jr luke-jr at utopios.org
Thu Aug 18 18:47:49 CEST 2005


On Thursday 18 August 2005 16:35, James Richard Tyrer wrote:
> Luke-Jr wrote:
> > On Wednesday 17 August 2005 16:39, James Richard Tyrer wrote:
> >> Luke-Jr wrote:
> >>> On Tuesday 16 August 2005 19:22, James Richard Tyrer wrote:
> >>>> Frank Osterfeld wrote:
> >>>>> On Monday 15 August 2005 14:31, David van Hoose wrote:
> >>>>>> I haven't yet, but I do see that Konqueror has at least one
> >>>>>> regression, so I know your regression tests are inadequate. 
> >>>>> I am sure you can point me to a testing method with 100%
> >>>>> coverage.
> >>>> They do exist -- at least they are supposed to exist.
> >>>>
> >>>> The validation suite for a compiler is supposed to test
> >>>> everything.
> >>> I'm pretty sure such tests only test standard-compliant
> >>> situations-- which work quite fine in KHTML.
> >> Which has nothing to do with what I said.  The suites test what the
> >>  people that write them write them to test.  The point is that are
> >> intended to test everything -- that writing a test suite is not
> >> only not a unrealistic goal, it is something which is done.
> > But such suites for standard situations is already a reality. The
> > "unrealistic goal" is writing a test suite that includes all the
> > broken webpages on the net.
> In rhetoric, this is called a "Straw Man".  Yes, you have defeated the
> Straw Man.  But I am not suggesting that a test suite be written to test
> all "broken webpages on the net".

You must be, since all non-broken webpages are, as far as I have seen, working 
great-- better than any other browser, at least.

> I am suggesting that we need to write a test suite that includes
> unauthorized MicroSoft extensions.

I am saying webpages should be chastised for using non-standard stuff without 
a fallback.

> >>> GCC, now that it has become the #1 compiler, has begun to be
> >>> stricter and refuses to compile bad code.
> >> Again, this has nothing to do with what I said.  A compiler
> >> validation suite is only to test that correct code is compiled
> >> correctly.  A compiler can support extensions and still run the
> >> validation suite correctly.
> > The problem isn't extensions, but invalid or bad code.
> Taking one word out of a sentence and saying something different about
> it means little.  A validation suite does not test a compiler for
> detecting incorrect code, it it was, compilers which recognize
> extensions might fail the test.

No, because extensions can still be valid code. Code should never depend on 
extensions, though...

> >>> If you want something compiler-like, then KHTML should refuse to
> >>> show any erroneous webpage...
> >> Even if that were the case, which it isn't, you would have to
> >> define what constituted an "erroneus" webpage.
> > Anything not following the established standards, obviously.
>
> This limited definition only causes problems.  Mozilla and Firefox are
> able to handle unauthorized MicroSoft extensions.  Are such unauthorized
> extension "established standards"?

Of course not. Are they part of the XHTML 1.1 (or CSS 2.1, etc) standard?

> > Then the real question which it would need to address are the
> > unauthorized extensions to HTML and its derivatives.

And the answer is for websites using extensions to not depend on them and 
fallback to standard code when extensions are unavailable-- just like all 
good C code does.
-- 
Luke-Jr
Developer, Utopios
http://utopios.org/


More information about the kde-quality mailing list