Crash organising files

Maximilian Kossick maximilian.kossick at googlemail.com
Sun Sep 7 18:08:25 CEST 2008


On Sun, Sep 7, 2008 at 5:19 PM, Jeff Mitchell <kde-dev at emailgoeshere.com> wrote:
> Maximilian Kossick wrote:
>> On Sun, Sep 7, 2008 at 7:01 AM, Seb Ruiz <ruiz at kde.org> wrote:
>>> Reproducible every time. Happens when organising from local or non
>>> local collection.
>>>
>>> Max, what do you think of this?
>>>
>>
>> This is the first time that I've seen a pointer to a QHash:)
>
> :-)
>
>> That shoudn't cause a crash though, but everything else seems to work.
>
> No, it shouldn't cause one.  That hash, although spread across threads,
> should not be getting accessed my multiple threads at once (I was pretty
> certain of this when I dug through the code).  Not that I think that's
> the problem (or that I know what the problem is).
>
>> Please check if it works without those two lines of AFT code. If it
>> does, you might try to use a QHash that's not shared across multiple
>> threads (i.e. use QHash instead of QHash* inside ScanResultProcessor)
>
> The reason that hash exists is that we can't update Meta::Track objects
> with new URLs when the new ones are found, because this happens in a
> non-GUI thread, and multiple parts of Amarok perform GUI functions when
> the objects are updated.  Big borkage/breakage occurs.
>
> Instead of using a hash, I could try emitting the information and having
> the ScanManager collect up those emits into a local hash object.  I feel
> like this might be slower though (as would mutexing the hash, even
> though I don't think it necessary or know if that's actually anything to
> do with the issue).
>
> Thoughts?
>
> --Jeff
>

First we'd have to make sure that the scan actually works correctly
otherwise without that insert.
Then I'd try if it works if we store the QHash directly instead of a
pointer to it, and provide a getter/setter for it.
Mutexig should not be necessary in that case (and it wouldn't make a
noticable difference anyway. This isn't high performance computing
after all)

Cheers, Max


More information about the Amarok-devel mailing list