Parallel solvers not terminating
Wolfgang Reissenberger
sterne-jaeger at openfuture.de
Wed Nov 18 20:43:59 GMT 2020
No, they do not terminate at all. I waited for several minutes without any change.
> Am 18.11.2020 um 21:41 schrieb Robert Lancaster <rlancaste at gmail.com>:
>
> One more question. You said they don’t terminate, does that mean that they never do, or that it just takes a few moments? They do not terminate immediately, it takes a short time for it to occur.
>
> Sent from my iPad
>
>> On Nov 18, 2020, at 3:09 PM, Robert Lancaster <rlancaste at gmail.com> wrote:
>>
>> Is this using the latest StellarSolver? And can you try the same image in the StellarSolverTester?
>>
>> The numbers don’t quite add up here. I assume you have 4 threads running, so there would be 4 solvers. If the first one solved, that does leave 3, that is correct. But I am counting these files and I see 5 in the one case and 2 in the other. May I assume that there was more than one solve that took place with the external solver? Or maybe some leftover files? Did one of the internal threads shut down properly leaving just 2?
>>
>>> On Nov 18, 2020, at 1:08 PM, Wolfgang Reissenberger <sterne-jaeger at openfuture.de <mailto:sterne-jaeger at openfuture.de>> wrote:
>>>
>>> Hi Robert,
>>> It happens both for the internal solver and the external astrometry.net <http://astrometry.net/> solver. After a successful solve, there appear three cancel files that remain - in both cases - in the /tmp directory:
>>>
>>> externalSextractorSolver_1_101.axy
>>> externalSextractorSolver_1_101.cancel
>>> externalSextractorSolver_1_101.cfg
>>> externalSextractorSolver_1_101.xyls
>>> externalSextractorSolver_1_151.axy
>>> externalSextractorSolver_1_151.cancel
>>> externalSextractorSolver_1_151.cfg
>>> externalSextractorSolver_1_151.xyls
>>> internalSextractorSolver_10.cancel
>>> internalSextractorSolver_12.cancel
>>> internalSextractorSolver_13.cancel
>>> internalSextractorSolver_14.cancel
>>> internalSextractorSolver_15.cancel
>>>
>>>
>>>
>>>> Am 17.11.2020 um 21:51 schrieb Robert Lancaster <rlancaste at gmail.com <mailto:rlancaste at gmail.com>>:
>>>>
>>>> Yeah, this could be something with his system. It works fine on all my systems. I was wondering if we could see the reason it doesn’t work for him. I’m not sure what it could be yet, but maybe the cancel files are not getting made for some reason. Maybe there is something up with his temp directory?
>>>>
>>>>> On Nov 17, 2020, at 3:43 PM, Jasem Mutlaq <mutlaqja at ikarustech.com <mailto:mutlaqja at ikarustech.com>> wrote:
>>>>>
>>>>> I tried this in load and slew. While it failed to solve the image, it terminated correctly.
>>>>>
>>>>> --
>>>>> Best Regards,
>>>>> Jasem Mutlaq
>>>>>
>>>>>
>>>>>
>>>>> On Tue, Nov 17, 2020 at 11:39 PM Robert Lancaster <rlancaste at gmail.com <mailto:rlancaste at gmail.com>> wrote:
>>>>> Hi Wolfgang,
>>>>>
>>>>> Does this happen when you try to solve the image in the StellarSolver tester, or just in KStars? Does it happen when using the internal solvers and the local astrometry.net <http://astrometry.net/>, or just the internal ones? Does it happen on more than one computer or just this one?
>>>>>
>>>>> Can you look in the temp directory and watch what happens? The way astrometry.net <http://astrometry.net/> works, to abort the internal or external solvers, a “cancel” file must be made, and once it notices that file, it aborts the solve. It is basically an empty file. What you should see happen is that the one solver solves the image, and then Stellarsolver tells the other solvers to abort, it does this by creating a cancel file for each one. Then when each one shuts down, it will delete its cancel file.
>>>>>
>>>>> I am concerned from your log here that possibly on your system the cancel file does not have the correct name so it is not being made perhaps. But I need more information to be sure.
>>>>>
>>>>> The current naming scheme is that each solver gets a number, and any temp files it creates would have this number appended. The internal solvers will have very few of these temp files (just solved and cancel), but the external ones have multiple files. The cancel files should look something like this: "internalSextractorSolver_20.cancel”
>>>>>
>>>>> Can you see if this happens in this case.
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Rob
>>>>>> On Nov 17, 2020, at 2:01 PM, Wolfgang Reissenberger <sterne-jaeger at openfuture.de <mailto:sterne-jaeger at openfuture.de>> wrote:
>>>>>>
>>>>>> Hi Robert,
>>>>>> I am running StellarSover on commit 2cac32ffb3523dbcb4b3d43afd73e96959b52d8d, but the problem when solving NGC6888 remains. It seems like after a successful termination of one solver, the others keep running.
>>>>>>
>>>>>> There appears a warning around the first successful solve:
>>>>>>
>>>>>> [2020-11-17T19:47:27.267 CET INFO ][ org.kde.kstars.ekos.align] - "Successfully solved with child solver: 1"
>>>>>> [2020-11-17T19:47:27.269 CET INFO ][ org.kde.kstars.ekos.align] - "Shutting down other child solvers"
>>>>>> [2020-11-17T19:47:27.270 CET WARN ][ default] - Empty filename passed to function
>>>>>> [2020-11-17T19:47:27.270 CET WARN ][ default] - QFile::remove: Empty or null file name
>>>>>> [2020-11-17T19:47:27.271 CET WARN ][ default] - QFile::remove: Empty or null file name
>>>>>> [2020-11-17T19:47:27.271 CET INFO ][ org.kde.kstars.ekos.align] - "Solver completed after 1.76 seconds.“
>>>>>>
>>>>>> The entire log file is here:
>>>>>> https://drive.google.com/open?id=1UTy8HCBSg2GILYvuzHlrIWKqrTGw7rYL <https://drive.google.com/open?id=1UTy8HCBSg2GILYvuzHlrIWKqrTGw7rYL>
>>>>>>
>>>>>> Wolfgang
>>>>>>
>>>>>
>>>>
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.kde.org/pipermail/kstars-devel/attachments/20201118/ce2976c5/attachment.htm>
More information about the Kstars-devel
mailing list