<div dir="ltr">Hi all,<div><br></div><div>Now 3 reports are published by my BASH scripts located in project/reports :</div><div><div style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration-style:initial;text-decoration-color:initial">All use the same configuration about to ignore directories to parse while scanning : ".krazy" file on root directory.</div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">This mailing list is CC automatically when a new report is online with the right url to take a look.</span></div><div><br class="gmail-Apple-interchange-newline">- clang : this analyzer do not have an option to ignore directory. So i parse all, and i filter the output HTML file before to publish.<br></div><div>The task is hard to complete, especially to update the analysis statistics accordingly. Currently it's not the case, and statistics include dropped items. I must code more and more with BASH to achieve a complete filtering.</div><div>- krazy : i only scan with "extra" checks not published to EBN (<a href="http://ebn.kde.org/krazy/reports/extragear/graphics/digikam/">http://ebn.kde.org/krazy/reports/extragear/graphics/digikam/</a>)</div><div>For this last one, i passed 3 weeks and 300 commits to fix all reports. The extra checks still under development and can generate false errors.</div><div>Take a care...</div><div>- cppcheck : very verbose, probably about 'style' code analyze. It just an option to tune if style are not suitable. Anyway, some reports are interesting to investigate.</div><div><br></div><div>To conclude : now we have suitable reports to detect wrong coding done by contributors as students or patches.</div><div><br></div><div>Best</div><div><br></div><div>Gilles</div><div><br></div><div><br></div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">2018-05-06 15:39 GMT+02:00 Gilles Caulier <span dir="ltr"><<a href="mailto:caulier.gilles@gmail.com" target="_blank">caulier.gilles@gmail.com</a>></span>:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">The url has a little bit changed :<div><br></div><div><a href="https://www.digikam.org/reports/" target="_blank">https://www.digikam.org/<wbr>reports/</a><br></div><div><br></div><div>We have now clang and cppcheck reports posted to digiKam.org...</div><span class="HOEnZb"><font color="#888888"><div><br></div><div>Gilles</div></font></span></div><div class="HOEnZb"><div class="h5"><div class="gmail_extra"><br><div class="gmail_quote">2018-05-05 14:19 GMT+02:00 Gilles Caulier <span dir="ltr"><<a href="mailto:caulier.gilles@gmail.com" target="_blank">caulier.gilles@gmail.com</a>></span>:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Hi all,<div><br></div><div>My Clang static analyzer script is working well now. It publish in digiKam.org static area the report automatically</div><div><br></div><div>url: <a href="https://www.digikam.org/report/" target="_blank">https://www.digikam.org/r<wbr>eport/</a></div><div><br></div><div>The contents is currently and older one. I will run again the script soon to update the contents.</div><div><br></div><div>Best</div><span class="m_6358007066034371602HOEnZb"><font color="#888888"><div><br></div><div>Gilles</div><div><br></div><div><br></div></font></span></div><div class="m_6358007066034371602HOEnZb"><div class="m_6358007066034371602h5"><div class="gmail_extra"><br><div class="gmail_quote">2018-05-04 17:48 GMT+02:00 Gilles Caulier <span dir="ltr"><<a href="mailto:caulier.gilles@gmail.com" target="_blank">caulier.gilles@gmail.com</a>></span>:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Hi all,<div><br></div><div>You must know that we parse norally all source code with Coverity Scan service and fix step by step the issues detected by the static analyzer.</div><div><br></div><div>Since January, git/master cannot be processed by Coverity. The build is complete, but the report is never commited and is send in somewhere in /dev/nulll (:=)))...</div><div><br></div><div>Coverity Scan service was acquired by a new company in 2018, and i suspect a side effect to commit report to the remote server. I contacted the Coverity team, who respond that investiguation are under progress, please wait.</div><div><br></div><div>So, i finally try to found a new solution to parse week by week all source code to review by another static analyzer. I tried Clang one, and reports are really excellent. I written a script in project/reports/, but it's not yet perfect.</div><div><br></div><div>The first report that clang generate is really interesting. I shared the files (web pages) in this archive :</div><div><br></div><div><a href="https://drive.google.com/open?id=1EKr9vAMZFZ8-UDOXXIrzKdlt5G8ClVD1" target="_blank">https://drive.google.com/open?<wbr>id=1EKr9vAMZFZ8-UDOXXIrzKdlt5G<wbr>8ClVD1</a><br></div><div><br></div><div>Please take a look and feel free to apply patches is necessary.</div><div><br></div><div>I will try to finalize the script while this week end to be able to run the analyzer locally.</div><div><br></div><div>Best</div><span class="m_6358007066034371602m_-1622835431493155084HOEnZb"><font color="#888888"><div><br></div><div>Gilles Caulier</div></font></span></div>
</blockquote></div><br></div>
</div></div></blockquote></div><br></div>
</div></div></blockquote></div><br></div>