[Kstars-devel] Downloadable star catalogs
Akarsh Simha
akarshsimha at gmail.com
Mon Mar 1 19:05:04 CET 2010
On Fri, Feb 26, 2010 at 11:01:08PM +0300, Alexey Khudyakov wrote:
> ?? ?????????????????? ???? 26 ?????????????? 2010 21:53:19 Lukas Middendorf ??????????????:
> > Am Monday 22 February 2010 schrieb Khudyakov Alexey:
> > > It seems that knewstuff doesn't play well with big files. I cannot
> > > download USNO catalog because it won't fit into my 512M /tmp.
> >
> > It indeed seems to be a problem with free disk space.
> >
> > The USNO catalog needs a total of 5.4GiB of disk space while being
> > installed. 1.4GiB for the original tar.bz2 (in /tmp)
> > 2.0GiB for the tar after uncompressing (in /tmp)
> > 2.0GiB for the real catalog (in /home)
> >
> > If you run out of disk space in the last step (I have /tmp and /home on the
> > same partition) you get only part of the file.
> >
> > I guess it could be optimized by either deleting the original download
> > right after uncompressing and before unpacking or by doing the
> > uncompressing and unpacking in one step to skip the *.tar. What is the
> > right component to report this against?
> >
> Actually file could be downloaded and uncompressed without using /tmp at all.
> Simple shell script example:
>
> wget "$URL" -O - | gunzip | tar xf -
>
> I think this is right approach. Most likely knewstuff wasn't designed with
> really big files in mind.
>
> Right component is knewstuff in kdelibs. And please post bug number here if you
> report this issue.
Would it be wise to split the USNO catalog into smaller parts?
Regards
Akarsh
More information about the Kstars-devel
mailing list