Karim BELABAS on Mon, 29 May 2000 19:50:19 +0200 (MET DST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: qfminim bug 3


[Paul van Wamelen:]
> In qfminim with the flag equal to 2 (that is for dealing with
> non-integral coefficients), if you want more vectors returned than can
> fit on the stack, you are more than likely to get a segmentation
> fault:
[...]
> The reason for this seems to be that in line 2656 of basemath/bibli1.c:
>          for (i=s+1; i<=stockmax; i++) S[i]=zero;
> the S array gets filled up with zeros instead of zero vectors.

The latter behaviour is correct [I added a comment at this point in the
code]. S is still an empty placeholder at this point so one needs to
initialize all components with a valid GEN before the call to gerepile
otherwise it will crash when inspecting this uninititialized data.

A meaningful initialization would indeed be vectors of 0s. But "zero" take
less space than vectors of zeroes (save roughly a factor N, where N is the
dimension...). Of course S is not a valid object until it is truncated to
stockmax elements (which is dutifully done before the function returns),
but this is of no concern to the functions that handle it in smallvectors.


The "real" bug was that "norms" was being filled even though it wasn't
needed at all, hence wasn't saved when garbage collecting occured. After
one garbage collection, norms points to some random place. After a few
assignments to norms[s], things crash...

> While trying to track this down I came across something else that seems
> to be a bug. After doing a garbage collection in smallvectors, the stack
> might in fact now be fuller than it was before the garbage collection.
> This is because as we collect vectors lots of components of these vectors
> point to the same actual data. (if two saved vectors differ only in the
> last few entries, the unchanged entries point to the same integers in
> memory). While doing a gerepilemany these vectors get gcopy'ied, which
> means that all components of all vectors now point to unique data. (In
> fact a forcecopy gets done which also expands all the zero's with which S
> gets filled up). This can take up more space than the garbage that was
> collected.

A simple way out is to store these huge vectors of vectors out of
the stack (in clones). That's exactly what gerepilemany does except it
copies them back on the cleaned up stack then destroys the clones. It's
a bit more cumbersome to do it by hand (have to make sure clones are freed
before the function returns), but it prevents the undesirable behaviour
described above.

Note: a useful setting to detect this kind of situations is
default(debugmem, 2) [or \gm2] (I have it on by default). Whenever
important garbage collecting occurs, which ought to be rare if the stack
size is adequately chosen, a Warning is printed  [ also, if the GP binary
was compiled with DEBUGMEM defined (on by default for debugging versions),
it regularly reports on memory usage ]


I corrected the above two problems and incorporated your patches 1 and 3 in
the CVS server. Can you check that everything is fine on your test data ?

  Karim.

P.S: I also tried to correct a number of interface bugs in the underlying
private functions (fincke_pohst() and smallvectors()): one of their
arguments is an oracle able to tell whether a given vector is to be kept or
not [useful for polredabs and a few other functions], and this was done in
a very awkward way (various hardcoded behaviours specific to the current
callers expectations, global variables, etc).
__
Karim Belabas                    email: Karim.Belabas@math.u-psud.fr
Dep. de Mathematiques, Bat. 425
Universite Paris-Sud             Tel: (00 33) 1 69 15 57 48
F-91405 Orsay (France)           Fax: (00 33) 1 69 15 60 19
--
PARI/GP Home Page: http://www.parigp-home.de/