Karim BELABAS on Wed, 17 Jul 2002 22:09:28 +0200 (MEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: precision-performance tradeoff


On Mon, 1 Jul 2002, Igor Schein wrote:
> I noticeed this:
>
> ? forstep(p=80,160,10,setrand(1);default(realprecision,p);gettime;thueinit(x^7-401);print(p" "gettime))
> 80 16780
> 90 16810
> 100 12750
> 110 12840
> 120 15140
> 130 10230
> 140 3410
> 150 3510
> 160 3590
>
> If you start with a high-enough precision, you can get 5x speedup
> factor.  Is it an expected behavior?

Not really. This is due to the (insane) way of handling precision increase in
buchall (aka bnfinit): forget everything and restart with "correct"
precision. [insane since relations are 1) tough to find, 2) algebraic (don't
depend on precision), and 3) can be stored algebraically at a very modest
memory cost)]

This is already suboptimal (Guillaume Hanrot was working on a comprehensive
patch for this), but becomes idiotic when one takes into account the fact
that the process is random-seed dependant. And the "correct" precision might
not be correct in the second (and subsequent) run(s) after all.

Waiting for Guillaume's patch, I restored the random seed when doubling
precision in buchall. I now get:

(22:08) gp > forstep(p=80,160,10,setrand(1);default(realprecision,p);gettime;thueinit(x^7-401);print(p" "gettime))
80 6180
90 6170
100 6470
110 6390
120 7600
130 3330
140 3400
150 3470
160 3510

which is more sensible.

    Karim.
-- 
Karim Belabas                    Tel: (+33) (0)1 69 15 57 48
Dép. de Mathematiques, Bat. 425  Fax: (+33) (0)1 69 15 60 19
Université Paris-Sud             Email: Karim.Belabas@math.u-psud.fr
F-91405 Orsay (France)           http://www.math.u-psud.fr/~belabas/
--
PARI/GP Home Page: http://www.parigp-home.de/