15 Mar 2013 21:07

## Optimization flag changing result of code execution

I was trying to solve a computational problem form James P Sethna's book Statistical Mechanics: Entropy, Order Parameters, and Complexity[1]. The problem is on page 19 of the pdf linked and is titled Six degrees of separation. For it I came up with this code: http://hpaste.org/84114

It runs fine when compiled with -O0 and consistently yields an answer around 10, but with -O1 and -O2 it consistently gives an answer around 25. Can somebody explain what is happening here?

[1] http://pages.physics.cornell.edu/~sethna/StatMech/EntropyOrderParametersComplexity.pdf

Azeem

```_______________________________________________
```
15 Mar 2013 22:09

### Re: Optimization flag changing result of code execution

Hey Azeem,
have you tried running the same calculation using rationals? Theres some subtleties to writing numerically stable code using floats and doubles, where simple optimizations change the orders of operations in ways that *significantly* change the result. In this case it looks like you're averaging the averages, which i *believe* can get pretty nasty in terms of numerical precision.  Rationals would be a bit slower, but you could then sort out which number is more correct.

On Fri, Mar 15, 2013 at 4:07 PM, Azeem -ul-Hasan wrote:
I was trying to solve a computational problem form James P Sethna's book Statistical Mechanics: Entropy, Order Parameters, and Complexity[1]. The problem is on page 19 of the pdf linked and is titled Six degrees of separation. For it I came up with this code: http://hpaste.org/84114

It runs fine when compiled with -O0 and consistently yields an answer around 10, but with -O1 and -O2 it consistently gives an answer around 25. Can somebody explain what is happening here?

[1] http://pages.physics.cornell.edu/~sethna/StatMech/EntropyOrderParametersComplexity.pdf

Azeem

_______________________________________________

```_______________________________________________
```
16 Mar 2013 09:46

### Re: Optimization flag changing result of code execution

Hi Carter,

Thank you for your help, but I can confirm that this is not due to floating point errors. My own hunch is that it is due to the way I am using random number generation from System.Random.MWC. To check it I wrote a version of mkNetwork function using random number generation from System.Random and it works fine with optimizations turned on. So any ideas why optimizations are messing with System.Random.MWC?

Azeem
From: carter.schonwald <at> gmail.com
Date: Fri, 15 Mar 2013 17:09:36 -0400
Subject: Re: [Haskell-cafe] Optimization flag changing result of code execution
To: azeeem <at> live.com

Hey Azeem,
have you tried running the same calculation using rationals? Theres some subtleties to writing numerically stable code using floats and doubles, where simple optimizations change the orders of operations in ways that *significantly* change the result. In this case it looks like you're averaging the averages, which i *believe* can get pretty nasty in terms of numerical precision.  Rationals would be a bit slower, but you could then sort out which number is more correct.

On Fri, Mar 15, 2013 at 4:07 PM, Azeem -ul-Hasan wrote:
I was trying to solve a computational problem form James P Sethna's book Statistical Mechanics: Entropy, Order Parameters, and Complexity[1]. The problem is on page 19 of the pdf linked and is titled Six degrees of separation. For it I came up with this code: http://hpaste.org/84114

It runs fine when compiled with -O0 and consistently yields an answer around 10, but with -O1 and -O2 it consistently gives an answer around 25. Can somebody explain what is happening here?

[1] http://pages.physics.cornell.edu/~sethna/StatMech/EntropyOrderParametersComplexity.pdf

Azeem

_______________________________________________

```_______________________________________________
```
16 Mar 2013 09:58

### Re: Optimization flag changing result of code execution

```Perhaps the problem is in withSystemRandom, which uses unsafePerformIO?

Does the problem persist if you seed your program with some predefined
seed?

Roman

* Azeem -ul-Hasan <azeeem <at> live.com> [2013-03-16 13:46:54+0500]
>
> Hi Carter,
>
>      Thank you for your help, but I can confirm that this is not due to floating point errors. My own hunch is that
it is due to the way I am using random number generation from System.Random.MWC. To check it I wrote a
version of mkNetwork function using random number generation from System.Random and it works fine with
optimizations turned on. So any ideas why optimizations are messing with System.Random.MWC?
>
> Azeem
> From: carter.schonwald <at> gmail.com
> Date: Fri, 15 Mar 2013 17:09:36 -0400
> Subject: Re: [Haskell-cafe] Optimization flag changing result of code execution
> To: azeeem <at> live.com
>
> Hey Azeem,have you tried running the same calculation using rationals? Theres some subtleties to
writing numerically stable code using floats and doubles, where simple optimizations change the orders
of operations in ways that *significantly* change the result. In this case it looks like you're averaging
the averages, which i *believe* can get pretty nasty in terms of numerical precision.  Rationals would be a
bit slower, but you could then sort out which number is more correct.
>
>
>
> On Fri, Mar 15, 2013 at 4:07 PM, Azeem -ul-Hasan <azeeem <at> live.com> wrote:
>
>
>
>
>
> I was trying to solve a computational problem form James P Sethna's book Statistical Mechanics: Entropy,
Order Parameters, and Complexity[1]. The problem is on page 19 of the pdf linked and is titled Six degrees
of separation. For it I came up with this code: http://hpaste.org/84114
>
>
>
> It runs fine when compiled with -O0 and consistently yields an answer around 10, but with -O1 and -O2 it
consistently gives an answer around 25. Can somebody explain what is happening here?
>
>
>
> [1] http://pages.physics.cornell.edu/~sethna/StatMech/EntropyOrderParametersComplexity.pdf
>
>
>
> Azeem
>
>
>
> _______________________________________________
>
>
>
>
>
>
>

> _______________________________________________
```
16 Mar 2013 10:31

### Re: Optimization flag changing result of code execution

Nope that isn't the case either. Even if I make use of defaultSeed through create the problem still remains. The problem seems to be in the generation of a vector of (a,a) i.e in the part

V.generateM ((round \$ p*(fromIntegral \$ l*z)) `div` 2) (\i-> R.uniformR ((0,0) , (l-1,l-1)) gen)

in line 16. Thanks again.

Azeem

> Date: Sat, 16 Mar 2013 10:58:50 +0200
> From: roma <at> ro-che.info
> To: azeeem <at> live.com
> Subject: Re: [Haskell-cafe] Optimization flag changing result of code execution
>
> Perhaps the problem is in withSystemRandom, which uses unsafePerformIO?
>
> Does the problem persist if you seed your program with some predefined
> seed?
>
> Roman
>
> * Azeem -ul-Hasan <azeeem <at> live.com> [2013-03-16 13:46:54+0500]
> >
> > Hi Carter,
> >
> > Thank you for your help, but I can confirm that this is not due to floating point errors. My own hunch is that it is due to the way I am using random number generation from System.Random.MWC. To check it I wrote a version of mkNetwork function using random number generation from System.Random and it works fine with optimizations turned on. So any ideas why optimizations are messing with System.Random.MWC?
> >
> > Azeem
> > From: carter.schonwald <at> gmail.com
> > Date: Fri, 15 Mar 2013 17:09:36 -0400
> > Subject: Re: [Haskell-cafe] Optimization flag changing result of code execution
> > To: azeeem <at> live.com
> >
> > Hey Azeem,have you tried running the same calculation using rationals? Theres some subtleties to writing numerically stable code using floats and doubles, where simple optimizations change the orders of operations in ways that *significantly* change the result. In this case it looks like you're averaging the averages, which i *believe* can get pretty nasty in terms of numerical precision. Rationals would be a bit slower, but you could then sort out which number is more correct.
> >
> >
> >
> > On Fri, Mar 15, 2013 at 4:07 PM, Azeem -ul-Hasan <azeeem <at> live.com> wrote:
> >
> >
> >
> >
> >
> > I was trying to solve a computational problem form James P Sethna's book Statistical Mechanics: Entropy, Order Parameters, and Complexity[1]. The problem is on page 19 of the pdf linked and is titled Six degrees of separation. For it I came up with this code: http://hpaste.org/84114
> >
> >
> >
> > It runs fine when compiled with -O0 and consistently yields an answer around 10, but with -O1 and -O2 it consistently gives an answer around 25. Can somebody explain what is happening here?
> >
> >
> >
> > [1] http://pages.physics.cornell.edu/~sethna/StatMech/EntropyOrderParametersComplexity.pdf
> >
> >
> >
> > Azeem
> >
> >
> >
> > _______________________________________________
> >
> >
> >
> >
> >
> >
> >
>
> > _______________________________________________
>
```_______________________________________________
```
17 Mar 2013 10:21

### Re: Optimization flag changing result of code execution

```On 16.03.2013 13:31, Azeem -ul-Hasan wrote:
> Nope that isn't the case either. Even if I make use of defaultSeed
> through create the problem still remains. The problem seems to be in the
> generation of a vector of (a,a) i.e in the part
>
> V.generateM ((round \$ p*(fromIntegral \$ l*z)) `div` 2) (\i-> R.uniformR
> ((0,0) , (l-1,l-1)) gen)
>
> in line 16. Thanks again.
>
I've tried to run you program and I've got approximately same results
regardless of optimization level. Which versions of GHC, mwc-random,
vector and primitive do you use?
```
17 Mar 2013 18:49

### Re: Optimization flag changing result of code execution

```Aleksey Khudyakov <alexey.skladnoy <at> gmail.com> writes:

> I've tried to run you program and I've got approximately same results
> regardless of optimization level. Which versions of GHC, mwc-random,
> vector and primitive do you use?
>

By approximate do you mean you are getting Monte Carlo noise
or Floating Point noise? If the latter then that's reasonable;
if the former then that's worrying.

Dominic.
```
18 Mar 2013 09:37

### Re: Optimization flag changing result of code execution

```On 17 March 2013 21:49, Dominic Steinitz <dominic <at> steinitz.org> wrote:
> Aleksey Khudyakov <alexey.skladnoy <at> gmail.com> writes:
>
>> I've tried to run you program and I've got approximately same results
>> regardless of optimization level. Which versions of GHC, mwc-random,
>> vector and primitive do you use?
>>
>
> By approximate do you mean you are getting Monte Carlo noise
> or Floating Point noise? If the latter then that's reasonable;
> if the former then that's worrying.
>
Difficult to say. I got values around 10 with and without optimizations. Most
likely it's MC noise

I was using GHC-7.6.2 and latest vector/primitive/mwc-random. I didn't tried
to reproduce bug with versions which Azeem Ul Hasan use.
```

Gmane