Daryoush Mehrtash | 13 Feb 23:06 2013
Picon

How to quantify the overhead?

Before I write the code I like to be able to quantify my expected result.  But I am having hard time quantifying my expected result for alternative approaches in Haskell. I would appreciate any comments from experienced Haskell-ers on this problem:


Suppose I have a big list of integers and I like to find the first two that add up to a number, say 10.

One approach is to put the numbers in  the map as I read them and each step look for the 10-x of the number before going on to the next value in the list.

Alternatively, I am trying to see if I it make sense to covert the list to a series of computations that each is looking for the 10-x in the rest of the list (running them  in breath first search).   I like to find out sanity of using the Applicative/Alternative class's (<|>) operator to set up the computation.  So each element of the list  (say x)  is recursively converted to a computation to SearchFor (10 - x)   that is (<|>)-ed with all previous computation until one of the computations returns the pair that add up to 10 or all computation reach end of the list.

Does the approach make any sense?

What kind of overhead should I expect if I convert the numbers to a computation on the list?  In one case I would have a number in a map and in the other I would have a number in a computation over a list.  How do I quantify the overheads?


Thanks,
 
Daryoush


_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe <at> haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe
Simon Marechal | 15 Feb 13:15 2013
Picon

Re: How to quantify the overhead?

On 13/02/2013 23:06, Daryoush Mehrtash wrote:
> Suppose I have a big list of integers and I like to find the first two
> that add up to a number, say 10.
> 
> One approach is to put the numbers in  the map as I read them and each
> step look for the 10-x of the number before going on to the next value
> in the list.
> 
> Alternatively, I am trying to see if I it make sense to covert the list
> to a series of computations that each is looking for the 10-x in the
> rest of the list (running them  in breath first search).   I like to
> find out sanity of using the Applicative/Alternative class's (<|>)
> operator to set up the computation.  So each element of the list  (say
> x)  is recursively converted to a computation to SearchFor (10 - x)  
> that is (<|>)-ed with all previous computation until one of the
> computations returns the pair that add up to 10 or all computation reach
> end of the list.
> 
> Does the approach make any sense?
> 
> What kind of overhead should I expect if I convert the numbers to a
> computation on the list?  In one case I would have a number in a map and
> in the other I would have a number in a computation over a list.  How do
> I quantify the overheads?

I am not sure why nobody answers this, so I will try, even though I am
not knowledgeable at all on the subject of algorithm complexity
analysis, and all my mail might be wrong.

The worst case running time of both algorithms (no match is found) is,
for the first one, O(n log(n)), and for the second O(n^2). The actual
asymptotic is a (probably complicated) function of the distribution of
values in your list and the target value.

This can't be quantified as "overhead", because the two algorithms don't
do the same thing at all, and will behave in vastly distinct ways.

Gmane