23 Apr 11:44 2013

## Automated Differentiation Type Question

Dominic Steinitz <dominic <at> steinitz.org>

2013-04-23 09:44:09 GMT

2013-04-23 09:44:09 GMT

Can anyone tell me why I get a type error with testGrad2? What are my options? Clearly I would like to be able find the gradient of my cost function for different sets of observations. Thanks, Dominic. > {-# LANGUAGE NoMonomorphismRestriction #-} > > import Numeric.AD > > default() > > costFn :: Floating a => [a] -> [[a]] -> [a] -> a > costFn ys xss thetas = (/ (2*m)) $ sum $ map (^ (2 :: Int)) $ > zipWith (\y xs -> costFnAux y xs thetas) ys xss > where > m = fromIntegral $ length xss > costFnAux :: Floating a => a -> [a] -> [a] -> a > costFnAux y xs thetas = y - head thetas - sum (zipWith (*) xs (tail thetas)) > > ys :: Floating a => [a] > ys = [1.0, 2.0, 3.0] > > xss :: Floating a => [[a]] > xss = [[1.0], [2.0], [3.0]] > > thetas :: Floating a => [a] > thetas = [0.0, 1.0] > > test :: Floating a => a > test = costFn ys xss thetas(Continue reading)