backprop-learn-0.1.0.0: Combinators and useful tools for ANNs using the backprop library

Safe HaskellNone
LanguageHaskell2010

Backprop.Learn.Model.Parameter

Synopsis

Documentation

deParam Source #

Arguments

:: (Backprop p, Backprop q) 
=> (pq -> (p, q))

split

-> (p -> q -> pq)

join

-> q

fixed param

-> (forall m. PrimMonad m => Gen (PrimState m) -> m q)

fixed stoch param

-> Model (Just pq) s a b 
-> Model (Just p) s a b 

Fix a part of a parameter as constant, preventing backpropagation through it and not training it.

Treats a pq parameter as essentialyl a (p, q), witnessed through the split and join functions.

Takes the fixed value of q, as well as a stochastic mode version with fixed distribution.

deParamD Source #

Arguments

:: (Backprop p, Backprop q) 
=> (pq -> (p, q))

split

-> (p -> q -> pq)

join

-> q

fixed param

-> Model (Just pq) s a b 
-> Model (Just p) s a b 

deParam, but with no special stochastic mode version.

reParam :: (forall z. Reifies z W => PMaybe (BVar z) q -> PMaybe (BVar z) p) -> (forall m z. (PrimMonad m, Reifies z W) => Gen (PrimState m) -> PMaybe (BVar z) q -> m (PMaybe (BVar z) p)) -> Model p s a b -> Model q s a b Source #

Pre-applies a function to a parameter before a model sees it. Essentially something like lmap for parameters.

Takes a determinstic function and also a stochastic function for stochastic mode.

reParamD :: (forall z. Reifies z W => PMaybe (BVar z) q -> PMaybe (BVar z) p) -> Model p s a b -> Model q s a b Source #

reParam, but with no special stochastic mode function.

dummyParam :: Model Nothing s a b -> Model p s a b Source #

Give an unparameterized model a "dummy" parameter. Useful for usage with combinators like . from that require all input models to share a common parameterization.