breeze.optimize.AdaptiveGradientDescent

L1Regularization

class L1Regularization[T] extends StochasticGradientDescent[T]

Implements the L1 regularization update.

Each step is:

x_{t+1}i = sign(x_{t,i} - eta/s_i * g_ti) * (abs(x_ti - eta/s_ti * g_ti) - lambda * eta /s_ti))_+

where g_ti is the gradient and s_ti = \sqrt(\sum_t'{t} g_ti2)

Linear Supertypes
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. L1Regularization
  2. StochasticGradientDescent
  3. FirstOrderMinimizer
  4. Logging
  5. Minimizer
  6. AnyRef
  7. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new L1Regularization(lambda: Double = 1.0, delta: Double = 1.0E-5, eta: Double = 4, maxIter: Int = 100)(implicit vspace: MutableCoordinateSpace[T, Double])

Type Members

  1. case class History(sumOfSquaredGradients: T) extends Product with Serializable

    Definition Classes
    L1RegularizationFirstOrderMinimizer
  2. case class State(x: T, value: Double, grad: T, adjustedValue: Double, adjustedGradient: T, iter: Int, initialAdjVal: Double, history: History, fVals: IndexedSeq[Double] = ..., numImprovementFailures: Int = 0, searchFailed: Boolean = false) extends Product with Serializable

    Definition Classes
    FirstOrderMinimizer

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def adjust(newX: T, newGrad: T, newVal: Double): (Double, T)

    Attributes
    protected
    Definition Classes
    L1RegularizationFirstOrderMinimizer
  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. def calculateObjective(f: StochasticDiffFunction[T], x: T, history: History): (Double, T)

    Attributes
    protected
    Definition Classes
    FirstOrderMinimizer
  9. def chooseDescentDirection(state: State, fn: StochasticDiffFunction[T]): T

    Attributes
    protected
    Definition Classes
    StochasticGradientDescentFirstOrderMinimizer
  10. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  11. val defaultStepSize: Double

    Definition Classes
    StochasticGradientDescent
  12. def determineStepSize(state: State, f: StochasticDiffFunction[T], dir: T): Double

    Choose a step size scale for this iteration.

    Choose a step size scale for this iteration.

    Default is eta / math.pow(state.iter + 1,2.0 / 3.0)

    Definition Classes
    L1RegularizationStochasticGradientDescentFirstOrderMinimizer
  13. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  14. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  15. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  16. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  17. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  18. def initialHistory(f: StochasticDiffFunction[T], init: T): History

    Definition Classes
    L1RegularizationFirstOrderMinimizer
  19. def initialState(f: StochasticDiffFunction[T], init: T): State

    Attributes
    protected
    Definition Classes
    FirstOrderMinimizer
  20. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  21. def iteratingShouldStop(state: State): Boolean

    Definition Classes
    FirstOrderMinimizer
  22. def iterations(f: StochasticDiffFunction[T], init: T): Iterator[State]

    Definition Classes
    FirstOrderMinimizer
  23. val lambda: Double

  24. lazy val logger: Logger

    Attributes
    protected
    Definition Classes
    Logging
  25. val minImprovementWindow: Int

    Definition Classes
    FirstOrderMinimizer
  26. def minimize(f: StochasticDiffFunction[T], init: T): T

    Definition Classes
    FirstOrderMinimizerMinimizer
  27. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  28. final def notify(): Unit

    Definition Classes
    AnyRef
  29. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  30. val numberOfImprovementFailures: Int

    Definition Classes
    FirstOrderMinimizer
  31. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  32. def takeStep(state: State, dir: T, stepSize: Double): T

    Projects the vector x onto whatever ball is needed.

    Projects the vector x onto whatever ball is needed. Can also incorporate regularization, or whatever.

    Default just takes a step

    Attributes
    protected
    Definition Classes
    L1RegularizationStochasticGradientDescentFirstOrderMinimizer
  33. def toString(): String

    Definition Classes
    AnyRef → Any
  34. def updateFValWindow(oldState: State, newAdjVal: Double): IndexedSeq[Double]

    Attributes
    protected
    Definition Classes
    StochasticGradientDescentFirstOrderMinimizer
  35. def updateHistory(newX: T, newGrad: T, newValue: Double, f: StochasticDiffFunction[T], oldState: State): History

    Definition Classes
    L1RegularizationFirstOrderMinimizer
  36. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  37. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  38. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from StochasticGradientDescent[T]

Inherited from Logging

Inherited from Minimizer[T, StochasticDiffFunction[T]]

Inherited from AnyRef

Inherited from Any

Ungrouped