Parent

Minimization::NewtonRaphson

Classic Newton-Raphson minimization method. Requires first and second derivative

Usage

  f   = lambda {|x| x**2}
  fd  = lambda {|x| 2x}
  fdd = lambda {|x| 2}
  min = Minimization::NewtonRaphson.new(-1000,1000, f,fd,fdd)
  min.iterate
  min.x_minimum
  min.f_minimum

Public Class Methods

minimize(*args) click to toggle source

Raises an error

# File lib/minimization.rb, line 112
    def self.minimize(*args)
      raise "You should use #new and #iterate"
    end
new(lower, upper, proc, proc_1d, proc_2d) click to toggle source

Parameters:

  • lower: Lower possible value
  • upper: Higher possible value
  • proc: Original function
  • proc_1d: First derivative
  • proc_2d: Second derivative
# File lib/minimization.rb, line 106
    def initialize(lower, upper, proc, proc_1d, proc_2d)
      super(lower,upper,proc)
      @proc_1d=proc_1d
      @proc_2d=proc_2d
    end

Public Instance Methods

iterate() click to toggle source

(Not documented)

# File lib/minimization.rb, line 115
    def iterate
      # First
      x_prev=@lower
      x=@expected
      failed=true
      k=0
      while (x-x_prev).abs > @epsilon and k<@max_iteration
        k+=1
        x_prev=x
        x=x-(@proc_1d.call(x).quo(@proc_2d.call(x)))
        f_prev=f(x_prev)
        f=f(x)
        x_min,x_max=[x,x_prev].min, [x,x_prev].max 
        f_min,f_max=[f,f_prev].min, [f,f_prev].max 
        @log << [k, x_min, x_max, f_min, f_max, (x_prev-x).abs, (f-f_prev).abs]
      end
      raise FailedIteration, "Not converged" if k>=@max_iteration
      @x_minimum = x;
      @f_minimum = f(x);
    end

Disabled; run with --debug to generate this.

[Validate]

Generated with the Darkfish Rdoc Generator 1.1.6.