Parent

Included Modules

Statsample::Bivariate::Polychoric

Polychoric correlation.

The polychoric correlation is a measure of bivariate association arising when both observed variates are ordered, categorical variables that result from polychotomizing the two undelying continuous variables (Drasgow, 2006)

According to Drasgow(2006), there are tree methods to estimate the polychoric correlation: ML Joint estimation, ML two-step estimation and polycoric series estimate. You can select the estimation method with method attribute.

ML Joint Estimation

Requires gsl library and gsl gem. Joint estimation uses derivative based algorithm by default, based on Ollson(1979). There is available a derivative free algorithm available compute_one_step_mle_without_derivatives() , based loosely on J.Fox R package ‘polycor’ algorithm.

Two-step Estimation

Default method. Uses a no-derivative aproach, based on J.Fox R package ‘polycor’.

Polychoric series estimate.

Warning: Result diverge a lot from Joint and two-step calculation.

Requires gsl library and gsl gem. Based on Martinson and Hamdam(1975) algorithm.

Use

You should enter a Matrix with ordered data. For example:

        -------------------
        | y=0 | y=1 | y=2 | 
        -------------------
  x = 0 |  1  |  10 | 20  |
        -------------------
  x = 1 |  20 |  20 | 50  |
        -------------------

The code will be

  matrix=Matrix[[1,10,20],[20,20,50]]
  poly=Statsample::Bivariate::Polychoric.new(matrix, :method=>:joint)
  puts poly.r

See extensive documentation on Uebersax(2002) and Drasgow(2006)

Reference

Constants

METHOD

Default method

MAX_ITERATIONS

Max number of iteratios

EPSILON

Epsilon

MINIMIZER_TYPE_TWO_STEP

GSL unidimensional minimizer

MINIMIZER_TYPE_JOINT_DERIVATIVE

GSL multidimensional minimizer, derivative based

MINIMIZER_TYPE_JOINT_NO_DERIVATIVE

GSL multidimensional minimizer, non derivative based

Attributes

name[RW]

Name of the analysis

max_iterations[RW]

Max number of iterations used on iterative methods. Default to MAX_ITERATIONS

debug[RW]

Debug algorithm (See iterations, for example)

minimizer_type_two_step[RW]

Minimizer type for two step. Default “brent” See rb-gsl.rubyforge.org/min.html for reference.

minimizer_type_joint_no_derivative[RW]

Minimizer type for joint estimate, no derivative. Default “nmsimplex”. See rb-gsl.rubyforge.org/min.html for reference.

minimizer_type_joint_derivative[RW]

Minimizer type for joint estimate, using derivative. Default “conjugate_pr“. See rb-gsl.rubyforge.org/min.html for reference.

method[RW]

Method of calculation of polychoric series. :two_step used by default.

:two_step

two-step ML, based on code by J.Fox

:polychoric_series

polychoric series estimate, using algorithm AS87 by Martinson and Hamdan (1975).

:joint

one-step ML, usign derivatives by Olsson (1979)

epsilon[RW]

Absolute error for iteration.

iteration[R]

Number of iterations

log[R]

Log of algorithm

loglike_model[R]

Model ll

r[R]

Returns the polychoric correlation

alpha[R]

Returns the rows thresholds

beta[R]

Returns the columns thresholds

Public Class Methods

new(matrix, opts=Hash.new) click to toggle source

Params:

  • matrix: Contingence table

  • opts: Hash with options. Could be any

accessable attribute of object

     # File lib/statsample/bivariate/polychoric.rb, line 166
166:       def initialize(matrix, opts=Hash.new)
167:         @matrix=matrix
168:         @n=matrix.column_size
169:         @m=matrix.row_size
170:         raise "row size <1" if @m<=1
171:         raise "column size <1" if @n<=1
172:         
173:         @method=METHOD
174:         @name=_("Polychoric correlation")
175:         @max_iterations=MAX_ITERATIONS
176:         @epsilon=EPSILON
177:         @minimizer_type_two_step=MINIMIZER_TYPE_TWO_STEP
178:         @minimizer_type_joint_no_derivative=MINIMIZER_TYPE_JOINT_NO_DERIVATIVE
179:         @minimizer_type_joint_derivative=MINIMIZER_TYPE_JOINT_DERIVATIVE
180:         
181:         @debug=false
182:         @iteration=nil
183:         opts.each{|k,v|
184:           self.send("#{k}=",v) if self.respond_to? k
185:         }
186:         @r=nil
187:         @pd=nil
188:         compute_basic_parameters
189:       end
new_with_vectors(v1,v2) click to toggle source

Create a Polychoric object, based on two vectors

     # File lib/statsample/bivariate/polychoric.rb, line 159
159:       def self.new_with_vectors(v1,v2)
160:         Polychoric.new(Crosstab.new(v1,v2).to_matrix)
161:       end

Public Instance Methods

chi_square() click to toggle source

Chi Square of model

     # File lib/statsample/bivariate/polychoric.rb, line 227
227:       def chi_square
228:         if @loglike_model.nil?
229:           compute
230:         end
231:         2*(@loglike_model-loglike_data)
232:       end
chi_square_df() click to toggle source
     # File lib/statsample/bivariate/polychoric.rb, line 234
234:       def chi_square_df
235:         (@nr*@nc)-@nc-@nr
236:       end
compute() click to toggle source

Start the computation of polychoric correlation based on attribute method.

     # File lib/statsample/bivariate/polychoric.rb, line 198
198:       def compute
199:         if @method==:two_step
200:           compute_two_step_mle
201:         elsif @method==:joint
202:           compute_one_step_mle
203:         elsif @method==:polychoric_series
204:           compute_polychoric_series
205:         else
206:           raise "Not implemented"
207:         end
208:       end
compute_basic_parameters() click to toggle source
     # File lib/statsample/bivariate/polychoric.rb, line 241
241:       def compute_basic_parameters
242:         @nr=@matrix.row_size
243:         @nc=@matrix.column_size
244:         @sumr=[0]*@matrix.row_size
245:         @sumrac=[0]*@matrix.row_size
246:         @sumc=[0]*@matrix.column_size
247:         @sumcac=[0]*@matrix.column_size
248:         @alpha=[0]*(@nr-1)
249:         @beta=[0]*(@nc-1)
250:         @total=0
251:         @nr.times do |i|
252:           @nc.times do |j|
253:             @sumr[i]+=@matrix[i,j]
254:             @sumc[j]+=@matrix[i,j]
255:             @total+=@matrix[i,j]
256:           end
257:         end
258:         ac=0
259:         (@nr-1).times do |i|
260:           @sumrac[i]=@sumr[i]+ac
261:           @alpha[i]=Distribution::Normal.p_value(@sumrac[i] / @total.to_f)
262:           ac=@sumrac[i]
263:         end
264:         ac=0
265:         (@nc-1).times do |i|
266:           @sumcac[i]=@sumc[i]+ac
267:           @beta[i]=Distribution::Normal.p_value(@sumcac[i] / @total.to_f)
268:           ac=@sumcac[i]
269:         end
270:       end
compute_one_step_mle() click to toggle source

Compute joint ML estimation. Uses compute_one_step_mle_with_derivatives() by default.

     # File lib/statsample/bivariate/polychoric.rb, line 378
378:       def compute_one_step_mle
379:         compute_one_step_mle_with_derivatives
380:       end
compute_one_step_mle_with_derivatives() click to toggle source

Compute Polychoric correlation with joint estimate, usign derivative based minimization method.

Much faster than method without derivatives.

     # File lib/statsample/bivariate/polychoric.rb, line 386
386:       def compute_one_step_mle_with_derivatives
387:         # Get initial values with two-step aproach
388:         compute_two_step_mle
389:         # Start iteration with past values
390:         rho=@r
391:         cut_alpha=@alpha
392:         cut_beta=@beta
393:         parameters=[rho]+cut_alpha+cut_beta
394:         np=@nc-1+@nr
395:         
396:         
397:         loglike_f = Proc.new { |v, params|
398:           new_rho=v[0]
399:           new_alpha=v[1, @nr-1]
400:           new_beta=v[@nr, @nc-1]
401:           pr=Processor.new(new_alpha,new_beta,new_rho,@matrix)
402:           pr.loglike
403:         }
404:         
405:         loglike_df = Proc.new {|v, params, df |
406:           compute_derivatives_vector(v,df)
407:         }
408:           
409:         
410:         my_func = GSL::MultiMin::Function_fdf.alloc(loglike_f,loglike_df, np)
411:         my_func.set_params(parameters)      # parameters
412:         
413:         x = GSL::Vector.alloc(parameters.dup)
414:         minimizer = GSL::MultiMin::FdfMinimizer.alloc(minimizer_type_joint_derivative,np)
415:         minimizer.set(my_func, x, 1, 1e-3)
416:         
417:         iter = 0
418:         message=""
419:         begin_time=Time.new
420:         begin
421:           iter += 1
422:           status = minimizer.iterate()
423:           #p minimizer.f
424:           #p minimizer.gradient
425:           status = minimizer.test_gradient(1e-3)
426:           if status == GSL::SUCCESS
427:             total_time=Time.new-begin_time
428:             message+="Joint MLE converged to minimum on %0.3f seconds at\n" % total_time
429:           end
430:           x = minimizer.x
431:           message+= sprintf("%5d iterations", iter)+"\n";
432:           message+= "args="
433:           for i in 0...np do
434:             message+=sprintf("%10.3e ", x[i])
435:           end
436:           message+=sprintf("f() = %7.3f\n"  , minimizer.f)+"\n";
437:         end while status == GSL::CONTINUE and iter < @max_iterations
438:         
439:         @iteration=iter
440:         @log+=message        
441:         @r=minimizer.x[0]
442:         @alpha=minimizer.x[1,@nr-1].to_a
443:         @beta=minimizer.x[@nr,@nc-1].to_a
444:         @loglike_model= -minimizer.minimum
445:         
446:         pr=Processor.new(@alpha,@beta,@r,@matrix)
447:         
448:       end
compute_one_step_mle_without_derivatives() click to toggle source

Compute Polychoric correlation with joint estimate, usign derivative-less minimization method.

Rho and thresholds are estimated at same time. Code based on R package “polycor”, by J.Fox.

     # File lib/statsample/bivariate/polychoric.rb, line 457
457:       def compute_one_step_mle_without_derivatives
458:         # Get initial values with two-step aproach
459:         compute_two_step_mle
460:         # Start iteration with past values
461:         rho=@r
462:         cut_alpha=@alpha
463:         cut_beta=@beta
464:         parameters=[rho]+cut_alpha+cut_beta
465:         np=@nc-1+@nr
466: 
467:         minimization = Proc.new { |v, params|
468:           new_rho=v[0]
469:          new_alpha=v[1, @nr-1]
470:          new_beta=v[@nr, @nc-1]
471:          
472:          #puts "f'rho=#{fd_loglike_rho(alpha,beta,rho)}"
473:          #(@nr-1).times {|k|
474:          #  puts "f'a(#{k}) = #{fd_loglike_a(alpha,beta,rho,k)}"         
475:          #  puts "f'a(#{k}) v2 = #{fd_loglike_a2(alpha,beta,rho,k)}"         
476:          #
477:          #}
478:          #(@nc-1).times {|k|
479:          #  puts "f'b(#{k}) = #{fd_loglike_b(alpha,beta,rho,k)}"         
480:          #}
481:          pr=Processor.new(new_alpha,new_beta,new_rho,@matrix)
482:          
483:          df=Array.new(np)
484:          #compute_derivatives_vector(v,df)
485:          pr.loglike
486:         }
487:         my_func = GSL::MultiMin::Function.alloc(minimization, np)
488:         my_func.set_params(parameters)      # parameters
489:         
490:         x = GSL::Vector.alloc(parameters.dup)
491:         
492:         ss = GSL::Vector.alloc(np)
493:         ss.set_all(1.0)
494:         
495:         minimizer = GSL::MultiMin::FMinimizer.alloc(minimizer_type_joint_no_derivative,np)
496:         minimizer.set(my_func, x, ss)
497:         
498:         iter = 0
499:         message=""
500:         begin_time=Time.new
501:         begin
502:           iter += 1
503:           status = minimizer.iterate()
504:           status = minimizer.test_size(@epsilon)
505:           if status == GSL::SUCCESS
506:             total_time=Time.new-begin_time
507:             message="Joint MLE converged to minimum on %0.3f seconds at\n" % total_time
508:           end
509:           x = minimizer.x
510:           message+= sprintf("%5d iterations", iter)+"\n";
511:           for i in 0...np do
512:             message+=sprintf("%10.3e ", x[i])
513:           end
514:           message+=sprintf("f() = %7.3f size = %.3f\n", minimizer.fval, minimizer.size)+"\n";
515:         end while status == GSL::CONTINUE and iter < @max_iterations
516:         @iteration=iter
517:         @log+=message        
518:         @r=minimizer.x[0]
519:         @alpha=minimizer.x[1,@nr-1].to_a
520:         @beta=minimizer.x[@nr,@nc-1].to_a
521:         @loglike_model= -minimizer.minimum
522:       end
compute_polychoric_series() click to toggle source

Compute polychoric correlation using polychoric series. Algorithm: AS87, by Martinson and Hamdam(1975).

Warning: According to Drasgow(2006), this computation diverges greatly of joint and two-step methods.

     # File lib/statsample/bivariate/polychoric.rb, line 571
571:       def compute_polychoric_series 
572:         @nn=@n-1
573:         @mm=@m-1
574:         @nn7=7*@nn
575:         @mm7=7*@mm
576:         @mn=@n*@m
577:         @cont=[nil]
578:         @n.times {|j|
579:           @m.times {|i|
580:             @cont.push(@matrix[i,j])
581:           }
582:         }
583: 
584:         pcorl=0
585:         cont=@cont
586:         xmean=0.0
587:         sum=0.0
588:         row=[]
589:         colmn=[]
590:         (1..@m).each do |i|
591:           row[i]=0.0
592:           l=i
593:           (1..@n).each do |j|
594:             row[i]=row[i]+cont[l]
595:             l+=@m
596:           end
597:           raise "Should not be empty rows" if(row[i]==0.0)
598:           xmean=xmean+row[i]*i.to_f
599:           sum+=row[i]
600:         end
601:         xmean=xmean/sum.to_f
602:         ymean=0.0
603:         (1..@n).each do |j|
604:           colmn[j]=0.0
605:           l=(j-1)*@m
606:           (1..@m).each do |i|
607:             l=l+1
608:             colmn[j]=colmn[j]+cont[l] #12
609:           end
610:           raise "Should not be empty cols" if colmn[j]==0
611:           ymean=ymean+colmn[j]*j.to_f
612:         end
613:         ymean=ymean/sum.to_f
614:         covxy=0.0
615:         (1..@m).each do |i|
616:           l=i
617:           (1..@n).each do |j|
618:             conxy=covxy+cont[l]*(i.to_f-xmean)*(j.to_f-ymean)
619:             l=l+@m
620:           end
621:         end
622:         
623:         chisq=0.0
624:         (1..@m).each do |i|
625:           l=i
626:           (1..@n).each do |j|
627:             chisq=chisq+((cont[l]**2).quo(row[i]*colmn[j]))
628:             l=l+@m
629:           end
630:         end
631:         
632:         phisq=chisq-1.0-(@mm*@nn).to_f / sum.to_f
633:         phisq=0 if(phisq<0.0) 
634:         # Compute cumulative sum of columns and rows
635:         sumc=[]
636:         sumr=[]
637:         sumc[1]=colmn[1]
638:         sumr[1]=row[1]
639:         cum=0
640:         (1..@nn).each do |i| # goto 17 r20
641:           cum=cum+colmn[i]
642:           sumc[i]=cum
643:         end
644:         cum=0
645:         (1..@mm).each do |i| 
646:           cum=cum+row[i]
647:           sumr[i]=cum
648:         end
649:         alpha=[]
650:         beta=[]
651:         # Compute points of polytomy
652:         (1..@mm).each do |i| #do 21
653:           alpha[i]=Distribution::Normal.p_value(sumr[i] / sum.to_f)
654:         end # 21
655:         (1..@nn).each do |i| #do 22
656:           beta[i]=Distribution::Normal.p_value(sumc[i] / sum.to_f)
657:         end # 21
658:         @alpha=alpha[1,alpha.size] 
659:         @beta=beta[1,beta.size]
660:         @sumr=row[1,row.size]
661:         @sumc=colmn[1,colmn.size]
662:         @total=sum
663:         
664:         # Compute Fourier coefficients a and b. Verified
665:         h=hermit(alpha,@mm)
666:         hh=hermit(beta,@nn)
667:         a=[]
668:         b=[]
669:         if @m!=2 # goto 24
670:           mmm=@m-2
671:           (1..mmm).each do |i| #do 23
672:             a1=sum.quo(row[i+1] * sumr[i] * sumr[i+1])
673:             a2=sumr[i]   * xnorm(alpha[i+1])
674:             a3=sumr[i+1] * xnorm(alpha[i])
675:             l=i
676:             (1..7).each do |j| #do 23
677:               a[l]=Math::sqrt(a1.quo(j))*(h[l+1] * a2 - h[l] * a3)
678:               l=l+@mm
679:             end
680:           end #23
681:         end
682:         # 24
683:         
684:         
685:         if @n!=2 # goto 26
686:           nnn=@n-2
687:           (1..nnn).each do |i| #do 25
688:             a1=sum.quo(colmn[i+1] * sumc[i] * sumc[i+1])
689:             a2=sumc[i] * xnorm(beta[i+1])
690:             a3=sumc[i+1] * xnorm(beta[i])
691:             l=i
692:             (1..7).each do |j| #do 25
693:               b[l]=Math::sqrt(a1.quo(j))*(a2 * hh[l+1] - a3*hh[l])
694:               l=l+@nn
695:             end # 25
696:           end # 25
697:         end
698:         #26 r20
699:         l = @mm
700:         a1 = -sum * xnorm(alpha[@mm])
701:         a2 = row[@m] * sumr[@mm] 
702:         (1..7).each do |j| # do 27
703:           a[l]=a1 * h[l].quo(Math::sqrt(j*a2))
704:           l=l+@mm
705:         end # 27
706:         
707:         l = @nn
708:         a1 = -sum * xnorm(beta[@nn])
709:         a2 = colmn[@n] * sumc[@nn]
710: 
711:         (1..7).each do |j| # do 28
712:           b[l]=a1 * hh[l].quo(Math::sqrt(j*a2))
713:           l = l + @nn
714:         end # 28
715:         rcof=[]
716:         # compute coefficients rcof of polynomial of order 8
717:         rcof[1]=-phisq
718:         (2..9).each do |i| # do 30
719:           rcof[i]=0.0
720:         end #30 
721:         m1=@mm
722:         (1..@mm).each do |i| # do 31
723:           m1=m1+1
724:           m2=m1+@mm
725:           m3=m2+@mm
726:           m4=m3+@mm
727:           m5=m4+@mm
728:           m6=m5+@mm
729:           n1=@nn
730:           (1..@nn).each do |j| # do 31
731:             n1=n1+1
732:             n2=n1+@nn
733:             n3=n2+@nn
734:             n4=n3+@nn
735:             n5=n4+@nn
736:             n6=n5+@nn
737:             
738:             rcof[3] = rcof[3] + a[i]**2 * b[j]**2
739:             
740:             rcof[4] = rcof[4] + 2.0 * a[i] * a[m1] * b[j] * b[n1]
741:             
742:             rcof[5] = rcof[5] + a[m1]**2 * b[n1]**2 +
743:               2.0 * a[i] * a[m2] * b[j] * b[n2]
744:             
745:             rcof[6] = rcof[6] + 2.0 * (a[i] * a[m3] * b[j] *
746:               b[n3] + a[m1] * a[m2] * b[n1] * b[n2])
747:             
748:             rcof[7] = rcof[7] + a[m2]**2 * b[n2]**2 +
749:               2.0 * (a[i] * a[m4] * b[j] * b[n4] + a[m1] * a[m3] *
750:                 b[n1] * b[n3])
751:             
752:             rcof[8] = rcof[8] + 2.0 * (a[i] * a[m5] * b[j] * b[n5] +
753:               a[m1] * a[m4] * b[n1] * b[n4] + a[m2] *  a[m3] * b[n2] * b[n3])
754:             
755:             rcof[9] = rcof[9] + a[m3]**2 * b[n3]**2 +
756:               2.0 * (a[i] * a[m6] * b[j] * b[n6] + a[m1] * a[m5] * b[n1] *
757:               b[n5] + (a[m2] * a[m4] * b[n2] * b[n4]))
758:           end # 31
759:         end # 31
760: 
761:         rcof=rcof[1,rcof.size]
762:         poly = GSL::Poly.alloc(rcof)
763:         roots=poly.solve
764:         rootr=[nil]
765:         rooti=[nil]
766:         roots.each {|c|
767:           rootr.push(c.real)
768:           rooti.push(c.im)
769:         }
770:         @rootr=rootr
771:         @rooti=rooti
772:         
773:         norts=0
774:         (1..7).each do |i| # do 43
775:           
776:           next if rooti[i]!=0.0 
777:           if (covxy>=0.0)
778:             next if(rootr[i]<0.0 or rootr[i]>1.0)
779:             pcorl=rootr[i]
780:             norts=norts+1
781:           else
782:             if (rootr[i]>=1.0 and rootr[i]<0.0)
783:               pcorl=rootr[i]
784:               norts=norts+1              
785:             end
786:           end
787:         end # 43
788:         raise "Error" if norts==0
789:         @r=pcorl
790:         pr=Processor.new(@alpha,@beta,@r,@matrix)
791:         @loglike_model=-pr.loglike
792:         
793:       end
compute_two_step_mle() click to toggle source

Computation of polychoric correlation usign two-step ML estimation.

Two-step ML estimation “first estimates the thresholds from the one-way marginal frequencies, then estimates rho, conditional on these thresholds, via maximum likelihood” (Uebersax, 2006).

The algorithm is based on code by Gegenfurtner(1992).

References:

  • Gegenfurtner, K. (1992). PRAXIS: Brent’s algorithm for function minimization. Behavior Research Methods, Instruments & Computers, 24(4), 560-564. Available on www.allpsych.uni-giessen.de/karl/pdf/03.praxis.pdf

  • Uebersax, J.S. (2006). The tetrachoric and polychoric correlation coefficients. Statistical Methods for Rater Agreement web site. 2006. Available at: john-uebersax.com/stat/tetra.htm . Accessed February, 11, 2010

     # File lib/statsample/bivariate/polychoric.rb, line 284
284:       def compute_two_step_mle
285:         if Statsample.has_gsl?
286:           compute_two_step_mle_gsl
287:         else
288:           compute_two_step_mle_ruby
289:         end
290:       end
compute_two_step_mle_gsl() click to toggle source

Compute two step ML estimation using gsl.

     # File lib/statsample/bivariate/polychoric.rb, line 315
315:       def compute_two_step_mle_gsl 
316:         
317:       fn1=GSL::Function.alloc {|rho|
318:         pr=Processor.new(@alpha,@beta, rho, @matrix)
319:         pr.loglike
320:       }
321:       @iteration = 0
322:       max_iter = @max_iterations
323:       m = 0             # initial guess
324:       m_expected = 0
325:       a=0.9999
326:       b=0.9999
327:       gmf = GSL::Min::FMinimizer.alloc(@minimizer_type_two_step)
328:       gmf.set(fn1, m, a, b)
329:       header=_("Two step minimization using %s method\n") % gmf.name
330:       header+=sprintf("%5s [%9s, %9s] %9s %10s %9s\n", "iter", "lower", "upper", "min",
331:          "err", "err(est)")
332:         
333:       header+=sprintf("%5d [%.7f, %.7f] %.7f %+.7f %.7f\n", @iteration, a, b, m, m - m_expected, b - a)
334:       @log=header
335:       puts header if @debug
336:       begin
337:         @iteration += 1
338:         status = gmf.iterate
339:         status = gmf.test_interval(@epsilon, 0.0)
340:         
341:         if status == GSL::SUCCESS
342:           @log+="converged:"
343:           puts "converged:" if @debug
344:         end
345:         a = gmf.x_lower
346:         b = gmf.x_upper
347:         m = gmf.x_minimum
348:         message=sprintf("%5d [%.7f, %.7f] %.7f %+.7f %.7f\n",
349:           @iteration, a, b, m, m - m_expected, b - a);
350:         @log+=message
351:         puts message if @debug
352:       end while status == GSL::CONTINUE and @iteration < @max_iterations
353:       @r=gmf.x_minimum
354:       @loglike_model=-gmf.f_minimum
355:       end
loglike_data() click to toggle source
     # File lib/statsample/bivariate/polychoric.rb, line 212
212:       def loglike_data
213:         loglike=0
214:         @nr.times do |i|
215:           @nc.times do |j|
216:             res=@matrix[i,j].quo(@total)
217:             if (res==0)
218:               res=1e-16
219:             end
220:           loglike+= @matrix[i,j]  * Math::log(res )
221:           end
222:         end
223:         loglike
224:       end

Disabled; run with --debug to generate this.

[Validate]

Generated with the Darkfish Rdoc Generator 1.1.6.