@stdlib/stats-base-dists-lognormal-entropy
v0.2.2
Published
Lognormal distribution differential entropy.
Downloads
1,156
Readme
Entropy
Lognormal distribution differential entropy.
The differential entropy (in nats) for a lognormal random variable is
where μ
is the location parameter and σ > 0
is the scale parameter. According to the definition, the natural logarithm of a random variable from a
lognormal distribution follows a normal distribution.
Installation
npm install @stdlib/stats-base-dists-lognormal-entropy
Usage
var entropy = require( '@stdlib/stats-base-dists-lognormal-entropy' );
entropy( mu, sigma )
Returns the differential entropy for a lognormal distribution with location mu
and scale sigma
(in nats).
var y = entropy( 2.0, 1.0 );
// returns ~3.419
y = entropy( 0.0, 1.0 );
// returns ~1.419
y = entropy( -1.0, 2.0 );
// returns ~1.112
If provided NaN
as any argument, the function returns NaN
.
var y = entropy( NaN, 1.0 );
// returns NaN
y = entropy( 0.0, NaN );
// returns NaN
If provided sigma <= 0
, the function returns NaN
.
var y = entropy( 0.0, 0.0 );
// returns NaN
y = entropy( 0.0, -1.0 );
// returns NaN
Examples
var randu = require( '@stdlib/random-base-randu' );
var entropy = require( '@stdlib/stats-base-dists-lognormal-entropy' );
var sigma;
var mu;
var y;
var i;
for ( i = 0; i < 10; i++ ) {
mu = ( randu()*10.0 ) - 5.0;
sigma = randu() * 20.0;
y = entropy( mu, sigma );
console.log( 'µ: %d, σ: %d, h(X;µ,σ): %d', mu.toFixed( 4 ), sigma.toFixed( 4 ), y.toFixed( 4 ) );
}
Notice
This package is part of stdlib, a standard library for JavaScript and Node.js, with an emphasis on numerical and scientific computing. The library provides a collection of robust, high performance libraries for mathematics, statistics, streams, utilities, and more.
For more information on the project, filing bug reports and feature requests, and guidance on how to develop stdlib, see the main project repository.
Community
License
See LICENSE.
Copyright
Copyright © 2016-2024. The Stdlib Authors.