NAME

Data::Entropy::Source - encapsulated source of entropy

SYNOPSIS

use Data::Entropy::Source;

$source = Data::Entropy::Source->new($handle, "sysread");

$c = $source->get_octet;
$str = $source->get_bits(17);
$i = $source->get_int(12345);
$i = $source->get_int(Math::BigInt->new("1000000000000"));
$j = $source->get_prob(1, 2);

DESCRIPTION

An object of this class encapsulates a source of entropy (randomness). Methods allow entropy to be dispensed in any quantity required, even fractional bits. An entropy source object should not normally be used directly. Rather, it should be used to support higher-level entropy-consuming algorithms, such as those in Data::Entropy::Algorithms.

This type of object is constructed as a layer over a raw entropy source which does not supply methods to extract arbitrary amounts of entropy. The raw entropy source is expected to dispense only entire octets at a time. The /dev/random devices on some versions of Unix constitute such a source, for example. The raw entropy source is accessed via the IO::Handle interface. This interface may be supplied by classes other than IO::Handle itself, as is done for example by Data::Entropy::RawSource::CryptCounter.

If two entropy sources of this class are given exactly the same raw entropy data, for example by reading from the same file, and exactly the same sequence of get_ method calls is made to them, then they will return exactly the same values from those calls. (Calls with numerical arguments that have the same numerical value but are of different types count as the same for this purpose.) This means that a run of an entropy-using algorithm can be made completely deterministic if desired.

CONSTRUCTOR

Data::Entropy::Source->new(RAW_SOURCE, READ_STYLE)

Constructs and returns an entropy source object based on the given raw source. RAW_SOURCE must be an I/O handle referring to a source of entropy that can be read one octet at a time. Specifically, it must support either the getc or sysread method described in IO::Handle. READ_STYLE must be a string, either "getc" or "sysread", indicating which method should be used to read from the raw source. No methods other than the one specified will ever be called on the raw source handle, so a full implementation of IO::Handle is not required.

The sysread method should be used with /dev/random and its ilk, because buffering would be very wasteful of entropy and might consequently block other processes that require entropy. getc should be preferred when reading entropy from a regular file, and it is the more convenient interface to implement when a non-I/O object is being used for the handle.

METHODS

$source->get_octet

Returns an octet of entropy, as a string of length one. This provides direct access to the raw entropy source.

$source->get_bits(NBITS)

Returns NBITS bits of entropy, as a string of octets. If NBITS is not a multiple of eight then the last octet in the string has its most significant bits set to zero.

$source->get_int(LIMIT)

LIMIT must be a positive integer. Returns a uniformly-distributed random number between zero inclusive and LIMIT exclusive. LIMIT may be either a native integer, a Math::BigInt object, or an integer-valued Math::BigRat object; the returned number is of the same type.

This method dispenses a non-integer number of bits of entropy. For example, if LIMIT is 10 then the result contains approximately 3.32 bits of entropy. The minimum non-zero amount of entropy that can be obtained is 1 bit, with LIMIT = 2.

$source->get_prob(PROB0, PROB1)

PROB0 and PROB1 must be non-negative integers, not both zero. They may each be either a native integer, a Math::BigInt object, or an integer-valued Math::BigRat objects; types may be mixed. Returns either 0 or 1, with relative probabilities PROB0 and PROB1. That is, the probability of returning 0 is PROB0/(PROB0+PROB1), and the probability of returning 1 is PROB1/(PROB0+PROB1).

This method dispenses a fraction of a bit of entropy. The maximum amount of entropy that can be obtained is 1 bit, with PROB0 = PROB1. The more different the probabilities are the less entropy is obtained. For example, if PROB0 = 1 and PROB1 = 2 then the result contains approximately 0.918 bits of entropy.

SEE ALSO

Data::Entropy, Data::Entropy::Algorithms, Data::Entropy::RawSource::CryptCounter, Data::Entropy::RawSource::Local, Data::Entropy::RawSource::RandomOrg, IO::Handle

AUTHOR

Andrew Main (Zefram) <zefram@fysh.org>

COPYRIGHT

Copyright (C) 2006, 2007, 2009, 2011 Andrew Main (Zefram) <zefram@fysh.org>

LICENSE

This module is free software; you can redistribute it and/or modify it under the same terms as Perl itself.