Entropy and Expected Acceptance Counts for Finite Automata
If a sequence of independent unbiased random bits is fed into a finite automaton, it is straightforward to calculate the expected number of acceptances among the first n prefixes of the sequence. This paper deals with the situation in which the random bits are neither independent nor unbiased, but are nearly so. We show that, under suitable assumptions concerning the automaton, if the difference between the entropy of the first n bits and n converges to a constant exponentially fast, then the change in the expected number of acceptances also converges to a constant exponentially fast. We illustrate this result with a variety of examples in which numbers following the reciprocal distribution, which governs the significands of floating-point numbers, are recoded in the execution of various multiplication algorithms.
© 2004 IEEE
Pippenger, N.; , "Entropy and expected acceptance counts for finite automata," Information Theory, IEEE Transactions on , vol.50, no.1, pp. 78- 88, Jan. 2004