Information Theory and the Complexity of Boolean Functions
Document Type
Article
Department
Mathematics (HMC)
Publication Date
1976
Abstract
This paper explores the connections between two areas pioneered by Shannon: the transmission of information with a fidelity criterion, and the realization of Boolean functions by networks and formulae. We study three phenomena:
-
The effect of the relative number of O's and l's in a function's table on its complexity.
-
The effect of the number of unspecified entries in a partially specified function's table on its complexity.
-
The effect of the number of errors allowed in the realization of a function on its complexity.
Our main result is a precise version of the following statement:
The complexity of approximately realizing a partially specified Boolean function, in whose table a fractiond of the entries are unspecified and a fractionp of the specified entries are l'swith errors allowed in a fraction not more thane of the specified entries, is less by the factor (1 −d) [H(p) − H(e)] (whereH(z) = −z log2 z −(1 −z) log2 (1 −z) is the binary entropy function) than the complexity of exactly realizing an arbitrary fully specified Boolean function.
We also give an intuitively appealing information-theoretic interpretation of the result.
Rights Information
© 1976 Springer-Verlag
Terms of Use & License Information
DOI
10.1007/BF01683269
Recommended Citation
Pippenger, N. "Information Theory and the Complexity of Boolean Functions", Math. Sys. Theory, 10 (1977), 129-167. doi: 10.1007/BF01683269