irreducible randomness
Oct. 24th, 2008 11:27 amAny natural stream of random numbers will have occasional patterns appear randomly within it. These patters can be encoded for with compression software rules -- they are then, to some degree reducible. However at some point you might have an unpatterned stream. Is that irreducible?
Imagine some way of producing completely patternless random numbers was found. Such numbers use as many combinations of the digit "1" followed by one of the other nine digits as possible, so can't be compressed by finding runs of duplicated digits. However there are other kinds of patterns, beyond simple runs of digits and these other patterns lend themselves to compression schemes too. Half of any random numbers will be divisible by 2 and the factor might open itself to further compression because of patterns in the digit stream. Even looking only at prime numbers there may still be patters, for instance 1231 is prime, so is 2131 and the patterns in those numbers are obvious... and that is just in base 10. Using other number bases could turn up further patterns.
Only a small proportion of all thousand-digit numbers would be incompressibly random. Might there be some simple way to characterise them? If we simply listed all such numbers then we would necessarily use less than a thousand digits to indicate any of the numbers on the list, thus reducing the irreducibly random still further.
Is there any limit to this? Surely there would be. Perhaps when the information required to describe the rules for compression equalled the numbers being examined.
In fact, is the lack of pattern a pattern itself? Gah!!!
Interesting stuff.
Imagine some way of producing completely patternless random numbers was found. Such numbers use as many combinations of the digit "1" followed by one of the other nine digits as possible, so can't be compressed by finding runs of duplicated digits. However there are other kinds of patterns, beyond simple runs of digits and these other patterns lend themselves to compression schemes too. Half of any random numbers will be divisible by 2 and the factor might open itself to further compression because of patterns in the digit stream. Even looking only at prime numbers there may still be patters, for instance 1231 is prime, so is 2131 and the patterns in those numbers are obvious... and that is just in base 10. Using other number bases could turn up further patterns.
Only a small proportion of all thousand-digit numbers would be incompressibly random. Might there be some simple way to characterise them? If we simply listed all such numbers then we would necessarily use less than a thousand digits to indicate any of the numbers on the list, thus reducing the irreducibly random still further.
Is there any limit to this? Surely there would be. Perhaps when the information required to describe the rules for compression equalled the numbers being examined.
In fact, is the lack of pattern a pattern itself? Gah!!!
Interesting stuff.