Friday, 24 October 2008

miriam_e: from my drawing MoonGirl (Default)
Any natural stream of random numbers will have occasional patterns appear randomly within it. These patters can be encoded for with compression software rules -- they are then, to some degree reducible. However at some point you might have an unpatterned stream. Is that irreducible?

Imagine some way of producing completely patternless random numbers was found. Such numbers use as many combinations of the digit "1" followed by one of the other nine digits as possible, so can't be compressed by finding runs of duplicated digits. However there are other kinds of patterns, beyond simple runs of digits and these other patterns lend themselves to compression schemes too. Half of any random numbers will be divisible by 2 and the factor might open itself to further compression because of patterns in the digit stream. Even looking only at prime numbers there may still be patters, for instance 1231 is prime, so is 2131 and the patterns in those numbers are obvious... and that is just in base 10. Using other number bases could turn up further patterns.

Only a small proportion of all thousand-digit numbers would be incompressibly random. Might there be some simple way to characterise them? If we simply listed all such numbers then we would necessarily use less than a thousand digits to indicate any of the numbers on the list, thus reducing the irreducibly random still further.

Is there any limit to this? Surely there would be. Perhaps when the information required to describe the rules for compression equalled the numbers being examined.

In fact, is the lack of pattern a pattern itself? Gah!!!

Interesting stuff.

Profile

miriam_e: from my drawing MoonGirl (Default)
miriam_e

February 2026

S M T W T F S
123 4 567
891011121314
15161718192021
22232425262728

Style Credit

Expand Cut Tags

No cut tags
Page generated Monday, 9 February 2026 08:33 am
Powered by Dreamwidth Studios