I recently came across this article on a breakthrough in the generation of random numbers via computers which seems very exciting with descriptors such as “masterpiece” and “light years ahead.” Revision 2 of the draft paper is uploaded here: Explicitly Two-Source Extractors and Resilient Functions, although I freely admit it is far too technical for my understanding.
The same author has another paper, self-described as written for the layperson, uploaded here: D. Zuckerman Can Random Coin Flips Speed Up a Computer. This I found very interesting, and would like to highlight the following points:
- The law of large numbers. As the number of events gets larger, the random nature of the event (and any true bias) becomes more apparent
- The stock market is a random walk with a slight upward bias
- “Investing differs from our earlier examples of cooking and polling, in that the randomness is modeling reality, and is not something we control. In cooking and polling, we choose to add randomness to achieve our aims.”
- Two applications of randomness in computing: distributing loads and computing the area of a complex shape or volume of a complex object.
- These are examples of “Randomized approximation algorithms”
- Two types of errors occur with randomized approximation algorithms: errors in the quality of the approximation (these are approximations after all), and errors in the estimation of the % error, which can be mitigated to 1/1,000,000,000.
- Reiterating: randomness can be used to model a problem, and to solve a problem
- Randomness is necessary for computer security, i.e. cryptography.
- Cryptography works by encrypting messages sent between two hosts, such that any messages intercepted by a nefarious third party would take too long to decrypt.
- RSA (named after the inventors Ron Rivest, Adi Shamir, and Leonard Adleman) is a cryptosystem that uses the difficulty of factoring large prime numbers and was created in 1977. Details here.
More on the new paper in a blog post here.