top | item 11672686

(no title)

dakami | 9 years ago

I'm not convinced Von Neumann debiasing is vulnerable like you think. You're dropping runs (00's and 11's) so you don't care if it's 90% 0 or 90% 1, you care if there's an even number of a run or an odd number of a run.

In other words:

0000000000000001 000000001 001

...are all interchangeable. So biases can change all day.

There are _other_ issues I've seen in real world data, but not this one.

discuss

order

infogulch|9 years ago

So I have the ability to arbitrarily change the bias on every flip, and you don't think debiasing is affected?

How about this: for the next string of generated bits, for every even bit the bias is 90% towards 1, and every odd bit the bias is 90% towards 0.

raw: 0101010101010111010101010001010101010101

'debiased': 1111111111111111

dakami|9 years ago

Ah, there you go. Repro code:

#/usr/bin/python

from random import *

count=0

def get_bit():

   global count

   count+=1

   if (count%2)==0:

      return random()<0.9

   else:

      return False
def get_fair_bit():

   while 1:

      a=get_bit()

      b=get_bit()

      if a!=b: return a
while 1:

   print get_fair_bit()
I'm wondering how often this one shows up the field -- my interest here comes from the ~1% failure rate of RSA keys in the field because manufacturers actually will not install hardware RNG or externally inject key material. Field vulnerability trumps religion. It looks like you need a really stateful vuln -- something akin to "a 1 is always followed by a 0 due to power drain" wouldn't be enough because the number of 0's is still acceptably unbound.

I know there are debiasers that look at statistics across a group. Wonder what else is out there.