Zero Knowledge Machine Learning

Worldcoin Use-Case

Deep learning networks

               layer |    output shape |     #parameters |            #ops 
-------------------- | --------------- | --------------- | --------------- 
       conv 32x5x5x3 |   (116, 76, 32) |            2400 |        21158400 
            max-pool |    (58, 38, 32) |               0 |          282112 
                relu |    (58, 38, 32) |               0 |           70528 
      conv 32x5x5x32 |    (54, 34, 32) |           25600 |        47001600 
            max-pool |    (27, 17, 32) |               0 |           58752 
                relu |    (27, 17, 32) |               0 |           14688 
             flatten |        (14688,) |               0 |               0 
     full 1000x14688 |         (1000,) |        14689000 |        14688000 
                relu |         (1000,) |               0 |            1000 
         full 5x1000 |            (5,) |            5005 |            5000 
           normalize |            (5,) |               0 |               6 

Relevant ML evaluation work

Generally: lot's of similarities with embedded implementations: use fixed-point to avoid floating point math, model is held constant, can do precompute.

Differences: Fixed-point numbers in ZK can be large (very very large in some proof systems). Simple comparisons are expensive. Can do inverses trivially. (i.e. zk development we are all familiar with).

Prior ZK-ML work

Fix point

\mathtt{a} = \floor{\frac{a}{2^{32}}}

\mathtt{a} ⋅ \mathtt{b} = \floor{\frac{a}{2^{32}}}

Strategy ideas

Remco Bloemen
Math & Engineering
https://2π.com