Linux kernel and Entropy

Since recent versions, entropy generation is very slooooooow on Linux (mainly because the entropy generation algorithm has been changed for security). This can be really painful when generating certificates, for ex. for Prelude. Reading /dev/random only gives a few bytes per second !

If you're lucky, you will have a hardware random generator (RNG). However this is generally not the case, so how to speed up random generation ?

One solution is to take random data from another source and then inject them to the kernel random pool. You have to be careful though, because if you read data from /dev/null it will be fast, but you will have problems !

Install rng-tools, which reads data from a configurable input source (hardware RNG, or, in our case, /dev/urandom). Before injecting data to the kernel, rng-tools test data with the FIPS 140-2 test, and then add it.

Warning: if you want to generate, for ex, certificates for Prelude, you must fully install and configure rng-tools before installing Prelude, since most packaging tools won't support installing 2 packages at the same time, and prelude will generate its certificates during installation (at least on Debian).

apt-get install rng-tools
vi /etc/default/rng-tools

Change the values, for ex:

RNGDOPTIONS="-W 80% -t 20" 

The second line tells rng-tools to fill 80% of the kernel random pool (default is 50%) every 20 seconds maximum (default is 60). You may want to adjust these not to be too aggressive, or your "tainted" random data will completely replace real random data. See man rngd for more information.

/etc/init.d/rng-tools start

Warning: if you want 100%-sure random data, then stick to the default random generator, or use other libraries

This tip works on all Linux distributions where rng-tools is packaged (which means most of them).