I have long been interested in randomness, and am an advocate of its use in AI, especially for video games. For example, I made this bee simulation to demonstrate to students how the idea can work in practice. In that simulation, all entities have a very simple state machine with decisions made almost entirely out of random choices.
Recently I have been reviewing the source code to my game Player Manager, which dates back to 1989. The game was written in 68000 assembler and in fixed point, but even so the techniques I developed back then are just as relevant to crafting great gameplay now.
I am not the only one to advocate the use of random numbers in the craft of gameplay development, but even so, sometimes the idea meets resistance. Surely, you can’t create great looking behavior using random numbers? Would it not be better to have actual algorithms that have no random element? Well, frankly, no. If you want to know a secret, randomness is a FUNdamental tool.
The use of dice in games goes way back of course, but video games allow much more use of random numbers without having to actually roll dice. Sure their use in RPG games is quite clear, being essentially derived from that seminal work “Dungeons and Dragons”. However, they can be applied to more than just RPGs. I’d like to see more randomness in first person shooters for instance… it is becoming more common, thankfully.
Anyway, because of my early experiments with Dungeons and Dragons, the game code of Player Manager actually uses the equivalent of ndn notation (e.g. 3d6, which means the sum of three 6 sided dice). The only difference is that I number the sides from 0 to make life easier. These days, I tend to use a floating point normalised roll (from -1 to 1, or 0-1). But when you don’t have floating point, 3d6 works great.
It is a very interesting characteristic of of summed random numbers that the more you sum them, the closer you get to the normal distribution (the following were generated by wolfram alpha: just type 3d6 or 7d12 or whatever into wolfram alpha and it draws a nice graph!)
The plots show the probability of any total of the dice… let’s go further…
That is the classic 3d6 dice roll for player stats in Dungeons and Dragons. It is sort of bell shaped, but still has a reasonable probability of extremes. But if you were using this to create a realistic shooting error for a footballler, it would not work well (even an average player would not, for instance, have a one in 36 chance of, say, a 45 degree error). If you need such extremes to be rarer, you can just add more dice:
For comparison, here’s a plot of some actual bell curves:
These days, I create a function to add random numbers between 0 and 1, and then divide the total by the number of dice, i.e. the average of all the rolls, to get a result between zero and 1 This, of course, also tends towards a true bell curve the more random numbers are added. I could, you might say, just use a real bell curve, I find that tuning is much easier with the 10d(float from 0-1) method. Often you don’t want a flat distribution (like 1d6) or a actual bell curve, but something in between. Provided that such rolls don’t get crazy (such as one hundred rolls), the loss in performance is not usually an issue.