Is there a formal way of deciding what fraction of a game is chance?

Dynamics researcher here. When most game theorists (as in mathematicians who study game theory) talk about "games of chance" they are talking about any game which has an element of, for example, a card shuffle or a coin toss or a dice roll. (Of course one can in principle become proficient at tossing a coin or a die in a biased way, but this is very hard to do and the chaotic motion tends to be well-modeled probabilistically anyway).

So, purely skill-based games like chess would indeed not be considered games of chance, although we can of course compute interesting meta-statistics about the game and the players. And there might be 'outside' aspects of the game which are chance-based, for example, deciding who should be white or black. Then it becomes semantic whether we consider those 'outside' decisions to be part of the game (usually not, or else we make a distinction like "a deterministic game to which we have introduced an element of chance").

Researchers use different definitions of 'chanciness' depending on what they want to study. For example, depending on your definition of "chanciness", a slot machine is as random as a coin flip, in the sense that you (supposedly) can't make any moves that change the underlying probability distribution. According to other definitions, a coin flip is 'more random' than a slot machine because it's (supposedly) unbiased.

Thus, games which are 'in between' skill-based and chance-based tend to involve the capacity of making moves which bias the game in your favor, but only probabilistically. For example, watch any poker tournament, and you will see the percentages change as the game progresses, reflecting changes in conditional probability.

To answer your question in a long-winded way, I'm unaware of any standardized way to do this. It really depends on your definition and what you want to study. Maybe chanciness is the largest possible change in your winning chance over all possible moves? (In which case maybe a coin toss would be 0, because no move changes your winning chance). Maybe it's the probability that a new/average/expert player wins? (Here a slot machine player has a lower 'rating' than a coin toss player, because the slot machine is biased towards the casino) or a ratio of percentage wins of an expert versus a novice? (In which case a coin toss would be 1, as there is no distinction between experts and novices in a purely chance based game).

/r/askscience Thread