Popular torrenting software µTorrent has included an automatic cryptocoin-miner in their latest update.

This is really, really simplified:

Bitcoins (or any other cryptocurrency) are basically just unique blocks of data that people decide have value (much in the same way there is no intrinsic value to paper money).

To mine a bitcoin is to is to find the hash key that when decrypted, will result in a specific set of data. (In bitcoin it's called the blockchain, and it's communicated between the entire bitcoin network so it's always the same for everyone).

Finding this hash key is entirely random. To really, really simplify it... let's say our "hash function" was taking a number, subtracting 7 and then using the result to represent the nth letter of the alphabet. So we'd be told that the letter we want to find is "j". So to do this, we'd try the number "22". We'd then subtract 7 to get 15, and find that the resulting letter is "o", which is wrong. So then we'd try again, so now we try "8", subtract 7 to get 1, and get the letter "a", which is wrong.. This goes on until we eventually try the number 17, so we subtract 7, get 10 and voila.. it's "j". At that point you tell everyone else in the network: "Hey, I found the hash that leads to "j", it's 17!". They all recognize that you found it first and now you've successfully mined a bitcoin!

The reason it is random is because actual hash function is so complex (and random itself) that it's not like there's algebra to solve it. You can't say: "Well I know J is the 10th letter of the alphabet: "10 = X - 7" and then find that "X = 10+7 = 17". You literally have to guess a number, apply the function, and then see if the result is what you wanted... and then repeat that forever until you're randomly successful.

So in a computer, it's just your processor doing this over and over and over and over and over.... but because the task is relatively simple, you don't need that powerful of a processor to perform any single operation, you just want to do as many as you can. This is why a graphics card is better at doing it than a CPU. GPU's have hundreds of little less powerful processors that can do a ton of small things in parallel, while a CPU is more designed to do a few things really fast individually. If you needed to add "1+1=2" 1000 times, it'd be faster to get 100 five year old kids to do it all at the same time , versus say, give PhD's in math.

So anyway, doing these calculations takes power, which costs money. Processors also have a finite lifetime, so doing a ton of calculations shortens that lifetime.

/r/technology Thread Link - forum.utorrent.com