The largest ever prime number has just been discovered, which is 23 249 425 digits long.

Seems like it wouldn't be difficult, just computationally expensive.

It's not even very expensive, it's just that it only works for small primes. Computers can't hold large lists (or multiples) of large primes; the amount of storage required goes up really really fast.

As a rough approximation, there are about n/log(n) primes less than n. So for the sake of round numbers, consider all the primes less than 1010 (10,000,000,000). There are around 109 of them (1,000,000,000).

If we roughly estimate that each one requires 4 bytes, then we're going to use 4 gigabytes to store the list -- and around the same if they were all multiplied together.

That's a lot of RAM, and even increasing it to 1000 gigabytes doesn't change the maximum prime in the list that much compared to the purpose at hand: for a number with ten million digits, a factor with 10 digits or even 100 digits is still a "small" factor.

But nonetheless your approach isn't useless by any means; sieves of various sorts (there are also fancier ones than Eratosthenes) are in fact often used to check for small prime factors.

Then you have to switch algorithms to find large prime factors.

So the bottom line is that it isn't a useless approach, it just doesn't come anywhere near solving the problem when dealing with large numbers.

Variations on it can be fine if you only want to factor very small numbers (or pre-screen for very small factors). It can be quite a bit better than just doing naive trial division.

/r/worldnews Thread Parent Link - mersenne.org