My journey to finding a faster algorithm for integer factorization. (Long post)

When I was your age, I had a similar idea: that if you could efficiently compute n! mod (p*q), then you could use binary search to find the smallest factor. I thought about it for a while, but eventually discarded the approach (mostly just because I learned about other factorization algorithms that were even more interesting).

Here's my best argument for why the n! mod (p*q) approach may not be as compelling as the congruence-of-squares algorithms (continued fraction, quadratic sieve, GNFS, etc.): if you can factor N efficiently, then you can compute sqrt(k) mod N efficiently, using Shanks-Tonelli and the Chinese Remainder Theorem. Conversely, if you can compute sqrt(k) mod N efficiently (for some arbitrary k), then you can factor N efficiently (just pick some random a, then compute sqrt(a2 mod N) mod N, and you have a congruence of squares). Therefore, computing sqrt(k) mod N is exactly as hard as factoring N. You can solve one if and only if you can solve the other.

Now replace sqrt(k) mod N with k! mod N and ask the same question. Obviously if you can compute k! mod N, then you can factor efficiently. However, even if you could factor N efficiently, it's not clear how you could compute k! mod N efficiently. For this reason, it's possible that computing k! mod N may be harder than factoring N. In other words, the intermediate result might be more difficult to compute than the final result. So it might be more useful to pursue an approach where you already know the intermediate result isn't harder-to-compute than the factors.

Also, I encourage you to learn about the quadratic sieve and Pollard's p-1 algorithm. I think you would find them both quite interesting.

/r/math Thread