Divergence of a harmonic-ish sequence

Iteratively repeat the following process: Set am = b_m for some m. For brevity, define c = b[m+1]. Like in your method, define g(x) as the linear interpolation between (m, b_m) and (m + k, c). Also note that g(x) is greater than f(x) = c on the interval [m, m+k]. Therefore the integral of g(x)/x is greater than f(x)/x. The latter integral (from m to m+k) has a value of c log(m+k/m). We can always select a k for which this integral is greater than 1, and therefore the same holds for g(x)/x. Set a_n to the values of g(x) on the relevant interval and select a new m -> m+k.

TL;DR We work with a piecewise "almost flat" function rather than a linear interpolation whose endpoint is attached to b_n, like in your method.

/r/mathriddles Thread Parent