What if progress is going to be less exponential than we think?

I'm not talking about GPT4, I'm talking about your claims regarding GPT5.

I believe there is a strong possibility that GPT-5 will approach AGI (Artificial General Intelligence) levels.

How do you know?

Realistically, GPT-5 could perform on par with an average human across a broad range of tasks.

How do you know?

I don't expect it to surpass experts in any given field. For instance, while GPT-5 might assist lawyers, it is unlikely to replace the best lawyers in the industry. This principle will likely apply to numerous fields.

How do you know?

One area where I strongly doubt GPT-5 can supplant humans is advanced AI programming.

How do you know?

I genuinely do not anticipate GPT-5 contributing significantly to the development of GPT-6.

How do you know? Now you're making claims about GPT6 before we even know anything about GPT5.

However, I think we might encounter a slowdown in improvements when transitioning from GPT-5 to GPT-6, as it will slowly get harder to find massive optimizations.

Again, how could you possibly begin to even speculate about GPT6 when we still don't know anything about GPT5?

Again, I don't necessarily disagree with your broad point of not expecting a singularity soon, but all the claims about GPT5 I find ridiculous since nobody outside of OpenAI knows anything about GPT5.

/r/singularity Thread Parent