Yes, yes it is how it works
Humans are really bad at nonlinear thinking.
Speak for yourself.
A given amount of computing power costs roughly half 2 years later. Moore's law covers cost and density of transistors on a chip, which more or less correlates with consumer device performance and cost. It's not exact, but it closely tracks over time. We have ~70± years of computer technology showing this.
If you wanted to pay in 2005 for the same amount of cpu processing power and ram that a $1000 flagship phone has in 2023, you're looking at somewhere between $500k and $1m. Likely a lot more, if you account for power and the cost of space to host the massive amount of hardware in 2005.