Computers that function like the human brain could soon become a reality thanks to new research using optical fibres made of speciality glass. The research, published in Advanced Optical Materials, has the potential to allow faster and smarter optical computers capable of learning and evolving.

Ok, I'm going to disregard everything you've said and I'm going to focus on this one bit:

It'd be like taking a modern super computer back to 1965 and asking computer scientists from that era to create a program that can read handwriting in a digital image. Those scientists, in short, would not be able to do it without substantial effort, largely because the ML algorithms that are used today did not exist back then.

You lost me. Im not sure what you are arguing or how it applies to what we are doing.

That highlights to me that our issue is that we're focusing on two totally unrelated fields of study.

From what I understand, you're focusing "real" performance increase; reducing energy consumption and run time by reducing the number of CPU and IO operations via dedicated hardware. This is great, but ultimately unrelated from the field of AI and computer science as a whole.

What you need to do to produce a truly breakthrough piece of hardware is find a way to reduce the number of steps in a given solution. We don't need to perform the set of steps faster, we need fewer steps. It's like comparing the amount of time you spend playing chess to the number of moves you make.

For example, it takes five steps to verify ((1+2)/3)*(2+5)=7:

1) (1+2) = 3

2) (3/3) = 1

3) (2+5) = 7

4) (1*7) = 7

5) 7=7

The number of steps the problem required is mathematically constant. Sure, you can make the solution in the computer simpler and faster than a typical machine, but you still haven't solved our bigger problem.

Our biggest problem is that we have been so far unable to establish any model that describes the brain's logic. It's completely disconnected from processing power; we flat out lack the math to solve the problem using our conventional toolkits.

New hardware does not solve this problem for us. I can mathematically reduce a vision recognition algorithm on pen and paper; I cannot do anything remotely close to that with the human brain.

/r/Futurology Thread Parent Link - southampton.ac.uk