Is there a theoretical distance at which objects in the universe would be unobservable because the likelihood of a photon from it hitting us is so small?

There is no hard limit to when it would start impossible to detect the light coming from a given object, however like you said the process would become increasingly harder as the object is further and further away.

As with any detection system, for our eye or a telescope to detect the light from a celestial body we need two things, a sufficiently strong signal and decent contrast. For example for our eyes to see say the North Star, what we need is for the integrated flux of the light hitting our eye to be sufficient to elicit a response by our vision system. An interesting series of experiments by Hecht and coworkers, revealed that as little as 100 (visible) photons hitting a person's eye resulted in the perception of a flash, so our eyes are actually quite sensitive. With electronic sensors we can even detect single photon events with a fairly high fidelity.

The question then becomes roughly how far away can we be from a star with a certain luminosity and still be confident that we can see the light it emits. First, let's get a sense for just how many photons a star actually emits. A sun-light star has a luminosity on the order of 4*1026 W. Of this light, let's say that roughly half of it falls within the visible range with an average wavelength of 500-ish (thus with a photon energy of about 4*10-19J), which means that this corresponds to a photon flux of 2*1026 W * / (4*10-19 J/photon)=5*1044 photons/second, which is a huge, huge, number. To put it in context, a candle with a luminosity of about one lumen, emits something on the order of 4*1015 photons/second, so the star emits a whopping 29 orders of magnitude more of light.

Now as you move away from an object, the intensity you will perceive will continuously decrease as 1/r2. In the case of a candle, on a dark night we can still see its light when it's say about a km away. The distance for a sun-like star to appear as bright as a candle that is 1 km away is sqrt(5*1044 \ 40*1015 ) = 4*1014 km or about 42 light years. This means that with the naked eye we can fairly clearly make out a sun-like star when it is on the order of 10s of light years away, we might still see something at hundreds of light years, but we would almost certainly not be able to detect if its separated by thousands of light years. Of course the numbers would change if the star was brighter or if we used a different imaging system rather than our eyes.

Finally, the last caveat is that detection requires not just a signal but contrast as well. The best example of this phenomenon is the fact that you can't see any stars during the day. And yet, this light is still present, just as much as it is during the night. The problem is simply that the much more intense sunlight scattered by the atmosphere completely washes the weaker light emanating from the stars. As we try to see weaker and weaker signals, this effective noise from stray light and false reading (the effective dark counts of our imaging system) will make detecting the light from the stars increasingly more difficult.

/r/askscience Thread