Would transmitted information from next to a black hole need to be processed differently if the receiver was 2 lightyears away than if it was 500 miles away?

Yes, there can be a significant difference. The lack hole if it is massive enough will have a very strong gravitational field, even outside of the event horizon. When light moves against a gravitational potential, it experiences a so-called gravitational redshift, meaning that the light will be "stretched" to a longer wavelength than the wavelength with which it was emitted, as shown graphically here.

More quantitatively, the degree of the redshift of light emitted close a gravitational body towards an observer at infinity depends on the relative ration of R/rs, where R is the distance of the source from the center of gravity and rs is the Schwarzschild radius (which for a blackhole represents the event horizon). In the limit that R approaches rs, the redshift is infinite (consistent with the fact that if R<rs, then the light would be within the event horizon and would never make it out), while in the limit that R approaches infinity, the redshift will be equal to zero (since the gravitational field vanishes as 1/R). Therefore, if the source is found next to the event horizon of a massive black hole, then the total redshift can be substantial, potentially drastically changing the type of detector you could use to detect the light. However putting the detector closer to the source would reduce the distance by which the light moves against the gravitational field, and hence the total redshift. In other words, the closer the detector is to the source, the smaller the redshift will be, until in the limit that the detector is right next to the source, the redshift will be equal to zero.

/r/askscience Thread