Here's your first problem, we've only been listening since the 1960s. That means we've only listened up to 40~50 lightyears away from ourselves.
I'm not sure I follow this part, wouldn't how far out we can detect signals be dependent on the "strength" of the source and how "faint" signals are that we can detect? Not for how long we have been looking for them.
Well basically, if we've been listening for 50 years (to set a number here), it means that any EM signals (which move at the speed of light) that we detect at best could have been 50 light years away after having scanned for 50 years.
Say for example, a signal was sent here from 60 light years away from Earth at the time we first started scanning the skies. This would mean that said signal has still 10 years to get to us.
On the flipside, say signal had been sent here from only 10 light years away, but sent 100 years ago. It means that we missed said signal 40 years ago.
It's a question of timing.
Also, when you think about it, if 50 light years is as far as we know, we've only listened to a very small distance away from us, since the Milky Way is 100,000 light years across and 1000 thick, according to Wiki. That's tiny
. And that's assuming that we've been watching the entire sky over us, which we haven't. There's a lot of holes in our tiny net.
You're right, but you're also assuming signals transmitted in isolation. The working assumption is that any technologically advanced society would be putting out fair amounts of radio constantly, the same way we are. This may not be the case, of course; they could have found a different technology, or simply used a different part of the EM band for communication.
But if they are putting out radio constantly, then we should be detecting them even outside a 50LY radius unless they just reached the radio age recently. The maximum distance would then be a combined function of distance, transmission power, and how long ago their radio age began.