I have to second what TwinTurboZX says. The problem with wireless networking is that it is inherently half-duplex, and what's worse is that you're sharing the radio frequencies with Bluetooth controllers and headsets, most cordless phones, all the other wireless network devices in your house, and all the wireless network devices in your neighbors' houses.
In a half-duplex network, when you reach about 30-35% utilization, you start having a high enough rate of packet collisions that more data makes the network get slower instead of faster. The wired standards adopted full-duplex and switching to overcome this problem, but wireless fundamentally can't do either of those. The best wireless can do is keep increasing the bitrates. (And increases in frequencies tend to be accompanied by decreases in effective distance, due to the power requirements, but that's another issue.)
When you consider that your 802.11G network runs at 54Mbps under ideal conditions, any interference you have chips away at that number. It's not hard at all in the real world to whittle it down to below the magic 19.2 Mbps figure for HD ATSC broadcasts, let alone high-bitrate 1080p24 content.
Your only real hope is eliminating the interference: have no other wireless devices working at the time, and hopefully there's not "too much" interference from intervening walls in between. (And it's not like you can easily use the PS3 without a BlueTooth controller being active at the same time!)
IMO, this whole streaming thing is the line in the sand where wired will cut it and wireless won't. Or at best it will always be a little bit troublesome.