Electrical Engineering Asked on February 20, 2021
If I replace an old, roof-mounted TV antenna with a 2.4GHz WiFi (IEEE 802.11) antenna; can I use the existing coax? Or will I need to run all new cable?
I was not satisfied with the answers so I suggest a blog post from a cable seller here. He says, that 75 Ohm Cable should work for most types of transmission because it has the lowest attenuation. In this Wikipedia Article it says that it also depends on length of the cable. The answers here say it's probably ok and in some cases better, and here it says it's okay for small distances.
So I suggest using a 75 Ohm Coax and make it as short as possible.
Answered by RIJIK on February 20, 2021
Yes. Coaxifi (coaxifi.com) is an example of what you're describing. You could do it with an RP-SMA antenna, or use a kit with an F connector antenna. You would need to create a balun for impedance, but that is certainly doable. To your question about impedance conversion, this just means that the dielectric field strength is changed, with a slight signal loss for the conversion. Ham radio enthusiasts deal with this frequently, for instance, with BNC connectors as shown below.
Just to be clear, antennas don't speak protocols, so there is no such thing as an "802.11 antenna." (There are omnidirectional or directional antennas, ones that cover just 2.4 or 5 GHz and those that cover both bands, etc.)
Answered by wifivar on February 20, 2021
You definitely can use RG6 coaxial cable with WiFi frequencies, provided that you convert the impedance. The fact that RG6 cable is marketed as being "tested to 1 GHz," "tested to 3 GHz," etc. doesn't preclude its use with higher frequencies. Take a look at the 50Ω LMR coaxial cable running between the sector antennas and the base station at virtually any cell site - in the US, those cables support a mix of frequencies that include 1.9 GHz, 2.5 GHz, and with 5G, 5.8 GHz or higher. As for 75Ω RG6 cables, cablecos offering DOCSIS 3.1 plan to reach 1.794 GHz in the near future.
To run WiFi over RG6 cabling, the main issue is attenuation over distance and connector/assembly loss. RG6 can evidently support 2.4 GHz frequencies out to 210 feet, while LMR-900-DB can support 2.4 GHz out to 1,130 feet. All you need is two impedance converters per run, one between the WiFi radio/router and the cable run in your wiring closet, and another between the wall plate and WiFi antenna in another room. You can find kits that support this from coaxifi.com or dual-comm.com.
The other factor is the output power on the router's radio chain. More output power is better, especially if you plan to split the WiFi signal several times, so a 1 watt router would be ideal. But for passing a signal to just one other room over RG6, most routers with RP-SMA connectors should be fine, so long as the cable has no shorts and the distance isn't excessive (consult the coaxial cable calculator at timesmicrowave.com to see which distances have run efficiencies of 0.1% or higher).
If you do have an opportunity to run 50Ω cable natively in your home or office, go for it. It's a great way to connect outdoor panel antennas or ceiling antennas where you don't need to fumble with wall plates. I'd recommend LMR-600 cable if you can afford it (around $1 per foot wholesale) and have room for a 0.59 inch jacket diameter, but if not, LMR-240 performs better than RG6 at WiFi frequencies and also is slightly smaller in jacket diameter than RG6.
One answer to this question suggests that 1 GHz is some sort of cut-off frequency on RG6. Clearly, it isn't, otherwise DOCSIS 3.1 wouldn't work. "RF people" should know that the only coaxial cables with built-in stopbands are radiating-mode leaky feeder cables, and unless you're in a train tunnel, you aren't using that. Nor are the components involved exotic - F-SMA impedance converters wholesale for under 50 cents. People that install panel antennas for in-building WiFi DAS deal with this all day (there's even a pretty picture in L-Com's latest catalog showing a WiFi over coax deployment at a hospital).
Answered by Eric Johnson on February 20, 2021
What you can do instead is to bring the router to the antenna on the roof and use a pair of MoCA boxes to run Ethernet over your coax.
Answered by chx on February 20, 2021
Nearly all coax is quite lossy at those frequencies, for a run more than a few feet / a meter. If you can get it to work at all, performance will be quite poor.
A better solution is to get the transceiver as close to the antenna as possible, then do a long cable run from that.
A similar thing is done for satellite antennas - ever heard of an LNB? They amplify and downshift the signal right at the antenna, to mitigate the losses of a cable run.
The "LNB" is just an analogy - you need to put the access point outside, then run Ethernet cable from that. Power over Ethernet would be perfect for an application like this. Look up "outdoor wireless access point".
If you absolutely cannot run a new cable, here's a wild idea - use the existing coax cable just to provide DC voltage to the access point. Set up the access point to cross-band repeat, then use another access point inside to get the data onto the rest of your network.
Answered by Tom S on February 20, 2021
Assuming 75 Ohm antenna design for cable TV coax, that causes a return loss
Also, cable TV signal loss gets pretty bad in the 1-5GHz range except satellite dish coax, but again, wrong impedance.
I would choose 50 Ohm semi-rigid coax and choose antenna that give gain in the intended direction. You may review flex coax loss per unit length and connector losses, so choose the best.
When I was in New Zealand, 10 years ago, small beachside towns, some residents had networked all their routers to give wide area coverage to the beach using RIP protocol (an option in many old routers) with a designated router MAC address sharing. They used small Yagi antenna, pointed towards the beach area to ensure optimal gain.
Answered by Tony Stewart Sunnyskyguy EE75 on February 20, 2021
So you want to transport that 2.5 GHz (or even 5 GHz ?) Wifi signal over TV COAX cable ?
Indeed to the non-RF people you'd just think that would work. And it does BUT there will be almost no signal coming through that cable.
The Wifi signal will be attenuated so much in that COAX cable that it will defeat the whole purpose of having an antenna on the roof. The same antenna directly on the router might even get better coverage.
Why is that ?
TV COAX cables are not designed for 2.5 GHz signals, TV signals go up to 1 GHz and even at that frequency you can expect a lot of attenuation.
TV COAX cables usually have a characteristic impedance of 75 ohms, Wifi antennas routers etc. all use 50 ohms. There are no exceptions to that.
So no, in practice this will not work at all.
Answered by Bimpelrekkie on February 20, 2021
You must use coaxial cable of the proper impedance. The most common impedance for coax cable is 50 ohms or 75 ohms. If the cable you want to use matches the impedance of the interface AND the antenna, then go for it. But if you use cable of the wrong impedance you will get significant attenuation of the signal to the point where it may not work at all. In high-power equipment, it may even damage the transmitter. But that is unlikely in average WiFi gear.
Answered by Richard Crowley on February 20, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP