The Mysterious 50 Ohm Impedance: Where It Came from and Why We Use It (2021)

# · 🔥 222 · 💬 78 · 11 months ago · resources.altium.com · amelius · 📷
When we talk about S-parameters, impedance matching, transmission lines, and other fundamental concepts in RF/high-speed PCB design, the concept of 50 Ohm impedance comes up over and over. So where did the 50 Ohm impedance standard come from and why is it important? Taken in isolation, selecting 50 Ohm impedance would seem totally arbitrary: why not 10 Ohms or 100 Ohms? If you're ready for a history lesson the 50 Ohm impedance value, then keep reading. 50 Ohms is pretty close to the mean between 77 and 30 Ohms, and it's close to 60 Ohms, so it seems natural to assume this is the reason for the 50 Ohm impedance standard. I prefer to think of reference impedance in terms of your desired termination impedance; you're shooting for 75 or 50 Ohm impedance at each port, and S-parameter measurements show you how you've deviated from this goal in your design. By using the term "Reference media", we're making a comparison between our DUT/interconnect and an idealized 50/75 Ohm cable, 50/75 Ohm port, or another component with 50/75 Ohm input impedance. Whether you need to design to 50 Ohm impedance or some other value, the PCB layout features in Altium Designer® include the tools you need for high-speed design and RF design.
The Mysterious 50 Ohm Impedance: Where It Came from and Why We Use It (2021)



Send Feedback | WebAssembly Version (beta)