top of page
Writer's pictureFrancesco Poderico

The importance of designing the return current properly and why it is so critical!



It was my pleasure to assist a startup company in passing EMC last month. The company now has its CE mark on their product and I am extremely pleased to hear that they have started making sales.

There were quite a few mistakes made in this project, just to name a few: a poor design of the common mode choke, x and y capacitors were placed in the wrong places and were underrated.

Despite this, the most important error I have seen is the bad design of the return current that I have seen.

If you have been following my posts, by now you know how important it is for the design of the return current to be correct. If you fail to do this, then you will create a differential mode current which will then be converted into both radiated and (with the latest standard) conductive emissions.

One of my previous posts also explained how to simulate the radiated emission of a DCDC converter. So, I encourage you to take a look at that article if you would like to know how the E field is estimated at 3 meters and 10 meters.

Whatever, I haven't mentioned yet is that if you fail to design a return current, you will increase your susceptibility tremendously. So, you will have a greater chance of failing either the conductive immunity or radiated immunity tests.

Let's make a simple analysis at low frequency.

Let's assume we have a ribbon cable. Pin number 1 on this cable terminates to a high-impedance pin, like the positive input of an operational amplifier.

Let's say pin 20 is the return signal (I hate calling GND a return current signal because it's misleading).

I wrote pin 20, to say, a pin very far away from pin 1.

There are two kinds of issues with a flawed design like this.

Wire number 1 may be susceptible to both capacitive and magnetic noise. Today I am intending to focus on the capacitive effect in this post. If you are interested in learning more about the magnetic immunity of ribbon cables I will discuss it in the next post.

On a single wire (pin 1 in this case), the noise voltage Vn is approximately (at low frequencies)

equal to vn = 6.28 f C12 R

f is the frequency, C12 is the parasitic capacitor between the noise source and wire 1 of our cable, and R is the load resistor.

Now, if you have connected this signal to the non-inverting pin of an operational amplifier, then you can assume that R is greater than 10 megaohms. C12, let's assume is 10 pF.

We can see that Vn for a 1 Vrms signal can induce a noise of 0.0314 V at just 50 Hz .

The voltage noise Vn increases linearly up to a frequency where it stabilizes (see below).

When the frequency rises above 1/(3.14 R (C12 + C23)) the coupling tends to flat and the maximum noise voltages become Vn = Vs C12/(C12+C23) .

Assuming C23 = C12 (just to make a simple calculation) then at relatively high frequency with 1 VRMS we can induce 0.5 V rms.

Imagine what the result would be on an ADC converter. There is an error of 31 mV even at 50 Hz.

I have not even begun to discuss the problem we have with this solution.

Anyway, I hope I've convinced you how important it is to design something that seems so simple like choosing the pinout of a ribbon case, in the next post I'll show you how to estimate the noise induced by a magnetic source this time, and then we can talk about choosing pinouts so that the noise is minimized.

I hope you have enjoyed, bye now


75 views0 comments

Recent Posts

See All

Commentaires


bottom of page