Using a capacitor for voltage smoothing

Nazo

2[H]4U
Joined
Apr 2, 2002
Messages
3,672
My grandmother bought a lighted magnifier at a thrift store or something and it didn't come with its power supply. Luckily it had the info on the plug, 10V @ 50mA (well, it's a LED, but that is still pretty low, lol.) So I tried to order an adapter for it, but of course there aren't a lot of 10Vs out there. I wanted to keep things simple and cheap though and preferred not to try to have to make a 12V work or something. I thought I found a good one on Mouser -- a bit overkill (500mA) but sufficient fine really. Unfortunately, when it shipped in I discovered that apparently I ordered an AC adapter! I've already cut the end off to attach the somewhat unusual sized connector this thing uses to the end, so returning it isn't really an option.

Rather than make things complicated though, since this is a pretty low budget project, I decided just to convert AC to DC with diodes. Apparently this is called a "diode bridge" (which I'd had no idea about before, but managed to find out via searching for something completely unrelated.) I ran across this article on Wikipedia once I knew what to look for: http://en.wikipedia.org/wiki/Diode_bridge They mention using a capacitor in parallel to smooth the output. From what it sounds like they are saying though, I'd need one rated for the output voltage I want? Eg 10V? Or do I misunderstand? I've never really quite gotten the hang of capacitors I must admit. Someone else told me differently on the matter (would you believe someone at RadioShack who actually understood what a capacitor even IS? Lol, lately all they seem to be good at there is stuff like cellphones...)

I've managed to salvage a few things so hopefully I won't have to buy anything. In fact, right after I went to all the trouble to actually connect four diodes together, I found an actual diode bridge in an old broken PSU, so took that. In looking for a smaller capacitor, I did find a 16V, 100μF electrolytic capacitor. Would something like that work if it doesn't actually have to be 10V? Should it be more or less capacitance?

BTW, should I even bother to run a resistor across it to discharge it? It's pretty low voltage (even the 16V isn't much really) and I would be happier if power draw when it's not in use was kept at a minimum. Of course, the whole thing is pretty low power to begin with. However, I don't think it's likely to ever get discharged by touch or anything anyway.
 
Yes, you need a capacitor here, it needs to be rated for greater than the peak voltage you will encounter. If you have a 120V -> 10V transformer, 10V is actually the rms value of the waveform - when you rectify the 10Vrms, the actual DC peak voltage will be 14.1 - 2*(0.65) = 12.8V, which is the minimum voltage the capacitor must be rated for (margin is always left with cap voltages, 16V will work fine here). Of course, depending on how the transformer is rated, the no-load voltage could be well above 10Vrms, so it is wise to measure the transformer/check the datasheet, or to just use a 25V cap.

Now there are two approaches to reducing the voltage: add series diodes, or use a resistor in series with the load (regulation is an option as well but slightly more complicated and frankly unneccessary). Diodes are the simplest option, just add as many in series to reduce the peak voltage to 10V, 2.8V = 4 diodes - this will be bulky, so the resistor seems to be a better option here (a single zener diode could be used but this will be more difficult to find). Using a resistor, calculate the resistance to drop 2.8V as R = 2.8/.05 = 56Ω, anything slightly above or below this will work. Calculate the peak power in the resistor - P = VI = 2.8*.05 = .14W, a 1/4W resistor will work here (this isn't entirely accurate, as the resistance not being 56Ω will affect dissipation, but you get the idea). You can actually choose the resistor power rating based on the rms power, but I'm being lazy here since the rms power here will be somewhat close to the peak power.

Also worth noting here that even though we want to reduce the voltage, the resistor is a better option for reducing voltages fed to current sinks (LEDs). Driving a current sink with a voltage source directly is quite a bad idea in fact - you always need to be able to add some resistance between them, which means having a higher supply voltage to account for the resistor's voltage drop. In this case though, if the light is designed for a standard AC/DC adapter, it will already have some series resistance added.

The bleeder resistor (discharges the cap) isn't necessary here - only useful if you want high-voltage caps to be discharged after removing power.

The capacitor will determine the ripple voltage seen by the lamp; here, however, the current waveform is more important with LEDs (intensity is proportional to current). If a simulation is performed with the load + load resistor modeled as 50Ω with a 10V zener (470u smoothing cap): the peak current is 37mA and the minimum is 28mA, with an average around ~33mA. Since the 50Ω resistor is limiting current too much, we decrease it to 33Ω and get an average of ~46mA while reaching 52mA peak, which is acceptable. Decreasing the cap to 100u results in an average of 38mA, which may not be acceptable for sufficient light. Moving to 220u results in ~43mA average, which isn't much different from the 470u cap - this is probably the optimal value.
 
It seems to me that all this effort is a bit unnecessary--why not simply purchase a wall-wart with the necessary specs?
 
Because of the reasons I said above..... BTW, what effort? The point here was a low effort solution. I grabbed a diode bridge from an old and broken PSU and a capacitor from another broken thing. No running to the store, no buying things online and waiting for them to arrive, and so on. It's not any more effort at this point than it would have been to order a new plug and make it work even had that been possible (which, as I said is not. It would have to get more complicated since I would then need voltage regulation presumably.)

Yes, you need a capacitor here, it needs to be rated for greater than the peak voltage you will encounter. If you have a 120V -> 10V transformer, 10V is actually the rms value of the waveform - when you rectify the 10Vrms, the actual DC peak voltage will be 14.1 - 2*(0.65) = 12.8V, which is the minimum voltage the capacitor must be rated for (margin is always left with cap voltages, 16V will work fine here). Of course, depending on how the transformer is rated, the no-load voltage could be well above 10Vrms, so it is wise to measure the transformer/check the datasheet, or to just use a 25V cap.
In this case it goes the opposite direction. In fact, it may be that it is overrated. When I measure the no-load voltage after going through the bridge it is actually BELOW 10V. Even given that multi-testers like this apparently do RMS it's still obviously going to be lower rather than higher. The LED is also coming out fairly dark and dull. I'm thinking now that it may actually be a 12V LED and they use a low voltage on purpose, but I'm not really sure. (It would make sense since it's kind of hard to believe that there are a lot of 10V LEDs out there and I seriously doubt they'd do a 9V or something and run it higher -- especially given that they didn't exactly make it user-replaceable.)

Also worth noting here that even though we want to reduce the voltage, the resistor is a better option for reducing voltages fed to current sinks (LEDs). Driving a current sink with a voltage source directly is quite a bad idea in fact - you always need to be able to add some resistance between them, which means having a higher supply voltage to account for the resistor's voltage drop. In this case though, if the light is designed for a standard AC/DC adapter, it will already have some series resistance added.
I think I'm going to have to do without then. For one, it's dark enough already. Lowering the voltage any further could get pretty messy. For another, it has no battery capabilities, so must be designed for external power sources right off. If I wanted to add any more resistance I'd surely have to get another power supply. She has already spent quite enough for something that's supposed to be cheap as it is though and I'm not even sure she's going to be very happy with how dark it comes out already.

Worst case scenario: if it does work out for her, but the LED tears up, I'll replace it myself and maybe do something better (such as a 9V LED combined with some resistance to lower the supply's voltage. I have the diode bridge and capacitor in a small enclosure and I couldn't really get heat shrink tubing to work on the output connectors due to the way things are connected, so I could easily enough make changes later on if need be. She could then have a much brighter light on the thing even.) I guess the most important thing is to first see if it does any good as it is.

The bleeder resistor (discharges the cap) isn't necessary here - only useful if you want high-voltage caps to be discharged after removing power.
Yeah, I kind of figured it would be no big deal with such a low power supply. At 120V it seems like a pretty good idea, but surely the worst a 16V, μF capacitor could do is hurt a little? Certainly it's nowhere near to the lethality of some they put on full power coming in.
 
In this case it goes the opposite direction. In fact, it may be that it is overrated. When I measure the no-load voltage after going through the bridge it is actually BELOW 10V. Even given that multi-testers like this apparently do RMS it's still obviously going to be lower rather than higher. The LED is also coming out fairly dark and dull. I'm thinking now that it may actually be a 12V LED and they use a low voltage on purpose, but I'm not really sure. (It would make sense since it's kind of hard to believe that there are a lot of 10V LEDs out there and I seriously doubt they'd do a 9V or something and run it higher -- especially given that they didn't exactly make it user-replaceable.)
How are you measuring the AC voltage? Technically, a true-rms meter should read the same voltage before and after the bridge-rectifier (no smoothing cap), but you should measure it directly at the transformer winding to be sure. I'm not entirely sure what's happening here, even if it's below 10Vrms you'll still have margin since the rectified voltage will be 41% higher.

LEDs can't really be driven with too low of a voltage (diode I-V curve) - it doesn't begin to produce much light until you reach around 0.3V under the rated forward voltage. This sharp conduction is the reason why you need a current-limiting resistor for driving LEDs from a voltage source - increase the voltage slightly above the rated forward voltage and the LED could die very quickly.
I think I'm going to have to do without then. For one, it's dark enough already. Lowering the voltage any further could get pretty messy. For another, it has no battery capabilities, so must be designed for external power sources right off. If I wanted to add any more resistance I'd surely have to get another power supply. She has already spent quite enough for something that's supposed to be cheap as it is though and I'm not even sure she's going to be very happy with how dark it comes out already.
Depends on how the light's configured internally. Any single white LED that isn't rated at around 3.2V either has a dropper resistor or has a small upconverter integrated. A 10V input either means multiple LEDs are in series or it already has a dropper resistor installed. If your transformer is capable of close to 10Vrms, then the rectified voltage will still be large enough.
 
How are you measuring the AC voltage? Technically, a true-rms meter should read the same voltage before and after the bridge-rectifier (no smoothing cap), but you should measure it directly at the transformer winding to be sure. I'm not entirely sure what's happening here, even if it's below 10Vrms you'll still have margin since the rectified voltage will be 41% higher.
I have a decent quality digital multi-meter and simply put it in voltage test mode. It has a button than switches it to testing AC voltage, so I initially tested from the adapter with that and now have tested in normal DC testing mode (the default) of the voltage coming out. It hasn't dropped a LOT, just a bit. You did mention using diodes to lower voltage, so I'm wondering if all I'm seeing is just that. Mostly I'm just wondering if they built it to be closer to what it says it does than usual really. In particular, I noticed that the voltage REALLY didn't change much with or without load, which makes me wonder if it has some regulation or something already anyway.

If you'd like to try to look up more info, here is the one I got: http://www.mouser.com/Search/ProductDetail.aspx?R=412-210054 Here's a direct link to the data sheet: http://www.mouser.com/catalog/specsheets/XC-600088.pdf Based on that it seems like it should be more. I don't know what really to tell you about that though.

I can say that the diode bridge is almost certainly seriously overspecced for the mere 10V @ 50mA this magnifier is rated at though. It came out of a PSU after all. I'm not sure how much it actually had to handle -- it may not have actually been directly on the AC-in after all -- but it surely has to handle a LOT more current if nothing else.

LEDs can't really be driven with too low of a voltage (diode I-V curve) - it doesn't begin to produce much light until you reach around 0.3V under the rated forward voltage. This sharp conduction is the reason why you need a current-limiting resistor for driving LEDs from a voltage source - increase the voltage slightly above the rated forward voltage and the LED could die very quickly.
Well, I know they can be pretty limited, but it seems like the range is more than that on some I've used in the past. In particular, I've used some flashlights and would swear that the total voltage from all the batteries dropped more than that amount and still managed to power it. I've noticed that they do still do a bit of the incandescent effect though where they start to dim very quickly as the voltage gets close to their lowest limits.
 
Last edited:
I have a decent quality digital multi-meter and simply put it in voltage test mode. It has a button than switches it to testing AC voltage, so I initially tested from the adapter with that and now have tested in normal DC testing mode (the default) of the voltage coming out. It hasn't dropped a LOT, just a bit. You did mention using diodes to lower voltage, so I'm wondering if all I'm seeing is just that. Mostly I'm just wondering if they built it to be closer to what it says it does than usual really. In particular, I noticed that the voltage REALLY didn't change much with or without load, which makes me wonder if it has some regulation or something already anyway.
Unless your measurements are deviating from the transformer specs significantly (11.6V +/-5% open-circuit), there's no problem. 'Loading' is a relative term - a 50mA load on a transformer rated for 500mA isn't going to affect output much, it will still be close to the no-load rating. What are you using for a load?
Well, I know they can be pretty limited, but it seems like the range is more than that on some I've used in the past. In particular, I've used some flashlights and would swear that the total voltage from all the batteries dropped more than that amount and still managed to power it. I've noticed that they do still do a bit of the incandescent effect though where they start to dim very quickly as the voltage gets close to their lowest limits.
Depending on the flashlight, it may have a boost converter to increase the voltage - other than that, I can't say much about them in general. I may have exaggerated the useful range of LEDs in terms of actually working, but you generally want to stay within a 0.3V-wide window to approach a decent intensity level.
 
If you've hooked things up properly you should be seeing ~12.5-15VDC at the output of the rectifier with a suitable capacitor across it. If the voltage is less, something is wrong. Make sure you're measuring AC/DC as appropriate.

I suspect it's probably safe to run this device at whatever comes out the rectifier, the adapters shipped with these types of gadgets typically have no regulation either (the transformer based ones anyway), and if you had measured it, you'd likely see it running significantly higher than spec'd as well. A 9V one would work too, and they're easy to find. Can probably get one for a buck or two at your local thrift shop.
 
Unless your measurements are deviating from the transformer specs significantly (11.6V +/-5% open-circuit), there's no problem. 'Loading' is a relative term - a 50mA load on a transformer rated for 500mA isn't going to affect output much, it will still be close to the no-load rating. What are you using for a load?
Yeah, it's below the official specs for "load" (15% according to that data sheet.) I just used the LED magnifier itself as the "load test." After all, that's all it is ever going to be powering, so it doesn't matter what it might be under different circumstances.

If you've hooked things up properly you should be seeing ~12.5-15VDC at the output of the rectifier with a suitable capacitor across it. If the voltage is less, something is wrong. Make sure you're measuring AC/DC as appropriate.
I'm measuring just fine. I don't claim to be any expert, but trust me when I say that I can do that much just fine. (Heck, I've managed to use it to calibrate a SCPH-1001 PSX system's laser which of course has to be measured correctly or the laser can burn out in a hurry or not work on anything.) But you really think a 10VAC power supply should come out as 12.5-15V out of a diode bridge? That sounds more than a bit excessive to me at least.


Anyway, I've given the thing to her now, so can't really do any more with it unless it should break down later.
 
But you really think a 10VAC power supply should come out as 12.5-15V out of a diode bridge? That sounds more than a bit excessive to me at least.

Unloaded (which it basically is with 50mA - or very likely much less - on it) that transformer is specified for 11.5VAC, multiply by sqrt(2) to get the peak to peak voltage, 16.3Vp-p. Drop ~1.4V in the bridge for very close to 15VDC. Loaded it should deliver 10VAC, same math = ~12.75VDC. Actual voltage will be a bit less depending on the smoothing cap used, but with the load you're using the cap needed to keep the voltage up isn't that big, though if you're using 100uF as mentioned that's probably why you're seeing a low DC output.

If it's working for you, not much else to say, I wasn't sure if you still had questions or if you got the gadget working.
 
Last edited:
Back
Top