Electricity question? From 115v to 220v and amps/watts

LucasG

Gawd
Joined
Jul 2, 2004
Messages
675
So I moved from the US to Argentina and I took quiet a few things. I need some help on understanding the transformations from one to another. In the US we use 115v - 125v and in Argentina it's 220v. The thing is I want to try and understand exactly how much "watts" do the appliances I brought use in 220v. I brought a Z-5500 5.1 speaker set, an xbox 360 and netgear router, all of them require transformers for 220v. I'm not sure where to get the information directly from all this products as they don't really specify it clearly. Some say output, but not input. Which one should I look at, also while doing the convertion to 220v from 115v or so on, do the wattage gets divded by 2 as it's half the volts we use in America?

Help is appreciated.
 
1 watt = 1 volt * 1 amp

But seriously:

"Argentine electricity is officially 220V, 50Hz. Adapters and transformers for North American equipment are readily available.

The best way to use imported electrical equipment in Argentina is to purchase an adapter once there. These are available in the Florida shopping area in Buenos Aires for around US$2, or less in hardware stores outside the city center. Buildings use a mix of European and Australian plug fittings. However, the live and neutral pins in the Australian fittings are reversed so as to prevent cheap imports into Australia. Therefore an Australian adapter may be incompatible. The IRAM-2073, which are physically identical to the Australian AS-3112 standard (two blades in a V-shape, with or without a third blade for ground).

European standard CEE-7/7 "Schukostecker" or "Schuko" outlets and the non-grounded, but compatible, European CEE-7/16 "Europlug" outlets may still be found in some older buildings. U.S. and Canadian travelers may want to pack adapters for these outlets as well.

Many sockets have no earth pin. Laptop adapters should have little problem with this for short term use.

Some Argentine sockets accept North American plugs, particularly ones on power strips. Beware - this does not mean that these sockets deliver 110 volts. Make sure that your equipment can handle 220 volts! Simply changing the shape of the plug with a US$2 adapter will not allow 110 volt equipment to operate on 220 volt Argentinian current, unless the device is specifically designed to work on both 110 and 220 volts, irreperable damage and even fire can result. Most laptop power adapters and many portable electronics chargers are designed to work on dual voltage; check the specifications for your equipment to be sure. If your equipment cannot accept 220 volt current, you can purchase a '220 to 110' volt transformer for approximately US$6 in most Argentinian electronics shops. This is much heavier and bulkier than a small adapter."

From: wikitravel.org
 
Everyone is on the 220V boat except America.

Come on America... get with times. 220V and metric!
 
They won't use any more power at 220 V though the adapter than they would at 120 V without it, aside from a little parasitic loss in the adapter. I think you will find that the adapter is just a step down transformer. In that case power in equals power out. The current on the 220 V side is approximately half what it is on the 120 V side. It's actually .55 times if you do the math on a calculator.
 
1 watt = 1 volt * 1 amp

But seriously:

"Argentine electricity is officially 220V, 50Hz. Adapters and transformers for North American equipment are readily available.

The best way to use imported electrical equipment in Argentina is to purchase an adapter once there. These are available in the Florida shopping area in Buenos Aires for around US$2, or less in hardware stores outside the city center. Buildings use a mix of European and Australian plug fittings. However, the live and neutral pins in the Australian fittings are reversed so as to prevent cheap imports into Australia. Therefore an Australian adapter may be incompatible. The IRAM-2073, which are physically identical to the Australian AS-3112 standard (two blades in a V-shape, with or without a third blade for ground).

European standard CEE-7/7 "Schukostecker" or "Schuko" outlets and the non-grounded, but compatible, European CEE-7/16 "Europlug" outlets may still be found in some older buildings. U.S. and Canadian travelers may want to pack adapters for these outlets as well.

Many sockets have no earth pin. Laptop adapters should have little problem with this for short term use.

Some Argentine sockets accept North American plugs, particularly ones on power strips. Beware - this does not mean that these sockets deliver 110 volts. Make sure that your equipment can handle 220 volts! Simply changing the shape of the plug with a US$2 adapter will not allow 110 volt equipment to operate on 220 volt Argentinian current, unless the device is specifically designed to work on both 110 and 220 volts, irreperable damage and even fire can result. Most laptop power adapters and many portable electronics chargers are designed to work on dual voltage; check the specifications for your equipment to be sure. If your equipment cannot accept 220 volt current, you can purchase a '220 to 110' volt transformer for approximately US$6 in most Argentinian electronics shops. This is much heavier and bulkier than a small adapter."

From: wikitravel.org
I know the plugs and where to find everything, just looking for the exact watts usage from the elctronics so that I can buy specific voltage transformers with the most approximate watt requirement instead big transformers with too much extra amount of watts.

Everyone is on the 220V boat except America.

Come on America... get with times. 220V and metric!
Agreed.

They won't use any more power at 220 V though the adapter than they would at 120 V without it, aside from a little parasitic loss in the adapter. I think you will find that the adapter is just a step down transformer. In that case power in equals power out. The current on the 220 V side is approximately half what it is on the 120 V side. It's actually .55 times if you do the math on a calculator.

So transforming 120V to 220V appliances the amount of power they use at 120V is around half of what it would be in 220V?
 
So transforming 120V to 220V appliances the amount of power they use at 120V is around half of what it would be in 220V?

No, It would be the same, assuming the appliance is compatible with the higher voltage. The current changes inversely with voltage, but power remains the same.
 
some devices do not need a stop down as they do it themselves.

What i'd do is check the power brick of each device and see if it accepts 220 V @ 50 Hz

For example, my LCD says 110/220 V 50/60 Hz which means that it can use both. A lot of newer psu's (that don't have the red 110/220 V switch) can do universal inputs.
 
Everyone is on the 220V boat except America.

Come on America... get with times. 220V and metric!
While metric is futuristic, 120 v 220V will be less of an issue in the future because in the future things are going to made to use less power, and things already are made to operate at 90VAC, due to long-ass range affored by automatically switich universal input voltage. People just need to get better circuit breakers, and PS Audio receptacles; if they have any aluminum house wiring, they need to tear all of it out and replace all of it with copper wiring. Or they could get the PS Audio Power Plant Premier.

FYI, Japan uses less than 120VAC and they like it because they're the ones that best know how to design thermal/power characteristics. Plus, even though 120VAC standard is more likely to cause sags, 220VAC is closer to a surge. 120VAC is also better because it could eventually force full-scale appliances to be made to use less power. "Could" because the government is going to force superior private companies to define power specs for products. And when the govt does that, they will just either cause the power problem to get worse, or causes a new problem, while the original problem that was intended to be solved remains.

When the PCI-Express 2.0 slot standard increased the amount of wattage the slots would deliver, that was kind of dumb, because it just allowed nvidia and ati to ignore thermal specs.

Same thing with these 1.2kw psu's. If the PSU companies had just capped the wattage of their products at like 800W, then not only would video cards consume less power, but they'd also run cooler.

ADD: Didn't read the post of user name user name (i.e., not the dos file command to launch the 96 classic) duke3d before writing this. sorry.
 
While metric is futuristic, 120 v 220V will be less of an issue in the future because in the future things are going to made to use less power, and things already are made to operate at 90VAC, due to long-ass range affored by automatically switich universal input voltage. People just need to get better circuit breakers, and PS Audio receptacles; if they have any aluminum house wiring, they need to tear all of it out and replace all of it with copper wiring. Or they could get the PS Audio Power Plant Premier.

FYI, Japan uses less than 120VAC and they like it because they're the ones that best know how to design thermal/power characteristics. Plus, even though 120VAC standard is more likely to cause sags, 220VAC is closer to a surge. 120VAC is also better because it could eventually force full-scale appliances to be made to use less power. "Could" because the government is going to force superior private companies to define power specs for products. And when the govt does that, they will just either cause the power problem to get worse, or causes a new problem, while the original problem that was intended to be solved remains.

When the PCI-Express 2.0 slot standard increased the amount of wattage the slots would deliver, that was kind of dumb, because it just allowed nvidia and ati to ignore thermal specs.

Same thing with these 1.2kw psu's. If the PSU companies had just capped the wattage of their products at like 800W, then not only would video cards consume less power, but they'd also run cooler.

ADD: Didn't read the post of user name user name (i.e., not the dos file command to launch the 96 classic) duke3d before writing this. sorry.

You have no idea what you're talking about.
 
I'm going to change ambiguities and make the changes/additions bold; if there is anything that still makes no sense, let me know, because if you told me "you have no idea what you're talking about" then you should elaborate. I'm taking my time to elaborate and I'm not the one who was talking shit.

While metric measurement are futuristic,and better b/c they're based on powers of 10, and b/c medicines and the rest of science is based upon it, 120 v 220V will be less of an issue in the future because in the future things are going to made to use less power, and things already are made to operate at 90VAC [like PSUs; my antec signature 650 operates at 90VAC, due to long-ass range affored by automatically switchinguniversal input voltage. People just need to get better circuit breakers, and PS Audio receptacles if their circuit breaker sucks and if they've had their receptacles for 13 years or so, because those regulate the purity of Alternating Current [although that won't automatically solve the brownout issue of 120VAC]; if they have any aluminum house wiring, they need to tear all of it out and replace all of it with copper wiring. Or they could get the PS Audio Power Plant Premier which purifies power completely, as indicated by ps audio's website and it adds 10V to the input voltage.

FYI, Japan uses less than 120VAC (they use 110VAC) and they like it because they're the ones that best know how to design power characteristics, that they're more likely to make powerful machines that consume less power 120VAC is better than 220VAC because it could eventually lead to full-scale appliances being made to use less power. "Could" because the Authoritarian government is going to force superior private companies to define power specs for products. And when the govt does that, they will just either cause the power problem to get worse, or causes a new problem, while the original problem that was intended to be solved remains.

When the PCI-Express 2.0 slot standard increased the amount of wattage the slots would deliver, that was kind of dumb; that just allowed nvidia and ati to boost their products' power specs which affect their thermal specs.

Same thing with these 1.2kw psu's. If the PSU companies had just capped the wattage of their products at like 800W, then not only would video cards consume less power, but they'd also run cooler.

I wasn't thinking right when I was saying that 240VAC was more vulnerable to surges, because they're rated to run at a higher voltage, after I thought more about it. So I had it backwards. Any input VAC > ~120VAC is a surge for the 120VAC standard, because 120VAC is the standard at which it's supposed to operate and anything above maximum accepted input voltage results in a surge. But then, the 240VAC spec is worse at repelling brown-outs, because anything below 240VAC is a brown-out for devices rated at 240VAC.

Now all of that doesn't matter if the device has an automatic input switching range of 90-264VAC


Basically, it would be better overall if Europe followed Japan's lead, so the whole world can just use 120VAC, which would cause things to consume less energy. The technology will be available soon to make washers, dryers and refrigerators, even house air conditioners and heaters consume as little power as necessary to so that 15 or 20A 110/120 VAC receptacles can provide enough wattage for powerful appliances. If you keep the 220VAC standard, then that could encourage things to use more wattage

Also, the Euro standard might be older than the american one in this case.[/B]
 
So I moved from the US to Argentina and I took quiet a few things. I need some help on understanding the transformations from one to another. In the US we use 115v - 125v and in Argentina it's 220v. The thing is I want to try and understand exactly how much "watts" do the appliances I brought use in 220v. I brought a Z-5500 5.1 speaker set, an xbox 360 and netgear router, all of them require transformers for 220v. I'm not sure where to get the information directly from all this products as they don't really specify it clearly. Some say output, but not input. Which one should I look at, also while doing the convertion to 220v from 115v or so on, do the wattage gets divded by 2 as it's half the volts we use in America?

Help is appreciated.

Lucas, if anything you brought to Argentina says 110/220 - 50/60Hz on it then you only have to use a plug adapter, no transformer needed and you are good to go. Most computer stuff is set like this.

If you have American stuff that is 110 only you are pretty much screwed. I have bought Step-up/step-down converters for American stuff before (I am an American living in Europe). It will work but it is not a long term or daily use option. You will need to buy those appliances in 220 there.
 
I'm going to change ambiguities and make the changes/additions bold; if there is anything that still makes no sense, let me know, because if you told me "you have no idea what you're talking about" then you should elaborate. I'm taking my time to elaborate and I'm not the one who was talking shit.

While metric measurement are futuristic,and better b/c they're based on powers of 10, and b/c medicines and the rest of science is based upon it, 120 v 220V will be less of an issue in the future because in the future things are going to made to use less power, and things already are made to operate at 90VAC [like PSUs; my antec signature 650 operates at 90VAC, due to long-ass range affored by automatically switchinguniversal input voltage. People just need to get better circuit breakers, and PS Audio receptacles if their circuit breaker sucks and if they've had their receptacles for 13 years or so, because those regulate the purity of Alternating Current [although that won't automatically solve the brownout issue of 120VAC]; if they have any aluminum house wiring, they need to tear all of it out and replace all of it with copper wiring. Or they could get the PS Audio Power Plant Premier which purifies power completely, as indicated by ps audio's website and it adds 10V to the input voltage.

FYI, Japan uses less than 120VAC (they use 110VAC) and they like it because they're the ones that best know how to design power characteristics, that they're more likely to make powerful machines that consume less power 120VAC is better than 220VAC because it could eventually lead to full-scale appliances being made to use less power. "Could" because the Authoritarian government is going to force superior private companies to define power specs for products. And when the govt does that, they will just either cause the power problem to get worse, or causes a new problem, while the original problem that was intended to be solved remains.

When the PCI-Express 2.0 slot standard increased the amount of wattage the slots would deliver, that was kind of dumb; that just allowed nvidia and ati to boost their products' power specs which affect their thermal specs.

Same thing with these 1.2kw psu's. If the PSU companies had just capped the wattage of their products at like 800W, then not only would video cards consume less power, but they'd also run cooler.

I wasn't thinking right when I was saying that 240VAC was more vulnerable to surges, because they're rated to run at a higher voltage, after I thought more about it. So I had it backwards. Any input VAC > ~120VAC is a surge for the 120VAC standard, because 120VAC is the standard at which it's supposed to operate and anything above maximum accepted input voltage results in a surge. But then, the 240VAC spec is worse at repelling brown-outs, because anything below 240VAC is a brown-out for devices rated at 240VAC.

Now all of that doesn't matter if the device has an automatic input switching range of 90-264VAC


Basically, it would be better overall if Europe followed Japan's lead, so the whole world can just use 120VAC, which would cause things to consume less energy. The technology will be available soon to make washers, dryers and refrigerators, even house air conditioners and heaters consume as little power as necessary to so that 15 or 20A 110/120 VAC receptacles can provide enough wattage for powerful appliances. If you keep the 220VAC standard, then that could encourage things to use more wattage

Also, the Euro standard might be older than the american one in this case.[/B]

You're still not makeing a bit of sense.

BTW, I believe Japan operates on 100v 50hz...

EDIT: Let me help you out some, Voltage in Volts * Current in Amps = Power in Watts. If you increase Voltage, Current automatically decreases so that the same amount of power is maintained. If you look on the specs of most older PSU's it'll say something like115/230Vac 8/4a. That means the PSU will draw 8amps at 115 volts, but only 4 amps a 230 volts.

Even further, inefficiencies are generally caused by the loss of energy through heat dissipation. Heat is created by resistance to current. Since higher voltages use less current, there's less heat. So, it's actually more efficient to run at a higher voltage.
 
To add to ryan's post:

Typical power system losses are heat losses due to I^2 * R (another formula for power). So loss increases by the square of current times the resistance of the path.

This is why voltage is stepped up to the megavolt range for long distance power transmission - to reduce I^2*R losses as much as possible.

Multiplying voltage by 1000 will reduce current by a factor of 1000 to drive the same loads in the end (because P=I*E). This will reduce heat losses during transmission by 1000*1000 or 1 million times. Furthermore the limiting factor in wiring is the amount of current which can be carried based on I^2*R heat buildup and voltage plays no part in that. More power can be carried by a wire operating at higher voltage.
 
So transforming 120V to 220V appliances the amount of power they use at 120V is around half of what it would be in 220V?

No it would be the same. If we assume your 120V appliance uses 120 watts of power then it draws 1 amp. On the 120V side of the transformer you would have 120W / 120V = 1A.120V @ 1 amp. On the 220V side of the transformer you still have the same 120 watts but the current is now 120W / 220V = 0.56 A. 220V @ 0.56A
220V X 0.56A = 120V X 1 amp
The voltage goes down but the current goes up.
 
You have no idea what you're talking about.
Glad you said it first.
20 v 220V will be less of an issue in the future because in the future things are going to made to use less power, and things already are made to operate at 90VAC
Input voltage does not affect the amount of power an electric device uses because its current requirements will increase linearly with any decrease in voltage, as ryan_975 pointed out.
like PSUs; my antec signature 650 operates at 90VAC
No it doesn't.
People just need to get better circuit breakers, and PS Audio receptacles if their circuit breaker sucks and if they've had their receptacles for 13 years or so, because those regulate the purity of Alternating Current [although that won't automatically solve the brownout issue of 120VAC]
The quality of the circuit breakers has absolutely nothing to do with any voltage concerns. Also, circuit breakers do not affect power regulation.
they like it because they're the ones that best know how to design power characteristics, that they're more likely to make powerful machines that consume less power
Again, voltage does not affect power consumption.
120VAC is better than 220VAC because it could eventually lead to full-scale appliances being made to use less power.
This makes no sense.
"Could" because the Authoritarian government is going to force superior private companies to define power specs for products. And when the govt does that, they will just either cause the power problem to get worse, or causes a new problem, while the original problem that was intended to be solved remains.
Better put on your tinfoil hat then.
When the PCI-Express 2.0 slot standard increased the amount of wattage the slots would deliver, that was kind of dumb; that just allowed nvidia and ati to boost their products' power specs which affect their thermal specs.
I have a bit of news for you: when you do more work, you consume more power. There is no way to get around increased power usage for things that do more, aside from attempting to increase the efficiency which is what companies try to do with smaller processes and new transistor designs with less leakage. However, there are limitations as to how far that can go.
Same thing with these 1.2kw psu's. If the PSU companies had just capped the wattage of their products at like 800W, then not only would video cards consume less power, but they'd also run cooler.
That would just prevent video card makers from producing high-end cards. It wouldn't make them magically run on less power.
I wasn't thinking right when I was saying that 240VAC was more vulnerable to surges, because they're rated to run at a higher voltage, after I thought more about it. So I had it backwards. Any input VAC > ~120VAC is a surge for the 120VAC standard, because 120VAC is the standard at which it's supposed to operate and anything above maximum accepted input voltage results in a surge. But then, the 240VAC spec is worse at repelling brown-outs, because anything below 240VAC is a brown-out for devices rated at 240VAC.
This also makes no sense. For any device rated at a certain input range, anything above it is a surge and anything below is a brownout, technically speaking. The specific voltage range has no effect on how susceptible a given device is to power surges or brownouts.
Basically, it would be better overall if Europe followed Japan's lead, so the whole world can just use 120VAC, which would cause things to consume less energy.
Nonsense.
If you keep the 220VAC standard, then that could encourage things to use more wattage
Higher input voltage does not affect power consumption. Man, I'm starting to sound like a broken record here.
 
You're still not makeing a bit of sense.

BTW, I believe Japan operates on 100v 50hz...

EDIT: Let me help you out some, Voltage in Volts * Current in Amps = Power in Watts. If you increase Voltage, Current automatically decreases so that the same amount of power is maintained. If you look on the specs of most older PSU's it'll say something like115/230Vac 8/4a. That means the PSU will draw 8amps at 115 volts, but only 4 amps a 230 volts.

Even further, inefficiencies are generally caused by the loss of energy through heat dissipation. Heat is created by resistance to current. Since higher voltages use less current, there's less heat. So, it's actually more efficient to run at a higher voltage.

Japan is weird - half of the country uses 50Hz and the other half uses 60Hz and are liked in the middle by DC.
 
Input voltage does not affect the amount of power an electric device uses because its current requirements will increase linearly with any decrease in voltage, as ryan_975 pointed out.

Not true! The power a device draws is defined by it's design. In simple terms it will draw what it needs to get the job done. It will be designed for a certain input voltage and current range. If you have a fixed load "resistance" and you increase the voltage across that load the current will increase not decrease. I = E X R The power will also go up P = I X E. For example if you plugged a 120V device into a 220V outlet it would likely draw twice the current it was designed for, but only for a brief instant just before it goes up in a ball smoke and fire. Assuming you don't blow a fuse or trip a circuit breaker.

In the case of the step down transformer the current draw on the input side is dependent on the current drawn by the load connected to it. The ratio is dependent on the way the transformer is wired, how many turns of wire are in the primary compared to the secondary. That's the basic premise for all those plug in the wall DC adapters. The coils are wound to get the voltage they want on the secondary. Then they just rectify it to DC and filter it with a capacitor. Now that switching power supplies are cheap the adapters don't need the transformer. Some can also auto sense the input voltage to bypass needing the external step down transformer for a different country.
 
Not true! The power a device draws is defined by it's design. In simple terms it will draw what it needs to get the job done. It will be designed for a certain input voltage and current range. If you have a fixed load "resistance" and you increase the voltage across that load the current will increase not decrease. I = E X R The power will also go up P = I X E. For example if you plugged a 120V device into a 220V outlet it would likely draw twice the current it was designed for, but only for a brief instant just before it goes up in a ball smoke and fire. Assuming you don't blow a fuse or trip a circuit breaker.

In the case of the step down transformer the current draw on the input side is dependent on the current drawn by the load connected to it. The ratio is dependent on the way the transformer is wired, how many turns of wire are in the primary compared to the secondary. That's the basic premise for all those plug in the wall DC adapters. The coils are wound to get the voltage they want on the secondary. Then they just rectify it to DC and filter it with a capacitor. Now that switching power supplies are cheap the adapters don't need the transformer. Some can also auto sense the input voltage to bypass needing the external step down transformer for a different country.
I was talking in terms of an entire appliance designed to operate at different voltages, since we're discussing computers here. Obviously it isn't quite so simple as I was making it out to be, but I didn't want to start explaining theory here. And yes, I know that for fixed resistors what I said does not apply ;).
 
Not true! The power a device draws is defined by it's design. In simple terms it will draw what it needs to get the job done. It will be designed for a certain input voltage and current range. If you have a fixed load "resistance" and you increase the voltage across that load the current will increase not decrease. I = E X R The power will also go up P = I X E. For example if you plugged a 120V device into a 220V outlet it would likely draw twice the current it was designed for, but only for a brief instant just before it goes up in a ball smoke and fire. Assuming you don't blow a fuse or trip a circuit breaker.

In the case of the step down transformer the current draw on the input side is dependent on the current drawn by the load connected to it. The ratio is dependent on the way the transformer is wired, how many turns of wire are in the primary compared to the secondary. That's the basic premise for all those plug in the wall DC adapters. The coils are wound to get the voltage they want on the secondary. Then they just rectify it to DC and filter it with a capacitor. Now that switching power supplies are cheap the adapters don't need the transformer. Some can also auto sense the input voltage to bypass needing the external step down transformer for a different country.


If E = I * R, then I = E / R, not E * R. Anyhow, we're not talking about fixed designs, we're talking about designs in general. A 200watt circuit designed for 120v input is less efficient than a 200watt circuit designed for 240v input.
 
I know the plugs and where to find everything, just looking for the exact watts usage from the elctronics so that I can buy specific voltage transformers with the most approximate watt requirement instead big transformers with too much extra amount of watts.

So transforming 120V to 220V appliances the amount of power they use at 120V is around half of what it would be in 220V?

Since everyone seemed to go off on a tangent and not answer the OPs questions... Here's a recap.

Power (watts) usage is going to be the same. P (in Watts) = I (in Amps) * E (in Volts). You aren't differentiating between power (watts) and current (amps).

Increased voltage = lowered current = same power draw.

Also, increased capability transformers just means you have extra headroom, it doesn't mean it consumes more power than is being drawn. Transformers are usually 95-99% efficient, with only a little power lost as heat.

In fact, it's usually a good idea to have at least 25% more capability than your power requirement, because it'll lower heat and increase your efficiency.
 
I was talking in terms of an entire appliance designed to operate at different voltages, since we're discussing computers here. Obviously it isn't quite so simple as I was making it out to be, but I didn't want to start explaining theory here. And yes, I know that for fixed resistors what I said does not apply ;).

OK, thats cool, I might have taken what you said a little out of context. There were so many quotes It was hard to follow some of what was said. I wasn't sure what smiley to put at the end so I wouldn't come out as being a know it all. ;) I'm a retired electronic technician and don't get much of a chance to talk shop these days so I tend to get carried away when I do get an opportunity. :)
 
OK, thats cool, I might have taken what you said a little out of context. There were so many quotes It was hard to follow some of what was said. I wasn't sure what smiley to put at the end so I wouldn't come out as being a know it all. ;) I'm a retired electronic technician and don't get much of a chance to talk shop these days so I tend to get carried away when I do get an opportunity. :)
Not a problem. I do the same thing sometimes :D.
 
BTW, I believe Japan operates on 100v 50hz...
QUOTE]
Well, I knew it was at least 10VAC less than 120VAC; I was partially correct, like you were about the input frequency=]

Glad you said it first.

Input voltage does not affect the amount of power an electric device uses because its current requirements will increase linearly with any decrease in voltage, as ryan_975 pointed out.

OK, you're right, makes sense; BUT... why does the U.S. have to have 220VAC for things like washers and dryers? Even microwaves are always less than 1.5kW, so I don't see why 220 VAC would be necessary; there's more than enough power for a microwave on a 15A 120VAC circuit.

No it doesn't.

newegg's website said that was the minimum input VAC it could run at, as well as the manual; it's listed in reg. input vac range on the unit, but that's because I don't believe extended range is allowed by the tyrannical govt(s) to be listed on the unit

The quality of the circuit breakers has absolutely nothing to do with any voltage concerns. Also, circuit breakers do not affect power regulation.

I never said they did either one of those two things; a bad circuit breaker could die on you or shut-off under less usage than it's rated for; also circuit breakers have different capacities; and 20A is plenty for receptacles; finally, a 120VAC/20A circuit will eventually be plenty for an excellent quality ac meant to cover a 3k ft^2 house.

Again, voltage does not affect power consumption.

True; thanks for correcting me=]

This makes no sense.

Well, if you keep the limits per circuit at 20A (assuming 220V affords much higher than that), then that will require manufacturers of appliances to make their appliances to consume less power.

Better put on your tinfoil hat then.

I have a bit of news for you: when you do more work, you consume more power. There is no way to get around increased power usage for things that do more, aside from attempting to increase the efficiency which is what companies try to do with smaller processes and new transistor designs with less leakage. However, there are limitations as to how far that can go.
I'd have to say nvidia is just relatively recently beginning to make a sincere effort to increase efficiency and ATI is nowhere near it.

That would just prevent video card makers from producing high-end cards. It wouldn't make them magically run on less power.

it wouldn't "magically" make them run on less power, but nvidia would have to increase their efficiency, through clock speeds, gddr3/5 voltage and ns rating, manufacturing process, and pcb quality/components.

This also makes no sense. For any device rated at a certain input range, anything above it is a surge and anything below is a brownout, technically speaking. The specific voltage range has no effect on how susceptible a given device is to power surges or brownouts.
That was what I said. but you're not understanding me

The surge range for the 120 VAC input standard (assuming there is no auto switching transformer) is [>120VAC,<infinite]. The surge range for the 240 VAC standard, as the 240VAC standard is >240VAC,< infinite]. the surge range is 120VAC more for the 120VAC standard than it is for the 240VAC standard

Nonsense.

Higher input voltage does not affect power consumption. Man, I'm starting to sound like a broken record here.
reply to last quote: I never said it did; I was saying, the 20A max receptacle rating for the 120VAC stand is less than the max receptacle; basically, if nvidia and ati put some sincere effort into their pcb designs and die manufacturing process as well as their gddr ram, and they became traditionally faster, then it would be possible for power consumption to not any longer be inflated with when performance and transistors go up.
I'm starting to sound like a broken record.


I just hope that one day the world will just settle on one final standard that's better than any other existing, although that's not really necessary because of how advanced power supplies have become with auto-switching extended universal VAC and frequency input range
 
BTW, I believe Japan operates on 100v 50hz...
Well, I knew it was at least 10VAC less than 120VAC; I was partially correct, like you were about the input frequency=]


reply to last quote: I never said it did; I was saying, the 20A max receptacle rating for the 120VAC stand is less than the max receptacle; basically, if nvidia and ati put some sincere effort into their pcb designs and die manufacturing process as well as their gddr ram, and they became traditionally faster, then it would be possible for power consumption to not any longer be inflated with when performance and transistors go up.
I'm starting to sound like a broken record.


I just hope that one day the world will just settle on one final standard that's better than any other existing, although that's not really necessary because of how advanced power supplies have become with auto-switching extended universal VAC and frequency input range

The more transistors you have, the more power you use, and because they tend to use more power when switching states, they also use more power when they're run faster and faster.

And what the hell does this mean?
I was saying, the 20A max receptacle rating for the 120VAC stand is less than the max receptacle;
 
why does the U.S. have to have 220VAC for things like washers and dryers?
Because they use a lot of power and it makes more sense to increase the voltage to 220V rather than having to use a ridiculously high-capacity breaker. Plus, higher current and lower voltage will also cause more heat loss, which would be more significant due to the much larger amount of power being delivered.
newegg's website said that was the minimum input VAC it could run at, as well as the manual; it's listed in reg. input vac range on the unit
I'm looking at a picture of the PSU label right now and the input range is clearly stated as 100-240V.
I don't believe extended range is allowed by the tyrannical govt(s) to be listed on the unit
What the fuck are you talking about?
I never said they did either one of those two things
Yes you did, right here:
if their circuit breaker sucks and if they've had their receptacles for 13 years or so, because those regulate the purity of Alternating Current
a 120VAC/20A circuit will eventually be plenty for an excellent quality ac meant to cover a 3k ft^2 house.
If you think a single 20A 120V circuit will ever be enough for an entire house, you're dreaming.
Well, if you keep the limits per circuit at 20A (assuming 220V affords much higher than that), then that will require manufacturers of appliances to make their appliances to consume less power.
...
I'd have to say nvidia is just relatively recently beginning to make a sincere effort to increase efficiency and ATI is nowhere near it.
...
it wouldn't "magically" make them run on less power, but nvidia would have to increase their efficiency, through clock speeds, gddr3/5 voltage and ns rating, manufacturing process, and pcb quality/components.
It doesn't seem like you understand just how difficult it is to improve the power efficiency of a GPU. It is an incredibly daunting task to engineer a smaller manufacturing process, especially when you start getting into sizes as small as manufacturers are using nowadays. In most cases the advancements in the design of GPUs occur at a much faster rate than new processes become available. And once they start bumping into the 10nm barrier (which is where quantum physics starts to become a factor), it'll take a whole lot longer to come up with smaller processes, whereas GPUs will still continue to get faster and faster. What you're proposing is wishful thinking at best.
The surge range for the 120 VAC input standard (assuming there is no auto switching transformer) is [>120VAC,<infinite]. The surge range for the 240 VAC standard, as the 240VAC standard is >240VAC,< infinite]. the surge range is 120VAC more for the 120VAC standard than it is for the 240VAC standard
And that makes a difference how? If the power grid is designed to output within a certain voltage range, the likelihood of a power surge doesn't change regardless of what that specific range is. You're no more likely to have a power surge on a 240V circuit than you are on a 120V circuit.
then it would be possible for power consumption to not any longer be inflated with when performance and transistors go up.
http://en.wikipedia.org/wiki/Conservation_of_energy
It is impossible to increase performance without increasing power usage. You cannot do more work without using more energy. You can mitigate it with smaller manufacturing processes and more efficient components, but there are fundamental barriers that you will run into eventually. That is why the trend has been for higher power consumption with faster GPUs, even though the average process size has decreased significantly over the last several years, and the power efficiency of GPUs has also increased tremendously.

And if you know of some way to get around that pesky conservation issue, please let me know. I could make a lot of money with that kind of knowledge.
 
http://en.wikipedia.org/wiki/Conservation_of_energy
It is impossible to increase performance without increasing power usage. You cannot do more work without using more energy. You can mitigate it with smaller manufacturing processes and more efficient components, but there are fundamental barriers that you will run into eventually. That is why the trend has been for higher power consumption with faster GPUs, even though the average process size has decreased significantly over the last several years, and the power efficiency of GPUs has also increased tremendously.

And if you know of some way to get around that pesky conservation issue, please let me know. I could make a lot of money with that kind of knowledge.

Processing "power" isn't directly tied down by conservation of energy. Considering there's 35W mobile CPUs that can outperform 140W prescott cpus, I think it's pretty clear that it IS possible to increase performance without increasing power usage.

Conservation of Energy just states that energy in = energy out. That's it. So if you fundamentally change the way the energy is used, and have a larger percentage go towards function and less towards low level heat loss, you do increase performance without increasing energy consumed.
 
Keep in mind that in Canada and the US the 240V is 2 phase, its two 120V circuits. Putting appliances that draw a lot of power on 240 lets you split the load across the two circuits. In a dryer for instance I believe the motor that turns the drum is a 120V motor connected to one of the two 120V feeds. The heater element is on the other 120V circuit. In the stove some of the heater elements are on one circuit and some on the other. I think the oven elements are 240V but I'm not 100% sure. By splitting the load you split your current draw across two wires instead of one. The wire diameter can be smaller and the breakers or fuses can also be a smaller rating than if you had just run one 240V circuit. I believe the idea behind it is it's safer and cheaper to do it that way. Just my 2c worth.:D
 
Processing "power" isn't directly tied down by conservation of energy. Considering there's 35W mobile CPUs that can outperform 140W prescott cpus, I think it's pretty clear that it IS possible to increase performance without increasing power usage.
I didn't say that processing power is limited by energy, I said that work is. Prescott CPUs do more work for a given amount of processing power, which is why they require more energy. However, like I said, there is a limit as to how far you can increase the efficiency of a processor, and you will run into the fundamental barriers eventually. That is unavoidable. It is foolish to think that you can continue to increase efficiency indefinitely so that you will always be able to increase performance without affecting power consumption.
 
Keep in mind that in Canada and the US the 240V is 2 phase, its two 120V circuits. Putting appliances that draw a lot of power on 240 lets you split the load across the two circuits. In a dryer for instance I believe the motor that turns the drum is a 120V motor connected to one of the two 120V feeds. The heater element is on the other 120V circuit. In the stove some of the heater elements are on one circuit and some on the other. I think the oven elements are 240V but I'm not 100% sure. By splitting the load you split your current draw across two wires instead of one. The wire diameter can be smaller and the breakers or fuses can also be a smaller rating than if you had just run one 240V circuit. I believe the idea behind it is it's safer and cheaper to do it that way. Just my 2c worth.:D

Wrong, It's a single phase, We get 120v by splitting it with a center-tap at the tranformer. In an electric dryer, the motor does connect to one of the incoming hots and a neutral/common for 120v, but the heating element connects to both incoming hots for the full 240v.
 
I didn't say that processing power is limited by energy, I said that work is. Prescott CPUs do more work for a given amount of processing power, which is why they require more energy. However, like I said, there is a limit as to how far you can increase the efficiency of a processor, and you will run into the fundamental barriers eventually. That is unavoidable. It is foolish to think that you can continue to increase efficiency indefinitely so that you will always be able to increase performance without affecting power consumption.

Read about quantum computing. http://en.wikipedia.org/wiki/Quantum_computer

And while you did say "work" the second time, you said "It is impossible to increase performance without increasing power usage." and this is what I had an issue with.

I also didn't say anything about indefinitely... just that it's very achievable to increase processing power while lowering power consumption. That's what die shrinks and many other manufacturing process changes have accomplished.
 
And while you did say "work" the second time, you said "It is impossible to increase performance without increasing power usage."
And right after that, I said this:
You cannot do more work without using more energy.
You took that one part out of context.
That's what die shrinks and many other manufacturing process changes have accomplished.
Except even with all of that, power consumption is still going up, because those aren't really solutions, they're only ways to avoid the problem.
 
You took that one part out of context.

Except even with all of that, power consumption is still going up, because those aren't really solutions, they're only ways to avoid the problem.

I understand what you meant, but if you're still defending how you said it... you mixed physics references with functional references. As far as physics goes, you're right... functionally, you're not. You can increase efficiency, and get more *computational* work done with less energy. This is a functional truth.
 
Back
Top