I have an old monitor that I use as a secondary screen with my laptop when off grid at the cabin. The rating on the sticker is 100-240v 1.5A. Is that drawing a lot of power from my solar system?
I have another newer monitor in the city that is rated 19v .8A which I'm guessing is more efficient. But could someone weigh in on how much more efficient it is and is my old monitor an energy hog or just so so? I'm thinking I should switch them out.
Ok, watts is amps X volts, so that is the direct comparison to make. For the first monitor, it's 120V X 1.5A = 180watts. The second monitor is 19V X 0.8A = 15.2watts. So, superficially, it appears that the new monitor is far more efficient.
But, that 19V X 0.8A is powered via a little black cube that you plug into the wall, right? Those numbers are what the monitor itself is pulling, but what is the total being pulled out by the black cube? I'll guess it's a lot closer to monitor #1 then your numbers lead you to believe?
Looking at my own monitor that I have in front of me as I type this, looking at the cube powering the monitor the numbers are Input: 120V X 1.5A, Output 19V X 2.6A. So my numbers are 180W vs 49.4W. I suspect your numbers are close to the same.
So, in reality, your two monitors are likely to be almost the same. I never leave my computer on, 24/7. Yes, shut off both monitors if you are not actively using them. They are a big drain over a 24hr period. Walk around the house and look at whatever other black cubes you have stuck on the walls. They are all sucking power 24/7. Even if you unplug whatever they power, they are still sucking watts in the background. The only way to shut them off is unplug them from the wall.
Generally speaking any new monitor you purchase today will use far less energy than something that is 10+ years old. The technology has just gotten so much better. Here is a link to energy star rated monitors.
Notice the most efficient monitor uses a mere 3.64 watts when it's on and 0.05 watts in sleep mode which is mind blowing compared to the way they used to be.
Fascinating! Thank you everyone for the intel. I'll have to wait until I'm at the cabin again to see what draw the power adapter uses so that I can add it to the monitor usage.
I'll post again when I have that number. I really appreciate your comments!
My friend went to the cabin for me and reported back that there is no adapter on the monitor. This is my figurings...do I have this right?
City monitor:
19v x .8 A = 15.2 watts
plus adapter:
120v x .6 = 72 watts
Add them together for total watts used: 72 + 15.2 = 87.2 watts
Cabin off grid monitor:
120v x 1.5a = 80 watts (no adapter)
So this means the old clunky cabin monitor is MORE energy efficient. Seems counter intuitive.
The city adapter says: input 100-240v .6A and Output 19v .84A. I've used the input numbers in my calc above. Let me know if I have that wrong.
Another question I have is why do you translate "100-240v" to 120v in the formula? What does the range 100-240v mean?
City monitor:
The adapter can take in anywhere from 100-240 volts AC and can produce 19 volts DC at a maximum of 3.7 amps (= roughly 72 watts).
The monitor only uses a fraction of this (15.2 watts) but there will be some energy loss as the adapter does the conversion. I would guess it would use a combined total of 40 watts maximum.
The best way to know for sure is to measure it. Devices like this kill-a-watt meter are simple to use. Plug it into the wall outlet then plug your devices into it, turn them on, then read the wattage. Not recommending this specific product, it's just the cheapest one listed with a quick search, to use as an example.
Mike Barkley wrote:The best way to know for sure is to measure it. Devices like this kill-a-watt meter are simple to use. Plug it into the wall outlet then plug your devices into it, turn them on, then read the wattage. Not recommending this specific product, it's just the cheapest one listed with a quick search, to use as an example.
I second that!
For instance, just because a power adapter is rated to be able to provide 2 amps, doesn't mean that the device is actually drawing that much. Furthermore, AC volts have less energy as compared with DC volts because it's really the area under the curve. You'd need to compare DC voltage to the AC "RMS voltage" to get something equivalent (estimated as the voltage times 0.707). And those are just two things that occur off the top of my head. I'm sure there are probably other factors that could come into play in a given situation.
Agreed -- the only way to know for sure is to measure. I don't know how well a kill-a-watt would work on an off grid system though. Does it provide reports of real time consumption as well as averages over time?
Anyway, the ratings on an electrical device indicate its maximum consumption. The actual usage is often lower, and depends on many factors -- some of which are under the user's control.
City monitor:
The adapter can take in anywhere from 100-240 volts AC and can produce 19 volts DC at a maximum of 3.7 amps (= roughly 72 watts).
The monitor only uses a fraction of this (15.2 watts) but there will be some energy loss as the adapter does the conversion. I would guess it would use a combined total of 40 watts maximum.
Ok! So then the city monitor IS more efficient. I'll get a watt meter and double check. Thanks so much.
We only use high-quality power strips at all "wall outlets", as most standard "wall" outlets are way down there by the floor. Thus, all our devices have an off switch, by virtue of the power strip, even power blocks w/o them. The strip is up high, on our desks, furniture, etc., so no more bending over to get down to the wall outlet level, and there's more outlets, surge protection, etc.
With that off switch at the power strip, no power is being used, anywhere ... confirmed by measuring devices like the kill-o-watt (agreed as to only way to know for sure).
Modern SMPS power blocks (aka wall warts) consume next to nothing if their downstream device is off. The problem is the huge variation in downstream devices truly being "off" ... if it has a display (time clock, status, etc.), or other modern "convenience" features, then it's in some state of "on", and consuming power. Power strips, on/off switches in between plug and outlet ... all help out in this particular whack-a-mole game.
You can't measure enough, as "convenience" means different things among manufacturers and consumers.
Post by:autobot
First, you drop a couch from the plane, THEN you surf it. Here, take this tiny ad with you:
The new permaculture playing cards kickstarter is now live!