Power consumption

Do PCs really use a lot of electricity in a house? Everytime the bill comes in the blame falls on me for using the pc too much, its a dell optiplex i7 3rd gen and a gtx 960. I use it 6-8 hours a day with 3 hours of gaming, the rest is just me dicking around in the Internet and work. Anyway what appliances use a ton of electricity that is mostly the culprit?

POSIWID: The Purpose Of A System Is What It Does Shirt $21.68

Homeless People Are Sexy Shirt $21.68

POSIWID: The Purpose Of A System Is What It Does Shirt $21.68

  1. 2 years ago
    Anonymous

    Water heaters, kettles, lighting appliances usually consume the most as far as i know.

    • 2 years ago
      Anonymous

      the biggest power consumer in most houses is gonna be air conditioning this time of year, hands down. your PC probably uses like 50 watts when you're puttering around on the internet and maybe close to 250 when you're playing a game. (assuming that game gives your PC a real workout, playing Terraria or something will hardly kick it off idle)

      things like electric kettles have a very high power draw but they generally aren't on for very long, just a few minutes so you can have your tea or instant ramen or whatever.

      • 2 years ago
        Anonymous

        >electric kettles
        my uncle likes to leave them plugged in 24/7, it boils for a few minutes and it has a "keep warm" function, does it draw the same amount of power as boiling it?

        • 2 years ago
          Anonymous

          obviously not, or the water would start to boil again

        • 2 years ago
          Anonymous

          >does it draw the same amount of power as boiling it?
          Pretty basic physics that taking water that is probably like 20°C and bringing it to 100°C very fast, is not quite as energy-intensive as having like 95°C water and keeping at that level. Especially if it's a well-insulated kettle, which it hopefully is if it's designed for keeping the water warm for a long time.

        • 2 years ago
          Anonymous

          that's stupid as you can just put it in a vacuum thermos. would keep the water hot in about 3 days unless you bought a busted thermos.

          Your PC probably idles at about 200 W and uses 400 under load, with the numbers you gave that's 200*5 + 400*3 = 2.2 kWh. What do you pay per kWh? 2.2 kWh would cost me about £0.50.

          Heating water is usually the main culprit. I once calculated the energy use of everything in my house and the boiler was by far the biggest consumer.

          nvidia doesn't have efficiency in mind, it only has like 3 memory+gpu clock rates (low, adaptive, high) and probably an OC rate you can use with their tool so at best your PC's PSU is at 100% load when you glide your window or scroll websites on chrome.
          keep in mind efficiency factors, one of the biggest culprits are AVR/UPS which has terrible efficiency factor of 0.5 so might be 150% of that.

          >Your PC probably idles at about 200 W
          not him but my UPS says that everything connected to it draws 82 W while idle, and 180+ while playing Elden Ring
          of course, i don't know how accurate is it
          if i plug in my phone for charging, the consumption jumps 20 W

          have you seen your UPS's actual efficiency and power factor? it could be another culprit, something like 0.5 PF can make it almost double the consumption. what it shows is what it only provides and not what it actually consumes itself.

    • 2 years ago
      Anonymous

      My kettle uses like what, 1600W for a minute, maybe two? Meanwhile my gaming PC uses 90W when it's just idling. So basically the kettle will use about 30 to 60 Wh of electricity for a boil, and having the PC on for an hour, doing nothing, uses 90 Wh of electricity.

  2. 2 years ago
    Anonymous

    >incel processor
    housefire
    >goyvidya gpu
    housefire

    its entirely your fault

  3. 2 years ago
    Anonymous

    Your PC probably idles at about 200 W and uses 400 under load, with the numbers you gave that's 200*5 + 400*3 = 2.2 kWh. What do you pay per kWh? 2.2 kWh would cost me about £0.50.

    Heating water is usually the main culprit. I once calculated the energy use of everything in my house and the boiler was by far the biggest consumer.

    • 2 years ago
      Anonymous

      how much toaster consumes? i use normal steel kettles on gas stove so thats not issue for me. but if toaster is problem, i might start frying bread on frying pan as well

      • 2 years ago
        Anonymous

        Quite a bit, over a thousand watts, but again it's a matter of high draw for low duration. It only draws that much when you're making toast. Which takes like two or three minutes and you do a few times a day at most.

        If you have AC running at all in the house I'd be shocked if that's not the biggest thing, any other electric large appliances (clothes driers, refrigerators, etc) are probably going to be next. Buy a kill-a-watt and you can get some hard numbers on this stuff,

        my dad would say that when i still lived with him, then i made the math next to him and he never mentionned the subject again.

        has the right idea.

        • 2 years ago
          Anonymous

          >you do a few times a day at most.
          that's a shitload
          okay into the trash toaster goes

    • 2 years ago
      Anonymous

      >Your PC probably idles at about 200 W
      not him but my UPS says that everything connected to it draws 82 W while idle, and 180+ while playing Elden Ring
      of course, i don't know how accurate is it
      if i plug in my phone for charging, the consumption jumps 20 W

      • 2 years ago
        Anonymous

        >the consumption jumps 20 W
        well, about 20, probably less
        just checked the charger, it says it has a 10W throughput

    • 2 years ago
      Anonymous

      >Your PC probably idles at about 200 W and uses 400 under load
      That sounds like too much unless you have very power hungry components. Especially idle.

      I've got an i5,-10600KF RTX 3060, 2x8 GB DDR4 and a 600W power supply (80 PLUS Platinum). Measured like 90 W idle, about 120 W browsing the Internet and having YouTube on, 200 W for Rocket League and 300 W for Cyberpunk 2077. I think the highest peak I measured with Cyberpunk was 320 W.

  4. 2 years ago
    Anonymous

    my dad would say that when i still lived with him, then i made the math next to him and he never mentionned the subject again.

  5. 2 years ago
    Anonymous

    A 3rd gen i7 and 960 consume less than the average 4k 23984729384nits tv so tell your parents to frick off and stop watching bullshit if they want to cut their costs.

  6. 2 years ago
    Anonymous

    offgrid solar gay here. I had to get every watt count in the beginning, so I have a pretty decent understanding now and yes, like some people have already said itt, it's never your computer, or the lights, router or your phone charger. It's always always always AC, water heater, heating, oven, kettles. Anything that involves resistance temp change is a fricking power leech. Unless you're mining crypto, I doubt you're using over 400w/h, if that.

    • 2 years ago
      Anonymous

      Even if you're mining you'd almost have to be moronic for a single computer with a single gpu to draw 400W+

    • 2 years ago
      Anonymous

      >like some people have already said itt, it's never your computer
      I actually cut like 60 to 70 kWh from my monthly usage by replacing my old desktop. I got it at the end of March and January to March, I averaged 274 kWh per month, and in April and May, when I had my new computer, I averaged 208 kWh. Getting the new computer was noticeable almost immediately in the daily statistics.

      Granted, one part of this is that the old computer would crash if I tried to sleep it, so I kept it on 24/7. New computer can actually sleep, so when I'm in bed, I put it to sleep. Therefore it's closer to 17/7.

      • 2 years ago
        Anonymous

        of course, things add up over time.even 50kwh/day is a lot if what you're looking at is at a power bill. I had to think in a "how man kw/h I can manage to get away with during the day" and "how many kw/h can my batteries handle during the night" kind of way.
        If you're looking at getting a slimmer power bill, you have to skim off any fricking watt you can manage. Get power socks with on/off switches, you'd be amazed how much energy you can waste just by having something plugged in even if the device is off. Try other sources of energy for cooking and/or heating, like butane or propane. Electricity is extremely not cost efficient when doing those things. Make it a religion to never leave a light on on a room you're not in. Things like that add over time.

        • 2 years ago
          Anonymous

          >Make it a religion to never leave a light on on a room you're not in.
          Lights aren't really that bad these days if you use LED bulbs. What used to take 60 W can now be done with like 8 W.

          Of course, it doesn't make sense to keep lights on if you don't use it, but I don't start panicking if I notice that I left the kitchen light on when I went on a walk.

  7. 2 years ago
    Anonymous

    >960 nhousefire 120 W
    lowest average only! not to mention "boost" and "overclock" can push this higher
    >optiplex
    doubt it has at least 80 plus bronze at all, it also depends on the power supply's efficiency factor. a good desktop "business" machine uses 90-94% efficiency (94 at low load, 90 at max). these are labeled as 80 plus titanium which is rare to find on prebuilt business machines but easy to find with xeon HPs

    a standard efficiency PSU probably has PF of 0.7 so if that machine is a 300W PSU == it consumes about 300W+90W wasted away as heat or 390W over all. 90W is wasted CONSTANTLY keep that in mind (may depend on load modes but since you have housefire gpu it is probably at 50-100% all the time).

    don't forget about your "UPS/AVR" which also probably has PF of about 0.5-0.8 - 0.8 is really good but we don't really know if these machines can safely withdraw or take away the overhead once the power are now stable (0.5 PF overhead is to ensure it is feeding corrected 110~240V when the power is lacking or poor frequency/current) although it should mean the consumed "watt" should be the same as intended/designed (since power is corrected), this is sometimes bugged and your UPS/AVR would instead just draw more power than it should that's why it gets "hot" - it is converted and wasted away as heat.
    a BAD UPS usually has about 0.5 power factor (which they try to hide) so it can supply its promised 1000W properly with its cheap components or bad EE design, and I doubt they have efficiency load modes (20-50-100% loads) so they're just literally wasting away the other 500W for no good reason (or 20-50% of that if they do have load modes).
    they also sap away power while turned off (pic related).

    if you're going to go anal about it, just make sure you have 94% efficiency desktop PSU in 80 plus Titanium. and use low power modes/governor, "T" series core and don't use a GPU. or maybe just use a fricking laptop with 60W adaptor that is VI effiency + energystar

    • 2 years ago
      Anonymous

      >94 at low load, 90 at max
      Aren't PSUs most efficient at medium loads, not low loads?

    • 2 years ago
      Anonymous

      >90W is wasted CONSTANTLY keep that in mind (may depend on load modes but since you have housefire gpu it is probably at 50-100% all the time).
      Pic related. Great job writing an essay just to show us you have no idea what you are talking about. A 970 idled at about ~10W.

      >94 at low load, 90 at max
      Aren't PSUs most efficient at medium loads, not low loads?

      Efficiency curve is more like a plateau now instead of only really being efficient at half it's rated load. I think it's still least efficient on low loads though, like you said.

      • 2 years ago
        Anonymous

        >Pic related. Great job writing an essay just to show us you have no idea what you are talking about. A 970 idled at about ~10W.
        >anime pic
        lol. no you.
        where are you getting your ~10W? lol did you plug your PSU directly in a meter to get that number? kinda sus but maybe you're right if your GPU was running in the bios menu it's probably at that number.
        I'm referring to PSU loads when I said 90W wasted and also the fact that nvidia only has like 3 clock rates, low/adaptive/high and maybe OC if you fell for OC meme, going full OC or locking the rates makes it NEVER want to idle.
        having a shitty ACPI device like optiplex can accidentally make PSU run at 100% load especially if you turned off the PCIE and other power management settings by accident (or it was bugged) the GPU will get supplied power, while the housefire technology only has like 3 clock rates and goes 100% clock rate when simply dragging windows (if the display does not have sync meme), while optiplexes are like less than 80 plus or barely gets 80 plus certification at all so yup, it's not very great and the 90W number is very real.
        if your PSU is running at 100% load due to its shitty power management it will waste 90W without a doubt unless it is indeed a 80 plus titanium then maybe 18W which is true numbers calculated from the EFFICIENCY POWER FACTOR not a "~10W" you made up.
        it would without a doubt waste 0.7 of the current load if it's not 80 plus certified (non-vPro optiplexes usually aren't using the best PSU or chances are it's a refurb, the good titanium PSU has been brought to recycling)

        • 2 years ago
          Anonymous

          300W - 300W * 0.7 PF (non-80plus PSU) = 90W as wasted heat (100% load). I'm being generous here as standard efficiency power supplies even go as low or lower as 60% efficiency or 0.6 PF
          300W - 300W * 0.90 PF (80plus t) = 30W as wasted heat (100% load)

        • 2 years ago
          Anonymous

          >doubling down with another schizo rant
          >anime pic
          anime website.

          • 2 years ago
            Anonymous

            guess I should've added some details, I have a gtx 960 2gb gamer Oc https://www.techpowerup.com/gpu-specs/galax-gtx-960-gamer-oc-mini-black.b5639 and a powerock psu https://www.gigabyte.com/Power-Supply/GE-N400A-C2#kf that I got off another pc.

            Quite a bit, over a thousand watts, but again it's a matter of high draw for low duration. It only draws that much when you're making toast. Which takes like two or three minutes and you do a few times a day at most.

            If you have AC running at all in the house I'd be shocked if that's not the biggest thing, any other electric large appliances (clothes driers, refrigerators, etc) are probably going to be next. Buy a kill-a-watt and you can get some hard numbers on this stuff, [...] has the right idea.

            >kill a watt
            should probably buy one of these then, in the meantime is hwmonitor reliable with its watt counter?

            Your PC probably idles at about 200 W and uses 400 under load, with the numbers you gave that's 200*5 + 400*3 = 2.2 kWh. What do you pay per kWh? 2.2 kWh would cost me about £0.50.

            Heating water is usually the main culprit. I once calculated the energy use of everything in my house and the boiler was by far the biggest consumer.

            last i checked the rate is $0.21 so 2.2 would cost me $0.47?

  8. 2 years ago
    Anonymous

    I have an ancient old server (Proliant DL370 G6) that's loaded with drives, add-in cards and dual power supplies and it rarely exceeds 250W even with some low-intensity VMs running on it constantly. Like others said, your PC isn't the big consumer here.

  9. 2 years ago
    Anonymous

    Unless you spend all day mining bitcoin, it's barely consuming anything. It's just the A/C more than likely.

    • 2 years ago
      Anonymous

      Would an A/C running at 2 hours every night still use more power than a pc?

      • 2 years ago
        Anonymous

        >A/C running at 2 hours every night
        >in the middle of summer
        Are you in the southern hemisphere or something

      • 2 years ago
        Anonymous

        wattage consumed per hour/day is a huge selling point for window units, they plaster it all over the box. look that up for a comparison

        running central A/C for any appreciable amount of time is definitely going to cost way more than a computer

      • 2 years ago
        Anonymous

        >The average central air conditioner uses between 3000 and 3500 watts per hour during the warm month.
        Your PC would have to average over 250W (i.e. rendering/mining/gaming), running all day and night to use more than your AC.

        • 2 years ago
          Anonymous

          >3000w
          jesas, ill just stick to fans thanks

          • 2 years ago
            Anonymous

            3000W for a single family home in one month is insane. That number has to factor in office buildings or something.
            >ill just stick to fans
            What climate do you live in anon? Summers get 40+ easily here and are humid as hell. There's no way I'd want to rid that out with just a fricking fan.

          • 2 years ago
            Anonymous

            nta but I'd settle for a window AC easy.

  10. 2 years ago
    Anonymous

    >Everytime the bill comes in the blame falls on me
    move out of your mom's basement then

  11. 2 years ago
    Anonymous

    Buy a wattmeter that you plug into the wall for ~$13, put your computer's surge protector on it, and then show your true usage.

  12. 2 years ago
    Anonymous

    The biggest draws typically are water heaters, fridges/freezers, AC/heat, dishwashers, washers and dryers (especialy dryers), ovens and microwaves, and appliances like air fryers.
    Lighting is lowest these days, if you've switched over to LED. Anything that runs all day, like a fan, or ceiling fan, will draw a lot, but not as much as the above.

    The biggest change in electrical usage over time has been the death of CRTs. They drew tons of power, and new tech for tvs and monitors have made a huge difference. Plasmas used a lot too, especially if it's on 12 hours a day like most homes.

    I could only see having to reduce computer usage if you're poor, and a $5 difference on your bil matters.

    But a lot of waste happens for things most people don't even think about, like turning a fan on when the temp goes up, or using the oven/microwave a lot.

Your email address will not be published. Required fields are marked *