I wonder if my system is good or bad. My server needs 0.1kWh.
I use unraid with 5950x and it wouldn’t stop crashing until I disabled c states
So that plus 18 hdds and 2 ssds it sits at 200watts 24/7
50W-ish idle? Ryzen 1700, 2 HDDs, and a GTX 750ti. My next upgrade will hopefully cut this in half.
About 700 watts, it makes for a decent space heater in the winter.
I’m right around the same level, and it actually keeps my server room / workshop at comfortable temperature during the winter. I also have my gaming PC mounted in my server rack; when that’s running, there are times where my AC will still kick in even when it’s 40 degrees outside.
80-100 watts at idle which is most of the time. Two OS drives, two fast drives, two spinners, lots of networking and always syncing with the rest of the cluster.
My server with 8 hard drives uses about 60 watts and goes up to around 80 under heavy load. The firewall, switch, access points and modem use another 50-60 watts.
I really need upgrade my server and firewall to something about 10 years newer, it would reduce my power consumption quite a bit and I would have a lot more runtime on UPS.
45 to 55 watt.
But I make use of it for backup and firewall. No cloud shit.
Mine runs at about 120 watts per hour.
Please. Watt is an SI unit of power, equivalent of Joule per second. Watt-hour is a non-SI unit of energy( 1Wh = 3600 J). Learn the difference and use it correctly.
I’m idling at 120W with eight drives, but I’m currently looking into how to lower it.
9 spinning disks and a couple SSD’s - Right around 190 watts, but that also includes my router and 3 PoE WiFi AP’s. PoE consumption is reported as 20 watts, and the router should use about 10 watts, so I think the server is about 160 watts.
Electricity here is pretty expensive, about $.33 per kWh, so by my math I’m spending $38/month on this stuff. If I didn’t have lots of digital media it’d be worth it to get a VPS probably. $38/month is still cheaper than Netflix, HBO, and all the other junk I’d have to subscribe to.
That’s true. And the children of my family see no ads which is priceless. Yet I am looking into ways to cut costs in half by using an additional lower powered mini pc which is always on and the main computer only running in the evening - maybe.
Same here. 300w with 12 disks, switches, and router. But electricity only costs $.12/kwh. I wouldn’t trust having terabytes of data in the cloud.
AiBot post. Fuck this shit.
Can you please explain?
80-110W
0.1kWh per hour? Day? Month?
What’s in your system?
Computer with gpu and 50TB drives. I will measure the computer on its own in the enxt couple of days to see where the power consumption comes from
You are misunderstanding the confusion, Kw/h is an absolute measurement of an amount of power, not a rate of power usage. It’s like being asked how fast your car can go and answering it can go 500 miles. 500 miles per hour? Per day? Per tank? It doesn’t make sense as an answer.
Does your computer use 100 watt hours per hour? Translating to an average of 100 watts power usage?
You’re adding to the confusion.
kWh (as in kW*h) and not kW/h is for measurement of energy.
Watt is for measurement of power.They said kilawatt hours per how, not kilawatts per hour.
kWh/h = kW
The h can be cancelled, resulting in kW. They’re technically right, but kWh/h shouldn’t ever be used haha.
Lol thank you, I knew that I don’t know why I wrote it that way, in my defense it was like 4 in the morning.
Yeah but tbh it’s understandable that OP got confused. I think he just means 100W
He might, but he also might mean that he has a power meter that is displaying Kwh since last reset and he plugged it in and then checked it again later when it was all set up after an arbitrary time period and it was either showing the lowest non-zero value it was capable of displaying or was showing a number from several hours.
17W for an N100 system with 4 HDD’s
That’s pretty low with 4 HDD’s. One of my servers use 30 watts. Half of that is from the 2 HDD’s in it.
@meldrik @qaz I’ve got a bunch of older, smaller drives, and as they fail I’m slowly transitioning to much more efficient (and larger) HGST helium drives. I don’t have measurements, but anecdotally a dual-drive USB dock with crappy 1.5A power adapter (so 18W) couldn’t handle spinning up two older drives but could handle two HGST drives.
Which HDDs? That’s really good.
Seagate Ironwolf “ST4000VN006”
I do have some issues with read speeds but that’s probably networking related or due to using RAID5.
Idle: 30 Watts
Starting all docker containers after reboot: 140 Watts
It needs around 28 kWh per month.
kWh is a unit of energy, not power
Wasn’t it stated for the usage during November? 60kWh for november. Seems logic to me.
Edit: forget it, he’s saying his server needs 0.1kWh which is bonkers ofc
Only one person here has posted its usage for November. The OP has not talked about November or any timeframe.
I was really confused by that and that the decided units weren’t just in W (0.1 kW is pretty weird even)
Wh shouldn’t even exist tbh, we should use Joules, less confusing
Watt hours makes sense to me. A watt hour is just a watt draw that runs for an hour, it’s right in the name.
Maybe you’ve just whooooshed me or something, I’ve never looked into Joules or why they’re better/worse.
Joules (J) are the official unit of energy. 1W=1J/s. That means 1Wh=3600J or that 1J is kinda like “1 Watt second”. You’re right that Wh is easier since everything is rated in Watts and it would be insane to measure energy consumption by seconds. Imagine getting your electric bill and it says you’ve used 3,157,200,000J.
3,157,200,000J
Or just 3.1572GJ.
Which apparently is how this Canadian natural gas company bills its customers: https://www.fortisbc.com/about-us/facilities-operations-and-energy-information/how-gas-is-measured
Thanks for the explainer, that makes a lot of sense.
At least in the US, the electric company charges in kWh, computer parts are advertised in terms of watts, and batteries tend to be in amp hours, which is easy to convert to watt hours.
Joules just overcomplicates things.
Wow, the US education system must be improved. 1J is 3600Wh. That’s literraly the same thing, but the name is less confusing because people tend to confuse W and Wh
Do you regularly divide/multiply by 3600? That’s not something I typically do in my head, and there’s no reason to do it when everything is denominated in watts. What exactly is the benefit?
Wow, the US education system must be improved.
I pay my electric bill by the kWh too, and I don’t live in the US. When it comes to household and EV energy consumption, kWh is the unit of choice.
1J is 3600Wh.
No, if you’re going to lecture people on this, at least be right about facts. 1W is 1J/s. So multiply by an hour and you get 1Wh = 3600J
That’s literraly the same thing,
It’s not literally the same thing. The two units are linearly proportional to each other, but they’re not the same. If they were the same, then this discussion would be rather silly.
but the name is less confusing because people tend to confuse W and Wh
Finally, something I can agree with. But that’s only because physics is so undervalued in most educational systems.
I did a physics degree and am comfortable with Joules, but in the context of electricity bills, kWh makes more sense.
All appliances are advertised in terms of their Watt power draw, so estimating their daily impact on my bill is as simple as multiplying their kW draw by the number of hours in a day I expect to run the thing (multiplied by the cost per kWh by the utility company of course).