Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

All these power hungry beasts coming out of Intel and NVIDIA feel quite out of sync with the zeitgeist in a world that's worried about the power bill - especially when the M1/M2 is there to provide contrast. I'm getting Pentium 4 vs Core architecture vibes.


> All these power hungry beasts coming out of Intel and NVIDIA feel quite out of sync with the zeitgeist in a world that's worried about the power bill -

These CPUs aren't consuming 250W all the time. Those are peak numbers.

Both Intel and AMD are providing huge efficiency gains, too. Rumors show the new i7 13700T Raptor Lake part can have a 35W mobile TDP and still outperform a Ryzen 7 5800X: https://www.tomshardware.com/news/intel-13700t-raptor-lake-a...

Speed scales nonlinearly with power. These high TDP parts are halo parts meant for enthusiast builds where it doesn't matter that the machine draws a lot of power for an hour or two of gaming.

It's also trivially easy to turn down the maximum power limit in the BIOS if that's what someone wants. The power consumption isn't a fixed feature of the CPU. It's a performance/power tradeoff that can be adjusted at use time.


Just adding to what you said, a 24-core CPU won't get anywhere near peak power usage during gaming. Most games only use a handful of cores. The only way you'll approach it is with parallelizable productivity work like video encoding or compiling code.


My nephew, B, got his 16+8 i9, during Path of Exile, to peak at 250W, and use all 24 cores. He is running at 5.2Ghz, and using air cooled. We are not sure at all how it uses e-(efficiency) cores, when it has 16 p-cores w/ hyper threading, but it all did show up in the new dark mode task manager.


PoE is one of the few games that actually makes use of lots and lots of cores/threads.


Any idea what for? I feel like PoE doesn't involve that much compute other than what would be offloading to the gpu. Maps are static, and I would have assumed that mobs are primarily computed server-side based on some sort of loosely synchronized state.

I guess I could imagine a few threads for managing different 'panes', a thread for chat, a thread for audio maybe? It's hard to think of 24 independent units of work.

I'm not a game dev, just used to play PoE and curious.


The trick used in AAA is to see each frame as an aggregation of core-independent jobs that can be queued up, and then to buffer several frames ahead. So you aren't working on just "frame A", but also finishing "frame B" and "frame C", and issuing the finished frames according to a desired pace, which allows you to effectively spend more time on single-threaded tasks.

The trade-off is that some number of frames of latency are now baked in by default, but if it means your game went from 30hz to 60hz with an frame of delay, it is about as responsive as it was before, but feels smoother.


Sure that explains the parallelization, but not why it takes 250 watts worth of compute to run the game. What's it computing?


The next frame.


if it's anything like gta5 it's going to be calling strlen a billionty times


Can you provide some more info about this?



Could it be the gpu driver/framework? I thought DX12 and Vulkun were meant to be cpu optimised and be able to use heaps of cores.


I guess, but like... how? Like I said, I can't really think of 24 things to do lol. I'm reminded of Dolphin, the GC/Wii emulator - people would ask for more cores to be used and they'd basically be like "for what???", they started moving stuff like audio out, eventually they made some breakthroughs where they could split more things out.

Maybe with these frameworks threads are less dedicated and instead are more cooperative, idk. Really not my area!


https://m.youtube.com/watch?v=MWyV0kIp5n4 I'm reminded of this poe build that can crash the server with too many spell effects


Or simply put, there's too much going on. I remember they had to rewrite ASAP some parts of the engine right after the release of Blight due to FPS drops down to 1/inf at the end-endgame versions of the encounter, as well as server crashes.


Sort of funny story, the concept of this build (spell loop) is currently meta, sadly the servers have improved to the point that they don‘t crash anymore.


Maybe all it does is produce crazy high, pointless FPS.


I've seen the NVIDIA driver eat up all the CPU on multiple cores without really doing anything substantial to the framerate.

This was back in the Windows XP days when I was working on OpenGL and DirectX. It would do this while rendering like a couple of triangles. One core I could understand, but not all. I'm pretty sure the driver had some spinlocks in there.

I also managed to find out the NVIDIA driver assumed user buffers passed to OpenGL (VBOs) would be 16-byte aligned, using aligned SIMD operations on them directly, even though there's no mention of alignment in the OpenGL spec.

It just so happened that Microsoft's C++ runtime would do 16-byte aligned allocations, while the language I was using only did 4-byte.

All is fair in love and performance wars I suppose...


What’s a new dark mode task manager?


The latest Windows 11 preview finally reads the system default paint allowing “dark mode” rendering the ui with dark background and light foreground.


So like in win 95 when you use a "dark theme". What an achievement. Wait, you can also set background colour. /s


What a time to be alive!!


I think you'll find that modern games use many more cores than they used to since mainstream consoles have all moved to being octa-core for the last two generations and you have things like Vulkan better allowing multi-threaded graphics code.


Many more cores yes, but 100% CPU usage should still be rare. If your game uses 100% of a 24C/32T processor, it will run poorly on a "mere" 8-core CPU, and most of your target audience won't be able to play it. You're right though, these aren't your grandma's single-threaded games anymore.


I don't really second this perspective.

CPUs and GPUs keep getting hungrier and that is just not where we should be heading. I wish the perf increase didnt keep coming along consumption increase each gen.


You can clock down a 7950x to 105W and it will be 37% faster than a 5950x


I hardly care, I don't want that heat in my room anyway.


> Both Intel and AMD are providing huge efficiency gains, too. Rumors show the new i7 13700T Raptor Lake part can have a 35W mobile TDP and still outperform a Ryzen 7 5800X: https://www.tomshardware.com/news/intel-13700t-raptor-lake-a...

Don´t let the TDP of T-models fool you. Power consumption to reach boost clocks can peak up to 100W for T-models of the previous generation, and the 13700T probably needs to run close to that to outperform a 5800X.


> for an hour or two of gaming

U gotta pump those numbers up, those are rookie numbers.


> These CPUs aren't consuming 250W all the time. Those are peak numbers.

But they require a heat-sink management designed for that peak. And it is insane. Try to keep microwave oven under 100C :-)


Your toaster uses more than 250W, microwave ovens are far above at 1-2kw


Its still pretty terrible from an optimal performance viewpoint. I can undervolt my 3070Ti by ~100mV, dropping performance by ~6% but dropping temperature by about 10C, or 13%, and dropping fan speed from PS3 to inaudible for anything <90% load.


If you consider the mainstream products of Intel and Nvidia, they have way more moderate power consumption. These products with massive power draw are ultra enthusiast products. They are an outlier. You could build a great PC now with a RTX 3060 and a mainstream CPU that would be fine with a 500~ watt PSU.

As technologists, we should support manufacturers pushing the limit in power and performance. It helps drive overall efficiency and move technology forward.


Power consumption doubled at least even on mainstream and keep increasing gen over gen.

1070 vs 3070 is +52٪ average (145 vs 220w) and +66% (154 vs 250w) sustained.

2070 vs 3070 is +10% (195) or +24% (203) sustained.

Even the 3060 you defend draws power as older flagships and 500w arent enough even for mainstream gaming.

And it keeps getting worse both on gpu and cpu size.

We aren't technologists but consumers, and reality is that x86 and gpus are in near duopolys so the 3 companies involved have little reasons to do a better job and it's clear Apple socs or more and more cloud moving into arm have not been enough of wake up calls.


And yet energy efficiency continues to improve: https://tpucdn.com/review/nvidia-geforce-rtx-3080-ti-founder...

In fact, the 3070 is significantly more power efficient than the 1070 is. Because while yes power is up 50%, performance is up 100%. So performance per watt has continued to improve even as power consumption has also increased.

The reality of power consumption is that it's the main lever to pull right now to deliver generational gains. It's the same lever Apple pulled for the M2 even.

> more and more cloud moving into arm have not been enough of wake up calls.

You mean the ARM enterprise SoCs that use just as much power as x86 does to deliver on average worse performance?


Yep, I’m using a 500w psu to power my gaming pc with a rtx 3060 ti and a 12700k cpu


where is the amd/nvidia/intel product that offers comparable performance at a power draw that is anywhere near m1?


I believe AMD Ryzen 6000 mobile cpus can hold their own against the apple m1. They have comparable performance and can be set by the manufacturer at a TDP comparable to the m1(and still perform well). Except for mainly m1 optimized apps, GPU performance should be pretty comparable too. Ryzen integrated graphics perform better in gaming.


Power consumption is an issue, but worrying about CPUs of a gaming machine is like worrying about straws in the context of the pollution.

Electricity needs of an average household in western world are going to increase a lot in coming decades, with transition to more electric heating, cooking, cars, etc. Gaming machine power usage is minuscule compared to those.


Just because energy needs keep increasing it's not a good reason to be okay they do.

While most house electronics keep pushing to consume less, computers go in the opposite direction.

This also adds tons of heat in my laptops or the air.

Working or gaming in a small room during hot days is painful.

Even consoles making noises of a turbo jet are nowadays considered normal. It's a disaster.


The major difference between a CPU and, say, an oven, is that the former runs 24/7, whereas an the latter would run for a short period of time.

Back of the envelope calculation here:

Assuming an average oven consumes 2kWh, and a CPU 0.1kWh:

Oven for four hours (average weekly usage) would be 2 * 4 = 8kWh weekly.

CPU for 24 hours, 7 days a week = 0.1 * 24 * 7 = 16.8kWh weekly.


The flaws in your calculations are apparent when we realize that modern CPUs clock down when they are not busy. You would assume this would be common knowledge on a site called Hacker News.


Interesting choice of units, kWh per hour?


I would say that people who regularly invest in top end hardware don't care as much about power bills. Otherwise power efficient chips is the norm (laptops, phones, etc).


This is slowly changing IMO - I'm seeing concern over energy use even on forums discussing high end hardware builds as cost of energy mounts in Europe. Previously no one really ever mentioned this other than to laugh at poor thermals.

If some of Nvidia's next generation 4xxx series GPUs are close to 1000w draw as many rumors suggest, the total draw of a high spec Intel/Nvidia system is going to probably have similar running costs to an electric space heater when playing demanding games. The existing 3090ti is already a 500w part, which not so long ago was enough to power a whole system in addition to the GPU.


The power cost of running my enthusiast build is on the order of a few dollars a month.

Now I am all for being green but there are things in my household that are much more of a concern than this.

Huge datacenters full of these chips is one thing. A personal computer for hacking & gaming probably not such a big deal.


Yeah that's not true, at least here in the UK it isn't. My normal build will use 500W when gaming, so every couple hours that's £0.40. Every 10 hours is £4. That's just few days of gaming for me, not including all the other computer use, definitely adds up over a month, especially since my bills used to be £100/month now they are £300 a month.


Using your own numbers, if you're spending 200/month more on gaming then that's 500 hours/month or approximately every waking hour. Are you really gaming that much? And even if you are, that's a lot cheaper than practically any other hobby you could spend that much time on.


I never said I spend £200 on gaming? I just said that my bills have increased from 100 a month to 300, but that's due to rising energy costs in the UK, not my gaming habits . It's more that in addition to my bill literally tripling, costs of gaming aren't insignificant for me. It doesn't matter that it's still cheap for the type of entertainment - it adds up. Ever pound spend this way is not a pound spent on something else.


> It's more that in addition to my bill literally tripling, costs of gaming aren't insignificant for me. It doesn't matter that it's still cheap for the type of entertainment - it adds up. Ever pound spend this way is not a pound spent on something else.

Sounds like a route to being penny-wise and pound-foolish. If you cut cheap entertainment you may well end up spending more (because in practice it's very hard to just sit in a room doing nothing), and if your gaming costs are a lot smaller than your energy bills then the cost should be relatively insignificant, almost by definition.


Well the discussion started with saying "why would enthusiasts care about their energy consumption" - so my point isn't that I'm going to cut out entertainment altogether, but for my next GPU I will definitely look at the energy consumption and there is zero chance I'm buying a 500W monster, even if I can afford the energy for it. It's just stupid and wasteful. I might go with a xx60 series instead, just because the TDP will be more reasonable. Or alternatively I might play the same games on my Xbox Series S which provides me with the same entertainment yet uses 1/6th of the energy of my PC when gaming.


Also, considering that the used power becomes heat, it is not such a waste if you already have inefficient electrical heating.


Conversely, it makes summer much worse.


perhaps.. you can easily equalize indoor temperature to outdoors, so if you're not cooling, it makes no difference.

sucks if you run an AC though :)


I don't use AC I think it's terrible to waste energy like that. I can understandan office or hospital, but home?

I'm from southern italy, it's hot, you sweat, you don't need to burn gas, coal or build infrastructure so people waste it to cool their room, this is so entitled and no wonder we're in full climate crisis.

People can't give up on anything really. It gets worse everyday and people's remedy is to make it even worse,. Nonsense.

So yes, it makes things much worse with a hot pc in the room.


24h x 30 days x .5kW x 0.25 CHF/kWh gives me 90 CHF a month to run my PC, assuming it never sleeps.

Have you run the calculation? It's worthwhile configuring suspend for PCs these days. My 3090 never seems to go below 120W, for one thing.


> 24h x 30 days x .5kW

No modern PC should be pulling 500W all the time.

Idle power can be as low as 20-30W depending on the build.

You should also allow it to sleep, of course.

> My 3090 never seems to go below 120W, for one thing.

Something is wrong. A 3090 should only pull about 20 Watts at idle: https://www.servethehome.com/nvidia-geforce-rtx-3090-review-... . You might have some process forcing it into full-speed 3D mode for some reason.


Windows indexing service.


> 500W

500W is a very high average power consumption. And my electricity is 0.13 USD/kWh, which is about half 0.25 CHF/kWh.

True average power is probably below 100W, for a total cost in the realm of 10 CHF or USD per month.


In the USA, for a significant part of the year, chances are you have to add the electricity costs of running your airconditioning to get rid of that heat.

If you’re living in a colder state, you may have to subtract the costs of having lower heating costs.


Sure, although it's not a ton of heat either way and doesn't make a large impact on the net cost.


The lowest contract you can get in italy is 40 cents.

Also, consuming more energy is bad and this rush to apologize lack of innovation in gpu and cpus weve seen in the last decade is ridiculous.

Where does it end? I'm okay with a 5000cc truck because airplanes and cruise ships are much worse?


the last 2-3 generations of cpus and gpus have seen the most innovation, efficiency, and performance gains in a long time.

If you don't want to drive a 5l truck, don't drive one, but 500W is not an average load, and if you have high electricity costs thats a you problem not an everyone else problem.

The largest parts of my electrical bill are distribution and overhead charges that don't change whether I use power or not. The marginal utility of the power vs the cost is quite reasonable.


This is nonsense. The last 2-3 generations of GPUs have seen nothing of the sort in terms of efficiency, the same is also true of many Intel desktop CPUs. The latest Alder Lake desktop parts from Intel have been universally criticized for power draw too.

Each of the last 4 generations of mid to high-end card from Nvidia has required more electricity than the last. By Nvidia's own admission, their future chips will get larger and hotter to some extent due to slow down in moore's law and future process nodes being harder to reach. The die size of the parts has also grown which does not help.

Its not a small trajectory either; 4 years ago the most power hungry consumer part from Nvidia was a 1080ti which would easily draw 250w under load. today that number is now 500w for the current 3090ti, and rumored to be 800-1000 for the 4xxx parts launching end of this year. A GPU often will sit at peak draw for hours at a time in games as they try to push as many FPS as possible.

A current gaming machine with recent components can easily exceed 500w constant load during gameplay, and this figure will rise again with the 4xxx parts.

The ONLY exception to this really is Apple devices, and even then its not clear we can compare an M1/M2 GPU to the fastest parts Nvidia offer.


You and me are seeing very different news if you dont see midrange gpus consumimg 250w under load and cpus getting over 100.

The performance gains are non existent and largely driven by bigger and bigger chips on smaller nodes.

There is no innovation in gpu and x86 cpus space since a long time, that happens only on arm nowadays.


>5000cc truck because airplanes and cruise ships are much worse

Oh boy, you should visit USA sometime. A 5 liter truck is the small one.


The fuck are those calculations ? Are you trying to mislead people on purpose?

Who is running their computer at 500W 24/7 ?


500W 24/7 consumption? What do you do? Train ML non-stop?

Your example is in now representative of reality.


I use suspend on my PC and I definitely do not run it 24h, or anywhere close to that. Also power is $0.08/kWh where I live.


That's insanely cheap power, use it while you can. I'm paying £0.40/kWh, so about 46 cents per kWh.


Really?

My power is some of the cheapest in the country and we pay ~13 cents/kWh. It's a little misleading though since my bill breaks out generation and distribution costs into separate line items. They are both billed per kWH though and add up to 13 cents.


Yes. We have a fixed basic connection charge of $20. So it's really close to your $0.13/kWh when that is taken into account.

I wasn't trying to be misleading though because the point is the basic charge does not increase with usage. So for each additional kWh we add to that it's only $0.089/kWh.


No, that's fair. I have an additional basic charge too. Congrats on the cheap power.


And double that number if you're in the UK.


It's worse than that. Gamer's Nexus had a video a few months ago about power transients becoming a bigger problem. Power spikes can double the amount of power needed. It doesn't really impact average power useage, but it can cause a psu's ocp to shut down the machine. https://www.youtube.com/watch?v=wnRyyCsuHFQ


> If some of Nvidia's next generation 4xxx series GPUs are close to 1000w draw as many rumors suggest

Those rumors are for millisecond long transient spikes, not an average of anything. So basically the rumor is a 500w peak load. Just like how the current 350-400w GPUs have transient spikes to upwards of 800w. It's not a problem in terms of power consumption (although obviously the increase from 400w to 500w would be), rather it's an issue with over-current protection in power supplies tripping. It's a "need bigger capacitors" type problem.


Yeah exactly this. I have a 3080 with a 5900X, would consider myself an enthusiast, and after recent price hike to my tariff here in the UK electricity usage is definitely something that's on my mind. Like, it hasn't stopped me gaming yet, but I'm very acutely aware that I'm using £1 worth of electricity every few hours of play - it adds up.


> £1 worth of electricity every few hours of play

I hope you make a lot more per hour of work. Stop worrying about that.


I mean, thank you for the thoughtful advice about my finances, but it doesn't help in the slightest. Life is getting a LOT more expensive lately, with everything going up in price - I'm seeing my grocery bills double, energy bills triple, spending lots more money on petrol, on eating out, on taking my family out for trips, and yes - on gaming too. Is that £1 every few hours making me destitute? No, absolutely not and I'm extremely privileged to be able to afford it. But at the same time every £1 taken for this isn't a pound saved, or spent on my kid, or on literally anything other more productive.

So yes, I can "easily" afford it, but it doesn't mean that the energy consumption of my gaming rig hasn't affected how I think about it. Any future hardware upgrades will also be impacted by this - there is no way I'm buying a GPU with 500W TDP, even if again, I can afford the energy bills.


Whether he does or not it's none of your business, and it doesn't change the fact that those are high prices and sources of environmental issues.

This power draw is getting out of hand on desktop, consoles and x86 laptops and is largely a symptom of lack of competition and lack of technological advances.


> those are high prices

By any reasonable measure they're not. £1 for "a few hours" of fun is a very cheap hobby.


And probably the heat output of a space heater as well. I had to move my tower into another room because it kept the whole room way too hot


The pilot light on my furnace went out years ago. I only noticed because when I opened the door to the room with my computer a light but noticeable heat blast hit me. It took a second, but I turned around and checked my furnace, etc instead of going in the room. It really was a revelation about how much heat those things produce.


> that people who regularly invest in top end hardware don't care as much about power bills

It adds up, especially in data centers where you end up needing even more megawatts of power and cooling capacity.


> don't care as much about power bills

Not yet!


I wonder what % of the overall power bill a PC actually consumes. My gut would say it doesn't compare, really, to the water heater or air conditioner, but it would be good to see numbers.


I would love some PSU metering ability, to see actual data about how much juice my PC is pulling down. Other than getting a kill a watt meter, how could one go about this?


They make "digital PSUs" like the Corsair AXi series that can talk to your PC over a comm port.


Some UPSes can show how much power is being drawn by everything connected to it, in this case presumably your computer.

You probably want a UPS anyway if you've got a power guzzling (and thus presumably expensive) machine.


Many server PSUs and motherboards have a SMBus or similar interface for monitoring. Quite rare on consumer parts, sadly.


Knockoff meters are like $10 on Amazon. Not a bad investment.


I have what is probably a close spiritual cousin of one of those, and while it even touts a power factor display, it also loves to show ridiculously high values during idle consumption for anything involving some sort of power electronics (not just for my computer, but for example for my washing machine, too – it shows sensible values while the heating element runs, or when the motor actually turns, but in-between it shows nonsensically high values).

It is a few years old, though, so maybe by now quality standards have improved even for those kinds of cheapo-meters…


Calculate it. You'll have kW/hr costs for electricity which depend on where you are and can ballpark power based on fraction of the time it's running and the components in it.

My standard dev machine is ~1kW flat out, ~500W most of the time, probably 100W idle. Runs for about eight hours a day. Say 500W is the average, suggests 4kW/hr a day. That's about $2 a day in the UK.

(those power numbers are relatively high - it's an elderly threadripper with two GPUs)


I think that 500W average is the tricky bit. When web browsing for example, my laptop (linux+intel) seems to spend 99% of the time in the C1 halt state, according to i7z.


I can say that when my son left for college last fall, our electric bill dropped about $30 a month compared to the months he was here (after adjusting for seasonal heating/cooling costs). He has an i9-12xxx gaming rig with two monitors. A Prusa 3D printer that gets a lot of use and a few other gadgets and such.


They are in development for years, not last 6 months


arguably it was the same thirst for electricity that was the killing stroke for most of the POWER architecture. that, and IBM contract fees.


netburst was the first thing that came to mind


I don't know, we found Helium-3 on the moon this week, so I think it might be fine.


In Intel's case, they need to push these insane TDPs in order to even dream of performance parity with AMD and Apple. All those years spinning their wheels on 14nm+++++++++++ are biting them in the ass.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: