>no one >literally no one >not even thanos >some random Gankeredditor at 12AM on a anime waifu sharing website: WAOOOW COMPUTERS CONSUME ELECTRICITY!!!?!!!!
Ideally, yes.
But nobody cares enough so they can get away with sticking to 10nm and blasting the power usage.
It's the same as GPU's and why they get away with using 350-600W
AMD cannot do this because they are already at 5nm and thermal throttle well before pumping more power into it.
People go on about AMD is winning etc. Sure, they might be slightly ahead currently but they need some serious innovation to keep up / stay ahead.
>shitty vidya optimisation by the devs >"This is intels fault"
Not only are you a homosexual, this is also a misleading post. This is if you use a 4090, which is like 0.5% of people.
Intel gets that as well, they're not dumb. But what the frick are they supposed to do if they have an inferior product. >They set a reasonable TDP, they lose at everything, no one buys their CPUs.
or >They set an unreasonable TDP, they win at SOMETHING here and there, people who don't care about power consumption buy their CPUs.
They can't win. They can't even cut prices because Ryzen is cheaper to produce: AMD would also cut prices and Intel would be even more fricked.
You can still go for it anon, just be intelligent with your selections otherwise you'll end up an autistic moronic homosexual like me who bought all the shinies and now undervolts everything so my room isn't an inferno.
I don't buy AMD GPUs because I don't trust the drivers, but they look alright
8 months ago
Anonymous
AMD GPU drivers are unironically better than Nvidia nowadays.
8 months ago
Anonymous
>released a driver last week that got you banned in a bunch of games
>better
8 months ago
Anonymous
>experimental driver has issues
go to bed jensen
8 months ago
Anonymous
bait or moronic ?
8 months ago
Anonymous
Show idle power
8 months ago
Anonymous
ok
8 months ago
Anonymous
Oh is it fixed? Then good for them. I remember months ago people here talking about the driver problem of its idle power consumption. If it's fixed, then so be it.
8 months ago
Anonymous
>I remember some random irrelevant shit from months ago so I will continue to spread disinformation
the struggle of AMD
do you morons realize this industry has zero competition? literally 2 relevant companies in both the CPU and GPU space and you have the audacity to constantly spread moronic shit about the only one that is even close to being pro consumer
8 months ago
Anonymous
>>I remember some random irrelevant shit from months ago
Literally everyone is talking about it. It's more concerning that you didn't know about the xtx's power draw problem. How about you go outside for once? >zero competition?
Cpu side has competition. Gpu is a mess.
From the way you talk, you sound like a total shill. I shouldn't have bothered replying. Go frick yourself.
Full system usage for a cpu test is some insanely moronic shit and just makes slower cpus bottlenecking the 4090 look like they use less power than they do.
I don't care. With all the horror Israel has suffered recently I'm still upgrading to a 14700k. It's the least I could do to show my support. If you're not a Nazi you will do the same.
Funny how AMD is now giving Intel a run for their money after suffering for so many years. The performance and efficiency gap just keeps on getting wider and there is no end in sight.
>caring about how much electricity your pc uses
I know that's not the point of the thread, but I would still be happy with my rig even if it used the same amount of power as an entire household.
Considering they can keep up with AMD at 10nm vs AMD 5nm, that's pretty good. They still have tonnes of headroom while AMD needs to get smaller to compete.
>cpu's over 600 watts >gpu's peaking over 450 watts >still struggling to play games at 60fps 1080p and instead fake better performance with upscaling to play anything higher resolution
Remember when people were complaining about games being all graphics-sims like a decade ago?
I don't care since I only run my trusty i5 6600k with GTX1080! >paying double in energy bill to play goyslop instead of indies and emulating good old times
>AMD sponsored badly coded slop game >Intel is intentionally bricked by code >AMD totem does better in biased Hardware Unboxed test >AMD cultist OP: AAAAH *faps to Lisa Su AI porn*
>am dead-set on a 14700K and a 4090 >have a 1300w platinum PSU sitting in my newegg cart in anticipation >see these numbers
...Should I just get something even bigger?
it's not really a hassle to turn my room into an ice box or a sauna on a dime. Though I do wonder if the cooling I've got in mind would be enough >Fractal Torrent >Maybe aftermarket noctua case fans >NH-D15
My gut says "I need water cooling for this mess" but everything I look at says that air cooling and water cooling are practically neck and neck.
>1300w
look at its data sheet and you'll find that it operates at peak efficiency at somewhere around 80% of 1300w which is 1040w. If your average power consumption during gaming isn't near that range then there will losses due to inefficiencies.
But that's the kicker, isn't it. Will my average power consumption fly all over the place with this thing? The last thing I'd want to do is kick out a bunch of money for a beefy PSU only to find out that it still manages to not be enough.
You can set power limits, you can park eCores with a single keyboard hotkey.
I have a 13700k and it draws 60-120W depending on the game. It's overclocked to 5.7Ghz.
I have a 150W power limit so under full load (ie Cinebench or Premiere Pro) it will downclock to 5.3Ghz to keep under the power limit.
Imagine paying for a 14900 and only playing at 1080p. Inb4 >but but CPU bottleneck at 1080p
It's still an artificial benchmark. Nobody with that CPU will be running at postage stamp resolutions.
Here's borderlands at 4k next to an average of a wide variety of games at 1080p. Notice how the difference between the 14900k and the 10900k are the exact same because what matters isn't the resolution it's not bottlenecking the gpu. Testing at 1080p just gives you more options for games to use.
16 cents a kWh is the average in America, moron. But can I really blame someone living in their parent's basement at 34 years old for not knowing how much power costs?
How do these Black folk like Goymers Nexus and Hardware Unboxed hit such high power draw? Even on my old 9900k it never saw above 150W even in P95 yet I recall everyone saying it was a housefire because Black person Unboxed said it drew 300W.
Are you moronic or only pretending?
Motherboard manufacturers just don't give a frick almost all intel CPUs just run without any limits on higher end boards. But even with limits in place 14900K is 253W without VRM inefficiency taken into account. If you don't have high end water cooling you will throttle that CPU in a matter off seconds in P95 or other stress tests.
Its absolute peak power. Its used only to show how much they can peak, thats why xtx 7900 can peak at fricking 700w for 1ms but that is enough to explode bad psus
This is not sustainable, gamers must be punished for accelerating climate change, or at the very least videogames should be taxed accordingly to reduce the gamer footprint.
Has nothing to do with climate change outside of them sitting in a 90F+ room sweating their balls off as they seethe about AMD being inferior. All while their California AC gets turned off automatically for using too much power.
>reviewers and motherboard makers let Intel run wild for best performance / sales >did the same with AMD until the 7800X3D started blowing up >easy sales for a few years without needing to innovate and jump bump up the power consumption (GPU's do this as well)
It's all profit related.
*an
I wouldn't even want to play Starfield in 1fps
Pretty sure this factors in the 4090, no?
So the last result is probably 150w for the cpu
Yeah it's from full system, but it's over 170W.
Can go to over 280W in non-vidya.
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
5800X3D KINGS
Best purchase i did
3D V-Cache is a performance hack. AMD cannot design actual good cores so they glue on cache out of desperation.
you intjeets make me want to go AMD for my next build
Keep coping intel shill. The results speak for themselves.
>performance bad!
>don't innovate, just keep iterating on your old shit because risk scary
>nooooo you can't use a clever way to get better performance in vydia you are only allowed to pump more watts
>IntelAviv seething
then why dont intel just do it?
they are
What was that?
All I hear are POORgay NOISES
I had to check my settings to see what CPU I have. An i5 11400 apparently. Funny how I still remember my first three computers' specs like my own name
>no one
>literally no one
>not even thanos
>some random Gankeredditor at 12AM on a anime waifu sharing website: WAOOOW COMPUTERS CONSUME ELECTRICITY!!!?!!!!
You're a homosexual bro.
PCs are supposed to give more power with less electricity as tech progresses. AMD gets that. Intel is competing with microwave ovens.
Ideally, yes.
But nobody cares enough so they can get away with sticking to 10nm and blasting the power usage.
It's the same as GPU's and why they get away with using 350-600W
AMD cannot do this because they are already at 5nm and thermal throttle well before pumping more power into it.
People go on about AMD is winning etc. Sure, they might be slightly ahead currently but they need some serious innovation to keep up / stay ahead.
more power yes, less electricity no
>shitty vidya optimisation by the devs
>"This is intels fault"
Not only are you a homosexual, this is also a misleading post. This is if you use a 4090, which is like 0.5% of people.
Intel gets that as well, they're not dumb. But what the frick are they supposed to do if they have an inferior product.
>They set a reasonable TDP, they lose at everything, no one buys their CPUs.
or
>They set an unreasonable TDP, they win at SOMETHING here and there, people who don't care about power consumption buy their CPUs.
They can't win. They can't even cut prices because Ryzen is cheaper to produce: AMD would also cut prices and Intel would be even more fricked.
stop trying to appease shareholders and not release a worthless generation like this and instead commit to r&d
Shit like this makes me not want to build a new rig.
You can still go for it anon, just be intelligent with your selections otherwise you'll end up an autistic moronic homosexual like me who bought all the shinies and now undervolts everything so my room isn't an inferno.
>tfw 10700k only draws 95 watts
makes sense since you get shit performance on that
>high power consumption
It's okay when AMD does it.
But amd chips consume 3x less power....
I'm talking about the gpus
Sir this is a cpu thread
But it IS okay when AMD does it right? Besides, why are you obsessed with pajeets homosexual?
I don't buy AMD GPUs because I don't trust the drivers, but they look alright
AMD GPU drivers are unironically better than Nvidia nowadays.
>released a driver last week that got you banned in a bunch of games
>better
>experimental driver has issues
go to bed jensen
bait or moronic ?
Show idle power
ok
Oh is it fixed? Then good for them. I remember months ago people here talking about the driver problem of its idle power consumption. If it's fixed, then so be it.
>I remember some random irrelevant shit from months ago so I will continue to spread disinformation
the struggle of AMD
do you morons realize this industry has zero competition? literally 2 relevant companies in both the CPU and GPU space and you have the audacity to constantly spread moronic shit about the only one that is even close to being pro consumer
>>I remember some random irrelevant shit from months ago
Literally everyone is talking about it. It's more concerning that you didn't know about the xtx's power draw problem. How about you go outside for once?
>zero competition?
Cpu side has competition. Gpu is a mess.
From the way you talk, you sound like a total shill. I shouldn't have bothered replying. Go frick yourself.
It's not fixed when using a 4K display.
God I love my 4080s power draw
werks 4 me
my 6700XT never uses more 250w at 100%
According to ARK its maximum is 253 Watts sustained. Though they note that that can be exceeded in short bursts.
i7 is workstation cpu only beaten by 1-4 consumer cpus depending whether task prefers amd
you dont buy it for gaming, you buy it to do work on your pc
No one cares about 1s in that case either when it's double the power usage
>av1 encoding on a cpu
yes, very common thing to do with a cpu stupid
Full system usage for a cpu test is some insanely moronic shit and just makes slower cpus bottlenecking the 4090 look like they use less power than they do.
I feel so out of date for still using a 9700k, but then not really cause I can still play most games at max with decent frames.
Yeah I'm using a 9700k and a 4080 and I'm reading I could get 30-40% more fps with a new CPU, so Call of Duty goes up to 210 fps? Yippee
I don't care. With all the horror Israel has suffered recently I'm still upgrading to a 14700k. It's the least I could do to show my support. If you're not a Nazi you will do the same.
Trying too hard
Incel bros... I don't feel too good...
should i go for the 7800X3D or wait?
>[1080p]
todds crack pipe must be hot to the touch
Funny how AMD is now giving Intel a run for their money after suffering for so many years. The performance and efficiency gap just keeps on getting wider and there is no end in sight.
407w
>407w
407w
>407w
407w
>407w
407w
>407w
>Safety limits removed
>buying a K series chip to not OC it to the max
Don't even need K versions anymore.
>caring about how much electricity your pc uses
I know that's not the point of the thread, but I would still be happy with my rig even if it used the same amount of power as an entire household.
4080 with a 5800X3D low power draw with a 1600w overkill platinum power supply master race reporting in
Is there a benefit to such a massive power supply?
Power supplies are most efficient around half their available wattage
isn't it more like 80%?
not that anon but for me it's because it runs silently
Jesus Christ
That's like a small heater, great in the winter I guess.
My 13700k at 5.7Ghz uses 70-120W in games.
Why would I care about 30W extra on the CPU when the GPU pulls 400W?
What the frick is Intel doing? They are supposed to be making smaller more efficient cpus. It's like they have gone back in time to 2010.
Considering they can keep up with AMD at 10nm vs AMD 5nm, that's pretty good. They still have tonnes of headroom while AMD needs to get smaller to compete.
>they have tons of headroom
thats literally the thing about smaller nodes though
they don't
the smaller the node, the more heat produced
ONE
POINT
TWENTY ONE
JIGGAWATTS?!
>AMD actually winning all the time in the CPU wars
What a time to be alive
>cpu's over 600 watts
>gpu's peaking over 450 watts
>still struggling to play games at 60fps 1080p and instead fake better performance with upscaling to play anything higher resolution
Remember when people were complaining about games being all graphics-sims like a decade ago?
I don't care since I only run my trusty i5 6600k with GTX1080!
>paying double in energy bill to play goyslop instead of indies and emulating good old times
FX-9590 intel edition.
Just set a wattage cap of 150 and retain 95% performance of uncapped
>AMD sponsored badly coded slop game
>Intel is intentionally bricked by code
>AMD totem does better in biased Hardware Unboxed test
>AMD cultist OP: AAAAH *faps to Lisa Su AI porn*
>it's a conspiracy that amd is winning by technical proficiency over intel brute forcing power draw!
>am dead-set on a 14700K and a 4090
>have a 1300w platinum PSU sitting in my newegg cart in anticipation
>see these numbers
...Should I just get something even bigger?
Don't forget extra ventilation for your room. That's a serious heat source anon.
it's not really a hassle to turn my room into an ice box or a sauna on a dime. Though I do wonder if the cooling I've got in mind would be enough
>Fractal Torrent
>Maybe aftermarket noctua case fans
>NH-D15
My gut says "I need water cooling for this mess" but everything I look at says that air cooling and water cooling are practically neck and neck.
>1300w
look at its data sheet and you'll find that it operates at peak efficiency at somewhere around 80% of 1300w which is 1040w. If your average power consumption during gaming isn't near that range then there will losses due to inefficiencies.
But that's the kicker, isn't it. Will my average power consumption fly all over the place with this thing? The last thing I'd want to do is kick out a bunch of money for a beefy PSU only to find out that it still manages to not be enough.
there are some power consumption calculators online if you arent able to trust your estimate. take +10% as error margin if you are really paranoid.
You can set power limits, you can park eCores with a single keyboard hotkey.
I have a 13700k and it draws 60-120W depending on the game. It's overclocked to 5.7Ghz.
I have a 150W power limit so under full load (ie Cinebench or Premiere Pro) it will downclock to 5.3Ghz to keep under the power limit.
They literally have the full machine pulling 630w from the wall with a 14900k and 4090 above running Starfield, why would you ever need 1300w?
Imagine paying for a 14900 and only playing at 1080p. Inb4
>but but CPU bottleneck at 1080p
It's still an artificial benchmark. Nobody with that CPU will be running at postage stamp resolutions.
Here's borderlands at 4k next to an average of a wide variety of games at 1080p. Notice how the difference between the 14900k and the 10900k are the exact same because what matters isn't the resolution it's not bottlenecking the gpu. Testing at 1080p just gives you more options for games to use.
I have a 13700k and play at 1080p.
24" monitors are perfect. I don't want larger and there's no 1440p or 4K 24" monitors worthwhile.
>You will soon need a non-standard power cable with higher wire gauge and an exclusive circuit and breaker just to play games
What a time to be alive
>paying over a dollar an hour in power costs just to experience Starfield's extremely sub-par graphics
That's just embarrassing.
Intelcucks do be like that
what shithole do you live in that electricity costs that much?
16 cents a kWh is the average in America, moron. But can I really blame someone living in their parent's basement at 34 years old for not knowing how much power costs?
Cuck cores and half the threads with more power usage; that's the power of israelites
How do these Black folk like Goymers Nexus and Hardware Unboxed hit such high power draw? Even on my old 9900k it never saw above 150W even in P95 yet I recall everyone saying it was a housefire because Black person Unboxed said it drew 300W.
>Even on my old 9900k
>everyone saying PS5 was a housefire
I don't remember this ever happening. The console has a 300W PSU, it doesn't mean the SoC consumes 300W.
Are you moronic or only pretending?
Motherboard manufacturers just don't give a frick almost all intel CPUs just run without any limits on higher end boards. But even with limits in place 14900K is 253W without VRM inefficiency taken into account. If you don't have high end water cooling you will throttle that CPU in a matter off seconds in P95 or other stress tests.
Its absolute peak power. Its used only to show how much they can peak, thats why xtx 7900 can peak at fricking 700w for 1ms but that is enough to explode bad psus
>xtx 7900 can peak at fricking 700w
we really should be using these gpus for better things than AAA slop
There is people who buy intel since ryzen are out
How?
I wish israelitetel just gave us gayming CPUs without estrogen cores.
This is not sustainable, gamers must be punished for accelerating climate change, or at the very least videogames should be taxed accordingly to reduce the gamer footprint.
Has nothing to do with climate change outside of them sitting in a 90F+ room sweating their balls off as they seethe about AMD being inferior. All while their California AC gets turned off automatically for using too much power.
Lmao amd has been shitting all over intel for like 10 years now
>Zen3 is 10 years old
How the time flies.
Tell that to gays and their private yets first. The I might consider undervolting my PC.
Don't care, still living at home, still have mommy pay for electricity, stay mad, stay jealous.
Tell your mother you love her.
wtf happened to Intel? why can they only increase performance by massively increasing the power?
>reviewers and motherboard makers let Intel run wild for best performance / sales
>did the same with AMD until the 7800X3D started blowing up
>easy sales for a few years without needing to innovate and jump bump up the power consumption (GPU's do this as well)
It's all profit related.
>Suddenly
They haven't changed process node in a while.
15th gen should finally use smaller one.
>tfw comfy as frick heater for the winter
So are the chiplet Intel chips coming next year?