3900x available on Newegg

jhego

[H]ard|Gawd
Joined
Jul 23, 2004
Messages
1,448
I was able to snatch one up. Looks like there is some still available.
 
Go to your local Microcnter. Both of the locations in my state have it in-stock.
 
Last edited:
Yeah, my MC has several in-stock as well.

That said, got mine through Best Buy for $449.99 with an automatic 10% coupon for people with rewards profiles.
 
Yeah, my MC has several in-stock as well.

That said, for mine through Best Buy for $449.99 with an automatic 10% coupon for people with rewards profiles.


Yeah, Best Buy, Amazon and Newegg will be perpetually sold-out for the first three months. This is what happened with the 9900k as well.

They will be sold-out because those are the mainstream destinations for tech. You either jump quick while in-stock on the big 3, or you make alternate arrangements.
 
Amazon had a pre-order earlier today for $490 (sold by Amazon) and said ships by Sept 1st. Gone now though, so searching for 3900x returns garbage per usual with their crappy search. I did not partake.
 
still waiting to see if threadripper is dogshit in games again.

0B1BF722-E3E5-4EDF-85B7-0D875D0FFBE8.jpeg
 
how exactly is threadripper dog shit? i have a threadripper it's great, how dare you, TRIGGERED

Dog shit is an exaggeration, but the reality is that if gaming is the primary use of a PC then threadripper is a poor platform choice. It is a good choice for a productivity rig where gaming is a secondary consideration. The high end ryzen chips deliver more fps than their bigger TR siblings and do it for much less money.
 
Threadripper 3000-series will, be better-all-around performer thanks to the lack of NUMA, and I imagine the core turbos will be more aggressive as well.

The first generation was quite messy, and could vary massively from one load to the next.

They could play the same trick they did with a jump to 64 cores, but probably not on release day. Maybe a 48-core on-release like the 3900X, and a 64-core later?
 
  • Like
Reactions: GHRTW
like this
Microcenter Kansas City has several in stock right now. Asked the guys. Was told the supply issue was solved and they expect to get more in weekly now.

9900k @ 5ghz is better for gaming. But lot of guys are apparently no longer gamers and focused entirely on productivity apps and benchmarks.
 
Microcenter Kansas City has several in stock right now. Asked the guys. Was told the supply issue was solved and they expect to get more in weekly now.

9900k @ 5ghz is better for gaming. But lot of guys are apparently no longer gamers and focused entirely on productivity apps and benchmarks.
Yeah it's amazing how many gamers turned into one man band game devs or pro video editors suddenly....
 
Ever since I moved my rig upstairs. The race for the absolute fastest is no longer my main goal. I run a 3600x undervolted and an RTX 2080 so my room doesn't heat to 80 degrees while gaming. If I were to run a 9900k @ 5ghz and a 2080ti, forget it, it gets too uncomfortable here.


AMD CPUs now run efficient as hell and the price is unbeatable. Paired with a decent Nvidia GPU, you got a nice efficient rig. 3900x... no thanks.. maybe for winter lol.
 
Ever since I moved my rig upstairs. The race for the absolute fastest is no longer my main goal. I run a 3600x undervolted and an RTX 2080 so my room doesn't heat to 80 degrees while gaming. If I were to run a 9900k @ 5ghz and a 2080ti, forget it, it gets too uncomfortable here.


AMD CPUs now run efficient as hell and the price is unbeatable. Paired with a decent Nvidia GPU, you got a nice efficient rig. 3900x... no thanks.. maybe for winter lol.

Yeah, I barely have to use a fan on me since I dropped my 6700k for a 3700x.
 
The internal case temp changed. I mean my 1080ti dropped 6 degrees. So, there is a noticeable difference. Go ahead and keep making an ass out of yourself.
And go ahead and keep being overly dramatic about 30-40 watts difference at best.
 
And go ahead and keep being overly dramatic about 30-40 watts difference at best.

I'm talking about real world temp differences and you just can't comprehend that. I am sure somewhere in that tiny brain there is a place that knows 42° F is a noticeable change.
 
I'm talking about real world temp differences and you just can't comprehend that. I am sure somewhere in that tiny brain there is a place that knows 42° F is a noticeable change.
And like many people on here, you are making ridiculous exaggerations when it comes to heat. Saying "I barely have to use a fan on me since I dropped my 6700k for a 3700x" is just laughably fucking stupid. Again you are talking about 30-40 watts difference at best so that is not going to make that big of a goddamn temperature difference in your room. Hell I just turned on my room light which is 2 60 watt bulbs so using your logic I better grab several fans and crank up the AC.
 
And like many people on here, you are making ridiculous exaggerations when it comes to heat. Saying "I barely have to use a fan on me since I dropped my 6700k for a 3700x" is just laughably fucking stupid. Again you are talking about 30-40 watts difference at best so that is not going to make that big of a goddamn temperature difference in your room. Hell I just turned on my room light which is 2 60 watt bulbs so using your logic I better grab several fans and crank up the AC.

You are nothing but a troll. Sorry you don't understand thermal energy.
 
Yeah it's amazing how many gamers turned into one man band game devs or pro video editors suddenly....
Yeah it's amazing how many gamers forgo almost imperceptible fps differences in most games to choose a CPU that does more than just 240p 1000hz gaming, with better value, higher efficiency and also not keep supporting the assholes that sold everyone quad cores for half an eon.
 
  • Like
Reactions: GHRTW
like this
Lol this coming from the clown that thinks his room is so much cooler because he has a cpu that uses 30 less watts. :rolleyes:
The double standards here on power consumption are quite funny. It's funny how though when it comes to AMD GPUs, 30-40 watts suddenly becomes a literal nuclear meltdown and will cost you $1000 in power extra when gaming 48/7/699 days a year. But when it's team green or blue doing it, it doesn't matter. Especially if there is a 5% performance differential.
The real answer is somewhere in the middle. Depending on your case, setup, ventilation etc it can make a difference. When someone shows at least some evidence of that in their use case, that's not a reason to attack them. Going from a CPU that dissipates more heat to one that's more efficient with less heat, will lower case temperatures a bit in most scenarios unless you have very capable cooling system. How much depends on the above.

Datacentre types seem to know quite a bit about this as it massively impacts their bottom line with cooling and electrical bills. And AMD is the captain now in that scenario in practically every damn use-case and benchmark. If you think that won't translate in similar scenarios to the desktop, I have a Hitlers' secret base under the ice in Antarctica to sell you.
 
The double standards here on power consumption are quite funny. It's funny how though when it comes to AMD GPUs, 30-40 watts suddenly becomes a literal nuclear meltdown and will cost you $1000 in power extra when gaming 48/7/699 days a year. But when it's team green or blue doing it, it doesn't matter. Especially if there is a 5% performance differential.
The real answer is somewhere in the middle. Depending on your case, setup, ventilation etc it can make a difference. When someone shows at least some evidence of that in their use case, that's not a reason to attack them. Going from a CPU that dissipates more heat to one that's more efficient with less heat, will lower case temperatures a bit in most scenarios unless you have very capable cooling system. How much depends on the above.

Datacentre types seem to know quite a bit about this as it massively impacts their bottom line with cooling and electrical bills. And AMD is the captain now in that scenario in practically every damn use-case and benchmark. If you think that won't translate in similar scenarios to the desktop, I have a Hitlers' secret base under the ice in Antarctica to sell you.
The guy is acting like it made a huge impact on temps in his room as he says "I barely have to use a fan on me since I dropped my 6700k for a 3700x". Only a fool would think that 30 watts would make any real measurable difference in your room temp. And what does this have to do with AMD or a data center with a room full of comps?
 
Last edited:
The guy is acting like it made a huge impact on temps in his room as he says "I barely have to use a fan on me since I dropped my 6700k for a 3700x". Only a fool would think that 30 watts would make any real measurable difference in your room temp. And what does this have to do with AMD or a data center with a room full of comps?
Who fucking cares?!?! Does his opinion impact you? No? Then shut the fuck up and let's get back on topic. Jeezus Christ the millennial vibe is strong here!
 
Dumb of me to fan the flames but want to share this for other fence sitters.

Moving from a 4790k to a 3700x is an obvious difference in exhaust fan temps. 4790k (oced to 4.6 , i forget voltage) exhausted warm to hot air from the rad, 8320FX always blew out extremely hot air.

In a small room with windows shut and door closed a 9900k could possibly heat up a room over a couple hours. I know for a fact a gpu consuming 275 watts does (See next paragraph before comparing that to 9900k TDP), learned that lesson trying to fold in my bedroom while sleeping.

And of course, the difference is far more than 30 watts in actual consumption. I'm not sure how TDP wattage is derived but assume its some function of how much energy per (time period) it takes to heat silicon by 1 degree Celsius. In actual fact I've seen 9900k after boost pulling over 200 watts, far above their tdp. 3700x is running around half of that.



See system power consumption tables at around the 5:15 mark. 3900x is consuming 100 watts less system load running blender. Assuming thats a stock 9900k too ..
 
Last edited:
Got mine at Microcenter on july 7th. I snatched the last one they had that day.

It's a wonderful slab o Si.
 
I want one, but honestly on my main rig it's just games and 2700x is fine. My video rig could definitely benefit, but the R7-1700 on a cheap/shitty AB350 seems to do fine. Saving 20-30 mins on an encode doesn't justify cost for me.

Now when winter gets here maybe I can find a deal on a used TR cpu+mobo for that box. And extra heat in my home office would be welcome after October. Or I could just turn on the old FX8350 box and let it heat the room up doing nothing :)
 
I think 30-40 watts of additional heat output could make a tangible comfort difference in a tiny, super insulated dorm room
 
Microcenter Kansas City has several in stock right now. Asked the guys. Was told the supply issue was solved and they expect to get more in weekly now.

9900k @ 5ghz is better for gaming. But lot of guys are apparently no longer gamers and focused entirely on productivity apps and benchmarks.
Yup, they had two Monday and said more were expected later.
 
Kind of funny. The difference between Vega and Pascal/Turing was often less than 30-40W (for like tiers), and everyone jumped all over AMD as those cards being hot and unmanageable.

But here, AMD is 30W less, and we are saying it's just a drop in the bucket and doesn't matter.
 
I'm not sure how TDP wattage is derived

TDP is however the manufacturer chooses to define it, unfortunately. Since it's Design Power, it's a spec number and they can adjust the definition to fit whatever purpose they want. Ideally, Thermal Design Power ~should~ be the maximum power draw (or heat rejection, since they should be nearly 1:1) that the silicon requires while operating at 100% design load under stock factory conditions.

Now, actual Thermal Power is going to be directly proportional (and nearly 1:1) with electric use. Part of the reason in microelectronics it's measured in terms of Watts.

Power In = Power Out, because energy is always conserved. It's just being converted from electric to heat inside the silicon.

There are ways to measure power in terms that you are talking about (heat / temp rise):

1 BTU = amount of energy required to raise the temperature of 1 pound of water by 1 degree F.
1 Watt = 3.41 BTU / hr

or for your metric folks:

1 calorie = amount of energy required to raise the temperature of 1 gram of water by 1 degree C.
1 Watt = 4.19 cal / sec (Calorie in food is actually kilocalorie in metric, or Capital C Calorie)
(although usually energy is measured in terms of joules for applications like this, and 1 W = 1 J / sec)

TL;DR

You could measure TDP by how much silicon (or some other element) heats up over time, but usually they just do it with an ammeter, as it's just a matter of converting units.
 
Last edited:
Back
Top