AMD/ATI Beats Nvidia to the 1 GHz GPU Milestone

beowulf7

[H]F Junkie
Joined
Jun 30, 2005
Messages
10,433
Congrats to AMD! :cool:

AMD/ATI Beats Nvidia to the 1 GHz GPU Milestone

Marcus Yam said:
AMD has hit another megahertz milestone record today.

radeon-hd-4890,9-6-193434-13.jpg


AMD boasted today that it has delivered the world’s first 1 GHz GPU. Is this a new product? Not exactly.

What AMD has done is that it has taken an ATI Radeon HD 4890 graphics card, which normally runs at 850 MHz, and overclocked it to 1 GHz at the factory – air cooled – and voila, you have yourself the world’s first shipping 1 GHz GPU part.

In fact, if you had your own Radeon HD 4890, you might be able to reach 1 GHz too. Of course, AMD does have the advantage of binning parts to make sure that those with the most headroom get separated for this new SKU.

The flip side of that equation also means that any GPUs that aren’t able to hit the 1 GHz mark on just air cooling will be relegated to just the old “regular” pile, which will cap speeds at which the Radeon HD 4890 will run.

Look out for these juiced up video cards from Asus, Club 3D, Diamond Multimedia, Force3D, Gecube, Gigabyte, HIS (Hightech Information Systems), ITC, Jetway, MSI, Palit Multimedia, PowerColor, Sapphire Technology and XFX.

Stay tuned for our hands on with one of these cards where we'll put it up against a 'vanilla' reference board.

Interestingly enough, this is the second time that AMD has beat an arch rival to the 1 GHz milestone. Back in 2000, AMD beat Intel to the 1 GHz punch with its Athlon. Remember that? Take a trip with us down memory lane as we look back on Chris Angelini’s review of the AMD Athlon 1 GHz during his more innocent and much younger days.
 
Thats cool, and I applaud ATi for that, although I don't think NV cares, or is attempting to go for high clockspeeds on its parts. AMD also beat intel to a 1ghz cpu by two days, but that didn't mean much in the grand scheme of things.
 
...and Intel beat AMD to the 3ghz CPU milestone...

Tell me that AMD beats NV 2:1 clock for clock, then we'll talk.
 
Woke up on the side of the bed this morning? :confused:

Heh, not at all. I really didn't mean it to sound like it looks. Just not a very big deal as the cards have been out for well over a week now.
 
Good for them. Can't wait to see if they will release a 4890 x2 or not.
 
Gratz ATI/AMD fanbois, but lets face it in the massively parallel world of video rendering this is an utterly meaningless "milestone". In case you haven't noticed CPU speeds haven't increased by any substantial amount in the last, what? 3-4 years? Technically they've gone down since the P4 era, but who cares? The 1GHz CPU race was meaningful, as the primary way to increase computing power back in the day was faster clock speeds. These days, raw clock speed doesn't matter and if ANYBODY should know that it would be an AMD fanboi (please see athlon vs P4 debate from 6 years ago).

So to all the fanbois out there, enjoy patting yourselves on the back for something:
a) you had nothing to do with
b) doesn't matter in the slightest

(and technically the nVidia shader units are double clocked off core frequency so they've been running well over 1GHz for quite some time now, but again at the end of the day who really cares? I'd buy a 1Hz GPU from Texas Instruments if it beat both ATI and nVidia in real world performance)
 
So, is the 1Ghz 4890 faster than a GTX 285? If it isn't, then how much is that milestone worth?
 
Wow, so it's not just board partners releasing their overclocked versions, but this is an actual part from AMD? Awesome in my book, good work AMD.

So, is the 1Ghz 4890 faster than a GTX 285? If it isn't, then how much is that milestone worth?
http://www.anandtech.com/video/showdoc.aspx?i=3555
Is the card just an overclocked/binned 4890 with no other changes? Extrapolating from that article it looks like it trades blows with the GTX 285 at stock memory speeds (975MHz) and at higher speeds (1.2GHz) takes ~5-10% lead. Could be very interesting how this plays out.
 
Was clearly done for the attention. It's one of those " Hey, look what I can do " situations. Something AMD has been doing a lot of these days.
 
Test it against the highest overclocked GTX 275/GTX 285 from Nvidia partners. Whether its done "officially" by AMD or unofficially by a licensed partner doesn't matter to me. I want to see benchmarks in an apples to apples situation - highest clocked card for sale from each GPU family.
 
Also first to 64-bit processors and first with the memory controller onboard (talking mainstream of course), both which Intel followed with later. Even their hypertransport bus might be a first (once again mainstream) but I don't recall. They also "technically" have beaten Intel in the area of enthusiast software with AOD and Fusion though that's arguable by some as to how important it is. AMD deserves some serious kudos for innovation and straight up ballsy decisions. Sure they've had their issues (management), but you have to agree that they've kicked ass in many ways over the years with nowhere near as much money or resources as Intel. Give credit where credit is due.
 
Last edited:
given that the gta300 is a rumored 512sp 512bit and gddr5 ati better be packing some serious heat with the 5870

imagine what a 4870 or 4890 would be like on ddr3?

ddr5 and nvidia :)
 
given that the gta300 is a rumored 512sp 512bit and gddr5 ati better be packing some serious heat with the 5870

imagine what a 4870 or 4890 would be like on ddr3?

ddr5 and nvidia :)

the rumors bogus.. to run 512bit ddr5 your talking about paying a grand easy for a gtx 300 series card.. also since 512bit gddr5 doesnt exist anyways.. theres no possible way nvidia's using it.. remember AMD developed gddr5.. and they have yet to develop 512bit gddr5.. so enjoy your rumors nvidia fanboy's.. because sadly its going to slap you all in the face..
god i hate useless rumors and then people who believe them like its the bible.. <-- and that goes for you ATI fanboys as well..
 
since 512bit gddr5 doesnt exist anyways.. theres no possible way nvidia's using it.. remember AMD developed gddr5.. and they have yet to develop 512bit gddr5

Hmmm, you're really aggressive for someone who doesn't have a clue what he's talking about. That's not how memory interfaces work. There is no 256bit gddr5. All gddr5 modules are 32-bit wide and you use multiple to build whatever width interface you want. So you can use 16x32bit 512Mb gddr5 modules to build a 512-bit interface to 1GB of memory.
 
Grats to AMD/ATI on the milestone, for sure... but also to the engineering and technical genuises who somehow keep raising the bar, generation after generation, to bring us ever faster and more powerful hardware that help us play our games better. :)

I did get my 4890 to 1 ghz on the core, but at 100% fan speed and with serious artifacting. Still, the possibility was there! The only thing that surprises me is why this wasn't announced much sooner.
 
1. I find extreme irony in how much AMD fanboi's bashed Intel for bragging about their clock speeds over performance per clock. It makes me giggle.

2. However, hitting 1 GHz with the topology and transistor count of GPU die's is freaking cool. GPU's are a different animals than CPU's...so actually doing that was no small effort.
 
P4 CPUs got to 4 GHz first! Take that AMD!

Oh wait...
 
You know what this says to me?

"We can't compete on clock for clock efficiency, so we'll just push the clockspeed through the roof and hope for the best"

Tell me, how much power does this card draw compared to its closest Nvidia equivalent?
 
You know what this says to me?

"We can't compete on clock for clock efficiency, so we'll just push the clockspeed through the roof and hope for the best"

Tell me, how much power does this card draw compared to its closest Nvidia equivalent?

The 4890 is in the same ballpark as its nVidia competitor. We're talking a difference of a 10-30 Watt during idle. When the rough figure hovers around 200 Watt, it is pretty close. It's no K7 vs P4 or so.
 
Congrats AMD for passing a meangingless milestone. But who will be the first to pass the 1 MHz mark in base 23?
 
The CPU comparison is a whole other ballgame as the architectures are so similiar. But how do you compare a GPU architecture built for one clock across the chip with another that has two different clocks where the second one has been over 1Ghz for over 2 years now? Or does shader clock not count in this megahurtz battle?
 
The CPU comparison is a whole other ballgame as the architectures are so similiar. But how do you compare a GPU architecture built for one clock across the chip with another that has two different clocks where the second one has been over 1Ghz for over 2 years now? Or does shader clock not count in this megahurtz battle?

Nothing the other side does matters when in a Fanboi battle.
 
i find it amusing how because people post this or comment they are suddenly a fanboi, get over yourself, i am sure if NVIDIA did this, it would be the same over there, let people congratulate a company, whether for PR or what ever reason, man some of you people really need to get laid, or get better sleep, so dam cranky.
 
Just to clarify, I'm the one who posted this article. If any "fanboi" references were toward me, you may want to think otherwise. I own 2 computers: 1 AMD CPU w/ an NVidia GPU (my primary PC) and 1 Intel CPU w/ an ATI GPU (my HTPC). As you can see, I have "balance" in both systems and within each system. Therefore, I'm not a fanboi of either. ;)
 
Yeah this is a Nvidia fanboy vs AMD fanboy war.
a 1GHZ GPU no matter how useless is pretty cool.
And for years people have been wondering who would be the first.
All in all I say Kudos AMD yeah its pointless but kudos.
 
given that the gta300 is a rumored 512sp 512bit and gddr5 ati better be packing some serious heat with the 5870

imagine what a 4870 or 4890 would be like on ddr3?

ddr5 and nvidia :)

yes, because we really need a 512 bit bus width when we use GDDR5. :rolleyes: Get better rumors before you believe them.

It'd be easier to believe that when you buy a GT300 micheal jackson jumps out from behind you and continues to molest you untill you give nVidia some more cash. :p jokes aside seriously, the 4870 with its mem at 900Mhz with a 256 bit bus width. it has a maximum memory bandwith of 128GB/s. If you keep the speed the same, and add 512bit bus width its 256GB/s, if you increase the speed, its obvious the speed is going to increase as well, but what is going to use 256GB/s?
 
Last edited:
with the G300, more likely to see either a 256bit bus and high speed gddr5, and 1gb ram, or a bus that would be 384bit, and a base ram of 1.5gb ram.
 
Back
Top