AMD Forms Radeon Technologies Group to Enhance Focus on Graphics and Immersive Comput

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
AMD today announced the promotion of Raja Koduri (47) to senior vice president and chief architect, Radeon Technologies Group, reporting to president and CEO Dr. Lisa Su. In his expanded role, Koduri is responsible for overseeing all aspects of graphics technologies used in AMD’s APU, discrete GPU, semi-custom, and GPU compute products.

“We are entering an age of immersive computing where we will be surrounded by billions of brilliant pixels that enhance our daily lives in ways we have yet to fully comprehend,” said Dr. Su. “AMD is well positioned to lead this transition with graphics IP that powers the best gaming and visual computing experiences today. With the creation of the Radeon Technologies Group we are putting in place a more agile, vertically-integrated graphics organization focused on solidifying our position as the graphics industry leader, recapturing profitable share across traditional graphics markets, and staking leadership positions in new markets such as virtual and augmented reality.”

“AMD is one of the few companies with the engineering talent and IP to make emerging immersive computing opportunities a reality,” said Koduri. “Now, with the Radeon Technologies Group, we have a dedicated team focused on growing our business as we create a unique environment for the best and brightest minds in graphics to be a part of the team re-defining the industry.”
 
So you can think of him as a type of.... Card Czar.

Thank you. Thank you. I'll be here all week.
 
Good news, now with Raja Koduri and Jim Keller AMD is good on tech. Now AMD needs to find somebody of this caliber heading their marketing department.
 
I hope he has some sense and reintroduces powerful RAMDACs to AMD's cards. Witb some R/D could do 1GHz with zero noticeable degredation when paired with a high-quality VGA cable or up to 1.5GHz over B.N.C. People need cards with the ability to drive CRTs at extremely high resolutions, such as 4096x3072i@85Hz. Current nVidia cards give dirty signals at pretty much any bandwidth, failing above 400MHz. You can get 2560x1920i@120Hz over a 980, but the signal will be sognifigantly degraded and blurred. Top-end AMD cards don't have RAMDACs at all, which makes it impossible to use them with almost all Proper monitors above 1400x1050@80 or 90 Hz, and even getting that measly resolution and refresh rate requires buying expensive external RAMDACs.

Honestly, though... This is realy all just wishful thinking. Most consumers don't even know what picture quality is anymore and use garbage displays, like the LCDs contraptions that are pushed as desktop mobiors.
 
...... Most consumers don't even know what picture quality is anymore and use garbage displays, like the LCDs contraptions that are pushed as desktop mobiors.

lol. Get over it, nobody wants a 75 pound monstrosity on their desk, with pathetically small screen real estate. The market has spoken!
 
I'm not seeing any 4K analogue CRTs for sale...
 

So much clear bias with so little actual evidence to support it. LCD tech passed CRT a long time ago. Posts like these made sense years and years ago but it's been false for ages now. Hardware color reproduction devices debunked this ages ago. the only concern now is LCD latency and if you get a good LCD is so close to perfect it's a non issue.

As someone who did professional gaming, I missed CRT's as well especially as good ones became harder to find. But the world has moved on, and tech has come within the last few years to completely replace CRT's, lets drop the subject now, CRT is long dead and isn't coming back, nor should it.
 
Hopefully, he will figuratively knock some heads together over there at AMD to get this going. Also, perhaps he can finally help them get a clue with their pricing and availability because right now, they are not going to make money on the Fury's.
 
So much clear bias with so little actual evidence to support it. LCD tech passed CRT a long time ago. Posts like these made sense years and years ago but it's been false for ages now. Hardware color reproduction devices debunked this ages ago. the only concern now is LCD latency and if you get a good LCD is so close to perfect it's a non issue.

As someone who did professional gaming, I missed CRT's as well especially as good ones became harder to find. But the world has moved on, and tech has come within the last few years to completely replace CRT's, lets drop the subject now, CRT is long dead and isn't coming back, nor should it.
They haven't COMPLETELY passed CRTs in terms of technology, but they've been caught up on the most important parts for many years now. I held onto my CRT long after LCDs were in the mainstream just because I hated TN panels passionately. I was an early adopter for the first e-IPS panel though since it was good enough for gaming and had comparable color.

Besides input lag, the other big advantage CRTs is support for non-native resolutions. I get sick of the upscale blur running an older game at a lower resolution, but the benefits of a good IPS panel far outweight the cons against CRT for me.
 
I am all for CRTs but it's obvious they are just inferior technology wise. Now Quantum displays is where it's at.
 
Honestly, though... This is realy all just wishful thinking. Most consumers don't even know what picture quality is anymore and use garbage displays, like the LCDs contraptions that are pushed as desktop mobiors.

I used to have some pretty nice CRT monitors. Once they started having problems I got rid of them.

Not only do CRTs weigh a ton, they also make a horrible high-pitched sound that I cannot stand to be around anymore.

CRTs also consume a massive amount of power compared to LCDs.

It is a good thing that they are gone.

Sure there have been some super crappy LCD monitors and I am sure that there still are, but they have come a really long ways since they were first introduced.

When I can get a good 40" 120Hz 1080p dumb TV/monitor for a little over $200 on sale, there is absolutely no reason for CRTs to even exist... except as boat anchors.

Decent large 4k monitors are getting to be below the $1k mark.

Now tell me again why you want a CRT?
 
They haven't COMPLETELY passed CRTs in terms of technology, but they've been caught up on the most important parts for many years now. I held onto my CRT long after LCDs were in the mainstream just because I hated TN panels passionately. I was an early adopter for the first e-IPS panel though since it was good enough for gaming and had comparable color.

Besides input lag, the other big advantage CRTs is support for non-native resolutions. I get sick of the upscale blur running an older game at a lower resolution, but the benefits of a good IPS panel far outweight the cons against CRT for me.

There are some monitors that do upscaling really well. The older Dell 19" 4:3 monitor is one of them. I kept one just because of that.

You should also be able to have your video card set to not scale the lower resolutions to full screen in order to not get the blur.
 
I used to have some pretty nice CRT monitors. Once they started having problems I got rid of them.

Not only do CRTs weigh a ton, they also make a horrible high-pitched sound that I cannot stand to be around anymore.

CRTs also consume a massive amount of power compared to LCDs.

It is a good thing that they are gone.

Sure there have been some super crappy LCD monitors and I am sure that there still are, but they have come a really long ways since they were first introduced.

When I can get a good 40" 120Hz 1080p dumb TV/monitor for a little over $200 on sale, there is absolutely no reason for CRTs to even exist... except as boat anchors.

Decent large 4k monitors are getting to be below the $1k mark.

Now tell me again why you want a CRT?
Everything was better back in the day, now get off my lawn
 
Not sure what to make of this. My first thought is they're preparing to sell off the graphics group. My second thought is they're actually trying to improve.
 
Glad I stumbled upon yet another CRT VS LCD discussion, we don't have enough of them going on.
 
There are some monitors that do upscaling really well. The older Dell 19" 4:3 monitor is one of them. I kept one just because of that.

You should also be able to have your video card set to not scale the lower resolutions to full screen in order to not get the blur.
Yeah, then it will just be smaller than a postcard, not a great solution.
 
Witb some R/D could do 1GHz with zero noticeable degredation when paired with a high-quality VGA cable or up to 1.5GHz over B.N.C. People need cards with the ability to drive CRTs at extremely high resolutions, such as 4096x3072i@85Hz.
Clean signal bandwidth is only one problem, and it's not going to happen with a VGA port. More closely packed phosphors and grouping, tighter aperture grilles, electron beam focus radius and switching speed, and the fact it's a waste of time are other inhibitors. :p

Such a CRT would be be prohibitively expensive, if it's possible at all, making the market for it essentially zero.
 
Clean signal bandwidth is only one problem, and it's not going to happen with a VGA port. More closely packed phosphors and grouping, tighter aperture grilles, electron beam focus radius and switching speed, and the fact it's a waste of time are other inhibitors. :p

Such a CRT would be be prohibitively expensive, if it's possible at all, making the market for it essentially zero.

Back in the day.. in the back of the Computer Shopper magazine, you could order 40" and possibly even bigger CRT monitors. Of course you would need your own personal nuclear power plant to run them.....
 
Clean signal bandwidth is only one problem, and it's not going to happen with a VGA port. More closely packed phosphors and grouping, tighter aperture grilles, electron beam focus radius and switching speed, and the fact it's a waste of time are other inhibitors. :p

Such a CRT would be be prohibitively expensive, if it's possible at all, making the market for it essentially zero.

I've sent 2560x1920i@120 to my CRT. The signal was a bit dirty, but it displayed fine. 3840x2880i@80Hz would be within range on my screen, but I have no way to test it due to RAMDAC bandwidth limits. 2560x1920 also looked better than 2048x1536, though it was a bit more blurry due to grille pitch and beam focus limits. To perfectly resolve 4096x3072 on a 26" visible tube would need a .13 dot pitch. There were .19 units available at smaller sizes. Tightening the pitch and beam focus a bit more would be totally possible. It is also possible to go for alternatives to aperture grilles that provide comparable, such as the rarely-used slot-masks.
 
Back in the day.. in the back of the Computer Shopper magazine, you could order 40" and possibly even bigger CRT monitors. Of course you would need your own personal nuclear power plant to run them.....

A 40" CRT will take around 300W. This is comparable to a very heavily-overclocked top end GPU or two flagship cards at stock clocks. You totally need huge amounts of power to run an SLI or overclocked gaming rig!/s
 
I don't know what RAMDACS but I are teh needs RAMDACS! RAMDACS 4 every1! Gimme RAMDACS!

On topic, why create a new division with a new VP? That's just more money that is being spent on business/corp inefficiencies....unless AMD is getting ready for a spinoff sale...
'
 
Should have spent the money on driver development and crossfire support.....
 
lol. Get over it, nobody wants a 75 pound monstrosity on their desk, with pathetically small screen real estate. The market has spoken!

Some do... I would take a 21" CRT with zero lag and color accuracy over an LCD today.
 
I wouldnt.
I have an Iiyama 22" 510 Diamondtron capable of 2048x1536 @ 80Hz and at high res, its not clear enough.
Even 1680 res is too high.

And its way too small.
It does weigh a ton too,
 
The only good thing about crts is when i was playing Ultima Online and sometimes would get pissed and punch the monitor(damn people robbing mah corpse). Can't do that with an lcd. lol
 
They haven't COMPLETELY passed CRTs in terms of technology, but they've been caught up on the most important parts for many years now. I held onto my CRT long after LCDs were in the mainstream just because I hated TN panels passionately. I was an early adopter for the first e-IPS panel though since it was good enough for gaming and had comparable color.

Besides input lag, the other big advantage CRTs is support for non-native resolutions. I get sick of the upscale blur running an older game at a lower resolution, but the benefits of a good IPS panel far outweight the cons against CRT for me.

Well input lag again as i said is no longer a issue and has not been for ages. The non native resolution though, yeah. But honestly not something I think anyone really uses much, or ever did. Just being objective about that one.
 
Back
Top