Trinity in the Flesh...or Silicon

Observation.
1. It is an interesting development.
2. Anandtech has the right idea. "There are no bad products; only bad prices"
3. It is important to separate facts from opinions.
4. AMD has the right idea to introduce this for laptop segment first.
4.1 Laptop has power consumption, performance, graphic, screen resolution, display quality, barebone/normal/entertainment/gaming/business/workstation segmentation. heat dissipation, add-on features, runtime, pricing (probably more)
4.2 Some will argue 4.1 applies for desktop as well. It is only partially true. read 4.3 below
4.3 Laptop ships in one piece and most of the time many parts of the unit are already fixed (factors in 4.1) the commons thing you can easily change are RAM/HD. The rest are not common per generic mass-market laptops. (I understand some will argue replaceable screen, gpu and perhaps doing cpu upgrade themselves, but we are talking "common mass-market size laptops in millions of units)
4.4 As a result of 4.1 and 4.3, you have a lot of variations. In desktop scenario, you can alter the balance easily by end users and some factors are of minimal concern to desktop users. In laptops, these are relevant. It is not easy to change the overall combination once delivered. Because of the variations and usually no single one can cover all practical combinations, there tends to be segments where you can find possibility. This explains point 4.
5. Combine everything in 4-to-4.4, this explains point 2.

6. Misc
6.1 Per single-thread, yes it is still less than SB/IB when you compare on same clock.
6.2 The point here is we are not talking about same clock. It is about within the laptop power consumption, run-time wrt performance and pricing.
6.3 Top SB/IB laptop models obviously have great performance, especially those with mid or high-end range of nVidia/AMD discreet GPU. But the pricing reflects that as well. Power consumption is also indirectly reflects that (indirect because most advance users know when they go for quad-core top-speed laptop processors with high end discreet gpu, power consumption will increase a bit and runtime will reduce a bit) (EDIT here to clarify the exact meaning)
6.4 Other than that, the remaining is wide open with choices, with features/pricing an important differentiation.

7. An interesting development, after introduction of Retina Display on iPad, some users are more interested in higher resolution LCD screen on PC laptops.

8. In the end refer to point 2 "There are no bad products; only bad prices"

Cheers.
 
Last edited:
waiting for 1080p trinity laptop with dual graphics in slim profile. I think that will be their sweet spot in price/performance.
 
Do you guys think that IPC could improve even further with Vishera since they wont have to be concentrating on energy efficiency and battery life with be desktop chips? Kinda like how Ford can put more power into their truck engines cause fuel efficiency wont be as big a concern as it would be in their small econo class cars? Or would that even affect IPC?

Although some folks probably expected a quick-fix for the Bulldozer architecture that would yield some sizeable performance gains, that doesn't appear be what's happened. Instead, Piledriver incorporates a fairly broad range of improvements, none of which contributes much more than 1% to overall per-clock instruction throughput. (I believe the cumulative total is somewhere around a 6% IPC improvement, generally, but my notes are fuzzy on that one.)

http://techreport.com/articles.x/22932/2

No, I don't think so. I think the addition of L3 cache will definitely help, but I wouldn't expect AMD to do anything drastic to really bump that ~6% figure up other than the L3. The clock speeds are another matter. Considering how efficient Trinity is, as efficient as SB at idle and load, it should translate to some pretty drastic decreases in power consumption under that 4ghz mark, the unknown variable would be that clock mesh tech which is supposedly helping Trinity more than Vishera because Trinity doesn't scale past 4ghz in clock speeds. It's difficult to derive much from Trinity because the reviews have been pretty shitty and because it's an APU that sacrifices 50% of its die space for the on-die GPU which hurts TDP and clock speeds both. I'd guess they finally reach their estimated clock speed goals they had with Bulldozer as well as decrease power consumption significantly at load. If we get some decent reviews where IPC is thoroughly looked at and compared with Llano, SB/IB and BD we just won't know much else.
 
Well I'm at work on my phone so its hard to peruse all the reviews but it looks like Trinity is 15-20% faster than Llano but 15-20% slower than Sandy Bridge which would likely translate to 20-25% slower than Ivy Bridge. That about right? Kinda disappointing if so. I was hoping to see AMD a little closer to 10% behind Intel.

It would be impossible and unreasonable to expect Trinity to beat (or even come close) SB/IB in IPC... Intel is just too far ahead.

With Trinity AMD is asking "Do you want a faster PC or better graphics ?"
Also they might be cheaper (which is a huge deal) if they charge less for the chipset.

Trinity will be better for "almost everyone" as most users do not really need "more CPU" while "more GPU" can always be useful.

Yes Intel+GPU will be a lot faster, but also a LOT more expensive (and +75$ on a 500$ laptop is huge).
Trinity will give you "guaranteed console performance" on a "budget laptop", something Intel can not do with HD2500 or barely do with HD4000.
 
Last edited:
With Trinity AMD is asking "Do you want a faster PC or better graphics ?"

The big question is how many people out there for their daily use could even tell if one laptop off the shelf was faster than another?

It's only when you start transcoding/converting video that most people would notice. As for gaming, in my experience the only game that appears on most folks laptops (usually teenagers) is the Sims. Haven't really found anything else. It's still a very small sliver of the market that seriously game to any degree on a laptop.

Strange that the Sims never appears on any of the techsite benchmarks?!:D
 
I haven't browsed computers/laptops in a real store in forever, so I'm not sure how they advertise. However, I think AMD will be able to check all the right boxes with these to satisfy your average customer. Quad core, great battery life, something about it's graphics, and they are definitely gonna nail the price. I think this is a great strategy for them, they just can't compete with Intel in the premium space anymore. I personally would like one, although I thought they were going to game better vs Llanno and Ivy.
 
Strange that the Sims never appears on any of the techsite benchmarks?!:D

Sims, World of Warcraft, BF3, Diablo3, Madden, NBA Live, Left 4 Dead, these are the gamers AMD is aiming at.
Games that you can see "on the news" and "advertised in bus/train stops"

Get a 500$ laptop from Mom/Grandma for birthday/high-school/college and game on it.

There is a reason consoles sell much easier than PC... they are cheaper...
A 500$ "console like laptop" isn't much more expensive than TV+360 and with Netflix/Hulu...
You see the potential market for "one 500$ laptop per person" :)
 
http://techreport.com/articles.x/22932/2

No, I don't think so. I think the addition of L3 cache will definitely help, but I wouldn't expect AMD to do anything drastic to really bump that ~6% figure up other than the L3. The clock speeds are another matter. Considering how efficient Trinity is, as efficient as SB at idle and load, it should translate to some pretty drastic decreases in power consumption under that 4ghz mark, the unknown variable would be that clock mesh tech which is supposedly helping Trinity more than Vishera because Trinity doesn't scale past 4ghz in clock speeds. It's difficult to derive much from Trinity because the reviews have been pretty shitty and because it's an APU that sacrifices 50% of its die space for the on-die GPU which hurts TDP and clock speeds both. I'd guess they finally reach their estimated clock speed goals they had with Bulldozer as well as decrease power consumption significantly at load. If we get some decent reviews where IPC is thoroughly looked at and compared with Llano, SB/IB and BD we just won't know much else.

6% huh? Well that is disappointing. Like I said, I wasn't expecting miracles but I was hoping for 15% IPC increases. Maybe the addition of some L3 can bump it up to 10%. Still a LONG way behind Intel and maybe too far behind to be competitive even with some nice pricing (talking Vishera here). I'm not jumping ship quite yet and will wait til we actually see some desktop CPU's but it does look a little worse than I'd hoped for.
 
Sims, World of Warcraft, BF3, Diablo3, Madden, NBA Live, Left 4 Dead, these are the gamers AMD is aiming at.
Games that you can see "on the news" and "advertised in bus/train stops"

Yes quite probable. Just a shame that AMD will never advertise the fact themselves.

Folks will just buy Intel Inside instead as that's what they are told on TV 5 times a day from the adverts.

AMD? Who are they? Some 80's synthpop band?

I still think AMD are their own worst enemies when it comes to pushing brand recognition. It's not even as though they have 20 other competitors to fight against.

Just the one.
 
6% huh? Well that is disappointing. Like I said, I wasn't expecting miracles but I was hoping for 15% IPC increases. Maybe the addition of some L3 can bump it up to 10%. Still a LONG way behind Intel and maybe too far behind to be competitive even with some nice pricing (talking Vishera here). I'm not jumping ship quite yet and will wait til we actually see some desktop CPU's but it does look a little worse than I'd hoped for.

Competitive with or for whom?

It's only a very small group that buy processors faster than the average. Most folks are happy to settle on the best bundle of 'stuff' they can get for $500.

Give it another year or two and more people will be using (less competitive in terms of IPC) ARM cpus for their day to day than Intel's.
 
AMD is dying. Was really looking forward to Trinity.

lol? trinity is 20% faster on the cpu side of things than llano, and like 50% more powerful on the graphics side, oh and its more power efficient. They did this with out a DIE shrink.

trinity is a awesome mobile chip, just depends on price.
 
lol? trinity is 20% faster on the cpu side of things than llano, and like 50% more powerful on the graphics side, oh and its more power efficient. They did this with out a DIE shrink.

trinity is a awesome mobile chip, just depends on price.

It's not awesome, it's pretty blah. All the reviews out now pretty much echo what I just said. The HD 4000 gets too close in many of the benchmarks. Let's face it, the BD arch is complete poo poo and AMD should have put in a better performing GPU.

And remember it's only the top of the line Trinity that looks even marginally acceptable. 4 cores, 384 :"Radeon Cores" etc. Imagine the dog poop performance from the severely neutered, dual core versions. Every IB comes with the HD4000, so... yeah. Sorry, Trinity is a letdown. But keep drinkin' that kool aid. It's not gonna make the engineering any better.
 
priced right, trinity laptops will still sell like hotcakes.

ivy bridge didn't make any significant gains except with graphics.
so we'll let Trinity slide for this round in terms of quad core full load performance
 
I think this shows that AMD desperately needs a node shrink. On the other hand this looks like a great processor for HTPC or upgrading old computers here at home...
 
http://www.legitreviews.com/article/1928/2/

diablo-3-benchmark.jpg


umm ya enough said.
 
And remember it's only the top of the line Trinity that looks even marginally acceptable. 4 cores, 384 :"Radeon Cores" etc. Imagine the dog poop performance from the severely neutered, dual core versions. Every IB comes with the HD4000, so... yeah. Sorry, Trinity is a letdown. But keep drinkin' that kool aid. It's not gonna make the engineering any better.
That is my concern as well. If amd's top of the line a10 is sometimes better and sometimes worse than the ivy bridge hd4000, how much worse is the a8 and a6 silicon compared against a lower end ivy bridge i5 which still has the same hd4000 GPU.

edit: i was also trying to find some ivy bridge laptops for some price comparison, but i have failed. Anyone know how much an i5 ivy bridge laptop may go for?
 
That is my concern as well. If amd's top of the line a10 is sometimes better and sometimes worse than the ivy bridge hd4000, how much worse is the a8 and a6 silicon compared against a lower end ivy bridge i5 which still has the same hd4000 GPU.

edit: i was also trying to find some ivy bridge laptops for some price comparison, but i have failed. Anyone know how much an i5 ivy bridge laptop may go for?

No idea what the i5's will go for. Currently the i3's aren't even scheduled for shipment/release until mid/late summer? Intel haven't said anything at all about the i3's so we may not see any competitively priced IB's for a while.

The IB chips would also drop in performance as well due to decreased cache and clock speeds both which affect the way the HD4000 performs due to how the cache is shared between GPU and CPU on the Intel designs. I'm not sure by how much, though, but my guess is the A8 Trinity chips will still outperform the IB on-die GPU regardless of the SKU.

Best bet is to wait. Trinity is shipping en masse already (over 1mill shipped?) while the IB i7's and 1 i5 SKU are shipping as well, the i7 3720QM being the only IB chip I've seen available for purchase. They should be on the shelves within a month but which chips and what the prices are :confused: The A10 Trinity is supposed to be priced at $600+ which is fine at the moment but you'd figure it would be too expensive if those IB i3's are equipped with HD4000 because they'd go for roughly the same.

I hate waiting games and paper launches.
 
Yeah, what scares me about this is Intel is starting to catch up in GPU. The one area AMD should have a huge advantage in (since they bought ATI, a dedicated graphics company). Now even that seems to be slipping away.

And the Piledriver improvements look weak too. And even 15% IPC improvement wont help desktop BD that much where it counts, in gaming. You see some gaming benches where BD is doing 50 FPS and Ivy doing 95. Well even if you add 15% BD only gets up to 57.5, big deal, they're still getting thrashed.

When is Piledriver desktop due anyway? Any better estimates than 2nd half 2012?

Edit: OTOH if AMD can up the desktop Piledriver clock speeds too, that can help. The positive for AMD is Ivy Bridge was kind of a flop.

Really being on 22nm though, I think Intel completely toys with AMD right now. Their chips are literally half the size/cost of competing AMD chips, Intel could slash prices anytime they want without even feeling much pain, they just choose not too. I dont think Intel wants AMD dead (antitrust issues), so AMD will continue to exist imo.
 
Last edited:
Their chips are literally half the size/cost of competing AMD chips, Intel could slash prices anytime they want without even feeling much pain, they just choose not too.
counting the RnD required to even ship 22nm i think you'll find they cannot in fact slash prices anytime they want.

the die shrink train is NOT something that runs on the cheap.
 
Sure they can, they're highly profitable. Do you not think AMD is required to sink the same R&D into 22nm? What then?

Besides that the tough part of 22nm for Intel is probably over (low yields etc), the gravy part of low costs is likely just beginning as yields increase for years to come.
 
And the Piledriver improvements look weak too. And even 15% IPC improvement wont help desktop BD that much where it counts, in gaming. You see some gaming benches where BD is doing 50 FPS and Ivy doing 95. Well even if you add 15% BD only gets up to 57.5, big deal, they're still getting thrashed.

Thats not normal though. The only time BD gets beat by 2:1 was in the [H] multi GPU review and using THREE GTX580's. If youve got $1500 in video cards then youre better off going Intel. But in a more realistic system with $500 or less in your PCIE slot, a Bulldozer will perform a lot closer to an Intel. It still gets whooped in Civilization V and Starcraft 2 for some reason though, but all the others will be a lot closer and you likely wont see a real world difference.

The reason I want 15-20% improvement was just to have a more competitive CPU market. I dont like Intel not having any competition. If Piledriver could muster 15-20% improvements, use less energy and overclock as well as Bulldozer then come in under $200, it would be a viable alternative to Intel's $250 i5-3570K. Intel would still be top dawg but at least AMD would be giving them a little bit of heat.
 
It's not awesome, it's pretty blah. All the reviews out now pretty much echo what I just said. The HD 4000 gets too close in many of the benchmarks. Let's face it, the BD arch is complete poo poo and AMD should have put in a better performing GPU.

And remember it's only the top of the line Trinity that looks even marginally acceptable. 4 cores, 384 :"Radeon Cores" etc. Imagine the dog poop performance from the severely neutered, dual core versions. Every IB comes with the HD4000, so... yeah. Sorry, Trinity is a letdown. But keep drinkin' that kool aid. It's not gonna make the engineering any better.

But is it anything that a driver update can't fix? AMD updates drivers every month; Intel - much less so.
 
Any word on street date/pricing for Trinity desktop stuff? Haven't built a PC in a while, and am only waiting on a top-spec A10 and a fitting mobo.
 
Thats not normal though. The only time BD gets beat by 2:1 was in the [H] multi GPU review and using THREE GTX580's. If youve got $1500 in video cards then youre better off going Intel. But in a more realistic system with $500 or less in your PCIE slot, a Bulldozer will perform a lot closer to an Intel. It still gets whooped in Civilization V and Starcraft 2 for some reason though, but all the others will be a lot closer and you likely wont see a real world difference.


I have seen gaming benchmark numbers that show, stock fx-8150 probably is not much better than my overclocked Q6600 @3.0. Which I assume to be close to the Q9650 in these charts, where you can see the q9650 competing well with 8150: http://www.hardware.fr/articles/842-20/jeux-3d-crysis-2-arma-ii-oa.html

Sure you can OC a 8150, but probably not all that much and then suddenly it sucks 400 watts from the wall. F that.

When I asked on H a few months ago whether it made sense to pair a 7970 with my q6600, i was roundly ridiculed >> http://hardforum.com/showthread.php?t=1667210 So it shouldn't be any different for 8150

But I mean, that just really drives home the sad reality to think of it that way. An 8150 would be a pointless upgrade for me for gaming from a Q6600.
 
Sure they can, they're highly profitable. Do you not think AMD is required to sink the same R&D into 22nm? What then?

Besides that the tough part of 22nm for Intel is probably over (low yields etc), the gravy part of low costs is likely just beginning as yields increase for years to come.

They're not, though. AMD doesn't own GloFo and GloFo shares their R&D costs with other fabs, including IBM and Samsung. Since they completely split just this year, AMD only pays for the wafers (though I'm not sure what the WSA, or wafer supply agreement is) they're not funding anything GloFo related unless they buy the chips from them. This was done purposely by the collective fabs so they can reduce the exponentially increasing R&D costs as you shrink down in node sizes, allowing them all to be more competitive. It's also the reason why Intel is opening up its fabs to third parties so they can increase their profits and provide for some of that future R&D.
 
I have seen gaming benchmark numbers that show, stock fx-8150 probably is not much better than my overclocked Q6600 @3.0. Which I assume to be close to the Q9650 in these charts, where you can see the q9650 competing well with 8150: http://www.hardware.fr/articles/842-20/jeux-3d-crysis-2-arma-ii-oa.html

Sure you can OC a 8150, but probably not all that much and then suddenly it sucks 400 watts from the wall. F that.

When I asked on H a few months ago whether it made sense to pair a 7970 with my q6600, i was roundly ridiculed >> http://hardforum.com/showthread.php?t=1667210 So it shouldn't be any different for 8150

But I mean, that just really drives home the sad reality to think of it that way. An 8150 would be a pointless upgrade for me for gaming from a Q6600.

I wouldnt ridicule you for using a 7970 with an overclocked q6600. Though I do think the 8150 is a big upgrade for multitasking... i use my 7970 with a llano... lol
 
I was expecting the difference between Trinity and HD4000 to be about the same as the difference between Llano and HD3000. It generally isn't, that's the biggest let down for me. Intel is almost caught up on the graphics side already. AMD's APUs get a nice boost from faster memory, I believe DDR4 will be coming out around the end of the year. I wonder if AMD will implement DDR4 in their next APU refresh, and if it will make a big difference. They need to do something, or Intel will catch them on their only advantage at the moment by next gen.
 
I cant stand the touchpad on the X220, otherwise a great ultraportable. Id probably go for something else like a DV4 or something. If you plan on using an external mouse then i guess its not a big deal... but in my opinion the touchpad on the X220 is a huge turn off.
 
I'm debating on whether to wait for Kaveri/Haswell and keep my i3 SB lappy or just buy Trinity/IB or a decent i3/i5 with a GT540m/640m now. I wouldn't mind dumping my rig altogether minus one monitor and just go laptop only. I suppose like everyone else it's going to be about the implementation and price on both newest gen chips. If a Trinity chip with crossfire can be had for ~$600 then that would convince me to get rid of both of these :p I'll have to wait at least a month
 
I was expecting the difference between Trinity and HD4000 to be about the same as the difference between Llano and HD3000. It generally isn't, that's the biggest let down for me. Intel is almost caught up on the graphics side already. AMD's APUs get a nice boost from faster memory, I believe DDR4 will be coming out around the end of the year. I wonder if AMD will implement DDR4 in their next APU refresh, and if it will make a big difference. They need to do something, or Intel will catch them on their only advantage at the moment by next gen.

Intel dedicated a lot more die space to the GPU with Ivy Bridge than they did with Sandy. That isn't something they can continue to do indefinitely, especially if they expect to remain within their set TDP. As is it looks like Trinity takes less power to run than Ivy Bridge mobile, how bad would it get if they doubled the size of the GPU again? If I'm not mistaken they did exactly that with Ivy Bridge.

Meanwhile AMD is only using their Northern Islands architecture with Trinity, they have Southern Islands waiting in the wing and it is not only more powerful, but much more efficient as well. This isn't even counting the fact that we're talking about launch drivers with Trinity and typically AMD GPUs get a nice boost with later drivers.
 
Last edited:
They both still suck at real resolutions?

This test is kindabiased, high for texture/physics/clutter ? Seriously.
Still it proves my point a 500$ gaming laptop is viable and Trinity will do it.

With 1366x768 you'll probably get 40-50ish FPS with Trinity on D3, instant sale !

It's only a very small group that buy processors faster than the average. Most folks are happy to settle on the best bundle of 'stuff' they can get for $500.

Exactly my point 500$ Trinity will probably beat 500$ Ivy Bridge for gaming

Yes quite probable. Just a shame that AMD will never advertise the fact themselves.
Folks will just buy Intel Inside instead as that's what they are told on TV 5 times a day from the adverts.

Honestly all they need to do is put a sticker on every single AMD Laptop they sell with "Play every game with the same quality as on a console !"
And work a deal with EA/Blizzard/Etc to put stickers "Play this game on this laptop!"
 
Last edited:
That is my concern as well. If amd's top of the line a10 is sometimes better and sometimes worse than the ivy bridge hd4000, how much worse is the a8 and a6 silicon compared against a lower end ivy bridge i5 which still has the same hd4000 GPU.

edit: i was also trying to find some ivy bridge laptops for some price comparison, but i have failed. Anyone know how much an i5 ivy bridge laptop may go for?

The LEAST expensive Ivy Bridge mobile part is currently $378 by itself and that is NOT the one being in many reviews. At least one review used the top of the line Ivy Bridge, which currently is priced at around $1100. Yes, thats right, they used an APU which is more expensive than most Trinity laptops are likely to be.

The areas where HD4000 is competitive are largely in CPU-limited titles and the lower spec Ivy Bridges won't look as good there. Also of note is that the HD4000 is NOT the same GPU. Again taking the example of the $1100 3920XM chip, the turbo on the GPU is higher than it is on the already quite pricey ($378) 3720QM and it likely won't be able to hit its turbo speeds as often due to the lower TDP.

That is another area where the comparison is frankly stupid. They're comparing the 35W A10-4600M to 45W and 55W Intel chips. Trinity isn't aiming at the top end, it's aiming for volume and when you compare it against the current i5 and i3 Sandy Bridge parts that are priced closer to it you get a much more competitive picture on the CPU side and a complete massacre on the GPU side.
 
Last edited:
The LEAST expensive Ivy Bridge mobile part is currently $378 by itself and that is NOT the one being in many reviews. At least one review used the top of the line Ivy Bridge, which currently is priced at around $1100. Yes, thats right, they used an APU which is more expensive than most Trinity laptops are likely to be.

The areas where HD4000 is competitive are largely in CPU-limited titles and the lower spec Ivy Bridges won't look as good there. Also of note is that the HD4000 is NOT the same GPU. Again taking the example of the $1100 3920XM chip, the turbo on the GPU is higher than it is on the already quite pricey ($378) 3720QM and it likely won't be able to hit its turbo speeds as often due to the lower TDP.

That is another area where the comparison is frankly stupid. They're comparing the 35W A10-4600M to 45W and 55W Intel chips. Trinity isn't aiming at the top end, it's aiming for volume and when you compare it against the current i5 and i3 Sandy Bridge parts that are priced closer to it you get a much more competitive picture on the CPU side and a complete massacre on the GPU side.

good post mate. I noticed that as well. Of coarse Trinity looks bad vs a top of the line Intel mobile processor. Trinity is a value processor. Also have to consider that most of the reviews i have seen compare a 35 watt Trinity to a intel processor with more TDP head room. Would be interesting to see if Amd plans to introduce higher TDP mobile chips for the "desktop replacement laptops"
 
i suspect trinity is getting better web browsing battery life than llano because they designed it to be ultra low voltage in the lowest p-states, where it sits most of the time.

I am currently testing my llano undervolting ability, tediously testing with prime95, and I have been able to lower p6 voltage from 0.9375 to 0.7875, and it looks like it can go even lower, but I haven't gotten there yet
 
I wonder if the 32nm process has improved some, because that A10-5800K is really tempting.

I hope the evolution from 'Bulldozer' to 'Piledriver' brings Trinity to at most 10-15% better than Llano per clock.

This is very unlikely. Llano has 4-7% better IPC then phenom II. Phenom II has much better IPC then bulldozer. I think we will see 5-10% better IPC then bulldozer, but i dont expect it to surprass phenom II ipc. It makes up for this with higher clock speeds though.
 
This is very unlikely. Llano has 4-7% better IPC then phenom II. Phenom II has much better IPC then bulldozer. I think we will see 5-10% better IPC then bulldozer, but i dont expect it to surprass phenom II ipc. It makes up for this with higher clock speeds though.

You do know that post of mine was a month ago, right?

Looking at how the A10-4600M performs at 35W, a six core @ 3.0-3.6GHz with about 384-512 shaders could fit into a 95-125W TDP nicely. I know it wouldn't set the world on fire, but it would make for a pretty good entry desktop.

If GF had the capacity at 28nm, I think AMD should try shrinking Trinity first.
 
i suspect trinity is getting better web browsing battery life than llano because they designed it to be ultra low voltage in the lowest p-states, where it sits most of the time.

I am currently testing my llano undervolting ability, tediously testing with prime95, and I have been able to lower p6 voltage from 0.9375 to 0.7875, and it looks like it can go even lower, but I haven't gotten there yet

Llanos were notoriously overvolted at stock clocks and p-states. Although you could potentially be correct in saying that it's why Trinity looks to have much more perf-per-watt, we also don't know how well Trinity can underclock and overclock because the BIOS and drivers are immature and there is no k10stat-like utility for the architecture. If you consider the massive turbo boosts and the low power consumption during full load -- it's actually less than SB -- you've got to figure Trinity can probably overclock as well as Llano and they definitely improved overall perf-per-watt because Llano couldn't get close to SB in that respect.

If GF had the capacity at 28nm, I think AMD should try shrinking Trinity first.

They probably do but Kaveri is probably near-completion and they should be pretty close to making the mask for the chips as well. Kaveri w/ GCN outperforms VLIW4 in every single way imaginable, particularly in GPU compute where AMD seems to be focusing on the most. This makes it a difficult choice for me. Should I buy Trinity? wait for an IB i3? Or will GloFo get 28nm right the first time and will Kaveri come out on time? It's always tempting to wait it out and when you know what's coming just around the corner but if I can find a Trinity laptop with a 1080p screen for ~$700 then buying =P
 
... They probably do but Kaveri is probably near-completion and they should be pretty close to making the mask for the chips as well. Kaveri w/ GCN outperforms VLIW4 in every single way imaginable, particularly in GPU compute where AMD seems to be focusing on the most.

On that note, I wonder if AMD will be able to throw 512 GCN shaders into Kaveri. Take the 7750, which has ~1.5B transistors (512 shaders/16 rops/32 tmus) at a die size of 123mm². If AMD cut the rops to 4 the tmus to 8, and take out the memory controllers, think it could fit with two Steamroller modules?
 
Back
Top