AMD R9 "Fury X2" Specifications

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
WCCF Tech has provided the presumed specifications on what should be AMD’s fastest graphics card yet.

The R9 Fury X2 will be the fourth and last Fiji based graphics card from AMD this year. The card will feature two full fledged Fiji XT GPUs with 4GB of stacked High Bandwidth Memory each, for a total of 8GB of memory and a combined memory bandwidth of one Terabyte per second.
 
Oh cool. Now, when will the green camp unveil their dual GPU card?
 
The 4GB per GPU can kindly screw off. I'd want at least 8GB per. Especially with something that has this kind of processing power.
 
Big no thanks from me. I prefer AMD but, if I were to spend the kind of money they will be asking, I would just purchase a Titan X.
 
IMHO any video card that relies on Crossfire in order to reach its full potential is a non-starter.

If you have separate cards in Crossfire, if you are unhappy, at least you can split them up and sell them individually. With a single card, there are too many compromises.

Same goes for SLI, just to a lesser extent, as SLI sucks slightly less than Crossfire.
 
Zarathustra[H];1041942929 said:
IMHO any video card that relies on Crossfire in order to reach its full potential is a non-starter.

If you have separate cards in Crossfire, if you are unhappy, at least you can split them up and sell them individually. With a single card, there are too many compromises.

Same goes for SLI, just to a lesser extent, as SLI sucks slightly less than Crossfire.

I whole heartedly agree with Zarathustra. That's been my experience.
 
The 4GB per GPU can kindly screw off. I'd want at least 8GB per. Especially with something that has this kind of processing power.

Well they just slapped two Fiji's on a board so that's why its 4GB. I reckon the GTX 990 will be 2 x 6GB since it'll be two Maxwells doing same.
 
What empty promises will AMD make with this card? THE ETHICAL CHOICE!111!!!1!!!!
 
Zarathustra[H];1041942929 said:
IMHO any video card that relies on Crossfire in order to reach its full potential is a non-starter.

If you have separate cards in Crossfire, if you are unhappy, at least you can split them up and sell them individually. With a single card, there are too many compromises.

Same goes for SLI, just to a lesser extent, as SLI sucks slightly less than Crossfire.

Meh, it's a limited production niche product. Saving a slot may be worth it for some people. Crossfire scaling also seems to be less of an issue with this generation.
 
Doesn't matter how fast the memory is if there isn't enough of it. A card designed for 4k should have more.
 
Meh, it's a limited production niche product. Saving a slot may be worth it for some people. Crossfire scaling also seems to be less of an issue with this generation.

Scaling is not my biggest concern.

Lack of or dekayed game profiles, instability, bugs, crashes, plain old failures to run, stutter, etc combine to make Crossfire of limited value to most users.
 
Zarathustra[H];1041942964 said:
Scaling is not my biggest concern.

Lack of or dekayed game profiles, instability, bugs, crashes, plain old failures to run, stutter, etc combine to make Crossfire of limited value to most users.

Seems to work just fine with my 290x's... I honestly haven't had any issues at all. Maybe I just play the wrong games or something.

Given that multi-gpu configurations are a small, high-margin niche it's not surprising that there's a lot of FUD coming from both sides. I tune it out just like I ignore the general pissing matches about who has better drivers.
 
Card is looking darn good, ...have .. to resist .. until .. next .. gen...

Hmm, really not sure if I'll be able to hold out for this one...
 
Seems to work just fine with my 290x's... I honestly haven't had any issues at all. Maybe I just play the wrong games or something.

Given that multi-gpu configurations are a small, high-margin niche it's not surprising that there's a lot of FUD coming from both sides. I tune it out just like I ignore the general pissing matches about who has better drivers.

Well, to be honest, I haven't had anything crossfire for a while.

Back when the 6970's were new, I had two of them to try to keep up with my then brand new 1440p monitor, and it was an absolute disaster. Took almost 2 years after launch for Redo Orchestra 2 to get crossfire profiles, and even today, it doesn't work right (people have told me), with frequent valleys in framerate, stutter and random issues.

And it's not just this title. I guess you don't play a lot of newer titles, as AMD is notoriously bad at getting out profiles on any kind of timeline. YOu search around these forums, you'll find plenty of people complaining that 2015 big titles still don't have profiles several months after launch.

Personally I am running dual 980ti's right now. The experience has been MUCH better compared to my dual 6970 experience of a few years ago, but its still worse experience than running a single more powerful GPU. The moment a single GPU comes out that can handle the games I am interested in playing at 4k without ever dropping below 60fps, I'll buy it on launch day, and say goodbye to multi-GPU forever (or at least until 8k comes along :p )

To me, provided you have enough power to handle the resolution and settings, anything single GPU >> anything SLI > anything crossfire.

Going multi-GPU usually only happens once every 5 or so years for me, when I upgrade my monitor, and nothing single GPU on the market can handle the new resolution, and then I only do it very reluctantly, as the experience just sucks compared to single GPU.
 
Zarathustra[H];1041942929 said:
If you have separate cards in Crossfire, if you are unhappy, at least you can split them up and sell them individually. With a single card, there are too many compromises.
Two video cards on one compact board has a painfully obvious advantage to anyone pretending to be objective: its a hell of a lot more compact.

The Fury in particular is an unusually tiny card for the performance envelope, so if they can make a Fury X2 fit in a compact dual-chassis with a single AIO watercooled line to it, you could fit it in a Micro-ATX ultimate gaming HPTC build.

Multiple card by contrast means more cables, more slots occupied, and the need for a bigger case and motherboard.
 
Two video cards on one compact board has a painfully obvious advantage to anyone pretending to be objective: its a hell of a lot more compact.

The Fury in particular is an unusually tiny card for the performance envelope, so if they can make a Fury X2 fit in a compact dual-chassis with a single AIO watercooled line to it, you could fit it in a Micro-ATX ultimate gaming HPTC build.

Multiple card by contrast means more cables, more slots occupied, and the need for a bigger case and motherboard.

True, this is an advantage for the niche group of SFF builders, but they make up - what - 0.1% of the PC enthusiast market, which in its turn makes up 0.,1% of PC users? :p
 
Zarathustra[H];1041943148 said:
True, this is an advantage for the niche group of SFF builders, but they make up - what - 0.1% of the PC enthusiast market, which in its turn makes up 0.,1% of PC users? :p
Most people I talk to these days are building mATX and micro-ITX systems... in fact, I think ATX fans are in the minority now.

In fact, looking at Amazon best sellers for cases and motherboards, I'd say ATX are less than 50%.
 
Most people I talk to these days are building mATX and micro-ITX systems... in fact, I think ATX fans are in the minority now.

In fact, looking at Amazon best sellers for cases and motherboards, I'd say ATX are less than 50%.

I am a big fan of larger cases.. more room and more available slots, and more room for the fans and cooler.

That being said, for my next build I could probably get away with a smaller board that only has 2 PCIe slots as long as the primary slot allows enough space for a 2 slot video card and has enough RAM slots to say run quad channel if that is even a thing by the time I do my next build.
 
I'm glad it doesn't support 32-bit OS. hopefully this will force people who are still using 32-bit to start switching over to 64-bit and then when i go to download all my chipset drivers then i won't need to have to choose.
 
I am a big fan of larger cases.. more room and more available slots, and more room for the fans and cooler.

That being said, for my next build I could probably get away with a smaller board that only has 2 PCIe slots as long as the primary slot allows enough space for a 2 slot video card and has enough RAM slots to say run quad channel if that is even a thing by the time I do my next build.
But here's the question... why do you need more fans and more slots if you can get the same performance in a small form factor?

For example, if the Fury X2 can be cooled by a single 120mm radiator, which there's no reason to believe it couldn't, then you could cool even a very aggressive modern i7 with another 120mm radiator, and fit both in a mini-ITX case that has a 240mm radiator mount. Something inherently cool getting tons of performance from something not much bigger than a PS4, and ideal for a HTPC.

Now, my old excuse for a big case was that I would have sound cards and TV tuners and tons of hard drives, but who the heck doesn't use network attached storage in 2015? And a SSD, heck they are so small that many fit right on the motherboard now.

Big cases are going the way of the do-do IMO, as miniaturization is rending them obsolete even for gamers. And for non-gamers/light-gamers, setups like my tiny Alienware Alpha are great miniature options.
 
hmm, my last real full tower build must date back... over 15 years or so now. No incentive to make one of those behemoths again, like the small builds now.
 
They're not just going to go all in and call it the R9 Inferno? All kidding aside this is going to be a neat product, not a good value for money, but neat.
 
But here's the question... why do you need more fans and more slots if you can get the same performance in a small form factor?

For example, if the Fury X2 can be cooled by a single 120mm radiator, which there's no reason to believe it couldn't, then you could cool even a very aggressive modern i7 with another 120mm radiator, and fit both in a mini-ITX case that has a 240mm radiator mount. Something inherently cool getting tons of performance from something not much bigger than a PS4, and ideal for a HTPC.

Now, my old excuse for a big case was that I would have sound cards and TV tuners and tons of hard drives, but who the heck doesn't use network attached storage in 2015? And a SSD, heck they are so small that many fit right on the motherboard now.

Big cases are going the way of the do-do IMO, as miniaturization is rending them obsolete even for gamers. And for non-gamers/light-gamers, setups like my tiny Alienware Alpha are great miniature options.

I disagree.

I think it is a phase. People are building SFF systems, and then a year or two from now are going to need to stick just one more expansion card in there, and not be able to, and then wish they had just gone with an ATX (or bigger) system.

I learned this lesson in 2010. I was an early adopter of the SFF mindset when I built my i7-920 around one of those Shuttle custom form factor SX58H7 case/motherboard combos.

I had a very overclockable stepping of the 920, which I could never really overclock due to thermal limitations. When I tried to upgrade to a GTX580, the custom form factor PSU couldn't handle it. When I wanted to add a second GPU, or other expansion cards I couldn't.

Now I know many of these issues have been addressed in more modern small form factor systems (like how they now take regular form factor PSU's) but still, after two years of using that thing I was so pissed off at it that I unloaded it, got a real ATX system and unloaded the Shuttle case here on the forums.

My original thought process had been that it would be sleek and compact, and take up less space, but instead I found the opposite to be true. The SFF case had to go on my desk, whereas I've always had mu towers under my desk, so now I had less desk space, and the fans were much closer to my ears, and were thus more of an annoyance, than having them hidden away under the desk. That, and all the same parts were much more expensive in smaller form.

(The price premium of SFF has since dropped, but backl in 2009 when I got the Shutte case, the barebones system including an integrated case, motherboard and PSU, but nothing else was $650!) An ATX motherboard and tower case would have been a fraction of that.

Simply not worth it IMHO.

To me, these days, bigger is better. ATX is too small. I demand EATX or CEB and as many PCIe lanes and slots as possible!

Right now I ahve a CEB board (ASUS P9x79 ws) and I have filled every single slot I can.

980ti -> Intel SSD 750 PCIe -> Sound Blaster X-Fi Titanium -> 980ti -> (slot covered by second GPU) -> Brocade 10gig ethernet, for dedicated line to my NAS server.

I have this GTX460 768mb I am not using, I'd like to throw in as a dedicated PhysX card, but I can't cause I don't have enough slots. Wish there were more! i'd use all 8 slots in the back of my Full Tower if I could! And I'm lucky with this board as I have on board Intel NIC's. If those were Realtek's I'd be tempted to try to get a discrete NIC in there too!

So, I guess my feeling is that this Mini-ITX craze is a temporary fad. People think it is awesome (and it can be) until you really want to do something, but lack the expansion, and then it is not.

I tried many workarounds to make the SFF case work for me (including one of those god awful drive bay PSU's that famously failed the [H] test in spectacular fasion before finally throwing in the towel and concluding that I'd rather have all the expansion capability I can get, than ever have to deal with feeling limited by my form factor choice again.

God I hated my SFF experience. It's making me frustrated just typing about it...
 
Zarathustra[H];1041945815 said:
I learned this lesson in 2010. I was an early adopter of the SFF mindset when I built my i7-920 around one of those Shuttle custom form factor SX58H7 case/motherboard combos.

Correction:

Order confirmation email tell me it was June 2009.
 
Who's even clueless enough to buy two graphics cards now that DX12 pushes responsibility of the implementation of multiple GPUs to game developers? There's a very small number of people who own desktops and an even smaller number of them that put money into more than one graphics card so a game developer that can't even manage to release their product on time, under budget, and in a buggy-free way isn't gonna spend the extra money to build in multi-GPU support into a modern game for the very few sales it'll land them.

Furthermore, even one of any high end GPU is a total waste when you can just buy a lower resolution screen and get a very good native resolution gaming experience while spending like less than $100 on a graphics card that doesn't need an overpriced PSU to feed it while it generates tons of heat. With even one being an utter waste, two are the domain of the terminally clueless so yeah, there's no reason at all for this video card to exist, much less generate sales. AMD's people are like crazy-cakes for not throwing absolutely everything at developing iGPUs and making HBM work for the CPU first before wasting it on such a dumb, high-wattage halo product that nobody would buy.
 
Who's even clueless enough to buy two graphics cards now that DX12 pushes responsibility of the implementation of multiple GPUs to game developers? There's a very small number of people who own desktops and an even smaller number of them that put money into more than one graphics card so a game developer that can't even manage to release their product on time, under budget, and in a buggy-free way isn't gonna spend the extra money to build in multi-GPU support into a modern game for the very few sales it'll land them.

Partially agree. I think this ignores - however - that very few developers are writing their games from the ground up theee days. Most use an established game engine. It would sound to me that this type of multi-GPU logic would likely fall more on the engine side (I understand Unreal Engine 4 has a really cool DX12 multi-adapter implementation, and as such everyone who uses it will too).

Time will tell though. Once we have some DX12 titles out there I think we will know better.

If I had to guess, my theory would be that DX12 multi-GPU will be highly biased towards single discrete GPU + IGP type loads, as most builds these days have an IGP, and less so towards multiple powerful GPU's. As I said before though, time will tell. I'd imagine we'll see some titles that take it very seriously, and optimize their titles for it, and it turns out better than current SLI/Crossfire, and many titles with broken, or no support at all.

As we know, new DX revisions take some time to work their way into new titles, so even if you are right, and DX12 multi-GPU winds up being a huge clusterfuck, the market won't have moved entirely DX12 for some time, and they will still work fine in pre-DX12 titles.

I don't know about you, but most of my game time is still spent in titles released in 2011. I haven't even bought a game in the last 2 years. I got Batman Arkham Knight for free with GPU purchase, but I haven't played it (less because of bugs, and more because it's not my type of game). Actually, I take the two year thing back. I DID buy Verdun a few months back. Only played it twice thus far though. The movement feels odd.

That's not to say that SLI/Crossfire is perfect today (it certainly isn't)...

Furthermore, even one of any high end GPU is a total waste when you can just buy a lower resolution screen and get a very good native resolution gaming experience while spending like less than $100 on a graphics card that doesn't need an overpriced PSU to feed it while it generates tons of heat. With even one being an utter waste, two are the domain of the terminally clueless so yeah, there's no reason at all for this video card to exist, much less generate sales. AMD's people are like crazy-cakes for not throwing absolutely everything at developing iGPUs and making HBM work for the CPU first before wasting it on such a dumb, high-wattage halo product that nobody would buy.

Well, now we are getting more towards individual preference.

I for one find 4k resolution to be a HUGE improvement, and wouldn't want to go back.

I understand what you are saying about resolution and GPU demands, though. My last 4 monitor upgrades have all caused my GPU to feel anemic:

2001 - Iiyama Visionmaster Pro 510, 22" CRT (I ran it at 1600x1200)
2005 - Dell 2405FPW (1920x1200)
2010 - Dell U3011 (2560x1600)
2015 - Samsung JS9000 (3840x2160)

In 2001 I saved up for a GeForce 3 to solve that problem.

In 2005 my Geforce 6800GT just couldn't cope with 1920x1200, and scaling was ugly. I actually just kind of lost interest in games - in part because of this, in part because other things going on in my life - until 2009 when I started to get into it again.

In 2010 this sparked a manic GPU upgrade frenzy. I had a highly overclocked GTX470 at the time, but it was in a SFF case, so I couldn't add another in SLI. I tried to go GTX580 when it launched, but the teeny power supply just couldn't keep up, and I sold it. Then I got rid of the SFF case, built a new machine end of 2011 around my current 3930k, which I equipped with dual Radeon 6970's in Crossfire, to try to keep up with the 1600p resolution. This was a nightmare. Nothing worked right in crossfire, at least not then, with the titles I was playing. As soon as the Radeon 7970 launched, I bought one just so I could ditch crossfire (god was it awful).

In an attempt to ghetto mount a CPU AIO cooler to be able to overclock it more, and get to good framerates at 1600p, I slipped with a screwdriver and cut a trace on that videocard, wrecking it. I saved up for it's replacement, a 2GB GTX680, which overclocked was OK, but disappointing at 1600p. It wasnt until early 2013, when I got my Titan I FINALLY got framerates I was happy with in most titles at 1600p

And then I did it again in June of this year. Upped the resolution. I swore I'd never do multi-GPU again, but here I am with two 980ti's. Sure, the SLI experience is much better than my crossfire experience, but I am still hoping I can get away with just a single top end card next generation, and get rid of these two.
 
Zarathustra[H];1041947999 said:
Partially agree. I think this ignores - however - that very few developers are writing their games from the ground up theee days. Most use an established game engine. It would sound to me that this type of multi-GPU logic would likely fall more on the engine side (I understand Unreal Engine 4 has a really cool DX12 multi-adapter implementation, and as such everyone who uses it will too).

Time will tell though. Once we have some DX12 titles out there I think we will know better.

If I had to guess, my theory would be that DX12 multi-GPU will be highly biased towards single discrete GPU + IGP type loads, as most builds these days have an IGP, and less so towards multiple powerful GPU's. As I said before though, time will tell. I'd imagine we'll see some titles that take it very seriously, and optimize their titles for it, and it turns out better than current SLI/Crossfire, and many titles with broken, or no support at all.

As we know, new DX revisions take some time to work their way into new titles, so even if you are right, and DX12 multi-GPU winds up being a huge clusterfuck, the market won't have moved entirely DX12 for some time, and they will still work fine in pre-DX12 titles.

I don't know about you, but most of my game time is still spent in titles released in 2011. I haven't even bought a game in the last 2 years. I got Batman Arkham Knight for free with GPU purchase, but I haven't played it (less because of bugs, and more because it's not my type of game). Actually, I take the two year thing back. I DID buy Verdun a few months back. Only played it twice thus far though. The movement feels odd.

That's not to say that SLI/Crossfire is perfect today (it certainly isn't)...



Well, now we are getting more towards individual preference.

I for one find 4k resolution to be a HUGE improvement, and wouldn't want to go back.

I understand what you are saying about resolution and GPU demands, though. My last 4 monitor upgrades have all caused my GPU to feel anemic:

2001 - Iiyama Visionmaster Pro 510, 22" CRT (I ran it at 1600x1200)
2005 - Dell 2405FPW (1920x1200)
2010 - Dell U3011 (2560x1600)
2015 - Samsung JS9000 (3840x2160)

In 2001 I saved up for a GeForce 3 to solve that problem.

In 2005 my Geforce 6800GT just couldn't cope with 1920x1200, and scaling was ugly. I actually just kind of lost interest in games - in part because of this, in part because other things going on in my life - until 2009 when I started to get into it again.

In 2010 this sparked a manic GPU upgrade frenzy. I had a highly overclocked GTX470 at the time, but it was in a SFF case, so I couldn't add another in SLI. I tried to go GTX580 when it launched, but the teeny power supply just couldn't keep up, and I sold it. Then I got rid of the SFF case, built a new machine end of 2011 around my current 3930k, which I equipped with dual Radeon 6970's in Crossfire, to try to keep up with the 1600p resolution. This was a nightmare. Nothing worked right in crossfire, at least not then, with the titles I was playing. As soon as the Radeon 7970 launched, I bought one just so I could ditch crossfire (god was it awful).

In an attempt to ghetto mount a CPU AIO cooler to be able to overclock it more, and get to good framerates at 1600p, I slipped with a screwdriver and cut a trace on that videocard, wrecking it. I saved up for it's replacement, a 2GB GTX680, which overclocked was OK, but disappointing at 1600p. It wasnt until early 2013, when I got my Titan I FINALLY got framerates I was happy with in most titles at 1600p

And then I did it again in June of this year. Upped the resolution. I swore I'd never do multi-GPU again, but here I am with two 980ti's. Sure, the SLI experience is much better than my crossfire experience, but I am still hoping I can get away with just a single top end card next generation, and get rid of these two.

Well, I totally see the use case for like combining an Intel (or AMD CPU graphics) and a discrete GPU under DX12 if there's some sorta actual performance benefit to doing so. It makes sense and a lot of people that are otherwise mifffed with the idea of CPU die space being spend on graphics that they'll never use might feel a little better about the world if it works. It's a common enough scenario, I guess.

As for gaming at high resolutions, I personally don't see the value. Like my preferred resolution is 1366x768 and even a really slow and inexpensive GPU can push very high settings like that without chasing after SLI or even mid range graphics and a power supply to support them. I guess there might be a small advantage to having a higher resolution screen, but as long as you're also not putting down a lot of unnecessary cash on something larger than 15 inches, you pretty much don't have to worry about the resolution taking away from the experience. Your whole gaming computer can be like a used small form factor business desktop from a couple of years ago with a RAM upgrade and a low-profile cheap GPU like a GeForce GT 730 which mitigates the need to soak up a lot of costs that can then go into far more important things like nice clothing, toys for your cat, and a high end camera for artistic photography projects.
 
Well, I totally see the use case for like combining an Intel (or AMD CPU graphics) and a discrete GPU under DX12 if there's some sorta actual performance benefit to doing so. It makes sense and a lot of people that are otherwise mifffed with the idea of CPU die space being spend on graphics that they'll never use might feel a little better about the world if it works. It's a common enough scenario, I guess.

As for gaming at high resolutions, I personally don't see the value. Like my preferred resolution is 1366x768 and even a really slow and inexpensive GPU can push very high settings like that without chasing after SLI or even mid range graphics and a power supply to support them. I guess there might be a small advantage to having a higher resolution screen, but as long as you're also not putting down a lot of unnecessary cash on something larger than 15 inches, you pretty much don't have to worry about the resolution taking away from the experience. Your whole gaming computer can be like a used small form factor business desktop from a couple of years ago with a RAM upgrade and a low-profile cheap GPU like a GeForce GT 730 which mitigates the need to soak up a lot of costs that can then go into far more important things like nice clothing, toys for your cat, and a high end camera for artistic photography projects.

True, but then you miss out on this:



And it's not just for fun and games. I can have 8 full pages on screen legibly at the same time, in word, and work on HUGE spreadsheets, without feeling like I am constantly scrolling around and losing my place.
 
Zarathustra[H];1041948041 said:
True, but then you miss out on this:

And it's not just for fun and games. I can have 8 full pages on screen legibly at the same time, in word, and work on HUGE spreadsheets, without feeling like I am constantly scrolling around and losing my place.

ZOMG that's not a netbook at all! I'd be mortified to have something like that around. :eek: Screens are really, I think, too big at over 12 inches. I'm kinda not comfortable when I'm writing on something other than my little old netbook, but I don't really do the spreadsheet thing at home. Most of my computing is just typing into a word processor to poke out a novel-type thing, reading something, or troll-errrrrr conversing with other people constructively online.
 
ZOMG that's not a netbook at all! I'd be mortified to have something like that around. :eek: Screens are really, I think, too big at over 12 inches. I'm kinda not comfortable when I'm writing on something other than my little old netbook, but I don't really do the spreadsheet thing at home. Most of my computing is just typing into a word processor to poke out a novel-type thing, reading something, or troll-errrrrr conversing with other people constructively online.

Yeah, it's 20" 1600x1200 (portrait) - 48" 3840x2160 (landscape) - 20" 1600x1200 (landscape).

I have to admit, the 48 took some time to get used to, but then again, so did my 22" CRT back in 2001, my 24" 1920x1200 in 2005, and my 30" 2560x1600 in 2010 (Which I am currently using at work, again for Spreadsheets. It makes FMEA's so much more bearable)

I know the term "immersive" gets thrown around way too much in game hardware marketing, but sitting 2.5ft from a 48" 4k screen and getting everything on the edges as peripheral vision, REALLY has a way of sucking you in.

IMHO, the 48" is slightly too large for this purpose though. If I had my druthers, a slightly smaller screen (42" maybe?) with the same resolution and slightly higher pixel density would have been perfect, but Samsung doesn't make the JS9000 in that size :p (48" is the smallest JS9000)
 
when [h] directly links to wccfbullshit themselves i know everything fucked.
 
My wallet is ready, at least i know this is going to last more than my old titan z
 
Zarathustra[H];1041948041 said:
True, but then you miss out on this:



And it's not just for fun and games. I can have 8 full pages on screen legibly at the same time, in word, and work on HUGE spreadsheets, without feeling like I am constantly scrolling around and losing my place.
DYammm dude... >_>
 
Back
Top