See first post for imgur link.
And for reference, keyboard still works great in 2014. I've been using it 8+ hours per day at work. Only other modification I did was superglue the spacebar to the chery keys. It kept popping out.
I'll try and remember to fix them tonight.
I should still have them laying around.
If you only added wires then you are correct, you have to cut some of the traces. I'm at work and don't have the pictures. But I'll re-up them tonight.
Played 5 mins this morning before work.
Consisted of character taking a poop then being able to inventory the poop for later.
I think I'll enjoy this game.
There was no "no i don't use start menus". So I voted "no opinion".
I do everything with run command or start menu quick search (which works exactly like windows 7 in windows 8, press start button on keyboard + type + enter.)
Yeah, all low all the time. That's what I've been running the last 15 years lol.
And mantle has almost doubled my FPS because of that. It's very impressive overall.
Yesterday I had 2-3 crashes, but today all was fine.
I do get 1 major hitch once in a while. I'd estimate 2x per round...
BF4@640x352
DX11.1 ~190fps
Mantle ~380fps
I'm not going to play this resolution ever, but it's exactly what mantle is for I guess. Remove some bottleneck created by DX api calls.
GPU time was down to 1.1ms, CPU was holding it back.
looks to be perfect for my config/setup. I run min graphics all the time and from the quick test I ran fps is alot more stable overall. Less weird dips.
I'm an oddball. First thing I do on any game is lower graphics as much as possible, something I can't do on a console. What to me makes a game enjoyable is high refresh rate, minimal lag, and as little annoying graphic effects that make the game difficult to play as possible.
I'm hoping an lcd comes out that supports the 177hz some articles are quoting as maximum G-Sync boards can handle.
I'd be happy even if it's 720p@177hz. Hell I'd be happy with 800x600 300hz.
I'll try and find what I used. From what I remember all it was was editing an XML file that the CCC uses to set clocks.
Edit: This seems like it should work.
Edit2: This should give you a good idea of what people were/are setting as 2D clocks.
It's all about memory clocks.
1440p and 1080p120hz lcd's on idle memory clocks can cause issues on desktop for some amd cards.
Just bump up idle memory clocks using one of the various methods (xml, afterburner, etc...)
It's impossible to read on my 1440p overclocked IPS.
Impossible to read on 144hz TN
Fully readable on 120hz LB.
So G-Sync which includes variable frequency LightBoost should be pretty good at this test.
I'm on the fence about getting this kit.
Let's say I get it, does this turn the LCD into an Nvidia only LCD? or can it still be used with AMD cards?
From the video's i've seen it basically becomes an "nvidia display".
(To clarify I use an nvidia card once in a while to test things)
Pretty much what I do now,
Relaxing puzzle/rpg games I play on IPS@100hz. Insanely blurry image, but they are non fast paced games.
Anything Racing/FPS ends up on the 144hz LCD.
I've lent 120hz LCD's to a couple of people. After 1-2hours @120hz I'd get them to go back to 60hz. Their reactions were pretty much "this isn't 60hz what did you do to my 60hz" lol.
Then they proceeded to purchase 120hz+ lcd's. (Well one of them just didn't give me back my LCD)
I want more hz. The higher the better. The rest is gravy that I'll take if its free. But I won't go paying extra for LB/Gsync/Freesync.
But, if a freesync/gsync monitor ends up having a higher refresh rate then I currently have I'll probably buy it.
All AMD did is demo it on hardware that already implemented the optional parts of the display port standard. Then they go and say they hope this gets manufacturers interested so that in the future, all displays will implement this optional part of the standard.
I don't think Nvidia are idiots...
I've read that 10 times, and nowhere does it say a controller is required.
(I've broken it down for you in previous post)
Edit for clarity: Nowhere does it say "an aditional controller" is required.
You should reshow us your source that states a special control board is required.
Because all I've seen points to an optional part of the VESA standard. So it's already in the standards, just optional. (you skipped over my previous post asking what control board)
What you're quoting in no way says you need an additional board.
All it says is that it would require monitor manufacturers to implement DP1.3 + "Optional Specs" (Which they seem to imply are part of the DP1.3 specs, but I can't confirm that).
It just says the controller on the display (which...
I don't know who everyone is, but the first article I read was Anand's, and it made it clear to me that AMD was not marketing this anytime soon. It also was very clear that only supported hardware could do this. Read the article linked in OP.
I was going to post this in the other thread in response to Elledan's post, but thread was closed and now points to here.
He said:
From what I've seen G-Sync doesn't work with existing hardware. The "upgrade kit" basically consists of gutting entire controls out of the asus lcd. G-Sync...
G-Sync had me interested when it was announced. i.e. if I could have purchased their module when it was announced then I would have gotten it. At it stands now I don't see any reason to get G-sync until things settle down.
I'm still not convinced someone running at 120/144hz needs...
I usually go back and forth between 120hz LB and 144hz. Overall I like the smoothness of 144hz over the clarity of 120hz LB.
But it's definitely a per user thing.