Trying to adapt to a higher DPI, should I be using a lower sensitivity in game? Does

cyberslag5k

Limp Gawd
Joined
Jun 29, 2004
Messages
277
So I've been using simple logitech 3-button optical mice forever, and they've set to a pretty low DPI. I think it's something like 400. My last one failed, and I decided it was time to upgrade, so I went with the Razer Death Adder. I absolutely love the thing. It's super light weight, and it's like there's zero friction at all with my Vespula. It's almost too slick, clicking one of the side buttons usually moves the mouse, which is a little problematic.

Anyway, I know that a lot of people suggest a higher DPI for competitive gaming (FPS's and RTS's are my main concern, though I play just about everything). It seems like 1,800 DPI is about where most of the suggestions lie, though I've seen others using 4,000, which is crazy.

1,800 is too much for me, at least to start, so I've decided to go with 1,000. But I'm finding that making precise adjustments (e.g. aiming a tiny bit more to the left) is really really difficult, since the tiniest of movements covers some serious ground. So then it got me thinking, maybe I need to reduce the mouse sensitivity in the game. Would that be an effective measure? or is doing so just canceling out the higher DPI? Should I instead just get used to my normal sensitivites at 1,000 DPI until I can adapt? Will I eventually be able to make fine adjustments at 1,000 DPI and up?

It's been a few days, and my accuracy out of games (just using Windows) is getting better, but it's really hurting me in game. I can be patient, but I want to make sure that I'm doing it right.
 
Last edited:
I would suggest using whatever feels right for you.

There are many competitive FPS players that still use 400-800 cpi settings because it's what they're most comfortable with; from what I understand most optical mice using the ADNS-3090 sensor seem to run these settings at 1000hz the most accurate as well.
 
drop your ingame sens. that is not negating your mouses DPI settings in any way.
 
If you want the most accurate mouse movement as possible set the in-game sensitivity to as low as it will go without dropping measurements. Usually this value is 1, but it is completely dependent on the game. Then adjust your mouse DPI to whatever feels comfortable.

For 2D mouse movement (desktop and RTS) I prefer my DPI to be about the same as my vertical resolution. I use a 1920x1080 monitor and I set my DPI to 1200. So if I move my mouse up or down an inch it will traverse the entire screen.
For first person shooters I set it so it takes 6 or 7 inches to turn 360 degrees which happens to be half the length of my mouse mat.
For me that is the perfect balance between being able to make minuscule per pixel movements while still being able to quickly do a 180. Find whatever is comfortable for you.
 
CPI, DPI and polling rate all have different effects, less sens in-game might as well lower the dpi on the mouse in question(you calculate x dots per inch but are negating x of that movement as you reduced the sens why bother with the dpi at that point its counter productive)

I generally go around 1500 and up the in-game to 1/2 for most games this way here fast twitch can still be tracked but you still have precision, I have a CM Inferno mouse and just love it, wicked adjustable.
 
I agree with Skull_Angel. The only thing you should worry about is finding what's most comfortable for you. The only thing I recommend is removing any form of mouse acceleration both in game, and in windows (disable "enhance pointer precision"), and keeping the windows mouse movement speed in the center (6/11). Anything other than those settings and you'll be reducing the accuracy by adding prediction.

As for the games, it really depends. Most games I've played have the in-game mouse controls set to a neutral level, so they don't need adjusting ('cept for skyrim... Had to get into the .ini to remove accel on that one). Anytime I find it too fast or slow, I just increase or decrease my DPI.

To the people using 4k dpi... Either they're playing with very high resolutions, where you need the dpi, or they're lowering the sens a ton. 4k at 1920x1080 is pretty much unplayable. The only time I see anyone use that sens when they don't have a massive resolution, is when they're doing it for "e-peen" bragging rights. I personally use 1800@1000hz for most games (1920x1080 resolution), but drop to 800 or even 400 if it feels too fast (most old games with resolution locked at a lower setting).

EDIT: seems like getting up in the middle of a post, only to return a hour and a half later, makes you miss other peoples posts hahaha. I wanted to add that I also use sharknice's method of adjusting sens for games that feel a bit off (non-neutral).
 
Last edited:
I did a few calculations on how many degrees 1 sensitivity turns for each dot measurement from the mouse.

Team Fortress 2 - 0.138 degrees
Counter-Strike: Global Offensive - 0.138 degrees
Chivalry: Medieval Warfare - 0.05 degrees

All Source Engine games are probably the same. I am not sure about Unreal Engine 3 games. Chivalry allowed me to set it using the console, but Unreal Tournament 3 only allows values of 11-99999 and Sanctum 2 doesn't allow you to set it to any numerical values or allow you to change it in the console or configuration files.

These measurements aren't super accurate. I measured by looking at one spot then turning as straight as I could 360 degrees so that I looked at the exact same spot. I adjusted my mouse DPI so it would take exactly 6.25" to do this.
For the calculations I took my in-game sensitivity (which was 1) multiplied by 360 degrees, then divided it by my mouse DPI setting.

In TF2 and CS:GO I have an in-game sensitivity of 1.0 and a DPI of 2600.
In Chivalry I have an in-game sensitivity of 1.0 and a DPI of 7200.

I searched the internet for degrees per sensitivity charts but I couldn't find anything. Something like that would be extremely helpful for getting consistent settings across all games. Especially if developers gave the actual values instead of people having to do estimations.
 
they also rely on software as you are pointing out with your tests AND they rely on the tracking software as well, most of the new mice track well but many of the older ones(even laser and such) could not track if DPI was set to high or to low, they just had bad tracking software, either way can go through the theory and such all day which is cool btw but comfort and such matters most. I on the fly DPI switch depending on the game I play. Generally in most games I use high dpi with higher sens but when it comes to say flying I turn it down a notch or 2 to make sure I don't crash lol.

Being able to adjust as you need to is a real nice thing I have 1920x1080 and 1680x1050 screens and generally ~1500-1680 is perfectly fine for me though I do have to turn up sens in nearly all games I play especially BF series for vehicles or its just far to sluggish
 
and keeping the windows mouse movement speed in the center (6/11). Anything other than those settings and you'll be reducing the accuracy by adding prediction.

Is this an accurate statement? I have mine set pretty low (2/11), that adds prediction?
 
Is this an accurate statement? I have mine set pretty low (2/11), that adds prediction?

From what I understand it adds a layer of interpolation, which basically means (extra) calculations are needed to determine where to move.
 
Is this an accurate statement? I have mine set pretty low (2/11), that adds prediction?

I should clarify, prediction wasn't the correct word to use there, so I'm sorry about that. Adding scaling would have been a better thing to say.

On setting 2, the cursor only moves 1 pixel for every 16 sent by the mouse. Personally, I feel that's a reduction in accuracy, as you could move your mouse 14 units, and your cursor wouldn't move. That said, there are games that completely ignore your windows cursor speed, and just use the raw input anyways. 6/11 is a 1 to 1 ratio, which is why it's preferred.

Summing it up, below 6/11 and you're discarding input. Above 6/11 and you're generating artificial input. I personally feel the latter is more detrimental. If you want to see the rest of the ratios, check out http://www.overclock.net/t/173255/cs-s-mouse-optimization-guide It's a bit dated, but most of the information is still relevant, and it contains much more info about the subject than I would want to type out here lol. The images in particular are very telling.
 
1,800 is too much for me, at least to start, so I've decided to go with 1,000. But I'm finding that making precise adjustments (e.g. aiming a tiny bit more to the left) is really really difficult, since the tiniest of movements covers some serious ground.

Dude... drink less caffeine. :)

I think you just need time to adjust to higher DPI settings.Eventually you should be able to adjust. I basically used a Logitech wireless Performance MX mouse to play games for a long time. I think it had a 1600 DPI. Unfortunately, I dropped it and it became a bit twitchy... :(

I bought a wired Logitech G500 and in a rather short period of time I was able to adjust to the higher DPI setting for the mouse. It can store 5 settings which can be switched by pressing buttons on the mouse. I am either using a 2800 or 3200 DPI setting in and out of games.

Perhaps you should start low and ratchet up the DPI setting as time goes by and simply settle on whatever feels comfortable to you. If 1000 DPI is too high, then drop it down to 600 DPI. Use that setting until you feel comfortable with it, then increase to 800 DPI. Rinse and repeat until you find an ideal setting for yourself.


Eventually my Performance MX mouse recovered from it's twitches. I now use it with my laptop, and a bought a spare one several months ago when I saw it for $50.
 
I skimmed this thread...

Which version of the DeathAdder do you have? If you have a 3, 3.5, or B.E., 1,800 C.P.I. should be your best setting, but you should lower in-game sensitivity if using raw input to your liking. The DeathAdders before the 2013 variant work best at 1,800 C.P.I. and 500Hz if I remember correctly...
If not using raw input, the lower, interpolated settings might be better at the cost of adding latency for the best tracking. Someone correct me if I am wrong!

If you have a DeathAdder 2013, this is much simpler because you could choose the C.P.I. setting.
 
Back
Top