SetPoint Sensitivity versus DPI?

sharknice

2[H]4U
Joined
Nov 12, 2012
Messages
3,752
Does anyone know how exactly the sensitivity and DPI settings in Logitech SetPoint work?

I searched all over the web and all I've been able to find are posts from people that say they heard it works a certain way from someone else. I haven't found any actual first hand facts, just people guessing.

I have always assumed you want your mouse outputting the highest DPI possible so I have my Logitech G700 set to 5700 DPI and I decreaaase the speed setting until it is the ideal speed. It is my understanding that 5 does no modification to the raw DPI the mouse outputs. Setting it to anything above 5 would be bad because you are actually multiplying the input and skipping pixels, and setting it to anything below 5 somehow reduces the effective DPI. 0 seems to cut it down to about 1/6th the normal DPI.
What I want to know is how it actually cuts down the DPI

The way I would guess it is handled based on how I would think it would work best is this:
Grab the raw data from the mouse which would be distance changed in X position and Y position. Multiply each of these numbers by the speed setting as a floating point number, then add them to the X and Y position that is stored in memory as a floating point number.
Give windows the whole number (because you can't have the cursor between pixels), but keep track of the number in higher precision than is displayed.

But I have heard from random posts on the internet that people think that setting it below 5 just skips reads so it can end up skipping pixels and all sorts of other random stuff. But I have never seen any hard sources showing how it works. Does anyone know?

TL;DR Give me research and facts.
 
Back
Top