# USB polling precision



## HAGGARD

It's an old topic really, but from what I gathered it has been mostly misunderstood and never thoroughly been touched upon.

While mice themselves are only rarely to blame for polling imprecision, I still think the topic deserves a dedicated thread here as it greatly impacts mousing and is an improvable issue.

The whole thing ended up longer than I wanted it to, mostly because I went into probably unnecessary lengths with my explanations, but whatever - told some people I would post this here. If anything is hard to understand or inaccurate, please don't hesitate to point it out.

*What is polling imprecision and how does it affect mousing?*

Movement and other input data created within your mouse is grabbed by your PC periodically and made available for the general system and use in applications.
The rate at which data is transferred from your mouse's internal buffer to your PC is called polling rate. The higher the polling rate, i. e. the lower the polling interval, the less potential delay between movement or other input activity (buttons, wheel) being registered on your mouse and that input being translated into events on your desktop or in an application.

The polling rate reduces input latency, where the set poll interval is the maximum latency. Maximum latency because regardless of the poll interval, data can enter the mouse buffer at any point in time according to your physical actions, and *polling happens at a set rate* regardless of the endpoint's buffer activity. That means you can physically actuate the mouse, it will fill its buffer with according data at point X after that action and the next poll from the host will grab that data at X + 0-polling interval.
The timing relation between your physical action and the consecutive poll will determine the latency added by polling. That can be mere microseconds or up to one whole polling interval. Optimally you would want to set your mouse to be polled at a rate of 1000Hz - once every millisecond. That way you ensure not only that polling latency is 1 millisecond at most, but you also increase the likeliness of polling latency being only fractions of that since more polls per time increase the likeliness of a poll process happening just after your mouse buffer is filled.

Note that this only goes for "single" events where the buffer is written to at an arbitrary time. If the buffer filling rate of your mouse exceeds the polling rate, i. e. if you move your mouse reasonably fast, the maximum polling latency will be met throughout that movement.



On the left side you see single events entering the buffer at an arbitrary point in time. Looking at the individual input events, you can see that the first enters the buffer just after the last USB poll. So that one will experience nearly maximum poll latency of 1ms before the next poll grabs it. The second event enters the buffer nearly towards the very end of the polling interval, just before the next poll is initiated. For that second input polling latency would be a few microseconds. The third input falls inbetween two poll processes, so roughly 500 microseconds of polling-induced latency for that.
On the right side you are moving the mouse around or scrolling the wheel fast and data enters the buffer at a rate beyond 1ms. So you will get a sum of those inputs delivered to the PC constantly each 1ms with an average latency of 500us.

Using the same depiction of poll processes, this is what imprecise poll times would look like:



The time inbetween polls is not consistent; polls are not evenly distributed throughout time. Normally when the CPU cannot address a poll process timely and you thus get an off-timed poll, the system will try to compensate and to handle the next poll process as much faster as the previous one was slower or vice-versa.

By the way, that also means that with higher polling rates (and lower CPI respectively) you additionally increase the speed at which you have to move the mouse before average polling latency occurs on every poll. For example at 400cpi @ 500Hz you hit 1ms average latency on your inputs at 500 / 400 = 1.25inch/s, whereas at 1kHz the average of 500us latency on your inputs is met only at 2.5inch/s.

Polling precision does not affect input latency, or that at least is not the primary reason why you would want to improve polling precision. *Polling precision describes how quickly the CPU can address an interrupt routine triggered by the USB host controller*. You can think of it like framerate vs. frametime. You get 100 frames per second, but not every frame will be finished exactly 10 milliseconds after the last; some take longer to be rendered and others are rendered more quickly to compensate. Analogously you get 1000 polls per second, but some poll processes take longer and are compensated for with more quickly triggered or addressed consecutive polls.

Polling variance even on unoptimized systems is in the range of a couple hundred microseconds to a few thousand microseconds in flawed systems.
These are latencies you will not perceive, especially because they are compensated for and more severely mistimed polls are rare and happen amidst thousands of other polls.
What I argue you will perceive however is "stuttering". Going back to the frametime analogy, you can have framerates as high as 500fps, there will still be noticable microstuttering if frametimes are not consistent.

I personally think the threshold is somewhere in the ~500us range from my time playing around with this. I. e. *if you regularly get polls off-timed by as much as half a millisecond, stutter will be apparent in cursor behaviour and more significantly in in-game rotation*. Any improvements beyond that still serve a purpuse though: For the PC, even 100us are a very significant time span, and getting mouse events processed as quickly as possible will make everything regarding input run more precisely. Especially considering that the games themselves add a lot of strain on the system, so any improvement to polling precision in a desktop environment may be necessary to get in-game polling precision to acceptable levels.
Polling precision can also be used as an indicator of how crisp your system is in general.




As you can see, I have optimized my system to be precise up to a maximum polling variance of 5us, the vast majority of polls being processed correctly timed in the nanosecond range - even hitting a maximum possible precision that may be determined by the host controller's ability to trigger ISRs, maximum hardware interrupt frequency as per line-based interrupt and TSC frequency that my host controller operates at or simply the maximum precision of the measurement program used itself.
These measurements are taken with the MouseTester by microe. Log Start -> circle your mouse fast enough to where you hit USB polling rate, but not fast enough to where malfunction could happen -> Log Stop. Look at the Interval vs. Time graph. Dismiss noise at the beginning and end of the motion with Data Point Start and Data Point End. You really don't have to move the mouse faster than 1m/s. If even that. Even with 400cpi you hit buffer filling rates of <1ms pretty quickly (polling rate / CPI = speed (inch/s) beyond which you fill the buffer more quickly than it is read from).

*Where does polling imprecision come from?*

Starting briefly with the fundamentals, how does USB polling work?
Most importantly, *the endpoint device is passive*. Upon registration on a USB host controller it requests among other things a certain protocol to be serviced with, including the polling interval as a bInterval specification. But that's it as far as the device's role in poll timing is concerned. While there is reason to believe the USB protocol allows for mid-operation changes of the bInterval specification for power saving purposes, I have yet to see any mouse utilize it. And when there are changes in the polling interval (when a mouse set to request 500Hz regularly shows 1000Hz readings for example) it most obviously is a flaw since the readings occur amidst constant tracking, where power saving features would be misplaced. I have seen this happening with Zowie mice.

Other possiblities of mice affecting polling that I have seen are that the internal buffer in the mouse is fed at rate below the set polling rate (e. g. the MX518 which caps out at around 700Hz) or mice simply not accepting certain polling specifications (seen most often in office-grade mice). Apparently firmwares can mess with poll behaviour as well, or so I've recently been made aware of by a user on here with a DA3G 1800cpi. Most likely related to buffer writing behaviour as well.

It's important to *note that issues on the mouse side* like buffer filling inconsistencies or limits, and flaws in the USB communication settings *are all things that would show in polling variance as whole milliseconds*. The device is still polled at a set millisecond interval, so when one poll returns no data because of the mouse messing up, you get data returns on a consecutive poll. Thus any poll delay with roots on the mouse side will always be in multiples of milliseconds.
Mouse-sided effects on reported poll times are very obvious in that they are not compensated for (you get extreme values offset only in one direction on the Y axis).
*Poll variance on the scale of microseconds and below on the other hand always has its roots on the PC side of the process*.

The active role in the polling process is occupied by the USB host controller. Depending on the polling interval specification, it issues timed interrupt schedules to the CPU to handle an endpoint's I/O tasks. The host controller itself is not the source of polling imprecision; it operates according to a clock of its own and is precisely timed to fulfill the USB microframe standard of 125us.

Interrupt service routine spawned by the host controller followed by 6 scheduled deferred procedure calls executed on core 2:


In practice, the endpoint is checked (polled) for states each Xms depending on the bInterval specification. When the mouse buffer contains data, its state is flagged accordingly to let the host know to grab that data within that same poll process. After a successful transfer the host hands the device an ACK or acknowledgement of successful transfer upon which the endpoint buffer is flushed.



The important bit here is that *the host issues polls according to the set interval independent of what the mouse is doing*. The USB protocol may allow for more relaxed timing schedules or interrupt priorities when the endpoint has been flagged inactive for a certain amount of time, i. e. returned no state change after a certain amount of polls, but there's no detailed information about that out there that I could find, nor is it really important here.

So, there's this precisely timed interrupt schedule spawned by the host controller that the CPU has to satisfy upon being interrupted with an ISR. Any imprecision in the polling behaviour will therefore be rooted in the CPU's inability to do so (or rarely the controller's inability to timely prompt ISRs). *Enabling the CPU to satisfy strict periodic demands is the main leverage point to get rid off polling imprecision*. We achieve this by either decreasing the amount of tasks the processor has to address (DPCs that are in competition with those scheduled by the USB host; ISRs prompted by other hardware and are addressed prior to the DPCs), increasing the rate and efficiency with which the processor addresses tasks or by switching task orders and priorities around.

*How to optimize the system:*

First, consider a few more general tips for a cleaner base system:

Uninstall proprietary HID drivers and software.

Install the latest chipset and hardware drivers. In regards to PCI Bus, USB filter, USB host controller and sound drivers you will have to see for yourself whether generic drivers supplied by Windows provide better polling and DPC results. Generally you should go the "the more recent the better" route for these.

Flash your motherboard with the most recent stable BIOS.

Disconnect any devices or internal hardware components you don't use. This includes things like the CD drive and leaving motherboard headers like the front panel support empty unless you want to use them. Preferably the only device hosted on USB is the mouse.
Same goes for programs; everything installed you don't use is by definition bloat.

Clean your startups with TechNet's autoruns.

Disable any integrated motherboard components you don't use from the BIOS. USB 3.0 chip, sound chip, video chip, unused serial ports, ...

Use the system variable "DEVMGR_SHOW_NONPRESENT_DEVICES 1" to reveal driver corpses of disconnected devices and uninstall everything that's not used.

Disable from the device manager any USB host controllers except for the one the mouse is hosted on and uninstall third-party USB 3.0 drivers. Uninstall the left-over hubs as well after you disable host controllers.

With Windows services, I have always kept it simple:

Essential services are set to automatic:

Schedule
SENS
ProfSvc
DcomLaunch
gpsvc
IKEEXT
PlugPlay
RpcSs
RpcEptMapper
Power

All other services you set to manual. This way Windows and applications can start services according to their dependencies and needs. For instance, you will boot without internet access and a connection will be established once attempted by other tasks or applications (if you want it to be esbtalished upon boot set WinHttpAutoProxySvc to Automatic). I have yet to run into problems with any calls for services to be ignored or dependencies not being co-launched, but there's a variety of software and associated interactions out there and so if you run into problems you will have to check which services you might have to manually start from the services.msc console for the piece of software to function properly.
The audio services are the only ones that I found do not respond to software calls. You can either set them to automatic or manually start them when you need them:

AudioSrv
AudioEndpointBuilder

Some services you might want to specifically disable:

Themes
UxSms
SysMain
TabletInputService
TapiSrv
WinDefend
WcesComm
RapiMgr
WSearch
MpsSvc
CertPropSvc
hidserv

I won't address further basic Windows settings as this is not supposed to be a guide on how to properly set up Windows. One additional thing to consider for NVIDIA users is the NVIDIA Display Driver Service nvsvc can also be set to manual. Normally NVIDIA launches 2 instances of the NVIDIA Display Driver Helper Service (nvvsvc.exe) and one instance of the NVIDIA User Experience Driver Component (nvxdsync.exe) regardless of whether or not you have NVIDIA Experience installed (which I recommend you do not). With the parent service nvsvc set to manual, those will not automatically start with Windows. They won't start when you launch a game either. There are no problems without them; game profile settings are applied properly and so is everything else. **Have to correct myself here: The profile settings are applied properly (confirmed with NVIDIA Inspector frame limiter), but the game is less responsive without the processes running. Would be interesting to know what they are doing exactly. I recommend you either set nvsvc to automatically start with Windows or refer to the next sentence.*
However, if you do want to use your NVIDIA Control Panel, you just have to right-click your desktop. The NVIDIA shell extension will then start all three processes to provide desktop-level support via. the context menu.
Going into more specific settings for the problem we are dealing with, the first thing is to configure the CPU. From the BIOS, disable any dynamic clocking, power saving or sleep features that apply to your processor.
Overclocking is optional and does not always benefit handling of time-sensitive or multi-threaded tasks. Look online for stable clocks for your processor and according voltages as well as RAM timings. If your motherboard supports automated voltage and RAM controls I have found no reason not to use them. This might depend on the processor and motherboard type, so do your research there.

*HPET or High Precision Event Timer*. Let's keep it simple here and say it is a hardware timer and as such enables hardware to interrupt the CPU more frequently than older timers as per QueryPerformanceFrequency. If your motherboard-CPU combination supports HPET, you should leave it enabled in the BIOS. Windows by default uses other timers and can synchronize and selectively utilize timers for hardware that requests it to do so.
Since Windows can call the HPET functions if needed or requested by hardware, there is no reason to force HPET onto Windows as a default timer with bcdedit /set platformclock true. For hardware that doesn't support HPET Windows in that case would always have to synchronize HPET with the timer the hardware supports.
You should also not specifically disable it with bcdedit /set platformclock false because if hardware does support high precision interrupt frequency it might function better (as seen for example in hardware that supports the MessageSignaledInterrupt mode). Just use bcdedit /deletevalue useplatformclock and let your OS decide on hardware timers. Further reading: http://www.windowstimestamp.com/description

*ISRs and DPCs:* Hardware interrupts to the CPU and their more or less delegated tasks.
Quote:


> The OS has no control over their execution. ISR's are triggered by physical hardware signals from devices to the CPU. When a device signals the CPU that it needs attention, the CPU immediately jumps to the driver's interrupt service routine. DPCs are scheduled by interrupt handlers and run at a priority only exceed by hardware interrupt service routines.
> 
> ISR and DPC activity usually increases with system activity. As a system becomes more active, interrupts and DPCs will generally become more frequent, taking up more CPU time. This can get to the point where they visibly (or audibly) affect system performance. In these cases, no single ISR or DPC routine is the problem - it is their cumulative usage of CPU time.


See: https://msdn.microsoft.com/en-us/library/windows/hardware/ff554373%28v=vs.85%29.aspx

Both DPC latency checkers like LatencyMon and thread activity checkers like DispatchMon are useful here.
You don't want to swamp your CPU with tasks if you care for swift execution times. DPC Latency checkers schedule low priority tasks (DPC) and count the time the CPU needs to address those. The more tasks the CPU otherwise is occupied with, the longer this will take and the more "latency" as in delay of addressing general tasks is present on the system.

*There are three approaches to helping the processor do things more swiftly:*


*Reducing the work load:*
Disconnect and disable hardware or software devices, uninstall drivers and software, clean your autostart, minimize the amount of background or scheduled tasks. All of this means less ISRs and DPCs for the CPU to have to dedicate cycles to and faster dedicating of CPU time to tasks you really want to be addressed.

*Increasing the work power:*
Overclock the CPU, disable power management features like power saving, thermal control, dynamic clocking, sleep states, etc. and unpark your CPU cores with the CPUUnparkApp.
Use the high performance power plan. Make sure to go to regedit.exe -> HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Power\PowerSettings and go through all entries, setting "Attributes" DWORD entries you find to "2". This will reveal power settings in the advanced power plan settings otherwise unavailable.
For the high performance power plan you want to disable every power saving feature available, especially PCI power management and USB selective suspend. Reading will help. CPU power management will have a ton of entries. Ignore those; parking is disabled already and minimal CPU frequency will already be at 100%.

*Increasing the work efficiency:*
The timer resolution of Windows. To my knowledge, basically the tick rate of Windows and determines scheduling precision of thread activity. That doesn't mean it affects the rate actions are performed at on a hardware-level, but how often Windows itself interrupts the hardware to check for the status of scheduled or periodic operations and either add new tasks or timeout others. Setting timer resolution lower than the default 15.6ms can help applications (not hardware) perform more accurately in that they are able to more frequently access drivers to have Windows create new or kill old tasks (any task the system may have delegated to the hardware and would potentially run for an unnecessary amount of time with a longer timer duration). TimerTool allows you to increase the tick to 500 microseconds instead of 15600 microseconds, helping software execution but also keeping the CPU more efficient by killing tasks more timely should they happen to be scheduled beyond their needs.

Task priorites. If you want certain tasks to be addressed more time-sensitively, setting IRQ priorities is a possibility to tell the CPU which interrupts are more important than others.
msinfo32.exe will tell you which IRQ# is assigned to which hardware component. For hardware that you want to have the CPU prioritize, create a IRQ#Priority DWORD entry in your registry under HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\PriorityControl, where # corresponds to the actual number attached to the desired component. Set the value of that entry to 1.

Another potentially viable way to improve handling of interrupts is to resolve IRQ conflicts. In msinfo32.exe, look for components that share IRQ# under conflicts and see whether you can disable any, change IRQs from your BIOS, get your mouse registered on a host controller that doesn't share an IRQ# or try to see if any components support MSI mode. For that, head to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Enum\PCI and create for every entry that possesses an "Interrupt Management" folder within the Device Parameters a new folder called "MessageSignaledInterruptProperties". In that context, create a DWORD entry called MSISupported and set that to 1. After you have done that with all devices under PCI, reboot and open your device manager. Hit view -> Resources by connection. Expand interrupt requests and scroll to the bottom. Every component that supports MSI mode will be at the very bottom with negative values. Take note of their hardware IDs and set the MSISupported entry to 0 for every component that is not one of those that support it.
Or use a little application called "MSI_util" that will spare you from having to adventure through your registry yourself.

Note that unstable overclocks or unmatching RAM timings affect work efficiency. As do multi- and hyperthreading techniques. People have seen improvements with disabling Intel's CPU hyperthreading feature. Disabling cores completely from the BIOS could be experimented with.
Another thing to think about is core affinity. The handling of resource-heavy components that regularly are not known for multithreaded performance can be assigned to a single core, preferably for this topic's purposes assigned of course to one that does not deal with your USB activity (look at DispatchMon).
For the audio handling specifically, go to your task manager's service tab and locate AudioSrv and AudioEndpointBuilder. Right-click, "Go to Process(es)". It will guide you to the corresponding svchost.exe process containing activity of the audio service, maintaining threads and scheduling tasks. Right-click on that process, "Set Affinity...". You can do this with processes from other drivers, services, or software as well.
Managing process priorities can in theory help (https://msdn.microsoft.com/en-us/library/windows/desktop/ms685100%28v=vs.85%29.aspx). These determine the scheduling pattern for your running software. Software should only compete with essential hardware-sided threads such as mouse input when set to real-time priority, so heed Microsoft's warning not to ever do that. You can experiment with setting other hardware-sided processes to lower priority levels though (such as the audio svchost.exe mentioned above).

*Some unsorted stuff:*

You can now see why disconnecting and disabling as much hardware and uninstalling as much software as possible is beneficial: less interrupts and tasks the CPU has to deal with, more cycles dedicated earlier to stuff you truly use. So apply this logic as far as you deem worth it. I argue, if you have a dedicated gaming PC, why not go all the way? And in the rare case you do need things like your USB 3 ports, printers or CD drive, Internet Explorer or Windows features, etc. - you can just enable them. Even stuff that requires reboots should not be bothersome to enable with fast boot times that come with SSDs. Obviously if you do all of your things on your main sole PC it's for you to decide just how far you want to go.

You will also have to go through drivers and test yourself what kind of effect they have on DPC latencies and polling precision. As I mentioned, for chipsets the latest should be the best because with those manufacturers have no real obligation to release drivers regularly - they only ever do when they see major possible improvements or fixes.
With graphic, network and sound cards and other addon hardware that's a different story. LatencyMon will give you a pretty good idea if drivers are bad in regards to DPC latency and how healthy your USB activity is (look at driver execution times, especially USBPORT.SYS).

Sound components are complex and resource-heavy. I use an onboard chip with generic Windows drivers, but you will have to check yourself what the least heavy implementation is. Maybe external (TOSLINK, coax, USB) or internal (PCI) alternatives perform better than onboard solutions. USB sound cards likely wreck polling precision, but if you use one (or any USB device other than the mouse for that matter), try to get them to be hosted on a different controller than the mouse. If you use your keyboard on USB, consider reducing the keyboard polling rate with hidusbf.sys. Although, preferably plug your keyboard into PS/2 which as an asynchronous interface requires no periodic polling and thus leaves the CPU alone at times an USB-interfaced keyboard wouldn't.
Other external sound solutions (TOSLINK, coax) still need an active playback device in your PC. I use my onboard chip to optically feed an external amplifier. But since the onboard chip doesn't have to apply digital-to-analog conversion or amplify the signal, I can imagine it operates less demandingly than when using the line-out.
Disable any playback or recording devices you are not using, including any High Definition Audio Controllers in the device manager that come with your GPU.

Maybe PCI add-on USB cards perform even better than those implemented onboard. I haven't tested that. Doubt it, but something you can look at if you have such an extension card.

Disable your antivirus while you are gaming. Or consider not using one to begin with.

Remember that running a game is very resource-heavy and will affect polling precision significantly. Another reason why frame caps and low-quality;high-performance settings in games you are looking to improve the experience of can be useful.

Switch to 500Hz if you can't at all get 1000Hz stable on your system.

TimerTool has some interesting effects on polling behaviour. After manually triggering it, this is what different settings look like. I won't make any assumptions as to why this happens.

15.6ms/10ms/5ms/2.5ms/1.25ms:


As you can see, basically anything above 0.5/1ms leads to my specific setup jumping between two discrete points of poll address timings. +30;-30us to be exact. I always set timer resolution to 0.5ms when I'm gaming.

Here is what totally messed up polling behaviour looks like (my laptop):


It feels horrible. Naturally lesser basic component performance, more strict thermal and power saving features, the built-in screen and so on play a role as well, but mousing instantly feels way better and controlled when switching to 500Hz instead of 1000Hz. The improvements just from reducing the polling rate on that laptop:



*I would like to see results from your setups and possible findings you have playing with TimerTool and tweaking other things.*


----------



## acid_reptile

Thanks. Interesting read.


----------



## CorruptBE

Good post, should be stickied to avoid alot of confusion (to many people think tweak X affects mouse performance whereas in reality they're just reducing overall workload on the PC).


----------



## Trull

Nice. This is what I've been waiting for.


----------



## Trull

Quote:


> Originally Posted by *HAGGARD*
> 
> For that, head to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Enum\PCI and create for every entry that possesses an "Interrupt Management" folder within the Device Parameters a new folder called "MessageSignaledInterruptProperties". In that context, create a DWORD entry called MSISupported and set that to 1.


*Don't do this*. It will lock your Windows down and it won't be able to start up, even in safe mode.


----------



## Melan

I wouldn't do any of this either way. It's a great experiment (and read. No really, good job man.) but not worth in the long shot, at least for me. I'm not even bothered that my DPC latency is hovering around 49us with occasional jump to 200. Except that time when bad network driver made it 5000.


----------



## HAGGARD

Thanks for the kind words.
Quote:


> Originally Posted by *Trull*
> 
> *Don't do this*. It will lock your Windows down and it won't be able to start up, even in safe mode.


Didn't read of that happening to anyone. If it does, use "Last Known Good Configuration" in the advanced boot options. It will reset your registry to the last point you where able to boot into Windows properly.

@Melan: Like I mentioned there, "optimizing" is always a compromise and it's a fair enough decision to say functionality outweighs the concerns raised here. I'm not saying to do anything possible to get a few microseconds less in dpclat times either. For polling variance I would say ~100us is a nice enough range that is achievable on most modern systems without extensive tweaking. If you are interested in "flawless" tracking though, I'd however say anything beyond that and especially obvious flaws (couple of hundred up to thousands of microseconds variance) are something you should try and iron out.
You can for example leave your CPU power saving features enabled and just use your power plan to prevent CPU down-clocking or sleeping. Using your keyboard on PS/2 or reducing the polling rate, trying to get things out of competing with each other for CPU attention etc. - those are all things anyone could reasonably apply without having to really compromise.


----------



## CorruptBE

Quote:


> Originally Posted by *Trull*
> 
> *Don't do this*. It will lock your Windows down and it won't be able to start up, even in safe mode.


Works fine, but do it device per device or otherwise "last known safe..." might indeed not work anymore. The hardware in question has to support it.

As for power saving in the BIOS I tend to turn everything off except for C3/C6. Alot less stuttering issues, etc in some "badly" coded games. Also u can disable the Multimedia Class Scheduler service after removing the dependency from the Audio Service.


----------



## uaokkkkkkkk

Pentium 4 users appreciate your effort good sir.


----------



## aleexkrysel

Good guide, I'll work through the optimization tips when I have some time in the coming days. However, I just checked my USB Controller section in Device Manager. This is what I see: 
Could someone please take a noob by the hand, explain what is what and tell me how to proceed. I would be eternally grateful.


----------



## HAGGARD

Quote:


> Originally Posted by *CorruptBE*
> 
> Works fine, but do it device per device or otherwise "last known safe..." might indeed not work anymore. The hardware in question has to support it.
> 
> As for power saving in the BIOS I tend to turn everything off except for C3/C6. Alot less stuttering issues, etc in some "badly" coded games. Also u can disable the Multimedia Class Scheduler service after removing the dependency from the Audio Service.


Might depend on the motherboard, for me if a component doesn't support MSI it just stays in line-based mode regardless of the entry.

Why would you disable MMCS though? It prioritizes scheduling tasks of multimedia applications over general tasks. You can tweak it to prioritize game performance and set audio to a lower priority.
https://msdn.microsoft.com/en-us/library/windows/desktop/ms684247%28v=vs.85%29.aspx

@ uaokkkkkkkk: They're out there. What a peculiar face though... Dude should get his liver checked also.


----------



## HAGGARD

Quote:


> Originally Posted by *aleexkrysel*
> 
> Good guide, I'll work through the optimization tips when I have some time in the coming days. However, I just checked my USB Controller section in Device Manager. This is what I see:
> Could someone please take a noob by the hand, explain what is what and tell me how to proceed. I would be eternally grateful.


You have two EHCI in there. Those are USB 2.0 or Enhanced Host Controller Interfaces. They also provide backward compatibility for USB 1.1 devices. Root Hubs are onboard chips that handle the physical USB ports, generic Hubs are used to include additional USB ports that don't have to be directly on the board. 1 Root hub per EHCI and 1 for the 3.0 host controller (xHCI - eXtensible Host Controller Interface).
You can look at the host controller properties or view "devices by connection" to see which controller your mouse is hosted on. From there you can uncheck "PC can turn off this device to save power" in the power management tab. If you plan to disable host controllers that are not used, you can then use "show hidden devices" and uninstall the faded-out hubs associated with the now disabled controllers.


----------



## banjogood

What's up with this? http://i.imgur.com/u9xvTqk.png


----------



## HAGGARD

You have an outlier at the beginning there messing up your scale. Adjust Data Point Start some more.


----------



## banjogood

Quote:


> Originally Posted by *HAGGARD*
> 
> You have an outlier at the beginning there messing up your scale. Adjust Data Point Start some more.


Right. Here's my results. http://i.imgur.com/O2F4BIm.png That looks pretty terrible doesn't it?


----------



## deepor

Quote:


> Originally Posted by *HAGGARD*
> 
> [...]
> 
> The active role in the polling process is occupied by the host controller. Depending on the polling interval specification, it issues timed interrupt schedules to the CPU to handle an endpoint's I/O tasks.
> Interrupt service routine spawned by the host controller followed by 6 scheduled deferred procedure calls executed on core 2:
> 
> [...]
> 
> In practice, the endpoint is checked (polled) for states each Xms depending on the bInterval specification. When the mouse buffer contains data, its state is flagged accordingly to let the host know to grab that data within that same poll process. After a successful transfer the host hands the device an ACK or acknowledgement of successful transfer upon which the endpoint buffer is flushed.
> The important bit here is that *the host issues polls according to the set interval independent of what the mouse is doing*. The USB protocol may allow for more relaxed timing schedules or interrupt priorities when the endpoint has been flagged inactive for a certain amount of time, i. e. returned no state change after a certain amount of polls, but there's no detailed information about that out there that I could find, nor is it really important here.
> 
> So, there's this set interrupt schedule spawned by the host controller that the CPU has to satisfy upon being interrupted with an ISR. Any imprecision in the polling behaviour will therefore be rooted in the CPU's inability to do so (or the controller's inability to timely prompt ISRs). *Enabling the CPU to satisfy strict periodic demands is the main leverage point to get rid off polling imprecision*. We achieve this by either decreasing the amount of tasks the processor has to address (DPCs that are in competition with those scheduled by the USB host, ISRs prompted by other hardware and are addressed prior to the DPCs), increasing the rate and efficiency with which the processor addresses tasks or by switching task orders and priorities around.
> 
> [...]


Are you sure you've understood that right with the host controller and the polling and the CPU? I was under the impression that the CPU doesn't work on details like that and that the host controller manages everything about the polling and getting data from the device. The host controller then triggers the interrupt after it has the data from the device to present to the CPU.

Did I misunderstand that? I tried checking the Linux kernel source code once for this, and couldn't see anything in the HID driver that deals with the polling being done. The code I looked at was just working with actual data, didn't have to do any work to get it. It was very simple and short. Is there some more work for the actual USB protocol that has to be done on the CPU? It could have maybe been hidden in some other driver so I might have made a mistake reading that code.


----------



## HAGGARD

Not terrible at all. That's roughly a second of measuring and maximum variance is 100 microseconds, average variance in the 20-50 microsecond range. It's obviously improvable, especially considering it's 500Hz which is less demanding than 1kHz, but it does mean there is no severe problem in your system.
EDIT: Just now noticed that I didn't look at your graph properly or you edited it. That's 10 microseconds at max and 2-5 microseconds average variance. doesn't get much better.

@ deepor: Not saying I know exactly which part of the polling process is handled internally on the host chip and which tasks the CPU occupies. Might even be that the CPU solely grabs data to process into messages the OS can work with.
That said, the only possible conclusion from all these settings affecting the poll precision is that the CPU ultimately is responsible for how consistently and timely mouse reports are made available for the OS and thus the cursor and applications. MouseTester looks at raw input activity at that, so it's not like the measured poll precision represents the time needed to convert mouse data into cursor movement or other events.
* Hard to find sources right now:
http://vr-zone.com/articles/blast-from-the-past-usb-polling-induced-performance-slowdowns/16620.html
http://www.baslerweb.com/media/documents/BAS1302_White_Paper_USB3+USB3_Vision-Standard_EN.pdf
In both of these you can read that USB polling is directly tied to the CPU. So the USB interrupts are more than ready data that the CPU only has to work further with from there. They are not accurately describing the process though.


----------



## banjogood

Quote:


> Originally Posted by *HAGGARD*
> 
> Not terrible at all. That's roughly a second of measuring and maximum variance is 100 microseconds, average variance in the 20-50 microsecond range. It's obviously improvable, especially considering it's 500Hz which is less demanding than 1kHz, but it does mean there is no severe problem in your system.


It's a 500Hz imo1.1. I have HPET disabled in my BIOS. I'll try playing with that. Thank you!


----------



## HAGGARD

Quote:


> Originally Posted by *arsn*
> 
> It's a 500Hz imo1.1. I have HPET disabled in my BIOS. I'll try playing with that. Thank you!


Sure thing.

Added some pictures to break some of the lengthy paragraphs.


----------



## pinobot

Maybe this is kinda relevant:

http://en.wikipedia.org/wiki/Nyquist_frequency

Ofcource with mice were not talking about sine waves.

Another example is moiré patterns when scanning books or newspapers on a flatbed scanner. You need a twice as high resolution to scan without moiré patterns.


----------



## Crymore13

Why does my Deathadder 2013 have these peaks ?



Movement made:



My Latencymon:



Windows 8.1


----------



## Derp

Disabling c6/c7 made an enormous improvement but I'm not ok with the processor sitting at full voltage and consuming fives times the power. The polling on my system is quite terrible with c6/c7 enabled.


----------



## deepor

Quote:


> Originally Posted by *Derp*
> 
> Disabling c6/c7 made an enormous improvement but I'm not ok with the processor sitting at full voltage and consuming fives times the power. The polling on my system is quite terrible with c6/c7 enabled.


Did you only experiment on the desktop? You should check what's happening with regards to C6/C7 while a game is running. It might keep the CPU from falling completely asleep and you might be fine there.

There's two different collections of C-states with the same names: ones for the individual cores and then for the whole package. The package ones can only activate if all cores are asleep at the same time. Those are the ones that can reduce speed and lower Vcore (if using offset overclocking). If at least one core is kept running by a game, the package ones might never activate and then individual cores in C6 might wake up to full voltage and run at full speed from the start.


----------



## HAGGARD

Quote:


> Originally Posted by *Crymore13*
> 
> Why does my Deathadder 2013 have these peaks ?


I would say because you are slowing down at some point in your circle, but then all of those readings exactly jump 1 poll interval.
Whatever it is, it is coming from your mouse. The host would try to compensate for deviation from the set interval and so you would see counter-peaks as well.
Take some more logs and try to stay consistently fast throughout the circle movement.
Quote:


> Originally Posted by *Derp*
> 
> Disabling c6/c7 made an enormous improvement but I'm not ok with the processor sitting at full voltage and consuming fives times the power. The polling on my system is quite terrible with c6/c7 enabled.


Even with the high performance power plan set? Not too familar with C6/7 power states, but like all c states they are micro sleeps for selective core idle. Setting minimum processor state 100% may or may not prevent cores from entering advanced idling states.
Alternatively use SpeedStep to save energy in a balanced power plan. Clocking features are effectively disabled with minimum set to 100% when you switch to your performance power plan. Or calculate just how much more you would be spending on power consumption per year without c states.


----------



## Melan

Mine.



Another one.


----------



## Derp

Quote:


> Originally Posted by *HAGGARD*
> 
> Even with the high performance power plan set? Not too familar with C6/7 power states, but like all c states they are micro sleeps for selective core idle. Setting minimum processor state 100% may or may not prevent cores from entering advanced idling states.
> Alternatively use SpeedStep to save energy in a balanced power plan. Clocking features are effectively disabled with minimum set to 100% when you switch to your performance power plan. Or calculate just how much more you would be spending on power consumption per year without c states.


Yes. With all power savings in bios disabled except c6 and windows power plan set to 100% so the clock speed never drops results in scattered polls from 1.4-2.6ms on a 500Hz g100s. If I disable c6 and change nothing else the polling changes to 1.97-2.02ms.


----------



## L4dd

Thank you!
I would like to have an explanation for my Lachesis 3G's SPI speed possibly being a bottleneck when using 1,000 Hz polling, too. I remember possibly noticing smoother mouse movement with 500 Hz versus 1,000 Hz via that mouse, so I suspect that it has a less than 1,000 Hz or so SPI speed and would show what you have listed here...


----------



## Crymore13

Quote:


> Originally Posted by *HAGGARD*
> 
> I would say because you are slowing down at some point in your circle, but then all of those readings exactly jump 1 poll interval.
> Whatever it is, it is coming from your mouse. The host would try to compensate for deviation from the set interval and so you would see counter-peaks as well.
> Take some more logs and try to stay consistently fast throughout the circle movement.


With my Roccat Taito(Razer Scarab before) now, 1 circle only, constant velocity.


----------



## HAGGARD

Very interesting. Try a swipe instead of a circle first. If it's still there, then we can start wondering what else it could be.
Using any Razer/ROCCAT software?
Quote:


> Originally Posted by *L4dd*
> 
> Thank you!
> I would like to have an explanation for my Lachesis 3G's SPI speed possibly being a bottleneck when using 1,000 Hz polling, too. I remember possibly noticing smoother mouse movement with 500 Hz versus 1,000 Hz via that mouse, so I suspect that it has a less than 1,000 Hz or so SPI speed and would show what you have listed here...


500Hz often also seems smoother because it "filters" inputs. Refer to www.overclock.net/t/1251156/an-overview-of-mouse-technology "Pollling Misnomer" section.
Quote:


> Originally Posted by *Derp*
> 
> Yes. With all power savings in bios disabled except c6 and windows power plan set to 100% so the clock speed never drops results in scattered polls from 1.4-2.6ms on a 500Hz g100s. If I disable c6 and change nothing else the polling changes to 1.97-2.02ms.


That is indeed significant. Either you calculate how much more you'd be spending and make your decision from there, you disable C6/7 everytime you want to play seriously, or you try to reduce power consumption with speedstep enabled. When you want to game you can just set the performance plan and speedstep will effectively be disabled.


----------



## Crymore13

Quote:


> Originally Posted by *HAGGARD*
> 
> Very interesting. Try a swipe instead of a circle first. If it's still there, then we can start wondering what else it could be.
> Using any Razer/ROCCAT software?
> 500Hz often also seems smoother because it "filters" inputs. Refer to www.overclock.net/t/1251156/an-overview-of-mouse-technology "Pollling Misnomer" section.


I am using software from Razer, Roccat Taito is a mousepad...
I tried with my Kana v2 and gave a different result, without installing the software because plugged only for testing. So the 500hz.

Kana v2(500hz):



Deathadder 2013(500hz):



Apparently the problem is with the mouse itself.


----------



## HAGGARD

Excuse my obliviousness regarding the Taito.

As I wrote in the OP with earlier Deathadder versions firmwares did cause polling issues. http://www.overclock.net/t/1499037/got-me-a-deathadder-3g-v2-need-advice-on-firmware-flash

Someone contacted me and his mouse showed similar behaviour:


Try a different firmware version and uninstall your drivers or try other driver versions as well afterwards.


----------



## Crymore13

Quote:


> Originally Posted by *HAGGARD*
> 
> Excuse my obliviousness regarding the Taito.
> 
> As I wrote in the OP with earlier Deathadder versions firmwares did cause polling issues. http://www.overclock.net/t/1499037/got-me-a-deathadder-3g-v2-need-advice-on-firmware-flash
> 
> Someone contacted me and his mouse showed similar behaviour:
> 
> 
> Try a different firmware version and uninstall your drivers or try other driver versions as well afterwards.


I looked for the firmware but found nothing, only synapse.
This is the version of my DeathAdder: http://www.razerzone.com/store/razer-deathadder
4G is 6400DPI.
Only found to 3.5G

I'll be in demand to solve this problem, I always feel there's something wrong when I play.


----------



## HAGGARD

AFAIK firmware flashing on the newer models happens exclusively from the synapse software itself. Did you search in there already? Not sure they offer different versions in there though, might even be that for the 4G version there never was a firmware update. If you can't find anything, at least your firmware version should be somewhere in there.
A lot of people have the DA13, if they have the same problems with the same firmware we have the culprit. From there we'd have to contact Razer.

But try searching in synapse first. For now you can also try uninstalling synapse and the razer driver should you be using it.


----------



## cryptos9099

Edit:



503 My brain is dumb.


----------



## HAGGARD

yeah ur hand


----------



## L4dd

Quote:


> Originally Posted by *HAGGARD*
> 
> 500Hz often also seems smoother because it "filters" inputs. Refer to www.overclock.net/t/1251156/an-overview-of-mouse-technology "Pollling Misnomer" section.


I do not mean path correction. I simply mean polling effected the mouse movement much like what you are explaining here; it was smoother at 500 Hz because it got less polling variance because the SPI could not fully handle 1,000 Hz, AFAIK... *Bullveyr* was the one to claim that the Lachesis 3G's SPI had a speed of 700 Hz.


----------



## HAGGARD

Quote:


> Originally Posted by *L4dd*
> 
> I do not mean path correction; I simply mean polling effected the mouse movement much like what you are explaining here; it was smoother at 500 Hz because it got less polling variance, AFAIK...


It's not really path correction, you still end up on the same pixel, just that small deviations are not always as apparent as they are with 1000Hz because more inputs are combined.
That effect makes the mousing feel less jittery, low poll variance would make it less stuttery. But it is very possible that you indeed felt the latter.


----------



## L4dd

Quote:


> Originally Posted by *HAGGARD*
> 
> It's not really path correction, you still end up on the same pixel, just that small deviations are not always as apparent as they are with 1000Hz because more inputs are combined.
> That effect makes the mousing feel less jittery, low poll variance would make it less stuttery. But it is very possible that you indeed felt the latter.


Yeah, it certainly effected "jitter/ripple" because the PTE sensor is very sensitive to fine movements, such as vibrations, and I understand what you're saying regarding combining inputs, but I believe that the polling variance was part of it, too.


----------



## Conditioned

Interesting read. I would advice disabling hpet for more accurate mousefeel and lower dpc (slightly dependent on which mobo i used). Also I would advice against setting tsc through your reg trick for nvidia cards. Me and a couple of others that have are good at measuring feel it brings some stutter/lag, that doesn't happen all the time but maybe once every 10-30 secs.


----------



## HAGGARD

Again, very possible that variance had to do with it. As I said in the OP, with my laptop the difference was night and day only from going 500Hz that the system could maintain more stably.

The "smoothing" anti-jitter effect of lower polling rates in regards to totaling of inputs is very apparent in paint. Way less noise to lines that you draw.


----------



## Melan

Wait. Wasn't it called ripple? Jitter was a cursor movement when mouse it self was static.


----------



## HAGGARD

I wouldn't accept that even if it were the commonly used terms for those. Jitter makes way more sense for the effects of unsteady motion/tracking.


----------



## Melan

It just gets very confusing from time to time.


----------



## HAGGARD

Not being serious either. But I guess it was easy to understand which of two was meant there.
Are there even mice that exhibit cursor movement when they are static? Certainly not these days.


----------



## Crymore13

Quote:


> Originally Posted by *HAGGARD*
> 
> AFAIK firmware flashing on the newer models happens exclusively from the synapse software itself. Did you search in there already? Not sure they offer different versions in there though, might even be that for the 4G version there never was a firmware update. If you can't find anything, at least your firmware version should be somewhere in there.
> A lot of people have the DA13, if they have the same problems with the same firmware we have the culprit. From there we'd have to contact Razer.
> 
> But try searching in synapse first. For now you can also try uninstalling synapse and the razer driver should you be using it.


Uninstalled Synapse and driver, the problem remains...
I will contact the support and see if they solve my problem...

Thanks anyway;


----------



## HAGGARD

Pretty obviously a firmware problem then. Well that or a weird defect with your copy. Let's hope more people with the mouse can be arsed to check theirs and report in if they get the same problem.
So no firmwares available in synapse for that model?


----------



## prd555

My Deathadder [email protected] Hz, all C states disabled and without Synapse:


----------



## HAGGARD

Time to give Razer some heat. It's not that old a mouse, maybe you can get them to ship a firmware update. With the 3G version there was a similar problem that was introduced with a newer FW, so it should be adjustable here as well.


----------



## Crymore13

Quote:


> Originally Posted by *HAGGARD*
> 
> Pretty obviously a firmware problem then. Well that or a weird defect with your copy. Let's hope more people with the mouse can be arsed to check theirs and report in if they get the same problem.
> So no firmwares available in synapse for that model?


Nothing.
I tried an old version of synapse, but it automatically updates to newer when logged in.


----------



## trism

Quote:


> Originally Posted by *Trull*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HAGGARD*
> 
> For that, head to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Enum\PCI and create for every entry that possesses an "Interrupt Management" folder within the Device Parameters a new folder called "MessageSignaledInterruptProperties". In that context, create a DWORD entry called MSISupported and set that to 1.
> 
> 
> 
> *Don't do this*. It will lock your Windows down and it won't be able to start up, even in safe mode.
Click to expand...

As far as I know, the only thing that could lock the Windows is setting the ATAPI controller to MSI-mode when the controller does not support it. But as said already, you should be able to get the previous working settings if this happens.

If you are going to set every device to MSI-mode, use HPET on. HPET off bugged my network adapter (much less bandwidth) and my audio went stuttery and played slower.


----------



## HAGGARD

Quote:


> Originally Posted by *trism*
> 
> HPET off bugged my network adapter (much less bandwidth) and my audio went stuttery and played slower.


Did you at all play around with the useplatformclock parameter in the boot configuration?


----------



## trism

Quote:


> Originally Posted by *HAGGARD*
> 
> Did you at all play around with the useplatformclock parameter in the boot configuration?


No. It's not set at all. I've tried using HPET on and useplatformclock set to true before in previous Windows installations but that made the input feel very delayed.

You also quoted me a bit wrong as it is out of context. Using the network adapter and the sound card on MSI-mode *while* the HPET was set to off from BIOS was bad. HPET on and MSI-mode works fine. HPET off and basic line-based interrupt mode works fine.


----------



## HAGGARD

I didn't mean to imply something was wrong with your settings, I wanted to confirm that Windows is able to utilize HPET timings selectively for hardware that supports it while operating at non-platformclock.
As I said in the OP, considering that the OS can selectively utilize HPET, there is no reason to force platformclock as standard timer in bcdedit.


----------



## dlano

Finalmouse 2015, HPET on:



Spoiler: Warning: Spoiler!







I'd like to think I've done some reasonable optimisation, but it's a fairly old setup now with a USB controller that doesn't always power up my keyboard (and flat out hated some Razer peripherals in general); an MSI X58 Pro with an i7 920 still going strong









Reading your post makes me wonder if poor polling is why the FM2015 is limited to 500Hz, they can't control their customers system so it gives the best possible user experience, and being designed as an "e-sport" mouse meant to be plugged into a machine provided for you set up who-knows-how.

The idea of them claiming it wasn't suitable for 1000Hz always seemed odd on their part until now if my tinfoil hat thoughts are correct.


----------



## VolsAndJezuz

Very good thread. There were several good tips in the OP that I hadn't remembered to do on my new Win7 x64 install with my new hardware. And I decided to try HPET on + MSI mode for the first time (always felt the risk to an otherwise stable OS wasn't worth the rewards), and so far I'm very pleased. Aim does feel quite different though compared to HPET off in BIOS and + IRQ setup. It almost feels a little slower with HPET + MSI, but somehow not in a laggy way. It feels more solid and accurate but perhaps not quite as instantly responsive for flicking and fast-180s, etc. I'm still getting used to it and will go back and forth with HPET off + IRQ a few more times the next several days to decide on which I like better. But currently leaning towards HPET + MSI. Here is a quick comparison of 1000Hz polling rate for both situations (data exported from MouseTester so I could do additional analysis and more custom graphs). You can see the deviation has tighetened nicely and 1000Hz feels acceptably steady enough to use over 500Hz to me with this new configuration. Also, I included a MouseMovementRecorder screencap to show what my typical 1000Hz polling rate data looks like now. It's noteworthy because of how much more tightly the reported rates stick right around 1000Hz now during fast mouse movement, where previously with HPET off + IRQ it was common if not frequent to see dips and peaks of 20, 30, or 40Hz+ away from the desired 1000Hz.





When I say 'MSI mode', here's how my PCI resources are. My HPET off + IRQ setup only had the Intel 82579V Gigabit Network adapter adopting MSI mode on its own. Note, I did try to get the SATA RAID controller to MSI mode, even installing Intel RST software in hopes it would do this or at least allow it, but I ended up only with boot BSODs that required System Restore to fully alleviate (Last Good Configuration gave BSOD-less reboot cycles). I'm thinking this is because I'm using the somewhat old v11.x Intel RAID ROM and RST drivers that are vastly superior to newer alternatives for my Z77 chipset, and I suspect upgrading RAID ROM + driver would allow MSI mode. It wouldn't be worth the tradeoff for me, especially because the SATA RAID controller doesn't have anything sharing its IRQ channel.



What might be hard to tell is the behavior when you change directions quickly, and the mouse momentarily is at rest. This data was from several fast side to side swipes with about 10 direction changes in all, at which you would expect the update time to be a multiple of the idealized 1ms update time when the mouse was at or near zero velocity momentarily. So those data points are cut off to focus on the behavior at the full-speed update interval. Previously, I had stuck with 500Hz polling rate because the polling consistency/update interval wasn't satisfactorily consistent, and this could be felt at the end of fast swipes where 500Hz felt precise and easy to stop on a dime where 1000Hz felt unwieldy and floaty. This was reflected in the data I collected for polling rate for both with my previous HPET off + IRQ setup, where 500Hz would come to a nice sudden and linear stop in this situation (_e.g_.: 500, 500, 500, 500, 250, -) where 1000Hz would flutter to a stop (_e.g_.: 1000, 1000, 1000, 500, 1000, 1000, 500, 250, 500, 125, -). Much to my surprise, with HPET + MSI the data from changes in direction and full stops look much more like the old 500Hz data, with only a couple of update times that were multiples of the 1ms idealized value for each instance; and, this feeling translate to in-game aim as well, where 1000Hz now feels appropriately steady and reliable at all speeds.

A few points in regards to the optimization suggestions in the OP:

-Intel Chipset Software/"Drivers" 9.x and 10.x almost literally change nothing besides the device name in Device Manager, as they are merely INF files and not drivers, hence if you already had a 9.x version or higher, upgrading to any newer version doesn't require a reboot. This applies to Intel SMBus Host Controllers and USB Host Controllers, so there is no need or point in trying to do any A/B comparisons to the default Microsoft Windows drivers versus the Intel "drivers".

-For Windows services, in addition to the services you listed I found the Dhcp (DHCP Client) and Dnscache (DNS Client) services to be necessary to have automatically started in order for my LAN to connect to the internet. And because I use Windows Firewall, also needed the MpsSvc (Windows Firewall) and BFE (Base Filtering Engine) services started automatically.

-CPUUnparkApp and for that matter and CPU unparking utility I could find were about as useful as counting your feet. ParkControl at least could accurately tell whether CPU cores were actually unparked or not, though it couldn't make the changes itself even with unblocking and running as admin. CPUUnparkApp and CoreParkingManager both would make inadequate changes to actually unpark the cores, and then would variously report for seemingly different reasons that the CPU was unparked. These were all bollocks so I think you should instead recommend the registry edits as you did, then follow that up with simply setting high performance power settings and changing the following to 100% in the advanced power settings > Processor power management: Processor performance core parking min cores, Minimum processor state, Maximum processor state, and Processor performance core parking max cores; and, finally, setting Allow Throttle States to Off. I wonder how many people unknowingly still have parked cores due to being misled by the seemingly universal incompetent unparking programs out there, like I had been for a long time. Complicating the matter is there didn't seem to be an easy-to-find, straight forward resource that laid out how to manually unpark your CPU cores.

-Why use TimerTool to adjust the timer resolution? Since the setting doesn't stick if you close the program, you'd have to keep it open and minimized to keep timer resolution at 0.5ms. Why not use the SetTimerResolution service instead? http://forums.guru3d.com/showthread.php?t=376458

-Trying to set core affinity for a service is futile because it is wiped away after every restart, and I couldn't find a convenient program or method for retaining. Furthermore it made absolutely no measureable difference in performance from my short time experimenting with it, and I only see core affinity tweaks being useful as a low-end system optimization. So I would personally just toss out that recommendation.


----------



## mandrake88

TL;DR needed, 500hz or 1000hz?


----------



## popups

Quote:


> Originally Posted by *mandrake88*
> 
> TL;DR needed, 500hz or 1000hz?


We don't do that kind of stuff.


----------



## VolsAndJezuz

Also you may want to mention a static GPU [over]clock/voltage (using NVIDIA Inspector for instance) is similarly very preferable for polling precision/minimal input lag.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *mandrake88*
> 
> TL;DR needed, 500hz or 1000hz?


Even though I had what I considered to be a highly optimized gaming rig, I previously didn't feel comfortable using 1000Hz and would have recommended 500Hz to anyone not using Motion Blur Reduction or Lightboost.

In light of the further optimizations possible from tips in the OP, mainly revolving around running HPET on and putting devices in MSI mode, I now think you can achieve a satisfactory polling stability at 1000Hz, BUT, only with heavy-handed optimization and an appreciably hefty gaming rig. And even then I imagine 1000Hz won't work well for every or even most systems out there due to unique configuration/firmware/peripheral combinations and nuances, not to mention conflicting user preferences.

I would still probably recommend 500Hz for most situations, especially to the lay user. Unless you are willing to spend hours and hours testing, tweaking, and optimizing to get update intervals and polling rates close to the picture in my previous post @ 1000Hz, you are simply giving up a lot more in consistency and accuracy than you are gaining in responsiveness IMO


----------



## HAGGARD

Thanks for the elaborate comments VolsAndJezuz. Glad this could convince you to try and go 1kHz. Regarding some of your concerns:

Quote:


> Note, I did try to get the SATA RAID controller to MSI mode, even installing Intel RST software in hopes it would do this or at least allow it, but I ended up only with boot BSODs that required System Restore to fully alleviate (Last Good Configuration gave BSOD-less reboot cycles)


Any SATA controller boot problems that are for some reason not gone with the registry reset should be resolvable with changing SATA modes in the BIOS back to IDE. From there you can edit the registry and set them to AHCI/RAID again afterwards.
Really strange how some of you get boot problems with MSI mode. Again, mine just stay in line if they don't support MSI regardless of what the registry suggests. Probably something to do with your hardware itself supporting MSI but the associated drivers or motherboard not supporting it for the component and that leading to conflicts.
Quote:


> -Intel Chipset Software/"Drivers" 9.x and 10.x almost literally change nothing besides the device name in Device Manager, as they are merely INF files and not drivers, hence if you already had a 9.x version or higher, upgrading to any newer version doesn't require a reboot. This applies to Intel SMBus Host Controllers and USB Host Controllers, so there is no need or point in trying to do any A/B comparisons to the default Microsoft Windows drivers versus the Intel "drivers".


That depends on whether the chipset packages come with their own associated .sys tree or refer to the standard Windows library. SMBus requires no actual driver, host controllers will mostly use Windows' own .sys usbhub/usbehci/usbohci/usbport, but for USB filters they often ship proprietary drivers. Above that, the INF does more than give the device a name. It can also install services, edit registry and specify other things (such as power management capabilities, interrupt mode) that are theoretically more fitting for the component than generic Windows specifications would give. But I'll agree with you and say that is not likely a root cause of polling problems on any system.
Quote:


> -For Windows services, in addition to the services you listed I found the Dhcp (DHCP Client) and Dnscache (DNS Client) services to be necessary to have automatically started in order for my LAN to connect to the internet. And because I use Windows Firewall, also needed the MpsSvc (Windows Firewall) and BFE (Base Filtering Engine) services started automatically.


Good additions. To establish an internet connection after booting I just run Windows update, it will launch all needed services. Firefox unfortunately does not request the launch of network services, a foobar2k stream for example does.
Quote:


> -CPUUnparkApp and for that matter and CPU unparking utility I could find were about as useful as counting your feet. ParkControl at least could accurately tell whether CPU cores were actually unparked or not, though it couldn't make the changes itself even with unblocking and running as admin. CPUUnparkApp and CoreParkingManager both would make inadequate changes to actually unpark the cores, and then would variously report for seemingly different reasons that the CPU was unparked. These were all bollocks so I think you should instead recommend the registry edits as you did, then follow that up with simply setting high performance power settings and changing the following to 100% in the advanced power settings > Processor power management: Processor performance core parking min cores, Minimum processor state, Maximum processor state, and Processor performance core parking max cores; and, finally, setting Allow Throttle States to Off. I wonder how many people unknowingly still have parked cores due to being misled by the seemingly universal incompetent unparking programs out there, like I had been for a long time. Complicating the matter is there didn't seem to be an easy-to-find, straight forward resource that laid out how to manually unpark your CPU cores.


Interesting. I've always gone the simple route of using resmon.exe to look for parked cores. The unpark app edits the registry associated with the parking power settings though, so it essentially is the same as doing that from the advanced power plan features, but maybe you are on to something. It likely depends on the CPU as well. My AMD Phenom processor is not parked at all in Windows to begin with, my Intel i5 I had to manually force not to park.
Quote:


> -Why use TimerTool to adjust the timer resolution? Since the setting doesn't stick if you close the program, you'd have to keep it open and minimized to keep timer resolution at 0.5ms. Why not use the SetTimerResolution service instead? http://forums.guru3d.com/showthread.php?t=376458


Will read into that. Although, TimerTool is as comfy as it gets really. The program itself does not require resources, you can adjust and read the timer easily and quickly. And closing the program will also reset the timer. Which is important so you don't forget to set it back and have your PC waste power needlessly.
Quote:


> -Trying to set core affinity for a service is futile because it is wiped away after every restart, and I couldn't find a convenient program or method for retaining. Furthermore it made absolutely no measureable difference in performance from my short time experimenting with it, and I only see core affinity tweaks being useful as a low-end system optimization. So I would personally just toss out that recommendation.


You can set the affinity modes from the registry to have them remain, and while I see benefits of moving the vast amounts of interrupts and tasks associated for example with audio processing away from the core that handles USB events, we don't know which tasks it then conflicts with on another core. So yeah, affinity is not the most obvious tweak and would require more research into things like the core behaviour of the CPU (sharing, splitting, multithreading of tasks) and the behaviour of software running across multiple cores/threads etc. - while ultimately not being the most significant factor really.


----------



## VolsAndJezuz

Thanks for your detailed reply.
Quote:


> Interesting. I've always gone the simple route of using resmon.exe to look for parked cores. The unpark app edits the registry associated with the parking power settings though, so it essentially is the same as doing that from the advanced power plan features, but maybe you are on to something. It likely depends on the CPU as well. My AMD Phenom processor is not parked at all in Windows to begin with, my Intel i5 I had to manually force not to park.


At least on my setup (Win7 x64, i7-3770k, Z77), the CPUUnparkApp would change some of the CPU %'s, but for instant didn't touch the Processor performance core parking min cores setting, which is perhaps the most important one to change to 100%, as the max cores should already be 100% and the min/max processor states are more standard options. The CoreParkingManager didn't even do that much, it literally only changed the min and max processor states which is pretty worthless for an "unparking" program. Maybe they are more successful in other hardware/OS screnarios, but being that they are not effective in a case as common as mine, it would probably be a better generalization to recommend the full gambit of manual settings I outlined previously.
Quote:


> Will read into that. Although, TimerTool is as comfy as it gets really. The program itself does not require resources, you can adjust and read the timer easily and quickly. And closing the program will also reset the timer. Which is important so you don't forget to set it back and have your PC waste power needlessly.


I guess it just annoys me to have something thats open and running, but only minimized to taskbar and not the tray







...picky picky. But also I felt it was somewhat resource inefficient for what it was doing, being that it took up 6MB of RAM. But really not that big of a deal. SetTimerResolution as a service is convenient for me because it's completely out of my hair and in the background. I have it set to Manual, then I have a 'high performance' script that runs on startup and on demand that sets my static GPU overclock, sets High Performance power setting, and start STR service. Then I have an 'energy saver' script that sets a much lower static GPU clock, changes to balanced power setting, and stops STR service, to accommodate everyday computing. Either way ultimately accomplishes the same end-goal it seems.


----------



## HAGGARD

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Thanks for your detailed reply.
> At least on my setup (Win7 x64, i7-3770k, Z77), the CPUUnparkApp would change some of the CPU %'s, but for instant didn't touch the Processor performance core parking min cores setting, which is perhaps the most important one to change to 100%, as the max cores should already be 100% and the min/max processor states are more standard options. The CoreParkingManager didn't even do that much, it literally only changed the min and max processor states which is pretty worthless for an "unparking" program. Maybe they are more successful in other hardware/OS screnarios, but being that they are not effective in a case as common as mine, it would probably be a better generalization to recommend the full gambit of manual settings I outlined previously.


CPU throttling should already be disabled in the high performance power plan, minimum/maximum processor state should also already be at 100% there. Were those different for you?
The "min. not-parked cores" settings always goes back to 10% for me, how did you change that?
The CPU Unpark App changes ValueMax to 0, which changes max. not-parked cores to 100%, essentially saying maximum amount of parked cores = 0%.
Where do you specifically see whether cores are parked or not? I just assumed if resmon doesn't specify any core as parked there is no parked core.
Quote:


> I guess it just annoys me to have something thats open and running, but only minimized to taskbar and not the tray
> 
> 
> 
> 
> 
> 
> 
> ...picky picky. But also I felt it was somewhat resource inefficient for what it was doing, being that it took up 6MB of RAM. But really not that big of a deal. SetTimerResolution as a service is convenient for me because it's completely out of my hair and in the background. I have it set to Manual, then I have a 'high performance' script that runs on startup and on demand that sets my static GPU overclock, sets High Performance power setting, and start STR service. Then I have an 'energy saver' script that sets a much lower static GPU clock, changes to balanced power setting, and stops STR service, to accommodate everyday computing. Either way ultimately accomplishes the same end-goal it seems.


I meant no resources as in CPU time, RAM usage is not really significant as you said. But automated scripts do sound tasty, I still have to do the steps manually. Then again, I don't game too often so it's bearable.
I can also understand the wish not to have the tool open in the background.


----------



## banjogood

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> I'm thinking this is because I'm using the somewhat old v11.x Intel RAID ROM and RST drivers that are vastly superior to newer alternatives for my Z77 chipset


More info on this? I have a Z77 motherboard from Asrock.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *arsn*
> 
> More info on this? I have a Z77 motherboard from Asrock.


http://www.win-raid.com/t362f23-Performance-of-the-Intel-RST-RSTe-AHCI-RAID-Drivers.html
"The "classical" Intel RST driver/ROM combo v11.2.x.xxxx is by far the best performant Intel RAID driver for my Z77 system. Especially the WRITE scores are much higher than with all other tested combos."

http://www.win-raid.com/t25f23-Which-are-the-quot-best-quot-Intel-AHCI-RAID-drivers.html
>Intel RST drivers v11.2.0.1006 WHQL
>best suitable Intel RAID ROM: v11.2.0.1527
Quote:


> Originally Posted by *HAGGARD*
> 
> CPU throttling should already be disabled in the high performance power plan, minimum/maximum processor state should also already be at 100% there. Were those different for you?
> The "min. not-parked cores" settings always goes back to 10% for me, how did you change that?
> The CPU Unpark App changes ValueMax to 0, which changes max. not-parked cores to 100%, essentially saying maximum amount of parked cores = 0%.
> Where do you specifically see whether cores are parked or not? I just assumed if resmon doesn't specify any core as parked there is no parked core.


For some reason none of the processor states showed up on my power plans until I performed the registry changes from the OP. Processor performance core parking min cores stays at 100% for me. I had to edit the 'ValueMax' REG_DWORD in HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Power\PowerSettings\54533251-82be-4824-96c1-47b60b740d00\0cc5b647-c1df-4637-891a-dec35c318583 to 64 in hexadecimal to get that setting to stay. This is yet another complication I forgot that should be included with the previous values I stated for unparking.

The CPU Unpark App incorrectly detects Parked/Unparked status and makes incorrect changes to the registry when using the Unpark All option, at least on my system. I'm guessing this is why the person released and linked the CPU Core Parking Manager V2, which still doesn't set all the needed values but at least doesn't set incorrect values. I don't think the ValueMax 0 works the way you said, and that's at least partly where these utilities are falling down. The values are confusing because 'Processor performance core parking min cores' means the minimum allowed % of UNparked cores (and similarly 'Processor performance core parking max cores' means the maximum allowed % of unparked cores, thus you want both of these at 100%).

Since this has strung over multiple posts and is probably horrifically confusing, here's the way I would recommend going about unparking your CPU manually:
-Enable high performance power plan from Windows control panel > Power options
-Hit Windows key + R and type regedit, hit enter, and navigate to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Power\PowerSettings
-Hit Ctrl + F, type Attributes, and check only Values under 'Look at', and hit Enter to search
-Hit Enter, type in 2 for Value data, and hit Enter again, then F3 to find the next occurence of Attributes
-Repeat the previous step until you've searched through all the subkeys in the left pane contained under PowerSettings
-Navigate to HKLM\SYSTEM\CurrentControlSet\Control\Power\PowerSettings\54533251-82be-4824-96c1-47b60b740d00\0cc5b647-c1df-4637-891a-dec35c318583 and change the ValueMax to 64 in hexadecimal
-Go to Windows control panel > Power options > Change plan settings (High performance) > Change advanced power settings > Processor power management
-Set the following four Settings to 100%: Processor performance core parking min cores, Minimum processor state, Maximum processor state, and Processor performance core parking max cores; and, finally, set Allow Throttle States to Off


----------



## HAGGARD

Quote:


> For some reason none of the processor states showed up on my power plans until I performed the registry changes from the OP.


Oh, that is normal. Just that those in the performance plan should already be set to 100%, or "Off" for throttling. With the Attributes entry you just make them visible in the power plan settings, the associated values are set in the registry.
Quote:


> I don't think the ValueMax 0 works the way you said, and that's at least partly where these utilities are falling down. The values are confusing because 'Processor performance core parking min cores' means the minimum allowed % of UNparked cores (and similarly 'Processor performance core parking max cores' means the maximum allowed % of unparked cores, thus you want both of these at 100%).


That would indeed be interesting if turns out true. The Unpark App sets that value to 0 to unpark, and 100 to park cores. So the exact opposite of what you are suggesting. The description is somewhat confusing... I mean, min. not-parked, max. not-parked - why couldn't they just have made those min. parked and max. parked. And maybe they even did that but the user-end description is screwed. Will try setting ValueMax to 100 later and min. not parked to 100% in the panel and see whether resmon reports any parking.
Again, how are you detecting parked cores if not in the resource monitor? Mine doesn't show parked cores and that is with ValueMax set to 0 as per CPU Unpark App.

Regarding affinity settings, I've dug up this: http://www.overclock.net/t/1536110/windows-audio-stack-sucks-anything-higher-than-stereo-16-bit-44-1k-has-higher-cursor-lag/100#post_23426443

There I performed tests in an in-game environment with and without audio activity. That's something you have to keep in mind - in idle some settings may not have the largest impact on poll precision but as soon as more activity is present they may. That's with the old MouseTester that didn't have a log option. Limiting the audio process to one core had positive effects there, but I didn't make sure to assign it to a core not dealing with USB events at that time. Will repeat those tests some time.


----------



## MaximilianKohler

Quote:


> Originally Posted by *Derp*
> 
> Disabling c6/c7 made an enormous improvement but I'm not ok with the processor sitting at full voltage and consuming fives times the power. The polling on my system is quite terrible with c6/c7 enabled.


Yeah, I'm in the same boat. I've been using the power saving options because I haven't found a need to overclock yet.

My asrock z87 motherboard has profiles I can save settings to, so I guess I'll resort to restarting and switching profiles often.
Quote:


> Originally Posted by *VolsAndJezuz*
> 
> then I have a 'high performance' script that runs on startup and on demand that sets my static GPU overclock, sets High Performance power setting, and start STR service. Then I have an 'energy saver' script that sets a much lower static GPU clock, changes to balanced power setting, and stops STR service, to accommodate everyday computing. Either way ultimately accomplishes the same end-goal it seems.


Could you upload that? I'm sure a number of us would be interested ^_^


----------



## Kitzstyle

good guide. too bad it comes now. I've been working on optimizing my system like a maniac for the past 4 years and most in this guide I've done except for autostart things. I have only CSGO installed on my PC and I never have more then one game installed at once. I'm a Real latency nazi when it comes to my PC. Name anything and I've tried it when it comes to timings and so on. My system are now in perfect state and couldn't be happier









And don't see why people complain about power usage. I max everything I can. I would use a mini sun to power my PC if I could. it's not like we live in a third world country and can't afford electricy which is nearly free of charge. :s


----------



## MaximilianKohler

Well this is very interesting...


Spoiler: Pre-tweaks



bcdedit /set useplatformclock true
HPET in BIOS ON

FinalMouse:


IE 3.0:


xvelocity comparisons: http://www.overclock.net/t/1531877/finalmouse-2015/1000_50#post_23817804





Spoiler: Post-tweaks



FM after changing windows power plan from "balanced" to "high performance", bcdedit /deletevalue useplatformclock, closing programs, *pre-restart*:


FM after restarting and disabling BIOS power saving/underclocking options:


HPET off in BIOS caused more and higher fluctuations, but I could still zoom in to certain areas where it was more or less the same as before.

IE 3.0 after tweaks, HPET in BIOS ON:


Post-tweaks *xvelocity* results:
FM:
Mostly all results look like this one now:


I got a couple in a row that looked similar to this, but I haven't been able to get any since then. I'm thinking I may have been lifting the mouse accidentally or something.


My 3.0 xvelocity graphs look essentially the same, except when I go over 1.5m/s now they look different:
Before:


After:



I already knew that the power saving features had a negative affect on DPC latency, but here are the post-tweaks results:


Windows 8 adds 1000us, so whenever I would mention my DPC before I would say "2-30" instead of 1002-1030". But as you can see that's no longer doable since there are sub 1000 drops.

So before the tweaks I would get 1002-1030us. Now it's pretty stable around 1000, with occasional drops to the 800s, 900s, and even 2-5. HPET seems to have no affect on this computer, but it did lower DPC latency on my previous PC.



I'm definitely going to do some ingame testing to see how this has affected mouse performance.

EDIT:


Spoiler: added USB 3.0 comparisons



Plugged into a USB 3.0 (native http://www.asrock.com/mb/Intel/Fatal1ty%20Z87%20Killer/ ) slot with 3.0 enabled ("smart auto") in BIOS, we can see it's very slightly worse by about 0.03ms:


usb 2.0 port with 3.0 enabled in BIOS:


usb 2.0 port with 3.0 disabled in BIOS:


usb 2.0 port with 3.0 disabled in BIOS -very zoomed in to get rid of outliers


usb 3.0 port with 3.0 disabled in BIOS


I opted to leave 3.0 enabled ("smart auto") in BIOS _(because oddly enough there seem to be more high deviations when 3.0 is disabled, and I have a usb 3.0 thumb drive)_, and have the mouse plugged into a 2.0 port. And I'll test 3.0 vs 2.0 ingame performance on this setting later on.



EDIT: friend showed he got significant difference when changing ports due to IRQ assignment. It's possible the difference I got on the 3.0 port was solely due to IRQ assignment rather than 3.0 performing differently than 2.0.


----------



## Melan

DPC latency checker doesn't work with Windows 8. Use LatencyMon instead.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *MaximilianKohler*
> 
> Could you upload that? I'm sure a number of us would be interested ^_^


The problem is that it depends on the location of nvidiaInspector.exe and the specific settings will vary wildly depending on your video card. I can give basic example scripts and people can try to customize for themselves if they're so inclined. This is an AutoHotkey .ahk script that you would convert to an .exe using the built-in converter with AutoHotkey download. Then you can create a shortcut and put that shortcut in the startup menu if you want it to start at Windows startup, and then pin it to the taskbar or copy it to the desktop for convenient switching.


Spoiler: High Performance



#NoEnv
#SingleInstance ignore
SetBatchLines -1
SetWorkingDir %A_ScriptDir%

OnExit, EOF

ErrorLevel := 0

Process, Exist, Energy Saving.exe
if ErrorLevel
{
MsgBox % "Energy Saving.exe is running, so High Performance.exe will now exit"
Goto, EOF
}

Sleep, 500

Run, %comspec% /c "C:\Windows\System32\powercfg.exe /setactive 8c5e7fda-e8bf-4a96-9a85-a6e23a8c635c & "C:\Program Files (x86)\NVIDIA Inspector\nvidiaInspector.exe" -setBaseClockOffset:0`,0`,510 -setMemoryClockOffset:0`,0`,201 -setVoltageOffset:0`,0`,0 -setGpuClock:0`,2`,1216 -setMemoryClock:0`,2`,3305 -forcepstate:0`,2 & net start STR & exit", , Hide

Goto, EOF

EOF:
ExitApp



The High Performance checks that Energy Saving executable isn't already running, then sets the High performance power plan, sets the GPU and memory clocks for my GTX 680 (as I said, this part will vary according to your video card(s)...google how to overclock your specific GPU with nvidiaInspector) and locks it into the highest GPU performance state, and starts the STR service (SetTimerResolution...downloaded from here http://forums.guru3d.com/showthread.php?t=376458). This is just an example of what you might what to do with a 'High Performance' shortcut, but there's a lot more possibilities if you look into AutoHotkey scripting.


Spoiler: Energy Saving



#NoEnv
#SingleInstance ignore
SetBatchLines -1
SetWorkingDir %A_ScriptDir%

OnExit, EOF

ErrorLevel := 0

Process, Exist, High Performance.exe
if ErrorLevel
{
MsgBox % "High Performance.exe is running, so Energy Saving.exe will now exit"
Goto, EOF
}

Run, %comspec% /c "net stop STR & C:\Windows\System32\powercfg.exe /setactive 381b4222-f694-41f0-9685-ff5bb260df2e & "C:\Program Files (x86)\NVIDIA Inspector\nvidiaInspector.exe" -setBaseClockOffset:0`,0`,-164 -setMemoryClockOffset:0`,0`,-2294 -setVoltageOffset:0`,0`,0 -setGpuClock:0`,2`,543 -setMemoryClock:0`,2`,810 -forcepstate:0`,2 & exit", , Hide

Sleep, 500

Goto, EOF

EOF:
ExitApp



Similarly, the Energy Saving checks that High Performance isn't already running, then stops the STR service, sets the Balanced power plan, and sets a lower GPU and memory clocks for my GTX 680 so it's not being thrashed as much during everyday activities like web browsing, watching videos, and editing documents.


----------



## VolsAndJezuz

Also I want to say that I've tested more extensively having all compatible hardware in MSI-mode w/ HPET on versus legacy interrupt w/ HPET off, and I've settled back on legacy interrupt w/ HPET off in BIOS.

Even though MSI-mode w/ HPET on did give slightly more consistent USB polling precision, I did find that the mouse movement was different in a negative way, like it almost felt like it had smoothing or something. This was most evident when I did aim-training map tests and trying to do bunnyhopping in an offline server (I find the latter to actually be one of the best methods to test mouse responsiveness and input lag because of the required precision in timing the mouse-wheel jumps and synchronizing strafes). Furthermore, FPS tests with Fraps showed that a legacy interrupt w/ HPET off setup gave more consistent overall FPS, as there would be dips for several measurements with MSI-mode w/ HPET on that were considerably lower.

My theory is that the MSI-mode w/ HPET mode smooths out USB polling in a subtle manner where the overall USB polling is smoothed, giving the consistently lower polling intervals, but also adding a small amount of input lag and/or taking away some of the "rawness" in the aiming. Also I imagine which of these two options is best depends heavily on your hardware and to a lesser but important extent on your Windows/mouse/game configuration, so it's something you really have to test out for yourself.

Regardless, after applying some of the other tweaks from the OP that I had previously overlooked, I now find 1000Hz polling consistency to be acceptable in both the MSI and legacy interrupt setups. Also of note is that locking the Windows timer to the lowest resolution is definitely advantageous with HPET on, but in my measurements it had a detrimental effect to the USB polling consistency in legacy interrupt w/ HPET off. So I am no longer forcing the sub-0.5ms timer resolution with the STR service or TimerTool.


----------



## qsxcv

wmo overclocked to 1000hz

something interesting happens between 3000 and 4000 

idk how many things i've tweaked, but i think i have most/all the power-saving stuff in bios disabled out of habit


----------



## banjogood

Quote:


> Originally Posted by *qsxcv*
> 
> 
> wmo overclocked to 1000hz
> 
> something interesting happens between 3000 and 4000
> 
> idk how many things i've tweaked, but i think i have most/all the power-saving stuff in bios disabled out of habit


thats weird but except for that your graph looks very good. what motherboard do you have?


----------



## MaximilianKohler

Quote:


> Originally Posted by *qsxcv*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> wmo overclocked to 1000hz
> 
> something interesting happens between 3000 and 4000
> 
> idk how many things i've tweaked, but i think i have most/all the power-saving stuff in bios disabled out of habit


Someone else was getting odd anomalies like that due to something running in the background.


----------



## VolsAndJezuz

When you get something like that on MouseTester, be sure to take several measurements repeating the same motion. You want to find out how anomalous that incident was, or worse case how recurrent such fits are. Still, the update times are what I would consider in the realm of acceptable in that 'interesting' range. My general criteria to find a given polling rate acceptable is whether the vast majority of the polling times are within +/- 0.003, with a few strays at ~+/-0.005 and the rare loners not much farther than +/-0.01. If it fails this criteria, then the next highest polling rate should be used (assuming it passes).


----------



## softskiller

Quote:


> Originally Posted by *HAGGARD*
> 
> Sound components are complex and resource-heavy.


Influence 24 bit vs. 16 bit 44/48 kHz?
For example for CS:GO people often suggest to use 16 Bit.
While 24 Bit is the "standard" setting of my Realtek onboard chip.
Quote:


> Originally Posted by *VolsAndJezuz*
> 
> "The "classical" Intel RST driver/ROM combo v11.2.x.xxxx is by far the best performant Intel RAID driver for my Z77 system. Especially the WRITE scores are much higher than with all other tested combos."


Do you also suggest to install RST when no raid is used, just to prefer the Intel SATA driver over Microsoft ones?
There are different discussions regarding performance and extra load.

Slightly better performance without the extra Intel driver layer.
But more "features" like Link Power Management for pakage C-State support.

EDIT: found an intersting user post http://www.win-raid.com/t362f23-Performance-of-the-Intel-RST-RSTe-AHCI-RAID-Drivers.html


----------



## VolsAndJezuz

I follow this advice from http://www.win-raid.com/t2f23-Intel-R-RST-RSTe-Drivers-newest-v-v-WHQL.html

"For RAID users I recommend to temporarily install the complete IRST/IRST(e) Drivers & Software Set just to get benefit of the "Write-Back-Caching" and the related write performance boost. After having enabled this feature, the "Intel(R) Rapid Storage Technology" software can be uninstalled from within the Control Panel."

And you definitely want to uninstall the RST software after that because it eats up system resources and could therefore easily disrupt USB polling and/or add input lag. I've used RAID ever since I started tweaking/optimizing for gaming, so I'm not sure on Intel SATA driver versus Microsoft. I would tend towards Intel SATA driver if I had to guess but it may be something you just have to benchmark and test in gaming for your setup.


----------



## VolsAndJezuz

And you don't need any of the "extra" features from versions newer than v11.2.0.1006 if you're going for performance gaming optimization. They are mainly related to power saving and read/write speed for large files, neither of which are important for gaming.


----------



## deepor

Do you really need to install the full Intel software if you want to enable that write-back-cache setting? Does the normal Windows setting in the device manager for the disk controller not work with the Intel driver?

In any case... you should be aware that everything that's in the cache at the moment of a crash will be lost, so it might not be a good idea to use for most people. I use it because it makes things like compiling large stuff faster but I have a battery for the PC and have a current backup at all times where I'd lose at most a handful of hours of work if something terrible happens. It's still a risky idea because if something ever happens, the few split-seconds saved here and there couldn't ever add up to the time lost to dealing with a broken Windows or lost work.


----------



## MaximilianKohler

I don't have RAID, but I just uninstalled the "Intel(R) Rapid Storage Technology" software and it didn't positively affect my update time graphs.

EDIT: also, good news is that I can enable stepping in BIOS and keep window's power options on "high performance" and get the same results as if stepping were disabled in BIOS. So I can easily switch window's power options from performance to savings when I'm gaming vs not. I'll check c-states in a sec.

Yep, confirmed Derp's experience in that having c-states enabled in BIOS has negative impacts even with window's on high performance thus maintaining clock speeds and voltage at exactly the same point as when c-states are disabled.

Fortunately, c-states don't seem that important, as stepping by itself underclocks the cpu and lowers voltage.


----------



## Oh wow Secret Cow

I love all the info on USB polling and how it relates to mouse movement, but I wish there was a simple step by step guide on how to maximize mouse responsiveness and reduce input lag







. Seems like theres a lot of info spread out between here, r0aches thread, random comments, etc. and some of it seems to be conflicting (on HPET especially)


----------



## qsxcv

1. don't visit this forum
2. don't think about it
3. ???
4. profit


----------



## MaximilianKohler

Well a lot of things vary from PC to PC, so you have to test them for yourself.


----------



## Oh wow Secret Cow

Quote:


> Originally Posted by *qsxcv*
> 
> 1. don't visit this forum
> 2. don't think about it
> 3. ???
> 4. profit


Too late, I already have nightmares about DPC latency and clown cursor


----------



## CorruptBE

Scary nightmares about clown cursor you say...



... sry, couldn't resist


----------



## cryptos9099

Quote:


> Originally Posted by *CorruptBE*
> 
> Scary nightmares about clown cursor you say...
> 
> 
> 
> ... sry, couldn't resist


He looks like my nephew... Nice guy... brings the steaks and really sharp knives for them...


----------



## MasterBash

Why is my first count like 250ms most of the times? Its like some kind of sleep mode, yet I am using the G302 atm so it shouldnt be a problem... Afterward, the rest are fine are mostly fine. Out of 2000 counters, I get like 2-3 @ 5ms interval.

I guess it could be some BIOS power saving features.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *Oh wow Secret Cow*
> 
> I love all the info on USB polling and how it relates to mouse movement, but I wish there was a simple step by step guide on how to maximize mouse responsiveness and reduce input lag
> 
> 
> 
> 
> 
> 
> 
> . Seems like theres a lot of info spread out between here, r0aches thread, random comments, etc. and some of it seems to be conflicting (on HPET especially)


I am going to do this eventually, including both sides of controversial settings like HPET, and citing the relevant sources or providing personally collected data to backup each recommendation. But it may be part of a new website. I'm in the process of evaluating options for starting it. If that doesn't happen then I will at least make it a thread on this forum.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *MasterBash*
> 
> Why is my first count like 250ms most of the times? Its like some kind of sleep mode, yet I am using the G302 atm so it shouldnt be a problem... Afterward, the rest are fine are mostly fine. Out of 2000 counters, I get like 2-3 @ 5ms interval.
> 
> I guess it could be some BIOS power saving features.


Because you are starting from 0 velocity. If you retest and include multiple times coming to a complete stop, you'll see the polling interval skyrocket each time. You'll also see high interval(s) if you change direction completely as there will be polling interval measurement(s) close to the point you change direction, at which time you have 0 horizontal velocity. This is because there are very few counts when the velocity is approaching 0, and counts and interval are inversely proportional. It's normal, expected behavior and nothing to worry about.


----------



## MaximilianKohler

After uninstalling the rapid storage tech software, I've been getting huge delays when accessing non-C drives... I'll probably just reinstall it since it didn't seem to have any negative affects when installed.

EDIT: ugh, I'm still getting the delays after reinstalling RST









I think it might have been due to me resetting the "balanced" power plan to default and selecting it. I put the "turn off hard disk after" setting to never. Yep, that fixed it.


----------



## trism

Sorry for bumping the thread but after I flashed in a new BIOS version, I can't use HPET off at all. The timers are completely messed up, www.aimbooster.com for example is impossible to play. Timing w/ Xperia Z2 and iPhone 4G, at one minute, aimbooster already reports the time to be at around 1:05-1:10 _even when I actually started the measurement with the phones before I started the game._ After a few moments of idle time, the Windows clock advanced about three to four minutes compared to the Internet time sync. I tried re-flashing the older BIOS but it gives the same result. Feels like the invarient tsc is not working at all. Odd? I didn't change anything from the Windows.

EDIT: Works fine in safe-mode.

EDIT2: looks like adding bcdedit /set useplatformclock false fixed the issue... odd, I never had it set (bcdedit /deletevalue) before I upgraded the BIOS and it was fine.


----------



## softskiller

@trism

I am also using aimbooster (every day) to test all kind of combinations like energy schemes, polling rate, dpi, bios and usb 2 and 3.0 settings and so on. Also different Intel Management Engine drivers seem to influence usb mouse behaviour.

One instantly notices if able to get like 80 or 160 hits in that flash game.

Very good to test mouse precision and latency.


----------



## CorruptBE

Quote:


> Originally Posted by *softskiller*
> 
> Very good to test mouse precision and latency.


Really isn't imo.

What it truly is imo is a test of focus.


----------



## softskiller

Well I did this aimbooster thing several 1000 times and for me it's a good way to "feel" the difference of various settings and to warm up for matches to see if I have the right feel for my mouse.

Of course Win Aero should be disabled, otherwise you have vsync.


----------



## trism

Quote:


> Originally Posted by *softskiller*
> 
> @trism
> 
> I am also using aimbooster (every day) to test all kind of combinations like energy schemes, polling rate, dpi, bios and usb 2 and 3.0 settings and so on. Also different Intel Management Engine drivers seem to influence usb mouse behaviour.
> 
> One instantly notices if able to get like 80 or 160 hits in that flash game.
> 
> Very good to test mouse precision and latency.


Wasn't my point really but yeah, I keep playing it for fun as well. I notice differences between different mice pretty easily when I get to the ~2 minute mark. Mostly issues on the mouse shape not fitting for me though. My highest one is 2:39 with KPM and I've never been able to get over 1:42 with the original FK.

@HAGGARD do you know anything about the timer issue I had with useplatformclock as default mode (not set) without HPET enabled on BIOS?


----------



## thrillhaus

@HAGGARD Are you aware of a service that if disabled, results in the message "This copy of windows is not genuine" appearing in the bottom right corner of Windows 7? My copy is indeed genuine and I started getting this after going through my services.


----------



## VolsAndJezuz

I did too. Not sure which service causes this, but if you simply right click on Computer in the start menu and go to Properties (also can be accessed by Control Panel\All Control Panel Items\System), Windows realizes it's activated and the message disappears. It seems to randomly reappear every few days, but this always fixes it immediately. Strange


----------



## cryptos9099

Windows Activation Technologies Service and Windows Process Activation Service (dependencies list: http://www.blackviper.com/windows-services/windows-process-activation-service/) need to be set to Manual. The Windows Process Activation should be always "Started" on boot.


----------



## thrillhaus

I found that starting "Software Protection" service fixes it.


----------



## Oh wow Secret Cow

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> I did too. Not sure which service causes this, but if you simply right click on Computer in the start menu and go to Properties (also can be accessed by Control Panel\All Control Panel Items\System), Windows realizes it's activated and the message disappears. It seems to randomly reappear every few days, but this always fixes it immediately. Strange


Any updates on the mouse optimization guide? Still would love to see it










If you have time, you could do a short, sourceless guide just for people like me who'd like the cliffs on what tweaks to make


----------



## Melan

I ran G303 though MouseTester with USB2 and 3 and got this.

USB 2


Spoiler: Warning: Spoiler!







USB 3


Spoiler: Warning: Spoiler!







Edit: No, that's not one of those USB 2 vs 3 poop. Just need someone with a more clue than me to read those graphs. Preferably @HAGGARD


----------



## qsxcv

do it a few more times; do they all look like that?


----------



## Melan

Yes.

Here are the graphs I did 5 mins ago.

USB 2


Spoiler: Warning: Spoiler!







USB 3


Spoiler: Warning: Spoiler!







It's like that on all polling rates available in LGS (125/250/500/1kHz)

Edit: Btw that "arrow" pattern is only on G303. My FK1 doesn't have it. I'll try booting into safe mode and see what graphs will be there.


----------



## Melan

Back from safe mode. It's VERY weird man.

USB 2


Spoiler: Warning: Spoiler!







USB 3


Spoiler: Warning: Spoiler!







Same USB 3 graph zoomed in.


Spoiler: Warning: Spoiler!







Edit: Though C1E could be the problem. Nope, does the same even with all C states off.
This weird pattern with usb 2 is what bothers me the most. I mean come on, what the hell.


----------



## qsxcv

do you have other usb devices also plugged in?


----------



## Melan

No.

Edit: From what I see with safe mode graphs it's probably some driver misbehaving. I'll just stick with USB 3 for now and when I'll be reinstalling windows I'll try testing with fresh OS, if I won't forget. I have a lot of stuff installed, some of which is pretty aggressive when it comes to being disabled (like Autodesk license services) or tampered with.

Whatever graphs I posted before is what mousetester usually pumps out. Sometimes it gives me a straight line with mere 5us differences across the whole test once (compared to 100us and 200us) and then goes back to utter mess.

Either way to hell with this. If anyone has any clue what could this be, would be nice to know.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *Oh wow Secret Cow*
> 
> Any updates on the mouse optimization guide? Still would love to see it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you have time, you could do a short, sourceless guide just for people like me who'd like the cliffs on what tweaks to make


I would just recommend using this thread in combination with r0ach's optimization thread to start off, testing conflicting recommendations for yourself to see which is better.

I don't really feel like I can even write a short version myself, because if I start out trying to do that, I know it will just end up being the full guide as I feel obligated to add explanations, etc. I'm hoping in the next few weeks I might find enough free time to do this.


----------



## softskiller

@Melan
What does "USB 3" mean?
That you just enabled USB 3.0 in the BIOS (enabled/smart auto/auto) and mouse is plugged into USB 2.0 port or mouse is plugged into an USB 3.0 port with USB 3.0 enabled?

AND "USB 2" means USB 3.0 disabled via BIOS?

I also did lots of tests and plots today with various USB3.0 settings.


----------



## Melan

Nothing is disabled. USB 2 means Intel's USB 2 port, USB 3 means Intel's USB 3 port. I just plug my mouse there. That's it.


----------



## Wizerino

So you have a better time using 3.0 now?
I can't read these graphs, someone please elaborate so i know what i'm looking at.


----------



## Melan

I'll be using 3.0 until I figure out what is wrong with 2.0. All I know for sure that it's not supposed to give out weird stuff like that. I also tried (properly now) doing same test with MX500 and FK1 and both mice gave same result for USB 2.


----------



## deepor

Quote:


> Originally Posted by *Wizerino*
> 
> So you have a better time using 3.0 now?
> I can't read these graphs, someone please elaborate so i know what i'm looking at.


Open the original size for the pictures so they are larger and not shrunk and blurry. Read the numbers on the vertical axis (look at the left side). You'll see that the scale of the graphs is changing in each screenshot, so just looking at the picture the dots make and how that feels to you is not good.

It seems things are closer to the 1.0 millisecond value for each dot in his USB 3.0 graphs. The mouse hits the exact 1000 Hz rate more closely when plugged into those ports.


----------



## Melan

Just reset BIOS. Weird arrow pattern is gone but USB 3 still produces cleaner graph.


----------



## Wizerino

hmmm thank you for explanation. this is the thing worth investigating.


----------



## MaximilianKohler

You're not the first person who did a CMOS reset that fixed strange graphs. That's pretty interesting. I wonder if some of the stuff r0ach talks about is affected by the same mechanism.


----------



## MaximilianKohler

Something to note is that having other programs open, including games, affects the polling precision.

I know that before matches I used to restart my computer and only have open everything I needed for the match, as that seemed to give the best performance. But even if you did that, the game itself is enough to significantly detrimentally impact the polling precision. I did some testing with this yesterday with CS.


----------



## VolsAndJezuz

If games are noticeably effecting the polling precision, then I would say you were just borderline adequately configured for good 1000Hz polling, and the extra stress on the GPU, video drivers, and networking are enough to throw it into inconsistency.

If you can't maintain similar polling precision @ 1000Hz while in a game, then I would say either drop down to 500Hz and you will again get laser-beam precise polling, or go back through guides and people's suggestions in this thread and r0ach's optimization thread and see if there's anything you missed or haven't tested that could yield that last bit of consistency for in-game 1000Hz polling. Because I see little to no difference in my 1000Hz polling in or out of a game.

And you probably know this, but keeping anything as resouce-intensive as a browser or unneeded voice program open during games is the easiest way to mess up 1000Hz polling. Maybe try fine tuning process priorities with a utility like Prio or the new feature that will be in the upcoming v2.00b of my sourceGL software.


----------



## banjogood

Quote:


> Originally Posted by *Melan*
> 
> Just reset BIOS. Weird arrow pattern is gone but USB 3 still produces cleaner graph.


i also had that exact same weird arrow pattern before. it wasn't always there, along with other weird stuff, and none of that showed up in safe mode. after a fresh windows install it _seems_ to be gone.


----------



## x7007

Anyone found a pattern to fix the issue with their computer ? something everyone can follow up and it will fix the issue instead of testing 1000 things.

Should be easy if we know the things that really turns thing up and down and then follow up with computer specific programs and settings. I didn't have much jumps before but now I have randomly jumps to 1070 and such. I disabled XHCI and ASMEDIA ports still didn't help. I change the mouse and keyboard to USB IRQ 23 instead 16, still didn't help. I have NOD32 8ver and Malwarebytes Anti-Malware + Malwarebytes Anti-Exploit , I tried to disable them but nothing change.

I can't find what causing it.

Windows 7 x64 SP1 non UEFI transferred from HDD to SSD with Samsung Mitigation it seem that it is aligned to no issue with that.

I am overclocking to 4.3 I3770K. I disabled ALL Power saving in bios and High Performance in windows. I need to use LLC on Ultra High cause the voltage goes too low and the cpu fail if not.

I tried HPET on vs off, didn't change, in here you say On so I'll keep testing it with On.

Logitech G502 mouse with SteelSeries SX pad.

Anyone know what could be the problem ?

EDIT : Could the Asus Suite cause any issue ? it seems like the 5 sec after the windows load it runs fine, that I feel the mouse movement changes, it could be the Logitech Gaming Software or the Asus Suite, but I seem to need both of them in some way, and some people doesn't have the issues, at least they don't know or didn't check


----------



## banjogood

I haven't. My mouse polling is bad whatever I have ever tried. Only safe mode gives me something nice.


----------



## x7007

Quote:


> Originally Posted by *arsn*
> 
> I haven't. My mouse polling is bad whatever I have ever tried. Only safe mode gives me something nice.


As soon I enabled the Usb selective suspend setting to ENABLED instead to Disabled as recommended . I didn't have this jumps to high numbers anymore, max was randomly 1040 as much as I could see. but it was randomly, and no more than that like before to insane numbers.


----------



## banjogood

Quote:


> Originally Posted by *x7007*
> 
> As soon I enabled the Usb selective suspend setting to ENABLED instead to Disabled as recommended . I didn't have this jumps to high numbers anymore, max was randomly 1040 as much as I could see. but it was randomly, and no more than that like before to insane numbers.


that would be weird but I'll try.


----------



## Huzzaa

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Very good thread. There were several good tips in the OP that I hadn't remembered to do on my new Win7 x64 install with my new hardware. And I decided to try HPET on + MSI mode for the first time (always felt the risk to an otherwise stable OS wasn't worth the rewards), and so far I'm very pleased. Aim does feel quite different though compared to HPET off in BIOS and + IRQ setup. It almost feels a little slower with HPET + MSI, but somehow not in a laggy way. It feels more solid and accurate but perhaps not quite as instantly responsive for flicking and fast-180s, etc. I'm still getting used to it and will go back and forth with HPET off + IRQ a few more times the next several days to decide on which I like better. But currently leaning towards HPET + MSI. Here is a quick comparison of 1000Hz polling rate for both situations (data exported from MouseTester so I could do additional analysis and more custom graphs). You can see the deviation has tighetened nicely and 1000Hz feels acceptably steady enough to use over 500Hz to me with this new configuration. Also, I included a MouseMovementRecorder screencap to show what my typical 1000Hz polling rate data looks like now. It's noteworthy because of how much more tightly the reported rates stick right around 1000Hz now during fast mouse movement, where previously with HPET off + IRQ it was common if not frequent to see dips and peaks of 20, 30, or 40Hz+ away from the desired 1000Hz.
> 
> 
> 
> 
> 
> When I say 'MSI mode', here's how my PCI resources are. My HPET off + IRQ setup only had the Intel 82579V Gigabit Network adapter adopting MSI mode on its own. Note, I did try to get the SATA RAID controller to MSI mode, even installing Intel RST software in hopes it would do this or at least allow it, but I ended up only with boot BSODs that required System Restore to fully alleviate (Last Good Configuration gave BSOD-less reboot cycles). I'm thinking this is because I'm using the somewhat old v11.x Intel RAID ROM and RST drivers that are vastly superior to newer alternatives for my Z77 chipset, and I suspect upgrading RAID ROM + driver would allow MSI mode. It wouldn't be worth the tradeoff for me, especially because the SATA RAID controller doesn't have anything sharing its IRQ channel.
> 
> 
> 
> What might be hard to tell is the behavior when you change directions quickly, and the mouse momentarily is at rest. This data was from several fast side to side swipes with about 10 direction changes in all, at which you would expect the update time to be a multiple of the idealized 1ms update time when the mouse was at or near zero velocity momentarily. So those data points are cut off to focus on the behavior at the full-speed update interval. Previously, I had stuck with 500Hz polling rate because the polling consistency/update interval wasn't satisfactorily consistent, and this could be felt at the end of fast swipes where 500Hz felt precise and easy to stop on a dime where 1000Hz felt unwieldy and floaty. This was reflected in the data I collected for polling rate for both with my previous HPET off + IRQ setup, where 500Hz would come to a nice sudden and linear stop in this situation (_e.g_.: 500, 500, 500, 500, 250, -) where 1000Hz would flutter to a stop (_e.g_.: 1000, 1000, 1000, 500, 1000, 1000, 500, 250, 500, 125, -). Much to my surprise, with HPET + MSI the data from changes in direction and full stops look much more like the old 500Hz data, with only a couple of update times that were multiples of the 1ms idealized value for each instance; and, this feeling translate to in-game aim as well, where 1000Hz now feels appropriately steady and reliable at all speeds.
> 
> A few points in regards to the optimization suggestions in the OP:
> 
> -Intel Chipset Software/"Drivers" 9.x and 10.x almost literally change nothing besides the device name in Device Manager, as they are merely INF files and not drivers, hence if you already had a 9.x version or higher, upgrading to any newer version doesn't require a reboot. This applies to Intel SMBus Host Controllers and USB Host Controllers, so there is no need or point in trying to do any A/B comparisons to the default Microsoft Windows drivers versus the Intel "drivers".
> 
> -For Windows services, in addition to the services you listed I found the Dhcp (DHCP Client) and Dnscache (DNS Client) services to be necessary to have automatically started in order for my LAN to connect to the internet. And because I use Windows Firewall, also needed the MpsSvc (Windows Firewall) and BFE (Base Filtering Engine) services started automatically.
> 
> -CPUUnparkApp and for that matter and CPU unparking utility I could find were about as useful as counting your feet. ParkControl at least could accurately tell whether CPU cores were actually unparked or not, though it couldn't make the changes itself even with unblocking and running as admin. CPUUnparkApp and CoreParkingManager both would make inadequate changes to actually unpark the cores, and then would variously report for seemingly different reasons that the CPU was unparked. These were all bollocks so I think you should instead recommend the registry edits as you did, then follow that up with simply setting high performance power settings and changing the following to 100% in the advanced power settings > Processor power management: Processor performance core parking min cores, Minimum processor state, Maximum processor state, and Processor performance core parking max cores; and, finally, setting Allow Throttle States to Off. I wonder how many people unknowingly still have parked cores due to being misled by the seemingly universal incompetent unparking programs out there, like I had been for a long time. Complicating the matter is there didn't seem to be an easy-to-find, straight forward resource that laid out how to manually unpark your CPU cores.
> 
> -Why use TimerTool to adjust the timer resolution? Since the setting doesn't stick if you close the program, you'd have to keep it open and minimized to keep timer resolution at 0.5ms. Why not use the SetTimerResolution service instead? http://forums.guru3d.com/showthread.php?t=376458
> 
> -Trying to set core affinity for a service is futile because it is wiped away after every restart, and I couldn't find a convenient program or method for retaining. Furthermore it made absolutely no measureable difference in performance from my short time experimenting with it, and I only see core affinity tweaks being useful as a low-end system optimization. So I would personally just toss out that recommendation.


What are the 2 PCI-to-PCI Bridges that you got in MSI?

I have my GPU, NIC and USB 3.0 Controller in MSI only.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *Huzzaa*
> 
> What are the 2 PCI-to-PCI Bridges that you got in MSI?
> 
> I have my GPU, NIC and USB 3.0 Controller in MSI only.


I now have everything in legacy IRQ mode (see image below). At that time I had systematically gone through devices and attempted to set them all to MSI mode, and that picture shows which ones were able to be in MSI mode. So I'm not sure exactly what the PCI-to-PCI Bridges were, but I would guess they were for the second PCI-E slot and maybe a PCI slot. After more testing with HPET on + MSI mode versus HPET off + legacy IRQ, I found I preferred the latter in terms of mouse movement and more stable FPS.


----------



## Huzzaa

I suppose that is where you and I differ then since the moment I turn HPET off on my Mobo, my framerate starts dancing around the set limit.

For example:

HPET on in Bios and no tweaks in bcd. It's a solid whatever @ 119 main menu and 199 in-game, I'm limiting at 200.

HPET off in Bios, no tweaks in bcd. It starts fluctuating at 116-117 main menu and bounces all around in-game 170-199.

I also play with lightboost and HPET off introduces a lot of stutter in general, in a manner that with lightboost I can see how the picture tears and stutters with the frames.

I'll get home in about 8 hours, will take some pictures of my mousemovement performance as well and post them here. One thing of noteworthy is that my polling picture is super solid.


----------



## Huzzaa

Actually, did the NIC operate after going back on to IRQ?

I did notice that going HPET off in BIOS, the NIC on MSI seemed to be a possible reason for a lot of frame stutter issues but I haven't tested it.

I'm deducting it from my gameplay where I can see right now how the frames sometimes stutter between 197-199 and even at times 203-200-199 in moment bursts, coinsiding with audio streams that are heard for the first time(possible first load) and bandwitdh heavy events.


----------



## VolsAndJezuz

If memory serves, I did get slightly higher average FPS with HPET on + MSI mode, but I also would have much more often framerate dips, and they would be much more severe in terms of how low the FPS would momentarily drop. So occassional quite noticeable single stutters. With HPET off + legacy IRQ, I have no noticeable stuttering and very mild FPS dips. I no longer use Motion Blur Reduction (lightboost equivalent for BenQ), so maybe with MBR/lightboost users could feel some microstuttering with my setup.

And yes NIC works well in MSI mode and legacy IRQ for me. I used TCP Optimizer to dial in network settings that helped with heavy bandwidth events and is ideal for gaming network experience. The sacrifice is my top end speeds for downloads and HQ video streaming is slightly reduced, but well worth the tradeoff imo.


----------



## Karac

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> And yes NIC works well in MSI mode and legacy IRQ for me. I used TCP Optimizer to dial in network settings that helped with heavy bandwidth events and is ideal for gaming network experience. The sacrifice is my top end speeds for downloads and HQ video streaming is slightly reduced, but well worth the tradeoff imo.


I've read conflicting comments about TCP Optimizer. Some say is useless at best, others say it could improve latency and connection quality in general. Is it useful even for a standard on board LAN and a standard DSL connection?
Quote:


> Originally Posted by *HAGGARD*
> 
> Another potentially viable way to improve handling of interrupts is to resolve IRQ conflicts. In msinfo32.exe, look for components that share IRQ# under conflicts and see whether you can disable any, *change IRQs from your BIOS*, get your mouse registered on a host controller that doesn't share an IRQ# or try to see if any components support MSI mode (changes IRQ# as well).


Is this allowed for every BIOS/Motherboard? I didn't find any option nor setting whatsoever.


----------



## VolsAndJezuz

If you have an Intel NIC, then the optimizations are probably more noticeable. Other NICs have worse performance hits from the driver implementations that make a bigger negative difference than can be gained from TCP Optimizer, in my opinion of course.

And no, I would say it's an option most BIOS/mobos do NOT have.


----------



## x7007

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> If you have an Intel NIC, then the optimizations are probably more noticeable. Other NICs have worse performance hits from the driver implementations that make a bigger negative difference than can be gained from TCP Optimizer, in my opinion of course.
> 
> And no, I would say it's an option most BIOS/mobos do NOT have.


Intel network cards, espically the built in are really bad. I've bought a Server Network card now I never see a DPC Intterupt complain from LatencyMon about NDIS.SYS. the card is SOLAFLARE SFN5122F SFP+ 10 Gb Card, but I'm connected using normal 1GB SFP Port, don't have a 10Gb switch .
I wonder if we are able to fix the issue with nvlddmkm.sys file, it always reach Highest reported DPC routine execution time, which NDIS.sys was fighting with the same numbers over 1450.444653

What should be the number for CPU Power Fixed ? currently it's on Auto I think and if it's on Manual then it give the number 300. I have 3770K @ 4.3 , I need 1.260 Manual Voltage + LLC Ultra High to keep it steady.

How can I use the Cpu Offset Voltage ? I never know what number to pick so it will be higher and how much +


----------



## scorpinot

http://rog.asus.com/312772014/labels/guides/tried-and-tested-why-intel-ethernet-is-still-better-for-gaming/


----------



## VolsAndJezuz

Quote:


> Originally Posted by *x7007*
> 
> *Intel network cards, espically the built in are really bad*. I've bought a Server Network card now I never see a DPC Intterupt complain from LatencyMon about NDIS.SYS. the card is SOLAFLARE SFN5122F SFP+ 10 Gb Card, but I'm connected using normal 1GB SFP Port, don't have a 10Gb switch .
> I wonder if we are able to fix the issue with nvlddmkm.sys file, it always reach Highest reported DPC routine execution time, which NDIS.sys was fighting with the same numbers over 1450.444653


Incorrect. The Intel NICs are easily the best available, but they have really bad *drivers* past a certain version. I have Intel Network Connections Version 19.0c, and as you can see, I have wonderful DPC numbers:



However, any drivers I've tried newer than 19.0c had a catastrophic effect on DPC latency and highest reported DPC execution time. Massive intervaled spikes on the scale you cited in determining "Intel network cards ... are really bad."

So may I humbly suggest that you uninstall all existing Intel Network drivers and try 19.0c before making such sweeping claims








Quote:


> Originally Posted by *scorpinot*
> 
> http://rog.asus.com/312772014/labels/guides/tried-and-tested-why-intel-ethernet-is-still-better-for-gaming/


Excellent link, thanks for that.


----------



## x7007

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Incorrect. The Intel NICs are easily the best available, but they have really bad *drivers* past a certain version. I have Intel Network Connections Version 19.0c, and as you can see, I have wonderful DPC numbers:
> 
> 
> 
> However, any drivers I've tried newer than 19.0c had a catastrophic effect on DPC latency and highest reported DPC execution time. Massive intervaled spikes on the scale you cited in determining "Intel network cards ... are really bad."
> 
> So may I humbly suggest that you uninstall all existing Intel Network drivers and try 19.0c before making such sweeping claims
> 
> 
> 
> 
> 
> 
> 
> 
> Excellent link, thanks for that.


Please search Solarflare network cards.
They are one of the best and what you say about intel comparing them on onboard motherboard . its like comparing onboard sound card to external or amplifier. Th card has real offloading . if u want more answers tell me i`ll send u information about it. But the post you send doesn`t say anything about downloadinh at 500 Mb and 1Gig.

Those are the leading cloud service provider and u can guess what they recommend.
https://blog.cloudflare.com/a-tour-inside-cloudflares-latest-generation-servers/

Running dpc for 1 min doesn`t say anything . run it for 10 hrs at least.


----------



## cryptos9099

Quote:


> Originally Posted by *x7007*
> 
> Please search Solarflare network cards.
> They are one of the best and what you say about intel comparing them on onboard motherboard . its like comparing onboard sound card to external or amplifier. Th card has real offloading . if u want more answers tell me i`ll send u information about it. But the post you send doesn`t say anything about downloadinh at 500 Mb and 1Gig.
> 
> Those are the leading cloud service provider and u can guess what they recommend.
> https://blog.cloudflare.com/a-tour-inside-cloudflares-latest-generation-servers/
> 
> Running dpc for 1 min doesn`t say anything . run it for 10 hrs at least.


http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100006519%2050014521%2040000027&IsNodeId=1&SubCategory=27&Manufactory=14521&SpeTabStoreType=1

Not going to pay that much for a NIC when the onboard Intel NIC satisfies most consumer-grade applications (hence way they "are the best")


----------



## deepor

Quote:


> Originally Posted by *x7007*
> 
> Please search Solarflare network cards.
> They are one of the best and what you say about intel comparing them on onboard motherboard . its like comparing onboard sound card to external or amplifier. Th card has real offloading . if u want more answers tell me i`ll send u information about it. But the post you send doesn`t say anything about downloadinh at 500 Mb and 1Gig.
> 
> Those are the leading cloud service provider and u can guess what they recommend.
> https://blog.cloudflare.com/a-tour-inside-cloudflares-latest-generation-servers/
> 
> Running dpc for 1 min doesn`t say anything . run it for 10 hrs at least.


I fear you are making a mistake by looking at servers. The latency on servers seems to be terrible. On workstation and server motherboards, it seems it's not unusual to see DPC latency of 300 µs for example. You will also see things like a board needing over 60 seconds to boot.

The goals on servers seem to be very different than what you want on a desktop. You should be careful if benchmarks are researching throughput. Configuring the Linux kernel for low latency will reduce throughput, so this is not what's done. The reverse is done, and to increase throughput, you intentionally sabotage latency.

To get best throughput, the idea is to try to let the program processes keep the CPU and not interrupt them until they themselves are giving up CPU time. This is the reverse of what you want on a desktop PC. For a snappy feel, you want that new events immediately get time on a CPU so that for example mouse updates get processed with little delay. You want processes to get interrupted, and that reduces their throughput.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *x7007*
> 
> its like comparing onboard sound card to external or amplifier.
> 
> ...
> 
> But the post you send doesn`t say anything about downloadinh at 500 Mb and 1Gig.
> 
> ...
> 
> Running dpc for 1 min doesn`t say anything . run it for 10 hrs at least.


Okay no offense but I can really tell you don't know what you're talking about with the sound card comparison. Take it from someone who has spent many years optimizing computer to work with Cubase using a wide array of different interfaces. I just, can't

...

I could not care less about DPC performance downloading large files or at high speed for my gaming setup, as they would not happen during gaming. But I would bet it's not far off.

...

That test was for 5 minutes not one minute. Clearly you read it very carefully. And I'll do no such thing to waste the precious lifetime of my hardware, in order to learn not much about something I don't even want or need to know.


----------



## softskiller

Hello,

is this normal that USB keyboard and mouse that support up to 1000Hz report rate only show up in tools like HWinfo64 with

*USB Device Speed: USB1.1 Full-speed*

instead of USB2.0 High-speed?

Someone wrote that HID devices only work at USB 1.1 speed.
But I don't know if this is true - also for 1000Hz gear.


----------



## Huzzaa

Heya, was supposto post my usb rates.

The 1st picture is with audio playback enabled. Do note, there was a 1.2khz spike in there as well but usually stable, the start spike is the audio initilization.

2nd pic with nothing:



Respectively for bigger images:

http://i.imgur.com/OEltUoT.png

http://i.imgur.com/G8nbdcz.png


----------



## povohat

Quote:


> Originally Posted by *softskiller*
> 
> Hello,
> 
> is this normal that USB keyboard and mouse that support up to 1000Hz report rate only show up in tools like HWinfo64 with
> 
> *USB Device Speed: USB1.1 Full-speed*
> 
> instead of USB2.0 High-speed?
> 
> Someone wrote that HID devices only work at USB 1.1 speed.
> But I don't know if this is true - also for 1000Hz gear.


Some HID devices are USB 2.0, but USB 2.0 is not required to achieve 1000Hz polling


----------



## softskiller

Quote:


> Originally Posted by *povohat*
> 
> Some HID devices are USB 2.0, but USB 2.0 is not required to achieve 1000Hz polling


I want to add that for my Logitech G502 - which is a pretty modern and higher end mouse - where one would assume that USB2.0 use is fully supported and used, it still says that USB 1.1 speed is used.

It also says that USB 2.0 is supported. So I would like to know if this is normal and such gaming mice will show up as running at USB 1.1 speed for everyone.

I want to be clear, because I am suffering mouse lag and tried everything so far.


----------



## deepor

It's normal. Mice and keyboards are USB 1.1 devices.


----------



## Scrimstar

is this ok

http://gyazo.com/c8519a75e6074f87c6ee812d46b68f4f


----------



## Wizerino

yes its ok.
if you move mouse super fast it will go over 1000hz..
as long as you dont have some interference as in "mouse shuts down" or something similar...those response rates are just fine.


----------



## qsxcv

Quote:


> Originally Posted by *Scrimstar*
> 
> is this ok
> 
> http://gyazo.com/c8519a75e6074f87c6ee812d46b68f4f


yea
Quote:


> Originally Posted by *Wizerino*
> 
> yes its ok.
> if you move mouse super fast it will go over 1000hz..
> as long as you dont have some interference as in "mouse shuts down" or something similar...those response rates are just fine.


no, if the mouse supports 1000hz report, you just need to make sure you move more than 1 count in every 1ms; no need for super fast.

the reason it goes over 1000hz in that program is because there is a bit of jitter


----------



## Wizerino

Quote:


> Originally Posted by *qsxcv*
> 
> yea
> no, if the mouse supports 1000hz report, you just need to make sure you move more than 1 count in every 1ms; no need for super fast.
> 
> the reason it goes over 1000hz in that program is because there is a bit of jitter


what do you mean "1 count in every ms" ?
can u post your results?
i dont feel any jitter on my mouse if you're saying there is a bit of a jittering. i must say i dont feel it. and i'm quite satisfied with how mouse performs.
can u cap it at 1000hz?


----------



## deepor

Quote:


> Originally Posted by *Wizerino*
> 
> what do you mean "1 count in every ms" ?
> can u post your results?
> i dont feel any jitter on my mouse if you're saying there is a bit of a jittering. i must say i dont feel it. and i'm quite satisfied with how mouse performs.
> can u cap it at 1000hz?


He meant, when you look at the screenshot the question was about, you see that there are a bunch of lines with 500 Hz, 333 Hz, etc. That happens when you move the mouse slow enough that there simply is no update to send when the next 1000 Hz tick is happening. You can see that there's just a +1 or -1 change for the X or Y coordinate sent when that happens.


----------



## Wizerino

oooooh i see
thanks man


----------



## pinobot

Came across this:
https://en.wikipedia.org/wiki/USB
Quote:


> Latency
> For USB1 low-speed (1.5 Mbit/s) and full-speed (12 Mbit/s) devices the shortest time for a transaction in one direction is 1 ms.[129] USB2 high-speed (480 Mbit/s) uses transactions within each micro frame (125 µs)[130] where using 1-byte interrupt packet results in a minimal response time of 940 ns. 4-byte interrupt packet results in 984 ns.[131]


----------



## qsxcv

don't think that's worded correctly. should be something like "the shortest time between transactions in one direction is 1 ms" i.e. minimum polling period = 1ms


----------



## Melan

That makes me think if ASUS Gladius (or whatever it's called) is an USB 2.0 device then? Since you know 2khz and stuff.


----------



## qsxcv

probably

can someone with a gladius and a stable computer (i.e. less than +-10us jitter in mousetester when using a mouse that does 1000 fine) plot a mousetester interval plot of the gladius at 2000hz?


----------



## HAGGARD

Quote:


> Originally Posted by *Melan*
> 
> That makes me think if ASUS Gladius (or whatever it's called) is an USB 2.0 device then? Since you know 2khz and stuff.


Yes, here: https://msdn.microsoft.com/en-us/library/windows/hardware/ff539317%28v=vs.85%29.aspx
High-Speed i. e. USB 2.0 can utilize microframe timings. Theoretically it should be possible to write USB filter drivers to force the host to service any device as an USB 2.0 device. That's what SweetLow does for Low-Speed devices - "masks" them as Full-Speed. As outlined here: http://www.overclock.net/t/1549979/differences-between-all-the-ime-3-0-variants/30#post_23770865


----------



## Kitzstyle

Quote:


> Originally Posted by *Karac*
> 
> I've read conflicting comments about TCP Optimizer. Some say is useless at best, others say it could improve latency and connection quality in general. Is it useful even for a standard on board LAN and a standard DSL connection?
> Is this allowed for every BIOS/Motherboard? I didn't find any option nor setting whatsoever.


I improved my ping with ~3ms with TCP optimizer. I don't know if I can go much lower as I have 5ms ping in CSGO now. My settings are purely for gaming and downloading are a bit slower in acceleration but peak at 120mbit. I have an onboard intel Nic and 100/100 Mbit fiber optic in Sweden.


----------



## dmbr

Quote:


> Originally Posted by *Kitzstyle*
> 
> I improved my ping with ~3ms with TCP optimizer. I don't know if I can go much lower as I have 5ms ping in CSGO now. My settings are purely for gaming and downloading are a bit slower in acceleration but peak at 120mbit. I have an onboard intel Nic and 100/100 Mbit fiber optic in Sweden.


CS:GO uses UDP not TCP.


----------



## qsxcv

keep in mind that ping in csgo isn't always consistent (i think due to the client and server ticks not perfectly lining up). so if you want to compare before and after doing whatever optimizations, make sure to measure several times


----------



## softskiller

Those with Win 10 and enabled Intel USB3.0: which driver is better:

the new one that comes with Win10 USBXHCI.sys from 07/2015

or the older official Intel one from 04/2015?

Intel(R)_USB_3.0_eXtensible_Host_Controller_Driver_3.0.5.69

I think I keep the one windows uses.


----------



## CookieBook

Could someone do me a favour and upload their advanced power settings window with processor settings? Mine is Dutch and I don't understand.


----------



## softskiller

I would, but my Haswell has way less entries under CPU:

Only minimum processor state, system cooling policy and maximum processor state.

You can also find screenshots when you google for advanced energie settings.


----------



## baskinghobo

how does it affect ur ping in csgo then if csgo uses udp?


----------



## banjogood

Quote:


> Originally Posted by *baskinghobo*
> 
> how does it affect ur ping in csgo then if csgo uses udp?


hint: it doesnt


----------



## treach

should polling be stable in safe mode?

cause mine isnt... where should look for optimization now? BIOS settings dont seem to change anything also..


----------



## Melan

What OS?


----------



## treach

Quote:


> Originally Posted by *Melan*
> 
> What OS?


win 10, same before 8.1


----------



## Melan

Polling should be always stable regardless of mode. Problem with W10 is the USB driver I recon, which makes mouse go nuts at least on P8Z77-V Pro motherboard.

When I tested polling rate on W10 it went as far as 5khz (that's with 1khz mouse), so yeah. Windows 10 is a no go for gaming until threshold 2 release.


----------



## treach

Quote:


> Originally Posted by *Melan*
> 
> Polling should be always stable regardless of mode. Problem with W10 is the USB driver I recon, which makes mouse go nuts at least on P8Z77-V Pro motherboard.
> 
> When I tested polling rate on W10 it went as far as 5khz (that's with 1khz mouse), so yeah. Windows 10 is a no go for gaming until threshold 2 release.


it has nothing to do with OS in my case...


----------



## trism

Quote:


> Originally Posted by *Melan*
> 
> Polling should be always stable regardless of mode. Problem with W10 is the USB driver I recon, which makes mouse go nuts at least on P8Z77-V Pro motherboard.
> 
> When I tested polling rate on W10 it went as far as 5khz (that's with 1khz mouse), so yeah. Windows 10 is a no go for gaming until threshold 2 release.


Not happening for me. Most likely an issue on your system.


----------



## Melan

Quote:


> Originally Posted by *treach*
> 
> it has nothing to do with OS in my case...


How badly "unstable" is it? Make some interval graphs in mouse tester. My mouse polling in Windows 8.1 is "good enough", that's not consistent 1 but with small deviations +/- 0.020. Some people get it down to 0.005 but that's just overkill which involves disabling a lot of stuff.


----------



## treach

well i got it to mostly +-10us now, C3 state was the main problem


----------



## Huzzaa

Quote:


> Originally Posted by *Melan*
> 
> How badly "unstable" is it? Make some interval graphs in mouse tester. My mouse polling in Windows 8.1 is "good enough", that's not consistent 1 but with small deviations +/- 0.020. Some people get it down to 0.005 but that's just overkill which involves disabling a lot of stuff.


Not really overkill in my opinion.

I have all the essentials running and have no such times where I cant run something because element X is disabled.

1-2 micro seconds deviation at idle is to be expected and 5 is something of a maximum on my rig.

EDIT: And Themes as well, I'm not running my win7 in 95-mode.


----------



## qsxcv

overkill is the maybe the wrong word but still 20us is already far far below what's detectable. but hey lower is better in the same sense that 4.71ghz is better than 4.7ghz.

i think it's more hardware dependent or something... my z97 computer gets <5us with minimal screwing around, but my old x58 computer is always around 10-20us even with as much stuff as possible disabled


----------



## HAGGARD

Overkill for your desktop experience - definitely. But like I say somewhere in the OP, when you run a game things get busier and any improvement that may have appeared negligible in idle can potentially be noticeable there.


----------



## qsxcv

hm yea that's definitely a possibility. maybe that's related to the m_rawinput 1 paranoia


----------



## treach

update guys, im close to getting my system rockstable.

the last Problem is my network, i have a gigabyte z97x gaming 3 and it has the killer 2201, its driver is simply Bad...

does somebody know a good driver Version or another workaround?


----------



## Thraxx

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> I now have everything in legacy IRQ mode (see image below). At that time I had systematically gone through devices and attempted to set them all to MSI mode, and that picture shows which ones were able to be in MSI mode. So I'm not sure exactly what the PCI-to-PCI Bridges were, but I would guess they were for the second PCI-E slot and maybe a PCI slot. After more testing with HPET on + MSI mode versus HPET off + legacy IRQ, I found I preferred the latter in terms of mouse movement and more stable FPS.


How about disabling not used PCI lanes?

Could improve anything


----------



## coldc0ffee

Can someone hit me me up with a TLDR summary I'm at work and I can't read this for 3 more hours


----------



## VolsAndJezuz

Quote:


> Originally Posted by *Thraxx*
> 
> How about disabling not used PCI lanes?
> 
> Could improve anything


I disable several things in device manager. Also my IRQ looks slightly different now because I leave the Intel network adapter and SATA ACHI controller in MSI-mode (how they defaulted from install). Here's both: http://i.imgur.com/him1DaA.png

Edit: now I have my GPU, PCI Express Root Ports, Intel MEI, and PCI bridge in MSI-mode as well. Previously I got weird lag spikes/stutters with GPU in MSI-mode but it's behaving well now. My FPS in test demos and benchmarks went up ~2-3% in CS:GO from these changes. And USB polling precision has improved slightly, mainly that the spikes are fewer and smaller.


----------



## iceskeleton

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> I disable several things in device manager. Also my IRQ looks slightly different now because I leave the Intel network adapter and SATA ACHI controller in MSI-mode (how they defaulted from install). Here's both: http://i.imgur.com/JeA0tNM.png


So when you say legacy IRQ, does this mean it isn't handled virtually by the Windows ACPI?


----------



## VolsAndJezuz

Not sure what you mean. MSI-mode are the ones with negative numbers in the parentheses (Intel Network Connection and SATA ACHI Controller) and the legacy IRQ channels are non-negative (GTX 680, Xonar DG, etc.)


----------



## iceskeleton

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Not sure what you mean. MSI-mode are the ones with negative numbers in the parentheses (Intel Network Connection and SATA ACHI Controller) and the legacy IRQ channels are non-negative (GTX 680, Xonar DG, etc.)


Read around about IRQs and found this
https://www.gearslutz.com/board/4785589-post9.html

So I just thought you turned off ACPI in Windows


----------



## Mahanat

Thx, very useful topic


Spoiler: Warning: Spoiler!


----------



## Huzzaa

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Not sure what you mean. MSI-mode are the ones with negative numbers in the parentheses (Intel Network Connection and SATA ACHI Controller) and the legacy IRQ channels are non-negative (GTX 680, Xonar DG, etc.)


Are those the entire list of IRQ devices you have?

I have a whole 200ish of ACPI or w/e MS's own stuff up there and seeing yours, I'm starting to think that maybe I should disable them?

I'm primarily curious as how you got so few.


----------



## VolsAndJezuz

I cut out the ACPI stuff so the screenshot wouldn't be a mile long


----------



## Bucake

i'm getting a lot of these fluctuations:


any (easy) way to find out what's causing these jumps? because they happen often :/
a long list of ~1000hz and then suddenly one, or a few deviations of up to ~150hz (while all i have running is MMR).
the behavior was there with all 5 mive i've tried: 3 mice at 1000hz, and 2 mice at 500hz.

any tests i could do to find out what the issue might be?


----------



## x7007

Quote:


> Originally Posted by *Bucake*
> 
> i'm getting a lot of these fluctuations:
> 
> 
> any (easy) way to find out what's causing these jumps? because they happen often :/
> a long list of ~1000hz and then suddenly one, or a few deviations of up to ~150hz (while all i have running is MMR).
> the behavior was there with all 5 mive i've tried: 3 mice at 1000hz, and 2 mice at 500hz.
> 
> any tests i could do to find out what the issue might be?


Doesn't it go above 1000 hz for you ? windows 7 my guess ? if not what did you set in bios or windows that fixed it for you not to go above 1000

I have no idea why it didn't happen with windows 7 x64, but it happens with 8.1 and windows 10 on both my laptop and desktop, G502 on both of them.

Can't figure this thing, is it a setting ? is it a driver ? is it a service ? what causing it.


----------



## Conditioned

mkb1969 has created a very nice tool to change the system timer. It can be done as a service and only change timerresolution when a certain program starts through config. http://forums.guru3d.com/showthread.php?t=376458


----------



## Huzzaa

Quote:


> Originally Posted by *x7007*
> 
> Doesn't it go above 1000 hz for you ? windows 7 my guess ? if not what did you set in bios or windows that fixed it for you not to go above 1000
> 
> I have no idea why it didn't happen with windows 7 x64, but it happens with 8.1 and windows 10 on both my laptop and desktop, G502 on both of them.
> 
> Can't figure this thing, is it a setting ? is it a driver ? is it a service ? what causing it.


Yeah, from my own last picture that I posted I've seemingly gone far worse.

I recently upgraded to win10.

Thing is, I think it's the graphics acceleration. On win7, it is buttery smooth, especially on the mousetester graphs. But, once you started up chrome in win7, you got those controlled wave forms of deviation that starts high, goes back to pinpoint 1ms and then again, from 0.7updates/sec-1.3updates/sec deviation slowly down to 1updates/sec(1000hz). From both sides. I have a suspicion that it should be that at least.
Since I was always able to replicate it. And it's a graphics driver thing. You can't really get rid of it and on win10, its basically on from startup, whilst in win7 it was good but once you booted up chrome or even did the test after some games, that's what it looked like.
Personally, I think as long as you don't get some velocity off-sets a lot on some axis, you're fine. As it seems to be a normal thing. Once the PC is under load, my interval rates simply stick around ~200ms off-set on pretty even levels throughout, so should be fine.

But I'm not an expert, take it with a grain of salt.


----------



## HAGGARD

"Going over 1000Hz" is perfectly normal and healthy. If a poll is not processed in time, the next will obviously follow closer to that.
Report #1 is 0ms. Report #2 comes in at 1ms. Say report #2 takes 100 microseconds to get addressed/processed by the CPU. That's 1.1ms after #1, i. e. ~909Hz is what MMR or a similar program would print for that. Then #3 comes in at 2ms (host controller issues polls consistently each 1ms) but only takes 10 microseconds in the CPU stage - that's at 2.01ms; only 910 microseconds after report #2, i. e. MMR prints ~1098Hz.
The reason why we see mostly equally as many reports in both +/- Xus is because the CPU likely compensates for any imprecision. So if it sees it could not fulfil the timing demands that the host asked for in the interrupt schedule, it will address the next one as fast as it possible can. So most off-timed polls should be instantly met with a more precisely timed (and then reported as below 1ms/above 1000Hz) consecutive poll.


----------



## qsxcv

it's simply adding jitter to precisely timed 1000hz data

the cpu doesnt need to compensate for anything


----------



## HAGGARD

It's not simply (i. e. random) jitter because an off-timed event is practically always followed by a more correctly timed one. This must mean the CPU compensates for itself. Otherwise you'd expect under load the reports to come in mostly late, but as far as I have seen there's always a balance there.
Actually you are probably right as always.







Would be good if it did compensate, but it probably doesn't.


----------



## qsxcv

i think you understood me, but for anyone who didn't:

the mouse sends data at regular intervals
|....|....|....|....|....|....|

the amount of time it takes for the cpu to process the data fluctuates because the os does a lot of things at once.
|......|..|....|.....|...|....|

so in mousetester you see intervals of
1.4ms, 0.6ms, 1ms,1.2ms,0.8ms,1ms

it's clear that the intervals must always be centered around 1ms, because e.g. if it takes 0.4ms more time for 2nd data packet to go through, the first interval increases by 0.4ms whereas the second decreases by the same amount


----------



## Bucake

i love you guys


----------



## softskiller

Can large desktop speakers with 2 pounds of magnets influence the USB data stream of the mouse cable?


----------



## Melan

No.


----------



## SweetLow

Quote:


> Originally Posted by *Melan*
> 
> No.


If you drop one of this speakers onto the cable - i would not be so sure...


----------



## qsxcv

if it's a logitech cable, it will probably still be okay


----------



## Huzzaa

Can confirm.


----------



## Fylzka

Quote:


> Originally Posted by *Bucake*
> 
> i'm getting a lot of these fluctuations:
> 
> 
> any (easy) way to find out what's causing these jumps? because they happen often :/
> a long list of ~1000hz and then suddenly one, or a few deviations of up to ~150hz (while all i have running is MMR).
> the behavior was there with all 5 mive i've tried: 3 mice at 1000hz, and 2 mice at 500hz.
> 
> any tests i could do to find out what the issue might be?


I also had a lot of these ~30us fluctuations. In my case they occurred every ~2000ms:



Im still not 100% convinced that the VRM switching frequency was the offender, but that setting was all i changed until i got those results:


----------



## Huzzaa

Really?

I have the same thing going on with my results...

Man, that would be worth testing.

Are you sure you tested both with the OS being in the same state? As you hadn't launched anything that requires graphics acceleration prior to the 2nd testing?

On win7 my results were perfect as well, until I booted up chrome. And then the results were like your 1st 2. But before I booted up Chrome, it was good as well.

And on win10, since DWM seems to simply incorporate acceleration, it's always like your first 2 pics.


----------



## Fylzka

Quote:


> Originally Posted by *Huzzaa*
> 
> Really?


Definitely not!
I changed the VRM freqs again and absolutely no changes in polling.

I tried a lot to reproduce that behaviour, but i cant make it happen.
I'm on Win7 and i always testet without opening anything.

Another thing i changed were the Intel NIC drivers. But that was a few restart earlier.
I'll still try to reproduce those issues and let you know if i find the reason.
U could disable your NIC to eliminate that, if u haven't tried already.


----------



## neetzenden

(New pc) My polling rate keeps jumping from 500hz to 1000hz even 1700hz every half-second/second. I've tried most of the things in this guide/r0ach's guide aswell and I have a big online tournament coming soon, could somone please help me out?


----------



## Huzzaa

You can't really do anything about those spikes, they are GPU related as far as I can tell.

The only spot I got a perfect 1ms response last was on win7 with nothing launched.

As soon as I started chrome in win7, those spikes that looked very symmetrical and always corrected themselves in 1 known interval that didn't change, they were there.

They are still there on win10 from boot up, basically it's something you'll never get rid of. And frankly, with them there, my input response is consistent enough regardless that I actually don't mind.
It's better to have a consistent response anyway, rather than something along the lines of variable response.

http://i.imgur.com/nWKpuK0.png - Windows 10, fresh boot up running Nvidia driver 364.91 and it feels fine frankly. My personal uneducated guess is that it's related to DWM and its GPU acceleration, as is with win7 and Chromes acceleration.


----------



## Bucake

i even have spikes in win7 with nothing running :-( should probably re-install windows, but that's such a effin' hassle..


----------



## VolsAndJezuz

For NVIDIA cards, you can try to enable the KBoost feature of EVGA PrecisionX, which as far as I know is the same as using NVIDIA PowerMizer Manager with Enable PowerMizer Feature checked and set at Fixed Performance Level: Max. Perf / Min. PowerSave. This drastically reduces DPC latency spikes caused periodically by nvlddmkm.sys, the NVIDIA Windows Kernel Mode Driver, as much as an order of magnitude depending on the GPU architecture (my GTX 680 had much higher latency spikes before this and still had occasional high spikes after... my GTX 780 was much lower to start off with and now practically never has nvlddmkm.sys DPC spikes over ~40us). This improved occasional erratic behavior in the USB polling precision plots as well.

Ofc this will make your video card unable to downclock/downvolt to lower power states. I consider this extremely desirable personally but if you are worried about power saving or idle temps then you will not want to do this.


----------



## PurpleChef

F.M.L...cant get it stable


----------



## Melan

It's not supposed to be hard 1000. Your polling is fine, beside last 5 polls which are probably just a result of slow movement.


----------



## PurpleChef

Quote:


> Originally Posted by *Melan*
> 
> It's not supposed to be hard 1000. Your polling is fine, beside last 5 polls which are probably just a result of slow movement.


Ok. Thought constant 1000hz was the goal here








Then i guess win10 is fine. Was about to install w7 again


----------



## Melan

It's just a proof of concept. You can, technically, get a 999-1000 polling, that's it.


----------



## Bucake

to get it ultra stable you need some proper optimization. qsxcv and haggard are probably among very few here who actually went so far.
as for me, it gets a lot less stable whenever i do some stuff. for example having a twitch stream open through chrome significantly lowers stability. i also seem to have peaks going on every ~1,5s, for whatever reason


----------



## qsxcv

well i didn't really do much beyond common sense stuff


----------



## Maximillion

Quote:


> Originally Posted by *qsxcv*
> 
> well i didn't really do much beyond common sense stuff


Did you disable print spooler?


----------



## qsxcv

idk/don't remember
but i don't print from my desktop

common sense says that it generally doesn't hurt to turn off unused stuff


----------



## PurpleChef

Quote:


> Originally Posted by *qsxcv*
> 
> idk/don't remember
> but i don't print from my desktop
> 
> common sense says that it generally doesn't hurt to turn off unused stuff


Ye disabled every single thing i could think off that i don't use.
(Everything is smooth as fck, but allways lookin for stuff that can be turned off, for gaming.)

Qsxcv whats your opinion on win7 vs 10? pro/cons?

We need qsxcv optimization guide ;-)


----------



## qsxcv

windows xp is best obviously


----------



## VolsAndJezuz

Obviously you have never felt mouse movement on Windows 3.1

If you did, you would never go back


----------



## AluminumHaste

This is when running the program at the Windows Desktop:



Here it is when playing a game of Quake Live against bots:



This seems a bit insane, in game. There's latency of almost 1 second? Is it that I'm moving the mouse too fast while in the game?
Using a Logitech G303, plugged into the back of the computer case.
Mouse set to 400 CPI, and a sense of 5 in game.


----------



## Melan

No it's because you don't move mouse constantly at sufficient speed.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *Melan*
> 
> No it's because you don't move mouse constantly at sufficient speed.


That's what's going on in the Quake readings most likely. But the desktop data shows fairly poor polling precision and it appears he was moving sufficiently fast to maintain 1ms update time. There is just a lot of variance because of a poor USB implementation, USB drivers, or something going on in the background. To get a good baseline/background for polling precision, you need to have all programs closed, even Steam, browser, etc. If I was getting that bad of a baseline/background with everything closed and after performing good optimizations like in the OP, I would abandon 1000Hz polling rate and change to 500Hz and see if the variance improves.


----------



## Melan

Mine isn't exactly great too, though I don't really care about these optimizations enough to disable CPU idle.


Spoiler: Warning: Spoiler!







These 2 are when running doom on vulkan in full ultra. Second is the part where I move mouse in circles.


Spoiler: Warning: Spoiler!


----------



## VolsAndJezuz

Yeah, seeing an increase by a factor of ~10 in USB polling variance while playing a game is normal in my experience. I'm not sure if this is due to limitations in how accurately MouseTester can collect data while it is having to compete with a resource hog like a game, or if the USB polling is actually influenced that negatively.

For the sake of science, change your mouse to 500Hz USB polling and report those measurements, if you don't mind.


----------



## TranquilTempest

I posted this in another thread a while back, but you can use a microcontroller with native USB to measure the polling interval.

I have an arduino micro(this code should work with any arduino that has native USB, like a lenoardo or due. With an arduino, you have to make a couple changes to the library code in order to get the information you want. I'm going to detail the changes I made instead of just posting the modified files because there have been updates to the arduino libraries since I did this, and people may read this after there are even more changes.

In USBAPI.h I added two lines defining the variables we need(I added them after the "// USB" comment):

Code:



Code:


extern volatile unsigned long lastTX;
extern volatile unsigned long lastTXInterval;

In USBCore.cpp I added 4 lines, two declaring the above variables:

Code:



Code:


volatile unsigned long lastTX=0;
volatile unsigned long lastTXInterval=0;

And two lines to gather the data we want from the USB ISR:

Code:



Code:


#ifdef CDC_ENABLED
                USB_Flush(CDC_TX);                              // Send a tx frame if found
                lastTXInterval = micros() - lastTX;
                lastTX = micros();
#endif

I then saved those library files and created a new sketch:

Code:



Code:


void setup() {
  Serial.begin(9600);
  Mouse.begin();
}

void loop() {
  Mouse.move(0,-1,0);
  Mouse.move(0,1,0);//generate usb mouse traffic so the ISR has something to send.
  Serial.println(lastTXInterval); //see modifications in USBCore.cpp and USBAPI.h
  delay(50);//wait for serial traffic to clear
}

Got the following results while launching and playing KSP(CPU heavy game):




It looks like the arduino's clock is about 0.5% slower than the PC's clock, or maybe it just takes some time between those two lines I added to the ISR. The resolution of micros(); is 4µs


----------



## qsxcv

you're limited there by arduino's default timers resolution

i've checked with an oscilloscope and jitter is <5ns

iirc usb spec specifies that polling jitter <50ns

the reason mousetester and whatever shows more jitter is because of the time it takes for the usb data to get through whatever the OS needs to do with it


----------



## TranquilTempest

Quote:


> Originally Posted by *qsxcv*
> 
> you're limited there by arduino's default timers resolution
> 
> i've checked with an oscilloscope and jitter is <5ns
> 
> iirc usb spec specifies that polling jitter <50ns
> 
> the reason mousetester and whatever shows more jitter is because of the time it takes for the usb data to get through whatever the OS needs to do with it


Yeah, I took a look at the micros(); function, and figured if I take the time understand the underlying timers well enough to mess with it, then I can just use the underlying timers instead of messing with it. I can probably get better precision by modifying prescalers, but it would also probably break a bunch of other stuff that relies on the timers.


----------



## qsxcv

or just skip the arduino stack completely... i know i got sick of it pretty quickly








the timer registers are fairly straightforward to use; just read through the relevant sections of the datasheet and ignore the parts about pwm generation and stuff.


----------



## freddycatking

Couple things I don't understand after reading the whole thread. First of all, in my advanced power management, "Minimum processor state" and "Maximum processor state" don't seem to exist. All the other perimeters are set to 100%, but how do I make certain that my processor is un-parked? This is after checking three times over with the find tool in regedit for "Attributes" in power management.

Second of all, with HPET-off, how do I determine how many devices/what devices should be set to MSI mode?

And lastly, should I be using a timer resolution tool or not? It does not seem to change my DPC latency, but it does seem to make a big difference in game. Problem is mine is now stuck at somewhere between 0.5 and 1.9, even if I click reset or even if I manually type 15 and click set. Anyone know why that could be?


----------



## c0dy

Quote:


> Originally Posted by *freddycatking*
> 
> Couple things I don't understand after reading the whole thread. First of all, in my advanced power management, "Minimum processor state" and "Maximum processor state" don't seem to exist. All the other perimeters are set to 100%, but how do I make certain that my processor is un-parked? This is after checking three times over with the find tool in regedit for "Attributes" in power management.
> 
> Second of all, with HPET-off, how do I determine how many devices/what devices should be set to MSI mode?
> 
> And lastly, should I be using a timer resolution tool or not? It does not seem to change my DPC latency, but it does seem to make a big difference in game. Problem is mine is now stuck at somewhere between 0.5 and 1.9, even if I click reset or even if I manually type 15 and click set. Anyone know why that could be?


For the core parking you can just choose the High Performance Powerprofile. Otherwise there are a lot of tools and IIRC it was only needed for Windows 7 and older.

For MSI-Mode you should probably head over to guru3d here http://forums.guru3d.com/showthread.php?t=378044


----------



## pstN

Quote:


> Originally Posted by *freddycatking*
> 
> Couple things I don't understand after reading the whole thread. First of all, in my advanced power management, "Minimum processor state" and "Maximum processor state" don't seem to exist. All the other perimeters are set to 100%, but how do I make certain that my processor is un-parked? This is after checking three times over with the find tool in regedit for "Attributes" in power management.
> 
> Second of all, with HPET-off, how do I determine how many devices/what devices should be set to MSI mode?
> 
> And lastly, should I be using a timer resolution tool or not? It does not seem to change my DPC latency, but it does seem to make a big difference in game. Problem is mine is now stuck at somewhere between 0.5 and 1.9, even if I click reset or even if I manually type 15 and click set. Anyone know why that could be?


Regarding the timer tools. I remember r0ach saying they could cause conflicts with other apps trying to change it as well so he leaves it alone.


----------



## freddycatking

Quote:


> Originally Posted by *pstN*
> 
> Regarding the timer tools. I remember r0ach saying they could cause conflicts with other apps trying to change it as well so he leaves it alone.


That's interesting, I really want to get mine back to default now. But like I said it seems to be stuck at some lower clock and I'm not sure how to get it back. http://puu.sh/qglUt.png


----------



## pstN

Quote:


> Originally Posted by *freddycatking*
> 
> That's interesting, I really want to get mine back to default now. But like I said it seems to be stuck at some lower clock and I'm not sure how to get it back. http://puu.sh/qglUt.png


I personally used (I don't anymore for the above stated reasons) http://www.lucashale.com/timer-resolution/ which has a "default" button.

*EDIT* I just realised the free version is said to only support XP. I used it on W10 and it seemed to "change", at least according to it, so I don't know.


----------



## Gonzalez07

i wouldnt bother with the timer tool if you are using windows 10.


----------



## whiteweazel21

I have an older X58 motherboard, and in latency mon I was averaging ~150-250 delay. I followed a long list of procedures to lower this without much luck, but perhaps a reduction down to ~150. I think unparking cores helped the most, along with disabling power saving options.

What actually fixed my latency, was buying a new soundcard (got a Powercolor HDX), and disabling the sound in bios. My latency average 2-3ms now.

If anyone has struggles, try disabling on-board sound in bios. Run latency mon, and if you have low DPC, it means you need to buy an add-on soundcard.


----------



## Melan

Quote:


> Originally Posted by *Gonzalez07*
> 
> i wouldnt bother with the timer tool if you are using windows 10.


Yup. Windows 10 handles timers well like Windows 7. Windows 8.1 on the other hand is bugged.


----------



## freddycatking

Well mine still seems to be stuck way under the default of 15, by clicking the default of either tool. Is there a reg hack to make it default again?

EDIT: I also want to add something about PowerMizer. According to Volz you can turn it off using EVGA precision X but the option in there says it doesn't support analog ports on your graphics card with "VGA" in parenthesis. I don't know what they mean by not supported but I use an analog display so I decided to find out how to do it manually, and noticed that my DPC latency dropped from a constant 20-30 on my "optimized" system to sub 4 constantly... (something to do with the nvidia driver)

https://www.native-instruments.com/forum/threads/solved-dropouts-cracks-pops-on-windows-7-and-nvidia-gfx-card.126080/

http://forums.guru3d.com/showthread.php?t=406671
Quote:


> [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video\{yourid}\000*]
> "RmRCPrevDriverLoadCount"=hex:02,00,00,00
> "PowerMizerEnable"=dword:00000000
> "PerfLevelSrc"=dword:00002222
> "PowerMizerLevel"=dword:00000000
> "PowerMizerLevelAC"=dword:00000000


Basically you go to
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video

in that folder is a bunch of keys with lots of numbers and letters. Each key has other keys in them that look like this:

http://puu.sh/qh3DN.png

Find the "0000" key within all those keys that has the most entries in it. Mine was the second one down, and the parent key also included "0001", "0002," and "0003", which the other folders don't have.

Then just create four new DWORD entries like so (without quotes):

"PerfLevelSrc" set to "3322"
"PowerMizerEnable" set to "0"
"PowermizerLevel" set to "1"
"PowermizerLevelAC" set to "1"

And reboot. It sounds complicated but it's rather simple and made an obvious change to my mousefeel in game plus a visible difference in my DPC latency.


----------



## x7007

Quote:


> Originally Posted by *freddycatking*
> 
> Well mine still seems to be stuck way under the default of 15, by clicking the default of either tool. Is there a reg hack to make it default again?
> 
> EDIT: I also want to add something about PowerMizer. According to Volz you can turn it off using EVGA precision X but the option in there says it doesn't support analog ports on your graphics card with "VGA" in parenthesis. I don't know what they mean by not supported but I use an analog display so I decided to find out how to do it manually, and noticed that my DPC latency dropped from a constant 20-30 on my "optimized" system to sub 4 constantly... (something to do with the nvidia driver)
> 
> https://www.native-instruments.com/forum/threads/solved-dropouts-cracks-pops-on-windows-7-and-nvidia-gfx-card.126080/
> 
> http://forums.guru3d.com/showthread.php?t=406671
> Basically you go to
> [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video
> 
> in that folder is a bunch of keys with lots of numbers and letters. Each key has other keys in them that look like this:
> 
> http://puu.sh/qh3DN.png
> 
> Find the "0000" key within all those keys that has the most entries in it. Mine was the second one down, and the parent key also included "0001", "0002," and "0003", which the other folders don't have.
> 
> Then just create four new DWORD entries like so (without quotes):
> 
> "PerfLevelSrc" set to "3322"
> "PowerMizerEnable" set to "0"
> "PowermizerLevel" set to "1"
> "PowermizerLevelAC" set to "1"
> 
> And reboot. It sounds complicated but it's rather simple and made an obvious change to my mousefeel in game plus a visible difference in my DPC latency.


what does it do acatually?


----------



## Gonzalez07

im surprised that reduced ur latency.....thought that stuff was just for laptops


----------



## Alya

nvlddmkm.sys is getting spikes of ~500us for me and I'm not sure why, tried enabling (typo.) MSI-Mode but that didn't work. (GTX 760) guess I gotta try K-Boost next, this is without using any GPU accelerated programs of course.


----------



## Melan

Nvlddmkm was making 150us spikes for me on slightly older driver. I've updated to recent one and spikes are gone.

Edit: Spikes were present even long before this though. They never went beyond 150us.


----------



## c0dy

Quote:


> Originally Posted by *x7007*
> 
> what does it do acatually?


It shoudln't do anything.

His settings do this in a form of UI



As you can clearly see the PowerMizer Feature is not enabled and so it should not change anything. And afaik it's disabled by default for Desktop-PCs








Unless for some reason these changes still have an impact although the whole feature is disabled. Which I doubt.


----------



## freddycatking

I cant seem to download the powermizer ui so I cant just toggle it on and off to show you. But I'm fairly certain it was enabled on my system by default. Considering that everyone else seems to get easy sub 5 numbers in latencymon, it's possible that it's just me, or I accidentally turned it on with an overclocking tool or something.

EDIT: I should mention that before this tweak, my HWmonitor showed my gfx card downclocking to 135mhz. Now it still seems to downclock but only down to 850mhz. I'll admit I'm not sure it's related and now I'm very confused. Hmmm

EDIT EDIT: I'm being dumb. I think if I start to "do anything" on my PC like open chrome, even if I close it again, my latmon will show higher #us. If I reboot and don't open or touch anything but latmon, it will show lower numbers. Because I couldn't really test powermizer back and forth, I can't tell if it helped anything. So my PC still shows nvidia drivers causing DPC latency after some time using the PC.


----------



## Gonzalez07

well I dont think it does anything, but if you want me to upload powermizer let me know...it took a long time to find it last time I tried don't remember where I got it either. nvm looks like someone else uploaded it, i swear i couldnt find it last time i was looking for it.
http://www.signalwarrant.com/2016/02/15/dell-precision-m6500-nvidia-quadro-fx-2800m-screen-turns-off-solved/


----------



## c0dy

If you use K-Boost for example it's basically the same thing. K-Boost = Nvidia PowerMizer-Settings

http://forums.guru3d.com/showpost.php?p=5290116&postcount=2

Edit: More info about the "history" and how it's "EVGA's" feature from the Afterburner dev.
http://forums.guru3d.com/showpost.php?p=4984266&postcount=93


----------



## Yahar

Quote:


> Originally Posted by *Alya*
> 
> nvlddmkm.sys is getting spikes of ~500us for me and I'm not sure why, tried enabling (typo.) MSI-Mode but that didn't work. (GTX 760) guess I gotta try K-Boost next, this is without using any GPU accelerated programs of course.


My GTX 780 (EVGA) is spiking to ~ 630 us, even with MSI-mode. Which brand is your card? I've been wondering lately if the spikes are related to specific brands.


----------



## Alya

Quote:


> Originally Posted by *Yahar*
> 
> My GTX 780 (EVGA ) is spiking to ~ 630 us, even with MSI-mode. Which brand is your card? I've been wondering lately if the spikes are related to specific brands.


Running an EVGA as well, non-SC.


----------



## altf4

Quote:


> My GTX 780 (EVGA) is spiking to ~ 630 us, even with MSI-mode. Which brand is your card? I've been wondering lately if the spikes are related to specific brands.


I have MSI 780 lightning and i have spikes too.


----------



## RevanCorana

Hi question I cant find my mouse in the IRQs in msinfo32 but there are a lot of "microsoft acpi compliant system". Is the mouse one of them or should I just give higher priority to the USB controller of the chipset?
Also, is there any benefit to giving priority 1 to the graphic card and soundcard (i have an asus phoebus pcie x1) ?

Screen

Oh and what is a numeric data processor? Can I safely disable it in device manager?


----------



## RevanCorana

Also how do you know if the RAM has "unmatching timings"?
I run my DDR3 at 6-6-6-16 are those ok?


----------



## PurpleChef

Use device manager to check what controller mouse is connected to, then check in msinfo. Disable the controllers you dont use, and make sure mouse is on its own IRQ. gg


----------



## whiteweazel21

Another thing I did notice, using Precision X to overvolt/overclock the videocard results in lower latency. Using an EVGA 750ti SC


----------



## RevanCorana

Quote:


> Originally Posted by *PurpleChef*
> 
> Use device manager to check what controller mouse is connected to, then check in msinfo. Disable the controllers you dont use, and make sure mouse is on its own IRQ. gg


Thanks this is what my system looks like now what d'you think?


----------



## VolsAndJezuz

Looks pretty good


----------



## RevanCorana

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Looks pretty good


Ok thanks this was with a g502 btw


----------



## HAGGARD

Quote:


> Originally Posted by *RevanCorana*
> 
> Ok thanks this was with a g502 btw


Mouse doesn't really matter save some defects/bad firmware.

Your readings are indeed pretty good. That's 60 microseconds variation over a ~5 second span (i. e. a span long enough to account for regular processor idle activity), the bulk of the reports being handled within the 20 microseconds range. There's always room for improvement, but as I mentioned somewhere else, as long as you are not getting a few hundreds of microseconds variation mousing won't suffer.


----------



## RevanCorana

Quote:


> Originally Posted by *HAGGARD*
> 
> Mouse doesn't really matter save some defects/bad firmware.
> 
> Your readings are indeed pretty good. That's 60 microseconds variation over a ~5 second span (i. e. a span long enough to account for regular processor idle activity), the bulk of the reports being handled within the 20 microseconds range. There's always room for improvement, but as I mentioned somewhere else, as long as you are not getting a few hundreds of microseconds variation mousing won't suffer.


Ok I already did see some mice that didnt quite reach 1000hz (stuck at around 800) but mostly were wireless ones.

How do yo know if a RAM timing is matching plus with what does it have to match exactly? CPU speed I assume?

Also do you think automatic drivers updaters are trustworthy?

Cool thread btw


----------



## Gonzalez07

@HAGGARD, what do you mean by "game is less responsive" when NVIDIA Display Driver Service is disabled?


----------



## ncck

Not sure if off-topic or on topic here, but there's zero difference between usb 3.0 and usb 2.0 for a mouse right? From what I understand the 3.0 port will just act as a 2.0 port if you plug a 2.0 device (like a mouse) into it. Been using it without problem but curious if anyone researched/looked into it. I couldn't use 2.0 anyway as those ports are defective/dead on my MB


----------



## Alya

Quote:


> Originally Posted by *ncck*
> 
> Not sure if off-topic or on topic here, but there's zero difference between usb 3.0 and usb 2.0 for a mouse right? From what I understand the 3.0 port will just act as a 2.0 port if you plug a 2.0 device (like a mouse) into it. Been using it without problem but curious if anyone researched/looked into it. I couldn't use 2.0 anyway as those ports are defective/dead on my MB


If you want to use your USB 3.0 as USB 2.0 ports, you can just disable the xHCI controller in your BIOS which is what I did.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *RevanCorana*
> 
> Thanks this is what my system looks like now what d'you think?


I just realized I'm an idiot and misread the decimal place. I like to see it a bit tighter personally (assuming your data was taken with nothing else running). I'll include data for my polling as an example. Still, I would say overall your polling is OK because your worst deviations are really not bad. It is very common to see people get frequent spikes to 0.5ms and 1.5ms, which is like 10x worse max deviation than yours. For the sake of science, redo the same type of measurement @ 500Hz for me. I'm interested to see if you're polling precision tightens considerably or not.



I purposely included an example that is showing a strange phenomenon I've seen uncommonly for as long as I can remember using MouseTester: spikes occurring in an increasing pattern at 100us interval over ~1 second. It is extremely unlikely this would even be perceptible, but curious if anyone has any idea as to what might explain this.


----------



## ncck

Quote:


> Originally Posted by *Alya*
> 
> If you want to use your USB 3.0 as USB 2.0 ports, you can just disable the xHCI controller in your BIOS which is what I did.


I see but is there any reason to do that? I'm using the rear panel 3.0 ports which I believe use an intel chipset, seems to be exactly the same as the rear panel 2.0 ports. If anyone has any info let us know.. but from actual usage I personally see no difference!


----------



## Alya

Quote:


> Originally Posted by *ncck*
> 
> I see but is there any reason to do that? I'm using the rear panel 3.0 ports which I believe use an intel chipset, seems to be exactly the same as the rear panel 2.0 ports. If anyone has any info let us know.. but from actual usage I personally see no difference!


Not really a difference no, I only did it because USB3.0 wasn't supported natively in Win7, the biggest difference I've noticed in mouse feeling is using your mouse and keyboard on different hubs.


----------



## whiteweazel21

I'm really curious, is there any way or possibility, to lock the usb polling rate? Like, when I move the mouse very slowly, the polling rate goes down. I understand it's because of moving slowly, but wouldn't it be more accurate running a higher polling rate even when moving slowly?

Like, If my mouse has a max polling rate of 500hz, could I lock it to be 500hz at all speeds? Is that possible, even to just experiment?


----------



## woll3

Quote:


> Originally Posted by *whiteweazel21*
> 
> I'm really curious, is there any way or possibility, to lock the usb polling rate? Like, when I move the mouse very slowly, the polling rate goes down. I understand it's because of moving slowly, but wouldn't it be more accurate running a higher polling rate even when moving slowly?
> 
> Like, If my mouse has a max polling rate of 500hz, could I lock it to be 500hz at all speeds? Is that possible, even to just experiment?


The polling rate itself doesnt drop, the mouse just simply reports less at lower speeds.


----------



## HAGGARD

Quote:


> Originally Posted by *RevanCorana*
> 
> Ok I already did see some mice that didnt quite reach 1000hz (stuck at around 800) but mostly were wireless ones.
> 
> How do yo know if a RAM timing is matching plus with what does it have to match exactly? CPU speed I assume?
> 
> Also do you think automatic drivers updaters are trustworthy?
> 
> Cool thread btw


If they can't reach 1kHz, it's to do with their internal clock/buffer filling rate. More commonly observed in wireless mice because they try to save power wherever, I suppose, and thus clock down internal processes. But I know the MX518 caps out at around ~700Hz too. Though again, that's not related to the "preciseness" of the polling. Generally, you can assume the USB communication is nanosecond-precise and consistent. It's the PC-side (most importantly CPU-side) processing of that input data where things get iffy and can be improved.

We are on overclock dot net here - I'm sure you'll find better resources than myself on RAM/CPU interactions. Just generally, there's RAM frequency and timings, CPU, HTL, NB and SB frequencies and their respective voltaging, and while they have "native" settings they are supposed to operate at, perfomance/stability improvements can be achieved by tinkering around with them. You should do research on your specific components... Or just go the empirical route: play around with your settings for those and observe how it affects stress tests, game performance, polling measurements...

Depends on the specific updater software used. I always felt they are quite sketchy. Cramming together all the drivers for a system that hasn't been updated in a while can be annyoing (and at times near impossible), but not only is it the safest for most system components to stick to the drivers officially distributed by Windows (via Update and from the device manager) or available on your mainboard's official vendor page, ultimately it's not like there's too many things that need to be kept up-to-date all too frequently that it wouldn't be feasible to do so manually; GPU, chipset, ethernet controller, possibly audio controller (I've found Windows default audio drivers to be perfectly fine, more so than proprietary ones in terms of DPC, but feature functionality of fancy soundcards/chips might be compromised). That's about all you need to check for updates every other month or so in a standard system.
Quote:


> Originally Posted by *Gonzalez07*
> 
> @HAGGARD, what do you mean by "game is less responsive" when NVIDIA Display Driver Service is disabled?


I haven't looked into that beyond feeling slight VSync-like delay in-game when disabling the service. As I said there, the profile is being loaded properly without the service running, but it screws something up. The driver service (now "NVIDIA Driver Helper Service") isn't even too bad a culprit in terms of CPU noise or anything, at all. It's the kernel driver that puts the strain on the system, so you might as well keep the service enabled. With some driver version they introduced the "User Experience Driver Component" whether you have Experience installed or not (which I, again, recommend you do not). It's not really clear what it does, but you can disable it (although it still regularly gets co-launched with other NVIDIA functions).
Quote:


> Originally Posted by *whiteweazel21*
> 
> I'm really curious, is there any way or possibility, to lock the usb polling rate? Like, when I move the mouse very slowly, the polling rate goes down. I understand it's because of moving slowly, but wouldn't it be more accurate running a higher polling rate even when moving slowly?
> 
> Like, If my mouse has a max polling rate of 500hz, could I lock it to be 500hz at all speeds? Is that possible, even to just experiment?


The polling rate actually is locked at whatever it is set to. I. e. the controller checks for data at 500Hz at all times. It's just that your mouse is not producing data at that rate if you are moving it slowly enough that causes polling rate programs to report lower rates. So nothing to worry about.


----------



## VolsAndJezuz

It's true that the NVIDIA Kernal Mode Driver is what puts strain on the system, but having the NVIDIA Display Driver Service running somehow causes my max DPC routine execution times from NVIDIA Kernal Mode Driver to go from ~30us to ~300us, which has very real and measurable effects on USB polling spikes and FPS dips manifesting as occasional microstutters in-game.

Personally, I do not feel the "VSync-like delay in-game" when the service isn't running. But I think the service is needed to properly use the NVIDIA control panel.

So instead of setting it to disabled, I've found the best solution is to set it to manual. However, if you ever right-click the desktop, this will invoke the service to start because of the NVIDIA control panel link it adds to the right-click menu, so you should also use Autoruns to uncheck the NvCplDesktopContext entry. Then you can make an Autohotkey script that launches the NVIDIA control panel when you need to use it, then automatically shuts down the service after you close the control panel. Then just add a shortcut to the script on the desktop, start menu, or wherever is convenient. The end product is that the NVIDIA Display Driver Service will be available whenever you need to change NVIDIA settings, but will only run when you have the control panel open and shut down automatically afterwards. Below is my AHK script for this:



Spoiler: AHK script for NVIDIA control panel



Code:



Code:


#NoEnv
#SingleInstance ignore
#NoTrayIcon

OnExit, EOF

Run, nvcplui.exe, C:\Program Files\NVIDIA Corporation\Control Panel Client

Loop {
                Sleep, 10000
                Process, Exist, nvcplui.exe
                if !ErrorLevel
                        break
}
Run, %comspec% /c "net stop nvsvc & exit",, Hide

return

EOF:
ExitApp


----------



## Alya

I've gone ahead and disabled "NVIDIA Display Driver Service" as well as the "NVIDIA GeForce Experience Service" (I don't even have this installed so idk why it's there.) and the "NVIDIA Shield Streaming Service" (along with the network one.)

EDIT: I should probably just clean install my NVIDIA drivers.


----------



## VolsAndJezuz

Yeah I don't have those last two services listed whatsoever. That was with a clean install of 347.09 Beta with only the display driver selected during installation.

Again, I don't recommend straight up disabling the Display Driver Service because then I don't think NVIDIA control panel functions properly.


----------



## HAGGARD

Yeah, I suggested something similar in the OP:
Quote:


> One additional thing to consider for NVIDIA users is the NVIDIA Display Driver Service nvsvc can also be set to manual. Normally NVIDIA launches 2 instances of the NVIDIA Display Driver Helper Service (nvvsvc.exe) and one instance of the NVIDIA User Experience Driver Component (nvxdsync.exe) regardless of whether or not you have NVIDIA Experience installed (which I recommend you do not). With the parent service nvsvc set to manual, those will not automatically start with Windows.


That way the services are only launched when you open the NCP. But again, the problem with this that I noticed (and still notice now) is that the game (being CS:GO) feels less responsive.

I guess we'd need actual latency tests to confirm that. Just shutting down nvsvc from the service console will close the NVIDIA services too, so if anyone can be bothered to throw tests (or just feelings) at it, it's a pretty simple process.


----------



## VolsAndJezuz

Yes, I was adding onto that suggestion with keeping the service from starting if you happen to right-click on the desktop and the AHK script which automatically shuts down the service after you close the control panel, because otherwise it would stay running unless you manually stopped the service. I've done plenty of testing with the NVIDIA service and with it running in CS:GO, there are periodic microstutters or something that feels like mini lag spikes. Without it running, these periodic anomalies disappear for me and I feel no "VSync-like delay in-game" or lack or responsiveness. Only way I can think of to really figure out if there's some kind of delay added by not having the service running would be with @qsxcv's photodiode measurements. Other metrics like average FPS from a timedemo or the FPS benchmark map show no appreciable difference because the effects of the NVIDIA Display Driver Service are so momentary.

Also another important thing to eliminate the microstutter/mini lag spike phenomena is the have the NVIDIA PowerMizer settings on Fixed Performance Level: Max. Perf / Min. PowerSave to enforce the P0 state (using Nvidia PowerMizer Manager to do this, or EVGA PrecisionX KBOOST option does the same thing I think). Beware that this will disable the lower power idle states and increase idle temperature, if you care about such things


----------



## HAGGARD

That script is certainly handy. I on the other hand never noticed that periodic microstuttering that you do with the services running. Would be cool to have some verifiable data both on that and the responsiveness thing. If the kernel driver is a bigger DPC hog with the services running, that's one more-or-less quantifiable aspect. Guess I will have to install LatMon again and check whether that's the case for me too.

As for PowerMizer... Isn't that exclusively for laptops? I thought those settings aren't used on desktop configs at all.


----------



## VolsAndJezuz

They absolutely affect desktops, even if they were intended for laptops. Without the settings I mentioned, it is impossible to force my card into P0 state and the Kernal Mode Driver gives large spikes like the Display Driver Service, even if the latter isn't running.

I can collect and post some LatencyMon data demonstrating the differences later.

Two other variables that I've remembered since my last post. One, different GPU architectures. My 780 and 780 Ti had all their microstuttering and DPC spikes fixed by following my suggestions. While my 680 and 570 still had the bigger spikes even with said fixes, but they were less common. Two, driver version may play a factor. I can't be bothered to upgrade to the newest NVIDIA drivers to see if they behave the same with my suggestions, but I can say that 344.11, 347.09, and 350.12 all benefited the same way from them.


----------



## c0dy

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> EVGA PrecisionX KBOOST option does the same thing I think)


Yes
Quote:


> Originally Posted by *HAGGARD*
> 
> As for PowerMizer... Isn't that exclusively for laptops? I thought those settings aren't used on desktop configs at all.


By default they aren't.

But I've linked to the "history" about K-Boost/PowerMizer here http://www.overclock.net/t/1550666/usb-polling-precision/230#post_25387976

EDIT: Or I'm just gonna quote myself
Quote:


> Originally Posted by *c0dy*
> 
> If you use K-Boost for example it's basically the same thing. K-Boost = Nvidia PowerMizer-Settings
> 
> http://forums.guru3d.com/showpost.php?p=5290116&postcount=2
> 
> Edit: More info about the "history" and how it's "EVGA's" feature from the Afterburner dev.
> http://forums.guru3d.com/showpost.php?p=4984266&postcount=93


----------



## Gonzalez07

I always get bsod if I try to turn on KBOOST or mess with the PowerMizer regedit settings on my 760


----------



## VolsAndJezuz

Immediate BSOD? Have you tried apply and reboot instead of instant apply? My 780 Ti goes to a black screen with blinking cursor if I try instant apply or anything else that restarts the graphics driver, like CRU. Even reinstalling graphics driver is a PITA that requires me to memorize the keyboard combination to restart my computer from the Windows menu because of this.


----------



## pstN

Has anyone here tested polling on the new 1607 Windows 10? I've heard a few reports saying it messed up polling stability or whatever.


----------



## Huzzaa

I know most systems have their default rear-IO USB2 ports on IRQ 16 but mine has the GPU primary slot on it as well and I cannot change it within Windows.

The front-IO is IRQ 23 going through the MB lower-area connector.

Any discernible difference? Or does it not matter? I'm just curious.


----------



## neetzenden

Quote:


> Originally Posted by *pstN*
> 
> Has anyone here tested polling on the new 1607 Windows 10? I've heard a few reports saying it messed up polling stability or whatever.


Had 1000hz stable (50-100 us variance at most), updated and I get +2000us variances, this is ridiculous.


----------



## spinFX

Quote:


> Originally Posted by *Melan*
> 
> I wouldn't do any of this either way. It's a great experiment (and read. No really, good job man.) but not worth in the long shot, at least for me. I'm not even bothered that my DPC latency is hovering around 49us with occasional jump to 200. Except that time when bad network driver made it 5000.


Very well written article and interesting information +rep. Is this more for older CPU's?
Just remotely connected to my rig, started two video files playing, two audio files playing, and started Cinebench R15 to max out the CPU as a test. DPC latency didn't exceed 60us. (I think that covers most of the things that would affect dpc lat)

I guess the part of the guide that described turning off everything and removing all usb devices besides mouse (ie. gimp the computer so it's *almost* useless) threw me off a bit; I can't see why this would be a beneficial thing to do (cut down a fully working rig to a partially working rig to gain 1ms response on your mouse), but then I cannot reproduce the mouse lag issue on my system.


----------



## deepor

Quote:


> Originally Posted by *Huzzaa*
> 
> I know most systems have their default rear-IO USB2 ports on IRQ 16 but mine has the GPU primary slot on it as well and I cannot change it within Windows.
> 
> The front-IO is IRQ 23 going through the MB lower-area connector.
> 
> Any discernible difference? Or does it not matter? I'm just curious.


There's an alternative to IRQ named "MSI". With MSI, each device can have its own interrupt. For some reason, the NVIDIA drivers don't try to use MSI by default but you can force this manually. That way, you can make it so the USB2 controller gets the IRQ to itself.

Here's a forum post that explains it:

http://forums.guru3d.com/showthread.php?t=378044

There's also a paragraph describing this in the first post of this thread here.

This does not work with the USB2 controller device. It can only use IRQ and refuses to use MSI, and that's why you have to make the change for the graphics card device.

There could be issues with this as NVIDIA might have a reason why MSI is not default. You should be on the lookout for problems like strange stutter or whatever. That said, things always worked fine for the cards I had. I also noticed that this MSI stuff is the default on Linux.


----------



## Huzzaa

EDIT: Could use a delete function but I cannot find it.


----------



## pstN

Quote:


> Originally Posted by *neetzenden*
> 
> Had 1000hz stable (50-100 us variance at most), updated and I get +2000us variances, this is ridiculous.


wow.. So the most stable W10 would be 1511?


----------



## ncck

Quote:


> Originally Posted by *deepor*
> 
> There's an alternative to IRQ named "MSI". With MSI, each device can have its own interrupt. For some reason, the NVIDIA drivers don't try to use MSI by default but you can force this manually. That way, you can make it so the USB2 controller gets the IRQ to itself.
> 
> Here's a forum post that explains it:
> 
> http://forums.guru3d.com/showthread.php?t=378044
> 
> There's also a paragraph describing this in the first post of this thread here.
> 
> This does not work with the USB2 controller device. It can only use IRQ and refuses to use MSI, and that's why you have to make the change for the graphics card device.
> 
> There could be issues with this as NVIDIA might have a reason why MSI is not default. You should be on the lookout for problems like strange stutter or whatever. That said, things always worked fine for the cards I had. I also noticed that this MSI stuff is the default on Linux.


Is there any benefit for setting other things to use MSI? I set only my nvidia card to do it a while ago because I felt it made it the slightest amount more snappy (AMD does this by default!) Also note that you shouldn't set certain sound cards to MSI as they will stop functioning until you revert it. I only have my mouse plugged into a USB slot so I don't think I even have a reason to make anymore changes. My keyboard is in ps/2 and audio devices are in the soundcard.. it's pretty clean on my rear panel 8)


----------



## c0dy

Quote:


> Originally Posted by *ncck*
> 
> Also note that you shouldn't set sound cards to MSI as they will stop functioning until you revert it.


Simply not true. It depends on the drivers. If they support MSI, then it'll work.

I have no issues with either of mine in MSI-Mode.

ASUS Xonar Essence STX and Creative X-Fi



Even works together with VoiceMeeter Banana.


----------



## ncck

ah I'll edit my post, it doesn't work with my soundblaster Z - causes it to stop functioning sorry!








also recall other users with the same issue


----------



## c0dy

It's "Creative" I guess. They're kinda known for having "meh" drivers.

Two people on the guru-thread (here on the last page http://forums.guru3d.com/showthread.php?t=378044&page=11) say, that their ZxR work. While other Sound Blasters don't and the very last post says that his Zx works partially but it's buggy. That's one reason why I left the Creative Boat :/ Wonky drivers. Asus isn't much better I suppose but there we have the UNi-drivers. I remember that there were "custom" drivers for the Creative cards aswell, but I don't know if they can fix that MSI issue.

Not sure if it would be possible to mod the inf-file (like people do for nvidia) of the ZxR driver to use 'em on the Z. But that's kinda OT I guess


----------



## HAGGARD

Quote:


> Originally Posted by *spinFX*
> 
> I guess the part of the guide that described turning off everything and removing all usb devices besides mouse (ie. gimp the computer so it's *almost* useless) threw me off a bit; I can't see why this would be a beneficial thing to do (cut down a fully working rig to a partially working rig to gain 1ms response on your mouse), but then I cannot reproduce the mouse lag issue on my system.


Improvements beyond a certain "threshold" come with the caveat that you only go there if you want.







It's a decision, there's pros and contras and it isn't purely about the "mousing" either (ironing some of these more extreme things out can for example improve video and audio playback and stuff, too) - I'm not preaching.

I'm btw. still interested in whether it is to some of our programmers here at all feasible to think of a way to introduce "MouseTester" or something similar to Linux. It would be interesting to get more insight into how the OS plays into polling behaviour (CPU interactions differ - if not fundamentally, at least significantly).


----------



## NeoReaper

I can't seem to find what IRQ my mouse is on, I think its connected to the only controller on my motherboard that is using the MSI Mode since the mouse is only thing plugged into a USB 3 port.
How does this result look to you guys? Probably not great (Corsair KATAR)


----------



## HAGGARD

@NeoReaper:

That's 100us variance at most over 9 seconds. It's not "great", but it's not bad either. At all. Nothing to worry about, but obviously room for improvement if you can be arsed.


----------



## RevanCorana

HAGGARD does having all this stuff disabled benefits CPU fluidity?


----------



## NeoReaper

Quote:


> Originally Posted by *HAGGARD*
> 
> @NeoReaper:
> 
> That's 100us variance at most over 9 seconds. It's not "great", but it's not bad either. At all. Nothing to worry about, but obviously room for improvement if you can be arsed.


I've tried most tweaks I can find on the mouse threads, my onboard audio is disabled as I use a Corsair VOID 7.1 for sound and the only things connected are headset, mouse and keyboard.


----------



## c0dy

Quote:


> Originally Posted by *NeoReaper*
> 
> I can't seem to find what IRQ my mouse is on, I think its connected to the only controller on my motherboard that is using the MSI Mode since the mouse is only thing plugged into a USB 3 port.
> How does this result look to you guys? Probably not great (Corsair KATAR)


You could download USB Device Tree Viewer from here http://www.uwe-sieber.de/usbtreeview_e.html (297KB)

Which would give you basically this.
Here with my Mionix Avior 7000 for example.



It's possible to do it with the Windows Device Manager, but I find this a lot easier and it's portable and small anyway.

Once you've identified the Controller, click on it. Then you have different ways to identify it in the "MSI mode utility" for example.

First one: "Driver Inf" - Shows you the Inf used by that Controller and the tool will also show it, right in the beginning.
Second one: "Device Path/ ID" - If you click any device in the MSI utility, it'll show you the Device ID at the bottom. Just compare it with the one from the Device Manager/ USB Device Tree Viewer.

Behind "Device Path" it'll also show you the GUID which you'll see if you browse through regedit to makle the changes manually (IIRC)

EDIT: Forgot the second picture, duh!


----------



## NeoReaper

Yup, my mouse is already in the controller thats in MSI mode, thanks for helping me c0dy identify that. My keyboard and headset are also on that same usb controller and changing what port they are plugged into makes no difference. (Apart from the front usb ports of the case but I don't want the cable dangling around from the front) xP


----------



## VolsAndJezuz

Quote:


> Originally Posted by *c0dy*
> 
> Simply not true. It depends on the drivers. If they support MSI, then it'll work.
> 
> I have no issues with either of mine in MSI-Mode.
> 
> ASUS Xonar Essence STX and Creative X-Fi
> 
> 
> 
> Even works together with VoiceMeeter Banana.


I can guarantee you that your STX is not in MSI mode, as using the MSI mode utility like you show or setting the registry settings for MSI has no effect on Xonar codes. Check the Resources by connection in Device Manager if you don't believe me and you will find that your STX is still in legacy IRQ mode and not MSI. I suspect the same for your Creative, but I don't have a Creative card to confirm this.


----------



## c0dy

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> I can guarantee you that your STX is not in MSI mode, as using the MSI mode utility like you show or setting the registry settings for MSI has no effect on Xonar codes. Check the Resources by connection in Device Manager if you don't believe me and you will find that your STX is still in legacy IRQ mode and not MSI. I suspect the same for your Creative, but I don't have a Creative card to confirm this.


Interesting. You're right about the STX. Guess I shoudln't have trusted that utility 100%.

The X-Fi on the other hand actually does.



For everyone wondering how to see that in that screen;
MSI = negative numbers in the brackets.


----------



## NeoReaper

Here is everything of mine thats in MSI mode:


----------



## VolsAndJezuz

Quote:


> Originally Posted by *c0dy*
> 
> Interesting. You're right about the STX. Guess I shoudln't have trusted that utility 100%.
> 
> The X-Fi on the other hand actually does.
> 
> 
> 
> For everyone wondering how to see that in that screen;
> MSI = negative numbers in the brackets.


Better question is why do you have 2 sound cards lol


----------



## c0dy

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Better question is why do you have 2 sound cards lol


Pretty easy to explain.

As I did not want to spend any money on a capture card, but I still wanted to see and hear my ps3 and ps4 on my display and headphones I used the optical output on my PS3/4 and the Optical In on the X-Fi for Sound, since the STX doesn't have it and connected the PS3/4 to my HDMI on my display. It's easier like that to play than to turn my head to the right all the time to play on my tv. And it would be "bad quality" since I'd be so close to my 47" TV as I'm usually on my PC when I'd play on my console. If I'm sitting at my desk, the TV is roughly 50-75cm away.

Also it was more like a test. Since I'm not really playing any console games. Apart from MGS <5
I could remove it again, but meh. I'm to lazy and JIC I'd want to play again I'd have to plug it in again.

EDIT: I could use the X-Fi as main and only card. Thing is - it has an annoying buzzing sound for my mic on my MMX300 and I can just swap out OPAmps on the STX for example. But as the X-Fi doesn't seem to cause any trouble (doesn't show up in LatencyMon at all for example) I guess I'll just leave it in there.


----------



## NeoReaper

Just to create a bit more activity here...
Well I have managed this with my mouse so far with even more tweaking (I must've skimmed over the power saving registry settings in the first post since for some reason USB3's link power management was set to 'moderate power saving' and that made a huge difference)


----------



## agsz

Quote:


> Originally Posted by *NeoReaper*
> 
> Just to create a bit more activity here...
> Well I have managed this with my mouse so far with even more tweaking (I must've skimmed over the power saving registry settings in the first post since for some reason USB3's link power management was set to 'moderate power saving' and that made a huge difference)


The 'selective suspend' setting in Windows, or in BIOS?


----------



## NeoReaper

Quote:


> Originally Posted by *agsz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NeoReaper*
> 
> Just to create a bit more activity here...
> Well I have managed this with my mouse so far with even more tweaking (I must've skimmed over the power saving registry settings in the first post since for some reason USB3's link power management was set to 'moderate power saving' and that made a huge difference)
> 
> 
> 
> 
> The 'selective suspend' setting in Windows, or in BIOS?
Click to expand...

In Windows, my bios does not have an option for that setting.


----------



## x7007

Quote:


> Originally Posted by *NeoReaper*
> 
> In Windows, my bios does not have an option for that setting.


But if you disable it, your External devices for example, External Hardisk or other devices can't go to sleep or turn off, they will be on all the time. very bad for other device expect mouse and keyboard.


----------



## c0dy

Interesting.

As a workaround you could create a seperate Power Profile with it deactivated and use Process Lasso to attach that Profile to the games process for example. So it's only deactivated when the game-process is running. There probably is a way with batch files aswell, but with PL it's probably easier.


----------



## NeoReaper

Quote:


> Originally Posted by *x7007*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NeoReaper*
> 
> In Windows, my bios does not have an option for that setting.
> 
> 
> 
> But if you disable it, your External devices for example, External Hardisk or other devices can't go to sleep or turn off, they will be on all the time. very bad for other device expect mouse and keyboard.
Click to expand...

Does not bother me, I always unplug things if I am not using them. Only my Headset, keyboard and mouse are plugged in most of the time.


----------



## pstN

how do you guys manage to get your results looking like in the OP? I always get something that looks like this, even if I try zooming in or out..


----------



## softskiller

Cut off the extreme points with "Data point start" and "data Point end" at the bottom of that window so that it only shows the part between like 2000 and 4500.


----------



## pstN

I tried, but it still doesn't look anything like this 

It doesn't spread at all, even if I zoom in realllllly close.


----------



## VolsAndJezuz

You have to collect and zoom in on data where all the points are between, say, 0.5 and 1.5ms, to make it generous.


----------



## pstN

Got it! It's cos I was using Mousetester 1.2, 1.1 worked on the first try.

How would you guys consider this to be?



I'm still unable to adjust the zoom for "Update Time" is this normal?


----------



## Bucake

turn off as many applications as you can before using mousetester. including processes running in the background
(that graph looks like you have some stuff running)


----------



## VolsAndJezuz

Quote:


> Originally Posted by *pstN*
> 
> Got it! It's cos I was using Mousetester 1.2, 1.1 worked on the first try.
> 
> How would you guys consider this to be?
> 
> I'm still unable to adjust the zoom for "Update Time" is this normal?


If you don't have anything running but MouseTester, then you definitely have some issues going on. Like previous poster said, you should have as little as possible running (no browser, Steam, messaging apps, etc).

The graphs auto adjust the Update Time scale based on the data in the Time range you specify. If you want to to adjust it manually, you'll have to save the LogFile as a .csv file, open it in a spreadsheet program, delete the X and Y counts columns, delete all the data except the Time points you want to graph (as in the times you would choose for a MouseTester plot to keep everything between 0.5-1.5ms), make a column that calculates the update time by subtracting the previous time for each time, graph it, and adjust the X and Y axes' scale to your liking.


----------



## cpaqf1

Hey guys, I'm currently picking out parts for a new rig for fps gaming. I'd like to make usb polling precision a top priority. Price not necessarily being an issue, how would I go about choosing a Motherboard/cpu/ram for something like this? Should I be looking at workstation/server Motherboards or something like that? Or are there things in particular to look for?


----------



## VolsAndJezuz

OS and software setup is more important than hardware IMO, but I can give you some recommendations from my experience.

For CPU, any Intel since Sandy Bridge is good enough and there will be little difference between them for something like USB polling. Basically, the newer architecture you choose, the better the energy efficiency, but you can get a better deal on older chips. Ivy/Sandy Bridge are the most straightforward to overclock, but my preference is Haswell/Haswell Refresh for the best overclocking and best single-threaded performance. I never do anything that would really benefit from having more than 4 cores, so I have always turned off Hyper-threading on my i7's. You could save some money and go for an i5, but I like to spring for the i7s because they will have the top of the line clock speeds and more L3 cache.

For GPU, I strongly prefer the Kepler architecture--specifically, the GK110 GPU because it overclocks like a boss with water cooling and has significantly less DPC latency than its predecessors. Maxwell is okay but NVIDIA made it much more difficult/impossible to do away with all the energy saving features. I would stay away from Pascal for the time being because its latency performance really hasn't been sorted out yet with NVIDIA's drivers. If you're going to overclock, I'd strongly recommend sticking with EVGA Classified models, because they have superior VRMs (the reference VRMs usually limit the OCing potential on water) and they also have special software that gives you more in-depth control of GPU voltages. There are some other manufacturers that are starting to include upgraded VRMs in their top of the line models, like Gigabyte for instance. Just stay away from models based on the NVIDIA reference card if you plan on doing any significant overclocking. I don't have any experience with AMD cards, so I can't comment on them.

For RAM, I recommend a hearty 2400MHz dual channel Samsung-based kit, like G.SKILL TridentX. Samsung IMCs give the best performance with tight timings IMO, but they aren't as overclockable as other IMCs. Still for Haswell at least, 2400MHz is about the most it can handle stably. Early generation CPUs might not even be able to handle 2400MHz. I went with a 16GB dual rank kit and was able to get some really tight timings and incredible performance by feeding it a healthy dose of voltage. Some people might prefer 8GB kits though because they will be single rank modules and be able to take tighter timings in general.

For motherboard, I can personally vouch for Z77 and Z97. I think Z87 and Z170 are probably alright too even though I haven't had motherboards with them. No need to go for a workstation/server motherboard. More importantly, I highly recommend sticking with either Gigabyte or ASUS boards. Again, if you are overclocking, you need to make sure to get a board with a hefty VRM. Sin's Hardware has a good list of the actual phases (versus the advertised number of phases) of various VRMs and the quality of its components. It hasn't been updated past Z97, however you can generally get a feel for the Z170 VRMs based on the VRMs of the same models of the older chipsets. For example, Gigabyte UD5H has always been a fantastic model across the various chipsets. It's probably a good idea to choose the motherboard after you have decided on CPU, and match the chipset to the architecture it was originally built for (Z87->Haswell, Z97->Haswell Refresh/Broadwell, Z170->Skylake, etc).

Unless you are going to go with server grade network card like Solarflare, I would make sure the motherboard has Intel LAN. You can use onboard sound if you want to save some money, but in my experience you get the best combination of sound quality and performance with an ASUS Xonar sound card (there's ones as cheap as $30 up to much more expensive ones with outstanding sound quality). This is because you can utilize UNi modded drivers for them, which have much better performance than the stock ASUS ones or Creative drivers for that matter. Another suggestion is to go with a SSD, otherwise you'll really be gimping your system if everything else is high end. The gold standard(s) is Samsung 850 EVO or PRO, and they are worth the few extra dollars.

Finally, don't skimp on the PSU. It's difficult to recommend wattage without knowing what all is going into the system and how much overclocking you plan to do, but you should really only consider "Tier 1" PSUs. JonnyGURU.com is a great place to check for reviews of specific models. Also take the time to look up user reviews on Newegg/Amazon before deciding on a specific model, because some of the ones that are considered Tier 1 and got sparkling reviews from JonnyGURU have high failure rates and a lot of the companies have very slow/frustrating RMA processes.


----------



## NeoReaper

Agreed with points above ^
And if budget is not an issue and you want to keep the system clean (And even use a smaller case), grab an Z97 board with a reasonable capacity M.2 SSD... One thing that I want to try/see is if someone goes full PCI-E/M.2 storage and disables the sata controller if it makes any real performance difference...


----------



## VolsAndJezuz

I avoided M.2 because I am using 3 PCIe cards, and M.2 would also be soaking up some of those PCIe lanes. But from everything I've seen, there's no real practical improvement from M.2 over SATA 3.0 SSDs. I think some synthetic benchmarks have shown a minor difference.


----------



## Huzzaa

Yeah, can agree on the MB department.

I'm on 2 MSIs and they are quite lacking in the amount of elements you can control with it.


----------



## cpaqf1

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> OS and software setup is more important than hardware IMO, but I can give you some recommendations from my experience.
> 
> For CPU, any Intel since Sandy Bridge is good enough and there will be little difference between them for something like USB polling. Basically, the newer architecture you choose, the better the energy efficiency, but you can get a better deal on older chips. Ivy/Sandy Bridge are the most straightforward to overclock, but my preference is Haswell/Haswell Refresh for the best overclocking and best single-threaded performance. I never do anything that would really benefit from having more than 4 cores, so I have always turned off Hyper-threading on my i7's. You could save some money and go for an i5, but I like to spring for the i7s because they will have the top of the line clock speeds and more L3 cache.
> 
> For GPU, I strongly prefer the Kepler architecture--specifically, the GK110 GPU because it overclocks like a boss with water cooling and has significantly less DPC latency than its predecessors. Maxwell is okay but NVIDIA made it much more difficult/impossible to do away with all the energy saving features. I would stay away from Pascal for the time being because its latency performance really hasn't been sorted out yet with NVIDIA's drivers. If you're going to overclock, I'd strongly recommend sticking with EVGA Classified models, because they have superior VRMs that usually limit the OCing potential on water and they also have special software that gives you more in-depth control of GPU voltages. There are some other manufacturers that are starting to include upgraded VRMs in their top of the line models, like Gigabyte for instance. Just stay away from models based on the NVIDIA reference card if you plan on doing any significant overclocking. I don't have any experience with AMD cards, so I can't comment on them.
> 
> For RAM, I recommend a hearty 2400MHz dual channel Samsung-based kit, like G.SKILL TridentX. Samsung IMCs give the best performance with tight timings IMO, but they aren't as overclockable as other IMCs. Still for Haswell at least, 2400MHz is about the most it can handle stably. Early generation CPUs might not even be able to handle 2400MHz. I went with a 16GB dual rank kit and was able to get some really tight timings and incredible performance by feeding it a healthy dose of voltage. Some people might prefer 8GB kits though because they will be single rank modules and be able to take tighter timings in general.
> 
> For motherboard, I can personally vouch for Z77 and Z97. I think Z87 and Z170 are probably alright too even though I haven't had motherboards with them. No need to go for a workstation/server motherboard. More importantly, I highly recommend sticking with either Gigabyte or ASUS boards. Again, if you are overclocking, you need to make sure to get a board with a hefty VRM. Sin's Hardware has a good list of the actual phases (versus the advertised number of phases) of various VRMs and the quality of its components. It hasn't been updated past Z97, however you can generally get a feel for the Z170 VRMs based on the VRMs of the same models of the older chipsets. For example, Gigabyte UD5H has always been a fantastic model across the various chipsets. It's probably a good idea to choose the motherboard after you have decided on CPU, and match the chipset to the architecture it was originally built for (Z87->Haswell, Z97->Haswell Refresh/Broadwell, Z170->Skylake, etc).
> 
> Unless you are going to go with server grade network card like Solarflare, I would make sure the motherboard has Intel LAN. You can use onboard sound if you want to save some money, but in my experience you get the best combination of sound quality and performance with an ASUS Xonar sound card (there's ones as cheap as $30 up to much more expensive ones with outstanding sound quality). This is because you can utilize UNi modded drivers for them, which have much better performance than the stock ASUS ones or Creative drivers for that matter. Another suggestion is to go with an SSD, otherwise you'll really be gimping your system if everything else is high end. The gold standard(s) is Samsung 850 EVO or PRO, and they are worth the few extra dollars.
> 
> Finally, don't skimp on the PSU. It's difficult to recommend wattage without knowing what all is going into the system and how much overclocking you plan to do, but you should really only consider "Tier 1" PSUs. JonnyGURU.com is a great place to check for reviews of specific models.


Thanks for all that great info, lots to think about.

With regards to the sound card, what about disabling onboard and just using HDMI output through the GPU? Wouldn't that be even better?


----------



## VolsAndJezuz

The NVIDIA audio controller has been a notorious source of issues, so I would not go that route. I have always disabled it in Device Manager so I can't verify that personally.


----------



## RevanCorana

My current result:

Pretty good tho this was a good segment there is still some cpu "noise" sometimes +/- 30us like here


setup:
windows 10 64bit pro - build 1501
Intel Management Interface off
Intel Rapid Storage off
HPET off
tested both, those things didn't do anything, might be case per case.
Because I use windows 10 I had to turn off as many useless services.

And heres how it looks with timertool 0.5ms:

It's basically the same with a bit more "noise" with it and, the cursor feels a bit more jerky at low speed.


----------



## HAGGARD

Quote:


> Originally Posted by *RevanCorana*
> 
> Pretty good tho this was a good segment there is still some cpu "noise" sometimes +/- 30us like here


Obviously not "perfect", but pretty negligible indeed. Look at it like this: that's 1.8 seconds worth of 500Hz reports, i. e. 900 reports. Of which only 9 are outside of the +10us range (the negatively offset reports on the Update Time axis are only the natural result of the positively offset ones - the latter are the only ones you have to worry about).

This goes for any polling measurement: The vast majority of reports are processed in the nanosecond range. What we here see as tens and hundreds (and sometimes thousands) of microseconds offset are (on most solid systems) single-digit percentages of all reports, if that. There's always room for improvement and it has merits, but this is something to keep in mind.
Quote:


> And heres how it looks with timertool 0.5ms:
> 
> It's basically the same with a bit more "noise" with it and, the cursor feels a bit more jerky at low speed.


I'm not sure TimerTool even has use on Windows 10 (or beyond Windows 7 at all, for that matter). I haven't looked into it myself as I am not tweaking WIN8+ systems, but last I heard is that Microsoft revamped the timers on those, both hardware and clock res (QPC to "true" TSC; dynamic ticks, even something about task scheduling changed to where DPclat shows 1000us per default). Other people will probably be able to tell you more about that.


----------



## RevanCorana

Quote:


> Originally Posted by *HAGGARD*
> 
> I'm not sure TimerTool even has use on Windows 10 (or beyond Windows 7 at all, for that matter). I haven't looked into it myself as I am not tweaking WIN8+ systems, but last I heard is that Microsoft revamped the timers on those, both hardware and clock res (QPC to "true" TSC; dynamic ticks, even something about task scheduling changed to where DPclat shows 1000us per default). Other people will probably be able to tell you more about that.


Well it does _something_







what I don't know








Might be a bug of some sort


----------



## Th3Awak3n1ng

Hey guys, tell me something about this please.









Spoiler: Warning: Spoiler!


----------



## Bucake

which mouse? what's the rate(hz) set to?


----------



## James N

Logitech G pro 1000hz logged for several seconds. Is this good or bad? HPET enabled in bios (and used the bcdedit /deletevalue useplatformclock , it gave me an error) , selective suspend disabled, cores unparked and power plan on high performance. Windows 10


----------



## Demi9OD

That's pretty bad James, you don't really want to see values above 1.1 or below .9.

In the original post, there is some information on USB hubs and IRQs. Try to find a USB HUB that doesn't share an IRQ with anything else, and run only the mouse on it. That should make the biggest difference.


----------



## James N

Quote:


> Originally Posted by *Demi9OD*
> 
> That's pretty bad James, you don't really want to see values above 1.1 or below .9.
> 
> In the original post, there is some information on USB hubs and IRQs. Try to find a USB HUB that doesn't share an IRQ with anything else, and run only the mouse on it. That should make the biggest difference.


Hmm, i already did that. I unplugged everything, deinstalled all the usb drivers and let windows reinstall the needed ones (Gigabyte doesn't provide any official USB drivers for Windows 10).

Slightly better result, but that is about it.







.



So i guess, it is time to reinstall.


----------



## Demi9OD

It's not great there, but its not terrible any more. Are you running the anniversary update?

This is as good as I seem to be able to get mine.


----------



## James N

Quote:


> Originally Posted by *Demi9OD*
> 
> It's not great there, but its not terrible any more. Are you running the anniversary update?
> 
> This is as good as I seem to be able to get mine.


Yea i am running the anniversary update. I just went into my bios and disabled all C states. I also disabled all intel usb enhanced host controller. And enabled HPET in the bios. These tests were with the mouse and my USB At 2020 mic sharing the usb hub. Now it looks like this.





Is it normal that i have to disable all cstates to achieve results like that? I assume those are ok now?

I also had to focus on moving the mouse fast enough in a circular motion or it would not have been as good.


----------



## Demi9OD

Looks great now. And yeah, cstates have a large impact. You can try enabling them one at a time to see which is causing it.


----------



## James N

Quote:


> Originally Posted by *Demi9OD*
> 
> Looks great now. And yeah, cstates have a large impact. You can try enabling them one at a time to see which is causing it.


c1 , c3 and c6 had the biggest impact. eist made it worse but not as bad as the others. I disabled all of them now. I wonder if that is ok, as it runs everything at full power permanently now.


----------



## Alya

Quote:


> Originally Posted by *James N*
> 
> c1 , c3 and c6 had the biggest impact. eist made it worse but not as bad as the others. I disabled all of them now. I wonder if that is ok, as it runs everything at full power permanently now.


Depends on if you're okay with paying a higher power bill. For the hardware it should be fine though.


----------



## James N

Quote:


> Originally Posted by *Alya*
> 
> Depends on if you're okay with paying a higher power bill. For the hardware it should be fine though.


Yea, i don't mind. Otherwise i would not have built a gaming pc with an i7 and gtx 1080. Whenever i am on my pc i play games anyways, so it shouldn't change much.

Phew i am happy that i got that out of the way. This also explains why my mouse felt better at 500hz than at 1000hz. Now 1000hz feels way better.

My DPC latency is also much more stable now, slightly higher but it doesn't spike anymore every 2 minutes or so. Now it is at 20-40 stable. Before it was jumping from 10-300.

I am interested if anyone has managed to get similar or even better values without disabling the c states?


----------



## Alya

Quote:


> Originally Posted by *James N*
> 
> Yea, i don't mind. Otherwise i would not have build a gaming pc with an i7 and gtx 1080. Whenever i am on my pc i play games anyways, so it shouldn't change much.
> 
> Phew i am happy that i got that out of the way. I am interested if anyone has managed to get similar or even better values without disabling the c states?


My Z97 board just has awful polling, I've done tons of stuff on the software/OS side and did quite a bit in the BIOS (disabled all the onboard trash like the secondary LAN controller, ASMedia & Intel USB controllers, C-States and Wake States, manually configured RAM timings, disabled every USB port I don't use, etc) and it STILL looks like this. I zoomed in super far because I always get one really high report at like 25ms that makes the graph compressed.


----------



## James N

Quote:


> Originally Posted by *Alya*
> 
> My Z97 board just has awful polling, I've done tons of stuff on the software/OS side and did quite a bit in the BIOS (disabled all the onboard trash like the secondary LAN controller, ASMedia & Intel USB controllers, C-States and Wake States, manually configured RAM timings, disabled every USB port I don't use, etc) and it STILL looks like this. I zoomed in super far because I always get one really high report at like 25ms that makes the graph compressed.


Did you move the mouse at a consistent speed fast enough? My plots also have a few dots above 1.1 and below 0.9 when i don't move the mouse fast enough. When i do move fast enough and do that for 4 seconds or so, and then cut the beginning and end off. Then they look like the good ones i posted earlier.

Also, if you use collect where you hold the left mouse button instead of "log start" , it seems more reliable.

Your plot still looks way better than my beginning ones though.

Is that plot 500hz, btw?


----------



## Alya

Quote:


> Originally Posted by *James N*
> 
> Did you move the mouse at a consistent speed fast enough? My plots also have a few dots above 1.1 and below 0.9 when i don't move the mouse fast enough. When i do move fast enough and do that for 4 seconds or so, and then cut the beginning and end off. Then they look like the good ones i posted earlier.
> 
> Also, if you use collect where you hold the left mouse button instead of "log start" , it seems more reliable.
> 
> Your plot still looks way better than my beginning ones though.
> 
> Is that plot 500hz, btw?


1000Hz, I'm using "log start" yeah. I am moving the mouse fast enough to stay at a consistent 1kHz


----------



## Th3Awak3n1ng

Quote:


> Originally Posted by *Bucake*
> 
> which mouse? what's the rate(hz) set to?


Razer DeathAdder 2013, 500 hz.


----------



## Bucake

well it's certainly not right that you're seeing 1ms and 3ms.
anyone any ideas? i can't recall reports of synapse causing this:
Quote:


> Originally Posted by *Th3Awak3n1ng*


----------



## HAGGARD

Razers do that. Multiple people have reported the same on DeathAdders. Some firmware quirk where they just skip a report at regular intervals.


----------



## James N

Ok final update. I just finished reinstalling and optimizing windows 10 (since my windows installation was over a year old).

It still seems that disabling all c-states in the bios made the biggest difference. I use an AT2020 in the same hub as the mouse (i am forced to , since via doesnt provide windows 10 drivers for my mainboard, so i have only the 2 intel ones working)

Reinstalling windows, definitely made it more consistent .

Logitech G Pro 1000hz



I also noticed that you need to move very stable and at a consistent rate. Otherwise you get false readings. So it also depends how consistent your speed is performing this test. I think it has something to do with the buffer not filling fast enough with information at 1000hz and 400dpi (so you need to maintain a certain speed). So my results are much better if done with 800dpi. At 400dpi i need to circle the mouse much faster.

Edit : what made it even more consistent and also lowered my dpc latency by a lot (below 10ns only spiking every now and then to 40ns due to nvidea), was to disable the nvidea shield streaming services and the "Connect User Experience and Telemetry service" - which is windows 10's built in spyware.


----------



## TriviumKM

I have all non used USB controllers disabled, hpet on, all c-states off, power on high performance with usb suspend disabled and a bunch of other tweaks yet i still can not get my polling to look clean, thinking it's probably just my mobo and theres nothing i can do about it.

800 dpi / 1000hz Logitech Pro (polling essentially looks the same irregardless of the mouse i use)


----------



## James N

Quote:


> Originally Posted by *TriviumKM*
> 
> I have all non used USB controllers disabled, hpet on, all c-states off, power on high performance with usb suspend disabled and a bunch of other tweaks yet i still can not get my polling to look clean, thinking it's probably just my mobo and theres nothing i can do about it.
> 
> 800 dpi / 1000hz Logitech Pro (polling essentially looks the same irregardless of the mouse i use)


Did you disable all redundant services? A lot of them also have an impact on it in addition to increasing your dpc latency.


----------



## TriviumKM

Quote:


> Originally Posted by *James N*
> 
> Did you disable all redundant services? A lot of them also have an impact on it in addition to increasing your dpc latency.


Yeah, and i just went through and disabled some more that were of no use to me, but it pretty much looks exactly the same still:


----------



## dlano

Quote:


> Originally Posted by *TriviumKM*
> 
> I have all non used USB controllers disabled, hpet on, all c-states off, power on high performance with usb suspend disabled and a bunch of other tweaks yet i still can not get my polling to look clean, thinking it's probably just my mobo and theres nothing i can do about it.
> 
> 800 dpi / 1000hz Logitech Pro (polling essentially looks the same irregardless of the mouse i use)


I'm sure you have but I'll mention it anyway because I made this mistake once, but are you sure the mouse is in a USB port using an intel controller? My mobo has intel, VIA and asmedia controllers and I see similar variations using non intel ports.


----------



## Demi9OD

See what it looks like in safe mode. If it doesn't change, you may be right about hardware. If it's better, there is probably still something you can do in software. This is the difference between regular Windows and Safe Mode for me, but I haven't taken the time to figure out why. My PC is a Plex media server and newsgroup "DVR", so I am not going to be disabling all the services I need to make that work. It's good enough for me right now.


----------



## James N

This is my safemode result. This is still far off Haggard's results, but i guess it is fine and shouldn't be much of a problem right?

Safemode



Without safemode


----------



## Demi9OD

James you're perfectly fine without safe mode and I doubt you could notice any difference in game between the two.

TriviumKM still needs some help and the safe mode response was directed towards that end.


----------



## James N

Quote:


> Originally Posted by *Demi9OD*
> 
> James you're perfectly fine without safe mode and I doubt you could notice any difference in game between the two.
> 
> TriviumKM still needs some help and the safe mode response was directed towards that end.


Thank you.
Let's hope that TriviumKM gets it fixed somehow. If his safemode results look way better, maybe reinstalling his OS could be helpful.


----------



## TriviumKM

Quote:


> Originally Posted by *dlano*
> 
> I'm sure you have but I'll mention it anyway because I made this mistake once, but are you sure the mouse is in a USB port using an intel controller? My mobo has intel, VIA and asmedia controllers and I see similar variations using non intel ports.


Yes, I'm using an Intel port

Quote:


> Originally Posted by *Demi9OD*
> 
> See what it looks like in safe mode. If it doesn't change, you may be right about hardware. If it's better, there is probably still something you can do in software. This is the difference between regular Windows and Safe Mode for me, but I haven't taken the time to figure out why. My PC is a Plex media server and newsgroup "DVR", so I am not going to be disabling all the services I need to make that work. It's good enough for me right now.


I'll do this once I'm home and have free time, thanks

Quote:


> Originally Posted by *James N*
> 
> Thank you.
> Let's hope that TriviumKM gets it fixed somehow. If his safemode results look way better, maybe reinstalling his OS could be helpful.


Appreciate the sentiment and help, good people on ocn


----------



## TriviumKM

So got home and ran the utility while in safe mode and voila, great polling:



Now to find what's the culprit


----------



## Derp

Quote:


> Originally Posted by *TriviumKM*
> 
> Now to find what's the culprit


Please keep us updated with your findings.


----------



## qsxcv

wooooooooooo

we clown cursor now

old x58 pc, new installation of windows 10 with almost nothing on it, high performance power setting


----------



## Elrick

Quote:


> Originally Posted by *qsxcv*
> 
> wooooooooooo
> 
> we clown cursor now
> 
> old x58 pc, new installation of windows 10 with almost nothing on it, high performance power setting


Instead of installing Trojan-10 onto all things Intel-based, why not install it on an AMD setup?

First eliminate anything to do with either manufacturers of CPUs then see what happens next. Seen it time and again, always using Intel hardware and getting clowned upon, hence why continue to hit your head against the Intel Wall of ignorance?


----------



## Melan

Quote:


> Originally Posted by *qsxcv*
> 
> wooooooooooo
> 
> we clown cursor now
> 
> old x58 pc, new installation of windows 10 with almost nothing on it, high performance power setting


Define "nothing on it". No usual home/pro/enterprise bloat or you just didn't install anything yet beside drivers?


----------



## James N

Quote:


> Originally Posted by *TriviumKM*
> 
> So got home and ran the utility while in safe mode and voila, great polling:
> 
> 
> 
> Now to find what's the culprit


Nice, if only i could achieve those results. I guess i am hardware limited.

But that is great news (you can definitely lower it, i hope you find out what the issue is). I disabled my audio drivers, anti-virus, nvidea drivers and superfetch. It was almost exactly the same as in safemode then . So maybe you can try that as well. I mean, of course you need your audio drivers, anti-virus and nvidea drivers. But i somewhat lowered it again by disabling my anti-virus while gaming. I also reinstalled my audio drivers. Also disabled the nvidea shield streaming service. So maybe that will help you as well.

Quote:


> Originally Posted by *qsxcv*
> 
> wooooooooooo
> 
> we clown cursor now
> 
> old x58 pc, new installation of windows 10 with almost nothing on it, high performance power setting


I noticed the same, right after reinstalling windows without having anything installed on it. Windows 10 is just poop. You need to remove so much and optimize it so much. In Windows 7 , all i did was disable selective usb suspend, unpark my cores and disabled some services. On Windows 10 there is so much more to do.

I wish they would offer a barebone performance version of Windows 10 without all the junk.


----------



## James N

Quote:


> Originally Posted by *Elrick*
> 
> Instead of installing Trojan-10 onto all things Intel-based, why not install it on an AMD setup?
> 
> First eliminate anything to do with either manufacturers of CPUs then see what happens next. Seen it time and again, always using Intel hardware and getting clowned upon, hence why continue to hit your head against the Intel Wall of ignorance?


I tested windows 10 on my old AMD965BE desktop. Looked about the same, even with cool and quiet and all the c states disabled . A fresh untouched Trojan-10 with a G303 set to 1000hz.


----------



## NeoReaper

It would be interesting to see which Antivirus programs provide the smallest impact to latency.


----------



## Melan

The ones which you do not install and run.
Even then, I'm running free avast and don't see that latency which needs to be gone at all costs.


----------



## Demi9OD

Don't run AV, upload dicey downloads to https://www.virustotal.com/ for scanning.


----------



## TriviumKM

Quote:


> Originally Posted by *James N*
> 
> Nice, if only i could achieve those results. I guess i am hardware limited.
> 
> But that is great news (you can definitely lower it, i hope you find out what the issue is). I disabled my audio drivers, anti-virus, nvidea drivers and superfetch. It was almost exactly the same as in safemode then . So maybe you can try that as well. I mean, of course you need your audio drivers, anti-virus and nvidea drivers. But i somewhat lowered it again by disabling my anti-virus while gaming. I also reinstalled my audio drivers. Also disabled the nvidea shield streaming service. So maybe that will help you as well.
> I noticed the same, right after reinstalling windows without having anything installed on it. Windows 10 is just poop. You need to remove so much and optimize it so much. In Windows 7 , all i did was disable selective usb suspend, unpark my cores and disabled some services. On Windows 10 there is so much more to do.
> 
> I wish they would offer a barebone performance version of Windows 10 without all the junk.


Been going through everything one by one these past two days and although disabling certain features and services did improve my polling slightly, the biggest offender seemed to be my onboard audio drivers [Realtek].

This is my polling now: 

Can probably tweak a bit further, but all in all i'm content with the result, just need to go buy an external dac / amp or soundcard now.

Edit: Now getting jumps to 1.1 when i test (still better than before), so some sort of service was triggered during normal pc usage that has had a negative effect on my polling. So now i have to dig further to track down what's causing it


----------



## daniel0731ex

Showerthought: How does ballistics work properly without any smoothing when you have an unstable polling rate?


----------



## Argowashi

Are you guys using a different version of MouseTester than I am because this is what my Interval vs Time looks like and it doesn't seem like it's the same as yours.


----------



## pstN

Quote:


> Originally Posted by *Argowashi*
> 
> Are you guys using a different version of MouseTester than I am because this is what my Interval vs Time looks like and it doesn't seem like it's the same as yours.


Are you using 1.1 or 1.2?


----------



## James N

Quote:


> Originally Posted by *TriviumKM*
> 
> Been going through everything one by one these past two days and although disabling certain features and services did improve my polling slightly, the biggest offender seemed to be my onboard audio drivers [Realtek].
> 
> This is my polling now:
> 
> Can probably tweak a bit further, but all in all i'm content with the result, just need to go buy an external dac / amp or soundcard now.
> 
> Edit: Now getting jumps to 1.1 when i test (still better than before), so some sort of service was triggered during normal pc usage that has had a negative effect on my polling. So now i have to dig further to track down what's causing it


Glad to hear that you managed to make it better. That looks so much better. Similar improvement to my old and new plots. Congrats. So you just disabled some services? The worst part is after the next big windows 10 update, they will reinstall all the apps and services and you need to redo everything. I had it once after the anniversary update. I want my win7 back









After disabling superfetch and my soundcard, using my onboard soundcard, i get this 

so i guess i can lower it further. But its not possible for me, since i need my creative soundcard. But yea i guess that is good enough.


----------



## James N

Quote:


> Originally Posted by *Argowashi*
> 
> Are you guys using a different version of MouseTester than I am because this is what my Interval vs Time looks like and it doesn't seem like it's the same as yours.


I use 1.5.3

http://www.overclock.net/t/1590569/mousetester-software-reloaded

But i don't think it matters much.

Maybe your mouse on 2000dpi is adding something or your mouse drivers? Try testing it at 800dpi


----------



## agsz

Quote:


> Originally Posted by *James N*
> 
> Glad to hear that you managed to make it better. That looks so much better. Similar improvement to my old and new plots. Congrats. So you just disabled some services? The worst part is after the next big windows 10 update, they will reinstall all the apps and services and you need to redo everything. I had it once after the anniversary update. I want my win7 back
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After disabling superfetch and my soundcard, using my onboard soundcard, i get this
> 
> so i guess i can lower it further. But its not possible for me, since i need my creative soundcard. But yea i guess that is good enough.


What sound card?

SuperFetch automatically disables w/SSD's after a certain Windows 7 update, usually.


----------



## James N

Quote:


> Originally Posted by *agsz*
> 
> What sound card?
> 
> SuperFetch automatically disables w/SSD's after a certain Windows 7 update, usually.


My SuperFetch was still active, don't know why.

Creative Titanium X-Fi


----------



## agsz

Quote:


> Originally Posted by *James N*
> 
> My SuperFetch was still active, don't know why.
> 
> Creative Titanium X-Fi


Did you have DPC Latency issues with that? If not, I don't think it's going to affect your USB polling much.

I'm curious if others in this thread found any benefit to disabling 'USB Selective Suspend' and/or their vacant USB ports in BIOS.


----------



## James N

Quote:


> Originally Posted by *agsz*
> 
> Did you have DPC Latency issues with that? If not, I don't think it's going to affect your USB polling much.
> 
> I'm curious if others in this thread found any benefit to disabling 'USB Selective Suspend' and/or their vacant USB ports in BIOS.


I have no idea how it affects it, maybe its the driver or , i don't know. But using the onboard card gives me better results. Also the more services i disable the better the results become.


----------



## Melan

Extra hardware, extra driver, extra software.

Tried soundblaster once, returned on day 2 and got my dac/amp with toslink. No drivers, no software, no problem.


----------



## Argowashi

Quote:


> Originally Posted by *pstN*
> 
> Are you using 1.1 or 1.2?


I was using the latest version. Is it better to use the earlier ones?

Quote:


> Originally Posted by *James N*
> 
> I use 1.5.3
> 
> http://www.overclock.net/t/1590569/mousetester-software-reloaded
> 
> But i don't think it matters much.
> 
> Maybe your mouse on 2000dpi is adding something or your mouse drivers? Try testing it at 800dpi


It's a G403 and Logitech Gaming Software is turned off so that shouldn't be it. The mouse feel is insanely good though so it's not like there's much wrong with it. Just wanted to test for fun.


----------



## TriviumKM

Quote:


> Originally Posted by *Argowashi*
> 
> Are you guys using a different version of MouseTester than I am because this is what my Interval vs Time looks like and it doesn't seem like it's the same as yours.


Are you using the wireless g403 by any chance? Only reason i ask is because your plot reminds me of this: http://www.overclock.net/t/1595573/logitech-g900-chaos-spectrum-announced/500_100#post_25038566

If so i wouldn't worry about it.


----------



## Argowashi

Quote:


> Originally Posted by *TriviumKM*
> 
> Are you using the wireless g403 by any chance? Only reason i ask is because your plot reminds me of this: http://www.overclock.net/t/1595573/logitech-g900-chaos-spectrum-announced/500_100#post_25038566
> 
> If so i wouldn't worry about it.


I'm actually using the wired version lol. Does it look bad? I guess I could try with Safe Mode first and see if it's because of Windows 7.


----------



## TriviumKM

Quote:


> Originally Posted by *Argowashi*
> 
> I'm actually using the wired version lol. Does it look bad? I guess I could try with Safe Mode first and see if it's because of Windows 7.


Oh







Sounds like a plan


----------



## Argowashi

Quote:


> Originally Posted by *TriviumKM*
> 
> Oh
> 
> 
> 
> 
> 
> 
> 
> Sounds like a plan


Tried Safe Mode and got exactly the same results, except with a little less variance. So I guess it's just something related to my hardware.

I've already disabled everything I can in the UEFI/BIOS so I'm not sure what else I can do. I think the mouse is in an Intel port but I'm not sure.


----------



## TriviumKM

Quote:


> Originally Posted by *Argowashi*
> 
> Tried Safe Mode and got exactly the same results, except with a little less variance. So I guess it's just something related to my hardware.
> 
> I've already disabled everything I can in the UEFI/BIOS so I'm not sure what else I can do. I think the mouse is in an Intel port but I'm not sure.


Yeah, might be hardware related then.

Hopefully someone will chime in to help you get it sorted out if you want to go down that road.


----------



## pstN

Quote:


> Originally Posted by *Argowashi*
> 
> I was using the latest version. Is it better to use the earlier ones?
> It's a G403 and Logitech Gaming Software is turned off so that shouldn't be it. The mouse feel is insanely good though so it's not like there's much wrong with it. Just wanted to test for fun.


Try 1.1


----------



## Argowashi

Quote:


> Originally Posted by *TriviumKM*
> 
> Yeah, might be hardware related then.
> 
> Hopefully someone will chime in to help you get it sorted out if you want to go down that road.


Quote:


> Originally Posted by *pstN*
> 
> Try 1.1


This is what I got with MouseTester 1.1.


----------



## pstN

Quote:


> Originally Posted by *pstN*
> 
> Try 1.1
> 
> weird, for me thats what fixed the problems with the graphs.. sorry


Quote:


> Originally Posted by *Argowashi*
> 
> This is what I got with MouseTester 1.1.


weird, for me thats what fixed the problems with the graphs.. sorry


----------



## Argowashi

Quote:


> Originally Posted by *pstN*
> 
> weird, for me thats what fixed the problems with the graphs.. sorry


No worries. : ) I might get a new motherboard soon (Asus Rampage V Extreme 10 Edition) so I can overclock my CPU and hopefully that'll improve the graphs lol.


----------



## pstN

Quote:


> Originally Posted by *pstN*
> 
> weird, for me thats what fixed the problems with the graphs.. sorry


Quote:


> Originally Posted by *Argowashi*
> 
> No worries. : ) I might get a new motherboard soon (Asus Rampage V Extreme 10 Edition) so I can overclock my CPU and hopefully that'll improve the graphs lol.


nice mobo that's for sure, however fast ram with low timings is probably one of the most important factor once you've done all the optimizing.


----------



## justzeNn

So should i put 500hz instead of 1k hz on my mionix avior 7k?


----------



## dlano

Quote:


> Originally Posted by *Argowashi*
> 
> No worries. : ) I might get a new motherboard soon (Asus Rampage V Extreme 10 Edition) so I can overclock my CPU and hopefully that'll improve the graphs lol.


It's a nice mobo but it's very feature laden, if you're aiming for great polling and optimisation in general there's gonna be a lot of features that you'll end up disabling or going unused.

Unless you really want those features, I'd say save some money and go for something more mid-range, most boards perform around the same nowadays anyway.


----------



## Argowashi

Quote:


> Originally Posted by *pstN*
> 
> nice mobo that's for sure, however fast ram with low timings is probably one of the most important factor once you've done all the optimizing.


Quote:


> Originally Posted by *dlano*
> 
> It's a nice mobo but it's very feature laden, if you're aiming for great polling and optimisation in general there's gonna be a lot of features that you'll end up disabling or going unused.
> 
> Unless you really want those features, I'd say save some money and go for something more mid-range, most boards perform around the same nowadays anyway.


I was thinking of getting that specific motherboard because it looks good, has one PS/2 port, two USB 2.0 ports, two ethernet ports and good WiFi. I know I'll probably end up disabling a lot of stuff but that's fine by me. Also my RAM is only 2133MHz so yeah I want something faster.


----------



## pstN

Quote:


> Originally Posted by *Argowashi*
> 
> I was thinking of getting that specific motherboard because it looks good, has one PS/2 port, two USB 2.0 ports, two ethernet ports and good WiFi. I know I'll probably end up disabling a lot of stuff but that's fine by me. Also my RAM is only 2133MHz so yeah I want something faster.


take a look at some low timing GSkill Trident 3400Mhz or something

I actually considered that mobo in the past, the dealbreaker for me is the USB3.0 since the intel xHCI controller is mid-board meaning that all usb 3 ports at the back are actually the Asmedia controller, so if you disable it you only end up with 2 usb 2.0 ports at the back + front panel USB 3.0 ports and that's it.


----------



## Argowashi

Quote:


> Originally Posted by *pstN*
> 
> take a look at some low timing GSkill Trident 3400Mhz or something
> 
> I actually considered that mobo in the past, the dealbreaker for me is the USB3.0 since the intel xHCI controller is mid-board meaning that all usb 3 ports at the back are actually the Asmedia controller, so if you disable it you only end up with 2 usb 2.0 ports at the back + front panel USB 3.0 ports and that's it.


Interesting and thanks for the information. I thought Haswell-E and X99 only really supported up to 3200MHz?


----------



## pstN

Quote:


> Originally Posted by *Argowashi*
> 
> Interesting and thanks for the information. I thought Haswell-E and X99 only really supported up to 3200MHz?


I think you're right actually, either way, I just looked on newegg and regardless of supported speeds, the G.Skill F4-3000C14D-16GTZ is the one I would go with and its 3000mhz.


----------



## Argowashi

Quote:


> Originally Posted by *pstN*
> 
> I think you're right actually, either way, I just looked on newegg and regardless of supported speeds, the G.Skill F4-3000C14D-16GTZ is the one I would go with and its 3000mhz.


There's also a similar model but 3200Mhz. I was planning on getting a Samsung 950 Pro but I think faster RAM will benefit me more.


----------



## pstN

yeah but the 3200 kit isn't worth it imo with the timings it has,

I would personally stick to a Samsung 850 Pro, I don't think you'd see any benefits from turning the SATA controller off.


----------



## Argowashi

Quote:


> Originally Posted by *pstN*
> 
> yeah but the 3200 kit isn't worth it imo with the timings it has,
> 
> I would personally stick to a Samsung 850 Pro, I don't think you'd see any benefits from turning the SATA controller off.


But the 3200MHz memory I found from G.Skill is CL14 so that's not too bad, right? Also I might be getting the parts earlier than I thought because I get the feeling my GPU is dying on me. I keep getting Display Driver errors and the monitor flickers when it happens and the GPU gets stuck in 2D clocks so I have to reboot the computer.


----------



## pstN

Quote:


> Originally Posted by *Argowashi*
> 
> But the 3200MHz memory I found from G.Skill is CL14 so that's not too bad, right? Also I might be getting the parts earlier than I thought because I get the feeling my GPU is dying on me. I keep getting Display Driver errors and the monitor flickers when it happens and the GPU gets stuck in 2D clocks so I have to reboot the computer.


Oh alright, the one I saw was cas16 iirc. Didn't see that one

Does it happen with older drivers as well? I know many people don't like the latest drivers.


----------



## Argowashi

Quote:


> Originally Posted by *pstN*
> 
> Oh alright, the one I saw was cas16 iirc. Didn't see that one
> 
> Does it happen with older drivers as well? I know many people don't like the latest drivers.


Yeah, I rolled back to 368.81 but it still happened. I tried disabling my overclock and it seems to have helped but it's so weird because this only started happening yesterday. It's also really annoying because it happens mid-game and my FPS suddenly drop from 300 FPS to 100 FPS and my input lag gets crazy and I have to reboot. Sometimes I wish I got a PS4/Wii U/Xbox One instead


----------



## James N

Quote:


> Originally Posted by *justzeNn*
> 
> So should i put 500hz instead of 1k hz on my mionix avior 7k?


If you can't get 1000hz stable then yea 500hz would be better. It definitely makes a difference. On my old installation when my results were bad, 500hz definitely felt much better and more consistent than 1000hz.

One of the reasons why most counter strike pros prefer the 500hz feeling, because i don't think that they optimize their systems like this. And definitely the pc's they have to use at lan won't be optimized. Of course this is only an assumption, but based on what i read that pros prefer 500hz over 1000hz, i think it makes sense.


----------



## justzeNn

Quote:


> Originally Posted by *James N*
> 
> If you can't get 1000hz stable then yea 500hz would be better. It definitely makes a difference. On my old installation when my results were bad, 500hz definitely felt much better and more consistent than 1000hz.
> 
> One of the reasons why most counter strike pros prefer the 500hz feeling, because i don't think that they optimize their systems like this. And definitely the pc's they have to use at lan won't be optimized. Of course this is only an assumption, but based on what i read that pros prefer 500hz over 1000hz, i think it makes sense.


Ty for your answer


----------



## Thunderbringer

My Revel Polling Rate jumps to 500hz occasionally, the beta firmware did no change:


Spoiler: Warning: Spoiler!







In Safemode the situation *sometimes* changes but i am not able to see a pattern(both graphs made in Safemode):


Spoiler: Warning: Spoiler!








My Sys67A D3 B3/ i5 2500k @4.0ghz/ 12GB 1066 RAM/ AMD HD5000/ Win7Home64bit - fresh installed OS

Here are some graphs with other mice for comparison:


Spoiler: Warning: Spoiler!



old Abyssus:


DA 3G:




What could cause this?

Edit:

Sometimes i get good results too (NOT in safemode):


Spoiler: Warning: Spoiler!







It seems so random! Will dig further into this thread, lets see!









Btw thx HAGGARD! <3


----------



## qsxcv

Quote:


> Originally Posted by *Melan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *qsxcv*
> 
> wooooooooooo
> 
> we clown cursor now
> 
> old x58 pc, new installation of windows 10 with almost nothing on it, high performance power setting
> 
> 
> 
> Define "nothing on it". No usual home/pro/enterprise bloat or you just didn't install anything yet beside drivers?
Click to expand...

as in i spent minimal effort disabling crap. cortana is disabled. only installed things are nvidia, chrome, and steam/csgo/dependencies


----------



## Melan

You can try disabling xbox junk or, well, removing it altogether.


----------



## James N

Do some of you use the TimerTool while gaming?


----------



## m0uz

Quote:


> Originally Posted by *James N*
> 
> Do some of you use the TimerTool while gaming?


What is this tool timing you speak of? I demand to know more!


----------



## iceskeleton

Quote:


> Originally Posted by *James N*
> 
> Do some of you use the TimerTool while gaming?


yea but I don't feel any different tbh


----------



## Alya

Quote:


> Originally Posted by *James N*
> 
> Do some of you use the TimerTool while gaming?


Useless on anything past Windows Vista as far as I'm aware, since NtQueryTimerResolution and NtSetTimerResolution are used by Chrome to set the resolution super low and a lot of other everyday programs do the same, and after Vista it's automatically set relatively low.


----------



## Melan

Useful only on Windows 8 and Windows 8.1 due to timer being bugged. Fixed on Windows 10.


----------



## HAGGARD

CS:GO sets clock res to 1ms itself, but TimerTool and the like allow you to set .5ms. As I've said in the OP, polling behaviour is similar with both, but as a general concept, since there's improvements from going 15.6ms -> 1ms, chances are going even lower will further improve things. There's benchmarks that show drastic improvements in app performance - again, as touched in the OP, improving processor task handling efficiency and in turn also showing improvements in more basic tasks such as poll processing. The more tasks, the more improvement can be expected from improving task handling efficiency, obviously.

I still set .5ms if I want to have a more "serious" gaming session. Try to benchmark your raw FPS performance with 1ms vs. .5ms - if there's no difference there, try for frametimes between the two. And if that's still not significant, log your polls while playing or running some programs and at latest there should be a noticeable stability improvement there.

Oh yeah, this goes for Windows 7. 10 I have no idea and no reason to believe benefits at all from this (the contrary if what people have been reporting here holds true universally in WIN10 systems).


----------



## agsz

Quote:


> Originally Posted by *HAGGARD*
> 
> CS:GO sets clock res to 1ms itself, but TimerTool and the like allow you to set .5ms. As I've said in the OP, polling behaviour is similar with both, but as a general concept, since there's improvements from going 15.6ms -> 1ms, chances are going even lower will further improve things. There's benchmarks that show drastic improvements in app performance - again, as touched in the OP, improving processor task handling efficiency and in turn also showing improvements in more basic tasks such as poll processing. The more tasks, the more improvement can be expected from improving task handling efficiency, obviously.
> 
> I still set .5ms if I want to have a more "serious" gaming session. Try to benchmark your raw FPS performance with 1ms vs. .5ms - if there's no difference there, try for frametimes between the two. And if that's still not significant, log your polls while playing or running some programs and at latest there should be a noticeable stability improvement there.
> 
> Oh yeah, this goes for Windows 7. 10 I have no idea and no reason to believe benefits at all from this (the contrary if what people have been reporting here holds true universally in WIN10 systems).


0.5ms always has worse results when FPS benchmarking via TimeDemo method, kind of odd.


----------



## HAGGARD

Quote:


> Originally Posted by *agsz*
> 
> 0.5ms always has worse results when FPS benchmarking via TimeDemo method, kind of odd.


...On Windows 7? Odd indeed.


----------



## agsz

Quote:


> Originally Posted by *HAGGARD*
> 
> ...On Windows 7? Odd indeed.


Yeah I didn't think it would happen, I would then restart my PC in between testing 0.5 and 1.0ms, and 1.0ms had better performance / less fps variance everytime. Sort of like how the launch option '-high' gives worse FPS too, but most would think it would help.

I'm a bit rusty on USB stuff, but on Windows 7, is uninstalling USB 3.0 Driver the same as disabling Intel xHCI Controller in BIOS?


----------



## pstN

Quote:


> Originally Posted by *agsz*
> 
> Yeah I didn't think it would happen, I would then restart my PC in between testing 0.5 and 1.0ms, and 1.0ms had better performance / less fps variance everytime. Sort of like how the launch option '-high' gives worse FPS too, but most would think it would help.
> 
> I'm a bit rusty on USB stuff, but on Windows 7, is uninstalling USB 3.0 Driver the same as disabling Intel xHCI Controller in BIOS?


not exactly no, I suggest you disable it in the bios. it should simply convert them to usb 2.0 ports and not completely disable them.


----------



## agsz

Quote:


> Originally Posted by *pstN*
> 
> not exactly no, I suggest you disable it in the bios. it should simply convert them to usb 2.0 ports and not completely disable them.


I thought it achieved the same thing, since Windows 7 doesn't natively support USB 3.0, so without drivers they would be controlled by EHCI, not xHCI, at least I thought so. I'm desperately trying anything to make my Zowie EC1-A feel more responsive


----------



## HAGGARD

Quote:


> Originally Posted by *agsz*
> 
> Yeah I didn't think it would happen, I would then restart my PC in between testing 0.5 and 1.0ms, and 1.0ms had better performance / less fps variance everytime. Sort of like how the launch option '-high' gives worse FPS too, but most would think it would help.
> 
> I'm a bit rusty on USB stuff, but on Windows 7, is uninstalling USB 3.0 Driver the same as disabling Intel xHCI Controller in BIOS?


More frames _and_ less frame variance? Kind of have to think that's setup-dependent, because I get more stable frametimes at .5ms and no appreciable difference in average framerate. Similar to how the -high launch option ("high" process priority) indeed does not increase raw performance (and can even reduce it by a handful of frames on average, likely depending on the CPU's multicore/thread performance) but does improve stability.

I guess you could use other benchmarks as the method might be affecting stuff. There's that FPS map, or just good old fraps. And look at polls in-game too. I suppose if the poll processing improvements are marginal or even worse as well, clockres really is not for you(r setup).

Uninstalling the driver means Windows 7 (which does indeed not offer Microsoft xHCI drivers) will always be looking for the drivers. Well, at least at boot or when the Update service is started or whatever. Either way, I'd always make sure to disable a component altogether, at BIOS level if possible. Depending on the USB 3.0 implementation on your board, that might even affect the entire USB stack's behaviour. Wouldn't be the first time I've seen improvements just by disabling 3.0. That said, the soft method of having the driver installed and just disabling the 3.0 controllers in the device manager sounds good enough if you do need those sometimes.


----------



## pstN

Quote:


> Originally Posted by *agsz*
> 
> I thought it achieved the same thing, since Windows 7 doesn't natively support USB 3.0, so without drivers they would be controlled by EHCI, not xHCI, at least I thought so. I'm desperately trying anything to make my Zowie EC1-A feel more responsive


is there a reason why you don't want to disable xHCI?


----------



## agsz

Quote:


> Originally Posted by *HAGGARD*
> 
> More frames _and_ less frame variance? Kind of have to think that's setup-dependent, because I get more stable frametimes at .5ms and no appreciable difference in average framerate. Similar to how the -high launch option ("high" process priority) indeed does not increase raw performance (and can even reduce it by a handful of frames on average, likely depending on the CPU's multicore/thread performance) but does improve stability.
> 
> I guess you could use other benchmarks as the method might be affecting stuff. There's that FPS map, or just good old fraps. And look at polls in-game too. I suppose if the poll processing improvements are marginal or even worse as well, clockres really is not for you(r setup).
> 
> Uninstalling the driver means Windows 7 (which does indeed not offer Microsoft xHCI drivers) will always be looking for the drivers. Well, at least at boot or when the Update service is started or whatever. Either way, I'd always make sure to disable a component altogether, at BIOS level if possible. Depending on the USB 3.0 implementation on your board, that might even affect the entire USB stack's behaviour. Wouldn't be the first time I've seen improvements just by disabling 3.0. That said, the soft method of having the driver installed and just disabling the 3.0 controllers in the device manager sounds good enough if you do need those sometimes.


I have the ASUS Z97-AR, if that matters, but yeah I don't need the drivers for anything. I realized that certain USB 3.0 drivers made my mouse feel more/less sensitive, so the only time I'd install drivers again is to go back and test to confirm. My method for testing was going to the same spot on de_mirage and lining up my crosshair, and swiping from edge to edge across my mousepad, and doing this a few times, and screenshotting the start/end points to confirm I did it correctly, then compare afterwards.

The FPS Benchmark map is good, but has its flaws. Sitting @ 999 fps or so for most of the map until you hit the lineup of ~20 T/CT models, and the overpowered smoke puffs, is what I feel is the issue. Timedemo method has been a pretty good benchmark for me at least, I use it to test NVIDIA Drivers, settings changes, etc.
Quote:


> Originally Posted by *pstN*
> 
> is there a reason why you don't want to disable xHCI?


Not at all, was just curious if removing the driver achieved the same thing.


----------



## Th3Awak3n1ng

Quote:


> Originally Posted by *agsz*
> 
> Timedemo method has been a pretty good benchmark for me at least, I use it to test NVIDIA Drivers, settings changes, etc.


Yeah, same here.


----------



## iceskeleton

So speaking of timedemo with csgo, I did the demo taken here http://www.hltv.org/blog/7971-one-of-the-best-method-to-check-your-fps-in-csgo-old

win7-pro 64bit/i5-3570/gtx 760 @ 353.06 driver
And this is what I got, did a game restart when changing settings

with 0.5ms timertool and -high

7487 frames 27.448 seconds 272.77 fps ( 3.67 ms/f) 22.599 fps variability
7487 frames 27.902 seconds 268.33 fps ( 3.73 ms/f) 22.136 fps variability
7487 frames 27.933 seconds 268.03 fps ( 3.73 ms/f) 21.836 fps variability
without timertool

7487 frames 28.514 seconds 262.58 fps ( 3.81 ms/f) 22.150 fps variability
7487 frames 28.630 seconds 261.51 fps ( 3.82 ms/f) 21.114 fps variability
7487 frames 28.851 seconds 259.50 fps ( 3.85 ms/f) 21.147 fps variability
without timertool without -high

7487 frames 28.402 seconds 263.61 fps ( 3.79 ms/f) 22.086 fps variability
7487 frames 28.628 seconds 261.53 fps ( 3.82 ms/f) 21.207 fps variability
7487 frames 28.647 seconds 261.35 fps ( 3.83 ms/f) 21.326 fps variability
For me timertool 0.5ms does make a difference in fps it seems. High doesn't seem to change much at all.

Also unrelated but I had been using Nvidia driver 368.81 since I played overwatch and thought I would get better fps with the new driver at that time, but started getting into csgo again and found that 353.06 gave ~20 fps more for csgo with timertool









didn't benchmark overwatch though


----------



## agsz

Quote:


> Originally Posted by *iceskeleton*
> 
> So speaking of timedemo with csgo, I did the demo taken here http://www.hltv.org/blog/7971-one-of-the-best-method-to-check-your-fps-in-csgo-old
> 
> win7-pro 64bit/i5-3570/gtx 760 @ 353.06 driver
> And this is what I got, did a game restart when changing settings
> 
> with 0.5ms timertool and -high
> 
> 7487 frames 27.448 seconds 272.77 fps ( 3.67 ms/f) 22.599 fps variability
> 7487 frames 27.902 seconds 268.33 fps ( 3.73 ms/f) 22.136 fps variability
> 7487 frames 27.933 seconds 268.03 fps ( 3.73 ms/f) 21.836 fps variability
> without timertool
> 
> 7487 frames 28.514 seconds 262.58 fps ( 3.81 ms/f) 22.150 fps variability
> 7487 frames 28.630 seconds 261.51 fps ( 3.82 ms/f) 21.114 fps variability
> 7487 frames 28.851 seconds 259.50 fps ( 3.85 ms/f) 21.147 fps variability
> without timertool without -high
> 
> 7487 frames 28.402 seconds 263.61 fps ( 3.79 ms/f) 22.086 fps variability
> 7487 frames 28.628 seconds 261.53 fps ( 3.82 ms/f) 21.207 fps variability
> 7487 frames 28.647 seconds 261.35 fps ( 3.83 ms/f) 21.326 fps variability
> For me timertool 0.5ms does make a difference in fps it seems. High doesn't seem to change much at all.
> 
> Also unrelated but I had been using Nvidia driver 368.81 since I played overwatch and thought I would get better fps with the new driver at that time, but starting getting into csgo again and found that 353.06 gave ~20 fps more for csgo with timertool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> didn't benchmark overwatch though


I have a 770, and found 344.11 & 350.12 to be the best. I'll try timertool again, I did it months ago, and every update seems to change CS:GO


----------



## Demi9OD

Why would you ever unset the timer? Why not just leave it set at .5ms all the time?


----------



## HAGGARD

Quote:


> Originally Posted by *Demi9OD*
> 
> Why would you ever unset the timer? Why not just leave it set at .5ms all the time?


Energy consumption. The primary reason these little programs or services to manually adjust the clock resolution exist to begin with is because there used to be problems with software setting low values and the OS not resetting them. Which apparently lead to significantly increased battery draw on laptops.

On desktop it's pretty negligible, and most browsers set 1ms anyway. But there's no reason to keep it forced low either; not like you need the slightly increased performance all the time.


----------



## agsz

For Windows 7 users: Have any of you installed '*June 2015 Intel CPU microcode update for Windows*' (KB3064209), and if so, have you noticed any differences with your mouse? I'm not exactly sure if it affected polling or something else, I need to go back and test it a bit more, but my mouse felt wacky as fack from an update that I didn't think could possibly cause that.


----------



## VolsAndJezuz

Doubt it would make any perceptible difference in mouse feel or measureable difference in USB polling, but it will make the same overclock require more Vcore if you update to CPU microcode newer than Rev. 18/28. I like how you keep searching tirelessly for the most arbitrary and minute things for "desperately trying anything to make my Zowie EC1-A feel more responsive", yet you refuse to change the single thing that I've told you countless times would make the biggest difference: disabling xHCI, xHCI hand-off, and eHCI hand-off. I'm beginning to think you're just trolling me at this point


----------



## Argowashi

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Doubt it would make any perceptible difference in mouse feel or measureable difference in USB polling, but it will make the same overclock require more Vcore if you update to CPU microcode newer than Rev. 18/28. I like how you keep searching tirelessly for the most arbitrary and minute things for "desperately trying anything to make my Zowie EC1-A feel more responsive", yet you refuse to change the single thing that I've told you countless times would make the biggest difference: disabling xHCI, xHCI hand-off, and eHCI hand-off. I'm beginning to think you're just trolling me at this point


How would I go about disabling hand-off on a new Asus motherboard?


----------



## agsz

Quote:


> Originally Posted by *Argowashi*
> 
> How would I go about disabling hand-off on a new Asus motherboard?


Advanced Tab > USB Settings


----------



## Argowashi

Quote:


> Originally Posted by *agsz*
> 
> Advanced Tab > USB Settings


Thanks. : )


----------



## Argowashi

Found this neat feature on the Gigabyte Z170 Gaming G1 called DAC-UP and I know it says it's for DACs but could this also help the mouse polling? They're also USB 2.0 instead of 3.0.

If you scroll down a little on this site you'll find what I'm talking about: gigabyte.com/products/product-page.aspx?pid=5478#ov


----------



## pstN

Quote:


> Originally Posted by *Argowashi*
> 
> Found this neat feature on the Gigabyte Z170 Gaming G1 called DAC-UP and I know it says it's for DACs but could this also help the mouse polling? They're also USB 2.0 instead of 3.0.
> 
> If you scroll down a little on this site you'll find what I'm talking about: gigabyte.com/products/product-page.aspx?pid=5478#ov


I always wondered about this but never tested it since I disabled just about everything in the bios. I suggest you test it and compare with MouseTester.


----------



## Argowashi

Quote:


> Originally Posted by *pstN*
> 
> I always wondered about this but never tested it since I disabled just about everything in the bios. I suggest you test it and compare with MouseTester.


No I don't have the motherboard myself, I was wondering if anyone knew if the USB ports being separate would help with polling precision. I think they would but I don't really know much about this stuff.


----------



## x7007

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Doubt it would make any perceptible difference in mouse feel or measureable difference in USB polling, but it will make the same overclock require more Vcore if you update to CPU microcode newer than Rev. 18/28. I like how you keep searching tirelessly for the most arbitrary and minute things for "desperately trying anything to make my Zowie EC1-A feel more responsive", yet you refuse to change the single thing that I've told you countless times would make the biggest difference: disabling xHCI, xHCI hand-off, and eHCI hand-off. I'm beginning to think you're just trolling me at this point


But disabling xhci means I can't use my usb 3.0 sandisk dongle which can reach to 200 mb read and 160 write. and a segate usb which can reach 200 mb read and 100 mb write.


----------



## VolsAndJezuz

Using USB drive in current year LUL


----------



## deepor

Quote:


> Originally Posted by *Argowashi*
> 
> Found this neat feature on the Gigabyte Z170 Gaming G1 called DAC-UP and I know it says it's for DACs but could this also help the mouse polling? They're also USB 2.0 instead of 3.0.
> 
> If you scroll down a little on this site you'll find what I'm talking about: gigabyte.com/products/product-page.aspx?pid=5478#ov


What this reminds me of is, I built in a tiny ITX case recently, the kind that needs an external power supply brick instead of a normal PSU, and I bought the cheapest ITX board with WiFi I could find.

On that PC, I used speakers that get power through USB, but get their audio signal normally through a 3.5mm jack. That absolutely wasn't usable because of the cheap ITX board. If you connect the speakers to the board's USB, you can hear noise when moving the mouse pointer around for example. It sounds a bit like coil whine. It depends on what the CPU is working on. The "solution" was to get power for the speakers through a phone charger. The board's USB was just completely unusable for supplying power to the speakers.

This "DAC-UP" marketing thing you found would be basically them promising that there's no problem like that for their board.


----------



## Argowashi

Quote:


> Originally Posted by *deepor*
> 
> What this reminds me of is, I built in a tiny ITX case recently, the kind that needs an external power supply brick instead of a normal PSU, and I bought the cheapest ITX board with WiFi I could find.
> 
> On that PC, I used speakers that get power through USB, but get their audio signal normally through a 3.5mm jack. That absolutely wasn't usable because of the cheap ITX board. If you connect the speakers to the board's USB, you can hear noise when moving the mouse pointer around for example. It sounds a bit like coil whine. It depends on what the CPU is working on. The "solution" was to get power for the speakers through a phone charger. The board's USB was just completely unusable for supplying power to the speakers.
> 
> This "DAC-UP" marketing thing you found would be basically them promising that there's no problem like that for their board.


Yeah you're probably right about that. Now to the question of whether or not it helps with mice lol.


----------



## Demi9OD

Thought I would share my gaming/desktop toggle batch in case anyone is interested.

It disables DWM.
Sets my monitor to 144hz for faster alt tabbing.
Enables Nvidia Digital Vibrance with a setting around 65 (default 50).
Sets the Power Config to High Power. (core parking min cores 100%, throttle states off, min proc state 100%, etc.)
Starts the Set Timer Resolution Service.

A second launch will reverse all settings and set the power profile back to balanced. I prefer my monitor at 60hz when not playing games for the increased contrast, even though I lose the smoothness of 144hz.

Code:



Code:


@echo off
sc interrogate uxsms | find "1062"
if %errorlevel%==0 goto :sc_start
sc stop uxsms
"C:\Program Files (x86)\NirCmd\nircmdc.exe" setdisplay monitor:"LG 24GM77(DVI)" 1920 1080 32 144 -updatereg
"C:\Program Files (x86)\Digital Vibrance\DVChange.exe" 20
powercfg -s 8c5e7fda-e8bf-4a96-9a85-a6e23a8c635c
net start STR
exit

:sc_start
sc start uxsms
"C:\Program Files (x86)\NirCmd\nircmdc.exe" setdisplay monitor:"LG 24GM77(DVI)" 1920 1080 32 60 -updatereg
"C:\Program Files (x86)\Digital Vibrance\DVChange.exe" 0
powercfg -s 381b4222-f694-41f0-9685-ff5bb260df2e
net stop STR
exit


----------



## Fylzka

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Using USB drive in current year LUL


not even golden LUL nor post #420 in 2016 LUL


----------



## Argowashi

Is there a noticeable difference in latency/polling issues between W7 and W10? Might need to upgrade soon so I can try out DX12.


----------



## James N

Quote:


> Originally Posted by *Argowashi*
> 
> Is there a noticeable difference in latency/polling issues between W7 and W10? Might need to upgrade soon so I can try out DX12.


Windows 10 is arse. Try to stick with win7 as long as you possibly can.


----------



## Demi9OD

Yep, just dual boot for DX12.


----------



## Th3Awak3n1ng

Spoiler: Warning: Spoiler!







Ok, guys, tell me please. is it good enough?








Zowie ZA13 @ 1000 hz.


----------



## m0uz

Quote:


> Originally Posted by *Th3Awak3n1ng*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Ok, guys, tell me please. is it good enough?
> 
> 
> 
> 
> 
> 
> 
> 
> Zowie ZA13 @ 1000 hz.


Seems pretty decent to me. Much better than mine, anyway.


----------



## Argowashi

Logitech G Pro on a new Windows 10 installation.


----------



## NeoReaper

Trying everything possible to get this to go any lower and I just can't seem to make any improvement from this:

EDIT: I would just like to add, this is with my antivirus Bitdefender 2017 running... In fact every result I posted had Bitdefender running as it's never made a performance hit on anything.


----------



## Argowashi

Quote:


> Originally Posted by *NeoReaper*
> 
> Trying everything possible to get this to go any lower and I just can't seem to make any improvement from this:


Looks pretty good to me.


----------



## James N

Quote:


> Originally Posted by *NeoReaper*
> 
> Trying everything possible to get this to go any lower and I just can't seem to make any improvement from this:
> 
> EDIT: I would just like to add, this is with my antivirus Bitdefender 2017 running... In fact every result I posted had Bitdefender running as it's never made a performance hit on anything.


I had similar results and couldn't get it lower without disabling almost everything, audio, network adapters and gpu drivers included.

I reinstalled windows on a different harddrive optimized everything same way i did with my normal installation. Then installed everything i needed one by one retesting every time. I was unable to get the same results i have achieved in safemode. And things that you need are definitely the nvidea drivers (which caused higher values and also higher dpc latency) and audio drivers. I am unsure but i have tested 4 different systems with the same mouse and windows 10 on it and they all sorta got down to the range you got it to. But not below. Maybe windows 10 is part of the issue.

After reinstalling windows 7 and optimizing it, i got almost the same results that i have gotten with windows 10 in safemode.

One more thing to note is that core temp running in the background also increased my results.

Your results look pretty good.


----------



## plyr

Circular movements on my Revel, not sure I done right tho...


----------



## daniel0731ex

Exclude the first few points.


----------



## plyr

Oh, right...


----------



## pierow

Does anyone know how I can troubleshoot these spikes every 100ms?


----------



## pox02

Good Bye Gpu Processing lag on amd









144hz no scaling









found bug amd driver bug let me know if anyone want it


----------



## iceskeleton

For NVIDIA on Windows 10 with my BenQ XL2411T I used this method to get display scaling (Windows 7 worked just fine with display scaling being present)

http://steamcommunity.com/sharedfiles/filedetails/?id=592511209


----------



## Oh wow Secret Cow

Quote:


> Originally Posted by *pox02*
> 
> Good Bye Gpu Processing lag on amd
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 144hz no scaling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> found bug amd driver bug let me know if anyone want it


I'll bite since I'm using an AMD GPU, what did you find


----------



## kyotkyotkyot

Why wouldn't you just post the bug...


----------



## pox02

Quote:


> Originally Posted by *Oh wow Secret Cow*
> 
> I'll bite since I'm using an AMD GPU, what did you find


Its remove a lot on crosstalk its force to use 8 bit locked its feel a lot beeter


----------



## PurpleChef

Quote:


> Originally Posted by *pox02*
> 
> Good Bye Gpu Processing lag on amd
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 144hz no scaling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> found bug amd driver bug let me know if anyone want it


What processing lag are you talking about. Nvidia is the ones with issues when usin Gpu scaling

So your saying that you turn off gpu scaling
Quote:


> Originally Posted by *pox02*
> 
> Its remove a lot on crosstalk its force to use 8 bit locked its feel a lot beeter


What did you do and how is it better?


----------



## m0uz

Just thought I'd share my before and after plots












Edit: Mouse is Nixeus Revel


----------



## James N

Quote:


> Originally Posted by *m0uz*
> 
> Just thought I'd share my before and after plots
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Mouse is Nixeus Revel


Before doing what? Just general optimization? Grats on the results.


----------



## IaVoR

I'll post my tests here but I have no clue how to even interpret them.

First is USB2


Second is USB3


USB2 Second try


----------



## m0uz

Quote:


> Originally Posted by *James N*
> 
> Before doing what? Just general optimization? Grats on the results.


Just some general optimization stuff like disbling crap in bios etc. However, now I have a strange "Activate Windows" thing on my screen after doing it. Might have disabled something in the bios that is linked to the hardware activation of Windows 10.


----------



## Argowashi

Quote:


> Originally Posted by *m0uz*
> 
> Just some general optimization stuff like disbling crap in bios etc. However, now I have a strange "Activate Windows" thing on my screen after doing it. Might have disabled something in the bios that is linked to the hardware activation of Windows 10.


That might be the Secure Boot setting.


----------



## m0uz

Quote:


> Originally Posted by *Argowashi*
> 
> That might be the Secure Boot setting.


Cheers! I'll have a look

Edit: lol "dis bling"


----------



## x7007

Why LGS drivers so crap ?

When I had it installed the mouse moved like crap.

G502 + windows 10

After uninstalling the drivers the mouse move so much.. I gave it a test though it will improve thing because I wanted to try the mouse pad calibration with the Tiger mouse feet. Just don't use the Logitech Drivers. Can't realise how crap it makes the mouse, just unbelievable compare to no drivers. I wanted to stop using my computer or throw the mouse out the window, after uninstalling I became relaxed again and having the fun I had before .. it is so LoL ,,, cause you need those drivers for other crap, and what people will do without the drivers ????


----------



## Derp

Quote:


> Originally Posted by *x7007*
> 
> Why LGS drivers so crap ?
> 
> When I had it installed the mouse moved like crap.
> 
> G502 + windows 10
> 
> After uninstalling the drivers the mouse move so much.. I gave it a test though it will improve thing because I wanted to try the mouse pad calibration with the Tiger mouse feet. Just don't use the Logitech Drivers. Can't realise how crap it makes the mouse, just unbelievable compare to no drivers. I wanted to stop using my computer or throw the mouse out the window, after uninstalling I became relaxed again and having the fun I had before .. it is so LoL ,,, cause you need those drivers for other crap, and what people will do without the drivers ????


I can't say that I noticed Logitech's drivers changing the tracking at all but their recent mice save everything to the on-board memory so this isn't really a problem. Just install the drivers once to configure everything and get rid of them if you feel something strange with them installed.


----------



## m0uz

Secure boot seems to be disabled. I booted up again and Windows was activated and then it decided it wasn't after about 2 minutes. Help me fam!

Edit: It says in msinfo32 that;

BIOS Mode: Legacy

Secure Boot State: Unsupported (because legacy maybe?)


----------



## deepor

There should be something that looks like a link on a web-page that you can click. It will open a window with instructions to follow.

In the past that involved calling a phone number where a machine will respond. You were shown codes on the PC's screen that you were supposed to type into the phone's number pad, and the machine on the other end then told you a code that you'd type at the PC and it would activate. I don't know if that's how things still work with Windows 10.


----------



## Demi9OD

Quote:


> Originally Posted by *m0uz*
> 
> Secure boot seems to be disabled. I booted up again and Windows was activated and then it decided it wasn't after about 2 minutes. Help me fam!
> 
> Edit: It says in msinfo32 that;
> 
> BIOS Mode: Legacy
> 
> Secure Boot State: Unsupported (because legacy maybe?)


Pretty sure something you changed in BIOS has just made Windows believe you are now running it on new hardware. Should be no big deal to re-activate with "new" hardware if it's a legit Win 10 copy.


----------



## m0uz

Quote:


> Originally Posted by *Demi9OD*
> 
> Pretty sure something you changed in BIOS has just made Windows believe you are now running it on new hardware. Should be no big deal to re-activate with "new" hardware if it's a legit Win 10 copy.


It is legit but it's a "free update" version of 10 rather than a key

Also, I've tried resetting the bios to defaults and it still says Secure Boot state is disabled

Edit: Needed to enable "Windows 8" mode to get the option of turning on or off Secure Boot. It still doesn't work with 10 and I'm still not activated


----------



## deepor

Quote:


> Originally Posted by *m0uz*
> 
> It is legit but it's a "free update" version of 10 rather than a key
> 
> Also, I've tried resetting the bios to defaults and it still says Secure Boot state is disabled


Shortly before the free upgrade offer ran out, I installed a Windows 10 on an empty machine that had never gone through the upgrade normally. I could activate it after the installation by typing in a Windows 8 key through that "change product key" button in the "System" window. Perhaps that still works? You could try typing in your 7 or 8 key in there and see if something happens.


----------



## m0uz

Apparently I have 22 registry issues under HKEY_LOCAL_MACHINE\Software\Wow6432Node\Microsoft\Windows\CurrentVersion\SharedDLLs\C:\Windows\system32

I'll just reinstall the bloody OS!


----------



## Demi9OD

What told you you have registry issues?


----------



## m0uz

Quote:


> Originally Posted by *Demi9OD*
> 
> What told you you have registry issues?


Piece of software. Can't remember what it was called. It was legit but it might have been a false positive. I've reinstalled but the issue still persists.


----------



## Demi9OD

Quote:


> Originally Posted by *m0uz*
> 
> Piece of software. Can't remember what it was called. It was legit but it might have been a false positive. I've reinstalled but the issue still persists.


I really wouldn't worry about what a third party scanner says about the registry. If this doesn't work I'm not sure, I've avoided Win 10 so far.

https://support.microsoft.com/en-us/help/20530/windows-10-reactivating-after-hardware-change


----------



## m0uz

"We can't activate Windows on this device because we can't connect to your organisation's activation server. Make sure that you're connected to your organisation's network and try again"

DAFUQ?!


----------



## Demi9OD

If this is your home machine and has never been connected to a domain, maybe you really do need to reinstall.


----------



## m0uz

Whatever has happened it's really dun goofed either on my end or MS's


----------



## Alya

Boy you done did it.


----------



## m0uz

Quote:


> Originally Posted by *Alya*
> 
> Boy you done did it.


I'm sorry sen3.14


----------



## patoux01

While this thread is mainly about Windows, similar reasoning could (should?) apply to Linux, would anyone know if such a guide exists for Linux? (debian)


----------



## m0uz

Here's a clearer plot of my polling. It's already miles better than it was before but has anyone got any ideas as to how I could reduce it further?



Edit: The spikes seem to be happening on a regular basis. Anyone know why?


----------



## Th3Awak3n1ng

Did some tests with different HPET parameters.


Spoiler: HPET BIOS OFF OS OFF









Spoiler: HPET BIOS ON OS OFF









Spoiler: HPET BIOS ON OS ON







Which one is better?


----------



## baskinghobo

Looks like hpet makes things slightly more stable for the price of precision


----------



## Melan

Leave it at default. On in BIOS, off in OS.


----------



## SweetLow

As i read this thread (and other same threads) .
People, why do you think, that enabling or disabling HPET, changing "UsePlatformClock" or "/usepmtimer" (actually in this case you simply select different real hardware sources for abstract OS timer) make POLLING more precise?


----------



## PLUSPUNKT

Some drops, but looks good so far
Logitech G303 + W10


----------



## Melan

Those are xCounts not Update time.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *SweetLow*
> 
> As i read this thread (and other same threads) .
> People, why do you think, that enabling or disabling HPET, changing "UsePlatformClock" or "/usepmtimer" (actually in this case you simply select different real hardware sources for abstract OS timer) make POLLING more precise?


http://www.overclock.net/t/1609933/terminology-pet-peeves-thread-misnomers-that-annoy-you/0_20#post_25472818

Referring to this stuff as 'USB polling' is just much easier than saying something like 'Operating System induced jitter relative to USB report rate'. Just think of it as colloquial use of the term.


----------



## SweetLow

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> http://www.overclock.net/t/1609933/terminology-pet-peeves-thread-misnomers-that-annoy-you/0_20#post_25472818
> 
> Referring to this stuff as 'USB polling' is just much easier than saying something like 'Operating System induced jitter relative to USB report rate'. Just think of it as colloquial use of the term.


It's all true, but i told about some other stuff. In your terms i say "playing with some parameters *don't* change (real) OS introduced jitter at all"


----------



## HAGGARD

Quote:


> Originally Posted by *SweetLow*
> 
> It's all true, but i told about some other stuff. In your terms i say "playing with some parameters *don't* change (real) OS introduced jitter at all"


Well, most people assume those parameters make a difference because the measurements change (see for example most recently post #469).

I'm not really sure myself whether, and how "enabling or disabling HPET, changing "UsePlatformClock" or "/usepmtimer"" significantly affects polling precision - differences in measurements between those are within negligible range on my setup. If you however have that knowledge, please do share.


----------



## SweetLow

Quote:


> Originally Posted by *HAGGARD*
> 
> Well, most people assume those parameters make a difference because the measurements change (see for example most recently post #469).
> 
> I'm not really sure myself whether, and how "enabling or disabling HPET, changing "UsePlatformClock" or "/usepmtimer"" significantly affects polling precision - differences in measurements between those are within negligible range on my setup. If you however have that knowledge, please do share.


It's obvious







You use keyword "measurement" two times but ignore it.
When we *profile* "jitter" we don't have only one process - getting messages from hardware to software. Second process (running on the same PC, not standalone hardware) - it's measurement itself. And it have its own precision, "jitter" and its own affecting settings (which i mentioned in my first post).
P.S. AFAIK, this thread may be precisely named "Optimization of reactivity of soft real time task running on non real time OS"


----------



## HAGGARD

Quote:


> Originally Posted by *SweetLow*
> 
> It's obvious
> 
> 
> 
> 
> 
> 
> 
> You use keyword "measurement" two times but ignore it.
> When we *profile* "jitter" we don't have only one process - getting messages from hardware to software. Second process (running on the same PC, not standalone hardware) - it's measurement itself. And it have its own precision, "jitter" and its own affecting settings (which i mentioned in my first post).


Very true. Better results are still better results, are they not? Less jitter is still less jitter. How are you to know where that jitter has its root - OS handling of USB reports or handling of the software used to make measurements? Again, if you do have that knowledge, I'd be glad to hear all about it. And besides, it would still be an optimization either way, even if not strictly related to input handling.
Quote:


> P.S. AFAIK, this thread may be precisely named "Optimization of reactivity of soft real time task running on non real time OS"


True. Not quite as catchy though! ;D


----------



## VolsAndJezuz

I think that optimizations that improve USB polling graphs (or whatever you want to call it) in MouseTester are a good way to measure very small changes to how the OS (or user mode programs in the OS, if you prefer) can handle very rapid and particularly timed function calls. A single optimization that shows a small but statistically significant change in MouseTester might not really be reflected by a difference in a game's FPS. But when you compound the effect of many good optimizations, there is a staggering difference in performance. With my overclocking and optimizations, I get higher FPS in CS:GO FPS benchmark and fps_test3 runs than I've even seen claimed by anyone else, including friends with GTX 1080s and much more expensive CPUs than me who I asked to run those tests. And it is smooth as butter, like my brain is directly controlling my aim in game and my arm and the mouse aren't even being used.
Quote:


> Originally Posted by *SweetLow*
> 
> It's obvious
> 
> 
> 
> 
> 
> 
> 
> You use keyword "measurement" two times but ignore it.
> When we *profile* "jitter" we don't have only one process - getting messages from hardware to software. Second process (running on the same PC, not standalone hardware) - it's measurement itself. And it have its own precision, "jitter" and its own affecting settings (which i mentioned in my first post).
> P.S. AFAIK, this thread may be precisely named "Optimization of reactivity of soft real time task running on non real time OS"


To answer your concern about what exactly we are measuring with these polling precision graphs, I would ask the question: what is a game? A 'second process' that has its own precision, jitter, and settings, and collects USB mouse data much the same way a program like MouseTester would go about collecting mouse data. So the assumption here is that observing changes in MouseTester plots represents changes in in-game performance.

I really don't see how you can argue that point. The only real (and perfectly reasonable) argument is on what scale we as humans can detect the changes we are observing. Can we detect a consistent 500us jitter, like you see with 1000Hz polling on horribly optimized and outdated systems? Almost certainly. Can we detect repeating 1us jitter, like the difference in 0.999ms and 1.000ms update interval? Almost certainly not. Maybe the most sensitive of us can detect something like repeating 10us jitters. Maybe for most people it is something like 100us. I don't know but the point is that we can detect polling precision on some scale, so optimizing it is probably a worthwhile endeavor.


----------



## VolsAndJezuz

Polling data I just collected for reference:



Edit: just used new MouseTester for the first time... looks like its overhead is even lower than v1.1 I was using.


Spoiler: old MouseTester


----------



## HAGGARD

Good post (and sick measurements! ;p).

When I argued those same points in the past, I put the "humanly-detectable" range more in the hundreds of microseconds variance, the "perfectionist over-optimized" range being in the single-digit microseconds. But that's 1. entirely anecdotal, 2. represents precision in a very bare environment that obviousy changes in an in-game scenario where a lot more stress is put on the system and 3. this sub-forum's and entire forum's topics are pretty "perfectionistic" endeavours to begin with - doesn't mean they are not worthwhile. I had fun tinkering around with my system, seeing the results in-game (actual, measurable, cognisable results and placebo-induced results alike) and have fun applying that knowledge to other systems.

And as has been mentioned, even if the measurements captured only the software-handling jitter and not input-handling jitter of the OS (which I know for a fact it doesn't: you can look at thread activity with DispatchMon and see varying CPU routine execution times for USB interrupts), that would still come with the same benefit for the overall system performance.

But I don't think SweetLow was arguing this point on that level, or generally questioning the benefit of optimizations like these; it seems it was more about HPET and OS timer in particular?


----------



## VolsAndJezuz

I think the bare environment is sort of necessary for making MouseTester data useful though. Because otherwise (say if you try to take MouseTester data while playing CS:GO) there is too much noise and too many variables to be able draw many valid conclusions about all but the most dramatic optimizations. So the bare environment is sort of a scientific control for our polling precision experiments.

But I agree, the tinkering can be enjoyable to a point. And if I find something worthwhile, I like sharing it on here. Some people seem to get really offended by this though and go out of their way to voice their disapproval, which is a strange phenomenon (not talking about SweetLow here).

I really get the impression that he was talking in a more general sense, but I'm sure he will clarify. On my system, disabling HPET in BIOS is undoubtedly worse in MouseTester plots, as is forcing it one way or the other in the OS. I guess I could use the log file and do the math if you really wanted, but I'm 99% sure it would qualify as a statistically significant difference with any reasonable significance level. As far as I'm aware, all Intel motherboards since at least 7-series have had decent enough HPET implementations that the ole 'disable HPET' advice is ill-advised. I would expect leaving HPET available and letting the OS choose the appropriate timer as needed would be the most sensible, and my experiments support that on the Z75, Z77, and Z97 platforms I've owned.


----------



## NeoReaper

Just got an Sabre RGB because my KATAR was faulty... Can someone explain this?


----------



## m0uz

Quote:


> Originally Posted by *NeoReaper*
> 
> Just got an Sabre RGB because my KATAR was faulty... Can someone explain this?


Dropped reports. Very common if the firmware is garbage. Just for funsies, can you zoom in on a small area such as between counts 2000 and 2500?


----------



## NeoReaper

Quote:


> Originally Posted by *m0uz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NeoReaper*
> 
> Just got an Sabre RGB because my KATAR was faulty... Can someone explain this?
> 
> 
> 
> 
> Dropped reports. Very common if the firmware is garbage.
Click to expand...

Sooo.... Does this actually mean this mouse is worse than the KATAR or just shows up badly in mousetester?


----------



## m0uz

Quote:


> Originally Posted by *NeoReaper*
> 
> Sooo.... Does this actually mean this mouse is worse than the KATAR or just shows up badly in mousetester?


Depends how the Katar scored in mousetester


----------



## NeoReaper

Quote:


> Originally Posted by *m0uz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NeoReaper*
> 
> Sooo.... Does this actually mean this mouse is worse than the KATAR or just shows up badly in mousetester?
> 
> 
> 
> Depends how the Katar scored in mousetester
Click to expand...

Quote:


> Originally Posted by *NeoReaper*
> 
> Trying everything possible to get this to go any lower and I just can't seem to make any improvement from this:
> 
> EDIT: I would just like to add, this is with my antivirus Bitdefender 2017 running... In fact every result I posted had Bitdefender running as it's never made a performance hit on anything.


----------



## m0uz

That's very similar to mine and actually not incredibly bad. Report rate-wise the Katar seems fine. The Sabre on the other hand isn't good due to the very frequent drops.


----------



## SweetLow

Quote:


> Originally Posted by *HAGGARD*
> 
> Very true. Better results are still better results, are they not? Less jitter is still less jitter. How are you to know where that jitter has its root - OS handling of USB reports or handling of the software used to make measurements? Again, if you do have that knowledge, I'd be glad to hear all about it. And besides, it would still be an optimization either way, even if not strictly related to input handling.
> True. Not quite as catchy though! ;D


>Better results are still better results, are they not? Less jitter is still less jitter.
Not always







Are you want example?


----------



## rakzbr

is my ninox aurora 1600dpi good?

windows 10, trying to make everything off
overclocked cpu 24,7 4.6ghz
hpet off

swiped harder, totally diferent result


made some changes and still



and hpet on is the same


damn im posting alot of images
but i found the problem is my MICE ?
500hz


----------



## VolsAndJezuz

Quote:


> Originally Posted by *SweetLow*
> 
> >Better results are still better results, are they not? Less jitter is still less jitter.
> Not always
> 
> 
> 
> 
> 
> 
> 
> Are you want example?


Go on...


----------



## rakzbr

Too bad i have to use 500hz.. Might be something with software firmware, Bucking Fst

I tried to set it 500hz and overclock via sweetlow driver, but im on windows10 and i cant do it anymore


----------



## VolsAndJezuz

Quote:


> Originally Posted by *rakzbr*
> 
> damn im posting alot of images
> but i found the problem is my MICE ?
> 500hz


You are definitely dropping reports @1000Hz like the person a few posts above, so the firmware on the mouse would be the problem and is something only a firmware update from Ninox could fix. 500Hz might be alright because you only have two 4ms update times, so that could potentially just be from not moving the mouse fast enough for the test. Redo 500Hz with faster swipes and repost.


----------



## rakzbr

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> You are definitely dropping reports @1000Hz like the person a few posts above, so the firmware on the mouse would be the problem and is something only a firmware update from Ninox could fix. 500Hz might be alright because you only have two 4ms update times, so that could potentially just be from not moving the mouse fast enough for the test. Redo 500Hz with faster swipes and repost.



i bought gpro, i hope its better


----------



## VolsAndJezuz

Not the greatest polling precision, but at least it's not dropping reports @500Hz. 1000Hz definitely was. GPro won't drop reports @1000Hz but idk if your polling precision will be any better, that's mainly related to your OS environment.


----------



## HAGGARD

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> I think the bare environment is sort of necessary for making MouseTester data useful though. Because otherwise (say if you try to take MouseTester data while playing CS:GO) there is too much noise and too many variables to be able draw many valid conclusions about all but the most dramatic optimizations. So the bare environment is sort of a scientific control for our polling precision experiments.


Yeah, definitely. While taking measurements during a game-scenario can be useful for other reasons, I was just saying the microsecond improvements that may seem "overly perfectionistic" might actually make a real, tangible difference in that less bare environment; i. e. the difference between, say, 20us variance and 5us in the bare environment could actually be 200us vs. 50us while in-game - the exponentiality here is obviously only a crude example, but it's a point I've stressed multiple times: any and all improvements are more or less worthwhile, in-game certainly more so.
Quote:


> On my system, disabling HPET in BIOS is undoubtedly worse in MouseTester plots, as is forcing it one way or the other in the OS. I guess I could use the log file and do the math if you really wanted, but I'm 99% sure it would qualify as a statistically significant difference with any reasonable significance level. As far as I'm aware, all Intel motherboards since at least 7-series have had decent enough HPET implementations that the ole 'disable HPET' advice is ill-advised. I would expect leaving HPET available and letting the OS choose the appropriate timer as needed would be the most sensible, and my experiments support that on the Z75, Z77, and Z97 platforms I've owned.


I didn't do too many iterations on that, but when I played around with all the variations I found the same to be true. There's oddities and inconsistency in the results between the settings, but a general trend is visible and would surely be statistically significant if recorded over longer periods and more repetitions.
And I think I recommended the same thing in the OP: leaving HPET enabled in the BIOS and letting Windows decide on timer-utilization itself. That yielded the best results on pretty much any system I've laid my fingers on 'til now.
Quote:


> Originally Posted by *SweetLow*
> 
> >Better results are still better results, are they not? Less jitter is still less jitter.
> Not always
> 
> 
> 
> 
> 
> 
> 
> Are you want example?


Duh. Give!
Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Not the greatest polling precision, but at least it's not dropping reports @500Hz. 1000Hz definitely was. GPro won't drop reports @1000Hz but idk if your polling precision will be any better, that's mainly related to your OS environment.


Yeah, microsecond-variance is 100% OS/CPU-induced. Maybe if you have a crappy USB host controller that could introduce some as well, but I'm not sure that's even possible (pretty sure they use an internal clock timer and sync operation, and USB communication operates at 125us intervals, so... you'd expect any problems there would show up in multiples of 125us, but that's not here nor there and probably completely useless to worry about).


----------



## rakzbr

i think i got a problem. is not the mice.
dm1pros

rival100

salmosa


my system is clean..


tested with hpet on and off
i5 4690k 4.6ghz 1.3v , 4x4gb 2400mhz cl10 1.65v, gtx 980, 2 ssd's and no HD
fresh win10 with tweaks, removing bloatwares

ill now remove my overclock to check if have influency
i might rollback to win7 if nothing works

edit: oh and i have a very nice MB.. asus vii hero
and tested both usb 2 and 3


----------



## HAGGARD

@rakzbr:

Well, what you can see is that those whole-millisecond outliers are only on the positive side of the Y axis, not on the negative. This means it is not your system, but dropped reports. Dropped reports can be a firmware problem, a tracking problem, and... dunno, maybe even a USB problem. Or Windows 10 shenanigans - I haven't played around with Win10 yet so I don't know much about that. Since that problem doesn't seem to show up in that 500Hz graph, you could first check whether the mice exhibit the same problem at 500Hz. Then tracking surface, lenses, swipe speed are the next things. Then BIOS, host controller drivers.

It's probably just the firmware. You must have the worst luck in picking mice: No idea about the DM1, but Salmosa is Razer (who regularly have this exact problem in their reporting behaviour) and I mean to remember that the Rival has this problem too. Wait for your GPRO, that one definitely does not have it. Then you can confirm that the mice are the problem.


----------



## NeoReaper

This is my SABRE at 500hz/2msec, reckon I should just leave it at that? (I mean my monitor is rated at 4ms GTG anyways)


----------



## rakzbr

@HAGGARD i apreciate alot if u help me, im a windows "tweaker" since i born.. since 2002 when i started gaming i always removed ****s and unuseful things
I just did a fresh windows 7 install, installed lastest nvidia + intel ethernet 19.0c + netframework ONLY
and this is the results

dm1pros + qckheavy

dm1pros + allsop xl


g100s + qck


g100s + allsop


i already did alot of search in my bios.. btw i updated my bios, i never in my life installed RST,SATA,Chipset drivers(for me anything beside internet driver = useless)
ill try make more changes in my bios, if i see a change ill write here, please try help me with advices


----------



## rakzbr

sorry to make another reply, but doesnt make sense update here ^

it was fast. its my bios settings. mostly overclock.


almost perfect huh? removing my cpu overclock.

there is still some variations (i did like 5 times the test, circles swipes not ultra-fast but fast)

its normal right?

ill improve by myself now if u tell me those above is normal, thanks for the thread


----------



## HAGGARD

Quote:


> Originally Posted by *NeoReaper*
> 
> This is my SABRE at 500hz/2msec, reckon I should just leave it at that? (I mean my monitor is rated at 4ms GTG anyways)


Monitor GTG doesn't really play into it, but you can leave it at that, sure: that's 30us variance at most, stably - and I'd say sub-100us is a sufficiently "optimized" range for most of everyone. The real problem strictly to do with polling precision is more the extreme kind of stuff you see with rakzbr anyway; everything else is more general system optimization.

@rakzbr: Very curious indeed. Again, since the extreme offset only shows up on the above-average side of it, it's not input handling issues on your system (because in that case a consecutively polled report would register below average, correlatively). Well, for the DM1 graphs it could be, but then the timing of the consecutively-offset reports seems too consistent.
Quote:


> ill improve by myself now if u tell me those above is normal, thanks for the thread


Yes, that graph looks very healthy to say the least. So you are saying the problems show up without the overlock? What exactly did you overclock - only CPU, or RAM, NBFreq./HTLinkFreq.?


----------



## rakzbr

I did some small and normal overclock changes to my bios
mostly putting 4.6ghz and 1.3v and tested in some stressers and was all ok.. didnt wasted days testing
but i think it got better because of it.

Also, if i have chrome or other stuff open.. my graph goes crazy too
i think my system is very sensitive is this possible ?

also my ram is on x.m.p, 1.65v 2400mhz but its the same freq and latency they are made for.
100:100 bclk 100 , internal pll overvoltage enabled(now i disabled), C states all OFF, power saving features all off, everything useless off

I have some ideas, there is HD onboard audio and AC97, ill swap and see whos better
also i really need this overclock.. ill try overclock again too. i really need 300fps in overwatch instead of 200-250 XD


----------



## HAGGARD

Having stuff open obviously introduces more stress, and browsers are not seldomly surprisingly work-intensive... Depends on how crazy they get I suppose, but your graphs really are way too crazy as is. I still don't think it's regular OS/CPU jitter what you have there. It's only offset in one direction (entirely untypical/unexpected behaviour for system-sided jitter; most of the time has to do with report behaviour, i. e. is mouse-sided) and then for up to 8 milliseconds. That's pretty hefty - 8 milliseconds in CPU-time is a literal eternity. I can't see what it could have done in that time to not be able to handle a single USB event. Are you sure you are not getting microfreezes or something with LatMon?

Dunno, either you have some seriously weird hardware component issue or... your program has issues or whatever. Since this is on a clean install, I assume you don't have any programs or third-party services running in the background and stuff? Maybe actually do try and get chipset drivers (host controllers, USB filter primarily). You can also disable your audio device(s) and check whether that causes those jumps. Same for network adapter. If you are on Windows 7 again, try TimerTool and setting 0.5ms and see whether that keeps your reports from hopping. Those 2-8ms jumps are not normal.


----------



## altf4

Is that good enough?


----------



## m0uz

Quote:


> Originally Posted by *NeoReaper*
> 
> This is my SABRE at 500hz/2msec, reckon I should just leave it at that? (I mean my monitor is rated at 4ms GTG anyways)


That's much better. I, personally, would stick with 500Hz due to it being stable


----------



## NeoReaper

Quote:


> Originally Posted by *m0uz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NeoReaper*
> 
> This is my SABRE at 500hz/2msec, reckon I should just leave it at that? (I mean my monitor is rated at 4ms GTG anyways)
> 
> 
> 
> 
> That's much better. I, personally, would stick with 500Hz due to it being stable
Click to expand...

Yeah, I did a few more runs of mousetester to test if I could get any better and I got it between 2.02 and 1.98 half of the tests so for the moment I will use 500hz, I might bring this up on Corsair's forum at some point as they are normally quite good at fixing firmware issues.


----------



## Th3Awak3n1ng

Someone (can't remember in which thread) said that last Windows update did something with USB polling.

Windows 10 (160715-1616) w/o updates:


Spoiler: Warning: Spoiler!







Windows 10 (161004-2338) with latest updates:


Spoiler: Warning: Spoiler!















These 2 systems configured almost in the same way.


----------



## deepor

That's the same result, isn't it? It's just confusing because of a different scale.


----------



## SweetLow

Quote:


> Originally Posted by *HAGGARD*
> 
> Duh. Give!


Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Go on...


Ok. Let's go.
Suppose there are 3 different timers: A. LC-oscillator B. Quartz C. Cesium and a independent periodical process P.
We measure the process P with the help of our timers.
What do we get from the measurements?
All is as expected - the large jitter is A, middle - B, little - С. All right? Not so fast, my friends.

Instead of independent process P we take process P2 - correlated with (derived from) the timer of the variant A.
What do we get from the measurements?
The big jitter is B, intermediate - C, small - A. Not so expected, isn't it?


----------



## Th3Awak3n1ng

Quote:


> Originally Posted by *deepor*
> 
> That's the same result, isn't it? It's just confusing because of a different scale.


Dunno, you tell me.









I hear not first time that USB polling has been "broken" after Windows update, so decided to test myself. I personally don't see big difference here.


----------



## SweetLow

Quote:


> Originally Posted by *rakzbr*
> 
> I tried to set it 500hz and overclock via sweetlow driver, but im on windows10 and i cant do it anymore


What is problem with windows 10???


----------



## rakzbr

@haggard im lost

Upgraded drivers Intel_USB3_Win7_V30141, Intel_Chipset_Win7-8-81-10_VER10117.
Tested with TimerTool
Clean Windows7
No Overclock or X.M.P
Tryed many bios settings
Hpet on and off , USB 2 and 3
Tryed other usb ports

My system is good.. asus vii hero motherboard.. i really dont understand why i get those spikes


my dpc latency is fine, i like to make dpc great since r0ach (lul)

@SweetLow
somehow i tried overclock yesterday on win10 and i got "filter driver is probably useless for this device" and didnt work, im not using microsoft anymore.. was trying to overclock a ****ty developed mice by bst (ninox aurora)

Edit: i tested unpark cores too, i got a spike in dpclatency for the first time.. Lol that never happened before too. I might change c states and all power settings in bios for testing now


----------



## notzi

Quote:


> Originally Posted by *Th3Awak3n1ng*
> 
> Someone (can't remember in which thread) said that last Windows update did something with USB polling.
> 
> Windows 10 (160715-1616) w/o updates:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Windows 10 (161004-2338) with latest updates:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> These 2 systems configured almost in the same way.


I did a thread about it, but there's no fix that I know of. Have "given up" myself, cause self-forced OCD isn't healthy for me.


----------



## rakzbr

Damn , i hate to be like r0ach.
I just did graphs with fresh install win10, and with super-ultra tweaked removed everything windows 10.
its clearly nothing in IOS, but im sad i couldnt even improve the graph

btw i learned my dpc latency could get to 1-20 range easy with tweak and win10.
ill test LTSB windows 10 from iso from gen2, since i just installed home

anyway the graphs are that, just for update.. same thing happening no matter what i do.. Oh, and i also changed everything on BIOS
changed Oc, removed Oc, let bios do Oc, Oc myself, Changed settings everywhere, and i know how to do those stuff normally
fresh win10

tweaked win10

r0ach`ed win10


now i plan install this LTSB and older drivers..mostly nvidia old drivers. already changed everything everywhere
so ill have to take this bad graph with me


----------



## pstN

If you're using W10, I would stick to 1511 at least until 1607 can be considered stable, which still isn't the case.

LTSB 2016 is 1607 and the 2015 version is the initial W10 build which is unstable as well so if your goal is usb polling + windows 10, your only option is one of the 1511 builds, which LTSB isn't a part of..

It might be possible to convert 2015 LTSB to 1511 but I don't think it's worth the trouble over stripping a 1511 Iso.


----------



## rakzbr

Thanks for the advice but i wanted to test LTSB i found torrent.
For my surprise..

g100s circle movement

g100s fast swipe movement zigzag

g100s crazy random movements


@HAGGARD i am safe now ? i think it was drivers influency ? because this version of windows i didnt have to install ethernet driver, usb, nothing.


----------



## Melan

Redstone LTSB is pretty stable tbh. Works well on my system and polling is fine.


----------



## pstN

Quote:


> Originally Posted by *Melan*
> 
> Redstone LTSB is pretty stable tbh. Works well on my system and polling is fine.


did you compare with 1511?


----------



## Melan

I didn't bother with non enterprise editions. Pointless to clean home/pro if I can just get clean version of windows from the start.

Non LTSB enterprise did get 1511 update but it was also stable for me.


----------



## Th3Awak3n1ng

Quote:


> Originally Posted by *pstN*
> 
> If you're using W10, I would stick to 1511 at least until 1607 can be considered stable, which still isn't the case.
> 
> LTSB 2016 is 1607 and the 2015 version is the initial W10 build which is unstable as well so if your goal is usb polling + windows 10, your only option is one of the 1511 builds, which LTSB isn't a part of..


I use Windows 10 LTSB 1607 and I think it's fine. Or my results are bad?


----------



## PLUSPUNKT

Are these the correct settings?


----------



## Neilthran

Hi guys, I'm new to all this. just got a tt azurues mini for like 12 bucks. i did the graph thing but had to cut the first and last dots from the graph because they messed with the scale (update intervals in the 50ms, but in the middle was fine). Is this ok? where can i learn more to interpret better the graphs?

(I'm using windows 10 LTSB)

first try


second try


Quote:


> Originally Posted by *PLUSPUNKT*
> 
> Are these the correct settings?


I think you should exclude the first points from the graph, the scale of the graph is too big to get an idea.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *Neilthran*
> 
> where can i learn more to interpret better the graphs?


It's just looking at the spread and frequency of variations from the polling interval. So using your second graph, since the data was better, your worst deviations are ~1.86ms and ~2.14ms, or ~140μs off. Your main group of deviation is more like a 1.96-2.04ms range, or 40μs off.

Now I'll compare that to my data, not to pat myself on the back too hard.



So in my graph, the worst deviations are just past 0.997ms and 1.003ms, or 3μs off. My main group of deviation is within 0.999-1.001ms, or 1μs off.

Keep in mind these are only 5-10 second clips of data. I for instance, will certainly have higher deviations that in my graph. But it is a fairly representative 5-10 second clip. So when you are collecting data for analysis, do several runs to get the feel for what your typical deviations are. While you don't want to completely cherry-pick 1.5 seconds of amazing looking data, you also don't want the rare bigger deviations that make the scale of the rest of the data harder to interpret, like those little buggers around 4600ms on your first graph. You can see in your first graph how those outliers squash the data in the 1.9-2.1ms range, whereas in the second graph you can see that the greater majority of the data is actually in more like the 1.94-2.04ms range.


----------



## rakzbr

But Vols, what u did to optimize that much of your system?
I have everything tweaked.. everything minimal and clean, still horrible, my hardware are good too


----------



## twohands

The number 1 pro in quake named rapha likes to use 125hz because it's more stable he says. Could you test this too.


----------



## uaokkkkkkkk

Man that would get old fast. Cursor literally chugs slowly across the screen on 125hz.


----------



## PurpleChef

Quote:


> Originally Posted by *twohands*
> 
> The number 1 pro in quake named rapha likes to use 125hz because it's more stable he says. Could you test this too.


interesting, but 125 hmm... source?<

why 125hz and not 120 or 144?


----------



## Th3Awak3n1ng

Cause he is talking about USB polling, not display refresh rate.


----------



## PurpleChef

Quote:


> Originally Posted by *uaokkkkkkkk*
> 
> Man that would get old fast. Cursor literally chugs slowly across the screen on 125hz.


Quote:


> Originally Posted by *Th3Awak3n1ng*
> 
> Cause he is talking about USB polling, not display refresh rate.


Ofc. But he might aswell do 125/125. But ye, thx for the unnecessary comment


----------



## Th3Awak3n1ng

Don't ask stupid questions then.


----------



## Alya

Quote:


> Originally Posted by *PurpleChef*
> 
> Ofc. But he might aswell do 125/125. But ye, thx for the unnecessary comment


Why would he do 125/125? The comment wasn't unnecessary, if your mouse is polling at the same timing as your screen is refreshing then there's no assurance that the poll will occur at the same time as the screen will refresh, this will make the screen tear.


----------



## HAGGARD

Quote:


> Originally Posted by *rakzbr*
> 
> But Vols, what u did to optimize that much of your system?
> I have everything tweaked.. everything minimal and clean, still horrible, my hardware are good too


You results here were quite good:
Quote:


> Originally Posted by *rakzbr*


Quote:


> Originally Posted by *rakzbr*
> 
> @HAGGARD i am safe now ? i think it was drivers influency ? because this version of windows i didnt have to install ethernet driver, usb, nothing.


Yes, those are normal measurements. Could have been the drivers, but I still can't really explain those jumps. I have never seen that behaviour on any system; "one-directional" offset should be a report problem, not an input addressing/processing one. Maybe if you had DPC freezes, but 6-8ms freezes would definitely be noticeable.

As for further optimization... You are on Win10 again, so that's not my territory.


----------



## TranquilTempest

Quote:


> Originally Posted by *Alya*
> 
> Why would he do 125/125? The comment wasn't unnecessary, if your mouse is polling at the same timing as your screen is refreshing then there's no assurance that the poll will occur at the same time as the screen will refresh, this will make the screen tear.


Nope. Framerate being unsynchronized with refresh rate is what causes tearing, and framerate doesn't depend on mouse polling rate. It might cause some stutter of animation that depends on mouse input, but not tearing.


----------



## Alya

Quote:


> Originally Posted by *TranquilTempest*
> 
> Nope. Framerate being unsynchronized with refresh rate is what causes tearing, and framerate doesn't depend on mouse polling rate. It might cause some stutter of animation that depends on mouse input, but not tearing.


You're right it doesn't tear, that's not what I meant, I meant stuttering but I couldn't think of that word when I made that post for some reason. I know frame rate is independent from polling rate.

Tearing occurs when the back buffer is swapped with the front buffer before the front buffer has been read/scanned fully, obviously this is a super simplistic and not at all technical explanation of it, but you get it.


----------



## PurpleChef

Quote:


> Originally Posted by *Th3Awak3n1ng*
> 
> Don't ask stupid questions then.


How is 125hz screen stupid, if that was the case? if the quake player prefered 125hz mouse he might aswell run 125hz screen, what do i know.
Salty. Dry them tears


----------



## Demi9OD

With CRT monitors at least synced mouse and screen refresh was godly. I used to run CS 1.1-1.6 with 125hz mouse and 125hz CRT and it was incredibly smooth compared to any desynced refresh rate.


----------



## HAGGARD

Quote:


> Originally Posted by *Demi9OD*
> 
> With CRT monitors at least synced mouse and screen refresh was godly. I used to run CS 1.1-1.6 with 125hz mouse and 125hz CRT and it was incredibly smooth compared to any desynced refresh rate.


It's not "synced" though. Your inputs don't arrive in synchronization with your frames, and your frames not in synchronization with your monitor's refresh. 125Hz polling rate is just choppier and laggier, that's it.

Doesn't mean it is unplayable. At all. In the glory days of the MLT04, a lot of players used 125Hz from what I know.


----------



## Alya

Quote:


> Originally Posted by *HAGGARD*
> 
> It's not "synced" though. Your inputs don't arrive in synchronization with your frames, and your frames not in synchronization with your monitor's refresh. 125Hz polling rate is just choppier and laggier, that's it.
> 
> Doesn't mean it is unplayable. At all. In the glory days of the MLT04, a lot of players used 125Hz from what I know.


Honestly if you're going through all the work to make your polling precision better, why on Earth would you use 125Hz anyway? It feels horrible, I plugged in my FK which I had set to 125Hz when I was playing CrossFire and I instantly thought to myself "This thing feels horrid." and then to my surprise, it was because it was set to 125Hz. Set it to 500Hz and it felt fine again...


----------



## Demi9OD

It was horrid, but back when the only options were 125hz USB or 200hz overclocked PS/2, you had to make the most of what was available if you wanted to use a USB mouse.


----------



## HAGGARD

Quote:


> Originally Posted by *Alya*
> 
> Honestly if you're going through all the work to make your polling precision better, why on Earth would you use 125Hz anyway? It feels horrible, I plugged in my FK which I had set to 125Hz when I was playing CrossFire and I instantly thought to myself "This thing feels horrid." and then to my surprise, it was because it was set to 125Hz. Set it to 500Hz and it felt fine again...


Yeah, the benefits of going beyond 125Hz are vastly more appreciable than the benefits gained in improving polling precision (well, purely in terms of input that is; the general system performance improvements that correlate with that are obviously something very beneficial too).
Quote:


> Originally Posted by *Demi9OD*
> 
> It was horrid, but back when the only options were 125hz USB or 200hz overclocked PS/2, you had to make the most of what was available if you wanted to use a USB mouse.


Also a LAN thing: the setting-up of the "overclock", using testmode and stuff - not really feasible on most LANs. Easier nowadays with the signed filter driver, but there's only a handful of people left playing the MLT04 professionally.

But back then, if you'd wanted, you could use input interpolation along the lines of m_filter (ew!).


----------



## HAGGARD

Did you try getting your mouse hosted on another host controller already? Look in your device manager which host controller it is currently registered to, disable that one, reconnect mouse.


----------



## Th3Awak3n1ng

Quote:


> Originally Posted by *HAGGARD*
> 
> Did you try getting your mouse hosted on another host controller already? Look in your device manager which host controller it is currently registered to, disable that one, reconnect mouse.


Doesn't work for me in Windows 8.1/10. My mouse is hosted on USB 3.0 xHCI and I want it to be hosted on eHCI controller. I disabled USB 3.0 xHCI controller, reconnected mouse, but nothing happened... Mouse just doesn't work, like it's not even connected.


----------



## HAGGARD

Have you tried a different port?


----------



## Th3Awak3n1ng

Sure. I also tried to turn on USB EHCI hand-off in BIOS -- no result.

Like Windows 10 just doesn't "understand" that EHCI USB 2.0 host should be used. And still continue use USB 3.0 xHCI although it's disabled...


----------



## Gonzalez07

Quote:


> Originally Posted by *Th3Awak3n1ng*
> 
> Sure. I also tried to turn on USB EHCI hand-off in BIOS -- no result.
> 
> Like Windows 10 just doesn't "understand" that EHCI USB 2.0 host should be used. And still continue use USB 3.0 xHCI although it's disabled...


skylake system? thought they discarded ehci on it


----------



## Melan

I thought Intel discarded VGA on skylake too... until I found a mobo which has a VGA port on it and uses CPU graphics. "Discarded" doesn't seem what it is now.


----------



## Th3Awak3n1ng

Quote:


> Originally Posted by *Gonzalez07*
> 
> skylake system? thought they discarded ehci on it


Haswell.


----------



## James N

Quote:


> Originally Posted by *Th3Awak3n1ng*
> 
> Doesn't work for me in Windows 8.1/10. My mouse is hosted on USB 3.0 xHCI and I want it to be hosted on eHCI controller. I disabled USB 3.0 xHCI controller, reconnected mouse, but nothing happened... Mouse just doesn't work, like it's not even connected.


Same issue on windows 10 with a gigabyte mobo and a 3770k. And did the same tweaks to the bios. Although my results are ok, it still annoys me that windows handles it like that.

Maybe it has something to do with the new mobos coming with 3.0 usb ports only (no native 2.0 usb ports). And windows 10 doesnt like that or something.

On windows 7 i had no issues with it.

But i don't get the spikes your system has , mine look like this now, despite the host controller doing its own thing.



So maybe it is some driver getting in the way?


----------



## Th3Awak3n1ng

Windows 7 uses EHCI controller by default, so I got this:


Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!


----------



## James N

Quote:


> Originally Posted by *Th3Awak3n1ng*
> 
> Windows 7 uses EHCI controller by default, so I got this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


My results on win 7 are also much better than on win 10.

I might just go back to Windows 7 till i absolutely have to switch over. FeelsMicrosoftman


----------



## Elrick

Quote:


> Originally Posted by *James N*
> 
> My results on win 7 are also much better than on win 10.
> 
> I might just go back to Windows 7 till i absolutely have to switch over. FeelsMicrosoftman


Everything is far BETTER within 7 when it comes to Gaming, the hardcore gamers all eventually succumb to this simple reality







.


----------



## Th3Awak3n1ng

Quote:


> Originally Posted by *Elrick*
> 
> Everything is far BETTER within 7 when it comes to Gaming


Quote:


> Originally Posted by *James N*
> 
> I might just go back to Windows 7 till i absolutely have to switch over. FeelsMicrosoftman


Yea, that's exactly what I did recently.
Although I still use Windows 10 for work, but for game I use only Windows 7.


----------



## c0dy

Quote:


> Originally Posted by *Elrick*
> 
> the hardcore gamers all eventually succumb to this simple reality
> 
> 
> 
> 
> 
> 
> 
> .










Especially when it comes to DX12









Eventually everyone will switch over to Windows 10. It'll be the same just as with "but muh XP is so superior, I'll never switch to 7"


----------



## Th3Awak3n1ng

I personally was glad to switch from XP to 7. Also I like Windows 10.

But Windows 7 is muuuuuuch better for CS:GO on my system.

And I think you choose wrong topic to talk about DX12 _singleplayer_ games.


----------



## c0dy

Comparing your 10 (your HPET test with both on since that's the default afaik) and 7 plots, left out the single extreme points which could be caused by anything.

Windows 7
Worst: 1.006ms
Best: 0.994ms

Windows 10
Worst: 1.014ms
Best: 0.986ms

I don't see where the "muuuuuuch better" comes from. Could be that the variance is different but that's not easy to compare since the scaling/range is different. At least for me.
Also 0.008ms difference... this is some roach-level stuff right there. If you believe this makes a difference while gaming, that's placebo at it's best.

DX12 multiplayer will be a thing, believe it or not. Also I'm sure the "Vulkan" point will be mentioned aswell. Might be, still you'll have to hope that developers use it over DX12.
And in the end people will switch. Unless you want to stick to old hardware/software.


----------



## altf4

Quote:


> Originally Posted by *c0dy*
> 
> Comparing your 10 (your HPET test with both on since that's the default afaik) and 7 plots, left out the single extreme points which could be caused by anything.
> 
> Windows 7
> Worst: 1.006ms
> Best: 0.994ms
> 
> Windows 10
> Worst: 1.014ms
> Best: 0.986ms
> 
> I don't see where the "muuuuuuch better" comes from. Could be that the variance is different but that's not easy to compare since the scaling/range is different. At least for me.
> Also 0.008ms difference... this is some roach-level stuff right there. If you believe this makes a difference while gaming, that's placebo at it's best.
> 
> DX12 multiplayer will be a thing, believe it or not. Also I'm sure the "Vulkan" point will be mentioned aswell. Might be, still you'll have to hope that developers use it over DX12.
> And in the end people will switch. Unless you want to stick to old hardware/software.


+1 i don't really feel a difference between windows 7 and 10, idk why some people here say it's much better.


----------



## VolsAndJezuz

Using the range: what a truly awful way to analyze that data.

It would seem that you are trying to find a biased way to make a point. If you want an actually worthwhile eyeball analysis...

With his Windows 10 data, the 'meat' of the data (~+/-2 standard deviations) is ~+/-10us. Whereas the 'meat' of the Windows 7 data is ~+/-2us. So the polling precision is around ~5x worse in his Windows 10 data. That fits in line with most Windows 10 data I've seen. Well optimized Windows 10 builds seem to run right around that 5x worse from my experience with others' data, often times closer to 10x for middling W10 optimizations.


----------



## c0dy

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> Using the range: what a truly awful way to analyze that data.
> 
> It would seem that you are trying to find a biased way to make a point. If you want an actually worthwhile eyeball analysis...
> 
> With his Windows 10 data, the 'meat' of the data (~+/-2 standard deviations) is ~+/-10us. Whereas the 'meat' of the Windows 7 data is ~+/-2us. So the polling precision is around ~5x worse in his Windows 10 data. That fits in line with most Windows 10 data I've seen. Well optimized Windows 10 builds seem to run right around that 5x worse from my experience with others' data, often times closer to 10x for middling W10 optimizations.


Not trying to find anything lol. That's just what I thought would be a point to make it worse over the other. You know more about that than I do.

Still you're talking about µs...


----------



## VolsAndJezuz

And I'm not trying to talk about what is perceptible and what isn't. The fact remains that MouseTester data for polling jitter is typically 5-10x worse on Windows 10.

The real question is whether the discrepancy actually exists outside of MouseTester data or if it is just a result of MouseTester/.NET Framework not running as well on W10.


----------



## Th3Awak3n1ng

Quote:


> Originally Posted by *c0dy*
> 
> I don't see where the "muuuuuuch better" comes from.


Of course you don't see. To see the difference you must come to my home and try to play CS:GO on both Windows 10 and 7. Only then you will "see".

Quote:


> Originally Posted by *altf4*
> 
> +1 i don't really feel a difference between windows 7 and 10, idk why some people here say it's much better.


Cause even If you don't feel the difference on *your* PC, it doesn't mean that there is no difference on *someone else* PC.

Quote:


> Originally Posted by *c0dy*
> 
> Also 0.008ms difference... this is some roach-level stuff right there. If you believe this makes a difference while gaming, that's placebo at it's best.


I don't need to believe, for me it's easy to compare since I have installed two OS configured in same way. So I play CS:GO on Windows 7, then reboot (it take ~15 secs) and play CS:GO on Windows 10. Believe me or not but I feel dramatic difference between mouse movement on Windows 7 and 10.

Quote:


> Originally Posted by *c0dy*
> 
> DX12 multiplayer will be a thing


Yeah, may be in 5 years or so.









Quote:


> Originally Posted by *c0dy*
> 
> Unless you want to stick to old hardware/software.


I probably stick to my hardware (Intel Core i7 4770, 16gb RAM, Radeon R7 270X) at least for next 3-4 years cause it's still enough for my tasks, so I don't see any problems here.

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> It would seem that you are trying to find a biased way to make a point.


I use Windows 7 only to play CS:GO. For the rest I use Windows 10 and I like it. So I don't understand what he wants to prove.


----------



## c0dy

Quote:


> Of course you don't see. To see the difference you must come to my home and try to play CS:GO on both Windows 10 and 7. Only then you will "see".


I've played cs go on 7, 8, 8.1 and 10. No problems whatsoever.

According to Vols the variance is in the µs range. Under load it would MAYBE reach the very low ms range (I'd assume that's what the result would be).
So no. Unless you're another r0ach that can feel µs/ms of delays/variance and feel them like "clown/swamp cursor" I highly doubt it's "muuuuuuch better" because of anything related to the polling.

Must be hard for you if you seem to be as sensitive as r0ach and will even be affected by ping differences of 5ms. Which is even more of a difference than the difference of Win7 vs 10 polling for you under load.

~ 1:04
https://youtu.be/4bP5Zr2KIzo


----------



## VolsAndJezuz

You are making bad assumptions.

1) Jitter, aka variance in input processing time, is a completely different phenomenon from input lag, so you can't go trying to relate input lag perception magnitude to polling jitter magnitude. Kinda like with internet latency, how you probably wouldn't immediately notice the difference between 30ms and 50ms ping in a game with well-implemented lag compensation and prediction. But if you have just 4-5ms of network jitter, the game becomes borderline unplayable (unless the game has interpolation you can jack up to fight it, at the cost of adding artificial latency).

2) Even if the differences in polling precision between W7 and W10 were somehow proven to be imperceptible to humans, polling precision is a good general indicator of how consistently your system can handle the most time-critical processing at baseline. This is assuming the W10 data doesn't turn out to be artificially noisy like I said in my last past. So if you're getting a lot of variance at baseline with USB polling, you can damn sure bet that you're going to be getting variance in frametime and other processing under heavy load. When I first got a 144Hz monitor, my USB polling precision wasn't great and CS:GO seemed to microstutter a lot on me, presumably from frame processing jitter. Through the long process of optimizing and tweaking, my polling precision improved dramatically and CS:GO is smooth like butter.

3) You seem to me to just be throwing out numbers arbitrarily in regards to variance. Variance in the low ms range, when almost everyone is using 1ms polling interval? So we're getting negative polling intervals sometimes under load? Doesn't inspire a lot of confidence in me that you're an expert on this.

The interesting, recurring phenomenon around here is that some people seem offended by discussions like polling precision, to the point that they seem obliged to repeatedly take the time to remind us of how much we are wasting our time. What I think is going on in your case, for example, is that you have basically committed yourself to W10, at least to the point where it would take an unacceptable amount of time for you to revert back to W7. So you want your W10 setup to be as good or better than any other available options. I think it's a confirmation bias related phenomenon.


----------



## c0dy

1) The ping was probably a bad analogy for this. It was meant more like "take a game as what it is now and try to differentiate between 25 and 30ms ping. And also use the worst polling on 7 as "100% stable" vs 10. That difference overall. Again, was probably not clear.

2) Yeah these graphs might be an indicator. In the real world it'd still be similar to the "phenomenon" of the old Zowie click latency for example. If you don't know about it, it doesn't affect you and you wouldn't bother at all. If you DO know about it any enthusiast, like most people here, would be worried/paranoid about it. Same with this W7 - W10 pollinmg discussion. The difference is so marginal, that it shouldn't affect you in any way. Not in the real world at least.

3) First of all, I never said that I'm an expert lol. Where did you get that from.... It's pretty obvious that the plots you create when your system is idling won't be the same if you got load on your system. Therefore I basically meant that the idle-results/plots don't really matter when you're playing. The usb reports will be "delayed" or whatever you want to call it. That's why I wrote *I'd assume*...

I previously already said that you know more about that than me.

For your last point - no. I do have backups of all my fully installed Windows versions. 7, 8, 8.1 and 10. Wouldn't be a problem for me to switch back to any of these versions.
I'm not biased at all lol. Looks to me like YOU are throwing out arbitrarily points. It's as simple as I've used every single version of Windows without any issues while playing games. No input lag, no stuttering or anything else.

So either I'm just extremely insensitive to these variations/click latencies etc that I don't realize it, or my system has been working perfectly fine for me for a few hardware-generations.









Oh or I'm just that insane at playing games that I can deal with it no matter what


----------



## Th3Awak3n1ng

Quote:


> Originally Posted by *c0dy*
> 
> I've played cs go on 7, 8, 8.1 and 10. No problems whatsoever.


I didn't said I have any problems with CS:GO on Windows 10.

Quote:


> Originally Posted by *c0dy*
> 
> According to Vols the variance is in the µs range. Under load it would MAYBE reach the very low ms range (I'd assume that's what the result would be).
> So no. Unless you're another r0ach that can feel µs/ms of delays/variance and feel them like "clown/swamp cursor" I highly doubt it's "muuuuuuch better" because of anything related to the polling.
> 
> Must be hard for you if you seem to be as sensitive as r0ach and will even be affected by ping differences of 5ms. Which is even more of a difference than the difference of Win7 vs 10 polling for you under load.


Sorry, but I just laugh at how confident you talk about the difference between Windows 7 and 10 on *my PC* when you even never touched it.


----------



## VolsAndJezuz

Quote:


> Originally Posted by *c0dy*
> 
> For your last point - no. I do have backups of all my fully installed Windows versions. 7, 8, 8.1 and 10. Wouldn't be a problem for me to switch back to any of these versions.
> I'm not biased at all lol. Looks to me like YOU are throwing out arbitrarily points. It's as simple as I've used every single version of Windows without any issues while playing games. No input lag, no stuttering or anything else.
> 
> So either I'm just extremely insensitive to these variations/click latencies etc that I don't realize it, or my system has been working perfectly fine for me for a few hardware-generations.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh or I'm just that insane at playing games that I can deal with it no matter what


I am not throwing out points arbitrarily. I am just trying to evaluate presented data and optimizations in an effort to sort the wheat from the chaff. As for W7 versus W10 versus Wx, until compelled otherwise, I tend to side with the data that shows a clear pattern of being undoubtedly better. Then people like you come in and say how utterly wrong I am based solely on your subjective experience and your opinions. The difference between us is that I'm entirely open to being persuaded by reasonable _evidence_, while you are dismissive of anything contrary to your personal belief.

If you feel like your gaming experience has been equivalent across all those Windows versions, they were either all equally unoptimized or you are indeed just extremely insensitive to such things as these. I can reliably pass sub-15ms input lag A/B tests and detect degradation in gameplay fluidity for Source Engine games when they drop below ~300 FPS, and my subjective experience is that optimizing systems against latency and jitter makes a profound difference in my gameplay experience.

My point is basically this: I don't know why you're in this thread if you are self-admittedly aloof of what is being discussed herein.


----------



## c0dy

Unless I miss something where I basically said that you're "utterly wrong", stop reading between lines or interpret something into it.
I've basically just stated my opinions which essentially boil down to such low values affecting anyone's gaming performance is rather placebo than anything else.

I'm not even denying your facts with variances or any data you provided.

Also your "point".. Oh boy... Have a good one.


----------



## Th3Awak3n1ng

Of course everything is placebo if *you* can't see/feel the difference. Doesn't matter what many other people says. LUL.

Quote:


> Originally Posted by *VolsAndJezuz*
> 
> I don't know why you're in this thread if you are self-admittedly aloof of what is being discussed herein.


He's probably troll...


----------



## c0dy

Quote:


> *LUL*


Quote:


> He's probably troll...


Just stop














Get back to twitch chat with that


----------



## Th3Awak3n1ng

Better tell me do you feel any difference between mouse hosted on xHCI (Windows 10 default) and eHCI (Windows 7 default) controller?


----------



## VolsAndJezuz

I'm disengaging and you should as well, because at this point we are just gratuitously diluting the thread and burying its useful content.


----------



## NeoReaper

Corsair fixed the problem with the sabre I was having, firmware update popped today and bam:


----------



## Melan

Went ahead and finally installed my barebone W10 ltsb. It has nothing in it by default, no browsers (IE or Edge), defender, cortana, market and w/e. Strangely enough you can't remove xbox game DVR but you can disable it in registry. What's installed are my usual programs with a lot of services taken down/set to manual. Only nvidia driver and no gfe.
Polling is kinda disappointing so far for Z77 system. All done with wired G403.

Desktop with minimum processor state 5% and timer default timer of 1.001


Spoiler: Warning: Spoiler!






Desktop with minimum processor state 100% and timer of 0.500


Spoiler: Warning: Spoiler!






BF4


Spoiler: Warning: Spoiler!






QL


Spoiler: Warning: Spoiler!






CSGO


Spoiler: Warning: Spoiler!







Better than before but still meh.

Edit: Extra images
Seems old issues with 4k polling are still there


Spoiler: Warning: Spoiler!









Now, at least, it's not permanent 4-12khz polling problem like with first release but such hiccups are still present. Can vary from 1300 up to 4000 hz. Happens only with min. processor state 5%. Didn't see it at 100 yet.


----------



## PurpleChef

How you get the Update Time (ms) so zoomed?

Clicked "Log Start" -> Started CS GO -> Joined aim map server -> played for 1-2 mins -> Quit -> Log start

Is this good or bad? Csgo looks hellah wild









Razer Deathadder Chroma, Corsair Strafe Keyboard, External Hdd on USB 2 ports. Mouse on its own IRQ. All USB3 Disabled.
Hpet On Bios

Did all Win10 tweaks i could find


----------



## Melan

Change data point start and end values. CSGO are xCounts not update time.

Also you don't have to set timer in W10 during games. It sets to maximum value automatically. In fact, system runs on 1.001 all the time from what I've seen. Games are 0.500.


----------



## PurpleChef

win7 csgo


----------



## Melan

This is xcount vs time. You need interval vs time.


----------



## NeoReaper

Another mouse firmware update/CUE update later on 1000hz....


----------



## Watery Chemical

FK2 red logo, 5% usage. Is this really bad :/ ?


----------



## Melan

What OS? What hardware?


----------



## Watery Chemical

Win 7 64bit
AMD A10 5800k - Overclocked to 4.5Ghz from 3.8
GTX 750ti
Cheap MSI mobo
8GB RAM
SSD + HDD


----------



## Melan

How about minimum processor state 100%? Is it so bad like on 5%?


----------



## Watery Chemical

processor at 100% with stress test?


----------



## Melan

No. Go to power options > plan settings > change advanced power settings > processor power management > set minimum processor state 100%.


----------



## Watery Chemical

Sorry can't find it.


----------



## Watery Chemical

Also, is this the reason why when I go in a smoke in CSGO/watching twitch on another monitor and my GPU usage goes to 100% my sensitivity goes down?


----------



## Melan

Weird. High performance should already have CPU power at 100% though.


----------



## Watery Chemical

beat you to posting... unlucky


----------



## Melan

Do have raw input on?

I had polling mess with sensitivity on early versions of windows 10 but it actually made it faster.


----------



## Bucake

how do i block avatars of SPECIFIC PEOPLE


----------



## Alya

Quote:


> Originally Posted by *Bucake*
> 
> how do i block avatars of SPECIFIC PEOPLE


Why isn't there a way to mirror mp4s yet, or rather, why can't we use the video button for it? Anyway.


----------



## Bucake

operation disable identity_melan successful


----------



## Melan

Damn it adblock.


----------



## RevanCorana

Anyone has experience mousetesting from a linux system? is it better/worst?


----------



## HAGGARD

Quote:


> Originally Posted by *RevanCorana*
> 
> Anyone has experience mousetesting from a linux system? is it better/worst?


Hard to do mousetesting without MouseTester.









I've contacted the creator(s) a while ago about this, but it doesn't appear feasible for them to bring it to Linux. I'm sure someone code-savy could relatively easily port the existing, or even write a similar program for Linux though.

And yes, it would be very interesting to play around with these measurements on Linux. See how much OS-induced noise there is to the report processing with a virginal Linux, and then from there there's obviously way more freedom to play around with more fundamental OS functioning in Linux to see how it affects that.


----------



## JackCY

IME3.0 1000Hz USB Z97, it should be the 9000fps model.

Zoomed in:



It has always been this way, I see no problem when using the mouse at 500 or 1000Hz on old laptop with WinXP or new PC with Z97 and Win8.1 or Win10.
The mouse has several periods where it stands still, that's why there are no updates. Power settings, 5% minimum, maximum C states are C6.


----------



## HAGGARD

All MLT04 models are 9kfps! :>

The second graph is useless. The first shows ~200us max. variance. Which is not terrible. Definitely improvable, and it's only half a second of a cutout of readings, so not necessarily representative, but the noise looks fairly tame.


----------



## Nawafwabs

t1.png 80k .png file


t2.png 34k .png file


bad or good ?


----------



## James N

I just convinced myself to ditch windows 10 for now. It is a pain to reapply all tweaks after a big update or reinstallation (i usually just reinstall instead of updating).

After tweaking win 10 for an hour or so this is the best i could do,



And this is windows 7 after 10 minutes with applied basic tweaks, such as, disabling unused audio devices, services, customizing the windows power plan, unparking my cores and disabling selective suspend. Of course i also disabled windows defender which was a major culprit in increasing the dpc latency and polling precision.



Same system, same tweaks, same mouse (G403). Only difference was going back from Windows 10 to Windows 7.


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> I just convinced myself to ditch windows 10 for now. It is a pain to reapply all tweaks after a big update or reinstallation (i usually just reinstall instead of updating).
> 
> After tweaking win 10 for an hour or so this is the best i could do,
> 
> 
> 
> And this is windows 7 after 10 minutes with applied basic tweaks, such as, disabling unused audio devices, services, customizing the windows power plan, unparking my cores and disabling selective suspend. Of course i also disabled windows defender which was a major culprit in increasing the dpc latency and polling precision.
> 
> 
> 
> Same system, same tweaks, same mouse (G403). Only difference was going back from Windows 10 to Windows 7.


how you got result like this

I can't got result like you


----------



## NeoReaper

Quote:


> Originally Posted by *Nawafwabs*
> 
> I can't got result like you


"Data point start" Set that to 200
"Data point end" Set that -500 whatever number is in the box before


----------



## James N

Quote:


> Originally Posted by *Nawafwabs*
> 
> how you got result like this
> 
> I can't got result like you


Sorry , i totally forgot to respond to you. But NeoReaper already answered your question. Anything above 1.1 or below 0.9 is non optimal and anything within these values is considered ok, ideally you want as little deviation as possible. When you do these tests, make sure that no other programs are open in the background.

If you can't achieve satisfying results then lowering the mouse refresh rate from 1000hz down to 500hz definitely helps, as 500hz is usually much more stable than 1000hz.

Before i did these tweaks 500hz felt much better than 1000hz, now that i optimized my system towards a good polling precision and low dpc latency 1000hz feels perfect.

The Windows 10 one i posted is totally fine and this is just me nitpicking btw.


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> Sorry , i totally forgot to respond to you. But NeoReaper already answered your question. Anything above 1.1 or below 0.9 is non optimal and anything within these values is considered ok, ideally you want as little deviation as possible. When you do these tests, make sure that no other programs are open in the background.
> 
> If you can't achieve satisfying results then lowering the mouse refresh rate from 1000hz down to 500hz definitely helps, as 500hz is usually much more stable than 1000hz.
> 
> Before I did these tweaks 500hz felt much better than 1000hz, now that i optimized my system towards a good polling precision and low dpc latency 1000hz feels perfect.
> 
> The Windows 10 one I posted is totally fine and this is just me nitpicking btw.


I think I am bad with software

I try to set "Data point end" = -500 but didn't accept negative value

I do a lot of tips like:

1- set mouse to 500hz

2- disable power saving in bios and windows

3- unpark core

4- disable some IRQ conflicts







5- Disable any integrated motherboard components

6- Update all driver

7- Update Bios

I don't know what I am missing.


----------



## NeoReaper

What I mean't by -500 is negate 500 from the number that is already in the max box.


----------



## Nawafwabs

Quote:


> Originally Posted by *NeoReaper*
> 
> What I mean't by -500 is negate 500 from the number that is already in the max box.


----------



## James N

Quote:


> Originally Posted by *Nawafwabs*


Click start log

Make sure to steadily move the mouse in a circle motion. It needs to be fast enough to where you hit USB polling rate, but not too fast, so you don't hit the malfunction speed of your mouse (which shouldn't happen with a decent gaming mouse.)



Then press log stop.

And then use the start and endpoint to exclude the parts where you moved your mouse inconsistent. Keep cutting it till it looks clean enough.



Then this should be the final product. (Of course, results will differ)


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> Click start log
> 
> Make sure to steadily move the mouse in a circle motion. It needs to be fast enough to where you hit USB polling rate, but not too fast, so you don't hit the malfunction speed of your mouse (which shouldn't happen with a decent gaming mouse.)
> 
> 
> 
> Then press log stop.
> 
> And then use the start and endpoint to exclude the parts where you moved your mouse inconsistent. Keep cutting it till it looks clean enough.
> 
> 
> 
> Then this should be the final product. (Of course, results will differ)




thanks a lot for helping me.
I appreciate that


----------



## Bucake

looks like the mouse you're using isn't stable at 1khz.

what mouse is that? something old? krait? mx300 / mx500? mx510? mx518? lol


----------



## James N

Quote:


> Originally Posted by *Bucake*
> 
> looks like the mouse you're using isn't stable at 1khz.
> 
> what mouse is that? something old? krait? mx300 / mx500? mx510? mx518? lol


I think that he mentioned somewhere that this is a Rival 700.

While my old Rival also had terrible polling, it wasn't nearly as bad as this one though. Something is wrong there.


----------



## uaokkkkkkkk

That's Scimitar release day firmware type bad.


----------



## Bucake

Quote:


> Originally Posted by *James N*
> 
> Rival 700


heh. what??
dunno why i thought that they know what they are doing..


----------



## James N

Quote:


> Originally Posted by *Bucake*
> 
> heh. what??
> dunno why i thought that they know what they are doing..


Yea, also the worst part is that this test is already done with 500hz. I am interested what his results would be with a different mouse in comparison, to see if it really is the mouse or something with his system.

He did mention that he optimized his system already, though.


----------



## uaokkkkkkkk

Yeah. I hope he has more mice to try.


----------



## Bucake

Quote:


> Originally Posted by *James N*
> 
> this test is already done with 500hz


but it's showing 1ms reports? unless the software or firmware has a mistake some place..
can't imagine his system being the problem, that would be really weird and really bad

for my own sake i'll just presume he's trolling or doing something wrong


----------



## Nawafwabs

Quote:


> Originally Posted by *Bucake*
> 
> looks like the mouse you're using isn't stable at 1khz.
> 
> what mouse is that? something old? krait? mx300 / mx500? mx510? mx518? lol


mouse: rival 700 (2016)

i set mouse to 500hz but it jump to 1000hz


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> Yea, also the worst part is that this test is already done with 500hz. I am interested what his results would be with a different mouse in comparison, to see if it really is the mouse or something with his system.
> 
> He did mention that he optimized his system already, though.


i have razer deathadder & g502

i will test it to show you the result


----------



## Nawafwabs

Quote:


> Originally Posted by *Bucake*
> 
> but it's showing 1ms reports? unless the software or firmware has a mistake some place..
> can't imagine his system being the problem, that would be really weird and really bad
> 
> for my own sake i'll just presume he's trolling or doing something wrong


trolling?


----------



## Nawafwabs

Razer DeathAdder:

1000Hz




500Hz




Logitech G502:

1000Hz




500Hz




SteelSeries Rival 700:

1000Hz




500Hz


----------



## Nawafwabs

Quote:


> Originally Posted by *uaokkkkkkkk*
> 
> Yeah. I hope he has more mice to try.


I do it


----------



## James N

Quote:


> Originally Posted by *Nawafwabs*
> 
> Razer DeathAdder:
> 
> 1000Hz
> 
> 
> 
> 
> 500Hz
> 
> 
> 
> 
> Logitech G502:
> 
> 1000Hz
> 
> 
> 
> 
> 500Hz
> 
> 
> 
> 
> SteelSeries Rival 700:
> 
> 1000Hz
> 
> 
> 
> 
> 500Hz


Really does seem like the Rival is the worst out of all your mice. Although the first plot you provided was much worse than the newer ones.

Weird how 500hz does not seem as stable as 1000hz on your system.

The G502 1000hz plots are the best results.


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> Really does seem like the Rival is the worst out of all your mice. Although the first plot you provided was much worse than the newer ones.
> 
> Weird how 500hz does not seem as stable as 1000hz on your system.
> 
> The G502 1000hz plots are the best results.


I have high end Pc

but I think the problem is the motherboard

what's your motherboard?


----------



## James N

I am currently using a Gigabyte Z77X-UD3H in conjuction with an overclocked I7 3770K @ 4.5ghz.

Do you have the option to test the Rival 700 on a different system?


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> I am currently using a Gigabyte Z77X-UD3H in conjuction with an overclocked I7 3770K @ 4.5ghz.
> 
> Do you have the option to test the Rival 700 on a different system?


I can test it on my laptop


----------



## Nawafwabs

mouserate show good result


----------



## James N

Quote:


> Originally Posted by *Nawafwabs*
> 
> 
> 
> mouserate show good result


Try it with the mouse tester software.


----------



## Nawafwabs

g403 wireless (new mouse)


----------



## Avalar

Osu! (the video game) has one of these lol


----------



## Avalar

Quote:


> Originally Posted by *Nawafwabs*
> 
> Razer DeathAdder:
> 
> 1000Hz
> 
> 
> 
> 
> 500Hz
> 
> 
> 
> 
> Logitech G502:
> 
> 1000Hz
> 
> 
> 
> 
> 500Hz
> 
> 
> 
> 
> SteelSeries Rival 700:
> 
> 1000Hz
> 
> 
> 
> 
> 500Hz


Would you mind telling me how you get these graphs? I've tried doing tests with my mice using MouseTester before, but it never looked right. Either something's wrong with my PC, or idk how to use the software, because I can't even interpret the data that I get.


----------



## justzeNn

Should i keep the logitech software on g403 or unistall it ? Can add input lag in cs go or in general ?


----------



## cdcd

Shouldn't add any lag. You can set your CPI steps and save it on the on-board memory and uninstall it just as well.


----------



## Nawafwabs

Quote:


> Originally Posted by *Avalar*
> 
> Would you mind telling me how you get these graphs? I've tried doing tests with my mice using MouseTester before, but it never looked right. Either something's wrong with my PC, or idk how to use the software, because I can't even interpret the data that I get.


I have the least ver of mouse taster 1.5

just press Log Start and move the mouse in circles


----------



## Straszy

For problems with nvidia pascal cards u need to disable nvidia mixer. There is a tool somewhere and its really working.


----------



## Avalar

Quote:


> Originally Posted by *Nawafwabs*
> 
> I have the least ver of mouse taster 1.5
> 
> just press Log Start and move the mouse in circles


Thanks! I actually just had an outdated version


----------



## Nawafwabs

Quote:


> Originally Posted by *Straszy*
> 
> For problems with nvidia pascal cards u need to disable nvidia mixer. There is a tool somewhere and its really working.


nvidia mixer ?? Where??


----------



## cdcd

If I recall correctly NVmixer used to be part of nForce back in the day, i.e. it should not be relevant at all these days. That suggestion sounds pretty r0achy to me tbh


----------



## Nawafwabs

After disable Asmedia 106x SATA Controller


----------



## Straszy

Quote:


> Originally Posted by *cdcd*
> 
> If I recall correctly NVmixer used to be part of nForce back in the day, i.e. it should not be relevant at all these days. That suggestion sounds pretty r0achy to me tbh


http://www.overclock.net/t/683583/how-to-adjust-nvidias-powermizer

sorry, it was powermizer


----------



## NovaGOD

Is this acceptable for win10-ltsb? I'm using intel driver on a usb 3.0 port/logitech g403 atm.


----------



## James N

Quote:


> Originally Posted by *NovaGOD*
> 
> Is this acceptable for win10-ltsb? I'm using intel driver on a usb 3.0 port/logitech g403 atm.


That is about as good as it gets on Windows 10. And overall this is a more than fine result.


----------



## Gonzalez07

Quote:


> Originally Posted by *NovaGOD*
> 
> Is this acceptable for win10-ltsb? I'm using intel driver on a usb 3.0 port/logitech g403 atm.


seems really solid, what driver version are you using?


----------



## NovaGOD

Thanks a lot, i guess if i want better w7 is the only option.








@Gonzalez07 I'm using default intel xHCI drivers, specifically "Intel(R) USB 3.0 eXtensible Host Controller - 1.0 (Microsoft)" i didn't install any driver when i switched xHCI back to "enabled" on my mobo w10 automatically uses this driver.


----------



## James N

Quote:


> Originally Posted by *NovaGOD*
> 
> Thanks a lot, i guess if i want better w7 is the only option.
> 
> 
> 
> 
> 
> 
> 
> 
> @Gonzalez07 I'm using default intel xHCI drivers, specifically "Intel(R) USB 3.0 eXtensible Host Controller - 1.0 (Microsoft)" i didn't install any driver when i switched xHCI back to "enabled" on my mobo w10 automatically uses this driver.


Yea, i also tweaked about everything i could and my best results were about the same as yours. I am also using the "Intel(R) USB 3.0 eXtensible Host Controller - 1.0 (Microsoft)" drivers.

Then the only change i did was switch back to windows 7, and it was way better. That being said the results you achieved, seem to be more than sufficient enough for it not to have an impact on the feeling of your mouse in games.









Here are my results comparing win 10 to win 7.

http://www.overclock.net/t/1550666/usb-polling-precision/600

So, at least for me the only way to achieve better results was to switch back to win 7, which overall feels a lot more solid to me.


----------



## NovaGOD

Yeah i saw your post with the excellent polling, that's why i was debating if i want to switch to w7 again







I guess i'll stick to w10 for now because i'm too lazy to switch.


----------



## Nawafwabs

Quote:


> Originally Posted by *NovaGOD*
> 
> Is this acceptable for win10-ltsb? I'm using intel driver on a usb 3.0 port/logitech g403 atm.


Same mouse


I think my result bad


----------



## NovaGOD

Try these:

1. High performance power plan with usb selective disabled
2. Put your gpu on msi mode if it isn't already (there is utility that does this automatically i think)
3. use kboost=on if you have nvidia
4. disable processor c-states
5. remove usb corpses with usbdeview
6. disable any usb device you don't want
7. again if you have nvidia then disable telemetry with autoruns
8. disable any other service you don't need (optional except the w10 telemetry i think)

Basically the idea is that you want your gpu/cpu to run at full clocks speeds(not full load) when you are testing for polling, for example in my system if i turn kboost off my results are much worse.

These worked for me at least but i'm not sure which has the most impact.


----------



## Conditioned

Quote:


> Originally Posted by *NovaGOD*
> 
> Try these:
> 
> 1. High performance power plan with usb selective disabled
> 2. Put your gpu on msi mode if it isn't already (there is utility that does this automatically i think)
> 3. use kboost=on if you have nvidia
> 4. disable processor c-states
> 5. remove usb corpses with usbdeview
> 6. disable any usb device you don't want
> 7. again if you have nvidia then disable telemetry with autoruns
> 8. disable any other service you don't need (optional except the w10 telemetry i think)
> 
> Basically the idea is that you want your gpu/cpu to run at full clocks speeds(not full load) when you are testing for polling, for example in my system if i turn kboost off my results are much worse.
> 
> These worked for me at least but i'm not sure which has the most impact.


Msi can give stuttering in some instances ymmw.

Kboost=evga only. (Actually now that I think about it there is a version of msi ab where you can put an evga skin on it and enable it on non evga cards).


----------



## NovaGOD

Quote:


> Originally Posted by *Conditioned*
> 
> Msi can give stuttering in some instances ymmw.
> 
> Kboost=evga only. (Actually I think there is a version of msi ab where you can put an evga skin on it and enable it on non evga cards).


i don't have evga card and i can enable it through 5.3.11 version of precision x(i think they removed it or made it evga exclusive on newer versions). Never heard about stuttering problems but you're right ymmv.


----------



## Gonzalez07

the program powermizer should do what kboost does. it adds reg keys to force P0 all times like kboost I believe


----------



## James N

Quote:


> Originally Posted by *Nawafwabs*
> 
> Same mouse
> 
> 
> I think my result bad


It is not super terrible but i could indeed be better.
Make sure that you don't have anything open in background that you don't need. For example monitoring software like core temp can have a big impact on your results as well as Chrome and such. Just so you know that it isn't programs fault.


----------



## Nawafwabs

Quote:


> Originally Posted by *NovaGOD*
> 
> Try these:
> 
> 1. High performance power plan with usb selective disabled
> 2. Put your gpu on msi mode if it isn't already (there is utility that does this automatically i think)
> 3. use kboost=on if you have nvidia
> 4. disable processor c-states
> 5. remove usb corpses with usbdeview
> 6. disable any usb device you don't want
> 7. again if you have nvidia then disable telemetry with autoruns
> 8. disable any other service you don't need (optional except the w10 telemetry i think)
> 
> Basically the idea is that you want your gpu/cpu to run at full clocks speeds(not full load) when you are testing for polling, for example in my system if i turn kboost off my results are much worse.
> 
> These worked for me at least but i'm not sure which has the most impact.




I don't know if there any improvement

thank you for tips


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> It is not super terrible but i could indeed be better.
> Make sure that you don't have anything open in background that you don't need. For example monitoring software like core temp can have a big impact on your results as well as Chrome and such. Just so you know that it isn't programs fault.


I spent a lot of time for optimizing my system

I think the problem with motherboard or driver


----------



## James N

Quote:


> Originally Posted by *Nawafwabs*
> 
> I spent a lot of time for optimizing my system
> 
> I think the problem with motherboard or driver


Have you tested it in safemode yet? If safemode shows the same then it seems to be hardware related.


----------



## Avalar

I was curious about what you guys were doing, so I tried it myself.



This is good, right?









Logitech G502 btw


----------



## daniel0731ex




----------



## Bucake

@Avalar you want to shrink it so it shows only ~60ms to <5760
i mean, the current graph clearly tells you that the mouse is stable at 1khz, but not how jittery the polling is

edit: see data point start and end when you have the plot in front of you


----------



## Avalar

Quote:


> Originally Posted by *Bucake*
> 
> @Avalar you want to shrink it so it shows only ~60ms to <5760
> i mean, the current graph clearly tells you that the mouse is stable at 1khz, but not how jittery the polling is
> 
> edit: when you have the plot in front of you, you can just input the numbers @ data point start / end, and press enter


How about now?


----------



## Bucake

yep, exactly.

fine plot, sir. your polling is in healthy condition


----------



## Avalar

Quote:


> Originally Posted by *Bucake*
> 
> yep, exactly.
> 
> fine plot, sir. your polling is in healthy condition


Yay ;3


----------



## James N

Quote:


> Originally Posted by *Avalar*
> 
> Yay ;3


Welcome to the club , you will only get more paranoid from here on out. Soon you will be checking your polling every other day or so, to make sure nothing changed.

Just let us know once the disease has fully spread and you are diagnosed with mousetism.


----------



## Avalar

Quote:


> Originally Posted by *James N*
> 
> Welcome to the club , you will only get more paranoid from here on out. Soon you will be checking your polling every other day or so, to make sure nothing changed.
> 
> Just let us know once the disease has fully spread and you are diagnosed with mousetism.


Lol, I think I already suffer from this.

Noticing the subtle CPI variations among different mouse pads using the same mouse, as well as the responsiveness. Using 3+ different mouse pad and mouse combos for different games, using the Windows 7 Classic theme, checking your DPC latency when you think something's up...

drawing figure 8's on your desktop for a solid 3 minutes to see how it "feels"...


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> Have you tested it in safemode yet? If safemode shows the same then it seems to be hardware related.


in safe mode


----------



## x7007

Quote:


> Originally Posted by *Nawafwabs*
> 
> in safe mode


Does anyone have simple guide how to check this, see , and what I need to look for and what I need to post in the forums ? I still didn't get the CPI thing. I mean I just want to check it for the moment and I'll understand it on the go.


----------



## Nawafwabs

Finally, I fix it :"(

I got the result in safe mode after some changes:



1- change Ram timing to manual

2- set Ram voltage to 1.2v

3- disable Rank Interleave

4- enable Channel Interleave


----------



## James N

Quote:


> Originally Posted by *Nawafwabs*
> 
> Finally, I fix it :"(
> 
> I got the result in safe mode after some changes:
> 
> 
> 
> 1- change Ram timing to manual
> 
> 2- set Ram voltage to 1.2v
> 
> 3- disable Rank Interleave
> 
> 4- enable Channel Interleave


This looks really good. So you know you are not hardware limited. Good luck tweaking, so you get similar results outside of safemode.

Quote:


> Originally Posted by *x7007*
> 
> Does anyone have simple guide how to check this, see , and what I need to look for and what I need to post in the forums ? I still didn't get the CPI thing. I mean I just want to check it for the moment and I'll understand it on the go.


http://www.overclock.net/t/1550666/usb-polling-precision/600#post_26033838

On the lower left you need to use the dropdown menu and select either "interval vs time". Then just post here.


----------



## Gonzalez07

have you fellas used the program WhySoSlow? same company that made latencymon, id be interested in seeing some results


----------



## nidzakv

Very interesting.. Thanks..

Послато са LG-D802 уз помоћ Тапатока


----------



## NeoReaper

I have tried what I can to improve the stability of my results, I am running nearly the minimal services needed to keep W10 running with internet and literally the only thing running in the background at the time was Bitdefender but it wasn't doing anything and was set to lowest priority/Game mode when mousetester was running. The funny thing is my Safemode result:


Spoiler: Safe Mode Result









Spoiler: Normal Mode Result


----------



## James N

Quote:


> Originally Posted by *NeoReaper*
> 
> I have tried what I can to improve the stability of my results, I am running nearly the minimal services needed to keep W10 running with internet and literally the only thing running in the background at the time was Bitdefender but it wasn't doing anything and was set to lowest priority/Game mode when mousetester was running. The funny thing is my Safemode result:
> 
> 
> Spoiler: Safe Mode Result
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Normal Mode Result


That is really odd. But since your results outside of safemode are totally fine, there is nothing to be concerned about. Maybe someone who has a better understanding of the technical aspects can help you with your safemode result being way worse. I am interested to see what is causing this.


----------



## NeoReaper

Yeah, I don't get it, I really don't have much ever running in the background. I have 55 processes all together (from task manager > Performance) in the background and I know what every one of them is responsible for (And for the record my PC is on the Creators update):


Spoiler: List of processes



system idle process
system
smss.exe
csrss.exe
wininit.exe
csrss.exe
services.exe
lsass.exe
winlogon.exe
svchost.exe
svchost.exe
fontdrvhost.exe
svchost.exe
svchost.exe
dwm.exe
svchost.exe
svchost.exe
svchost.exe
vsserv.exe
wudfhost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
updatesrv.exe
vsservp.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
svchost.exe
sihost.exe
taskhostw.exe
explorer.exe
svchost.exe
shellexperiencehost.exe
cue.exe
audiodg.exe
svchost.exe
bdagent.exe


EDIT: Hopefully someone can explain this so I can get it sorted.


----------



## Axaion

55?, ive got 107

Man i miss nLite, could disable so much stuff from the get go there.

What'd you use to trim it down?


----------



## Melan

DISM if you're good with cmd.

MSMG Toolkit if you're not.

NTLite if you have $$$.


----------



## NeoReaper

The actual stripped down version I installed was from the tool in this thread: http://www.overclock.net/t/1627258/tools-for-editing-windows-10-iso/0_100#post_26002316
Then I used this website (It flags up to some anti-viruses but I can assure you it doesn't have a virus) http://fdossena.com/?p=w10debotnet/index_1703.frag
Black Viper came next but that website I found to be... Too safe, there is a lot more you can disable that isn't needed by windows.
I have windows update disabled and re-enable it once a month when I do a defrag of my 2TB drive using MyDefrag, Run Ccleaner and let windows update grab any updates it wants (It might sound like a task but its not that much of a task, these 4 services are the ones Windows Update needs to do its work and can be disabled safely then re-enabled every time you want to do an update):
Background Intelligent Transfer Service
Update Orchestrator Service
Windows Firewall
Windows Update

Note: If you don't have an antivirus/firewall solution installed, leave Windows Firewall enabled.
I can make a post if wanted and I can list every service I have disabled.
EDIT: I also go and find as many of the privacy tools as possible, not because I am concerned about W10's privacy crap, its because I don't want it running in the background eating up RAM and CPU when its not needed to keep W10 functioning.


----------



## Axaion

Thanks, used blackviper/nlite since XP, was hilarious to get an XP install down under 80MB ram usage, and still fully working back then hah

Switched to Windows Toolkit for win7 though, since nLite didnt work, and RT7Lite was so buggy


----------



## NeoReaper

Quote:


> Originally Posted by *Axaion*
> 
> Thanks, used blackviper/nlite since XP, was hilarious to get an XP install down under 80MB ram usage, and still fully working back then hah
> 
> Switched to Windows Toolkit for win7 though, since nLite didnt work, and RT7Lite was so buggy


The days with low ram usage... Ahhh...
I can get Windows 10 down to 700mb (on 16gb of ram with 4gb pagefile so I know some of it might be going to pagefile to be cached?) on startup when bitdefender/CUE wasn't installed which is pretty cool that you can still enjoy windows 10 with low amounts of ram xD
EDIT: Oh no, I just learn't there is *ANOTHER FEATURE UPDATE ON THE WAY IN SEPTEMBER KILL ME NOW PLS*


----------



## Axaion

Quote:


> Originally Posted by *NeoReaper*
> 
> The days with low ram usage... Ahhh...
> I can get Windows 10 down to 700mb (on 16gb of ram with 4gb pagefile so I know some of it might be going to pagefile to be cached?) on startup when bitdefender/CUE wasn't installed which is pretty cool that you can still enjoy windows 10 with low amounts of ram xD
> EDIT: Oh no, I just learn't there is *ANOTHER FEATURE UPDATE ON THE WAY IN SEPTEMBER KILL ME NOW PLS*


Yay, now with improved .. wasted white space!


----------



## Melan

September should also bring new W10 LTSB version. Hooray indeed.


----------



## NeoReaper

I would love to go to LTSB but there isn't many ways of getting hold of it unless you have enterprise which sucks. I kinda wished they gave the option to us W10Pro users too...


----------



## Axaion

Yeah, would have been neat, but least pro was free - which is the only reason im not running 7 right now


----------



## NeoReaper

Same here for my laptops, the way my desktop got pro was I participated in the very early insider builds so thats how I got my W10Pro and luckily they don't force us insiders to always have preview builds enabled to keep our license! (even though it would not have mattered much since I have windows update disabled >> )


----------



## Melan

I'm not planning on using W10 until I upgrade/sidegrade to ryzen. W8 gives way better results on Z77 platform for me.


----------



## Avalar

Quote:


> Originally Posted by *Axaion*
> 
> Yeah, would have been neat, but least pro was free - which is the only reason im not running 7 right now


Windows 10 is still free if you have a valid copy of Win 7 installed.

...that is, if you know where to look.









I get Win 10 for free anyway cuz I'm a dual-enrolled college student, so iz cool.


----------



## Axaion

Quote:


> Originally Posted by *Avalar*
> 
> Windows 10 is still free if you have a valid copy of Win 7 installed.
> 
> ...that is, if you know where to look.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I get Win 10 for free anyway cuz I'm a dual-enrolled college student, so iz cool.


Literally doesnt matter to me since i got mine for free


----------



## Avalar

Quote:


> Originally Posted by *Axaion*
> 
> Literally doesnt matter to me since i got mine for free


Nah man, I mean, you coulda had both


----------



## Axaion

So apparently MSMG Toolkit doesnt work with ISO files gotten from the MediaCreationTool, had to download a seperate ISO because the one from the tool, didnt have install.wim in it.


----------



## NeoReaper

Quote:


> Originally Posted by *Axaion*
> 
> So apparently MSMG Toolkit doesnt work with ISO files gotten from the MediaCreationTool, had to download a seperate ISO because the one from the tool, didnt have install.wim in it.


You can modify those iso's, mount them, convert the install.esd to .wim and then yer done!


----------



## Axaion

Yea, noticed that after my post, had to take permission of everything else it bugged out like mad


----------



## justzeNn

Does realtek audio driver create input lag to the mouse?


----------



## freddy4fingrar

imo this looks very bad?

http://imgur.com/a/ILVaW


----------



## James N

Quote:


> Originally Posted by *freddy4fingrar*
> 
> imo this looks very bad?
> 
> http://imgur.com/a/ILVaW


Do it again and move the mouse at a steady pace, then exclude the starting and endpoints. If your speed is inconsistent, then it will affect your results.


----------



## Synoxia

What is this? 20k hz with g203 mouse? ahahahah


----------



## Nawafwabs

I think I do a great job .!


----------



## x7007

Quote:


> Originally Posted by *Nawafwabs*
> 
> 
> 
> 
> I think I do a great job .!


what this program does? usb mouse adjust


----------



## James N

Quote:


> Originally Posted by *Nawafwabs*
> 
> 
> 
> 
> I think I do a great job .!


Nice, you finally did it. Good job, it should feel much better now compared to before.


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> Nice, you finally did it. Good job, it should feel much better now compared to before.


Thanks for your support.!

I Feel it smoother and accurate

Today I take a look at software called UsbTreeView

I see my USB High latency


can you test it ?

http://www.uwe-sieber.de/usbtreeview_e.html


----------



## James N




----------



## Nawafwabs

Quote:


> Originally Posted by *James N*


Do you use windows 10?

how u got that driver?

windows 10 default controller(intel usb 3.0 ex.. host controller)

u have different controller


----------



## James N

Quote:


> Originally Posted by *Nawafwabs*
> 
> Do you use windows 10?
> 
> how u got that driver?
> 
> windows 10 default controller(intel usb 3.0 ex.. host controller)
> 
> u have different controller


I went back to Windows 7 And disabled everything i don't need. Also correct me if i am wrong. But the USB enumeration time shouldn't matter unless you have issues with your usb port and or if it stops working randomly every now and then.


----------



## NovaGOD

DAE, w10


I can try different mouse, disable usb 3.0 to see if there is any difference, not sure about what this USB: 641ms means though and what is the impact in-game. When i use latency mon usb dpc latency is lower with this driver.


----------



## James N

Quote:


> Originally Posted by *NovaGOD*
> 
> DAE, w10
> 
> 
> I can try different mouse, disable usb 3.0 to see if there is any difference, not sure about what this USB: 641ms means though and what is the impact in-game. When i use latency mon usb dpc latency is lower with this driver.


As far as i know this is the time it takes for your pc to detect a device when they are connected or disconnected. Sometimes it will take a while for your system to recognize the new device and then you will get the hourglass icon, since your pc is enumerating (configuring and assigning, setting the address, getting the description of said device).

This can take as long as 16 seconds. As long as your USB port and your device works and doesn't keep timing out, the number is not relevant and has no significant impact on the performance of your device.


----------



## Infection11

da elite Windos 7 64bit
why latency so high? how can i fix it








btw how i can do the USB polling precision first message so long and my english very bad, can you help me?


----------



## James N

Quote:


> Originally Posted by *Infection11*
> 
> 
> da elite Windos 7 64bit
> why latency so high? how can i fix it
> 
> 
> 
> 
> 
> 
> 
> 
> btw how i can do the USB polling precision first message so long and my english very bad, can you help me?


Don't worry the Enumeration time is not important unless you experience issues.

You should read through the very first post in this thread , it features everything you need to know. If you need additional help, let us know.

You should download this

http://www.overclock.net/t/1590569/mousetester-software-reloaded


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> I went back to Windows 7 And disabled everything i don't need. Also correct me if i am wrong. But the USB enumeration time shouldn't matter unless you have issues with your usb port and or if it stops working randomly every now and then.


I dont know how I describe that but I will try

before 5 months I have Asus motherboard

I change it to Gigabyte motherboard OK?

When I have asus 400dpi feel slow really slow

so when I change to gigabyte 400dpi feel like 1000dpi

I lower it to 200dpi but also feel fast like you acceleration feel


----------



## x7007

Quote:


> Originally Posted by *Nawafwabs*
> 
> I dont know how I describe that but I will try
> 
> before 5 months I have Asus motherboard
> 
> I change it to Gigabyte motherboard OK?
> 
> When I have asus 400dpi feel slow really slow
> 
> so when I change to gigabyte 400dpi feel like 1000dpi
> 
> I lower it to 200dpi but also feel fast like you acceleration feel


so gigabyte usb is bad?


----------



## Nawafwabs

Quote:


> Originally Posted by *x7007*
> 
> so gigabyte usb is bad?


maybe


----------



## James N

Quote:


> Originally Posted by *Nawafwabs*
> 
> I dont know how I describe that but I will try
> 
> before 5 months I have Asus motherboard
> 
> I change it to Gigabyte motherboard OK?
> 
> When I have asus 400dpi feel slow really slow
> 
> so when I change to gigabyte 400dpi feel like 1000dpi
> 
> I lower it to 200dpi but also feel fast like you acceleration feel


I am using a Gigabyte board and mine is ok.
The VIA ones for whatever reason have major issues on Windows 10 and also don't work correctly on Windows 7 if you have anything plugged into them prior to installing drivers. But that seems more of an issue with VIA rather than Gigabyte. So make sure that you only use the Intel USB ports for your mouse.


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> I am using a Gigabyte board and mine is ok.
> The VIA ones for whatever reason have major issues on Windows 10 and also don't work correctly on Windows 7 if you have anything plugged into them prior to installing drivers. But that seems more of an issue with VIA rather than Gigabyte. So make sure that you only use the Intel USB ports for your mouse.


win7 is smooth and good

but I can't go back, some new games I think not working good on win 7

I use "Intel(R) USB 3.0 eXtensible Host Controller - 1.0 (Microsoft)"

I don't understand VIA you talk about


----------



## James N

Quote:


> Originally Posted by *Nawafwabs*
> 
> win7 is smooth and good
> 
> but I can't go back, some new games I think not working good on win 7
> 
> I use "Intel(R) USB 3.0 eXtensible Host Controller - 1.0 (Microsoft)"
> 
> I don't understand VIA you talk about


It is common for some mainboards to have 2 different usb chipsets. Mine for example has intel and VIA . So i need to use specific usb ports in order to use the intel chipset or the via ones. I always use the Intel ones for my mouse , since the VIA ones have some driver incompatibility issues on windows 10 and even on windows 7.

I am not sure if that is the reason for your problem though. But it is worth a try to test out different ones.


----------



## Infection11

im so bad at english dude, i dont understand a thing.. there is no simple why to just install it and just setup it to 1000hz?
too many things to remove , but i dont see a download of something


----------



## emka

So wait the Intel 3.0 usb ports should be better than my 2.0 usb native ports on my gigabyte Z97 ?


----------



## Nawafwabs

Quote:


> Originally Posted by *James N*
> 
> It is common for some mainboards to have 2 different usb chipsets. Mine for example has intel and VIA . So i need to use specific usb ports in order to use the intel chipset or the via ones. I always use the Intel ones for my mouse , since the VIA ones have some driver incompatibility issues on windows 10 and even on windows 7.
> 
> I am not sure if that is the reason for your problem though. But it is worth a try to test out different ones.


i know whats the problem

when i started the program"mouse movement recorder " it shows red line when im playing

so i change pointer speed to 3/6)

then i play again and it feel accurate like aimbot

i dont know why this happen to me


----------



## James N

Quote:


> Originally Posted by *emka*
> 
> So wait the Intel 3.0 usb ports should be better than my 2.0 usb native ports on my gigabyte Z97 ?


Use whatever works for you. Just in my particular case, the VIA ones don't work correctly and the intel ones give me better results/are more stable. My intel ones are the main ports and support 3.0 and 2.0 usb. I am currently using them as 2.0 on windows 7.


----------



## NovaGOD

Suddenly i have bad polling without changing anything on my setup, tried g403(main driver atm) and KPO always the same result. Tried Intel usb 3.0 drivers, removed corpses, etc. Honestly i don't know what happened i used to have good polling results..


----------



## Avalar

Yeah, what happened. My polling didn't differ more than 0.01 before. Now look at it..

Before: http://cdn.overclock.net/7/7d/7d63ded9_8OKsOc.png

After: https://vgy.me/CPm9hD.png

;-;

I mean, who _really_ cares? But still..

;c


----------



## James N

Windows 10 creators update?


----------



## NovaGOD

Windows 10 ltsb-n for me, didn't change a thing and suddenly my polling rate is terrible lol.


----------



## x7007

When connecting my mouse to the Intel USB 3 controller the mouse moves soooooo slow compared to when USB 2 ... what will I do when I'll have only USB 3.1/3.0 Motherboard support ? MSI motherboard that my friend have, it has special USB that actually makes USB 3.0 to USB 2.0 the orange ports . so it doesn't affect on devices. I also have the Xbox Elite Controller with Wireless Dongle and on USB 3.0 it works terrible , it register sometimes double movement if I leave the left stick when it's on max of one side. it doesn't seem to happen with USB 2.0 .

So I can't figure out why this is happening. I have many devices I need to connect using USB 3.0 and I can't all the time go in bios and change it. External Harddisks , USB devices all support USB 3.0 . and I don't have enough USB's.

I need the Mouse and Keyboard on USB port IRQ 23 and not IRQ 16 sharing with the GPU , so 2 are out 24/7.
Then I need for my GSX 1000 External USB sound card again not IRQ 16 . must be USB 2.0.
Harddisk is USB 3.0 connected to the Intel XHCI 3.0 because that's the only way to get UASP support.

I have
CASE
2 front USB 2.0 IRQ23 Mouse and GSX 1000 sound card connected
2 front USB 3.0

Back
2 ASMEDIA 3.0
2 USB 2.0 IRQ 16
3 Intel XHCI 3.0 Keyboard is connected to one of the ports .

What should I do ?
connect the keyboard to the USB 2.0 IRQ 16 ? I must have issues when the keyboard connected to the USB 3.0, like some kind of delay like I had with the mouse.


----------



## 508859

Quote:


> Originally Posted by *x7007*
> 
> When connecting my mouse to the Intel USB 3 controller the mouse moves soooooo slow compared to when USB 2 ... what will I do when I'll have only USB 3.1/3.0 Motherboard support ? MSI motherboard that my friend have, it has special USB that actually makes USB 3.0 to USB 2.0 the orange ports . so it doesn't affect on devices. I also have the Xbox Elite Controller with Wireless Dongle and on USB 3.0 it works terrible , it register sometimes double movement if I leave the left stick when it's on max of one side. it doesn't seem to happen with USB 2.0 .
> 
> So I can't figure out why this is happening. I have many devices I need to connect using USB 3.0 and I can't all the time go in bios and change it. External Harddisks , USB devices all support USB 3.0 . and I don't have enough USB's.
> 
> I need the Mouse and Keyboard on USB port IRQ 23 and not IRQ 16 sharing with the GPU , so 2 are out 24/7.
> Then I need for my GSX 1000 External USB sound card again not IRQ 16 . must be USB 2.0.
> Harddisk is USB 3.0 connected to the Intel XHCI 3.0 because that's the only way to get UASP support.
> 
> I have
> CASE
> 2 front USB 2.0 IRQ23 Mouse and GSX 1000 sound card connected
> 2 front USB 3.0
> 
> Back
> 2 ASMEDIA 3.0
> 2 USB 2.0 IRQ 16
> 3 Intel XHCI 3.0 Keyboard is connected to one of the ports .
> 
> What should I do ?
> connect the keyboard to the USB 2.0 IRQ 16 ? I must have issues when the keyboard connected to the USB 3.0, like some kind of delay like I had with the mouse.


Enable MSI mode for your GPU, use only USB 2.0 for your mouse.

Also check if your sound card is MSI-capable.


----------



## x7007

enabling msi on gpu causes issues.
what about keyboard?

usb sound card 2.0 can't enable msi mode.


----------



## NeoReaper

It does? Never heard that one before.


----------



## x7007

Quote:


> Originally Posted by *NeoReaper*
> 
> It does? Never heard that one before.


Anyone else used it , did anyone else see issues , had issues ?


----------



## NeoReaper

I have enabled it on every system I own since it seems to actually make them a bit more stable in the case of one of the old netbooks I have (It was an experiment to see if I can get a 10 year old netbook to run W10... "smoothly") and after enabling MSI mode the long black screens before the lock screen picture just disappeared for some reason. I have never seen a case of it actually making stability/polling worse.


----------



## x7007

Quote:


> Originally Posted by *NeoReaper*
> 
> I have enabled it on every system I own since it seems to actually make them a bit more stable in the case of one of the old netbooks I have (It was an experiment to see if I can get a 10 year old netbook to run W10... "smoothly") and after enabling MSI mode the long black screens before the lock screen picture just disappeared for some reason. I have never seen a case of it actually making stability/polling worse.


I don't understand why it's auto enabled for laptops MSI mode, but not enabled for desktop cards.

omg,, connecting a mouse, to be accurate, a G502 and Corsair K70 Red Rapid Keyboard to USB 3.0 causing random stutterings and drop fps . in a game BladeStorm Nightmare I had drops to 45-55 while the gpu is sitting barely 45% and cpu barely 25% . nothing happens while the fps drops, it just happens. after I change the mouse to USB 2.0 the drops almost stopped at all, there was still a drop to 58-59 randomly. after I changed the Keyboard to USB 2.0 after using MSI on the graphics card and connected to USB 2.0 IRQ 16 the fps drops stopped and it's 60 FPS all the time..


----------



## pindle

Quote:


> Originally Posted by *NovaGOD*
> 
> Suddenly i have bad polling without changing anything on my setup, tried g403(main driver atm) and KPO always the same result. Tried Intel usb 3.0 drivers, removed corpses, etc. Honestly i don't know what happened i used to have good polling results..


I think I'm having the same problem you're having, been like this since a few weeks now and I've been pulling my hair out. I can clearly see polling dips every 1000ms, and can't for the love of God figure out what changed or caused it. I've revisited all BIOS settings, checked updates, tried different USB ports, different mice, disconnected most stuff (internal/external) that isn't mandatory... to no avail. I'm starting to think it's time for a full reinstall (since my W8.1 is already 2-3 years old). *shiver* :'(

Edit: are u using an nVidia card? Had a driver update some time ago, haven't ruled out that yet.


----------



## Melan

Quote:


> Originally Posted by *x7007*
> 
> Anyone else used it , did anyone else see issues , had issues ?


I've been using MSI mode on GPU for quite some time on systems with W7 up to W10 without any problems. At the very least on Z77 platform.


----------



## Axaion

Im using it right now on windows 10 x64 with a GTX 970 @3.5GB VRAM


----------



## NovaGOD

Quote:


> Originally Posted by *pindle*
> 
> I think I'm having the same problem you're having, been like this since a few weeks now and I've been pulling my hair out. I can clearly see polling dips every 1000ms, and can't for the love of God figure out what changed or caused it. I've revisited all BIOS settings, checked updates, tried different USB ports, different mice, disconnected most stuff (internal/external) that isn't mandatory... to no avail. I'm starting to think it's time for a full reinstall (since my W8.1 is already 2-3 years old). *shiver* :'(
> 
> Edit: are u using an nVidia card? Had a driver update some time ago, haven't ruled out that yet.


Yes nvidia 970 on msi mode using kboost and i didn't change the driver.

I stopped thinking about it tbh, i changed nothing, no new updates/drivers/hw configuration, everything is as before when i had good polling so i can't do anything about it. I'm too lazy to reinstall with all the effort i made optimizing the system.


----------



## pindle

Quote:


> Originally Posted by *NovaGOD*
> 
> Yes nvidia 970 on msi mode using kboost and i didn't change the driver.
> 
> I stopped thinking about it tbh, i changed nothing, no new updates/drivers/hw configuration, everything is as before when i had good polling so i can't do anything about it. I'm too lazy to reinstall with all the effort i made optimizing the system.


I have a 970 too and installed new drivers a couple of weeks back, that could've been the issue (but you didn't install the new driver guess?). Not using kboost but never had this kind of latency before so I'm not expecting that to do anything (when I choose max performance for game profiles it doesn't change the clocks anyway - why would I even need kboost then?).

My thinking exactly. Mouse performance is pretty good it's not that my aim is totally off, it just keeps eating at me







Probably will reinstall if I get a day off somewhere soon, else it'll have to wait until my vacation (almost 6 weeks to go...).


----------



## James N

Quote:


> Originally Posted by *Melan*
> 
> I've been using MSI mode on GPU for quite some time on systems with W7 up to W10 without any problems. At the very least on Z77 platform.


Same, no issues whatsoever.


----------



## NovaGOD

Quote:


> Originally Posted by *pindle*
> 
> I have a 970 too and installed new drivers a couple of weeks back, that could've been the issue (but you didn't install the new driver guess?). Not using kboost but never had this kind of latency before so I'm not expecting that to do anything (when I choose max performance for game profiles it doesn't change the clocks anyway - why would I even need kboost then?).
> 
> My thinking exactly. Mouse performance is pretty good it's not that my aim is totally off, it just keeps eating at me
> 
> 
> 
> 
> 
> 
> 
> Probably will reinstall if I get a day off somewhere soon, else it'll have to wait until my vacation (almost 6 weeks to go...).


I'm still using 381.78, i try not to change drivers often as i always use DDU and i lose all my CP settings etc. I'll try the latest driver out of curiosity and see if it fixes anything.

For me msi mode and kboost had the biggest impact while testing for polling rate, kboost means your gpu operates at it's maximum clock value, the factory "boost" clock speed. In my case it was 1405mhz vs base clock which is around 1180 if i used "prefer maximum performance" if i recall correctly.

You can give it a try maybe it will help you.


----------



## x7007

Quote:


> Originally Posted by *x7007*
> 
> I don't understand why it's auto enabled for laptops MSI mode, but not enabled for desktop cards.
> 
> omg,, connecting a mouse, to be accurate, a G502 and Corsair K70 Red Rapid Keyboard to USB 3.0 causing random stutterings and drop fps . in a game BladeStorm Nightmare I had drops to 45-55 while the gpu is sitting barely 45% and cpu barely 25% . nothing happens while the fps drops, it just happens. after I change the mouse to USB 2.0 the drops almost stopped at all, there was still a drop to 58-59 randomly. after I changed the Keyboard to USB 2.0 after using MSI on the graphics card and connected to USB 2.0 IRQ 16 the fps drops stopped and it's 60 FPS all the time..


Didn't this happen to no one ? I think we should say what is our motherboard and say which usb we are connecting to with a picture. there is too much to it, usb 2.0 from one port can be different from the other or such.

Still, no one answered what are we going to do when we have a motherboard with full USB 3.1 only ? and no usb 2.0 . all the new motherboards are like that no ? for now MSI-mode on the nvidia works fine. I don't know why they officially do that while AMD/ATI do it on desktop and laptop and Nvidia on Laptop.. can't understand this craoop.

But when the mouse/keyboard/GSX1000 USB sound card are all connected to USB 2.0 I have jumps on Latencymon with the USB .sys it doesn't seem much of an issue, but it's a bit annoying that this happens. anyhow it doesn't affect in game or sound . but overall SoundCard like Asus Strix RAID DLX is working as Asmedia USB 3 chip . It's ALL BLOODY fishy ! , why one thing can't work while the other can, why it doesn't do this automatically while the other do , why USB 3.0 causing issues with Mouse/Keyboard/ USB SoundCards while PCI-E sound card works just fine ??

We need to investigate this. who knows what Bios configuration or windows drivers/software can cause other issues........ it's like problems within problems.. there is no out from this thing.


----------



## pindle

Quote:


> Originally Posted by *NovaGOD*
> 
> I'm still using 381.78, i try not to change drivers often as i always use DDU and i lose all my CP settings etc. I'll try the latest driver out of curiosity and see if it fixes anything.
> 
> For me msi mode and kboost had the biggest impact while testing for polling rate, kboost means your gpu operates at it's maximum clock value, the factory "boost" clock speed. In my case it was 1405mhz vs base clock which is around 1180 if i used "prefer maximum performance" if i recall correctly.
> 
> You can give it a try maybe it will help you.


Will try it when I have time... probably not before the weekend :'(
Thanks!
Quote:


> Originally Posted by *x7007*
> 
> Didn't this happen to no one ? I think we should say what is our motherboard and say which usb we are connecting to with a picture. there is too much to it, usb 2.0 from one port can be different from the other or such.
> 
> Still, no one answered what are we going to do when we have a motherboard with full USB 3.1 only ? and no usb 2.0 . all the new motherboards are like that no ? for now MSI-mode on the nvidia works fine. I don't know why they officially do that while AMD/ATI do it on desktop and laptop and Nvidia on Laptop.. can't understand this craoop.
> 
> But when the mouse/keyboard/GSX1000 USB sound card are all connected to USB 2.0 I have jumps on Latencymon with the USB .sys it doesn't seem much of an issue, but it's a bit annoying that this happens. anyhow it doesn't affect in game or sound . but overall SoundCard like Asus Strix RAID DLX is working as Asmedia USB 3 chip . It's ALL BLOODY fishy ! , why one thing can't work while the other can, why it doesn't do this automatically while the other do , why USB 3.0 causing issues with Mouse/Keyboard/ USB SoundCards while PCI-E sound card works just fine ??
> 
> We need to investigate this. who knows what Bios configuration or windows drivers/software can cause other issues........ it's like problems within problems.. there is no out from this thing.


Nope, no specific USB3 related issues here. My polling issues are systemwide, USB 2 or 3 doesn't influence that.


----------



## x7007

Quote:


> Originally Posted by *pindle*
> 
> Will try it when I have time... probably not before the weekend :'(
> Thanks!
> Nope, no specific USB3 related issues here. My polling issues are systemwide, USB 2 or 3 doesn't influence that.


but you don't say what motherboard you have . are you connected USB 2.0 or 3.0 ? did you check latencymon for more than 1-2 hours ?


----------



## pindle

Quote:


> Originally Posted by *x7007*
> 
> but you don't say what motherboard you have . are you connected USB 2.0 or 3.0 ? did you check latencymon for more than 1-2 hours ?


I have a H97 chipset (Asus H97M plus) and have had my mouse connected on a USB3 port on and off with no noticeable difference. I'm using Win8.1 and am fairly sure latencymon doesn't work properly on that (at least, last time I tested it) but as said have felt zero difference and actually checked some time ago with another latency tool and it showed no issues either, though I haven't monitored it for hours. Anyways not saying this will be the same for everyone, just reporting in, for me on this comp using USB 2 or 3 is all the same.
Quote:


> Originally Posted by *NovaGOD*
> 
> I'm still using 381.78, i try not to change drivers often as i always use DDU and i lose all my CP settings etc. I'll try the latest driver out of curiosity and see if it fixes anything.
> 
> For me msi mode and kboost had the biggest impact while testing for polling rate, kboost means your gpu operates at it's maximum clock value, the factory "boost" clock speed. In my case it was 1405mhz vs base clock which is around 1180 if i used "prefer maximum performance" if i recall correctly.
> 
> You can give it a try maybe it will help you.


Checked MSI mode, it's turned on now but it doesn't really seem to make a difference, still stuck with these minor latency hickups. Checked for KBoost but seems I need to install yet another suite besides Afterburner (that Precision stuff) not really interested in that atm :/ Is there a place I can download that skin for Afterburner? Googled it but couldn't find any links (guess I'd have to dl the suite and extract it... make an account first... blah the hassle)


----------



## Oh wow Secret Cow

What are some things people have done to get their regular boot results in line with their safe mode results?


----------



## x7007

Quote:


> Originally Posted by *pindle*
> 
> I have a H97 chipset (Asus H97M plus) and have had my mouse connected on a USB3 port on and off with no noticeable difference. I'm using Win8.1 and am fairly sure latencymon doesn't work properly on that (at least, last time I tested it) but as said have felt zero difference and actually checked some time ago with another latency tool and it showed no issues either, though I haven't monitored it for hours. Anyways not saying this will be the same for everyone, just reporting in, for me on this comp using USB 2 or 3 is all the same.
> Checked MSI mode, it's turned on now but it doesn't really seem to make a difference, still stuck with these minor latency hickups. Checked for KBoost but seems I need to install yet another suite besides Afterburner (that Precision stuff) not really interested in that atm :/ Is there a place I can download that skin for Afterburner? Googled it but couldn't find any links (guess I'd have to dl the suite and extract it... make an account first... blah the hassle)


again mate you missing information. which usb 3 did you connect. asmedia, intel, etron? were the drivers installed or microsoft ones? latencymon works. dpc latency is not working windows 8 abovr.


----------



## pindle

Quote:


> Originally Posted by *x7007*
> 
> again mate you missing information. which usb 3 did you connect. asmedia, intel, etron? were the drivers installed or microsoft ones? latencymon works. dpc latency is not working windows 8 abovr.


I've only got Intel USB controllers (at least don't see any others popping up in device manager). I think I never installed MS drivers for the chipset, but since this was 2-3 years back I'm not 100%. Looking at the driver makes me suspect this is the default one (originates 2006 so...). Tried LatencyMon and it seems to be working allright indeed. Not sure what you need to know, attached a screen of it:


This is just with some apps and browsers open, and a Twitch stream but no gaming. Not even sure if these are good values (I do have some polling issues as mentioned before, but not USB2/3 related). Can't do more atm but if you want me to, I can test something else for you later. Just let me know what you'd like to know


----------



## x7007

Quote:


> Originally Posted by *pindle*
> 
> I've only got Intel USB controllers (at least don't see any others popping up in device manager). I think I never installed MS drivers for the chipset, but since this was 2-3 years back I'm not 100%. Looking at the driver makes me suspect this is the default one (originates 2006 so...). Tried LatencyMon and it seems to be working allright indeed. Not sure what you need to know, attached a screen of it:
> 
> 
> This is just with some apps and browsers open, and a Twitch stream but no gaming. Not even sure if these are good values (I do have some polling issues as mentioned before, but not USB2/3 related). Can't do more atm but if you want me to, I can test something else for you later. Just let me know what you'd like to know


If you can show me how much your USBPORT.SYS reaching after 1 hr or something, do you have any USB Sound Card DAC ?

Does anyone with USB 2.0 connected keyboard and mouse at 1000 Hz usb polling rate , have Very High DPC Latency on USBPORT.SYS ? I also have the GSX 1000 on the USB so this have affected too.


----------



## NovaGOD

My keyboard is using a usb to ps/2 adapter but i have a 1khz mouse connected.

Switch to Intel xHCI from your bios and repeat the test, i had lower USBPORT.SYS latency with the Intel driver so i decided to keep it. Not a big difference in my case thought.
Quote:


> Originally Posted by *pindle*
> 
> Checked MSI mode, it's turned on now but it doesn't really seem to make a difference, still stuck with these minor latency hickups. *Checked for KBoost but seems I need to install yet another suite besides Afterburner (that Precision stuff) not really interested in that atm :/ Is there a place I can download that skin for Afterburner?* Googled it but couldn't find any links (guess I'd have to dl the suite and extract it... make an account first... blah the hassle)


If you want my opinion i would remove afterburner temporarily (caused me some problems in the past when i used it with precision x) unless you absolutely need it and install precision x 5.3.11 (i think that's the correct version but i'm not at home to verify it, google it for download link), this will allow you to enable kboost for any nvidia gpu not only evga's lineup as in the latest versions. Try it, test polling/latency w/e you want and if it does nothing switch to afterburner again, ymmv in these situations.


----------



## pindle

Quote:


> Originally Posted by *x7007*
> 
> If you can show me how much your USBPORT.SYS reaching after 1 hr or something, do you have any USB Sound Card DAC ?
> 
> Does anyone with USB 2.0 connected keyboard and mouse at 1000 Hz usb polling rate , have Very High DPC Latency on USBPORT.SYS ? I also have the GSX 1000 on the USB so this have affected too.


Sure can run some tests, 1 hour of idle time or does it need to be stressed? Not using a USB DAC atm.
My KB is USB 2.0, mouse 1KHz, am not using a GSX 1000 (not sure what that is







).
Quote:


> Originally Posted by *NovaGOD*
> 
> If you want my opinion i would remove afterburner temporarily (caused me some problems in the past when i used it with precision x) unless you absolutely need it and install precision x 5.3.11 (i think that's the correct version but i'm not at home to verify it, google it for download link), this will allow you to enable kboost for any nvidia gpu not only evga's lineup as in the latest versions. Try it, test polling/latency w/e you want and if it does nothing switch to afterburner again, ymmv in these situations.


If it replaces all Afterburner functionality then I'm fine with it ofcourse. Just annoying cause I set FPS limiters in Afterburner that I will have to redo, but will give that software a try.


----------



## NovaGOD

Quote:


> Originally Posted by *pindle*
> 
> If it replaces all Afterburner functionality then I'm fine with it ofcourse. Just annoying cause I set FPS limiters in Afterburner that I will have to redo, but will give that software a try.


I'm not sure about the functionality, i only use the program for kboost tbh.


----------



## x7007

Quote:


> Originally Posted by *pindle*
> 
> Sure can run some tests, 1 hour of idle time or does it need to be stressed? Not using a USB DAC atm.
> My KB is USB 2.0, mouse 1KHz, am not using a GSX 1000 (not sure what that is
> 
> 
> 
> 
> 
> 
> 
> ).
> If it replaces all Afterburner functionality then I'm fine with it ofcourse. Just annoying cause I set FPS limiters in Afterburner that I will have to redo, but will give that software a try.


not need to be stressed , just make sure you play or use the keyboard + mouse a lot. gaming or something


----------



## pindle

Quote:


> Originally Posted by *x7007*
> 
> not need to be stressed , just make sure you play or use the keyboard + mouse a lot. gaming or something


OK, will run it while playing Overwatch or something, not sure when I have time but somewhere tonight or this weekend probably.
Quote:


> Originally Posted by *NovaGOD*
> 
> I'm not sure about the functionality, i only use the program for kboost tbh.


Lol OK where do you set FPS limits then? Found Afterburner the most reliable place. Will test it out anyhow, keep you posted!


----------



## NovaGOD

Quote:


> Originally Posted by *pindle*
> 
> Lol OK where do you set FPS limits then? Found Afterburner the most reliable place. Will test it out anyhow, keep you posted!


I don't use fps limiters at all.







I prefer unlocked frame rate but i think precision has a "frame rate target" or something like that.

kboost is not essential and it might do nothing to your system, it improved my polling rate but as i said ymmv.


----------



## x7007

Quote:


> Originally Posted by *NovaGOD*
> 
> My keyboard is using a usb to ps/2 adapter but i have a 1khz mouse connected.
> 
> Switch to Intel xHCI from your bios and repeat the test, i had lower USBPORT.SYS latency with the Intel driver so i decided to keep it. Not a big difference in my case thought.
> If you want my opinion i would remove afterburner temporarily (caused me some problems in the past when i used it with precision x) unless you absolutely need it and install precision x 5.3.11 (i think that's the correct version but i'm not at home to verify it, google it for download link), this will allow you to enable kboost for any nvidia gpu not only evga's lineup as in the latest versions. Try it, test polling/latency w/e you want and if it does nothing switch to afterburner again, ymmv in these situations.


You mean connect to the Intel USB 3.0 but to disable the XHCI ? then my External USB Harddisk will be slow ... and the Intel USB 3.0 is the only one which uses UASP for external harddisk and USB's 3.0

Anyhow it's already connected to Intel USB 2.0 , Also the GSX 1000 Sound Card DAC is also connected to USB 2.0 , they both takes a lot of processing from the USB 2.0 port.


----------



## NovaGOD

Quote:


> Originally Posted by *x7007*
> 
> You mean connect to the Intel USB 3.0 but to disable the XHCI ? then my External USB Harddisk will be slow ... and the Intel USB 3.0 is the only one which uses UASP for external harddisk and USB's 3.0
> 
> Anyhow it's already connected to Intel USB 2.0 , Also the GSX 1000 Sound Card DAC is also connected to USB 2.0 , they both takes a lot of processing from the USB 2.0 port.


No, enable intel xHCI through the bios, run test, disable xHCI again through the bios, run test again and compare results. Every motherboard is different though, mine only has intel ports so make sure you are using an intel one, disable the ones you don't use via device manager. Also you have a lot of devices connected, maybe for testing purposes disconnect them, remove corpses, use only the mouse.

That's what i did and got different results on latencymon (lower USBPORT.SYS latency using xHCI enabled), not sure if it's the best testing method or the results are meaningful though.


----------



## pindle

Quote:


> Originally Posted by *x7007*
> 
> not need to be stressed , just make sure you play or use the keyboard + mouse a lot. gaming or something


Here's my result after running it for 2 hours, of which ~1.5 hours of gaming I think. During the full time I had a couple of Chrome tabs open, one of which was a Twitch stream on my other monitor. I have my keyboard connected on USB (2.0) and specially for you connected my mouse to USB3.0











Not really sure what's normal exactly, what are you looking at specifically, DPC count? Note though that I'm not using any USB DAC or something, just my onboard soundcard.


----------



## x7007

Quote:


> Originally Posted by *pindle*
> 
> Here's my result after running it for 2 hours, of which ~1.5 hours of gaming I think. During the full time I had a couple of Chrome tabs open, one of which was a Twitch stream on my other monitor. I have my keyboard connected on USB (2.0) and specially for you connected my mouse to USB3.0
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not really sure what's normal exactly, what are you looking at specifically, DPC count? Note though that I'm not using any USB DAC or something, just my onboard soundcard.


I wanted you to use usb 2.0 for the mouse.


----------



## x7007

Do any of you have USB Selective Suspend Settings Enabled or Disabled ?

Because that seem to be my issue..

With this Disabled , I don't have jumps anymore when starting some sound with USB 2.0 ...... I think it's something with power saving on the USB , the device is shutting down in 1,2 sec and when trying again it turn it on causing a big spike.

Now the USBPORT.SYS is not going crazy. pfff , I will just try to disable all the power saving to the card where it's connected , I don't want all my USB Devices to keep running on , USB External Devices won't turn off causing them to heat up and even blow up with the heat here. more than 38 ambient in my room and the harddisk can reach to more than 54 C , and that's when Idle not copying anything. even my keyboard is working properly now ..

When connecting to the Front panel IRQ 23 with the Mouse / USB sound Card GSX , I have the big jumps no matter what with the USBPORT.SYS in Latencymon .

When connecting to the BACK panel IRQ 16 which is GPU / Keyboard 1000 Hz / USB sound Card GSX I also don't have any jumps with the SYS file, just some random high Interrupt to Process Latency which only happens once when starting to hear music ..

*But when connecting to USB 3.0 , there is no big jumps or anything else , it just works fine. But I'm not sure if I should connect the USB Sound Card which only support 2.0 to USB 3 port . What can I do ??
Mouse is not good to connect to USB 3.0 port , but what about Keyboards and USB DAC sound cards ?*

Which port should I connect my Xbox Wireless Adapter for Xbox Elite Controller ?


----------



## James N

Yes, USB Selective Suspend should be disabled.


----------



## x7007

Quote:


> Originally Posted by *James N*
> 
> Yes, USB Selective Suspend should be disabled.


But what about External Hard Disk ? how can I still have it turn off ? APM and such ? or it won't work because it depends on the USB Power management ?


----------



## James N

Quote:


> Originally Posted by *x7007*
> 
> But what about External Hard Disk ? how can I still have it turn off ? APM and such ? or it won't work because it depends on the USB Power management ?


Of course, don't disable features that you need. I personally just unplug my external harddrive after i did whatever i had to do.


----------



## pindle

Quote:


> Originally Posted by *x7007*
> 
> I wanted you to use usb 2.0 for the mouse.


Ran test again, of the total time it was ~1 hour of gaming, same settings as before but on USB2.0. Also have a 2nd mouse connected btw, but I always make sure its sensor is on a surface it can scan, shouldn't be that much of an influence I hope.


Yes I have high performance profile on with selective suspend disabled.

(I do the same as James above me, connect only when I need them. Never really bothered fiddling with power options for them).


----------



## x7007

Quote:


> Originally Posted by *pindle*
> 
> Ran test again, of the total time it was ~1 hour of gaming, same settings as before but on USB2.0. Also have a 2nd mouse connected btw, but I always make sure its sensor is on a surface it can scan, shouldn't be that much of an influence I hope.
> 
> 
> Yes I have high performance profile on with selective suspend disabled.


so ye usbport is taking alot. for me I need to connect the gsx 1000sound card to usb 3 or else it screw my latency total all. having fps randomly drop down and some stutterings. mouse is ok for usb 2 and also keyboard but not usb sound cards. maybe new advanced motherboard won't have this issue like gigabyte or msi. the amd ones I would like to see.

So the external hard disk will never shut down if the USB Selective is Disabled ?

There is no way to just single handed disable USB management ? cause it seems it doesn't effect when you just manually do this from the Device manager, why we wouldn't do that so ?


----------



## pindle

Yeah I don't have the USB soundcard problem so guess that makes a ton of difference. Not sure on your HD question, rarely use them nowadays.


----------



## cimi

Hi guys,

first sorry for bad English..I have *BIG* problem with my mouse..I most of the time play CS GO,1yr ago i had AMD setup,then i changed to Intel and after switch is made problem started..My mouse feels like it has acceleration,i used 6/11 in game sens 2.12,dpi 400,m_yaw 0.022...Now i have to use 6/11,in game 1,dpi 400,and m _yaw 0.12 so i can even play normaly...If i put old setting my mouse fly all over the screen,i cant even aim.

I tried every thing from this forum,tried 5-6 bios,changed every setting in bios,tried win7/9/10,tried 2 mices,2 cpu,2 mb,2 gpu,2 ram,other cables,other drivers,other nvidia settings...Mouse acceleration is off in Win10..

Problem happens in every game i try steam or no steam..I posted this topic on 10+ forums and no success,i spent a looooot of time searchin google,a looot of money buying new pc parts..

Plzzzz help if you can,and if someone can spare some time to help me,i can give you all info,screen shoots..*If someone could help me i will pay him(just send me paypal acc) i will donate you money..
*
This problem possessed me last year,i spent ton of my time,and spent big money buying every pc part twice..

I tried pc services in mine country,no one want to spend time to find problem,if mice works they don`t see problem ..Who want to test in game,in service if it work on desktop then its fine:thumb: .

If you can helm me it wold means a lot to me.

6700k 4.7GHZ 1.360V+Nepton 240m
Gigabyte GA-Z170X-Gaming 7
CORSAIR 16GB Vengeance LPX DDR4 3000MHz
Palit GTX1060 6 GB dual
Benq XL2411Z
CRUCIAL 256GB SSD MX100...WD 2TB WD20EFRX...320GB WD3200AAJS
CaseEdit Value
FRACTAL DESIGN Define R4,EVGA G2 750W
On-board / GENIUS Cavimanus HS-G700V
Win 10 x64 Pro
ZOWIE by BenQ FK2.
Genius GX Manticore..

Ty in advance guys.



Here are my results...


----------



## bleabbraxhm

Quote:


> Originally Posted by *cimi*
> 
> Hi guys,


What is your mousepad. If you are using something new it may have oil buildup or humidity problems. If that is the problem then cleaning weekly and using a clothes dryer for 5-10 minutes to remove all humidity before you want to play. Also your mousepad may have had a treatment that wore off over time.

Also consider purchasing a usb 2.0 pcie add in card.


----------



## pindle

Quote:


> Originally Posted by *cimi*
> 
> Hi guys,
> 
> Here are my results...


USB selective suspend enabled? Did you disable Hyperthreading, C-states and did you do all those other BIOS tweaks mentioned by r0ach etc?

Btw the result pic looks weird to me. Been a month since I used MouseTester but your measurements on x-axis are 12.5k iterations while your cutoff point is set to 6.3k? It should "zoom" the graph so the end point on the x-axis becomes your manually set endpoint, at least iirc.


----------



## cimi

@bleabbraxhm i have qck + fnatic and qck heavy..I clean my pc evary week..Problem started when i changed cpu/mb/ram from amd to intel..Then i buy again cpu/mb/ram problem still here.

I will buy pci/usb 2.0 card so to try that..

@pindle USB selective suspend is disable (but same results are with enable)..All other saving options in bios disabled..No meter what i do in sys or bios mouse feels floaty like its on ice instead mouse pad..I had to cut my sensitivity by half so i can even play.

I think maybe IRQ is problem,but i don`t know how to even check nor how to set eavy part to different irq..


----------



## pindle

Quote:


> Originally Posted by *cimi*
> 
> @bleabbraxhm i have qck + fnatic and qck heavy..I clean my pc evary week..Problem started when i changed cpu/mb/ram from amd to intel..Then i buy again cpu/mb/ram problem still here.
> 
> I will buy pci/usb 2.0 card so to try that..
> 
> @pindle USB selective suspend is disable (but same results are with enable)..All other saving options in bios disabled..No meter what i do in sys or bios mouse feels floaty like its on ice instead mouse pad..I had to cut my sensitivity by half so i can even play.
> 
> I think maybe IRQ is problem,but i don`t know how to even check nor how to set eavy part to different irq..


Do you have an nvidia card? If so you can try this.


----------



## cimi

Yup i have nvidia card...Ty for link but im noob for that to do,only if someone can spare time to lead me step by step on ts or discord..Or if some member who know to do that connect via team viewer..
Here is mine irq...I don`t know what else can be problem,cos i tired everything..And nothing make change more then 5-10% in mouse feel..





If someone can fix this issue i will send you money true pay pal..I`m hopeless don`t know what else to try.If you want add me on stem to show you what looks like in game when i put settings to my old value.


----------



## RealSteelH6

Check out my post:

http://www.overclock.net/t/1634253/dpc-quite-high-after-bios-update#post_26222412


----------



## softskiller

Quote:


> Originally Posted by *cimi*
> 
> Here is mine irq...


Why do you have that Synaptics SMBus Driver?

Are you using a touchpad? Or a mouse software?

Too many of your relevant hardware devices are using the IRQ 16. Never saw that before.

I hope you did not use "PCI Lock" in msconfig which would prevent Windows from assigning IRQs?

You can also type "msinfo32" in the Win search box and click on Hardware Resources -> Conflics/Sharing


----------



## cimi

I don`t know about Synaptics SMBus Driver..I just uninstall it but it comes back..So i disable it in device menager.
I don`t use touchpad,nor any mouse software (i ave Zowie Fk2 no driver needed).



I would put every device on different irq but im afraid i`m make even worst situation..

I saw that also that many devices are using irq 16


----------



## bigluchs

Hi cimi,

you tried a lot, but have you done this:

Try to put as many PCI devices as possible to assign the irq by "MSI mode". This will use a "new" method of assigning the irq numbers. When succeeding those devices will get a negative irq number in device manager.

If it works, this will remove a lot of those irq conflicts you have. But it depends on the inidividual device driver.

To do so, follow this instructions: (or you might use your msi mode utility, you put in a screenshot of above, but i don't know that tool)

http://forums.guru3d.com/showthread.php?t=378044


----------



## cimi

Thank you man for advice..

Here are my devices.What to put in msi mod?



I would give everythin to fix this problem...I spent 101% more then 1 month (if i sum all the time i spent) trying to resolve this problem.


----------



## bigluchs

try the last 3 that sit on irq 16 (USB, Intel Management Enginge , High definition Audio)

you have to reboot to see results


----------



## x7007

Quote:


> Originally Posted by *cimi*
> 
> Thank you man for advice..
> 
> Here are my devices.What to put in msi mod?
> 
> 
> 
> I would give everythin to fix this problem...I spent 101% more then 1 month (if i sum all the time i spent) trying to resolve this problem.


why some of your devices no msi on default? they are installed as msi mode


----------



## x58haze

Hello everyone this is me haze, or Yuriy a guy from Venezuela, my English is not that good, but I was wondering if someone here can guidance me a little be with my pc :')
The thing is, I do have a keyboard brand Logitech model G-15 the thing is, that this keyboard seems to have the f4 key damaged... like just internal, not the plastic f4 key and this is very annoying because the key keep pressing itself even at BIOS, when installing an OS.... etc, etc

But I have found the problem, when Windows boots... in device manager, in the tab of (human interfaces Devices) there is a device called:
This is my pc specs:

OS: Windows 7 64 ultimate (a partition SSD 30 gigs, for play games like cs 1.6 in steam) that's why I disable here pagefile etc I just play here cs 1.6
CPU: Ryzen 1600 3.7 oc good voltages
RAM: 3200 MHz 2x4 Patriot these timmings are configured with Thaiphon burner software the Tool Enchancer, through bios 16-18-18-36 etc
mobo: Asrock ab350 fatality k4 gaming
PSU: 2011 Thermaltake ToughPower 775+bronce 80
EVGA: GTX 1060 6 GB
(Im from Venezuela yes, but im not rich, this was like one opportunity in a million, I meet a friend from usa, and I did work for him as freelancer playing games, but
as I say was one in a million, and im very proud of work spending many many hours ,and 2months to finally get this pc, and by the that evga was a gift free of charges
So I wanted to be very humble and honest.

Usb Input Device Port_#0004.HUB_#002 and when I uninstall that device, the f4 key stop pressing itself...
but when the pc restarts... windows seems to install back that Hub... ,so my question is, how can I..... find a way to fully uninstall permanent the Usb input device that I don't need? but in permanent way? thank you lemme put here the pictures



Also I was testing my Logitech g9 old mice like from 2008, and it seems to perfom better with 1000 dpi and 200 hz... x.x like the MArc Fix Mouse recorder, seems to put all 200 without dropping or going crazy.. I guess, because 333 or 500 is like they drop to 450-470, or 333 to 320 or 335, I don't know it just feel better at 1000 dpi 200 hz
and windows 5 sensivitiy, if someone can explain me a better settings , I would appreciate ♥



And finally this are my latency, running in a Window 7 64 bits, and by following Haggard guide, not cpu core parking, services optimized, no page filing, by eliminating al hiding and disconnected devices under Device manager, by configuring properly the power-energy at windows, and regedit cpu core parking... and changing PCI components to MSI, except AHCI and AMD drivers because it doesn't work on this Ryzen... and by



PS: Hope someone can help me, im facing like visual lag.... games feel not like it should, cs 1.6 even at 75 hz monitor 2 ms 800x600 in cs 1.6 at 100fps game feel like not at 100fps at all.. and I want to have a better mouse feeling, thanks for reading me


----------



## Nawafwabs

When I move mouse slowly its skip pixel a lot

how I fix that ?

Fast Movement


Slow Movement


----------



## jayfkay

Quote:


> Originally Posted by *Nawafwabs*
> 
> When I move mouse slowly its skip pixel a lot
> 
> how I fix that ?
> 
> Fast Movement
> 
> 
> Slow Movement


thats normal isnt it? sensor/firmware/dpi dependant.

up your dpi or lower your polling rate - or swap mouse.


----------



## Nawafwabs

Quote:


> Originally Posted by *jayfkay*
> 
> thats normal isnt it? sensor/firmware/dpi dependant.
> 
> up your dpi or lower your polling rate - or swap mouse.


i connect mouse to usb 3.0

so i feel a little bit better than usb 2.0

my mouse Zowie fk1+


----------



## vf-

I thought pixel skipping was an on screen thing?


----------



## cdcd

https://www.reddit.com/r/Competitiveoverwatch/comments/6pe220/to_those_of_you_who_still_believe_in_the/


----------



## vf-

Haha!
Quote:


> Originally Posted by *Natethegrate1999*
> DAMNIT I spent like 4 hours changing sens in a bunch of games to combat pixel skipping just for you to tell me it was baloney.


----------



## Syntractrix

First time doing these kind of tests on my mouse, I'll let you guys decide if this is good or not
(these photos are probably taken incorrectly lol)




my dpc:


----------



## x58haze

I was wondering if someone here can guidance me or teach me how to properly uninstall the (hid devices) because every time I boot the pc, they install by themselves...
And actually I have a problem on my Keyboard Logitech G15, its like the F4 key is damaged... it keep pressing itself, is no the (plastic f4 key) is like inside internal the f4 pressed every time very annoying even in bios ,or before installing any OS

I have notice that in
Quote:


> Originally Posted by *Syntractrix*
> 
> First time doing these kind of tests on my mouse, I'll let you guys decide if this is good or not
> (these photos are probably taken incorrectly lol)
> 
> 
> 
> 
> my dpc:


Lol nice latency and mouse graph, by the way what are your PC specs? and WIndows?


----------



## Syntractrix

Quote:


> Originally Posted by *x58haze*
> 
> I was wondering if someone here can guidance me or teach me how to properly uninstall the (hid devices) because every time I boot the pc, they install by themselves...
> And actually I have a problem on my Keyboard Logitech G15, its like the F4 key is damaged... it keep pressing itself, is no the (plastic f4 key) is like inside internal the f4 pressed every time very annoying even in bios ,or before installing any OS
> 
> I have notice that in
> Lol nice latency and mouse graph, by the way what are your PC specs? and WIndows?




Specs:
gtx 1060
i5 6600k @4.6ghz 1.32v though I can get it up to 5.0ghz it's just not worth it since it raises your dpc latency so much because of the instability issues at 1.42v etc
evga g3 550
850evo
z170

edit: you dont "uninstall" hid drivers, you disable them lol


----------



## nidzakv

Hy guys, i need help.. Its not the first time a did al tweaks from the first post, but now i have this problem..

When i check dpc latency i get:



As i remember, earlyer, it was ~30us..

It feels i bit lagy ingame.. What i did wrong? Mouse is G pro, 500hz, drivers instaled but not runnig them while taking SS..

When i turn on timertool to 0.5ms, it cut the "us" from 1000 to 500..

Thanks in advance..


----------



## EastCoast

Do u have a nvidia card?


----------



## Melan

Don't use DPC latency checker on windows 8 or 10. It will report incorrect values.


----------



## Crymore13

Quote:


> Originally Posted by *nidzakv*
> 
> Hy guys, i need help.. Its not the first time a did al tweaks from the first post, but now i have this problem..
> 
> When i check dpc latency i get:
> 
> 
> 
> As i remember, earlyer, it was ~30us..
> 
> It feels i bit lagy ingame.. What i did wrong? Mouse is G pro, 500hz, drivers instaled but not runnig them while taking SS..
> 
> When i turn on timertool to 0.5ms, it cut the "us" from 1000 to 500..
> 
> Thanks in advance..


DPCLatency is incompatible with Windows greater than 7. Use LatencyMon.


----------



## nidzakv

Thanks everyone, latency mon shows everything is ok.. Still it flows from 5-60μs, but i can not solve that..

Somebody asked, yes i have nvidia card...

Послато са LG-D802 уз помоћ Тапатока


----------



## x7007

Quote:


> Originally Posted by *nidzakv*
> 
> Hy guys, i need help.. Its not the first time a did al tweaks from the first post, but now i have this problem..
> 
> When i check dpc latency i get:
> 
> 
> 
> As i remember, earlyer, it was ~30us..
> 
> It feels i bit lagy ingame.. What i did wrong? Mouse is G pro, 500hz, drivers instaled but not runnig them while taking SS..
> 
> When i turn on timertool to 0.5ms, it cut the "us" from 1000 to 500..
> 
> Thanks in advance..


are you using windows 7 or 10? because you need Latencymon to display the right information


----------



## Avalar

Was anyone else able to get less than 30μs with all their usual programs running? I'm quite proud of it tbh.. ^-^


----------



## jayfkay

who cares about ur 30 nanoseconds when ur gold nova in csgo?


----------



## Lolcarrots

Quote:


> Originally Posted by *jayfkay*
> 
> who cares about ur 30 nanoseconds when ur gold nova in csgo?


----------



## Syntractrix

this is probably the lowest that you will ever get, I think that theres a little bit of room left to improve even at these levels... lol


----------



## senileoldman

I'm using an old G400 with prediction at 125hz, because I'm too lazy to change it on Linux, and can rekt you all in any fps.


----------



## Nawafwabs

Quote:


> Originally Posted by *Syntractrix*
> 
> 
> this is probably the lowest that you will ever get, I think that theres a little bit of room left to improve even at these levels... Lol


how do get that?


----------



## JackCY

Quote:


> Originally Posted by *senileoldman*
> 
> I'm using an old G400 with prediction at 125hz, because I'm too lazy to change it on Linux, and can rekt you all in any fps.


gold

At least that's how I read it the first time.


----------



## Syntractrix

Quote:


> Originally Posted by *Nawafwabs*
> 
> how do get that?


You have to do insane amounts of tinkering to the point where you can reach a stable ~9us (or less) latency with nvidia drivers enabled, after that you disable nvidia drivers during the mouse benchmark and terminate explorer.exe from the task manager (although im not quite sure if it affects anything but I just do it for the sake of it) also I cut off a little bit of the graph since I usually get a random ass spike of 1020

Actually now that I think of it I could have also unplugged my keyboard during the testing and completely disabled the internet access by stopping the tcpip.sys driver to get even more stable results


----------



## Th3Awak3n1ng

So what's the point then if you will get higher DPC latency while doing something anyway?


----------



## Axaion

There is no point in that case, its basicly an unusable PC unless its run as a server, in which case he wouldent use that specific OS anyway


----------



## vf-

Quote:


> Originally Posted by *Th3Awak3n1ng*
> 
> So what's the point then if you will get higher DPC latency while doing something anyway?


Seems a daft test. The whole purpose of getting it stable is running it with your peripherals and software. Not isolating them or it is a flawed test.


----------



## x7007

Quote:


> Originally Posted by *Syntractrix*
> 
> You have to do insane amounts of tinkering to the point where you can reach a stable ~9us (or less) latency with nvidia drivers enabled, after that you disable nvidia drivers during the mouse benchmark and terminate explorer.exe from the task manager (although im not quite sure if it affects anything but I just do it for the sake of it) also I cut off a little bit of the graph since I usually get a random ass spike of 1020
> 
> Actually now that I think of it I could have also unplugged my keyboard during the testing and completely disabled the internet access by stopping the tcpip.sys driver to get even more stable results


if you get the perfect number but have no internet or keyboard how will you play games without explorer.exe or anything else that you closed?


----------



## Syntractrix

;


Here is the test with all of the good stuff enabled


----------



## Axaion

define 'all the good stuff'

and also, lmao 3.2s, and 1.3s? - you trolling, right?


----------



## Nawafwabs

if anyone have Xim4 can u try it ON PC with mouse

And give it test with mouseTester

i want to see result how will it be


----------



## 477909

Quote:


> Originally Posted by *Syntractrix*
> 
> You have to do insane amounts of tinkering to the point where you can reach a stable ~9us (or less) latency with nvidia drivers enabled, after that you disable nvidia drivers during the mouse benchmark and terminate explorer.exe from the task manager (although im not quite sure if it affects anything but I just do it for the sake of it) also I cut off a little bit of the graph since I usually get a random ass spike of 1020
> 
> Actually now that I think of it I could have also unplugged my keyboard during the testing and completely disabled the internet access by stopping the tcpip.sys driver to get even more stable results


how does it look like without anything disabled?


----------



## jormakka

Hi, i've been wondering what is wrong with my computer.. Basicly feels like i get slight input lag all time but after playing any game for 10-15minutes my mouse/keyboard feels even more delayed.

I ran usb mousetester and these are results:

https://imgur.com/a/OYRtC

does this look right?


Spoiler: Warning: Spoiler!



Latest Win10, G403 @ 1000hz, Gigabyte Z97X-SLI, 4690k, GTX 670


----------



## cdcd

Which mouse?


----------



## jormakka

Quote:


> Originally Posted by *cdcd*
> 
> Which mouse?


G403 @ 1000hz, gigabyte z97x-sli as mobo


----------



## cdcd

The plot does look bad. Does it get more stable @500Hz?


----------



## Straszy

OS
Windows 10 Professional 64bit
Motherboard
Asus Prime Z270-A
CPU
Intel Core i7-7700K
Memory
Corsair Vengeance LPX 16GB (2x8GB) DDR4 DRAM 3200MHz C16
SSD
Transcend SSD370 128GB
Video Card
Palit GeForce GTX 1060 Dual 6GB
Mouse
Logitech G Pro


----------



## Avalar

Quote:


> Originally Posted by *Straszy*
> 
> 
> 
> 
> 
> OS
> Windows 10 Professional 64bit
> Motherboard
> Asus Prime Z270-A
> CPU
> Intel Core i7-7700K
> Memory
> Corsair Vengeance LPX 16GB (2x8GB) DDR4 DRAM 3200MHz C16
> SSD
> Transcend SSD370 128GB
> Video Card
> Palit GeForce GTX 1060 Dual 6GB
> Mouse
> Logitech G Pro


You HAVE to tell me what steps you took to do this. I just started using my Windows 10 partition after sadly discovering that my new PC won't be supported with Win 7. I was gonna start with the basic UEFI optimizations, then disable automatic driver updates and driver updates included in Windows Updates, so when I uninstall NVIDIA's display drivers completely, they STAY uninstalled. Win 7 let's you do this without all the tinkering; just uninstall and it's gone. Am I missing anything?

Specs: https://pcpartpicker.com/list/fv4Y4C


----------



## jormakka

Quote:


> Originally Posted by *cdcd*
> 
> The plot does look bad. Does it get more stable @500Hz?


500hz = https://imgur.com/a/CuCCo


----------



## cdcd

Quote:


> Originally Posted by *jormakka*
> 
> 500hz = https://imgur.com/a/CuCCo


Looks better but still not good. Try doing the steps outlined in the OP first.


----------



## cdcd

Out of curiosity I just checked my polling rate (using an EVGA Torq X5 atm). Mousetester plot looks pretty bad @1000Hz. Mouserate checker and MouseMovementRecorder both show numbers that are spot on (<1% deviation with the odd 1050Hz sprinkled in) though. Which readings are more accurate?


----------



## vf-

Quote:


> Originally Posted by *Straszy*
> 
> 
> 
> 
> 
> OS
> Windows 10 Professional 64bit
> Motherboard
> Asus Prime Z270-A
> CPU
> Intel Core i7-7700K
> Memory
> Corsair Vengeance LPX 16GB (2x8GB) DDR4 DRAM 3200MHz C16
> SSD
> Transcend SSD370 128GB
> Video Card
> Palit GeForce GTX 1060 Dual 6GB
> Mouse
> Logitech G Pro


22 seconds on Latency Mon? You're taking the biscuit. Do stuff for 30 minutes.


----------



## emka

Any ideas as to why my mouse is spiking to 1000hz? Is it just the reading that is wrong?

Z97 - W10 FCU - XHCI off - usb2.0 native ports - tried everything in the first post.

Happens to G Pro, EC2-A, Sensei Raw... all of them.


----------



## Huzzaa

For love of god, just read people...

The dude with completely broken english, one might wonder why he gets no responses at all.

Same to the Venezuelan guy.

And no, not to diss you or anything but after re-reading through the whole shibazz, there's like 10 pages of clutter and c$ap that nobody cares about.

emka, You're system is compensating for the previous read at 333hz, is why it hits a 1ms rate after it. The reason why it happens can be a dozen things and I mean that literally.
But fact is, the processor was doing something else at the moment the data was ready from the USB controller.

Try turning more stuff off, disabling unnecessary devices, etc. You should know the drill if you read OP.


----------



## KittyChampion

Any more suggestion to people who is going to follow OP's Services Guide?


----------



## he4th

I'm getting really weird results with my overclocked 8000hz wmo1.1a black mouse. is my motherboard playing up or is there a problem with the software atm in latest build of win 10.


----------



## Straszy

Quote:


> Originally Posted by *vf-*
> 
> 22 seconds on Latency Mon? You're taking the biscuit. Do stuff for 30 minutes.




15 minutes


----------



## vf-

Quote:


> Originally Posted by *Straszy*
> 
> 
> 
> 15 minutes


While opening tasks? Browsing, video viewing and all the normal stuff? Or just idling on the desktop?


----------



## pyrexshorts

Can anyone tell me what these spikes could be? They seem to be happening every 500ms. I didn't have this on windows 10, just installed windows 7.



My latencymon also shows lots of hard pagefaults, but not sure if that's relevant or not.


----------



## vf-

Quote:


> Originally Posted by *he4th*
> 
> 
> 
> I'm getting really weird results with my overclocked 8000hz wmo1.1a black mouse. is my motherboard playing up or is there a problem with the software atm in latest build of win 10.


I must be blind but where in MouseTester v1.2 do you select Hz? I don't have that option.


----------



## cdcd

Quote:


> Originally Posted by *vf-*
> 
> I must be blind but where in MouseTester v1.2 do you select Hz? I don't have that option.


You running the latest version (v.1.5.3 if I'm not mistaken)?


----------



## vf-

Quote:


> Originally Posted by *cdcd*
> 
> You running the latest version (v.1.5.3 if I'm not mistaken)?


Hmm, that would be why. No idea how I ended up downloading v1.2.


----------



## pyrexshorts

I noticed that if I connect my pc through a power bar, all of my spikes in mousetester disappear but my audio becomes a lot worse.

Can anyone explain this?


----------



## vf-

Quote:


> Originally Posted by *pyrexshorts*
> 
> I noticed that if I connect my pc through a power bar, all of my spikes in mousetester disappear but my audio becomes a lot worse.
> 
> Can anyone explain this?


The same effect as ferrite cores.


----------



## pyrexshorts

Quote:


> Originally Posted by *vf-*
> 
> The same effect as ferrite cores.


Can you elaborate? I am using a paracord cable for my mouse without a ferrite core. Also, the spikes are every 500ms, and only happens in windows 7, if that helps.


----------



## silikone

Probably irrelevant to polling precision, but I've noticed that pretty much all low-speed mice are specified to be polled every tenth millisecond in their endpoints, yet this becomes eight milliseconds in use, luckily enough. Is this a quirk of Windows or the hardware controller itself?


----------



## SweetLow

silikone said:


> Probably irrelevant to polling precision, but I've noticed that pretty much all low-speed mice are specified to be polled every tenth millisecond in their endpoints, yet this becomes eight milliseconds in use, luckily enough. Is this a quirk of Windows or the hardware controller itself?


Hardware, probably. Сyclical frame buffer of the UHC has size of 1024 (1 ms) frames, so if you want periodical polling of device it must be power of 2. And Windows rounds to the shortest period.


----------



## Nawafwabs

pyrexshorts said:


> I noticed that if I connect my pc through a power bar, all of my spikes in mousetester disappear but my audio becomes a lot worse.
> 
> Can anyone explain this?


power bar ?


----------



## Avalar

Nawafwabs said:


> power bar ?


Power strip.


----------



## 123atomsk

Hi all, ran into an issue during measurements that I wanted to clarify:

During testing, my results vary widely depending on whether or not I open MouseTester_v1.5.3 in admin mode.

Admin mode enabled 


















Opened regularly


















For reference I'm using a G703 @ 800DPI / 1000hz. Windows 10, mouse sens 6/11, i6700k. 

Also, assuming the admin mode is the correct measurement, does these results look alright?


----------



## Marctraider

My results with G305

System: LTSB 2016, dpc latency 5-10~us average
Minimalistic but gaming functional


----------



## MuntyYy

quack


----------



## Syntractrix

Forget about dpc latency, isr, mouse polling rate and so on. Just reinstall windows and don't do any optimizations that you don't know.


----------



## empl

Btw for low latency:

INTEL CHIPSET DRIVER: INTEL ENGINE OR HOW IT CALLS IS RECOMMENDED TO NOT INSTALL - tho some users having had fps drops without them, it depends as with everything, THEY CAUSE INPUT LAG tho, SAME FOR INTEL SATA DRIVERS. But if you need copying fast you have to install them. Usb drivers you should install from motherboard website, or intel site probably, windows already installed latest, but not necessarily. FORGET HOW IT CALLS, USUALLY IT IS NOT RECOMMENDED TO USE PROGRAMS OR DOWNLOAD FROM SHADY SITES DRIVERS 
Set everything you can into msi mode, except gpu, on nvidia said it specifies msi-x number in drivers programming, but with msi mode on i have lower latency, strange. And lines in nvdispi.inf msi 1 are for old devices which would stop working otherwise.

I tested in windows, i have razer death adder elite, 500hz and i move mouse at the straight line across monitor. I got under interval vs time 80 us, also latencymon reports execution time under usb 70-100us.

I have optimized interrupts, everything in msi-x mode except usb 2, unfortunately doesn't work on my motherboard - intel series 7 chipset.

Btw affinity setting for drivers are ignored at hardware level said latencymon support, if you wanted assign one core for only for usb, it could be ignore by hardware. Also today usb should support msi, even usb2.

About priority control, it is supposedly old residue from windowsNT and it is not clear it works, but i am sensitive to input lag and mouse feels snappier. I have to yet test it, with latencymon and mousetest ... Also there is under hklm/currentcontrolset/enum/device - devicepriority dword value under affinity of drivers in regedit, which should be same thing like priority control. My nic set it to high after installation of latest drivers.

You can turn dwm off in windows 10: https://www.reddit.com/r/Windows10/..._worth_disabling_dwm_desktop_composition_for/ 
But maybe you will get black screen in fullscreen mode, problem is windowed mode may cause frame drops. Go into win+r gpedit.msc windows components and disable everything under dwm, it reduces latency slightly.

In windows 10 it is possible appearance setting may not save, if you disable themes services, there is way to set it in registry too, currently it is working for me tho.

Also windows updates are problem, there is wutm wrapper script which disabled auto updates, but checking every update for issue manually takes a lot of time.

You can delete flash player from windows 10 seems to reduce input lag.

Deleting sheduled process, even like google chrome, or changing any setting, caused for me weird lag, which disappeared only after reinstall, so i am afraid to change sheduled processes in microsoft sheduler.

BTW you should probably check motherboard before buying pc, even expensive asus motherboard suffered from high dpc latency and every feature it has.

*EDIT:*
I tested this tweak again and take it back, it doesn't feel snappier, tho previously changed yet cmos and gpu priority, which i read it can help in one book, so i can be that, i need yet to test order of these. I put irq number of my 1E2D usb ehci controller under prioritycontrol and tested it latencymon (while maxed polling rate), i got much higher like 108~us with this tweak and mouse feels ****ty, than without - 88~


----------



## x7007

empl said:


> I tested in windows, i have razer death adder elite, 500hz and i move mouse at the straight line across monitor. I got under interval vs time 80 us, also latencymon reports execution time under usb 70-100us.
> 
> I have optimized interrupts, everything in msi-x mode except usb 2, unfortunately doesn't work on my motherboard - intel series 7 chipset.
> 
> Btw affinity setting for drivers are ignored at hardware level said latencymon support, if you wanted assign one core for only for usb, it could be ignore by hardware. Also today usb should support msi, even usb2.
> 
> About priority control, it is supposedly old residue from windowsNT and it is not clear it works, but i am sensitive to input lag and mouse feels snappier. I have to yet test it, with latencymon and mousetest ... Also there is under hklm/currentcontrolset/enum/device - devicepriority dword value under affinity of drivers in regedit, which should be same thing like priority control. My nic set it to high after installation of latest drivers.
> 
> You can turn dwm off in windows 10: https://www.reddit.com/r/Windows10/..._worth_disabling_dwm_desktop_composition_for/
> But maybe you will get black screen in fullscreen mode, problem is windowed mode may cause frame drops. Go into win+r gpedit.msc windows components and disable everything under dwm, it reduces latency slightly.
> 
> In windows 10 it is possible appearance setting may not save, if you disable themes services, there is way to set it in registry too, currently it is working for me tho.
> 
> Also windows updates are problem, there is wutm wrapper script which disabled auto updates, but checking every update for issue manually takes a lot of time.
> 
> You can delete flash player from windows 10 seems to reduce input lag.
> 
> Deleting sheduled service, even like google chrome, or changing any setting, caused for me weird lag, which dissapered only after reinstall, so i am afraid to change sheduled processes in microsoft sheduler.
> 
> BTW you should probably check motherboard before buying pc, even expensive asus motherboard suffered from high dpc latency and every feature it has.


Did you see the issue with Windows 1809 and above?

Is there a way we can fix this?







Because I can't manage to fix this, it seems repeatedly the same with 1000hz mouse, I tested on 2 computers, it's the same on both. I tested 125 Generic Microsoft mice and it is fine on both computers getting perfect 125 dots on the line with 1903

Taken from this guide if you missed it

https://docs.google.com/document/d/1nrcQ2EU5512TpuspPF4u5PgZ43p7hoV1cYBMi2C3XSQ/edit#


----------



## Melan

Man, I swear these people will look for any sort of excuse these days.


----------



## 508859

x7007 said:


> Did you see the issue with Windows 1809 and above?
> 
> Is there a way we can fix this?
> 
> https://www.youtube.com/watch?v=EG4g9XlKw5w
> 
> 
> Because I can't manage to fix this, it seems repeatedly the same with 1000hz mouse, I tested on 2 computers, it's the same on both. I tested 125 Generic Microsoft mice and it is fine on both computers getting perfect 125 dots on the line with 1903
> 
> Taken from this guide if you missed it
> 
> https://docs.google.com/document/d/1nrcQ2EU5512TpuspPF4u5PgZ43p7hoV1cYBMi2C3XSQ/edit#


80% of this is meaningless, the rest is debatable.


----------



## empl

x7007 said:


> Did you see the issue with Windows 1809 and above?
> 
> Is there a way we can fix this?
> 
> https://www.youtube.com/watch?v=EG4g9XlKw5w
> 
> 
> Because I can't manage to fix this, it seems repeatedly the same with 1000hz mouse, I tested on 2 computers, it's the same on both. I tested 125 Generic Microsoft mice and it is fine on both computers getting perfect 125 dots on the line with 1903
> 
> Taken from this guide if you missed it
> 
> https://docs.google.com/document/d/1nrcQ2EU5512TpuspPF4u5PgZ43p7hoV1cYBMi2C3XSQ/edit#


Didn't see complete video yet... What do you mean by 125 dots ? Yeah i heard there was some bug increasing DPC latency on older builds than 1903. Update to 1903 supposed to fix it, but for some users updating to 1903 didn't fixed it ! What issue you even mean, there was something in win 8 allowing only 250 or 125 polling rate, but didn't heard of it in win 10.

About what he speaks in that video: you shouldn't force platformclock on, but let windows decide. Bases on correspondence from expert on hpet - can be found on: https://www.tweakhound.com/2014/01/30/timer-tweaks-benchmarked/
But ofc you can test it, maybe it could help some people. Wait he speaks about platformtick, not platform clock !!

Correct me if i am wrong but:
Here is site from microsoft on counter, which should be only system counter as i understand it, not actual timer ! https://docs.microsoft.com/en-us/windows/win32/perfctrs/performance-counters-portal
Basically when system needs execute events at right times with high precision and computer keeps time using ticks, this counter is used to time them out precisely. Even on microsoft is - this option should be only used for debbuging.

Didn't find similar page on platformtick, but as i understand it - platformtick is actually used for drivers/system/applications to update code to cpu aka resolution timer. Didn't even find any page from MS, or experts on it, only pasted this command on reddit with this video, which may or may not be good.

When i applied this tweak - platformtick, my mouse feels heavier, but more precise i think and dpc latency increased by 90 (in latencymon) for nvidia driver during playing this video, but it can be coincidence, i don't know if you get even same value, just from being idle in system. Not sure how to tell if it helped for all drivers, can be tricky to test e.g how you tell if difference is too small ? Hmm for usb it decreased from 88 to 78. And i usually have something around 80 from multiple tests... No wait, now it jumped to 200 for some reason it may be one time, maybe i should test total dpc lat. for like 1 min. But it is hard to keep 500hz polling rate long time, again it is hard to test and how i supposed to stop exactly after 1 minute.. And even if i disabled nic and audio, i don't think gpu has same values by being idle in system..

I would really like peoples input on this, with platformtick enabled, i have feeling my mouse is much more accurate, but i wouldn't say input lag, but it feels much heavier and i wouldn't say it is something you can get used to. I will definitely test it on new pc.

Btw if you wanna lower dpc and input lag overall, i can give you a lot of tips, i must have read like 10k tweaking forums posts and sites  No BS !!!


----------



## x7007

empl said:


> Didn't see complete video yet... What do you mean by 125 dots ? Yeah i heard there was some bug increasing DPC latency on older builds than 1903. Update to 1903 supposed to fix it, but for some users updating to 1903 didn't fixed it !
> 
> About what he speaks in that video: you shouldn't force platformclock on, but let windows decide. Bases on correspondence from expert on hpet - can be found on: https://www.tweakhound.com/2014/01/30/timer-tweaks-benchmarked/
> But ofc you can test it, maybe it could help some people. Wait he speaks about platformtick, not platform clock !!
> 
> Correct me if i am wrong but:
> Here is site from microsoft on counter, which should be only system counter as i understand it, not actual timer ! https://docs.microsoft.com/en-us/windows/win32/perfctrs/performance-counters-portal
> Basically when system needs execute events at right times with high precision and computer keeps time using ticks, this counter is used to time them out precisely. Even on microsoft is - this option should be only used for debbuging.
> 
> Didn't find similar page on platformtick, but as i understand it - platformtick is actually used for drivers/system/applications to update code to cpu aka resolution timer. Didn't even find any page from MS, or experts on it, only pasted this command on reddit with this video, which may or may not be good.
> 
> When i applied this tweak - platformtick, my mouse feels heavier, but more precise i think and dpc latency increased by 90 (in latencymon) for nvidia driver during playing this video, but it can be coincidence, i don't know if you get even same value, just from being idle in system. Not sure how to tell if it helped for all drivers, can be tricky to test e.g how you tell if difference is too small ? Hmm for usb it decreased from 88 to 78. And i usually have something around 80 from multiple tests... No wait, now it jumped to 200 for some reason it may be one time, maybe i should test total dpc lat. for like 1 min. But it is hard to keep 500hz polling rate long time, again it is hard to test and how i supposed to stop exactly after 1 minute.. And even if i disabled nic and audio, i don't think gpu has same values by being idle in system..
> 
> I would really like peoples input on this, with platformtick enabled, i have feeling my mouse is much more accurate, but i wouldn't say input lag, but it feels much heavier and i wouldn't say it is something you can get used to. I will definitely test it on new pc.
> 
> Btw if you wanna lower dpc and input lag overall, i can give you a lot of tips, i must have read like 10k tweaking forums posts and sites  No BS !!!




For me having DisableDynamicTick makes the mouse super faster. Did you try both or just platformtick?
Also, 2 of my friends said they feel the mouse is faster and they can control it better. Both on laptops.


I've found some more details for my issues.

What fixed my Mouse issues are as follows from top to low. this will depend on your spec I think Intel and AMD, some things won't effect like mine.

Disable Spectre protection using InSpectre.exe

Command Prompt
bcdedit /set DisabledynamicTick Yes
bcdedit /set useplatformTick Yes

Put GPU on MSI mode using MSI_util_v2.exe

Change Drivers files Core Affinity offloading from Core0... Only GPU/USB using interrupt_affinity_policy_tool.msi

DO NOT Install Windows Sandbox


Obviously things to do

Power Mangement High Performance or Ultra


Bios

CPU Phase Power Mode Should be High Performance or Extreme
DRAM also Extreme
Legacy USB Doesn't seem to have big effect / Still checking
Secure Boot does not have any effect, also it doesn't really enabled, if you do not enable the GPEDIT.msc
https://www.tenforums.com/tutorials/...dows-10-a.html
HPET Enabled in Bios

Installing Monitor driver for my TV OLED LG E6 Locking/Buffering EDID using MonInfo. Because EDID will try to comminucate every second using sending HDCP information and everytime it doesn't for any reason you will lose signal, having bad precision or just other stall of information.


----------



## empl

I have intel..
I tried only platformtick and restarted pc, platformclock should be used only for debugging MS says and i tried that in past and it sucked. My cpu has spectre protection, but not a meltdown. Btw i didn't know about this util, even i know grc, thank you glad to see something new. I will test it again on my new pc. But for now, i agree mouse is more accurate, but sucks, it feels heavier or something. Even i don't have much dpc latency, it says my system can handle audio and video without dropouts, during load. 

About gpu, i talked to nvidia dev and he said: nvidia specifies msi-x support and number of supported interrupts (it is that number like 1, 2, 4, 8 or up to 2048) to allocate in their driver coding. And that setting in registry creates only for devices, which would not handle higher number. But it is strange, i definitely feel less input lag if i switch msi mode on. And msi util doesn't even show msi-x support under my gpu, only msi-x.

I talked to latencymon support and they said, interrupt affinity setting from registry can be ignored on driver/hardware level. But if not it could theoretically work i guess, but i don't know how to check how many cores driver uses, in latencymon it shows only cores with highest execution time. 

Btw. How did you meant it exactly ? Having gpu and usb on core 0 and rest on other cores ? I am pretty sure both devices should use more cores as they support msi (usb only on new gen mobos). Also if you played like cs go on 1 core, if it is even possible nowadays, you could theoretically clear one core from interrupts, if you have enough cores, like with amd, but than you have less cores for interrupts, don't know if there would be any benefit of doing that, especially in other games, when you need more cores on game.

Windows sandbox don't know even what it is.

Power management ultra and unparked cores, disable usb saving and important thing (disabled idle saver so cpu works in c0 state always), process lasso should do that automatically if you you designate high perf. process, but than it doesn't switch back to balance, after you quit game..

Some of these i currently don't use, or don't have, i have old cheap mobo.

Legacy usb off lowers input lag, but make mouse crappy on some mobos

I have hpet on too

Btw google r0ach, or roach on these forums for input lag tips.
And check my input lag tweaks i found during years, at least all i can remember. Steam has good input lag guides on cs go, which is usable everywhere, or sound production forums have interesting info.

https://www.tenforums.com/gaming/11...y-comprehensive-list-will-blow-your-mind.html


----------



## x7007

empl said:


> I have intel..
> I tried only platformtick and restarted pc, platformclock should be used only for debugging MS says and i tried that in past and it sucked. My cpu has spectre protection, but not a meltdown. Btw i didn't know about this util, even i know grc, thank you glad to see something new. I will test it again on my new pc. But for now, i agree mouse is more accurate, but sucks, it feels heavier or something. Even i don't have much dpc latency, it says my system can handle audio and video without dropouts, during load.
> 
> About gpu, i talked to nvidia dev and he said: nvidia specifies msi-x support and number of supported interrupts (it is that number like 1, 2, 4, 8 or up to 2048) to allocate in their driver coding. And that setting in registry creates only for devices, which would not handle higher number. But it is strange, i definitely feel less input lag if i switch msi mode on. And msi util doesn't even show msi-x support under my gpu, only msi-x.
> 
> I talked to latencymon support and they said, interrupt affinity setting from registry can be ignored on driver/hardware level. But if not it could theoretically work i guess, but i don't know how to check how many cores driver uses, in latencymon it shows only cores with highest execution time.
> 
> Btw. How did you meant it exactly ? Having gpu and usb on core 0 and rest on other cores ? I am pretty sure both devices should use more cores as they support msi (usb only on new gen mobos). Also if you played like cs go on 1 core, if it is even possible nowadays, you could theoretically clear one core from interrupts, if you have enough cores, like with amd, but than you have less cores for interrupts, don't know if there would be any benefit of doing that, especially in other games, when you need more cores on game.
> 
> Windows sandbox don't know even what it is.
> 
> Power management ultra and unparked cores, disable usb saving and important thing (disabled idle saver so cpu works in c0 state always), process lasso should do that automatically if you you designate high perf. process, but than it doesn't switch back to balance, after you quit game..
> 
> Some of these i currently don't use, or don't have, i have old cheap mobo.
> 
> Legacy usb off lowers input lag, but make mouse crappy on some mobos
> 
> I have hpet on too
> 
> Btw google r0ach, or roach on these forums for input lag tips.
> And check my input lag tweaks i found during years, at least all i can remember. Steam has good input lag guides on cs go, which is usable everywhere, or sound production forums have interesting info.
> 
> https://www.tenforums.com/gaming/11...y-comprehensive-list-will-blow-your-mind.html



With the affinity program you need to deselect Core0-1 and leave the others, this way you offload all the processes to other cores because every single crap and driver works on Core0-1 so having it without GPU and USB processing makes it better.

EDIT : I read everything he wrote on this thread, seems I have more things to test. but we are on the right track, ones and for all we know 50% fixes for our issues. we are not lost anymore.


----------



## Nawafwabs

So many kids in this forum , talk about how those tweak are useless and they didnt tried it , just talk blalallaalalalal and do nothing in their life


----------



## 508859

Nawafwabs said:


> So many kids in this forum , talk about how those tweak are useless and they didnt tried it , just talk blalallaalalalal and do nothing in their life


most of them are technically useless, until someone (like you or author) can prove that they impact anything. it also doesn't mean the skeptics didn't try them


----------



## NDUS

I set aside a couple hours with MouseTester and tried everything in the OP. There was no difference in the polling behavior of any mouse I tried. One thing I did notice, though: while in a game, polling consistency seems to be improved relative to just swiping on the desktop.


----------



## gunit2004

For me personally there are 2 things that give basically a night and day difference in terms of aim feel... one I came to the conclusion of before I had even saw this thread months ago and one was after reading the IRQ priority stuff listed here.

1. Giving "Intel Management Engine Interface" a IRQ# entry in the Registry Editor made mouse feel a lot more responsive. Don't know the science behind it... but it just does. When I remove the entry, mouse feels okay but not as good as it does with the entry. Super easy to A-B test while in game because it does not even require you to restart the computer to take effect seemingly. I can feel the difference as soon as you delete the entry and re-enter it again. Tried doing similar things to other listed devices as well such as the GPU, USB controllers, etc but giving those entries actually made aim feel much worse to me. By default my computer already had all devices EXCEPT for GPU in MSI mode. When I put the GPU in MSI mode, mouse felt like crap so I keep my GPU as it was by default (line-based mode). The only thing I felt benefited (and my oh my does it benefit it alot on my system) is the IMEI.

EDIT: forgot to mention, this is in conjunction with using an older IMEI driver (version 11.7.0.1014, dated 4/4/2017) which feels A LOT better than the newer one that got installed when I updated to W10 1903 (version 1914.12.0.1256, dated 4/4/2019) In fact, I make sure most of the drivers I have are generic (SATA controller, power management, blah blah blah). Updating to 1903 updated a lot of these drivers and I rolled them all back because aim felt weird after upgrading. Installing the newest Samsung NVME drivers for my M2 SSD seemed fine though (better than the old generic NVME controller drivers)

2. Moving all my devices to a separate USB controller than the mouse. In my case, I do have a 2nd controller (ASmedia controller) on my motherboard but when I have it enabled it makes mouse aim feel like crap so I have to keep it disabled in BIOS. Same with the Realtek HD Audio Controller... makes mouse aim feel super sluggish, so have that disabled as well. Disabled pretty much anything else I don't use such as Wifi/Bluetooth controller, which funnily enough RUNS on the Intel controller alongside the gaming mouse and other devices on the Intel USB controller?!?). I do use bluetooth for a pair of headphones when I am watching movies in bed so I use a USB bluetooth dongle instead of the built in Bluetooth controller on my motherboard. So the only thing I have plugged into the Intel USB controller is my gaming mouse and everything else I have running off a PCI-E expansion USB card (Renesas controller on that one) with a powered 10-port generic Amazon hub. With everything running alongside my mouse on the same controller, the mouse feels floaty as hell in comparison. With everything on a separate controller, mouse response feels excellent.


----------



## NDUS

gunit2004 said:


> ...


Have you tested any of this, or is it all based on "feeling"? It's not as if it is difficult to test mouse behavior. Mousetester is free.


----------



## Melan

If Intel ME bothers you so much you could just remove it from bios. All you need is a second PC running linux (live cd will work), a programmer and me_cleaner script.


----------



## gunit2004

NDUS said:


> Have you tested any of this, or is it all based on "feeling"? It's not as if it is difficult to test mouse behavior. Mousetester is free.


I've tried Mousetester and all that stuff. It's never helped me in anyway except identifying super obvious problems with a mouse itself (for example the Endgame Gear XM1 @1000hz having constant drops down to 500hz on certain firmwares)

My testing consists of actually playing games (mainly Overwatch) with aim intensive heroes like Mccree and the game going from essentially unplayable (not being able to hit shots to save my life and like 30-35% accuracy, mostly body shots on big fat tanks) -> aim in game feeling great and consistently getting good accuracy in games (50-60% accuracy, 30+ crit shots).


----------



## NDUS

gunit2004 said:


> I've tried Mousetester and all that stuff. It's never helped me in anyway ...


But all of this is heavily susceptible to the placebo effect. If you're seeing no difference in terms of objective measurements, 99.9% of the time this kind of thing is just placebo.


----------



## gunit2004

NDUS said:


> But all of this is heavily susceptible to the placebo effect. If you're seeing no difference in terms of objective measurements, 99.9% of the time this kind of thing is just placebo.


I'm just going to copy and paste my last post.

"My testing consists of actually playing games (mainly Overwatch) with aim intensive heroes like Mccree and the game going from essentially unplayable (not being able to hit shots to save my life and like 30-35% accuracy, mostly body shots on big fat tanks) -> aim in game feeling great and consistently getting good accuracy in games (50-60% accuracy, 30+ crit shots)."

I understand where you are coming from, but I don't need some dots on a graph to tell me the above. Actually PLAYING a game is the best test to me and I will always do things that way.


----------



## NDUS

gunit2004 said:


> I understand where you are coming from, but I don't need some dots on a graph to tell me the above. Actually PLAYING a game is the best test to me and I will always do things that way.


What you're saying is just a video game related form of astrology. You have no evidence to support your belief besides a vague feeling that you would like to believe. It would be trivially easy to test your theory, but you refuse (probably because you on some level realize it would turn out to be wrong?)

For the record: I also play Overwatch, and I also play aim-intensive heroes. I would generally consider myself sensitive to input delay and anomalous input/display behavior. I notice when my FPS drops from 240 to 220. I noticed when the strobing crosstalk area of my monitor was reset in a power loss.

I noticed *zero* change in mousefeel from anything posted in this thread, except that it worsened when I tried some of the Windows timer tweaks. I was also unable to measure any difference in polling at any stage of my tests, except for the aforementioned timer tweaks. So, both experimentally and informally, this thread appears to be techno-superstition.


----------



## Straszy

https://m.imgur.com/a/xo0QxSU

Mouse Polling rate tests on different timers forced through bcdedit. Mobo: msi mag z390 tomahawk


----------



## BroadPwns

Ugh, I'm getting tons of scores in a range of 2000-8000, I've filtered 6 scores which were above 50000Hz (delta between each individual record converted to frequency). How do I make this right?


----------



## Straszy

Check it in safe-mode. If it's not good then check hard drive with hd tune. If it's fine do clean install of system.


----------



## 508859

gunit2004 said:


> For me personally there are 2 things that give basically a night and day difference in terms of aim feel... one I came to the conclusion of before I had even saw this thread months ago and one was after reading the IRQ priority stuff listed here.
> 
> 1. Giving "Intel Management Engine Interface" a IRQ# entry in the Registry Editor made mouse feel a lot more responsive. Don't know the science behind it... but it just does. When I remove the entry, mouse feels okay but not as good as it does with the entry. Super easy to A-B test while in game because it does not even require you to restart the computer to take effect seemingly. I can feel the difference as soon as you delete the entry and re-enter it again. Tried doing similar things to other listed devices as well such as the GPU, USB controllers, etc but giving those entries actually made aim feel much worse to me. By default my computer already had all devices EXCEPT for GPU in MSI mode. When I put the GPU in MSI mode, mouse felt like crap so I keep my GPU as it was by default (line-based mode). The only thing I felt benefited (and my oh my does it benefit it alot on my system) is the IMEI.
> 
> EDIT: forgot to mention, this is in conjunction with using an older IMEI driver (version 11.7.0.1014, dated 4/4/2017) which feels A LOT better than the newer one that got installed when I updated to W10 1903 (version 1914.12.0.1256, dated 4/4/2019) In fact, I make sure most of the drivers I have are generic (SATA controller, power management, blah blah blah). Updating to 1903 updated a lot of these drivers and I rolled them all back because aim felt weird after upgrading. Installing the newest Samsung NVME drivers for my M2 SSD seemed fine though (better than the old generic NVME controller drivers)
> 
> 2. Moving all my devices to a separate USB controller than the mouse. In my case, I do have a 2nd controller (ASmedia controller) on my motherboard but when I have it enabled it makes mouse aim feel like crap so I have to keep it disabled in BIOS. Same with the Realtek HD Audio Controller... makes mouse aim feel super sluggish, so have that disabled as well. Disabled pretty much anything else I don't use such as Wifi/Bluetooth controller, which funnily enough RUNS on the Intel controller alongside the gaming mouse and other devices on the Intel USB controller?!?). I do use bluetooth for a pair of headphones when I am watching movies in bed so I use a USB bluetooth dongle instead of the built in Bluetooth controller on my motherboard. So the only thing I have plugged into the Intel USB controller is my gaming mouse and everything else I have running off a PCI-E expansion USB card (Renesas controller on that one) with a powered 10-port generic Amazon hub. With everything running alongside my mouse on the same controller, the mouse feels floaty as hell in comparison. With everything on a separate controller, mouse response feels excellent.


1) did you try to just delete IME drivers and disable the device in device manager?

2) another PCI-E controller normally should introduce additional interrupts for the CPU, the benefit is not obvious


----------



## empl

On how to geek is, that you can't disable intel me: https://www.howtogeek.com/334013/in...-explained-the-tiny-computer-inside-your-cpu/
But on steam in cs go input lag guides is usually to not install intel me drivers and reason is - they cause input lag ! I can agree with that, but it makes mouse more accurate at the same time, at least for me.

Someone was saying enabling msi-x for gpu made mouse feel crappy, for me it actually reduced input lag, i tested myself on input lag (with input lag ab test) and i could tell down to 6 ms difference. And it can also differ on each system. Btw nvidia dev said, setting in registry are only for devices, which couldn't handle more interrupt channels and nvidia put there value 1 for these devices. And nvidia specifies this number in its own driver programming. Than again i can feel clear difference and it is that kind, which is more apparent. Who knows why is that, but setting you set in registry for gpu should be irrelevant by nvidia dev so that's strange. Also saw reddit post about it and more people feel same, so that's strange.

Btw so many peoples posted since last time and i don't even understand what about these people argue currently... I know there are usually 2 kinds of people, trying everything they can to improve their system, or that kind, which say every tweak is pointless and do nothing, which is not true. However there is only so much you can do. Usually bios settings matter most and what drivers your mobo using, has influence on input lag and dpc latency.

I got asrock z390 phatom gaming itx/ac for my new pc, which is supposed to be very low dpc latency, tested on anandtech. These tests are usually not performed, due it can vary on hw and sw configurations, so it is kinda shot in the dark, when you picking new pc. So maybe i find something new, there is a lot of changed from last time i bought pc. 

But yeah you can check if mobo using intel nic and has no wifi and useless crap, or what sound card it is using and if you are lucky and find some dpc latency test, which are relative, so it is still tough to pick. But you can at least see if mobo doesn't suffer from extreme dpc latency issues, like if multiple people reported it about some mobo e.g. asus rog, so you see this mobo is probably not good. Some people say dpc latency doesn't matter, but you can tell easily, if you are sensitive to it, or if it has very high dpc latency...


----------



## x7007

Just got the best server network card I can get

Mellanox MCX4121A-ACAT.

It has everything to do with interrupt and latency. you can customize every setting through device manager or registry.

https://docs.mellanox.com/display/winof2/Configuring the Driver Registry Keys
https://docs.mellanox.com/display/w...ringtheDriverRegistryKeys-OffloadRegistryKeys


----------



## Spieler4

This is maybe the best result I can get in win 7


----------



## empl

x7007 said:


> Just got the best server network card I can get
> 
> Mellanox MCX4121A-ACAT.
> 
> It has everything to do with interrupt and latency. you can customize every setting through device manager or registry.
> 
> https://docs.mellanox.com/display/winof2/Configuring the Driver Registry Keys
> https://docs.mellanox.com/display/w...ringtheDriverRegistryKeys-OffloadRegistryKeys


Is it pci-e card ? Dedicated nic is always better than any pci-e card, because pci-e cards generate high amount of dpc latency. Unless there is a exception and your card is godlike. But i haven't seen yet pci-e card that would be better than dedicated.



> Spieler4


Test setting - think it was called inteval vs time, it shows time, in which your polls are being handled by system. This looks like it just shows your polling rate stability. 
Also problem is, how do you test this setting objectively, there is not same load, in your system at any time, even in idle there can be small variance and who knows what can affect it. I get like 88us, but sometimes i get spike to 200us as highest execution in latencymon. But you can tell if there would be huge difference.

Btw i am currently thinking, if would be possible stream audio to second pc and render it there, so you avoid dpc latency from sound card. I was looking at couple programs, but not sure if they do what i want. Because sound card makes a lot of dpc latency.


----------



## BroadPwns

Alright, found a v1.5 of MouseTester, thought I was doing some miscalculations but apparently, my previous attempts were correct. I have no idea what's making it be like that - cursor is butter smooth. This score is such a bs I can smell it sitting on my chair.


----------



## Straszy

BroadPwns said:


> Alright, found a v1.5 of MouseTester, thought I was doing some miscalculations but apparently, my previous attempts were correct. I have no idea what's making it be like that - cursor is butter smooth. This score is such a bs I can smell it sitting on my chair.


bcdedit /deletevalue useplatformtick
bcdedit /deletevalue disabledynamictick
bcdedit /deletevalue useplatformclock

and check again, if it's still the same check dpc latency, maybe something is runnning in the background?


----------



## BroadPwns

LatencyMon shows perfect 62-108us with all the daily software running in the background + music playing too. I don't bother with readings as I'm sure MouseTester can't cope with HPET, since all that madness is not happening at all on my screen. Else I'd had a choppy experience will all these dips to below 500Hz polling.


----------



## James N

Spieler4 said:


> This is maybe the best result I can get in win 7


Yep Windows 7 is way better than 10 in that regard, but it doesn't matter since sooner or later everyone has to make the switch to 10. And on 10 you won't ever get a result as good as this


----------



## BroadPwns

James N said:


> Yep Windows 7 is way better than 10 in that regard, but it doesn't matter since sooner or later everyone has to make the switch to 10. And on 10 you won't ever get a result as good as this



That is IF these scores on Win 10 are legit reported at all..


----------



## empl

BroadPwns said:


> LatencyMon shows perfect 62-108us with all the daily software running in the background + music playing too. I don't bother with readings as I'm sure MouseTester can't cope with HPET, since all that madness is not happening at all on my screen. Else I'd had a choppy experience will all these dips to below 500Hz polling.


What madness, what is this test even supposed to test, i don't see title. How you can have frequency 10 000. Is this supposed to test polling rate stability, if so shouldn't there be 1000 instead 10000  ? And than second test i think it was interval vs time is supposed to test, how fast interrupts from polling rate are handled by system.

If your motherboard support it you can set usb to msi-x. 
Use usb 2 top ports, usb 3+ is worse for mouse.

Otherwise use ps/2 if you can, it has lower input lag, don't know about dpc latency, but lower as well probably. Even some expert on linus tech tips recommended it. https://linustechtips.com/main/topi...g-mobo-for-low-dpc-latency-and-low-input-lag/
You can find about it on the internet, that it is better than usb for mouse and keyboard, mechanical keyboard support even full key n rollover for ps/2. Now i am not sure actually how much it is better and with higher polling rate difference narrows supposedly, but if you use 500hz polling, it could be better, don't know currently, long time i was looking into that and i can't use ps/2 anyways unfortuantely.


----------



## BroadPwns

I use the USB 2.0 port, 800dpi + 1000Hz polling rate, which is correctly read with HPET off. I bet my ass this software goes monkey **** when HPET is on. I see no reason to tamper with anything as these scores have literally no reflection on reality.


----------



## x7007

BroadPwns said:


> I use the USB 2.0 port, 800dpi + 1000Hz polling rate, which is correctly read with HPET off. I bet my ass this software goes monkey **** when HPET is on. I see no reason to tamper with anything as these scores have literally no reflection on reality.



HPET Off in bios and windows Bcdedit useplatformClock?

Please specify


----------



## BroadPwns

No HPET switch in bios (most likely ON by default) so only by deleting the useplatformclock the software works properly (reporting 1000Hz +-50Hz). Now a wild part - after deleting the useplatformclock through cmd NieR Automata started to utilize my GPU in constant 99%, instead of jumping 65-90, which results in keeping a nice 57-80FPS instead of getting dips below 40 lol. No changes in Deus Ex Mankind Divided, can't say if there are other changes as I did not test more titles. So just an engine specific issue.


----------



## 508859

BroadPwns said:


> so only by deleting the useplatformclock the software works properly.


no value = default value, so you basically have all the timers enabled and OS chooses which one to use at any given moment. 



BroadPwns said:


> (reporting 1000Hz +-50Hz)



this is garbage to be honest, it is not "properly"


----------



## BroadPwns

The end accuracy of mine is perfect and I see no reason to dip into polling voodoo further. I also meant that the software reads properly, not that the score itself is proper.


----------



## Spieler4

empl said:


> Is it pci-e card ? Dedicated nic is always better than any pci-e card, because pci-e cards generate high amount of dpc latency. Unless there is a exception and your card is godlike. But i haven't seen yet pci-e card that would be better than dedicated.
> 
> 
> 
> Test setting - think it was called inteval vs time, it shows time, in which your polls are being handled by system. This looks like it just shows your polling rate stability.
> Also problem is, how do you test this setting objectively, there is not same load, in your system at any time, even in idle there can be small variance and who knows what can affect it. I get like 88us, but sometimes i get spike to 200us as highest execution in latencymon. But you can tell if there would be huge difference.
> 
> Btw i am currently thinking, if would be possible stream audio to second pc and render it there, so you avoid dpc latency from sound card. I was looking at couple programs, but not sure if they do what i want. Because sound card makes a lot of dpc latency.


Maybe you got not the best drivers or windows was setup wrong
I try not to get spikes when I dont want them. my latencymon attached


----------



## 508859

BroadPwns said:


> The end accuracy of mine is perfect and I see no reason to dip into polling voodoo further. I also meant that the software reads properly, not that the score itself is proper.


dude, you did not dip anywhere, you are using default settings (not that there is something wrong with it)


----------



## empl

x7007 said:


> With the affinity program you need to deselect Core0-1 and leave the others, this way you offload all the processes to other cores because every single crap and driver works on Core0-1 so having it without GPU and USB processing makes it better.


Yeah, if you have like 16, or 32 cores, thats good thing to do theoretically. Problem is expert from latmon, said interrupt affinity can be ingored, on driver, or hardware level, so it doesn't always work. Also most of drivers should use msi, or msi-x, so they should use all, or multiple cores and spread evenly... Still if you have like 16, or 32 cores, if you could reserve for interrupts some, it would be good. I can't otherwise see how many cores each driver using, latmon shows only one driver with maximum latency per core. Maybe it is possible somehow, don't know about such feature in latencymon.



Spieler4 said:


> Maybe you got not the best drivers or windows was setup wrong
> I try not to get spikes when I dont want them. my latencymon attached


I get like highest 700 on nvddklm.sys or how it calls - nvidia drivers and usb i have usually 80, when maxing polling rate, one time i run that i got spike to 200 for some reason, both in mouse tester and latmon i have 80 usually when moving mouse a lot. Overall latency is in green and quite low, slightly increases when i switch into game, from only thing i get most dpc latency is usb, because it won't work in msi-x mode on my motherboard. Still i have everything tweaked and still very low dpc latency.
I got asrock phantom gaming itx/ax, which is supposed to have very low dpc latency even, it is hard to test, because it different on hw and sw configurations. Unfortunately i got bad socket, so i decided to wait for intel 10 gen cpus. I would buy amd, but i need single core performance and it will still max out my gpu so i don't care.



BroadPwns said:


> No HPET switch in bios (most likely ON by default) so only by deleting the useplatformclock the software works properly (reporting 1000Hz +-50Hz). Now a wild part - after deleting the useplatformclock through cmd NieR Automata started to utilize my GPU in constant 99%, instead of jumping 65-90, which results in keeping a nice 57-80FPS instead of getting dips below 40 lol. No changes in Deus Ex Mankind Divided, can't say if there are other changes as I did not test more titles. So just an engine specific issue.


Ye platformclock should be only used for debbuging and you shouldn't force hpet on, but let windows decide based on hpet expert mail conversation - can be found on tweakhound.com write platformclock. 

What i did not know about is platformtick (same syntax), which disable synthetic timers and use hpet as timer. Platformcock is counter, for driver and appplications to optimize performance and time out events "says microsoft". I switched on platformtick and mouse feels more accurate, you can try it, for delete i think: deletevalue useplatformtick, again it is on tweakhound.



James N said:


> Yep Windows 7 is way better than 10 in that regard, but it doesn't matter since sooner or later everyone has to make the switch to 10. And on 10 you won't ever get a result as good as this


Why are you measuring polling stability, are you afraid your mouse polling isn't stable. Every good mouse today should have stable 1k, unless it is faulty, or sw bug like in win 8.1, rather measure inteval vs time, which tests when your polls are handled by os, make sure you max out polling rate, aka move mouse quickly in circles and than zoom and check biggest curve's maximum time.

Btw this site contains interesting info about latency etc. https://www.resplendence.com/latencymon Main thins is disable all thottling features in bios and keep cpu in c0 state aka disable idle saver- google it if you don't know what that is. Process lasso can turn it on by app , but than can't restored balanced mode, which is stupid.

Btw manufacturers experimenting with 480 hz monitors and DP has already bandwidth for 1000 hz gaming pog, tho you get barelly 60 fps in some games with 2080 , still on games like cs go it would be epic, if devs upgraded engine. Unfortunately some of most competetive games like sc2, still run only on 60fps.


----------



## x7007

empl said:


> Yeah, if you have like 16, or 32 cores, thats good thing to do theoretically. Problem is expert from latmon, said interrupt affinity can be ingored, on driver, or hardware level, so it doesn't always work. Also most of drivers should use msi, or msi-x, so they should use all, or multiple cores and spread evenly... Still if you have like 16, or 32 cores, if you could reserve for interrupts some, it would be good. I can't otherwise see how many cores each driver using, latmon shows only one driver with maximum latency per core. Maybe it is possible somehow, don't know about such feature in latencymon.
> 
> 
> 
> I get like highest 700 on nvddklm.sys or how it calls - nvidia drivers and usb i have usually 80, when maxing polling rate, one time i run that i got spike to 200 for some reason, both in mouse tester and latmon i have 80 usually when moving mouse a lot. Overall latency is in green and quite low, slightly increases when i switch into game, from only thing i get most dpc latency is usb, because it won't work in msi-x mode on my motherboard. Still i have everything tweaked and still very low dpc latency.
> I got asrock phantom gaming itx/ax, which is supposed to have very low dpc latency even, it is hard to test, because it different on hw and sw configurations. Unfortunately i got bad socket, so i decided to wait for intel 10 gen cpus. I would buy amd, but i need single core performance and it will still max out my gpu so i don't care.
> 
> 
> 
> Ye platformclock should be only used for debbuging and you shouldn't force hpet on, but let windows decide based on hpet expert mail conversation - can be found on tweakhound.com write platformclock.
> 
> What i did not know about is platformtick (same syntax), which disable synthetic timers and use hpet as timer. Platformcock is counter, for driver and appplications to optimize performance and time out events "says microsoft". I switched on platformtick and mouse feels more accurate, you can try it, for delete i think: deletevalue useplatformtick, again it is on tweakhound.
> 
> 
> 
> Why are you measuring polling stability, are you afraid your mouse polling isn't stable. Every good mouse today should have stable 1k, unless it is faulty, or sw bug like in win 8.1, rather measure inteval vs time, which tests when your polls are handled by os, make sure you max out polling rate, aka move mouse quickly in circles and than zoom and check biggest curve's maximum time.
> 
> Btw this site contains interesting info about latency etc. https://www.resplendence.com/latencymon Main thins is disable all thottling features in bios and keep cpu in c0 state aka disable idle saver- google it if you don't know what that is. Process lasso can turn it on by app , but than can't restored balanced mode, which is stupid.
> 
> Btw manufacturers experimenting with 480 hz monitors and DP has already bandwidth for 1000 hz gaming pog, tho you get barelly 60 fps in some games with 2080 , still on games like cs go it would be epic, if devs upgraded engine. Unfortunately some of most competetive games like sc2, still run only on 60fps.


I literally just gave the program from Microsoft to do the affinity change specific to the Drivers. and you should do it even with 4 Cores because Core0-1 has too much on it.

And you see the different load in LatencyMon on the CPU tab. clearly Core0 20.000 while others are 5.000 instead Core0 100.000 other cores 20.000 is a big improvement.


----------



## empl

x7007 said:


> I literally just gave the program from Microsoft to do the affinity change specific to the Drivers. and you should do it even with 4 Cores because Core0-1 has too much on it.
> 
> And you see the different load in LatencyMon on the CPU tab. clearly Core0 20.000 while others are 5.000 instead Core0 100.000 other cores 20.000 is a big improvement.


That's interesting, but it doesn't mean system will always abide by these settings, as latmon expert said it can be ignored at driver/hardware level. I know what you mean. I already tried that in past, but i didn't see change, i tried put gpu on all cores. But still it may be good tweak, what if it works in some cases. People were saying about certain things it doesn't work, while it seemed to work. Like about putting gpu to msi-x, nvidia dev said it shouldn't have any effect, but i can clearly tell difference in input lag. Yeah i too have 90% of interrupts on core 0 from short test, which is strange - given i have everything in msi-x except usb, so drivers should utilize multicore. I will test it once again and see if it helps.

Problem is i don't see how many dpc calls come from specific driver on each core. I can see only overall count of dpc calls on each core and which driver had highest execution time on specific core. So only way too see if it helped i guess is look for overall highest execution time, if it lowers. How do you measure if it helps ? After i set gpu in regedit to IrqPolicyAllProcessorsInMachine to 3 in hexadecimal, i didn't see much more interrupts on other cores.

I have so contradictory results, overall latency is great only 400 max after 20 minutes of testing for nvlddmkm.sys, which is usually at least 700. Again these results can vary, it is hard to test. But other cores are still barely touched, core 0 - 353k dpc calls, core 1 - 23k, others 700-5000, barely touched. And i set it in enum, under affinity policy for my gpu hardware id, so i don't think i set it wrong. I checked it yet in performance monitor and same results, everything on core 0... So i don't think results are wrong..

I would like to have interrupts spread evenly on all cores, despite i have everything in msi/ msi-x mode, i have still 90% of interrupts on core 0 and setting core affinity in registry didn't help much, if at all, in latmon result dpc count on other core didn't change !

I don't think it is even possible to change, interrupt handling is done by apic chip in cpus. It depends on hardware and drivers and these settings in registry don't do anything: i tried both that utility and manual entry like DevicePolicy 3 under device id. It also depends on how drivers are written, it is strange tho, because hardware definitely should have capabilities, same as drivers, i have up to date drivers and everything in msi-x so it doesn't make sense...

What are your loads on each cores ?

EDIT: okay now interrupt affinity works, after i switched gpu to msi mode, i didn't do that because nvidia dev said, msi is support is specified in its drivers coding and setting, which nvidia drivers put to registry are only for old devices, which wouldn't handle more. But after i switch gpu to msi mode in registry i get less input lag, so it is strange...


----------



## x7007

I've found out what was causing me mouse and keyboard lag input. it was the MOBO Fan Controller, now it's not a single case only to my motherboard now, it's my old motherboard and many other people with any motherboard. Need to set the Fan controller to specific FAN power, DC or PWM depends on your fan, do not leave it on Auto.

I am still investigating but I think I am finally done. can say I fixed it just need to know what exactly did it on the technical level.


----------



## 508859

x7007 said:


> I've found out what was causing me mouse and keyboard lag input. it was the MOBO Fan Controller, now it's not a single case only to my motherboard now, it's my old motherboard and many other people with any motherboard. Need to set the Fan controller to specific FAN power, DC or PWM depends on your fan, do not leave it on Auto.
> 
> I am still investigating but I think I am finally done. can say I fixed it just need to know what exactly did it on the technical level.


you need to be able to reproduce the issue by switching back and forth between settings, and issue must be measurable by something


----------



## r0ach

James N said:


> Yep Windows 7 is way better than 10 in that regard, but it doesn't matter since sooner or later everyone has to make the switch to 10. And on 10 you won't ever get a result as good as this


Every Maxwell and higher + AMD Vega and higher generation GPU uses tiled rendering, which seems to render with higher latency in non-hardware accelerated desktop mode (Windows 7), which is one of the many reasons I'm a 5%'r that refuses to budge off Windows 8.1. Probably doesn't affect full-screen mode to clarify, but I don't like having a crappy desktop mode either. If you wanted to stay on Windows 7 and don't want desktop cursor movement to be crappy, the highest GPU you can use is an AMD 580.

The only problem there is - from the old days of Bitcoin mining - I've had an entire room full of GPUs at one time and noticed cursor movement was always worse on non-reference R9 290's, and it's probably difficult to find any reference AMDs from that generation nowadays. I'm not sure how they do it, but 3rd party vendors seem to screw up their BIOS implementations somehow. It could be numerous things from random UEFI crap, to timing issues like in Bitcoin mining on AMD cards how you had to set both the core and memory clocks to an exact number to get proper performance and adding say +200 mhz might give you worse performance than the lower clocked card.

I don't really know or care what the cause of the issue is, all I know is that many non-reference cards have issues. The problem also seems to be worse for AMD 3rd party vendors - might be due to architectural differences requiring some sort of specific timing as I was talking about. I would also prefer a reference Nvidia over a 3rd party, but reference Nvidia vs 3rd party hasn't been as big of an issue for me where you notice some 3rd party AMD having extreme floaty cursor feel compared to reference. Maybe some type of issue inherent to GCN timings, maybe not.


----------



## Kommando Kodiak

r0ach said:


> Every Maxwell and higher + AMD Vega and higher generation GPU uses tiled rendering, which seems to render with higher latency in non-hardware accelerated desktop mode (Windows 7), which is one of the many reasons I'm a 5%'r that refuses to budge off Windows 8.1. Probably doesn't affect full-screen mode to clarify, but I don't like having a crappy desktop mode either. If you wanted to stay on Windows 7 and don't want desktop cursor movement to be crappy, the highest GPU you can use is an AMD 580.
> 
> The only problem there is - from the old days of Bitcoin mining - I've had an entire room full of GPUs at one time and noticed cursor movement was always worse on non-reference R9 290's, and it's probably difficult to find any reference AMDs from that generation nowadays. I'm not sure how they do it, but 3rd party vendors seem to screw up their BIOS implementations somehow. It could be numerous things from random UEFI crap, to timing issues like in Bitcoin mining on AMD cards how you had to set both the core and memory clocks to an exact number to get proper performance and adding say +200 mhz might give you worse performance than the lower clocked card.
> 
> I don't really know or care what the cause of the issue is, all I know is that many non-reference cards have issues. The problem also seems to be worse for AMD 3rd party vendors - might be due to architectural differences requiring some sort of specific timing as I was talking about. I would also prefer a reference Nvidia over a 3rd party, but reference Nvidia vs 3rd party hasn't been as big of an issue for me where you notice some 3rd party AMD having extreme floaty cursor feel compared to reference. Maybe some type of issue inherent to GCN timings, maybe not.


holy balls its really him! I thought he was just a myth around here! I thought he abandoned these forums.


----------



## x7007

r0ach said:


> Every Maxwell and higher + AMD Vega and higher generation GPU uses tiled rendering, which seems to render with higher latency in non-hardware accelerated desktop mode (Windows 7), which is one of the many reasons I'm a 5%'r that refuses to budge off Windows 8.1. Probably doesn't affect full-screen mode to clarify, but I don't like having a crappy desktop mode either. If you wanted to stay on Windows 7 and don't want desktop cursor movement to be crappy, the highest GPU you can use is an AMD 580.
> 
> The only problem there is - from the old days of Bitcoin mining - I've had an entire room full of GPUs at one time and noticed cursor movement was always worse on non-reference R9 290's, and it's probably difficult to find any reference AMDs from that generation nowadays. I'm not sure how they do it, but 3rd party vendors seem to screw up their BIOS implementations somehow. It could be numerous things from random UEFI crap, to timing issues like in Bitcoin mining on AMD cards how you had to set both the core and memory clocks to an exact number to get proper performance and adding say +200 mhz might give you worse performance than the lower clocked card.
> 
> I don't really know or care what the cause of the issue is, all I know is that many non-reference cards have issues. The problem also seems to be worse for AMD 3rd party vendors - might be due to architectural differences requiring some sort of specific timing as I was talking about. I would also prefer a reference Nvidia over a 3rd party, but reference Nvidia vs 3rd party hasn't been as big of an issue for me where you notice some 3rd party AMD having extreme floaty cursor feel compared to reference. Maybe some type of issue inherent to GCN timings, maybe not.


So if I am going to buy AMD GPU example 5700XT or the new RX RayTracing support 2020, or Nvidia GPU, which one should I buy? What test we would be able to do for a review that they might fix this crap for every released card like FrameTime was a long issue till it was fixed.


----------



## r0ach

x7007 said:


> So if I am going to buy AMD GPU example 5700XT or the new RX RayTracing support 2020, or Nvidia GPU, which one should I buy? What test we would be able to do for a review that they might fix this crap for every released card like FrameTime was a long issue till it was fixed.


There is no fix. Windows XP was a hardware accelerated desktop, then Win 7 wasn't, then Win 8.1 was again. This is why desktop cursor movement vs exclusive full-screen 3d mode feels the same in Windows 8.1 but not Windows 7. What's bizarre is Nvidia doesn't support Windows 8.1 on 2060-2080 series but does on 1600 series. I refuse to use Windows 10 New World Order edition myself. It's Microsoft attempting to transition to a fully locked down, Apple-style OS, and I would use it solely as a game box and nothing else due to that, but cursor movement is worse than Win 8.1, so I have no use for it at all.


----------



## x7007

r0ach said:


> There is no fix. Windows XP was a hardware accelerated desktop, then Win 7 wasn't, then Win 8.1 was again. This is why desktop cursor movement vs exclusive full-screen 3d mode feels the same in Windows 8.1 but not Windows 7. What's bizarre is Nvidia doesn't support Windows 8.1 on 2060-2080 series but does on 1600 series. I refuse to use Windows 10 New World Order edition myself. It's Microsoft attempting to transition to a fully locked down, Apple-style OS, and I would use it solely as a game box and nothing else due to that, but cursor movement is worse than Win 8.1, so I have no use for it at all.


need hdr thought.. wouldn't mind much windows 10. but it has all the support and drivers needed and have new features. so bad technology always goes backwards.. all the time. if it's money related and mind and spirit is all about the final product we get.


----------



## r0ach

I don't see it that way. DX12 looks like a flop with the worst adoption of any DX version so far, which means the industry as a whole might easily go towards something like Vulkan or something else instead. Microsoft is trying to release DX12 only games such as Age of Empires II: Definitive edition that doesn't even need DX12 to try and scam people into moving to Win 10, but it's not going to work.


----------



## TranquilTempest

x7007 said:


> What test we would be able to do for a review that they might fix this crap for every released card like FrameTime was a long issue till it was fixed.


You need end to end latency testing, either a mouse modified to light a LED whenever you press the button, used with a high speed camera, or a microcontroller and a photosensor, which sends a mouse input, and measures the delay from the input event to the brightness change caused by that input event. If you already have the high speed camera, the first one can be done with a couple bucks of parts and a soldering iron. The second one is more like $50 of parts, but the data is much easier to gather and analyze. 

Relevant posts on microcontroller method: 
https://forums.blurbusters.com/viewtopic.php?t=1381
https://forums.blurbusters.com/viewtopic.php?p=22634#p22634

High speed camera method:
https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/3/


PS: Anyone giving advice should either be explaining the test methodology and providing data, or should be able to explain exactly how a setting works, and how changing it changes the pipeline. Disregard any advice based solely on feel. Mouse feel is only a reliable indicator for really huge differences in latency, on the order of 20ms and greater. Below 10ms it's very hard to distinguish it from placebo, need blind AB test and a statistical analysis for that.


----------



## x7007

r0ach said:


> I don't see it that way. DX12 looks like a flop with the worst adoption of any DX version so far, which means the industry as a whole might easily go towards something like Vulkan or something else instead. Microsoft is trying to release DX12 only games such as Age of Empires II: Definitive edition that doesn't even need DX12 to try and scam people into moving to Win 10, but it's not going to work.


yes I put my trust on vulkan but for some reason the wolfesntein young blood I only get 25 fps no matter what I do. and I have no idea what is the problem and some people also had this issue. so vulkan is not free of bugs. or it's just something else. at least we are close to answers like never before, we know what does which.

also I use reshade for every game I play as to play in 3d using superdepth shader. and vulkan, dx12 are still not perfect. 

I also have weird issue since I disabled smt. when trying to use Microsoft intterupt affinity program it says I dot have enough resources... it worked just fine before. I tried on USB and gpu 1080gtx. they both get disabled. I couldn't find a fix yet. I don't know if I will. I need it to offload some of the load from core 0. too much on it.


----------



## SuperMumrik

What do you mean? My RTX 2080Ti performs just as good in 8,1 as win10. No dx12 or rtx ofc. Win8,1 FTW 🙂


----------



## r0ach

SuperMumrik said:


> What do you mean? My RTX 2080Ti performs just as good in 8,1 as win10. No dx12 or rtx ofc. Win8,1 FTW 🙂


I'm on a Maxwell, but I don't see any Windows 8.1 option on RTX 2000 series. If either of those options installs on Win 8.1, it probably means Microsoft is attempting to pressure Nvidia to deprecate compatibility in the future to scam people into using Win 10:


----------



## SuperMumrik

You have to use the win7 driver, but it installs correctly without issues


----------



## r0ach

TranquilTempest said:


> Mouse feel is only a reliable indicator for really huge differences in latency, on the order of 20ms and greater. Below 10ms it's very hard to distinguish it from placebo, need blind AB test and a statistical analysis for that.


It's not even all latency orientated. In the words of that one Asian guy who I forget his name: "all code is garbage". You can get some screwed up, floaty mouse cursor movement probably without even adding additional latency just by setting some software setting to a non-optimal value. Like MSI mode vs line based interrupt mode for instance. 

Discussion of that variable on this forum was non-existent when I talked about it years ago and said manually forcing it on for Nvidia GPUs gives worse mouse movement - sort of an inaccurate, floaty cursor feel. It didn't really mess with latency much if at all, it's just a setting that's not doing something right in software. I believe AMD forces MSI mode on in their drivers and haven't really messed with forcing it off to see if there's any difference there. The last clown cursor setting I recall trying was the Nvidia one for selecting OpenGL render mode. Manually forcing it seemed worse than just leaving it on auto-select and gave that floaty cursor feeling (you need to reboot in-between setting changes just to be sure as well).


----------



## Falkentyne

x7007 said:


> yes I put my trust on vulkan but for some reason the wolfesntein young blood I only get 25 fps no matter what I do. and I have no idea what is the problem and some people also had this issue. so vulkan is not free of bugs. or it's just something else. at least we are close to answers like never before, we know what does which.
> 
> also I use reshade for every game I play as to play in 3d using superdepth shader. and vulkan, dx12 are still not perfect.
> 
> I also have weird issue since I disabled smt. when trying to use Microsoft intterupt affinity program it says I dot have enough resources... it worked just fine before. I tried on USB and gpu 1080gtx. they both get disabled. I couldn't find a fix yet. I don't know if I will. I need it to offload some of the load from core 0. too much on it.


Hey Roach, I missed you!
I'm still using your Roach approved mouse sensor.
Have you considered this for windows 10?

Windows 10 Enterprise 2019 LTSC ?

There's also some thing here.
http://forum.notebookreview.com/threads/windows-10-nobloatware-edition.810557/


----------



## r0ach

I haven't really messed with Win 10 since it was first released and noticed it wasn't good mouse cursor-wise vs Win 8.1 even after removing all the stuff I knew to cause problems. I really doubt the LTSC is going to change any of that, so I haven't bothered with it. Do people on this forum actually use that OS now? With Intel Management Engine and everything else installed too? MS is obviously moving towards a locked down, Apple app store style of OS, so you're going to get screwed somewhere down the line messing with Win 10 regardless.

You have factions like the government - which MS basically is the government - wanting to spy on the population and enforce Marxism, plus the corporations wanting to minimize things like piracy while an app store model also helps facilitate vendor lock-in where they can just eliminate any competition putting out non-SJW software that people actually want to buy by not allowing them to operate on the platform. So, in short, allowing Microsoft, aka the government, 'rolling release' potential is obviously going to be compounding, Marxist incrementalism.


----------



## r0ach

As a side note, why do you like chess? The amount of variables in the game is so small, and the game is not played in real-time, that every degree of freedom can be comprehended by the human brain to the point where it's basically a deterministic system. That's why I've never really understood people giving any prestige to chess over say some other more modern, computer orientated games with more degrees of freedom.

One system is more like a puzzle - deterministic - while the other has enough degrees of freedom making it difficult to exactly replicate the same inputs you used in a previous game allowing a randomization, entropy effect to create emergent gameplay phenonemon which chess does not have. Some would claim that if stages are not procedurally generated that can't be true, but like I said, the simple fact that you can stand on or interact with a much larger array of pixels with other scripts designed to interact with those random, unable to be completely replicated movements. You would need much better terrain deformation, AI, and physics for this to fully play out though.


----------



## x7007

Can everyone do the In-Depth Latency Tests and say what CPU do you have and if you have SMT then if it better results or fixes it?

Also, Roach, do you say MSI selected for GPU is bad then? I will test.


EDIT : I've found that not using MSI mode for GPU causing ISR dxgkrnl.sys. so using MSI on Nvidia card actually does work and works fine.

You can clearly see MSI mode have no ISR on dxgkrnl.sys

Btw Roach, do you have any idea why Disabling SMT to fix my issue with In-Depth Latency SMI which causing the mouse and everything else to go crazy clown mode I am not able to change the Interrupt Affinity Priority using the Microsoft program? every time I change something it disables the device and saying not enough resources and I need to disable another device so have more. SMT Enable it works fine, but I have crazy clown mouse.. I want to use the program to reduce the load on CORE0 for GPU, USB, and SoundCard. It should give a better mouse movement so.

EDIT: I find it hard to detect any performance increase or mouse movement improvement when using Bcdedit /set tscpsyncpolicy Legacy and Disabling HPET from bios/Windows and Device manager.


----------



## r0ach

x7007 said:


> Also, Roach, do you say MSI selected for GPU is bad then? I will test.


Well, Nvidia sets it off by default and AMD on by default, and it's always been worse for me forcing it on for Nvidia cards. The ONLY Nvidia card I know of that has MSI on by default is the Alienware Alpha embedded 860m (w.t.f.?). On that particular unit, having it on doesn't seem to be detrimental to cursor movement or as detrimental at least as attempting to force it on for a regular desktop card. So there's probably other secondary settings and random driver stuff going on that need to be adjusted as well if using MSI. 

I don't recall offhand if I even tested the difference between on and off for AMD cards. TLDR: don't attempt to force MSI mode on Nvidia cards.


----------



## x7007

r0ach said:


> Well, Nvidia sets it off by default and AMD on by default, and it's always been worse for me forcing it on for Nvidia cards. The ONLY Nvidia card I know of that has MSI on by default is the Alienware Alpha embedded 860m (w.t.f.?). On that particular unit, having it on doesn't seem to be detrimental to cursor movement or as detrimental at least as attempting to force it on for a regular desktop card. So there's probably other secondary settings and random driver stuff going on that need to be adjusted as well if using MSI.
> 
> I don't recall offhand if I even tested the difference between on and off for AMD cards. TLDR: don't attempt to force MSI mode on Nvidia cards.


Do you have something I can test when MSI on/off?
So I'll see if it's worse?

I'm looking at things so I can see and feel it. so I'll know when it's best.


good thread with MSI chat

https://forum.rme-audio.de/viewtopic.php?id=25778

Issues when using MSI mode for nvidia

https://forum.rme-audio.de/viewtopic.php?pid=130009#p130009


----------



## gunit2004

r0ach said:


> Well, Nvidia sets it off by default and AMD on by default, and it's always been worse for me forcing it on for Nvidia cards. The ONLY Nvidia card I know of that has MSI on by default is the Alienware Alpha embedded 860m (w.t.f.?). On that particular unit, having it on doesn't seem to be detrimental to cursor movement or as detrimental at least as attempting to force it on for a regular desktop card. So there's probably other secondary settings and random driver stuff going on that need to be adjusted as well if using MSI.
> 
> I don't recall offhand if I even tested the difference between on and off for AMD cards. TLDR: don't attempt to force MSI mode on Nvidia cards.


I don't look at the numbers and all that stuff like you do but I will say that I tried putting my 1080TI on MSI mode a while back and it felt horrible (made Overwatch pretty much unplayable in terms of aim because I wouldn't be able to hit the broad side of a barn). So I can say I wholeheartedly agree with your findings. By default when I first tried MSI_utilv2, everything on my computer was already in MSI mode except for the NVIDIA GPU and the NVIDIA audio controller.

What are your thoughts on using IRQ priority in registry editor? I found it made things worse on anything EXCEPT for the Intel Management Engine Interface. Giving that an IRQ priority entry in registry editor makes the mouse feel MUCH MUCH more responsive and I have been using that setting for like a year now.


----------



## empl

Sup


> roach


 I found out from you many useful tweaks, which gave me low input lag. I used to be supreme master class in cs go and top1 masters in starcraft 2. I can tell even 6 ms difference in input lag. Thanks dude !  Yea windows 10 is crap, unfortunately some games don't work on windows 7 and dx12 is only on win 10. You can't even disable nagling and turn off dwm. It was possible to disable dwm, but not even this works anymore: https://www.reddit.com/r/Windows10/..._worth_disabling_dwm_desktop_composition_for/ and you can play only in windowed mode, which can lower fps, even fullscreen optimizations should improve this supposedly, don't know how much. 

Luckily linux is getting to support more games, but it is still nowhere near windows in terms of performance and don't know about input lag.

About msi-x on nvidia cards: Sora said on nvidia forums, msi support and msisupported (channels), are specified in nvidia drivers programming. And next said drivers put value to registry, only for old devices, which would crap out otherwise. Because there is some line in nvidia inf, which should put it to regedit, but i have by default gpu in line based interrupt mode. But when i put my geforce 780 to msi mode in regedit, i get lower input lag so it clearly wasn't in msi/msi-x. That's strange, what to thing of it, either she is lying, or nvidia has it wrong. Msi-x is pretty important for lower input lag. And nvidia drivers has high dpc latency.



> x7007 - msi


 I tested replay from starcraft 2, so there should be same load. 2 minutes long test in latmon and with msi on i had much lower dpc and highest execution time. Even system load is never same, there was huge difference. I didn't test it yet with adk (windows assessment toolkit), because some experienced poster on 3dguru said: latencymon doesn't measure accurately, but i don't know, it is professional tool. Again even with same things open and similar gpu load, because it is replay from game, not different game, dpc latency won't be ever same, so it is hard to test, but you can see huge differences and how it feels.

Btw i have weird mouse lag, when i edit anything in task scheduler for some reason, it is like mouse feels heavy, it happened to me even in win 7 and win 10. Strange and only reinstall, or repair fixes it. Because you should disable scheduled tasks when gaming, otherwise i wouldn't touch it. Anyone know anything about it ? I found nothing, probably because it is strange and most people won't notice.


----------



## x7007

I need to know if using onboard mouse Logitech G503 vs Software with LGS HUB opened vs Closed and disabled Virtual Mouse and Mouse pad calibration.

What should give a perfect movement?

There are too many variables.

Did someone tested how it is with LGS HUB installed Opened/Closed from tray icon. The Logitech virtual Mouse Disabled/Enabled from device manager.
Mouse calibration Default vs Calibrated with LGS HUB closed/opened. Onboard vs Software


----------



## r0ach

It's pretty damn simple! For any "feature" someone tries to add to your computer, whether it's Nvidia post processing blah blah sharpening or memory resident mouse malware, assume ALL of the code is terrible and lag inducing in every single one of them, because it usually is on both counts. There's no reason to have mouse software installed. 

In the old days, the Razer software turned native steps into interpolated, and plenty of them still do that in the year 2019. And what is the size of Logitech's software now? 1 Gigabyte? Do you think it takes that much space to operate a mouse? Of course it's not going to do anything useful for you unless you absolutely need to change default DPI, then just uninstall it if you do.


----------



## x7007

r0ach said:


> It's pretty damn simple! For any "feature" someone tries to add to your computer, whether it's Nvidia post processing blah blah sharpening or memory resident mouse malware, assume ALL of the code is terrible and lag inducing in every single one of them, because it usually is on both counts. There's no reason to have mouse software installed.
> 
> In the old days, the Razer software turned native steps into interpolated, and plenty of them still do that in the year 2019. And what is the size of Logitech's software now? 1 Gigabyte? Do you think it takes that much space to operate a mouse? Of course it's not going to do anything useful for you unless you absolutely need to change default DPI, then just uninstall it if you do.


indeed that's my thinking also. but sometimes need to change something and it's annoying to reinstall it. if we can keep it and just disable its functions. Also, does the mouse calibration really do something? because I don't feel much from default to calibrated, and the question is does the mouse calibration work without the software installed?

EDIT: It seems my G502 works better with the mouse calibration set to the mouse pad. I have the SteelSeries I-2 Glass. I also use the Lexit Ceramic Mouse feet so I think I need the mouse calibration and also I need to use the Software and keep it opened, or else the mouse becomes funky.
That's what I meant with variables. I don't have a single standard thing. Glass mouse pad which is kind "special". nonstandard mouse feet.
Also, the G502 has a fail sensor or what's it called when moving too fast. on paint, it looks like broken lightning. but I mean too fast like speedy gonzales which I'll never move like that in games.


----------



## Kommando Kodiak

How to give r0ach a heart attack


----------



## NDUS

In ninety pages of discussion there has never been a concrete before-and-after proof of any technique for optimizing USB polling precision. Just a bunch of people vaguely throwing around "feelings".


----------



## Avalar

NDUS said:


> In ninety pages of discussion there has never been a concrete before-and-after proof of any technique for optimizing USB polling precision. Just a bunch of people vaguely throwing around "feelings".


Make it real. Be the change.


----------



## r0ach

I'm not reading through 90 pages but every USB 2.0 controller I've used has pretty noticeable degradation of mouse movement by plugging in a 1000hz keyboard instead of a 125hz one, so these things can barely even support a single 1000hz device in the first place in the manner they're implemented, let alone two of them. Mice perform worse on USB3 so no point testing those before anyone brings that up. Therefore it's probably completely pointless running a mouse past 1000hz.

I'm not on a ultra overclocked to the moon box at the moment, just a Haswell quad core, but on Windows 8.1, this XM1 mouse (PWM3389) goes from 4% cpu while idle in Chromium with five tabs open to 13-15% CPU when I move the mouse fast along with a pretty significant DPC rise. Why would anyone want to devote even more system resources to IO when USB is already implemented in a manner to put a pretty enormous strain on the system trying to poll a single device at 1000hz? It's probably more detrimental than beneficial.

/thread


----------



## x7007

r0ach said:


> I'm not reading through 90 pages but every USB 2.0 controller I've used has pretty noticeable degradation of mouse movement by plugging in a 1000hz keyboard instead of a 125hz one, so these things can barely even support a single 1000hz device in the first place in the manner they're implemented, let alone two of them. Mice perform worse on USB3 so no point testing those before anyone brings that up. Therefore it's probably completely pointless running a mouse past 1000hz.
> 
> I'm not on a ultra overclocked to the moon box at the moment, just a Haswell quad core, but on Windows 8.1, this XM1 mouse (PWM3389) goes from 4% cpu while idle in Chromium with five tabs open to 13-15% CPU when I move the mouse fast along with a pretty significant DPC rise. Why would anyone want to devote even more system resources to IO when USB is already implemented in a manner to put a pretty enormous strain on the system trying to poll a single device at 1000hz? It's probably more detrimental than beneficial.
> 
> /thread


mine does go much on cpu usage, barely 2-5% I have 1950x. but should I put keyboard and mouse on 500hz? I can't disable USB 3 on my x399 gigabyte. only handoff

which is the best controller usb 2 I can buy to plug on motherboard pcie or the motherboard plugs? should I connect the mouse and keyboard on front? I have 2 USB 2. 0 plugs on case using the usb2 plug cable, forgot its name. and should I put both on 500 hz?


----------



## jayfkay

r0ach said:


> I'm not reading through 90 pages but every USB 2.0 controller I've used has pretty noticeable degradation of mouse movement by plugging in a 1000hz keyboard instead of a 125hz one, so these things can barely even support a single 1000hz device in the first place in the manner they're implemented, let alone two of them. Mice perform worse on USB3 so no point testing those before anyone brings that up. Therefore it's probably completely pointless running a mouse past 1000hz.
> 
> I'm not on a ultra overclocked to the moon box at the moment, just a Haswell quad core, but on Windows 8.1, this XM1 mouse (PWM3389) goes from 4% cpu while idle in Chromium with five tabs open to 13-15% CPU when I move the mouse fast along with a pretty significant DPC rise. Why would anyone want to devote even more system resources to IO when USB is already implemented in a manner to put a pretty enormous strain on the system trying to poll a single device at 1000hz? It's probably more detrimental than beneficial.
> 
> /thread


I'm on Ivy Bridge and my CPU goes from 1% idle to like 10% at best with 80 firefox tabs open when I try to really force it with abnormally fast mouse movement for a few seconds.


----------



## x7007

jayfkay said:


> I'm on Ivy Bridge and my CPU goes from 1% idle to like 10% at best with 80 firefox tabs open when I try to really force it with abnormally fast mouse movement for a few seconds.



ye cpu usage is not really a thing now days. we have extreme performance and we can even make sure the USB device use specific cpu so it will never cause issues. using the intterupt affinity program.


----------



## x7007

Ha I remember why the cpu usage is low. Roach is using USB 2.0 so it is using the cpu polling without buffer and USBEHCI.sys driver, so everything goes to the CPU like it should. we are using USB 3 USBXHCI.sys driver so it doesn't use ISR or anything. 

Roach, is there any way I can use USB 2.0 with my motherboard? Is it any good to use USB header from the motherboard or it's the same thing? Is it any good to get PCIE Usb 2.0 controller? What is my best way to get USB 2.0 so I can get the best latency from it in windows 10. what's the best chipset for usb 2.0? so many variables. I have my USB 2.0 from the internal connectors to my front case, would it be any good or worse or no different? 

Can you show your picture from USB Device Tree Viewer?


----------



## jayfkay

Nope, I am using USB 2, my Z77-g43 has 4 or 6 USB2.0 ports.
It also has 3.0 ports tho. Can I just delete that USBEHCI.sys driver? Will it help or just cause all my usb devices to fail


Also how do I zoom in on Y-axis but not X-axis on mousetester plot?


----------



## x7007

jayfkay said:


> Nope, I am using USB 2, my Z77-g43 has 4 or 6 USB2.0 ports.
> It also has 3.0 ports tho. Can I just delete that USBEHCI.sys driver? Will it help or just cause all my usb devices to fail


fail


----------



## x7007

Not sure, but it seems SR-IOV or Secure boot with keys, causing serious lag input. I had Secure boot enabled with keys. I remove all keys, disable secure boot and Disable SR-IOV which shouldn't be Enabled even when Auto because it needs iommu and it was also disabled.


----------



## J Doe

How bad is this?


----------



## r0ach

x7007 said:


> Not sure, but it seems SR-IOV or Secure boot with keys, causing serious lag input


Why are you not using legacy BIOS mode? It's better.


----------



## x7007

r0ach said:


> Why are you not using legacy BIOS mode? It's better.


windows already installed on uefi. would matter if I use legacy if windows is on Uefi mode.


----------



## x7007

Something weird I've found out just now. 

Lowering the gpu clock and ram fixed my mouse input in instant. Lowering it to match without boost, for example my card without boost is 1759. With boost is like 1879 and can reach also to 2050-2025mhz max. so lowering it by at least -259 makes it to stay near 1759 without boost and the mouse lag input was improved by 200%. I mean it was accurate like pinpoint. Boost is piece of crap, it's exactly same like CPU boost and power saving/clock reducing which causes terrible stuttering and including lag input. so we found out the cause once and for all. No one in a single thread has been told to check this thing. Using nvidia debug mode is different than lowering the gpu, that I'm sure.


----------



## RamenRider

x7007 said:


> windows already installed on uefi. would matter if I use legacy if windows is on Uefi mode.


I still don't understand how UEFI and Legacy mode works in relation to Windows 10's master boot records? Say I have multiple drives with multiple win 10. If I bootup would it still ask me to select which drive/partition to use though the Win10 mbr bios?


----------



## Timecard

Difference should be negligible once UEFI hands off control to Windows, UEFI has more functionality and added features. Otherwise they provide a very similar purpose of initializing and connecting hardware to the operating system (bootstrapping)


----------



## RamenRider

So why do people always suggest it? Does it really make a huge difference? If so, I should be disabling CSM completely?


----------



## empl

r0ach said:


> It's pretty damn simple! For any "feature" someone tries to add to your computer, whether it's Nvidia post processing blah blah sharpening or memory resident mouse malware, assume ALL of the code is terrible and lag inducing in every single one of them, because it usually is on both counts. There's no reason to have mouse software installed.
> 
> In the old days, the Razer software turned native steps into interpolated, and plenty of them still do that in the year 2019. And what is the size of Logitech's software now? 1 Gigabyte? Do you think it takes that much space to operate a mouse? Of course it's not going to do anything useful for you unless you absolutely need to change default DPI, then just uninstall it if you do.


Exactly and razer support will tell you otherwise. Razer synapse is known to cause extreme input lag. And you can't even use some of their mice without it and it persist even after uninstall, there is way to remove it, but it is complicated. But i had logitech g502 and g900 and their drivers are very good, i didn't noticed anything. Logitech has good drivers. Just don't use angle-snapping = smoothing. But i don't like shape of their mice.


> x7007


G300 has very low click latency btw, and logitech g900 has very good sensor tracking and fast response time, unfortunately, it has front skates higher than other and cause friction and shape of a knee.
I have extremely large hand and i tried g403 in shoowroom and it fit surprisingly well. You may want to look at logitech g403 hero. Is currently popular. Or cooler master has 50g mouse, which are currently popular.


----------



## gunit2004

empl said:


> Exactly and razer support will tell you otherwise. Razer synapse is known to cause extreme input lag. And you can't even use some of their mice without it and it persist even after uninstall, there is way to remove it, but it is complicated. But i had logitech g502 and g900 and their drivers are very good, i didn't noticed anything. Logitech has good drivers. Just don't use angle-snapping = smoothing. But i don't like shape of their mice.


Yeah, people who actively use Synapse 2 or 3 on their system have no idea how hard they are gimping themselves lol. I was running BOTH for the longest time until I began troubleshooting why I was having input lag issues and Synapse was one of many culprits. Literally a night and day difference between having it installed or not.

HOWEVER, I feel like some Razer devices can benefit from having their official drivers installed. For example, I use a Razer Orbweaver keypad and at this point my hand is basically molded to the damn thing, I can't go back to using a regular keyboard for gaming and there are no other options on the market that I like. But anyway, the Orbweaver feels noticeably slower in response time when it is using whatever default drivers that Windows 10 gives it. So what I had to do was install Synapse, get the official drivers installed for it and then get rid of Synapse and it was fine.

I don't think Razer drivers themselves will cause any noticeable input lag but the Synapse software is absolute garbage when it comes to that.


----------



## CrucialNUG

x7007 said:


> Something weird I've found out just now.
> 
> Lowering the gpu clock and ram fixed my mouse input in instant. Lowering it to match without boost, for example my card without boost is 1759. With boost is like 1879 and can reach also to 2050-2025mhz max. so lowering it by at least -259 makes it to stay near 1759 without boost and the mouse lag input was improved by 200%. I mean it was accurate like pinpoint. Boost is piece of crap, it's exactly same like CPU boost and power saving/clock reducing which causes terrible stuttering and including lag input. so we found out the cause once and for all. No one in a single thread has been told to check this thing. Using nvidia debug mode is different than lowering the gpu, that I'm sure.


Is your fan speed lowering when you do this? Ive noticed similar things regarding lower fan speed and more consistent mouse behaivor.


----------



## NDUS

The orbit of Neptune was slightly closer to earth recently and my mouse movement is way smoother, guys! My mouse feels totally smoother when there are rainbows nearby also. I think we need to investigate the effects of astrology, dowsing, leprechauns, and psychic powers on mouse movement. 

What, empirical proof with MouseTester? Why would I provide that?


----------



## r0ach

CrucialNUG said:


> Is your fan speed lowering when you do this? Ive noticed similar things regarding lower fan speed and more consistent mouse behaivor.


....

It has ZERO to do with fan speed. It's because at idle, your motherboard and GPU drop the PCI-E port to 16x v1.1 instead of 16x 3.0 as some sort of power saving feature while also downclocking. Mouse movement is objectively, noticeably different between idle mode and full power mode. Whether it's entirely due to the PCI-E switch or the GPU downclock, or both, I don't know. A lot of people can notice a difference in forcing PCI-E 2.0 vs PCI-E 3.0 in BIOS, so the PCI-E mode probably does play a part.

I swear like every single one of your posts is some sort of disinformation that I already know the answer to. You cannot accept the fact people like me exist that have already worked through all of these variables then claim everything is caused by electromagnetism or dirty power and call everyone insane that doesn't agree with you.


----------



## x7007

r0ach said:


> ....
> 
> It has ZERO to do with fan speed. It's because at idle, your motherboard and GPU drop the PCI-E port to 16x v1.1 instead of 16x 3.0 as some sort of power saving feature while also downclocking. Mouse movement is objectively, noticeably different between idle mode and full power mode. Whether it's entirely due to the PCI-E switch or the GPU downclock, or both, I don't know. A lot of people can notice a difference in forcing PCI-E 2.0 vs PCI-E 3.0 in BIOS, so the PCI-E mode probably does play a part.
> 
> I swear like every single one of your posts is some sort of disinformation that I already know the answer to. You cannot accept the fact people like me exist that have already worked through all of these variables then claim everything is caused by electromagnetism or dirty power and call everyone insane that doesn't agree with you.



yes but I forced it to pcie 3.0 and it still runs idle 1.1. how can I force it through registry or something?


----------



## SweetLow

NDUS said:


> The orbit of Neptune was slightly closer to earth recently and my mouse movement is way smoother, guys! My mouse feels totally smoother when there are rainbows nearby also. I think we need to investigate the effects of astrology, dowsing, leprechauns, and psychic powers on mouse movement.
> 
> What, empirical proof with MouseTester? Why would I provide that?


Relax, man. You ask clowns not to clown


----------



## empl

x7007 said:


> Something weird I've found out just now.
> 
> Lowering the gpu clock and ram fixed my mouse input in instant. Lowering it to match without boost, for example my card without boost is 1759. With boost is like 1879 and can reach also to 2050-2025mhz max. so lowering it by at least -259 makes it to stay near 1759 without boost and the mouse lag input was improved by 200%. I mean it was accurate like pinpoint. Boost is piece of crap, it's exactly same like CPU boost and power saving/clock reducing which causes terrible stuttering and including lag input. so we found out the cause once and for all. No one in a single thread has been told to check this thing. Using nvidia debug mode is different than lowering the gpu, that I'm sure.


You can force pcie 3.0 in windows regedit, don't know from head what you have to edit. It works you can check with gpu-z, that gpu runs in pcie 3.0, while normally would drop to 1.1 or something. If you don't have option to force it in bios. But i never had problem with mouse lag - when my gpu ran in lower pcie mode and never heard of it, but everything is possible. Btw i would be pretty mad, if overclock caused input lag for me and since you can't disable boost, since nvidia locked bios, or you can't force constant clock, which sucks !!! Now i think isn't possible your psu is weak ? And yet lowering ram clock, lowers timmings and reduce input lag. Ram lag is actually important thing too, you can tell difference !


----------



## x7007

empl said:


> You can force pcie 3.0 in windows regedit, don't know from head what you have to edit. It works you can check with gpu-z, that gpu runs in pcie 3.0, while normally would drop to 1.1 or something. If you don't have option to force it in bios. But i never had problem with mouse lag - when my gpu ran in lower pcie mode and never heard of it, but everything is possible. Btw i would be pretty mad, if overclock caused input lag for me and since you can't disable boost, since nvidia locked bios, or you can't force constant clock, which sucks !!! Now i think isn't possible your psu is weak ? And yet lowering ram clock, lowers timmings and reduce input lag. Ram lag is actually important thing too, you can tell difference !



https://forums.evga.com/How-to-force-PCIE-Gen-3-in-Windows-10-m2372915.aspx
I will try it

When I have 3 videos opened which uses hardware acceleration through Potplayer means D3D11 + madvr I feel the mouse smooth and precise like windows xp. so yes PCIE is using X16 3.0 when the video is opened. and the mouse movement is improved by 100%!! 

So I think PCIE is really have to do with the mouse movement. please everyone test this.


----------



## r0ach

x7007 said:


> https://forums.evga.com/How-to-force-PCIE-Gen-3-in-Windows-10-m2372915.aspx
> When I have 3 videos opened which uses hardware acceleration through Potplayer means D3D11 + madvr I feel the mouse smooth and precise like windows xp. so yes PCIE is using X16 3.0 when the video is opened. and the mouse movement is improved by 100%!!


That's mostly Windows apps that playback video requesting a finer timer resolution and not the PCI-E switchover itself. Tons of 3rd party programs to force that lower timer resolution nowadays.


----------



## x7007

r0ach said:


> That's mostly Windows apps that playback video requesting a finer timer resolution and not the PCI-E switchover itself. Tons of 3rd party programs to force that lower timer resolution nowadays.


I tried that also but it didn't change much for the mouse. because even with 15.667 ms or something it should work the same

The timer resolution is different than PCI-E speed

I found out it might be related to some Nvidia Settings. some must not be touched as you post on the screenshot. I think High Quality screw some things.
Still testing. Last time I tested the mouse felt perfect. stopped and moved instant. I will test today when I'm home. if it stays like this I will see what everything looks like. restart and see if it stays. if something changes after restart then I know it must be something registry related as the windows booted up and it must be something that keeps changing depends on what I'm doing and something else change it again. it can't be that random.


----------



## CiselS

I fix 1000hz unstable

just install win 8.1

https://www.overclock.net/forum/375...n-8-1-win-7-old-motherboard.html#post28268720


----------



## empl

CiselS said:


> I fix 1000hz unstable
> 
> just install win 8.1
> 
> https://www.overclock.net/forum/375...n-8-1-win-7-old-motherboard.html#post28268720


You can test that in mouse tester. But he says, it changes upon video rendering and pcie going to 3. If it was unstable, he shouldn't feel difference between this and no gpu load. But feel that all times.

As roach said, it can be timer resolution. Download intelligent standby list cleaner. Than set timer resolution 0.5 in this program and compare how mouse feels against when you set nothing and you play movie. You also want disable dynamic tick as it cause acceleration and your mouse movement isn't 1:1. 
To do that open elevated cmd: bcdedit /set disabledynamictick Yes. 
To delete: bcdedit /deletevalue disabledynamictick

It is strange, because i have no difference when my pcie is in 1.1 and 3.0

Btw the way, can anyone find any more info on platform tick ? I read it disables synthetic timers. It is not same as platform clock, which is counter, that is used for drivers and os to time out tasks, which are sensitive in terms of precision. And microsoft says platformclock should be used only for debugging. There is something else, people said platformtick improved their mouse accuracy and changed timer resolution to more levelled value, from 0.499 to 0.5 ms e.g. When i turn that on, my mouse is more accurate, but mouse feels more heavy not necessarily lagging (hard to describe). 

EDIT: It's probably because i am used to not using it, but mouse is more accurate with that on. It may dependent on hardware configuration. now i am pretty satisfied with platformtick on. I got 27 squares on mouseaccuracy.com on hard and tiny 15 second. I usually don't have that much usually, i did windows repair, i had some other issues with mouse, to make sure everything is working fine... 


I don't know which timers windows use normally. It uses more clocks. There is rtc for example, old method to measure time. And there is yet legacy timer, but i never heard of synthetic timers before.

What concerns platform clock, i read on tweakhound correspondence with expert on timers and he said you should never force platformclock on and let windows decide, it may be same with platformtick. My mouse definitely feels better, with platformtick off. On second thought, i don't know. Hard to say...


----------



## CiselS

empl said:


> You can test that in mouse tester. But he says, it changes upon video rendering and pcie going to 3. If it was unstable, he shouldn't feel difference between this and no gpu load. But feel that all times.
> 
> As roach said, it can be timer resolution. Download intelligent standby list cleaner. Than set timer resolution 0.5 in this program and compare how mouse feels against when you set nothing and you play movie. You also want disable dynamic tick as it cause acceleration and your mouse movement isn't 1:1.
> To do that open elevated cmd: bcdedit /set disabledynamictick Yes.
> To delete: bcdedit /deletevalue disabledynamictick
> 
> It is strange, because i have no difference when my pcie is in 1.1 and 3.0


I always test on fps game so pcie always 3.0. (you play for game, not desktop)

when I play apex legends is really really different with windows 10

many game timer resolution set 0.5(ex:csgo apex legends)

I using EmptyStandbyList same with Intelligent standby list cleaner

bcdedit /set disabledynamictick Yes (I feel no different)

no HPET
https://i.imgur.com/Zpsyd3Y.png

sorry for my bad english


----------



## empl

I can feel massive difference, mouse accelerates and decelerates, each time you move it, but i am very sensitive to it. It is possible you had it disabled from before. Or if you have high dpi, or else you are not sensitive to it, so you can't tell. I used to play with windows acceleration aka enhance mouse precision, before i had internet and i didn't noticed, that it cause acceleration... It was before i played competitively, so i didn't care that much. Dynamic tick means, windows can change resolution timer dynamically to save power, when you are not doing anything demanding. Timer resolution is window, delay before windows can update code to cpu. Highest is 15.6ms, so there is high latency. Lowest is 0.5, you can tell massive difference.

And what x7007 said, thing is pci-e not always will run in 3.0 even in games. If game doesn't utilize gpu properly e.g. Like direct 9 games. I have in sc2 like pcie 2 or something...


----------



## CiselS

empl said:


> I can feel massive difference, mouse accelerates and decelerates, each time you move it, but i am very sensitive to it. It is possible you had it disabled from before. Or if you have high dpi, or else you are not sensitive to it, so you can't tell. I used to play with windows acceleration aka enhance mouse precision, before i had internet and i didn't noticed, that it cause acceleration... It was before i played competitively, so i didn't care that much. Dynamic tick means, windows can change resolution timer dynamically to save power, when you are not doing anything demanding. Timer resolution is window, delay before windows can update code to cpu. Highest is 15.6ms, so there is high latency. Lowest is 0.5, you can tell massive difference.
> 
> And what x7007 said, thing is pci-e not always will run in 3.0 even in games. If game doesn't utilize gpu properly e.g. Like direct 9 games. I have in sc2 like pcie 2 or something...


excuse me, are you still change IRQ priority?

cmos 1，ehci 2，gpu3 ? (IRQ8Priority or IRQ08Priority?,IRQ23Priority,IRQ16Priority)

my Win32PrioritySeparation set is 2

if i set ehci 2 is ok?(don't kill my pc)

I try ehci 1 but no feel different

I test only one usb(mouse)

and play csgo aim_botz 360 Degrees 100 kill

https://i.imgur.com/hw3RlsX.jpg

I feel no different (cmos 1 ehci 3) I will try system timer 1 cmos 3 echi 4

https://i.imgur.com/RxfAZAm.png

should I set gpu?(more fps?) 

btw:I use MSI_util_v2 set ehci high(not msi mode) no different.

Timer resolution always 0.5

https://www.overclock.net/forum/132...drivers-should-i-prioritize.html#post28127444

https://www.overclock.net/forum/132...drivers-should-i-prioritize.html#post28112298



r0ach said:


> Why are you not using legacy BIOS mode? It's better.


excuse me, are you change IRQ priority?


----------



## empl

I don't remember exactly, i repaired pc for other issue, not because bad tweak  and didn't set it again. I think i set 1. gpu 2. cmos 3. usb. I felt less input lag, which may differ per hw configuration. That key "PriorityControl" is supposedly forgotten residue from times of WindowsNT, but it may still work. I found drivers themselves are setting irq priority in regedit, but in different location: Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Enum\hardwareid of a device\affinity policy - DevicePriority. Msi_util2 can do this, since you have it.
Wind32PriorityThreadSeparation is focus on background apps, or programs, it should be on programs. It is under control p/system/advanced.

It depends, results can vary. It is hard to measure, sometimes highest dpc latency execution time spikes. And system is never in same load, even if you test it in relatively same conditions. You can still discern large differences. Run latmon, look under drivers/usbport.sys and check highest execution time for dpc latency. Make sure you max polling rate, doing fast circles with mouse.

Yea you can't damage pc, by setting this. Nevertheless always backup whole system to be sure ! It is possible it is even ignored, on driver/hardware level. But it can work. Support from latencymon said to me that, interrupt affinity can be ignored like this, but it works for me setting it in registry. 

It won't give you more fps, but eliminates micro-latencies, which is still important. About cmos: its name in device manager is cmos rtc clock, or something. It is old device to measure time. But today windows use hpet. I have hpet in device manager, but stange is: it has no irq. I was googling around, but this requires advanced expertise in hardware and it is hard to find how it is, i wasn't able to find anything concrete about why it has no irq, or if setting priority for cmos would do anything. 

You can still test dpc latency, for corresponding device's drivers, after you change prioritycontrol in regedit, or in that second location. Or go with what feels best. Change only one thing at the time. 

Ehci should be in msi mode, but it is possible it won't work, if you have old mobo, mine doesn't work. Msi, or better msi/x if supported reduces input lag a lot !!! Careful - if you set this for wrong device your pc won't boot up, particularly for high definition audio from graphic card. So always check hardware id in device manager and do backup.

Check this, i made *tweakguide*, where i tried to list all tweaks i know, which have noticeable impact and for people to share. Btw do you know in cs go more fps you have - lower input lag you get ? It is something with how cs go engine works. Also if you use nvidia, you may want to use ultra low latency mode - on, instead of ultra. If your gpu isn't utilized at 99%, there is higher input lag with ultra than on. Someone tested that using 1000 fps camera. Also thing is: although more fps reduces input lag, but mutlicore rendering adds couple ms aswell. So it is hard to tell what is better, maybe multi-core is better if you have 300+ fps, or like 500-600, but you probably won't. And i doubt game is playable on single core anymore,since they broken it. I used to play cs go on 137 fps min. before they broken performance with updates and badly designed maps, on single core. I was supreme master class and i couldn't play on multi-core, there was too much lag, i would be gold... 

And raw input adds smoothing in this game, no need to use it, if you have disabled dynamic tick in bcdedit. And use 6/11 in windows and disable enhance point precision and in game M_mousescale 0, m_mouse_exponent 1, m_mousespeed 0. And in regedit current_user/control panel/mouse delete everything from MouseCurve(x,y) = smoothing, you can find these values backed under local machine, in case you want them back, or save it...


----------



## Timecard

The reason you would get lower latency with higher fps in csgo is because it gets an updated mouse position every frame.


----------



## jayfkay

Timecard said:


> The reason you would get lower latency with higher fps in csgo is because it gets an updated mouse position every frame.


Most games ******* work like that.


----------



## x7007

It seems GPU FAN is related to any LAG INPUT on the computer. it's like a POWER SAVING within a SENSOR that causing it when AUTO.

My mouse is extremely better with FAN 100% instead of AUTO. and the weirder thing it's not the actual fan speed, because my GPU is 0% fan speed when idle and doesn't reach a certain temp.

That is why it's so random.




EDIT: :::::: ***. I tested this at least 50 times now. Removing the USA Keyboard makes the mouse move properly. I tried to use USA India and it just works fine???? I have 2 languages installed. USA and Hebrew. There is also Hebrew (Standard). I don't know what is the difference. There was someone who said installing different Languages fixed his issue! I didn't believe until I tested. You can do it live in real-time and see how the mouse behaves. Make sure after you install the language to go back to the main Time&Language screen.

Maybe it's one thing, to remove and install. but test it.


----------



## skTear

aa


----------



## CiselS

empl said:


> And raw input adds smoothing in this game, no need to use it, if you have disabled dynamic tick in bcdedit. And use 6/11 in windows and disable enhance point precision and in game M_mousescale 0, m_mouse_exponent 1, m_mousespeed 0. And in regedit current_user/control panel/mouse delete everything from MouseCurve(x,y) = smoothing, you can find these values backed under local machine, in case you want them back, or save it...


sorry for my bad English

wow so many information 

I need time to know

I try ehci set to MSI mode but just die 

I know CPU game set ultra will increase input lag(I using global/CSGO on GPU game ultra)

https://youtu.be/7CKnJ5ujL_Q?t=502

CSGO m_rawinput long time ago I watch this 
https://www.mouse-sensitivity.com/forums/topic/342-csgo-m_rawinput-vs-rinput/

"" "m_rawinput 1" provides by far the most accurate and consistent interpretation of the signals from the mouse driver. ""

this is the first time I know m_rawput 1 adds smoothing

but more pro play using m_rawput 1 only one guy using 0 is coldzera

I will try it

no M_mousescale 0, m_mouse_exponent 1 commands btw

this is smoothing ? 

https://youtu.be/cAMjZE7xY2Y?t=31

I know this gaming tweaks thank you share

Thank you reply
------------------------------------------
What is mouse smoothing?
https://prosettings.net/library/what-is-mouse-smoothing/



x7007 said:


> EDIT: :::::: ***. I tested this at least 50 times now. Removing the USA Keyboard makes the mouse move properly. I tried to use USA India and it just works fine???? I have 2 languages installed. USA and Hebrew. There is also Hebrew (Standard). I don't know what is the difference. There was someone who said installing different Languages fixed his issue! I didn't believe until I tested. You can do it live in real-time and see how the mouse behaves. Make sure after you install the language to go back to the main Time&Language screen.
> 
> Maybe it's one thing, to remove and install. but test it.


I forget a post say English(Philippines ) lets him feel mouse better.

but even I remove English(USA) English(USA) always back.


----------



## empl

CiselS said:


> s


Check gpu usage using gpu-z, doubt cs go can max gpu on 99% at all times (although it depends on hardware you using). It is direct 9 game - doesn't utilize multicore properly.
Yeah it is recommended to use raw_input, but i don't use it in games. It makes mouse feel heavy and i don't like it. I don't have acceleration, so i don't need it. I think raw input doesn't disable acceleration anyways, if i turn enhance mouse precision and raw input, i still feel it. In case of cs go, it turns on smoothing as well, but maybe they changed it. You can't disable smoothing individually, it is tied to raw_input 1. I don't play cs go anymore. Otherwise there should be lower input lag with raw_input, because game gets data straight from a mouse and doesn't have to go through windows. 

Btw pro players they always don't use ideal setting. I don't know if it is true, from config sites: pros scale mouse sensitivity in game and use 400 dpi and in game 2.5 e.g. I tried this and my mouse was skipping that many pixels i couldn't kill easy ai. On razer da chroma. So copying everything pro players do, isn't always best. You should always set dpi on mouse and in game have 1.0 to disable software interpolation. But i doubt these online sites with config are accurate. Tho even in starcraft 2, i saw from stream pro adjust sensitivity in game, which is mistake. You should never do that. Or some still use acceleration. They are probably used from old times of 400 dpi and are to use to it to change it in-game. Still is it very strange, i was supreme master class that's not that far from pro, i was beating even global elite ranked players sometimes, or how that was called. And if i used in game sens i would be bronze 3 no joke... Don't really know how they can play like this, if it is true that they use 400 dpi and 2 in game e.g.

m_exponent is speed to which mouse will be accelerated before scalling, you should turn that off, but now it works only to 0.010, so you should leave it on 1.0 = 1x = same sens
m_mousespeed 1 cause acceleration
m_scale by how much = default 0.04 = should be on 0 for muscle memory

But some people still use acceleration, even with acceleration you can develop muscle memory, even it will be harder probably, but people use what feels best to them.


----------



## 508859

empl said:


> Check gpu usage using gpu-z, doubt cs go can max gpu on 99% at all times (although it depends on hardware you using). It is direct 9 game - doesn't utilize multicore properly.
> Yeah it is recommended to use raw_input, but i don't use it in games. It makes mouse feel heavy and i don't like it. I don't have acceleration, so i don't need it. I think raw input doesn't disable acceleration anyways, if i turn enhance mouse precision and raw input, i still feel it. In case of cs go, it turns on smoothing as well, but maybe they changed it. You can't disable smoothing individually, it is tied to raw_input 1. I don't play cs go anymore. Otherwise there should be lower input lag with raw_input, because game gets data straight from a mouse and doesn't have to go through windows.
> 
> Btw pro players they always don't use ideal setting. I don't know if it is true, from config sites: pros scale mouse sensitivity in game and use 400 dpi and in game 2.5 e.g. I tried this and my mouse was skipping that many pixels i couldn't kill easy ai. On razer da chroma. So copying everything pro players do, isn't always best. You should always set dpi on mouse and in game have 1.0 to disable software interpolation. But i doubt these online sites with config are accurate. Tho even in starcraft 2, i saw from stream pro adjust sensitivity in game, which is mistake. You should never do that. Or some still use acceleration. They are probably used from old times of 400 dpi and are to use to it to change it in-game. Still is it very strange, i was supreme master class that's not that far from pro, i was beating even global elite ranked players sometimes, or how that was called. And if i used in game sens i would be bronze 3 no joke... Don't really know how they can play like this, if it is true that they use 400 dpi and 2 in game e.g.
> 
> m_exponent is speed to which mouse will be accelerated before scalling, you should turn that off, but now it works only to 0.010, so you should leave it on 1.0 = 1x = same sens
> m_mousespeed 1 cause acceleration
> m_scale by how much = default 0.04 = should be on 0 for muscle memory
> 
> But some people still use acceleration, even with acceleration you can develop muscle memory, even it will be harder probably, but people use what feels best to them.


most pro players (csgo ones at least), don't have a clue about the technical part. they just stick to whatever they know and do not waste time experimenting, same with mouse sensors - they don't know and don't care mostly


----------



## Timecard

For language packs each have different features that come with it(text to speech, speech recognition, handwriting), the ones you mention have fewer added features which could be related to the discrepancy you're seeing. Maybe make a new forum post about it not to dilute the point of this thread.


----------



## gunit2004

I'm curious.. when people here ran MSI_utilv2 for the first time.. how many and which of their devices showed up as "High" priority? (without them even touching any settings)

On a fresh install of Windows 10 on an ASUS Z370 Maximus Hero X motherboard, my SATA controller & NVME controller were both already set to High priority. Everything else was the usual "undefined" setting.


----------



## James N

gunit2004 said:


> I'm curious.. when people here ran MSI_utilv2 for the first time.. how many and which of their devices showed up as "High" priority? (without them even touching any settings)
> 
> On a fresh install of Windows 10 on an ASUS Z370 Maximus Hero X motherboard, my SATA controller & NVME controller were both already set to High priority. Everything else was the usual "undefined" setting.


That is normal, usually those are the ones at high priority and everything else is undefined.


----------



## gunit2004

James N said:


> That is normal, usually those are the ones at high priority and everything else is undefined.


I gotta test it more but it feels like aim feel in game (just tried it in Overwatch) feels a lot better when I set the SATA & NVME controllers to undefined like the rest of the devices. Doesn't seem to have any noticeable negative effects as long as you have Virtual Memory turned off in Windows 10 (by default Windows seems to do a system-managed amount of virtual memory on the drive your OS is installed on, so you would have to turn that off).


----------



## RamenRider

CiselS said:


> I forget a post say English(Philippines ) lets him feel mouse better.
> 
> but even I remove English(USA) English(USA) always back.



It was my post which I found from here. https://forums.tomshardware.com/threads/input-lag-in-windows-10-and-games.2923337/#post-19973646

You just have to mess with your language bar settings. I installed Eng Philippines than uninstalled the other ones. You can right click the language bar or just type the setting.



x7007 said:


> It seems GPU FAN is related to any LAG INPUT on the computer. it's like a POWER SAVING within a SENSOR that causing it when AUTO.
> 
> My mouse is extremely better with FAN 100% instead of AUTO. and the weirder thing it's not the actual fan speed, because my GPU is 0% fan speed when idle and doesn't reach a certain temp.
> 
> EDIT: :::::: ***. I tested this at least 50 times now. Removing the USA Keyboard makes the mouse move properly. I tried to use USA India and it just works fine???? I have 2 languages installed. USA and Hebrew. There is also Hebrew (Standard). I don't know what is the difference. There was someone who said installing different Languages fixed his issue! I didn't believe until I tested. You can do it live in real-time and see how the mouse behaves. Make sure after you install the language to go back to the main Time&Language screen.


So you are saying max fan speeds have less input lag along with English(India)? I'm still using US keyboard but just Eng Philippines. 


=============================================

I found another guide guys, how good is it? Gonna test it out using a bootable windows 10 DVD: https://www.reddit.com/r/Amd/comments/7yfji8/my_advice_for_a_modern_os_and_uefi_configuration/

===========

I've been a systems designer and analyst, software QA and all around IT specialist for 17 years. This is my job. I felt like being nice because it was on my mind and I happened to be configuring another system at the time. I immediately manually configure bios settings with every system I work on and it's so automatic for me I don't even think about it anymore - but I realized that most people probably go defaults and call it a day, and MAYBE set xmp. I consistently get better benchmarks on amd hardware then most reviewers get online and I felt I should share some of the reasons why instead of keeping it all to myself. It's important to understand that hardware manufacturers are often ahead of the curve and OS and driver integration of new chip tech tends to lag behind. WDDM 2.0 was ready in 2007, for example.

I'm well known in my community and in general when it comes to this stuff as a systems integrator. https://pbs.twimg.com/media/C6mJabQWkAI7NYt.jpg:large That's me with the glasses. You probably recognize the guy in the sweater. 

It leaves a few things out that can improve performance of usb devices and general overall system latency, however, and doesn't mention that modern os's should be installed on a pure EFI system to make use of all the fantastic features and performance benefits of the AMD architecture, going as far back as Bulldozer. A lot of the performance issues people experienced with that chip-set and architecture had to do with poorly configured motherboards. Intel did a better job of forcing certain settings to be default by moving them into the chip back then. You could, however, get great performance out of an AMD system even back then by doing similar to what I'm about to tell you.

I develop in VR as a hobby and have several headsets, so ultra low latency is really important to me. I also game and really enjoy under 4ms click to response rate.

Here's a quick guide for the FX series chipsets and ZEN and above etc. This is what I do when I reinstall or build a system.

Essentially, I recommend a fresh OS install of windows 10 FCU. This makes sure you don't have to do a bunch of conversion after the fact.

If you have a normal SSD and not NVME m.2 - splurge a little and get the upgrade, that's my recommendation - but any disk will work.

Once you have a disk you can do a fresh windows 10 install on, go download the windows 10 media for FCU.

However, we won't be making a USB, we'll be using an ISO and creating an EFI bootable disk using RUFUS.

https://www.microsoft.com/en-ca/software-download/windows10

https://rufus.akeo.ie/

Unplug all other drives except the one for the new install.

Go into your "bios"/EFI and load optimized defaults. Save and exit, and turn your computer off. Then, go back in. This is done because sometimes these settings require a full motherboard cycle and not a soft boot.

Next, make sure you have and install the latest bios for your board. For my gaming 5, this is F22b.

Next, enable XMP on your ram. Save and exit as before. You should actually see a "double reboot" in most cases - again this is due to changes to the cpu.

Next - here's all the big ones.

Under devices - disable serial port entirely. You're not using it and it's occupying your cpu as an addressed device. The change is tiny but we're talking about absolute maximum resources here right?

Disable the network adapter you're not using if you have two (and not using both). You can always turn it on later but there's no reason to include it if you never use it. Again - it's using address space you don't need. The benefit might not even be noticeable but we're talking about absolutely ideal settings.

Disable Legacy USB. Disable XHCI handoff. Disable HPET. Disable CSM. Disable EHCI handoff. You -may- want to enable "above 4g limit"

This is a change that allows 64 bit addressing on GPU memory. It doesn't affect performance most of the time unless you have multi-gpu systems - but it's possible that some modern WDDM 2.3 drivers combined with things like dx12 and vulcan might actually see some performance benefits.

Set an aggressive fan curve so your system opts to cool rather then throttle 99% of the time.

Install windows 10 FCU. Marvel at the speed at which it installs, I promise it will shock you.

Allow windows to install ALL the drivers from windows update before manually installing drivers.

Go to AMD's website and install the chipset drivers and reboot. Then, install the GPU drivers and reboot.

Install your gpu's fan management tool (or use wattman) and be sure to allow the fan to be more aggressive. We don't want latency from card throttling. We all wear headphones. I'm not personally allergic to a little fan noise, and I value the lifespan of my hardware.

Go into device manager and find the SM bus controller. Right click, properties, update driver. point it at c:\amd\ and tell it to look in sub directories. You'll be surprised to find it finds an updated driver.

Install the manufactuer's tool for your SSD. This makes a huge difference people don't realize. NVME SSDS have their own controller and the Samsung software allows for major performance benefits. Non nvme ssds can benefit from a caching scheme that allows for up to 2000mbps. Modern Windows 10 can make use of this.

You'll want to do the same with any other driver that has the word (microsoft) next to it that is a hardware device. Point it at the AMD chipset directory and see if it finds anything.

Your IDE/ATA should say "AMD SATA Controller" You should have AMD GPIO drivers.

Your usb should say "AMD USB 3" and "AMD USB 3.1" root hubs and controllers. you should see "AMD xHCI".

Sometimes the drivers from Microsoft Update ARE up to date from AMD, but this isn't always the case. Most of the time the drivers included in the chipset package but not installed by default are newer and remain that way for a while. This is required stuff on server 2016.

Disable hibernation in windows by command prompt as admin so your system reboots properly when it needs to:

powercfg /hibernate off

Run the following in command prompt as admin:

bcdedit /deletevalue useplatformclock

bcdedit /set disabledynamictick yes

Go into device manager and find "high precision event timer" Right click, disable. I'm actually re-evaluating the device manager change. Windows may have released something in the latest OS that is helping this, but maybe not.

Why? Deferred procedure calls (DPC(s) Latency) usually in bios...

you disabled CSM right?? We have EFI now and a modern OS. We don't need Bios stuff.

...allow programs to queue actions to be done quickly in the processor scheduler, and they hang the whole computer until they get processed.

Disabling HPET allows an unrestricted input output to occur and results in a very raw and extremely responsive connection between you and your machine. It also removes a ton of micro-stuttering and screen tearing.

1000us = 1ms HPET ON: between 100-150us delay HPET OFF: between 5-15us delay

3-4 frames per second loss, single card with HPET OFF, chance of stuttering decreased accordingly.

Even though it doesn't seem like much, but imagine every action and each PC component is effected by that same 100-150us delay, It can add up to MANY milliseconds and significant performance loss. The end result is, to higher end systems, another step forward is crispier and snappier inputs and actions across the board.

All the above changes provide a baseline for the best possible stock performance out of your system. You WILL notice a pretty big difference in lots of little ways - I promise.


----------



## CiselS

empl said:


> Check gpu usage using gpu-z, doubt cs go can max gpu on 99% at all times (although it depends on hardware you


Thank your information

I check gpu load on MSI Afterburner CSGO is cpu heavy game

I did'n play SC2 but I play LoL the game using windows sens

first time I don't know 

I using 400 dpi in game max sens develop muscle memory (= 800dpi with acceleration)

Until my competitive level platinum
-
old motherboard every is undefined


----------



## empl

RamenRider said:


> Disabling HPET allows an unrestricted input output to occur and results in a very raw and extremely responsive connection between you and your machine. It also removes a ton of micro-stuttering and screen tearing.


If you disable HPET, what timer is exactly used ? Besides HPET, i have 2 more devices in device manager (at least those i found, maybe there are more of them, or some aren't visible maybe). 
They are: system timer and cmos/ real time clock, which is old device for measuring time, doubt it is used. I wasn't able to find any concrete info about system timer. I suspect it is that legacy timer, it has i/o address and irq number. I don't have gaming mobo, i have currently gigabyte ga-b75m-d2v and i am using hpet. But i have still very low isr and dpc latency. Disabling it doesn't improve my dpc latency, maybe it even makes it worse and mouse feels worse. But it vary on each system. 

Btw 5-15 dpc latency offload ? What is your dpc latency in-load ? There are some good mobos, which have <100us in load on some systems, but usually drivers can spike even to couple hundred. My nvidia drivers usually have like 600us. For example asrock z390 phantom gaming-itx/ac, anandtech measured least than 100us. But these test are very relative, as each hardware configuration is different. And it may not be clear from tests, if these values are before, or after tweaking.

Btw why hpet doesn't have irq ? I read in old book: "Windows 7 annoyances" that says - users had good experience to give highest interrupt priority to "cmos/ real time clock", or to gpu. Under "PriorityControl" key, which is located at HKLM\SYSTEM\CurrentControlSet. I can feel the difference, if i prioritize gpu and usb. Even measuring usbport.sys, highest execution time got lower, while i maxed polling rate. But this value may fluctuate. It is hard to take these values seriously, as you can't exactly replicate system load at one point in time. But i am very sensitive to input lag, so i can usually feel the difference and go by that. Even drivers adjusts irq priority under HKLM/system/current control set/enum/"hardware id of device"/affinity policy: device policy. E.g. sata automatically adjust it and give itself high priority, when you put them to undefined input lag improves. Same thing was saying 1 person here. 

Even i was told interrupt affinity can be ignored on hardware/driver level by latencymon support. I was skeptical setting this in regedit at first, partly because it never worked for me. I had to set gpu to msi for some reason in order for it to work. Now i don't have everything scheduled on core 0. Same thing is with interrupt priority, setting it in registry works. Even some drivers put themselves to high priority, which is mistake probably and doesn't allow system to schedule interrupts properly, as putting it to undefined reduces input lag. It wouldn't surprise me, as badly coded drivers are source of high dpc latency. Still drivers/hardware supersede values from registry. Otherwise having it improperly configured would slow down system significantly. But it works, as we can see. Minimally at least sometimes: until it is overridden by drivers/hardware etc. But it still helps configure it manually.


----------



## CiselS

empl said:


> .


https://youtu.be/lFn1eFOtzpA?t=377

my pc enable hpet fps will really bad every pc is different

you can download WinTimerTester test 

https://www.overclock.net/forum/78-pc-gaming/1487105-want-higher-fps-what-timer-do-you-use.html

I using bcdedit /set disabledynamictick yes / bcdedit /deletevalue tscsyncpolicy / bcdedit /deletevalue useplatformclock 

I feel no different on tsc
--
Now I am test Xhci mode (enable) + XHCI legacy support(enable) 

https://i.imgur.com/WDs8Hpf.png

interrupt priority set High feel no different(I will test more)

Xhci mode (enable) + XHCI legacy support(enable) feel mouse good to control.

maybe XHCI legacy support is USB 3.0 support (because my bios translate is USB 3.0 support english write XHCI legacy support)

Don't change your ehci to msi mode because USB1.0/2.0 does not support msi mode

your mouse will feel really really bad if you do it. (if you do this go to bios enable xhci mode, back desktop go to device manager remove ehci, go back bios disable xhci mode)

I will test more time to ehci vs xhci

btw: You can safely switch GPU and High Definition Audio Controller to MSI mode.

https://forums.guru3d.com/threads/w...terrupts-msi-tool.378044/page-57#post-5747303

--
PriorityControl 

when I first test change ehci priority I feel no different (ehci 1 or cmos 1)

I will test more time

--
I get stutter only in csgo(like this gif https://streamable.com/xy546) 

I always set timer resolution on 0.5 (csgo apex legends win10 default set 0.5)

I change to 1 will fix all stutter (csgo apex legends win8.1 default set 1)

this is happen on win10 too

maybe is my pc problem


----------



## Athrutep

Use timerbench and then test every combination on your system yourself. The results differ from system to system. And other people can't comment on whats best for your system, it has to be tested.

https://www.overclockers.at/articles/the-hpet-bug-what-it-is-and-what-it-isnt


----------



## empl

> CiselS


Supposedly from correspondence with timer expert, you shouldn't force platformclock on, even from microsoft website: "it is for debugging only". https://www.tweakhound.com/2014/01/30/timer-tweaks-benchmarked/
But you can use platformtick, that actually makes mouse more accurate and purpose of it is to disable synthetic timers. EDIT: While this is true, at the same time i found on microsoft docs, platformtick should be used only for debugging and timer expert says let windows choose what to use (but in cause of platformclock, don't know if same applies for platformtick). https://docs.microsoft.com/en-us/windows-hardware/drivers/devtest/bcdedit--set
Another expert said he uses this (on this page). But i find myself turning that off each time, after using it awhile when i turn that off, it feels there is a lot of less input lag. Same with hpet off... On my system hpet off/on is no different by TimerBench performance wise. So i think platformtick should be off, unless you have better experience on your hardware with it on.
Tsyncpolicy is for debugging only aswell.

Oh and btw, yes usb 2 support msi, but on newer chipsets. Yeah xhci is trash, you should use usb 2, r0ach said with usb higher throughput higher the latency. Also ps/2 is better, but not many mice and keyboards support it and there are no active adapters, only mega expensive like for 1000$. Also higher polling rate, less difference between ps/2 and usb, so if you use 500hz+ there is less difference in latency.

Other things i know and i had never issue with timer resolution 1, or 0.5, no matter what i set.



Athrutep said:


> Use timerbench and then test every combination on your system yourself. The results differ from system to system. And other people can't comment on whats best for your system, it has to be tested.
> 
> https://www.overclockers.at/articles/the-hpet-bug-what-it-is-and-what-it-isnt


Thanks, i knew similar program, but didn't know what it is for. So it showed, without hpet: my system is about tiny bit better. But on some system hpet was even 7 times slower. More the timer calls the better. Don't know about dpc latency of other drivers tho - when disabling hpet. Hpet is said to be more accurate ! But dpc lat. is hard to measure, it is not at any moment same. I had feeling without hpet on i had more of dpc latency, but now i tested it again and it was about same, it could have been by something else...

EDIT: Now i tested it couple more times and TimerBench is reporting different results each time, e.g. 540k timer calls, or 510. Slightly different frame time and fps and interval is about 1us off sometimes. When nothing is running and hpet is off. So i don't know how accurate it is. It is understandable, it won't be same ever. I wanted to test interrupt priority for system timer and compare results. But it fluctuates quite a bit. About 30k, so i doubt i will get anything from this.

Also i have 2 timers in device manager System CMOS/real time clock, or system timer. I don't know which of these is tsc. Or if there is hidden device not showing (i have checked show hidden devices). But i usually go with what feels best anyways. Also interesting hpet has no irq number and when i disable it it disappears.

This TimerBench is strange, each time i run it i get less timer calls, previously 490k and now 488, while previously i got 510. Seems like it is broken. Okay never mind, i got like 5 times gradually less score, so it may seemed so. Now i got like 520k again, while nothing is running, still it fluctuates. So i don't think i will be able to find out.


----------



## RamenRider

empl said:


> If you disable HPET, what timer is exactly used ? Besides HPET, i have 2 more devices in device manager (at least those i found, maybe there are more of them, or some aren't visible maybe).
> They are: system timer and cmos/ real time clock, which is old device for measuring time, doubt it is used. I wasn't able to find any concrete info about system timer. I suspect it is that legacy timer, it has i/o address and irq number. I don't have gaming mobo, i have currently gigabyte ga-b75m-d2v and i am using hpet. But i have still very low isr and dpc latency. Disabling it doesn't improve my dpc latency, maybe it even makes it worse and mouse feels worse. But it vary on each system.
> 
> Btw 5-15 dpc latency offload ? What is your dpc latency in-load ? There are some good mobos, which have <100us in load on some systems, but usually drivers can spike even to couple hundred. My nvidia drivers usually have like 600us. For example asrock z390 phantom gaming-itx/ac, anandtech measured least than 100us. But these test are very relative, as each hardware configuration is different. And it may not be clear from tests, if these values are before, or after tweaking.
> 
> Btw why hpet doesn't have irq ? I read in old book: "Windows 7 annoyances" that says - users had good experience to give highest interrupt priority to "cmos/ real time clock", or to gpu. Under "PriorityControl" key, which is located at HKLM\SYSTEM\CurrentControlSet. I can feel the difference, if i prioritize gpu and usb. Even measuring usbport.sys, highest execution time got lower, while i maxed polling rate. But this value may fluctuate. It is hard to take these values seriously, as you can't exactly replicate system load at one point in time. But i am very sensitive to input lag, so i can usually feel the difference and go by that. Even drivers adjusts irq priority under HKLM/system/current control set/enum/"hardware id of device"/affinity policy: device policy. E.g. sata automatically adjust it and give itself high priority, when you put them to undefined input lag improves. Same thing was saying 1 person here.
> 
> Even i was told interrupt affinity can be ignored on hardware/driver level by latencymon support. I was skeptical setting this in regedit at first, partly because it never worked for me. I had to set gpu to msi for some reason in order for it to work. Now i don't have everything scheduled on core 0. Same thing is with interrupt priority, setting it in registry works. Even some drivers put themselves to high priority, which is mistake probably and doesn't allow system to schedule interrupts properly, as putting it to undefined reduces input lag. It wouldn't surprise me, as badly coded drivers are source of high dpc latency. Still drivers/hardware supersede values from registry. Otherwise having it improperly configured would slow down system significantly. But it works, as we can see. Minimally at least sometimes: until it is overridden by drivers/hardware etc. But it still helps configure it manually.


I'm sorry bro I just don't have the answers. But I found this thread that may be of use to you. 

This one talks about how it breaks FPS counters. https://www.reddit.com/r/Amd/comments/eoutjw/a_reminder_disabling_hpet_is_snake_oil/

This one is just a list I made. https://www.reddit.com/r/Amd/comments/epl1j3/ramds_best_threads_bftbugs_fixes_and_tips_v1/


----------



## 508859

RamenRider said:


> I'm sorry bro I just don't have the answers. But I found this thread that may be of use to you.
> 
> This one talks about how it breaks FPS counters. https://www.reddit.com/r/Amd/comments/eoutjw/a_reminder_disabling_hpet_is_snake_oil/
> 
> This one is just a list I made. https://www.reddit.com/r/Amd/comments/epl1j3/ramds_best_threads_bftbugs_fixes_and_tips_v1/


https://i.imgur.com/0Synuy5.png

difference between hpet on/off in win10 on a 8700K with z370 
mind the frametimes, user experience is just night and day.


----------



## r0ach

RamenRider said:


> I'm sorry bro I just don't have the answers. But I found this thread that may be of use to you.
> 
> This one talks about how it breaks FPS counters. https://www.reddit.com/r/Amd/comments/eoutjw/a_reminder_disabling_hpet_is_snake_oil/
> 
> This one is just a list I made. https://www.reddit.com/r/Amd/comments/epl1j3/ramds_best_threads_bftbugs_fixes_and_tips_v1/


Lots of pro-HPET comments in there that completely ignore the fact HPET is a higher overhead timer so is going to have worse performance by default than ITSC. Whether framerate counting is accurate or not with it disabled is entirely a separate issue.


----------



## RamenRider

Oh yeah I forgot about this thread.



Bearybear said:


> Overall, regarding the effect on FPS, the differences I saw were so small that they can only be put down to margin of error, but this is the conclusion I've come to:
> 
> 
> *Timers*
> 
> _High Precision Event Timer is enabled or disabled in the BIOS
> bcdedit is configured via an elevated command prompt_
> 
> *TSC+LAPIC - Seems to cause no latency or stutter, input and output are smooth* (normally the default)
> _High Precision Event Timer: Disabled
> bcdedit /deletevalue useplatformclock_
> 
> *LAPIC - Seems to cause stutter but no latency*
> _High Precision Event Timer: Disabled
> bcdedit /set useplatformclock true_
> 
> *TSC+HPET - Seems to cause latency and stutter*
> _High Precision Event Timer: Enabled
> bcdedit /deletevalue useplatformclock_
> 
> *HPET - Seems to cause latency but no stutter, input and output are very, very smooth*
> _High Precision Event Timer: Enabled
> bcdedit /set useplatformclock true_
> 
> The following I've probably just made up:
> 
> I think the TSC timers are relative to each processor, the LAPIC timer is relative to the system bus and HPET is external. TSC timers can't be used alone because they don't stay in sync hence why HPET and LAPIC are either used instead of TSC, or are used as well so that the TSC timer's very low latency can be taken advantage of without the issue of them going out of sync. I think that TSC and LAPIC derive their times from the same clock/crystal so they fit quite nice together but HPET doesn't and this is probably part of the reason why using HPET causes problems with latency and stutter since it's not going to be in sync with everything. The purpose of the really high frequency might even be just an attempt to mitigate the sync and latency issues with HPET, rather than any sort of performance improvement, but the high frequency actually causes issues of it own, especially when HPET is used on it's own, which is probably why it's not used by default.
> 
> 
> *Times stamp counter synchronization policy*
> 
> Setting this to Enhanced either had no effect or caused stutter, from what I now understand Windows already picks the best setting for this so it should be left alone.
> 
> To force setting to enhanced:
> _bcdedit /set tscsyncpolicy Enhanced_
> 
> To remove the forced setting:
> _bcdedit /deletevalue tscsyncpolicy_
> 
> 
> *Dynamic timer tick*
> 
> Neither had any noticeable effect but it's documented to have caused issues before, it also offers nothing of any real benefit and isn't used or needed in Windows 7 or previous so I've chosen to disable it.
> 
> To disable:
> _bcdedit /set disabledynamictick yes_
> 
> To enable:
> _bcdedit /deletevalue disabledynamictick_
> 
> 
> So in the end it would seem (at least in my case) that messing with these settings offers nothing of benefit other than the chance to correct the often overlooked and ignorantly misconfigured HPET BIOS setting. ^.^


----------



## x7007

RamenRider said:


> Oh yeah I forgot about this thread.


I am still stuck if to keep hept enabled in bios. the only way I see it changes is when I use the bcdedit /set uaeplatformtick yes then I see the timer is 0.4888 instead 0.5 or 0.9888 or something instead 1.0, when hpet disabled. 
but if I don't use this command then it is always the same, hpet enabled or disabled. 1.0 or 0.5 or 15.625 like it should 
that's on my amd 1950x gigabyte x399 gaming aorus 7. any idea?

and something weird on my Intel 3770k with Asus mobo. even with hpet enabled no matter what when using emulator like blueatacks the timer is 0.9765,with the emulator running 

it's 1.0 when not running the emulator. can't understand that thing...


----------



## ylpkm

x7007 said:


> I am still stuck if to keep hept enabled in bios. the only way I see it changes is when I use the bcdedit /set uaeplatformtick yes then I see the timer is 0.4888 instead 0.5 or 0.9888 or something instead 1.0, when hpet disabled.
> but if I don't use this command then it is always the same, hpet enabled or disabled. 1.0 or 0.5 or 15.625 like it should
> that's on my amd 1950x gigabyte x399 gaming aorus 7. any idea?
> 
> and something weird on my Intel 3770k with Asus mobo. even with hpet enabled no matter what when using emulator like blueatacks the timer is 0.9765,with the emulator running
> 
> it's 1.0 when not running the emulator. can't understand that thing...


Can also check with TimerBench 1.4, see how many calls per second can be made, and compare on vs off.


----------



## x7007

ylpkm said:


> Can also check with TimerBench 1.4, see how many calls per second can be made, and compare on vs off.



ye I will try.

So for me, the main 2 fixes for mouse lag are GPU Clock on desktop, less important on games because it would run on max and Windows Languages, changing from English/US to English Philipines fixed it. 

I played Rage 2 and Call Of Duty MW 2019 and Heroes of the Storm and I feel the differences between the no lag now, it's day and night.

I don't see much difference between on/off

But it seems On is better because more Calls


----------



## ylpkm

x7007 said:


> ye I will try.
> 
> So for me, the main 2 fixes for mouse lag are GPU Clock on desktop, less important on games because it would run on max and Windows Languages, changing from English/US to English Philipines fixed it.
> 
> I played Rage 2 and Call Of Duty MW 2019 and Heroes of the Storm and I feel the differences between the no lag now, it's day and night.
> 
> I don't see much difference between on/off
> 
> But it seems On is better because more Calls


Hpet isnt fully on, hit the enable hpet button in TimerBench and reboot, then it will actually be on. Use the same button to disable afterwards.

When I enabled hpet with the button, my calls/s dropped a huge amount, and call interval was way longer.


----------



## x7007

ylpkm said:


> Hpet isnt fully on, hit the enable hpet button in TimerBench and reboot, then it will actually be on. Use the same button to disable afterwards.
> 
> When I enabled hpet with the button, my calls/s dropped a huge amount, and call interval was way longer.


Did you read the last roach post? here and the many other threads? it was said not to force HPET ever. I didn't want to enable the HPET button because what I want to know is, if enabling or disabling hpet in bios have any effect on windows and how. without ever forcing it. enabling the command means you disband Invariant Tsc which is the fastest timer there is now and the newest. I did test forcing it long time a go and I got same results like you less calls and lower fps.


----------



## Athrutep

x7007 said:


> Windows Languages, changing from English/US to English Philipines fixed it.



If you change the year to 1990 (earliest year available under windows 10) it makes it even better.


----------



## x7007

Athrutep said:


> If you change the year to 1990 (earliest year available under windows 10) it makes it even better.


I am not sure if you serious or trolling but it's not something validate because you need to have real date. so it's not something to be tested anyway.


----------



## Athrutep

x7007 said:


> I am not sure if you serious or trolling but it's not something validate because you need to have real date. so it's not something to be tested anyway.


its the same level of seriousness as the language thing.


----------



## x7007

Athrutep said:


> its the same level of seriousness as the language thing.


but how would this work out? without real date nothing will work properly.. we can't permanently use that.


----------



## RamenRider

x7007 said:


> I am still stuck if to keep hept enabled in bios. the only way I see it changes is when I use the bcdedit /set uaeplatformtick yes then I see the timer is 0.4888 instead 0.5 or 0.9888 or something instead 1.0, when hpet disabled.
> but if I don't use this command then it is always the same, hpet enabled or disabled. 1.0 or 0.5 or 15.625 like it should
> that's on my amd 1950x gigabyte x399 gaming aorus 7. any idea?
> 
> and something weird on my Intel 3770k with Asus mobo. even with hpet enabled no matter what when using emulator like blueatacks the timer is 0.9765,with the emulator running
> 
> it's 1.0 when not running the emulator. can't understand that thing...





ylpkm said:


> Can also check with TimerBench 1.4, see how many calls per second can be made, and compare on vs off.





x7007 said:


> ye I will try.
> 
> So for me, the main 2 fixes for mouse lag are GPU Clock on desktop, less important on games because it would run on max and Windows Languages, changing from English/US to English Philipines fixed it.
> 
> I played Rage 2 and Call Of Duty MW 2019 and Heroes of the Storm and I feel the differences between the no lag now, it's day and night.
> 
> I don't see much difference between on/off
> 
> But it seems On is better because more Calls


According to this guy. It's better to have HPET on in BIOS and off in OS to keep a perfect low timer resolution of 0.5 instead of 0.48 or any other unbalanced number that causes desyncing problems which contributes to stuttering and desyncing. 




What do you think about this roach?


r0ach said:


> Lots of pro-HPET comments in there that completely ignore the fact HPET is a higher overhead timer so is going to have worse performance by default than ITSC. Whether framerate counting is accurate or not with it disabled is entirely a separate issue.





Athrutep said:


> If you change the year to 1990 (earliest year available under windows 10) it makes it even better. Its the same level of seriousness as the language thing.


But I'm serious.

Also here is my TimerBench with HPET on BIOS with just bcdedit /set useplatformtick true. Feels amazing in game. Currently testing on newest Win 10.


----------



## 508859

RamenRider said:


> According to this guy.


this guys is like roach. everything that he mentions in his videos HAS DRAMATIC CRITICAL NOTICEABLE OBVIOUS IMPACT ON INPUT LAG


----------



## Axaion

numberfive said:


> this guys is like roach. everything that he mentions in his videos HAS DRAMATIC CRITICAL NOTICEABLE OBVIOUS IMPACT ON INPUT LAG


Well at least "this guy" provides some evidence that whatever tweaks he does, does in fact effect system latency, or mouse polling at the very least

Thats a pretty big step up from calling people cavemen


----------



## x7007

I even found a better English for me without anything included, just the keyboard for English and Hebrew. I also removed any language Packs using this command lpksetup /u which will remove only the Language pack to choose as Windows Display Language which Windows kept installing automatically.

So there is no handwriting, speeching or anything crap installed. Only Keyboard and US language. I chose English ( Israel ) and US Keyboard instead of English Philipines. It took like 1 min to remove the Hebrew Language pack on SSD..... so no needed.

Also, I think I should disable this TYPING.


----------



## jayfkay

I followed this suggestion, makes sense to me.
What kinda value is 0.48???? 0.5 makes more sense.


----------



## Timecard

x7007 said:


> I also removed any language Packs using this command lpksetup /u which will remove only the Language pack to choose as Windows Display Language which Windows kept installing automatically.


Good find


----------



## Athrutep

x7007 said:


> I even found a better English for me without anything included, just the keyboard for English and Hebrew. I also removed any language Packs using this command lpksetup /u which will remove only the Language pack to choose as Windows Display Language which Windows kept installing automatically.
> 
> So there is no handwriting, speeching or anything crap installed. Only Keyboard and US language. I chose English ( Israel ) and US Keyboard instead of English Philipines. It took like 1 min to remove the Hebrew Language pack on SSD..... so no needed.
> 
> Also, I think I should disable this TYPING.


So are you gonna provide a plot showing the difference proving it actually does something or?


Here are mine, does no difference at all in terms of polling precision. And i have also tested different combinations.

https://imgur.com/a/2YpDOBP


----------



## 508859

Athrutep said:


> So are you gonna provide a plot showing the difference proving it actually does something or?
> 
> 
> Here are mine, does no difference at all in terms of polling precision. And i have also tested different combinations.
> 
> https://imgur.com/a/2YpDOBP


even in theory, it cannot have any impact on the polling precision

the idea here is that some under the hood services might be permanently listening for an input from an HID device, and some languages do not support this functionality. this could cause (very theoretically) some input lag or an inconsistency of cursor behavior, but it has nothing to do with how USB controller is polling the device.


----------



## Athrutep

numberfive said:


> even in theory, it cannot have any impact on the polling precision
> 
> the idea here is that some under the hood services might be permanently listening for an input from an HID device, and some languages do not support this functionality. this could cause (very theoretically) some input lag or an inconsistency of cursor behavior, but it has nothing to do with how USB controller is polling the device.


Why is he posting that in this thread then? And why has no one done some testing on this yet, so we have actual results to look at?

At least you provided an explanation on this matter, that makes sense. I still have to see any conclusive evidence that it is causing issues though.


----------



## x7007

Athrutep said:


> Why is he posting that in this thread then? And why has no one done some testing on this yet, so we have actual results to look at?
> 
> At least you provided an explanation on this matter, that makes sense. I still have to see any conclusive evidence that it is causing issues though.


Just post whatever finding you have. I wouldn't mind doing everything now after I've found that languages effect the mouse even on laptop touch pad. the mouse felt like it had some muscle disease. after installing the same languages I felt instant improvement. this thread is for anything mouse related. polling or not also some people only view this thread and not the other. the one roach bios settings guide.


----------



## Athrutep

x7007 said:


> Just post whatever finding you have. I wouldn't mind doing everything now after I've found that languages effect the mouse even on laptop touch pad. the mouse felt like it had some muscle disease. after installing the same languages I felt instant improvement. this thread is for anything mouse related. polling or not also some people only view this thread and not the other. the one roach bios settings guide.


Nah, this thread is for USB polling precision, it is not a general thread. Stuff like that only clutters the thread and makes it harder for people to find important information related to the thread title. If anything, the roach bios guide is the place to post stuff like that. If people are interested in other things, they are more than likely looking up other thread.

So you posting this here instead of in your own thread only makes it hard for the people to find this information, if that is what they are looking for.


----------



## 508859

x7007 said:


> Just post whatever finding you have. I wouldn't mind doing everything now after I've found that languages effect the mouse even on laptop touch pad. the mouse felt like it had some muscle disease. after installing the same languages I felt instant improvement. this thread is for anything mouse related. polling or not also some people only view this thread and not the other. the one roach bios settings guide.


you can post whatever finding you have on another forum then. this has nothing to do with USB, or USB polling. 
also there has been zero evidences so far


----------



## CiselS

sorry for my bad English

yesterday I try win7 and optimize os 1 hour

polling rate still like windows 8.1 https://i.imgur.com/Qd0mzhG.png

but I feel win8.1 mouse control better than win7

I try to find the reason. I forget I didn't change to MSI mode in win7

https://i.imgur.com/bzyw7FE.png 

(don't change ehci and sata ahri one guys change sata ahri kill his os https://forums.guru3d.com/threads/w...terrupts-msi-tool.378044/page-44#post-5671039)

https://i.imgur.com/ZLv19Cv.png (after I change msi mode)[I forget to get a picture before I change to msi mode in win7]

but if your open application discord or game 
the polling rate will be perfect(be bad)

now mouse control win7>win8.1

I have two os win 7 win 8.1

so I do the same thing in win8.1 but not get perfect polling rate

maybe because I can't disable dwm in win8.1

Edit1CI standard PCI bridge connect GPU both IRQ is 16

if you change PCI standard PCI bridge to msi mode will get perfect polling rate in desktop

you can safe to change gpu and audio to msi mode(r0ach say he change sound card to msi mode mouse feel bad)

Edit2:if you do ehci to mis mode your will get float mouse(go device manager reinstall ehci and restart)

ehci can't be msi mode,xhci work(xhci auto msi mode)

-------------------------------------------------------
btw: input lag mouse control ehci > xhci + usb 3.0 > xhci no usb3.0

my usb port: blue port mouse response > white port [when you try tracking in fps game very obvious(fast movement fps game ex:apex legends overwatch)]
-------------------------------------------------------
test PC : E3 1231V3 GTX970 G1 sniper B6

this is not good pc

maybe high-end pc no problem(new high-end pc many devices auto using msi mode)

Edit2:if your NEW high-end pc can get perfect polling rate in-game please tell me
-------------------------------------------------------
Edit3:
every test is human test

if you believe just try it.

really less more input lag and response mouse

change msi mode is first time I real feel big different (No.1)

(No.2 is "Nvidia seems to allocate some type of enormous buffering to anistropic filtering, probably as a ghetto performance hack, so by setting it to off in Nvidia control panel, it seems to get rid of it"

https://www.overclock.net/forum/375-mice/1739266-r0ach-optimal-computer-hardware-mice-q-2020-a.html)

https://i.imgur.com/ke7CLev.png


----------



## jayfkay

CiselS said:


> sorry for my bad English
> 
> yesterday I try win7 and optimize os 1 hour
> 
> polling rate still like windows 8.1 https://i.imgur.com/Qd0mzhG.png
> 
> but I feel win8.1 mouse control better than win7
> 
> I try to find the reason. I forget I didn't change to MSI mode in win7
> 
> https://i.imgur.com/bzyw7FE.png
> 
> (don't change ehci and sata ahri one guys change sata ahri kill his os https://forums.guru3d.com/threads/w...terrupts-msi-tool.378044/page-44#post-5671039)
> 
> https://i.imgur.com/ZLv19Cv.png (after I change msi mode)[I forget to get a picture before I change to msi mode in win7]
> 
> but if your open application discord or game
> the polling rate will be perfect(be bad)
> 
> now mouse control win7>win8.1
> 
> I have two os win 7 win 8.1
> 
> so I do the same thing in win8.1 but not get perfect polling rate
> 
> maybe because I can't disable dwm in win8.1
> -------------------------------------------------------
> btw: input lag mouse control ehci > xhci + usb 3.0 > xhci no usb3.0
> 
> my usb port: blue port mouse resopn > white port [when you try tracking in fps game very obvious(fast movement fps game ex:apex legends overwatch)]
> -------------------------------------------------------
> test PC : E3 1231V3 GTX970 G1 sniper B6
> 
> this is not good pc
> 
> maybe high-end pc no problem(new high-end pc many device using msi mode)


nice I never had pure 1000hz, always 990-1020 or so. now say how you did it


----------



## x7007

CiselS said:


> sorry for my bad English
> 
> yesterday I try win7 and optimize os 1 hour
> 
> polling rate still like windows 8.1 https://i.imgur.com/Qd0mzhG.png
> 
> but I feel win8.1 mouse control better than win7
> 
> I try to find the reason. I forget I didn't change to MSI mode in win7
> 
> https://i.imgur.com/bzyw7FE.png
> 
> (don't change ehci and sata ahri one guys change sata ahri kill his os https://forums.guru3d.com/threads/w...terrupts-msi-tool.378044/page-44#post-5671039)
> 
> https://i.imgur.com/ZLv19Cv.png (after I change msi mode)[I forget to get a picture before I change to msi mode in win7]
> 
> but if your open application discord or game
> the polling rate will be perfect(be bad)
> 
> now mouse control win7>win8.1
> 
> I have two os win 7 win 8.1
> 
> so I do the same thing in win8.1 but not get perfect polling rate
> 
> maybe because I can't disable dwm in win8.1
> 
> Edit1CI standard PCI bridge connect GPU both IRQ is 16
> 
> if you change PCI standard PCI bridge to msi mode will get perfect polling rate in desktop
> 
> you can safe to change gpu and audio to msi mode(r0ach say he change sound card to msi mode mouse feel bad)
> 
> Edit2:if you do ehci to mis mode your will get float mouse(go device manager reinstall ehci and restart)
> 
> ehci can't be msi mode,xhci work(xhci auto msi mode)
> 
> -------------------------------------------------------
> btw: input lag mouse control ehci > xhci + usb 3.0 > xhci no usb3.0
> 
> my usb port: blue port mouse response > white port [when you try tracking in fps game very obvious(fast movement fps game ex:apex legends overwatch)]
> -------------------------------------------------------
> test PC : E3 1231V3 GTX970 G1 sniper B6
> 
> this is not good pc
> 
> maybe high-end pc no problem(new high-end pc many devices auto using msi mode)
> 
> Edit2:if your NEW high-end pc can get perfect polling rate in-game please tell me
> -------------------------------------------------------
> Edit3:
> every test is human test
> 
> if you believe just try it.
> 
> really less more input lag and response mouse
> 
> change msi mode is first time I real feel big different (No.1)
> 
> (No.2 is "Nvidia seems to allocate some type of enormous buffering to anistropic filtering, probably as a ghetto performance hack, so by setting it to off in Nvidia control panel, it seems to get rid of it"
> 
> https://www.overclock.net/forum/375-mice/1739266-r0ach-optimal-computer-hardware-mice-q-2020-a.html)
> 
> https://i.imgur.com/ke7CLev.png



so you mean disabling antistropic filtering on Nvidia control panel you reduce lag input?


----------



## jayfkay

518L on windows 7, with optimizations.
I was fiddling with ram overclock so speedstep or w/e its called is currently enabled (core speed varies depending on usage).

this is considered bad I guess?


----------



## Timecard

There are two features, speedstep and speedshift. Speedstep is software (os) controlled where as speedshift is controlled directly on the cpu without the os, so it should have much lower latency for core power state transitions.

https://www.anandtech.com/show/9751/examining-intel-skylake-speed-shift-more-responsive-processors


----------



## jayfkay

Timecard said:


> There are two features, speedstep and speedshift. Speedstep is software (os) controlled where as speedshift is controlled directly on the cpu without the os, so it should have much lower latency for core power state transitions.
> 
> https://www.anandtech.com/show/9751/examining-intel-skylake-speed-shift-more-responsive-processors


My power setting is somehow locked on "balanced", should I just set minimal CPU performance to 100%? 
I used to have max core speed permanently, I think I disabled it a while back cuz my ram overclock wasn't working otherwise. (z77-g43 is really limited in terms of ram /oc)


----------



## empl

jayfkay said:


> 518L on windows 7, with optimizations.
> I was fiddling with ram overclock so speedstep or w/e its called is currently enabled (core speed varies depending on usage).
> 
> this is considered bad I guess?


What version of mouse tester you are using ? This looks completely different for me. I don't remember name of graph, which measures polling rate stability and name i found at mouse tester thread is missing in my program. I am using 1.5.3. I have update (ms) graph, when i select interval vs time. I have 1 ms all time, except couple spikes in two places. One to 2ms and one up to 11 (second spike is at the end, program author says to ignore spikes at start and at end). But that can be as i stopped moving my mouse quickly, so polling rate dropped. Also i can't even zoom as much as you and get memory exception. I also can't find section where is measured polling handling by os. But from latencymon i have 80us usually max.

Btw you know, that if you set process state to 100%, it is still not running at maximum utilization and in state of maximum readiness. You have to google power attrib disable idle saver. But careful, you have to have good cooling as cpu is utilized at 100% all times. Also disable usb suspending at the same place and use ultimate performance profile. Also disable waking pc by mouse and keaybord in device manager and for all hid devices. You can also check my guide, it is pretty comprehensive list of tweaks, which reduce input lag.

*Btw anyone here have idea, if it is possible to render audio on second pc ? Like streamers stream on second pc. Because audio card is source of major input lag !!*


----------



## jayfkay

This guy has me seriously wondering whether I should install Win 8.1 or stay on Win 7 again :thinking:


----------



## Marctraider

Best I can get it on 1809 so far. Must be said it is a wireless G305, I have no comparison as to whether a corded logitech would yield better results.
Basically +- 5us.

I'm still testing to see if it can get better.


----------



## nofearek9

how would you rate my results below ?:








(finalmouse UL2 , w10 1809)


----------



## Axaion

Impossible to tell really, set it to count from 100 on time and upwards


----------



## ylpkm

jayfkay said:


> This guy has me seriously wondering whether I should install Win 8.1 or stay on Win 7 again :thinking:


lol, I just spent the last day getting my 3900x build running on Win 7. Haven't gotten to tweaking yet, just finished windows updates. But already, lower latency numbers in latencymon, even in ata mode, gonna try to switch to ahci. And my 1080ti is benching about 5% higher fps at same gpu overclock settings. So Win 7 right now, is giving me 5% higher fps, untweaked, still using bad drivers, and cpu clocks lower than win 10 install by ~200mhz on each core, than my fully tweaked Win 10 install.


----------



## jayfkay

ylpkm said:


> lol, I just spent the last day getting my 3900x build running on Win 7. Haven't gotten to tweaking yet, just finished windows updates. But already, lower latency numbers in latencymon, even in ata mode, gonna try to switch to ahci. And my 1080ti is benching about 5% higher fps at same gpu overclock settings. So Win 7 right now, is giving me 5% higher fps, untweaked, still using bad drivers, and cpu clocks lower than win 10 install by ~200mhz on each core, than my fully tweaked Win 10 install.


Yeah thats why I wasnt even considering win10. But from what I gathered win8.1 is improved on win7 in every aspect, gaming performance and general OS performance


----------



## nofearek9

Axaion said:


> Impossible to tell really, set it to count from 100 on time and upwards


like this? :


----------



## Axaion

nofearek9 said:


> like this? :


Yup, looks good.


----------



## nofearek9

thanks.


----------



## Athrutep

I have just recently upgraded my hardware to the newest intel gen. After reinstalling win 10 1909 (which feels horrible). I started tweaking. The last tweak i did was bcdedit /set disabledynamictick yes

and bcdedit /set useplatformtick yes.

after disabling the synthetic timers, i restarted and all of a sudden my mouse felt horrible. it took like 2 minutes till it was somewhat back to normal, it felt like it only picked up 20% of my handmovement. And even the mousetester graphs looked horrible.

I used bcdedit /deletevalue useplatformtick to revert the changes.

On my old intel gen 4 system 4770k it felt better than the default. 

Did anyone else experience the same or knows whats going on?


----------



## Timecard

You might have this, enjoy the ride with us!
https://www.overclock.net/forum/375...untered-mouse-keyboard-input-lag-related.html


----------



## Athrutep

Timecard said:


> You might have this, enjoy the ride with us!
> https://www.overclock.net/forum/375...untered-mouse-keyboard-input-lag-related.html


Hell no.

Everything is fine, i am simply talking about disabling synthetic timers, which before felt better than with them enabled. And now its the opposite. There is no, power mumbo jumbo going on and my house isn't cursed by linux ghost who hates windows. I am from Europe and these issues unless something is seriously wrong with the power routing in your house, is way less of an issue, since here we have 230V system here compared to the 120V in the US or Japan.

I haven't moved since and everything is fine in that regard. I just want to figure out why in 1909 disabling synthetic timers is worse than on my previous hardware win 1807 install, where disabling the synthetic timers felt much better and showed lower latencies / improved polling behaviour.

https://youtu.be/FuDOIsVgwWE?t=124


----------



## Timecard

Are the timers operating at different frequencies than before?

TimerBench, CPU-Z > Timers


----------



## CrucialNUG

Athrutep said:


> Hell no.
> 
> Everything is fine, i am simply talking about disabling synthetic timers, which before felt better than with them enabled. And now its the opposite. There is no, power mumbo jumbo going on and my house isn't cursed by linux ghost who hates windows. I am from Europe and these issues unless something is seriously wrong with the power routing in your house, is way less of an issue, since here we have 230V system here compared to the 120V in the US or Japan.
> 
> I haven't moved since and everything is fine in that regard. I just want to figure out why in 1909 disabling synthetic timers is worse than on my previous hardware win 1807 install, where disabling the synthetic timers felt much better and showed lower latencies / improved polling behaviour.
> 
> https://youtu.be/FuDOIsVgwWE?t=124


Sounds like you certainly do not have the same issue. I have heard of more folks in EU having it than in NA though so I do not think 230V has any inherent advantage over 120V when it comes to this problem, which we do not even know the direct source of. Hard to speculate at this time what type of power infrastructure fault makes this possible. Furthermore my problem does not even show up on a mouse polling log behaivor.


----------



## Athrutep

Timecard said:


> Are the timers operating at different frequencies than before?
> 
> TimerBench, CPU-Z > Timers


I have no reference , since i cant remember the timers on the old hardware and OS. 

Current timers on new hardware and win 10 1909 with synthetic timers enabled are 

ACPI 3.580Mhz QPC 10.000Mhz RTC 1.000Khz


is the new windows build maybe the cause for this, as on 1807 i had no issues?


----------



## x7007

Athrutep said:


> I have no reference , since i cant remember the timers on the old hardware and OS.
> 
> Current timers on new hardware and win 10 1909 with synthetic timers enabled are
> 
> ACPI 3.580Mhz QPC 10.000Mhz RTC 1.000Khz
> 
> 
> is the new windows build maybe the cause for this, as on 1807 i had no issues?


yes because Microsoft changed the time to have meltdown and specter protection.

check also your resolution timers. if you have it on 1.0ms you don't need to add bcdedit expect disabledynamictick yes


----------



## r0ach

Athrutep said:


> after disabling the synthetic timers, i restarted and all of a sudden my mouse felt horrible. it took like 2 minutes till it was somewhat back to normal, it felt like it only picked up 20% of my handmovement. And even the mousetester graphs looked horrible.


On Windows 8.1, each time I load up command prompt to turn on and off dynamicticks then reboot it typically gives me a different result each time in terms of cursor movement. Same thing with turning MSI mode for GPU on and off. Feels like a lot of toggleable changes in Windows shuffle many more variables around in the background in a non-deterministic manner. I'm sure many people have gotten that feeling before when dealing with IRQ auto-allocation.


----------



## Athrutep

x7007 said:


> yes because Microsoft changed the time to have meltdown and specter protection.
> 
> check also your resolution timers. if you have it on 1.0ms you don't need to add bcdedit expect disabledynamictick yes


My timers on the newest build are off 0.496 instead of 0.5 but as soon as i disable synthetic timers it screws the mouse input even though the timers are even numbers like 0.5 1.0 and so on. My dpc latency is also lower and more consistent with synthetic timers enabled.

I have reinstalled an 1807 version on another ssd. Here it works perfectly normal even on the new hardware and disabling synthetic timers gives me even numbers like 0.5 1.0 but the mouse input actually works normally with or without synthetic timers.

Why can't microsoft stop screwing things up. Makes me want to go back to Win 7 and forget about all this, but the new hardware gimps the performance on Win 7.

Also turns out that i had my mouse in an usb 3.1 gen 2 port. Luckily my board comes with its own usb 2.0 hub with 4 ports. Since i have swapped the mouse to the usb 2.0 port without anything else connected to it, it works better on win 1807 but on 1909 it doesn't seem to change anything.





r0ach said:


> On Windows 8.1, each time I load up command prompt to turn on and off dynamicticks then reboot it typically gives me a different result each time in terms of cursor movement. Same thing with turning MSI mode for GPU on and off. Feels like a lot of toggleable changes in Windows shuffle many more variables around in the background in a non-deterministic manner. I'm sure many people have gotten that feeling before when dealing with IRQ auto-allocation.


Yea for me on the new hardware, setting the gpu into msi mode seems to make my dpc latency worse along with decreasing polling precision. I can also tell when i look around in fps games it is not nearly as smooth during fast movements on msi mode (240hz monitor). So i don't use it on the new hardware and 1909.


----------



## ylpkm

jayfkay said:


> Yeah thats why I wasnt even considering win10. But from what I gathered win8.1 is improved on win7 in every aspect, gaming performance and general OS performance


I went back to my Win 10 build due to frequency of random gpu memory crashes (or at least memory goes first and then entire card is lost). 



Athrutep said:


> Yea for me on the new hardware, setting the gpu into msi mode seems to make my dpc latency worse along with decreasing polling precision. I can also tell when i look around in fps games it is not nearly as smooth during fast movements on msi mode (240hz monitor). So i don't use it on the new hardware and 1909.


Really? Hmm. I may need to retest this. Previously (Win 10) switching my 1080ti to MSI mode (priority undefined <- important, setting it to high made things feel off) usually made my inputs feel better. 

Question, are you using display or gpu scaling in the nvidia control panel, and what is your monitor? Im using a newer 4ktv, and recently I've been testing scaling between gpu and display when tv is set to 2560x1440p 120hz with black frame insertion on (to reduce blur) and in game resolution set the 2560x1440p 120hz. (Override scaling mode set by games and programs is checked)

Gpu scaling (aspect or fullscreen): More pronounced strobe from light sources in game.
Gpu scaling (no scaling): Feels like less input lag, can't really see any differences compared to display
Display scaling (no scaling): Inputs feel a little laggy


----------



## Athrutep

ylpkm said:


> I went back to my Win 10 build due to frequency of random gpu memory crashes (or at least memory goes first and then entire card is lost).
> 
> 
> 
> Really? Hmm. I may need to retest this. Previously (Win 10) switching my 1080ti to MSI mode (priority undefined <- important, setting it to high made things feel off) usually made my inputs feel better.
> 
> Question, are you using display or gpu scaling in the nvidia control panel, and what is your monitor? Im using a newer 4ktv, and recently I've been testing scaling between gpu and display when tv is set to 2560x1440p 120hz with black frame insertion on (to reduce blur) and in game resolution set the 2560x1440p 120hz. (Override scaling mode set by games and programs is checked)
> 
> Gpu scaling (aspect or fullscreen): More pronounced strobe from light sources in game.
> Gpu scaling (no scaling): Feels like less input lag, can't really see any differences compared to display
> Display scaling (no scaling): Inputs feel a little laggy


1080p 240hz no scaling. Scaling on monitor override scaling set by games and programs


----------



## kurtextrem

Athrutep said:


> 1080p 240hz no scaling. Scaling on monitor override scaling set by games and programs


Regarding DPC latency and MSI mode, and disabling the synthetic timers - what increases or decreases are we speaking of?


----------



## Athrutep

kurtextrem said:


> Regarding DPC latency and MSI mode, and disabling the synthetic timers - what increases or decreases are we speaking of?


DPC latency for me without MSI mode is around 5-12µs on windows 10 1909 after tweaking occasionally jumping to no higher than 30µs. With MSI mode it goes from 9-25µs jumping to as high as 600µs (which still isn't bad but its a difference)

Polling precision is usually no higher than a 60µs variance with the majority of polls being around a 10µs variance. In MSI mode that increases to 190µs with the majority of polls being around 100µs (unacceptable? No. Is it a difference , yes.)

So for me on my system personally after the switch from an earlier windows build to 1909, msi mode is somehow worse for me now. Worse as in not problematic by any stretch but worse. In addition in games like csgo and quake whenever i move around and do 90 degree or 180 degree turns or fluid motions like tracking someone close quarters with a lot of strafing and mouse movement involved it becomes visually less clear and less easy to accurately aim. Not a huge difference but one that i do notice. 

It has maybe an impact on the frametime consistency. That is something i would need to test. And that is why i am asking if anyone else noticed something like that on the newest win 1909 build or if its just my install and i should reinstall (win 10 feels like poop anyways compared to win 7). 

In games where i don't get fps over 240fps it doesn't matter since it feels inconsistent anyways and much worse than 240+fps on msi mode. That is how i perceive it anyways.

and disabling synthetic timers it feels like it halves my cpi and the cpi overall fluctuates, which is reflected by the mousetester actually reading it like that as well. But it comes and goes and is inconsistent up to the point, where i don't want to mess with it, since its not usable for me like that and with synthetic timers on it feels fine.


----------



## r0ach

I'm seeing lots of bad ideas in this thread:

1) Forcing MSI mode on is worse for every Nvidia card I've ever used. And using that tweak program to force other random devices into MSI mode for no reason gives me worse mouse movement as well.

2) Under no circumstances should you ever use BCDEDIT to enable useplatformclock. You shouldn't even be buying a motherboard that doesn't let you toggle off HPET in the first place. The awful performance hits due to HPET on newer Intel cores is also hilarious: "Then X299 showed up and the query for an HPET timestamp suddenly takes 7 times longer! The number of possible HPET timer calls per seconds went from 1.4 million on Broadwell-E to merely 200.000 calls on Skylake X. Let's remember that this is a high-end platform also used for scientific purposes where accuracy and performance are both very relevant."

3) bcdedit "disabledynamictick" will feel better on some people's systems and worse on others. One thing I have noticed with this setting is that if you let you monitor go to sleep at all then wake again, your cursor movement can be a little more dulled down until you reboot. This doesn't seem to happen with it disabled. Might be some sort of clock skew/drift issue with Windows rest modes or issue with suspending and restarting this dynamic tick mechanism.


----------



## the1freeMan

^ 1) Yes forcing MSI mode is not a good idea and has literally no point.

2) Forcing useplatformclock is also the dumbest thing anyone ever suggested. M$ stated it should be only used for debugging purposes and yeah if you've ever read how hpet works, forcing the OS to use only that is just mindnumbingly idiotic.
However disabling hpet in the bios is also nonsensical in any windows OS after windows XP. If you read the link in the OP under the hpet section you can see how "The TSC frequency is calibrated against HPET periods to finally get proper timekeeping." on win7 and later.

Those guys tested things properly, but even without going so far you could see how disabling hpet in the bios you'd get weird issues on win7 like CSGO clamping fps to multiples of 128 (or submultiples of 1024?)
As for xp, hpet is not supported, don't know if having it on in the bios could give bugs, but yeah.. that was win xp lol.
Anyway if you don't want programs to use it you can easily disable it in the device manager. That is the best solution as the OS is allowed to calibrate but programs are not allowed to use it.

3) Doesn't really seem to change anything on my system, with c-states disabled and minimum processor state to 100%, maybe on configurations with more powersavings, mainly a feature made for laptops anyway.
Checked for differences after sleep, restart etc.. with pcclocktiming.exe and polling rate stability with mousetester. While there are slight deviations with every reboot, disabledynamictick or sleep mode don't impact that behavior.


----------



## Athrutep

the1freeMan said:


> ^ 1) Yes forcing MSI mode is not a good idea and has literally no point.
> 
> 2) Forcing useplatformclock is also the dumbest thing anyone ever suggested. M$ stated it should be only used for debugging purposes and yeah if you've ever read how hpet works, forcing the OS to use only that is just mindnumbingly idiotic.
> However disabling hpet in the bios is also nonsensical in any windows OS after windows XP. If you read the link in the OP under the hpet section you can see how "The TSC frequency is calibrated against HPET periods to finally get proper timekeeping." on win7 and later.
> 
> Those guys tested things properly, but even without going so far you could see how disabling hpet in the bios you'd get weird issues on win7 like CSGO clamping fps to multiples of 128 (or submultiples of 1024?)
> As for xp, hpet is not supported, don't know if having it on in the bios could give bugs, but yeah.. that was win xp lol.
> Anyway if you don't want programs to use it you can easily disable it in the device manager. That is the best solution as the OS is allowed to calibrate but programs are not allowed to use it.
> 
> 3) Doesn't really seem to change anything on my system, with c-states disabled and minimum processor state to 100%, maybe on configurations with more powersavings, mainly a feature made for laptops anyway.
> Checked for differences after sleep, restart etc.. with pcclocktiming.exe and polling rate stability with mousetester. While there are slight deviations with every reboot, disabledynamictick or sleep mode don't impact that behavior.



That pretty much sums up my findings, based on dpc latency, mouse tester graphs and my personal feelings (which doesn't count at all, but matters for me at least )


----------



## r0ach

the1freeMan said:


> However disabling hpet in the bios is also nonsensical in any windows OS after windows XP. If you read the link in the OP under the hpet section you can see how "The TSC frequency is calibrated against HPET periods to finally get proper timekeeping." on win7 and later.


I'd rather deal with potential clock skew/drift issues (if such a thing is even a problem with ITSC) by just shutting down and restarting rather than having worse mouse movement with HPET turned on. Dynamicticks is a far bigger potential problem source than running without HPET also IMO. 

I don't think such a problem with ITSC even exists except potentially for people who run with SMT, which should not be used on a gaming machine in the first place. So all I see for HPET is complete negatives and no postitives. The end-user experience is definitely worse with HPET on, so any technobabble concerning the subject is irrelevant to me. What good is technobabble if it has no basis in observed reality?

It's kind of like the old days when you had these geriatric IT people on forums like this who told everyone that you should always keep vsync on at all times due to the benefits it provides to prevent tearing, while they're completely unable to comprehend it makes cursor movement crappy or why that would matter. Hell, I haven't even gotten noticeable tearing in years, which is why I find gsync hype ridiculous. It's a cure in search of a non-existent problem.


----------



## the1freeMan

r0ach said:


> I'd rather deal with potential clock skew/drift issues (if such a thing is even a problem with ITSC) by just shutting down and restarting rather than having worse mouse movement with HPET turned on. Dynamicticks is a far bigger potential problem source than running without HPET also IMO.
> 
> I don't think such a problem with ITSC even exists except potentially for people who run with SMT, which should not be used on a gaming machine in the first place. So all I see for HPET is complete negatives and no postitives. The end-user experience is definitely worse with HPET on, so any technobabble concerning the subject is irrelevant to me. What good is technobabble if it has no basis in observed reality?
> 
> It's kind of like the old days when you had these geriatric IT people on forums like this who told everyone that you should always keep vsync on at all times due to the benefits it provides to prevent tearing, while they're completely unable to comprehend it makes cursor movement crappy or why that would matter. Hell, I haven't even gotten noticeable tearing in years, which is why I find gsync hype ridiculous. It's a cure in search of a non-existent problem.


The OS will use invariant TSC regardless of HPET being on in the bios. Having HPET off in the bios will mess up the calibration giving you more deviation every reboot. I strongly suggest you thoroughly read http://www.windowstimestamp.com/description which is posted in the OP under the HPET section.
If those tests are somehow not enough for you, I suggest investing 50 bucks in an arduino / photodiode setup like qsxcv, copy paste his code from github and test for yourself.

Regarding dynamictick, I will add that I don't use SMT/HT as it simply gets you less fps in all games I benchmarked.
Also there is a bug with nvidia drivers and HT on, where you get slightly over 100µs additional interrupt latency when the gpu is under it's base clock.
Not relevant in games as, if you use "prefer maximum performance" power mode, the gpu will not downclock (and HT should be off on a gaming system anyway). Still not a nice thing to see.
You can test that with IDLT, part of LatencyMon.

Regarding gsync/freesync: while I agree that tearing is not noticeable at high fps / 1000Hz mouse polling and that is the ideal scenario, it's still a nice technological step forward for games that run at lower fps, or stuff like Jazzpunk where even minimal tearing ruins the cut scenes.
Variable refresh rate is part of the displayport standard and it's quite embarrassing that you can't simply turn it on on any dp screen regardless of gpu vendor or it being "freesync enabled/gsync compatible".
Then you have nvidia drivers getting extra input lag on gsync compatible screens when it's turned off so yeah maybe we're better off this way.. *facepalm*


----------



## kurtextrem

the1freeMan said:


> 3) Doesn't really seem to change anything on my system, with c-states disabled and minimum processor state to 100%, maybe on configurations with more powersavings, mainly a feature made for laptops anyway.
> Checked for differences after sleep, restart etc.. with pcclocktiming.exe and polling rate stability with mousetester. While there are slight deviations with every reboot, disabledynamictick or sleep mode don't impact that behavior.


Try "useplatformtick" for a while and report back too, please.


----------



## x7007

so we can agree that disabledynamictick Yes is always good?
and disable platform tick Yes, is when necessary?


----------



## cdcd

the1freeMan said:


> Then you have nvidia drivers getting extra input lag on gsync compatible screens when it's turned off so yeah maybe we're better off this way.. *facepalm*


Never heard of this before, could you elaborate (maybe with a source)?


----------



## the1freeMan

cdcd said:


> Never heard of this before, could you elaborate (maybe with a source)?


https://youtu.be/L42nx6ubpfg?t=680

Don't know if they fixed it from then but it's nvidia drivers.. I discovered the HT/downclock thing many years ago and it's still there..


----------



## cdcd

Very interesting stuff, thanks a lot.


----------



## Athrutep

x7007 said:


> so we can agree that disabledynamictick Yes is always good?
> and disable platform tick Yes, is when necessary?


I never had any negative effects when i disabled dynamic tick, which is meant to pause the timers when there is nothing going on. So it seems logical that you want to disable that.

disabling synthetic timers via platformtick should potentially be done to correct the offset clock timers. Like when they are uneven instead of 1.0 0.5 and so on. On windows 10 by default it seems like with synthetic timers enabled they seem always off (at least for all my win 10 installs that was the case). So you would think that disabling synthetic timers to correct them to be flat even numbers would be better. But on my latest win 1909 install it made things worse compared to on previous installs. 

I have no idea why that is, all i know is its worse. So i am not gonna fiddle with the timers except disable dynamic tick. Everyone needs to test that on their own system, so there is no general guideline for it.

The best test that doesn't rely on it "feels" better/worse. Is this test 

https://www.overclockers.at/articles/the-hpet-bug-what-it-is-and-what-it-isnt

run timerbench and a dpc latency test as well as the mousetester with default settings, change whatever you want in regards to the timers, then run your tests again. And stick with what gives you better results.

disabling synthetic timers at least on my system with a 9700k on a z390 motherboard with 1909 makes it way worse. As in , it completely screws the mouse input (only registers half the polls, dropouts, cpi decrease at random and so on) and increases dpc latency


----------



## Marctraider

My findings:

DisableDynamicTick: Yes, Slightly drops throughput (And max fps in benchmarks by a percentage on average, but greatly reduces systemic load spikes, also observable in mouse polling tests) So well worth it imho.
Assuming this works similar to linux kernels, the same story sort of applies. Instead of doing a chunk of work on a specific timer tick, it just ticks 1000hz/sec and does a tiny bit of work spread over the span of a second.

MSI Mode: On for Nvidia significantly reduced mouse polling issues here (2080 super), but this started becoming apparant only after I optimized my IRQ routing with different devices (that adheres to irq affinity to begin with)

UsePlatformTick: Yes: Changes timers to 0.500/1.000 visually, and it seems to reduce frametime fluctuations slightly, especially when using in-game fps limiters which are affected by timer behavior. Other than that I see little difference.


----------



## r0ach

the1freeMan said:


> The OS will use invariant TSC regardless of HPET being on in the bios. Having HPET off in the bios will mess up the calibration giving you more deviation every reboot.


Even if we assume this is correct, that TSC uses HPET to calibrate itself (doesn't sound correct to me at all or the system would be experiencing major problems with HPET disabled and it doesn't) it does not change the fact that mouse movement is better with HPET off. So whatever *supposed* negatives occur from TSC deviation are outweighed by the overhead reduction of completely removing HPET. Like I said, I only care about the real world results, not the theoretical technobabble, and the real world results have NEVER been better with HPET on.


----------



## Axaion

exactly what real world results?, i dont see any in this thread


----------



## Marctraider

Tell any recent Intel platform user, HPET is slow as a crawl.

Also isn't HPET on ISA bus? When I disable msisadrv service the entry in device manager also disappears !


----------



## x7007

for me at the moment using only disabledynamictick Yes and my resolution timer is 2ms idle. with my eclaro sound enabled it changes the timer instead 15.625. but usually it suppose to be 1ms. when for example I run video the time goes to 1.0ms. when playing a game the time goes 0.5. It must be something with the tscsyncpolicy changed because I remember it only happened last time when using enhanced with hpet enabled in bios. anyone have the same finding?


----------



## the1freeMan

r0ach said:


> Even if we assume this is correct, that TSC uses HPET to calibrate itself (doesn't sound correct to me at all or the system would be experiencing major problems with HPET disabled and it doesn't) it does not change the fact that mouse movement is better with HPET off. So whatever *supposed* negatives occur from TSC deviation are outweighed by the overhead reduction of completely removing HPET. Like I said, I only care about the real world results, not the theoretical technobabble, and the real world results have NEVER been better with HPET on.


It does in fact experience major problems, they guys in link got improper time keeping for example. In my case, as I already said, it gave timing problems in cs go with fps clamping to submultiples of 1024 or multiples of 128. I tested this under windows 7 and playing aim training would lock to 512 fps, with hpet on in the bios it would do ~700.
Just a couple of examples, I'm sure there are more issues if people test around. That said, every windows version behaves differently. 
If you care about real world results, as I said get a microsecond precise testing rig like q.



Marctraider said:


> DisableDynamicTick: Yes, Slightly drops throughput (And max fps in benchmarks by a percentage on average, but greatly reduces systemic load spikes, also observable in mouse polling tests) So well worth it imho.
> Assuming this works similar to linux kernels, the same story sort of applies. Instead of doing a chunk of work on a specific timer tick, it just ticks 1000hz/sec and does a tiny bit of work spread over the span of a second.


Can't reproduce this on my system. Do you have HT, or c-states or any cpu powersaving features enabled by any chance? Either that or just different cpu generations / windows versions. Or nvidia power management set to maximum performance in the desktop.
I set it per program as keeping gpu clocks high in the desktop is a waste of component life.
Will see tomorrow if that changes anything, but little point as I wouldn't use it anyway.



x7007 said:


> for me at the moment using only disabledynamictick Yes and my resolution timer is 2ms idle. with my eclaro sound enabled it changes the timer instead 15.625. but usually it suppose to be 1ms. when for example I run video the time goes to 1.0ms. when playing a game the time goes 0.5. It must be something with the tscsyncpolicy changed because I remember it only happened last time when using enhanced with hpet enabled in bios. anyone have the same finding?


I get 15.625 idle even with disabledynamictick

As usual the answer to these things is: highly system dependent, test for yourself


----------



## x7007

> Can't reproduce this on my system. Do you have HT, or c-states or any cpu powersaving features enabled by any chance? Either that or just different cpu generations / windows versions. Or nvidia power management set to maximum performance in the desktop.
> I set it per program as keeping gpu clocks high in the desktop is a waste of component life.
> Will see tomorrow if that changes anything, but little point as I wouldn't use it anyway.
> 
> 
> I get 15.625 idle even with disabledynamictick
> 
> As usual the answer to these things is: highly system dependent, test for yourself



r0ach. bios hpet enabled or disabled? 

what I meant is it is 15.625 when the eClaro sound card is disabled because it's driver want to use low timer resolution.
what I am saying is instead getting 1ms usually with the sound card enabled I am getting 2ms. when watching movies I get 1ms
when playing a game I am getting 0.5ms
so it is using different resolution timer for every job
I can't understand why 2ms though on idle when sound card enabled . is it bad?

I do have Global State enabled, SMT, minimum cpu is set to 100%. cool & quiet is disabled


----------



## andreeeeee

@the1freeMan @r0ach

If I force useplatformclock to "No" on bcdedit and disable "High Precision Event Timer" on Device Manager, do you think that I could be messing up my timers even if on BIOS I have HPET on?

I'm asking this because of what @r0ach said here:



r0ach said:


> I'd rather deal with potential clock skew/drift issues (if such a thing is even a problem with ITSC) by just shutting down and restarting rather than having worse mouse movement with HPET turned on. Dynamicticks is a far bigger potential problem source than running without HPET also IMO.



I have the feeling that, with those settings, when my computer (Win7) hibernates/sleeps, awakes and then I try to play, the hitreg and the desync are worse than when I just restart it and start playing.

It feels like a clock drift issue, but I'm not 100% sure.

Do you have any thoughts on this?


----------



## r0ach

andreeeeee said:


> I have the feeling that, with those settings, when my computer (Win7) hibernates/sleeps, awakes and then I try to play, the hitreg and the desync are worse than when I just restart it and start playing.


I don't think I've ever experienced that issue on Windows 7, just on Windows 8.1 with dynamicticks enabled after letting the monitor go to sleep.


----------



## the1freeMan

andreeeeee said:


> I have the feeling that, with those settings, when my computer (Win7) hibernates/sleeps, awakes and then I try to play, the hitreg and the desync are worse than when I just restart it and start playing.
> 
> It feels like a clock drift issue, but I'm not 100% sure.


Check your system with these tools:

PC clock timing: https://www.softpedia.com/get/System/System-Info/PC-Clock-Timing.shtml

TimerTiming: https://www.dropbox.com/s/us2fjmdsy56bric/TimerTiming.zip?dl=0
It was used in this benchmark: https://www.tweakhound.com/2014/01/30/timer-tweaks-benchmarked/
but the link is dead so I re-uploaded it.
Use it from command line otherwise it wont work.


----------



## CiselS

windows 8.1 always auto deleting thumbnail cache.

I can't fix it.

https://answers.microsoft.com/en-us...humbnail/5d09ef73-7777-4d59-8e47-965379fb91f0


----------



## Athrutep

x7007 said:


> r0ach. bios hpet enabled or disabled?
> 
> what I meant is it is 15.625 when the eClaro sound card is disabled because it's driver want to use low timer resolution.
> what I am saying is instead getting 1ms usually with the sound card enabled I am getting 2ms. when watching movies I get 1ms
> when playing a game I am getting 0.5ms
> so it is using different resolution timer for every job
> I can't understand why 2ms though on idle when sound card enabled . is it bad?
> 
> I do have Global State enabled, SMT, minimum cpu is set to 100%. cool & quiet is disabled


You can't say in general what is better or worse, you have to test it for yourself as it differs from system to system even if its 80% better to turn it off there are still some systems where it makes things worse.

https://www.overclockers.at/articles/the-hpet-bug-what-it-is-and-what-it-isnt


----------



## Marctraider

the1freeMan said:


> It does in fact experience major problems, they guys in link got improper time keeping for example. In my case, as I already said, it gave timing problems in cs go with fps clamping to submultiples of 1024 or multiples of 128. I tested this under windows 7 and playing aim training would lock to 512 fps, with hpet on in the bios it would do ~700.
> Just a couple of examples, I'm sure there are more issues if people test around. That said, every windows version behaves differently.
> If you care about real world results, as I said get a microsecond precise testing rig like q.
> 
> 
> 
> Can't reproduce this on my system. Do you have HT, or c-states or any cpu powersaving features enabled by any chance? Either that or just different cpu generations / windows versions. Or nvidia power management set to maximum performance in the desktop.
> I set it per program as keeping gpu clocks high in the desktop is a waste of component life.
> Will see tomorrow if that changes anything, but little point as I wouldn't use it anyway.
> 
> 
> 
> I get 15.625 idle even with disabledynamictick
> 
> As usual the answer to these things is: highly system dependent, test for yourself


I test everything on max core clock both gpu and cpu, even nvidia profiles don't force max core clock with setting. I use my own written tool to auto force max clock through switching MSI Afterburner profiles instead.

DisableDynamicTick isn't supposed to change this value, useplatformtick does


----------



## empl

Athrutep said:


> I have just recently upgraded my hardware to the newest intel gen. After reinstalling win 10 1909 (which feels horrible). I started tweaking. The last tweak i did was bcdedit /set disabledynamictick yes
> 
> and bcdedit /set useplatformtick yes.
> 
> after disabling the synthetic timers, i restarted and all of a sudden my mouse felt horrible. it took like 2 minutes till it was somewhat back to normal, it felt like it only picked up 20% of my handmovement. And even the mousetester graphs looked horrible.
> 
> I used bcdedit /deletevalue useplatformtick to revert the changes.
> 
> On my old intel gen 4 system 4770k it felt better than the default.
> 
> Did anyone else experience the same or knows whats going on?



Yep platformtick feels terrible, mouse is slower and somehow heavier (hard to describe), even it may depend on hw configuration, but i had same experience as you... It is supposed to be used only for debugging by microsoft. I found only about platformclock on tweakhound in correspondence with timer expert and same thing - it should be used only for debugging. Try insted to get bclk closest to 100mhz. Btw you can use dual boot with 7 for gaming like cs go, for lowest latency, also clean boot helps a lot. And you may want to make custom install, to remove everything you can. I am thinking about it, since i am getting in 1-2 month new pc.

Btw i don't think mouse tester can measure time of polling rate handling by os, only polling stability, skipped pixels etc. I just read official thread and didn't find there anything like that. But it is still useful. 

Btw how do you measure dpc latency in load ? Latencymon simulates load and from support i heard you shouldn't gaming during that. Also windows ADK seems to use same measuring method and other methods in latencymon don't work or are not reliable. But i don't know how accurately latencymon can simulate interrupts from gpu e.g. Latencymon supports says, they are accurate as hardware allows them, or as microsoft reports them accurately. But they didn't say about mode interrupt to dpc latency why it is bad, (they said only interrupt to process latency is most useful, that is the mode is set by default and you shouldn't game with it on, i still wanted to do tests in load. But there is no other modes working, so i guess i can't. Because i get a lot of less dpc latency and calls and total execution times with msi on for gpu, than with msi off - even nvidia claims it is specified in drivers programming and setting in regedit shouldn't matter. Also i get a lot of lower input lag having gpu in msi.


----------



## jayfkay

how I can see what my platformtick is set on?
anyway here is a 10s latmon "bench" on windows 8.1(this is with firefox steam discord running)

Highest measured interrupt to process latency (µs): 61,317295
Average measured interrupt to process latency (µs): 1,631274
Highest measured interrupt to DPC latency (µs): 60,295340
Average measured interrupt to DPC latency (µs): 0,490335

it feels pretty snappy tho, and my mouserate has relatively little variance (990-1010 with some rare exceptions). there is still some bios stuff I could change and MSI for some devices like gpu, but I havent touched that yet. fortunately the USB controller was set to MSI by default.

DPC latency checker shows 1000µs constantly, so I guess that program is useless.


----------



## x7007

Can someone give insight on Bcdedit /set Tscsyncpolicy Default/Legacy/Enhanced?

Btw, just to give you some testing ideas. don't reinstall windows if you want to see a problem, especially if it doesn't happen in games, use the Windows To Go. if the problem happens in the OS it will keep happening without you losing all data and you could continue everything instead of seeing the issue the same right after. Also, you can always do In-Place install that will do the same as fresh install without removing anything.


----------



## r0ach

I just assembled my new build. I had been using a board that didn't let you disable HPET for a while and the new one allows me to disable it (Asus P8Z77-I Deluxe ITX) so I can do a good comparison here. The1freeman claims having HPET on is good for TSC to sync with, but whether good in technobabble terms or not, the real world effect of having HPET off is that with all other mouse settings being equal, it makes you more likely to undershoot than overshoot. 

In other words, it gives you cursor movement more similar to the infamous MLT04 where you flick it and the cursor just kinda hits up against a wall at the end and stops due to having a low max speed. Even if this effect was detrimental from a technobabble perspective, in the REAL WORLD, it's beneficial to my aim and is almost like having a cheat engine enabled because you just can't overshoot no matter how hard you try. And it's much easier to hit targets with flicks if you have MLT04-style cursor movement and can't overshoot. It also feels like it adds more granularity to fine movements from removing the higher overhead HPET noise floor from the system.

Also:



r0ach said:


> update on new build:
> 
> Get more responsive mouse movement using 1333 mhz and lower timings than 1600 mhz. If you have Samsung 'miracle' ram and can do 7-8-7-24 at 1600mhz then you probably don't need to though. This is one reason I haven't bothered screwing with any DDR4 boards. Scared doubling the CAS latency will feel terrible.


----------



## x7007

r0ach said:


> I just assembled my new build. I had been using a board that didn't let you disable HPET for a while and the new one allows me to disable it (Asus P8Z77-I Deluxe ITX) so I can do a good comparison here. The1freeman claims having HPET on is good for TSC to sync with, but whether good in technobabble terms or not, the real world effect of having HPET off is that with all other mouse settings being equal, it makes you more likely to undershoot than overshoot.
> 
> In other words, it gives you cursor movement more similar to the infamous MLT04 where you flick it and the cursor just kinda hits up against a wall at the end and stops due to having a low max speed. Even if this effect was detrimental from a technobabble perspective, in the REAL WORLD, it's beneficial to my aim and is almost like having a cheat engine enabled because you just can't overshoot no matter how hard you try. And it's much easier to hit targets with flicks if you have MLT04-style cursor movement and can't overshoot. It also feels like it adds more granularity to fine movements from removing the higher overhead HPET noise floor from the system.
> 
> Also:




so the Hpet off in bios gives the better mouse stop? 
f33tp or his name on YouTube channel and someone else made tests. look here about hpet off on bios. 

https://github.com/CHEF-KOCH/GamingTweaks/issues/18#issuecomment-586493606

last words he says not to change anything in windows.


----------



## Axaion

Please waste 2 minutes of your life to do a mousetester graph and latencymon test, just for kicks


----------



## empl

x7007 said:


> Can someone give insight on Bcdedit /set Tscsyncpolicy Default/Legacy/Enhanced?
> 
> Btw, just to give you some testing ideas. don't reinstall windows if you want to see a problem, especially if it doesn't happen in games, use the Windows To Go. if the problem happens in the OS it will keep happening without you losing all data and you could continue everything instead of seeing the issue the same right after. Also, you can always do In-Place install that will do the same as fresh install without removing anything.


Again probably not best idea to tweak these. From corespondence with timer expert : https://www.tweakhound.com/2014/01/30/timer-tweaks-benchmarked/ they should be only used for debugging! I had with all of these: platformclock, platformtick and tsyncpolicy bad experience, but test it out.

I think disabledynamictick is safe and it was made for notebooks in the first place to safe power, it also gives you mouse acceleration. So this is really important to disable !

HPET on/off in bios differs per hw configuration, particularly cpu. TimerBench (more calls the better) is for testing that, but there is concern about whether results are accurate. I have better mouse feeling with hpet on definitely ! Much more snappier and TimerBench reports minimal difference between hpet on/off. Also each test of by 40k+- and both results were in that range. I test how it feels in the first place, when i am tweaking something...



jayfkay said:


> DPC latency checker shows 1000µs constantly, so I guess that program is useless.


Yep use latencymon, DPC latency checker doesn't work since win7. Don't game when you test dpc latency: says latencymon support. It already simulates a load. It is the default mode latencymon uses:"interrupt to user process latency". Windows ADK appears to use same mode, as when i start gaming during test, it gives me same spikes for gpu driver to 600 us.


----------



## r0ach

Axaion said:


> Please waste 2 minutes of your life to do a mousetester graph and latencymon test, just for kicks


I've already posted DPC tests of z77 boards before. Got 2 on Gigabytes and 3-4 on Asus ones.


----------



## w1tch

*Results*

Hello, I would like to share my results:
Mouse: Glorious Model O @1000hz
CPU: Ryzen 1600
GPU: RX 580 8GB
RAM: [email protected]
MSIutilv2: all pci devices use msi mode
Drivers/Bios: latest revision

Latencymon (stats/drivers): https://imgur.com/a/qDnzUeq
MouseTester (two attempts): https://imgur.com/a/OpN0eVy
and a picture of default timers.

Problem: cannot flick in csgo, really hard to play kz maps - bad strafe sync.

Goal: How should I go about achieving "optimized system" graph as from original poster?


----------



## r0ach

w1tch said:


> Hello, I would like to share my results:
> Mouse: Glorious Model O @1000hz
> CPU: Ryzen 1600
> GPU: RX 580 8GB
> RAM: [email protected]
> MSIutilv2: all pci devices use msi mode
> Drivers/Bios: latest revision
> 
> Problem: cannot flick in csgo, really hard to play kz maps - bad strafe sync.
> 
> Goal: How should I go about achieving "optimized system" graph as from original poster?


Here's your problems:

1) Stop reading ANY posts from people who claim you should use that stupid utility to switch all devices to MSI mode that aren't already that way by default like ethernet controller. I get garbage mouse movement by changing ANYTHING to MSI mode. Changing the GPU to MSI mode is a noob trap setting because while it does make it feel like the cursor has less 'friction' on a lot of setups, your accuracy will be significantly worse. It makes cursor movement objectively worse. Period. Nvidia GPUs ship with MSI mode off by default, and I see you have an AMD GPU and those ship with MSI mode on. The context of my post was discussing MSI mode for Nvidia GPUs, so not sure what happens by turning it off for AMD cards.

2) I have both the Model O and Model O- and while they have some of the fastest response 3360's I've ever seen, the overall sensor implementation is going to be less accurate than something like the 3366 in the Logitech G403. People can probably compete at a high level with the Model O, but Logitech's sensor implementation is better, and so is Steelseries' 3360 in the Sensei 310 (but it's too wide and slippery rubber grip to even use).

3) Ram latency makes a HUGE difference on my DDR3 builds for cursor movement. It sounds like it would not, but it does. The difference in using CAS9 and CAS 7 is night and day. If you're using DDR4, I would not buy anything less than 3200 mhz CAS 14.

4) When I had a room full of GPUs mining digital ****coins, I noticed all the 3rd party AMD cards had terrible mouse movement compared to the R9 290 reference. A lot of Nvidia 3rd party cards have worse mouse movement than reference as well, but NOWHERE NEAR as bad as I saw for 3rd party AMD cards. I'm on an Asus 1660 Super Strix right now with dual bios and the "quiet mode" bios switch isn't very good, but the default "performance mode" BIOS switch is good if you were looking for a cheap card without a gimped BIOS implementation ($250). If you wanted to play it safe, I would get an Nvidia reference card, though. You should also be using only 441.66 Nvidia driver (other new ones aren't nearly as good latency-wise) and the following settings: 



















Important: You also need to change this setting under "Desktop and Color Settings". I don't think the setting comes up if you connect by DVI, but does for HDMI and possibly others:








[/quote]


----------



## w1tch

Here's your problems:

*1) Stop reading ANY posts from people who claim you should use that stupid utility to switch all devices to MSI mode that aren't already that way by default like ethernet controller.*

By default all devices but two "High Definition Audio Controllers" are set to MSI Mode in my AsRock B450 Pro4. I only switched these two to MSI Mode to stop HdAudBus from generating IRQ's in LatencyMon.

*2) I have both the Model O and Model O- and while they have some of the fastest response 3360's I've ever seen, the overall sensor implementation is going to be less accurate than something like the 3366 in the Logitech G403.* 

I have had access to g402, g203, g305, g502, harpoon rgb, naga trinity. Sadly it is not mouse dependent problem.

*3) Ram latency makes a HUGE difference on my DDR3 builds for cursor movement. It sounds like it would not, but it does. The difference in using CAS9 and CAS 7 is night and day. If you're using DDR4, I would not buy anything less than 3200 mhz CAS 14.*

It is Patriot Viper Elite 2800MHZ CL16-16-16-36. I do not have access to another pair of ram stick

*4) When I had a room full of GPUs mining digital ****coins, I noticed all the 3rd party AMD cards had terrible mouse movement compared to the R9 290 reference. A lot of Nvidia 3rd party cards have worse mouse movement than reference as well, but NOWHERE NEAR as bad as I saw for 3rd party AMD cards.*

I partially agree with this, mouse was somewhat easier to "tame" while ASUS DUAL GTX 1060 3GB was in use, however I cannot just go and buy parts. If it is unsolveable without changing hardware then I'am stuck with this lagging mouse movement.

For Monitor, I got 144hz via vesa certified displayport cable.


----------



## r0ach

I forget the reason, but audio devices aren't supposed to use MSI mode in the first place. Seriously, don't use that utility to force anything to MSI mode.


----------



## Athrutep

w1tch said:


> Hello, I would like to share my results:
> Mouse: Glorious Model O @1000hz
> CPU: Ryzen 1600
> GPU: RX 580 8GB
> RAM: [email protected]
> MSIutilv2: all pci devices use msi mode
> Drivers/Bios: latest revision
> 
> Latencymon (stats/drivers): https://imgur.com/a/qDnzUeq
> MouseTester (two attempts): https://imgur.com/a/OpN0eVy
> and a picture of default timers.
> 
> Problem: cannot flick in csgo, really hard to play kz maps - bad strafe sync.
> 
> Goal: How should I go about achieving "optimized system" graph as from original poster?


The biggest difference you will see is when you disable all c-states, power throttling options in your bios and don't use anything other than windows default antivirus its good enough. That alone will lower your dpc latency and increase your polling precision by a lot. The rest of the tweaks are just doing very to little while maybe even making things worse otherwise. And i agree, if you don't have any irq conflicts (which shouldn't happen on a fresh install unless you screw things up) don't put devices into msi that aren't.


----------



## x7007

Could our weird issues be from Computer time?
I don't know what but it made my mouse feel way better and accurate.

it might take some time for your time to go out of correction, so you need to keep refresh it will do so eventually. with the software, it always keeps autocorrect it

The problem
https://blog.codinghorror.com/keeping-time-on-the-pc/

To see the problem
https://time.is/


To fix the issue
https://www.meinbergglobal.com/english/sw/ntp.htm


----------



## Straszy

w1tch said:


> Hello, I would like to share my results:
> Mouse: Glorious Model O @1000hz
> CPU: Ryzen 1600
> GPU: RX 580 8GB
> RAM: [email protected]
> MSIutilv2: all pci devices use msi mode
> Drivers/Bios: latest revision
> 
> Latencymon (stats/drivers): https://imgur.com/a/qDnzUeq
> MouseTester (two attempts): https://imgur.com/a/OpN0eVy
> and a picture of default timers.
> 
> Problem: cannot flick in csgo, really hard to play kz maps - bad strafe sync.
> 
> Goal: How should I go about achieving "optimized system" graph as from original poster?


Just stop messing up with ur OS... Do a fresh windows install (download ISO by media creation tool) without internet connection, then disable all crap like windows update, background apps, install drivers connect internet and vuola. And in bios disable power saving features aswell...


----------



## empl

x7007 said:


> Could our weird issues be from Computer time?
> I don't know what but it made my mouse feel way better and accurate.
> 
> it might take some time for your time to go out of correction, so you need to keep refresh it will do so eventually. with the software, it always keeps autocorrect it
> 
> The problem
> https://blog.codinghorror.com/keeping-time-on-the-pc/
> 
> To see the problem
> https://time.is/
> 
> 
> To fix the issue
> https://www.meinbergglobal.com/english/sw/ntp.htm


Interesting, but installing this on my system - even it synchronized time perfectly, which was previously off 0.7 second. Made mouse feel worse. What knows all it does, it even installed service. Even after uninstallation problem persisted and after synchronizing manually to windows time server, lag is gone LOL...

Does system time have any effect on mouse movement ??? When computer is running i don't know what timer is used for keeping time. But when it is off it is RTC right ? I don't think having system time off matters on mouse movement and in that article: there was nothing that would suggest it does. After i uninstalled this program mouse feels laggier if anything. Do you have any other article that would suggest so. Otherwise i have qpc, rtc and acpi all same, after running it and gaming. Surprisingly cpu-z doesn't show hpet, while i have it on. But, weird after synchronizing back to windows time server lag is gone. I have sincerely no idea about this...


----------



## x7007

empl said:


> Interesting, but installing this on my system - even in synchronized time perfectly, which was previously off 0.7 second. Made mouse feel worse. Even after uninstallation problem persisted and after synchronizing manually to windows time server, lag is gone LOL...
> 
> Does system time have any effect on mouse movement ??? When computer is running i don't know what timer is used for keeping time. But when it is off it is RTC right ? I don't think having system time off matters on mouse movement and it that article: there was nothing that would suggest it does. After i uninstalled this program mouse feels laggier if anything. Do you have any other article that would suggest so. Otherwise i have qpc, rtc and acpi all same, after running it and gaming. Surprisingly cpu-z doesn't show hpet, while i have it on.


if you read everything it was said that it was used to cheat in counterstrike before anyone could do anything he would kill all. 
so changing timing effects whatever, epically on multi-player. 
for me the mouse movement is better. you might have something else.


----------



## empl

I miss that one out, that's pretty canny, but there is nothing that would suggest that it affects mouse movement. But strange as it is, there was difference before and after. But i am not sure time is what caused that.


----------



## Versus2190

r0ach said:


> I just assembled my new build. I had been using a board that didn't let you disable HPET for a while and the new one allows me to disable it (Asus P8Z77-I Deluxe ITX) so I can do a good comparison here.


Would be interesting to hear how you feel about hpet on vs off in a direct comparison.


----------



## jayfkay

I clocked my ram from default 2400 16-16-16-38 to 2400 12-12-12-28 and while LatMon and polling rate have hardly changed, cursor on desktop instantly felt more responsive.
I also went from a 480 fps to a 530fps benchmark in csgo.


----------



## andreeeeee

x7007 said:


> Could our weird issues be from Computer time?
> I don't know what but it made my mouse feel way better and accurate.
> 
> it might take some time for your time to go out of correction, so you need to keep refresh it will do so eventually. with the software, it always keeps autocorrect it
> 
> The problem
> https://blog.codinghorror.com/keeping-time-on-the-pc/
> 
> To see the problem
> https://time.is/
> 
> 
> To fix the issue
> https://www.meinbergglobal.com/english/sw/ntp.htm












This can't be good, right?

ps: HPET on at the moment (pc running after sleep/hibernation)

edit 1: I have a Dell - from https://blog.codinghorror.com/keeping-time-on-the-pc/:










As I posted here, for years I've been feeling a huge desync/hitreg problem with HPET off on Windows 7 (it's forced ON on my BIOS): https://www.overclock.net/forum/28325906-post1023.html

When I forced HPET on, suddenly my hitreg got so much better. However, I feel like the game (CS:GO) is indeed "slower"

edit 2: If I try to manually update my time via Windows Server (like @empl did), the offset remains the same. However, if I change my time server from default to Google's, then it gets right










Haven't tested on game yet if it affects something, but should it? (like it did for @empl and @x7007?)

Could it affect the UDP packets somehow? Timestamps? Or affect the way apps/games sync it?


----------



## empl

I don't how how gaming engine works exactly. But modern games... i don't think it would affect anything. I never heard about time affect pc game, only in that article about cs 1.6, which doesn't exists anymore. Modern games have to count with that right ? For example sending packet to a server, which send you packet back and recording when it arrived can measure difference in time, although i don't know how accurate this is, i am not netcode expert so i have no idea.

Even if time is off, if it still passing at the same rate (unless something cause to change it, but that mostly due to some bug right and happens at one moment. And even if it was slowly getting off, would it matter ?). And how long you were walking e.g. it would depend on difference when you started walking and ended. And i don't even know if games use same time measuring method as windows time. If it doesn't use more precise method like hpet, tsc or something.

But i can tell you that installing that program NTP, only increased lag on my system, after uninstalling it and synchronizing time with windows server lag was gone, but that can be just me. I don't know what that program does exactly, plus it install service. Services can be source of lag !!! You can also try cpuz, tools/timers and start test and game and than check if all timers have same values. And get bclk closest to frequency 100mhz.

You can check my tweaking guide if you are bored guide 

PS: yea ram timmings cause huge difference in lag as well, problem is today ram affect fps a lot. There is 20 fps difference between 2133 and 3600 mhz. I got gskill trident z 3200 mhz cl16, on black friday like for 100$, while cl 14 would cost about 50% more now, previously dunno. So i am happy with my ram, got them super cheap. But thing is - maximum performance ram won't be always with lowest latency. I found some insane list of benchmarks and some ddr3 2x2 gb ram had lowest latency of all, than major brands. So you have to choose between performance and latency. And games like cs go are single core so ram freq. and timings and cpuf freq. also matters a lot. Ye but ideally you want lowest timings, while achieving maximum throughput.


----------



## Timecard

Pretty sure Half Life Engine (CS1.6, Source, CSGO) takes this into account for all versions, however.... it also has a max time discrepancy in the netcode so if client time is beyond a certain threshold it converts (slams) the time on the messages to the server time instead (received time). So you could change your time out by 1-2 minutes and the engines will compensate for your clients time and use the servers time instead, the engine itself isn't looking for a discrepancy in minutes since it's a near real time game.


----------



## andreeeeee

empl said:


> I don't how how gaming engine works exactly. But modern games... i don't think it would affect anything. I never heard about time affect pc game, only in that article about cs 1.6, which doesn't exists anymore. Modern games have to count with that right ? For example sending packet to a server, which send you packet back and recording when it arrived can measure difference in time, although i don't know how accurate this is, i am not netcode expert so i have no idea.
> 
> Even if time is off, if it still passing at the same rate (unless something cause to change it, but that mostly due to some bug right and happens at one moment. And even if it was slowly getting off, would it matter ?). And how long you were walking e.g. it would depend on difference when you started walking and ended. And i don't even know if games use same time measuring method as windows time. If it doesn't use more precise method like hpet, tsc or something.
> 
> But i can tell you that installing that program NTP, only increased lag on my system, after uninstalling it and synchronizing time with windows server lag was gone, but that can be just me. I don't know what that program does exactly, plus it install service. Services can be source of lag !!! You can also try cpuz, tools/timers and start test and game and than check if all timers have same values. And get bclk closest to frequency 100mhz.
> 
> You can check my tweaking guide if you are bored guide
> 
> PS: yea ram timmings cause huge difference in lag as well, problem is today ram affect fps a lot. There is 20 fps difference between 2133 and 3600 mhz. I got gskill trident z 3200 mhz cl16, on black friday like for 100$, while cl 14 would cost about 50% more now, previously dunno. So i am happy with my ram, got them super cheap. But thing is - maximum performance ram won't be always with lowest latency. I found some insane list of benchmarks and some ddr3 2x2 gb ram had lowest latency of all, than major brands. So you have to choose between performance and latency. And games like cs go are single core so ram freq. and timings and cpuf freq. also matters a lot. Ye but ideally you want lowest timings, while achieving maximum throughput.


I had your tweaking guide in my bookmarks, but I didn't know it was yours. Good stuff empl

I'm still running DDR3 1600 MHz and I feel like this is one of my bottlenecks today. Thanks for the tips



Timecard said:


> Pretty sure Half Life Engine (CS1.6, Source, CSGO) takes this into account for all versions, however.... it also has a max time discrepancy in the netcode so if client time is beyond a certain threshold it converts (slams) the time on the messages to the server time instead (received time). So you could change your time out by 1-2 minutes and the engines will compensate for your clients time and use the servers time instead, the engine itself isn't looking for a discrepancy in minutes since it's a near real time game.


Interesting. So maybe this clock offset is the result of another problem? And not the reason itself of my hitreg problems?


----------



## r0ach

jayfkay said:


> I clocked my ram from default 2400 16-16-16-38 to 2400 12-12-12-28 and while LatMon and polling rate have hardly changed, cursor on desktop instantly felt more responsive.
> I also went from a 480 fps to a 530fps benchmark in csgo.


On every z77 Ivy Bridge DDR3 build I've tried, the difference in mouse response from Cas 9 to Cas 7 on DDR3 is huge. Strangely, the Steambox I have is Haswell and Cas 9 on that feels about the same as Cas 7 on Ivy Bridge. My theory is FIVR probably makes the system feel more responsive...since that's what's FIVR's entire purpose besides power reduction was...to reduce latency.


----------



## Marctraider

Best I can get it on my concocted LTSC build.

@r0ach is right in leaving MSI on Nvidia alone.

Bottom one is before any driver installation. Top two after Nvidia install.

So before gpu driver installed, this 1809 build is basically the same as in Windows 7, and after install still well within acceptable range.


----------



## Athrutep

Marctraider said:


> Best I can get it on my concocted LTSC build.
> 
> @r0ach is right in leaving MSI on Nvidia alone.
> 
> Bottom one is before any driver installation. Top two after Nvidia install.
> 
> So before gpu driver installed, this 1809 build is basically the same as in Windows 7, and after install still well within acceptable range.


Can you post something longer, like 12 seconds?


----------



## x7007

It's funny how NVIDIA is the only one who doesn't have MSI Enabled! Even AMD and INTEL APU have MSI enabled. I can call them cheap company expensive hardware.

Another interesting thing you might look on. this is about stability only and not performance.

One thing I'm sure of is that windows randomly change the timer, so if it worked good before it could not work after. I say that because I have i3770K with Windows 10 x64 newest RTM with HT enabled. I have HPET enabled in Bios.

I am using BlueStacks emulator some android game and I updated to the latest version, I started having issues after a long period of times, we can say it could be windows update or the newest Bluestacks.
When normally the windows Timer is 1.0ms or 15.625 idle, running Bluestacks makes the timer 0.9765ms, no matter what I do it always like that.
Now I need to use also Bcdedit /set useplatformTick Yes so it won't be 0.4988 or some other weird timer because the CPU doesn't have TSC or Invariant TSC. the other one is ofc Bcdedit /set disabledynamicTick Yes.

The game is Raid
The issue was after a long period of time the program and the game was running when entering some parts of the game menus for example places which need to use some kind of browser data of updated list from other users or some events that keep refreshed, the game would not load the data, menu freeze or just instant crash. Now the only thing I did is changing is Adding Bcdedit /set Tscsyncpolicy Enhanced, and the problem did not happen anymore. I would even have a weird Connection issue every second, saying it could not connect, or some other issues, which clearly is not server issue because for others on the clan it worked fine. even when the ping to everywhere on the internet was fine.

So clearly whatever windows are doing, it doesn't always the best timer for the system and things will crash or won't work and we won't even understand why.

So this setting is for stability more than performance, that's why you won't see any performance gains.

Also this

https://github.com/CHEF-KOCH/GamingTweaks/issues/18

Final conclusion
Keep HPET in BIOS enabled (newer BIOS/mobo combinations do not even have the option to disable it anymore since it's enabled all the time).

Disable Hpet
bcdedit /set useplatformclock no
Restart your OS

Disable synthetic timers
bcdedit /set useplatformtick yes
Restart your OS

The above stuff only has to be done until 1909 and is NOT needed with 2004 or higher!
You can ignore all other mentioned flags mentioned above.
Best practice:
Install Windows 2004 now, no need to set other options. If you install another OS (upgrade) from within 2004 you do NOT NEED to change anything anymore. The OS detects it and automatically sets above mentioned "flags".

Reported "mouse lags" once HPET bios option was enabled is caused because the mentioned flags are not set or the driver (mouse driver) causes itself some "lags" (inaccuracy). In a lot of cases it's also directly related to the software from the mouse which "communicates" to the driver (for e.g. changing settings x), which means the software might causes additional problems or "lags". The best practice here is to to setup your mouse, ave the settings into the mouse internal memory, remove the mouse software.

Other reported "OS lags" are caused by some problematic KB's or because wrongly configured Windows Defender settings.

Other considerable "problems" are not reproducible.


----------



## Timecard

- As for your timer resolution, it's the applications on your system that alter or request different values, the one with the lowest request value always wins/stays. Depending on your operating system you'll notice that the system gives different resolutions for the same requested value. 0.5 vs ~0.48. HPET/dynamictick etc Perhaps the differences are just bad floating point math on Microsoft's end or they have changed the foundation of the timer systems on later versions.

- Your timer resolution indicates the time difference between each Quantum (time allotted to any given thread/process for cpu resource), depending on w32priorityseparation value those factors may change it with the defined foreground boost, long/short/variable definitions. To say... if you're seeing a value below 0.5 for example like 0.486, 0.486 it would suggest that the quantums are smaller (faster, quicker transitions) than 0.5 which could be a good thing. I haven't seen any real research with data to indicate performance differences between even values like 0.5 and 0.486. So don't just go assuming 0.5 is best vs the other values you are seeing on newer windows without comparing results yourself.


----------



## andreeeeee

Marctraider said:


> Best I can get it on my concocted LTSC build.
> 
> @r0ach is right in leaving MSI on Nvidia alone.
> 
> Bottom one is before any driver installation. Top two after Nvidia install.
> 
> So before gpu driver installed, this 1809 build is basically the same as in Windows 7, and after install still well within acceptable range.



Of these top two (after nvidia install), the better one is with MSI off and the worse with MSI on?


----------



## Marctraider

Thing is, on an uncapped game you'll never going to
notice useplatformtick yes/no.

It starts to matter once you use (in-game) fps limiters and their accuracy. Unless one doesn't believe they give the best consistency / input lag balance, you'll generally want to use this.

Try it on CSGO or any cryengine based game with in-engine limiter and compare results. Without synthetic timer the cap is much more stable and with less drift (basically like back in Win 7)

As for RTSS etc these are already very accurate regardless, so I doubt this will make much of a difference.

For mouse polling it doesn't fix the outliers / spikes either. Best way to try is screwing around with interrupt affinities.


----------



## Marctraider

andreeeeee said:


> Of these top two (after nvidia install), the better one is with MSI off and the worse with MSI on?


This is all with default (msi off) aka. emulated legacy interrupt.


----------



## deepor

andreeeeee said:


> Of these top two (after nvidia install), the better one is with MSI off and the worse with MSI on?



If you look closely at those top two, they seem to be the same. They use a different scale on the vertical axis. If you keep that different scale in mind, you'll see they show a similar picture.


----------



## r0ach

Marctraider said:


> Thing is, on an uncapped game you'll never going to
> notice useplatformtick yes/no


Stop trolling. Even terrible gamers can notice the difference in HPET on and off for mouse movement. And there's no reason to ever have useplatformclock forced on. If someone can't notice a difference at all with HPET on vs off there is something seriously wrong with your sensory perception.


----------



## RamenRider

empl said:


> If you disable HPET, what timer is exactly used ? Besides HPET, i have 2 more devices in device manager (at least those i found, maybe there are more of them, or some aren't visible maybe).
> They are: system timer and cmos/ real time clock, which is old device for measuring time, doubt it is used. I wasn't able to find any concrete info about system timer. I suspect it is that legacy timer, it has i/o address and irq number. I don't have gaming mobo, i have currently gigabyte ga-b75m-d2v and i am using hpet. But i have still very low isr and dpc latency. Disabling it doesn't improve my dpc latency, maybe it even makes it worse and mouse feels worse. But it vary on each system.
> 
> Btw 5-15 dpc latency offload ? What is your dpc latency in-load ? There are some good mobos, which have <100us in load on some systems, but usually drivers can spike even to couple hundred. My nvidia drivers usually have like 600us. For example asrock z390 phantom gaming-itx/ac, anandtech measured least than 100us. But these test are very relative, as each hardware configuration is different. And it may not be clear from tests, if these values are before, or after tweaking.
> 
> Btw why hpet doesn't have irq ? I read in old book: "Windows 7 annoyances" that says - users had good experience to give highest interrupt priority to "cmos/ real time clock", or to gpu. Under "PriorityControl" key, which is located at HKLM\SYSTEM\CurrentControlSet. I can feel the difference, if i prioritize gpu and usb. Even measuring usbport.sys, highest execution time got lower, while i maxed polling rate. But this value may fluctuate. It is hard to take these values seriously, as you can't exactly replicate system load at one point in time. But i am very sensitive to input lag, so i can usually feel the difference and go by that. Even drivers adjusts irq priority under HKLM/system/current control set/enum/"hardware id of device"/affinity policy: device policy. E.g. sata automatically adjust it and give itself high priority, when you put them to undefined input lag improves. Same thing was saying 1 person here.
> 
> Even i was told interrupt affinity can be ignored on hardware/driver level by latencymon support. I was skeptical setting this in regedit at first, partly because it never worked for me. I had to set gpu to msi for some reason in order for it to work. Now i don't have everything scheduled on core 0. Same thing is with interrupt priority, setting it in registry works. Even some drivers put themselves to high priority, which is mistake probably and doesn't allow system to schedule interrupts properly, as putting it to undefined reduces input lag. It wouldn't surprise me, as badly coded drivers are source of high dpc latency. Still drivers/hardware supersede values from registry. Otherwise having it improperly configured would slow down system significantly. But it works, as we can see. Minimally at least sometimes: until it is overridden by drivers/hardware etc. But it still helps configure it manually.


Check out my new and improved Input Lag guide:
https://community.amd.com/message/2...TMHNPpJvcUJ73n8KSD-Q951-MStxi6YZLWotL_8b8hvA8

"[Currently in Research] Windows 10 QueryPerformanceFrequency and timer resolutions. Youtuber FR33THY made a video about how it's better to have HPET on in BIOS and off in OS to keep a perfect low timer resolution of 0.5, instead of 0.48 or any other unbalanced number that causes desyncing problems which contributes to stuttering and desyncing.
To learn more about this check out this thread explaining timer resolution impact on performance. https://www.reddit.com/r/Amd/comments/epl1j3/ramds_best_threads_bftbugs_fixes_and_tips_v1/fekr7td/"


----------



## x7007

also I've found out that some reason using the Intelligent standby list cleaner (ISLC) v1.0.2.2 or even 1.0.2.1 cause to some games not to start when it's on Start. for example star wars battlefront II. the game would just have the little window and do nothing. I tried more than 10 times. as soon I change the ISLC to Stop, the game would start immediately.


----------



## SuperMumrik

RamenRider said:


> Check out my new and improved Input Lag guide:
> https://community.amd.com/message/2...TMHNPpJvcUJ73n8KSD-Q951-MStxi6YZLWotL_8b8hvA8
> There is no fix. Windows XP was a hardware accelerated desktop, then Win 7 wasn't, then Win 8.1 was again. This is why desktop cursor movement vs exclusive full-screen 3d mode feels the same in Windows 8.1 but not Windows 7. What's bizarre is Nvidia doesn't support Windows 8.1 on 2060-2080 series but does on 1600 series. I refuse to use Windows 10 New World Order edition myself. It's Microsoft attempting to transition to a fully locked down, Apple-style OS, and I would use it solely as a game box and nothing else due to that, but cursor movement is worse than Win 8.1, so I have no use for it at all./"



There is 20 series support in win8.1 by the win7 driver. 
Other than that it's a very good guide IMO


----------



## x7007

RamenRider said:


> Check out my new and improved Input Lag guide:
> https://community.amd.com/message/2...TMHNPpJvcUJ73n8KSD-Q951-MStxi6YZLWotL_8b8hvA8
> 
> "[Currently in Research] Windows 10 QueryPerformanceFrequency and timer resolutions. Youtuber FR33THY made a video about how it's better to have HPET on in BIOS and off in OS to keep a perfect low timer resolution of 0.5, instead of 0.48 or any other unbalanced number that causes desyncing problems which contributes to stuttering and desyncing.
> To learn more about this check out this thread explaining timer resolution impact on performance. https://www.reddit.com/r/Amd/comments/epl1j3/ramds_best_threads_bftbugs_fixes_and_tips_v1/fekr7td/"



Nice guide mate. you can also add the things I discovered about the Tscsyncpolicy, it should help with stability with windows 1903-1909 till windows build 2004, where it was said to be fixed. windows detect the timer every boot, sometimes it fails to do it properly so that why somethings change.

I tested the Disabling Smooth scrolling on the IE11, which seem to also help.

About the languages, it is better to use English Philippines because English Israel has UK symbols and you don't have a normal keyboard. so it prefers to use this one. We need to find a way to stop windows installing Language pack, I use Hebrew and it keeps installing it, it can't affect much on the lag input but it does take some storage and I'll never use it.

Overall good guide, should help many people if they do so. Just delete the MSI section for Nvidia, it is the worst idea for it. Nvidia doesn't want us to have the best performance for the price of Geforce, only if you buy Quadro.

All the things I said above fix my lag in a miracle way and it is noticed.

But finally, we are so close to fixing the lag input, remember that Windows 7 had to have different things, fewer things on the os and fewer technologies. So when they add many new things and change many other things, we will have issues.


----------



## empl

Great guide from ramenrider !!! Didn't have time to go over all of it yet.

*LOLOLOL i just found EPIC INPUT LAG TWEAK* : https://superuser.com/questions/101...if-the-sound-card-in-the-pc-has-been-disabled 
Sound card is major source of input lag, test on/off and you will see !!! You can use usb headset with sound card off, as it has soundcard integrated in its controller. Tho don't know how much dpc latency would usb generate, using usb sound card, but motherboard sound card has huge latency !

*Also anandtech does dpc latency test for MOBOS, which are relative, but better than nothing !*


What did you found with tsync ? There are to many posts...

-> Disabling smooth scrolling helped a lot GJ ! EN philipines maybe dunno. 

Also if you could somehow disable hardware acceleration, you can still use nvidia inspector to force single display mode under setting: "acceleration" for lowest lag. But still - before installing nvidia drivers if i disable HW accel. mouse feels better, than with nvidia drivers installed.



7007 said:


> Just delete the MSI section for Nvidia, it is the worst idea for it. Nvidia doesn't want us to have the best performance for the price of Geforce, only if you buy Quadro.


 -> What do you mean exactly by that ? Nvidia is disabling msi mode for their cards ??? It wouldn't surprise me tho, they locked their bios on newer cards ! They say to prevent selling fake gpus, but i don't think it was their first priority ! 

Also disabling DWM would be holly grail, even there are ways to do that, you are stuck with windowed mode, which can reduce performance and slightly increase input lag. Also you can freeze your system and damage it, it is risky. And it is to vehement to do that on daily basis.

Also you know about program autoruns ? There are various hidden processes and services running on your pc. Steam guide. In msconfig clean boot: if you disable all non ms services, it reduces lag a lot ! Problem is you have to restart pc each time and tick fields back on (because you can't get by without some of them), which is to vehement. Which makes me wanna do multi-boot ! Also scheduled tasks are problem, but i had terrible experience touching anything in task scheduler, gave me permanent lag until windows repair with usb/reinstall on multiple systems !

PS: intel is weasel and chicken, ye they tried to avoid comparisons and ms **** on amd and doesn't release updates like a year, while intel has them asap. Ofc intel is scared now and tries to boycott AMD... And nvidia isn't spectacular as well, tho amd gpu still dunno. They had bug users couldn't use more than 60hz monitor and i have feeling it wasn't fixed any time soon.


----------



## r0ach

empl said:


> *LOLOLOL i just found EPIC INPUT LAG TWEAK* : https://superuser.com/questions/101...if-the-sound-card-in-the-pc-has-been-disabled
> Sound card is major source of input lag, test on/off and you will see !!!


You don't need to use headphones. I run with no soundcard and just output sound from Nvidia GPU to monitor, then from monitor to two stereo speakers via the 3.5mm jack. Most monitors have 3.5mm jacks nowadays.


----------



## x7007

I am using Creative SXFI for headphones and eclaro HT omega pcie sound card for 5.1. 

I don't have single dpc issue. at least didn't have before and after tweaks. but yes Realtek is crap.


----------



## Athrutep

r0ach said:


> You don't need to use headphones. I run with no soundcard and just output sound from Nvidia GPU to monitor, then from monitor to two stereo speakers via the 3.5mm jack. Most monitors have 3.5mm jacks nowadays.


But what if the nvidea sound introduces more input lag than specific mobo built in soundcards?


----------



## Timecard

Dwm "usually" doesn't enforce vsync, or use any resources when you run a game in full screen exclusive mode, if you aren't sure if your game is impacted use gpuview to confirm. Lots of great examples on diagnosing real world scenarios and game delays in the link below. Happy hunting.

https://graphics.stanford.edu/~mdfisher/GPUView.html
https://forums.blurbusters.com/viewtopic.php?f=10&t=6214


----------



## r0ach

Athrutep said:


> But what if the nvidea sound introduces more input lag than specific mobo built in soundcards?


It doesn't because you don't install the Nvidia sound driver for it so it defaults to the Microsoft Windows sound driver for the GPU's audio output. When I used a PCI-E 1x Soundblaster Z, I did the same thing and just put the card in, installed no driver for it, and let it use Window's default audio driver. I've tested on-board audio solutions before and they are god awful input lag-wise compared to the two above alternatives.


----------



## Athrutep

r0ach said:


> It doesn't because you don't install the Nvidia sound driver for it so it defaults to the Microsoft Windows sound driver for the GPU's audio output. When I used a PCI-E 1x Soundblaster Z, I did the same thing and just put the card in, installed no driver for it, and let it use Window's default audio driver. I've tested on-board audio solutions before and they are god awful input lag-wise compared to the two above alternatives.


Have you tested that?

I can't exactly recall if i read it in your thread, but weren't you saying that pcie soundcards and that you should avoid using any pcie cards at all if possible?


----------



## r0ach

Using a Soundblaster Z PCI-E sound card without any driver installed and using the default Microsoft Windows sound driver for it is FAR better than using on-board Realtek sound and installing it's necessary drivers. But just outputting sound from the Nvidia card's HDMI/Displayport to the monitor then plugging in speakers to the monitor's 3.5mm jack is going to be better. 

You can't turn off the Nvidia card's audio out anyway - the OS detects it's there and automatically installs the default Microsoft driver - so putting in a PCI-E sound card just means you have...two sound cards for no reason basically. Having multiple audio devices or devices you don't need always has a large chance of screwing up your mouse movement. It was worse in the old days of the 580 GTX when I think every single port on the card had it's own audio driver instead of just one for the whole card so you would look in your device manager and it would be flooded with audio devices.


----------



## Athrutep

r0ach said:


> Using a Soundblaster Z PCI-E sound card without any driver installed and using the default Microsoft Windows sound driver for it is FAR better than using on-board Realtek sound and installing it's necessary drivers. But just outputting sound from the Nvidia card's HDMI/Displayport to the monitor then plugging in speakers to the monitor's 3.5mm jack is going to be better.
> 
> You can't turn off the Nvidia card's audio out anyway - the OS detects it's there and automatically installs the default Microsoft driver - so putting in a PCI-E sound card just means you have...two sound cards for no reason basically. Having multiple audio devices or devices you don't need always has a large chance of screwing up your mouse movement. It was worse in the old days of the 580 GTX when I think every single port on the card had it's own audio driver instead of just one for the whole card so you would look in your device manager and it would be flooded with audio devices.


Too bad that the sound quality is just ass then, unless there is a way of running an equalizer to make this actually useful in gaming scenarios where you have to rely on directional sound.


----------



## x7007

Athrutep said:


> Too bad that the sound quality is just ass then, unless there is a way of running an equalizer to make this actually useful in gaming scenarios where you have to rely on directional sound.


why not buy creative sxfi amp or x3. u will have the best surround exist for the price. also it uses the best USB controller compare to others. no drivers, just plug and play. the x3 requires the control panel though, but it won't cause any issues. the sxfi amp just have small windows framework code control panel. but u don't have to run it all the time set and close it


----------



## Athrutep

x7007 said:


> why not buy creative sxfi amp or x3. u will have the best surround exist for the price. also it uses the best USB controller compare to others. no drivers, just plug and play. the x3 requires the control panel though, but it won't cause any issues. the sxfi amp just have small windows framework code control panel. but u don't have to run it all the time set and close it


Found a Creative Sound BlasterX G6 for just 80€ and the X3 for 95€. So i guess i will get the G6


----------



## empl

r0ach said:


> You don't need to use headphones. I run with no soundcard and just output sound from Nvidia GPU to monitor, then from monitor to two stereo speakers via the 3.5mm jack. Most monitors have 3.5mm jacks nowadays.


Doesn't nvidia hd audio cause lag ? I literally disable it in device manager and i think it lags. I found on the internet nvidia high definition audio can cause mouse lag. Also how does it work without soudcard ? Never heard of monitor with soundcard, or dac. Also how many gaming monitors have 3.5 jack ?\

Using usb headset might not be best solution as well. I heard they cause a lot of dpc latency. How about streaming sound to second pc and play it there ?\



> Athrutep


All pcie cards cause high dpc latency !!!

I did some research and there is probably no need for dedicated soundcard today - as onboard sound cards got better and have emi shielding. Unless you use some high level equipment, or hear static. You can buy external dac, as i heard realtek dac is ****. But even that wouldn't be much difference probably, unless you have some expensive headphones. But soundcards have special effects and maybe if you have some specific headphones like from soundblaster, you can benefit from soundblaster soundcard enabling some technology. I don't know what soundblaster uses.


----------



## x7007

Athrutep said:


> Found a Creative Sound BlasterX G6 for just 80€ and the X3 for 95€. So i guess i will get the G6


it's not the same thing. g6 is old as hell and doesn't come with the Sxfi technology. you will get crappy virtual surround. get the x3 or sxfi amp. only


----------



## Athrutep

x7007 said:


> it's not the same thing. g6 is old as hell and doesn't come with the Sxfi technology. you will get crappy virtual surround. get the x3 or sxfi amp. only


alright


----------



## Athrutep

r0ach said:


> Using a Soundblaster Z PCI-E sound card without any driver installed and using the default Microsoft Windows sound driver for it is FAR better than using on-board Realtek sound and installing it's necessary drivers. But just outputting sound from the Nvidia card's HDMI/Displayport to the monitor then plugging in speakers to the monitor's 3.5mm jack is going to be better.
> 
> You can't turn off the Nvidia card's audio out anyway - the OS detects it's there and automatically installs the default Microsoft driver - so putting in a PCI-E sound card just means you have...two sound cards for no reason basically. Having multiple audio devices or devices you don't need always has a large chance of screwing up your mouse movement. It was worse in the old days of the 580 GTX when I think every single port on the card had it's own audio driver instead of just one for the whole card so you would look in your device manager and it would be flooded with audio devices.


I have to give credit where credit is due. 

Polling precision with my onboard realtek audio compared to routing it through my second monitor via hdmi and my nvidea card. I deinstalled my realtek drivers before and cleaned it with DDU in safemode before rebooting. Is it a huge difference? No. Is it a difference, yes. It all adds up.

And here is some empirical evidence. And mind you this is on the trashy win 10 1909

onboard audio









onboard bios off / audio through gpu to monitor










sounds like garbage though and doesn't do my beyer dynamics DT1990s any justice. So i will get the Creative X3 usb audio card, i guess(which should be way better than the realtek audio chip on my z390 aorus).


----------



## x7007

Athrutep said:


> I have to give credit where credit is due.
> 
> Polling precision with my onboard realtek audio compared to routing it through my second monitor via hdmi and my nvidea card. I deinstalled my realtek drivers before and cleaned it with DDU in safemode before rebooting. Is it a huge difference? No. Is it a difference, yes. It all adds up.
> 
> And here is some empirical evidence. And mind you this is on the trashy win 10 1909
> 
> onboard audio
> 
> onboard bios off / audio through gpu to monitor
> 
> 
> sounds like garbage though and doesn't do my beyer dynamics DT1990s any justice. So i will get the Creative X3 usb audio card, i guess(which should be way better than the realtek audio chip on my z390 aorus).


for the best surround make sure to install the control panel to get the most updated firmware. also you will need to do headphones profile so you will get the best surround fit to you only by your own ears. you will need to use your phone for pictures. just follow the guide when you get it. if you can hear surround because some people physically can't because their brain can't intercept delays, then you will say it's the best surround u ever experienced. make sure you take the pictures the best you can because they will affect the sound. and with this device u don't need to worry about dpc. no affect on it whatsoever.

just so you know about its quality. I have the sxfi amp with Sennheiser hd800s and the sound is great for it price. there is no surround getting close or sound quality for this price. also you will have specific headphones profiles to use. select the dt 1990. it will change the sound a lot.


----------



## Athrutep

x7007 said:


> for the best surround make sure to install the control panel to get the most updated firmware. also you will need to do headphones profile so you will get the best surround fit to you only by your own ears. you will need to use your phone for pictures. just follow the guide when you get it. if you can hear surround because some people physically can't because their brain can't intercept delays, then you will say it's the best surround u ever experienced. make sure you take the pictures the best you can because they will affect the sound. and with this device u don't need to worry about dpc. no affect on it whatsoever.
> 
> just so you know about its quality. I have the sxfi amp with Sennheiser hd800s and the sound is great for it price. there is no surround getting close or sound quality for this price. also you will have specific headphones profiles to use. select the dt 1990. it will change the sound a lot.


Thanks for letting me know, that is helpful.


----------



## Timecard

Some more fun things for you guys to test, recently did a little test with a custom GUI tool i'm building and found you can micro adjust timer resolution.

https://www.overclock.net/forum/375-mice/1743018-micro-adjusting-timer-resolution-experiment.html


----------



## r0ach

Athrutep said:


> So i will get the Creative X3 usb audio card, i guess


Terrible idea. Running a USB audio card is about the worst option you can choose for screwing up mouse movement. If you insist on running a sound card instead of outputting sound from the Nvidia card to the monitor and out of the monitor via 3.5mm jack, the best option is getting a Soundblaster Z PCI-E card and using it with no driver installed with the default, Microsoft Windows driver. 

You said outputting sound from the monitor via 3.5mm jack to two desktop reference speakers sounds bad? It sounds virtually the same as outputting sound from a Soundblaster for me on this Samsung PLS monitor. However, when I tried doing the same thing on an Asus VG27AQ 165hz panel over HDMI 2.0, the sound was all muffled and bad. So it's likely your monitor's fault and not the actual procedure. Maybe test doing it via displayport vs HDMI too.


----------



## x7007

r0ach said:


> Terrible idea. Running a USB audio card is about the worst option you can choose for screwing up mouse movement. If you insist on running a sound card instead of outputting sound from the Nvidia card to the monitor and out of the monitor via 3.5mm jack, the best option is getting a Soundblaster Z PCI-E card and using it with no driver installed with the default, Microsoft Windows driver.
> 
> You said outputting sound from the monitor via 3.5mm jack to two desktop reference speakers sounds bad? It sounds virtually the same as outputting sound from a Soundblaster for me on this Samsung PLS monitor. However, when I tried doing the same thing on an Asus VG27AQ 165hz panel over HDMI 2.0, the sound was all muffled and bad. So it's likely your monitor's fault and not the actual procedure. Maybe test doing it via displayport vs HDMI too.


r0ach; with that one u are wrong. I have the sxfi amp which is almost the same and I don't have any issue with mouse movement or dpc. when I had mouse issues I had it still with pcie soundcard and I don't have mouse issues now with both pcie and USB. it's not the standard USB sound card you think. creative won all the prizes for 2018 or 2019with this device. also my friend has this x3 device and he is all about lag input and mouse movement.


----------



## r0ach

Having anything plugged in to the USB ports at all besides the mouse itself affects mouse movement negatively, so of course running a USB sound card is a horrible idea. Running audio from the Nvidia HDMI/Displayport to the monitor's 3.5mm jack to two desktop speakers is the best option for mouse movement followed by running a PCI-E Soundblaster Z with no drivers installed and using the default Windows audio driver for it. USB audio is like dead last worst option.


----------



## x7007

r0ach said:


> Having anything plugged in to the USB ports at all besides the mouse itself affects mouse movement negatively, so of course running a USB sound card is a horrible idea.


the only devices made are USB. else u want to buy very expensive dac which doesn't have the same surround u get from this. sometimes lag input is fine when u get highest sound quality


----------



## r0ach

You can buy expensive, audiophile, 'reference' speakers and hook them up via the monitor's 3.5mm headphone jack.


----------



## x7007

r0ach said:


> You can buy expensive, audiophile, 'reference' speakers and hook them up via the monitor's 3.5mm headphone jack.



I have a TV. oled LG e6 for using 3d in games.
it won't be same quality using like that. there won't be true surround like the sxfi. please read what this tech gives to the table.

also I have the hd800s headphones 
I don't want to use speakers.


----------



## Athrutep

r0ach said:


> Terrible idea. Running a USB audio card is about the worst option you can choose for screwing up mouse movement. If you insist on running a sound card instead of outputting sound from the Nvidia card to the monitor and out of the monitor via 3.5mm jack, the best option is getting a Soundblaster Z PCI-E card and using it with no driver installed with the default, Microsoft Windows driver.
> 
> You said outputting sound from the monitor via 3.5mm jack to two desktop reference speakers sounds bad? It sounds virtually the same as outputting sound from a Soundblaster for me on this Samsung PLS monitor. However, when I tried doing the same thing on an Asus VG27AQ 165hz panel over HDMI 2.0, the sound was all muffled and bad. So it's likely your monitor's fault and not the actual procedure. Maybe test doing it via displayport vs HDMI too.


I have a pcie creative x-fi titanium soundcard. But using it without the drivers sounds as bad as my monitor output since then its just a throughput device, since there is no equalizer. A Dac amp would solve that, but then its also connected through usb.

I will try setting up the X3 on my second pc saving some profiles then connecting it to a usb hub on my main pc that doesn't have any other devices connected to it. But if the only way of getting marginally lower latency is by gimping myself with bad directional audio. I chose the increased latency, since the downside from gimping myself with bad sound is much more of a drawback.


----------



## PMB

A USB soundcard is completely fine for mouse movement, if you are really paranoid just make sure to not use the same usb hub on your mobo. Mostly mobo have 1 dedicated usb2.0 hub and one for usb 3.0, just single out the former for your mouse usage. 



getting a crappy pcix soundcard on the other hand will lead to terrible issues as creative just doesnt give a single f about their drivers. I had the ae5 and it didnt receive updates for about a year after win 1803 or so and ruined my dpc latency as microsoft changed something up in the sound department. 

now im using a sharkoon usb dac pro, works plug and play -> no driver issues, dpc latency is great, sound is better then before as there is no interference with the graphics card.


----------



## r0ach

PMB said:


> A USB soundcard is completely fine for mouse movement, if you are really paranoid just make sure to not use the same usb hub on your mobo. Mostly mobo have 1 dedicated usb2.0 hub and one for usb 3.0, just single out the former for your mouse usage.


Not a viable solution. I have a Haswell PC laying around running Win 8.1 (Steambox/Alienware Alpha) that you can't disable USB3 on so device manager has both a USB2 and USB3 controller in it. Even if you plug a mouse into the USB2 port, it acts in passthrough mode and the mouse is really controlled by the USB3 controller if you dig under the hood. Not sure if it's Windows 8.1 and higher facilitating this or the hardware itself, or both, but the last chipset you can disable USB3 completely or run things off two different, native USB controllers at the same time seems to be Z77.

So, TLDR: Anything you plug into the USB port is going to negatively affect your mouse movement. Which is why using USB audio is the worst possible option from all the options I stated earlier. It might be better and less laggy than Realtek on-board audio though. But having Realtek audio enabled with drivers installed is so bad for mouse response there's probably not a worse option besides pissing on your computer to cool it instead of using fans.

I'm also amazed there are people like Athrutep on this forum who run useplatformclock forced on + Realtek on-board audio and god knows what else. I'd be surprised if you people can hit a flickshot to save your life with all that crap. Athrutep was the guy who told me the Glorious Model O had a good 3360 implementation. It's responsive, but not really good tracking-wise. The 3366 in the G403 blows it out of the water, and so does the G Pro Hero 16k wired. Haven't seen a single no-name mouse company come close to any of Logitech or Steelseries sensor implementations before.


----------



## Athrutep

r0ach said:


> I'm also amazed there are people like Athrutep on this forum who run useplatformclock forced on + Realtek on-board audio and god knows what else. I'd be surprised if you people can hit a flickshot to save your life with all that crap. Athrutep was the guy who told me the Glorious Model O had a good 3360 implementation. It's responsive, but not really good tracking-wise. The 3366 in the G403 blows it out of the water, and so does the G Pro Hero 16k wired. Haven't seen a single no-name mouse company come close to any of Logitech or Steelseries sensor implementations before.













I stated multiple times what i use. I never forced hpet. For someone that is pretty anal about some things, you don't really pay much attention. 

Once again i use,

bcdedit /deletevalue useplatformclock
bcdedit /deletevalue useplatformtick
bcdedit /set disabledynamictick yes

Realtek Onboard audio disabled in bios and drivers uninstalled via DDU in safemode

Its hard to beat Logitech. But out of the other brands personally the model o feels best to me, harder for tracking but easier for flicks. Steelseries which you praise were bad for me. I had a Rival that had awful polling consistency and felt really bad (maybe i was just unlucky but i never bothered with them after that).

I have hit flicks multiple times and saved my life successfully 

You always talk about your amazing skills throwing judgemental assumptions around. How about you showcase your amazing flicks and show everyone how its done with a flawless mouse input tweaked system?


----------



## x7007

the sxfi amp is using a USB 3.1 C type also the X3. it doesn't use a normal USB 2.0 port.


----------



## 508859

Athrutep said:


> I stated multiple times what i use. I never forced hpet. For someone that is pretty anal about some things, you don't really pay much attention.
> 
> Once again i use,
> 
> bcdedit /deletevalue useplatformclock
> bcdedit /deletevalue useplatformtick
> bcdedit /set disabledynamictick yes
> 
> Realtek Onboard audio disabled in bios and drivers uninstalled via DDU in safemode
> 
> Its hard to beat Logitech. But out of the other brands personally the model o feels best to me, harder for tracking but easier for flicks. Steelseries which you praise were bad for me. I had a Rival that had awful polling consistency and felt really bad (maybe i was just unlucky but i never bothered with them after that).
> 
> I have hit flicks multiple times and saved my life successfully
> 
> You always talk about your amazing skills throwing judgemental assumptions around. How about you showcase your amazing flicks and show everyone how its done with a flawless mouse input tweaked system?


his brain cannot comprehend words longer than 11 characters, both starts with the same, thus the confusion after this buffer overflow


----------



## PMB

r0ach said:


> Not a viable solution. I have a Haswell PC laying around running Win 8.1 (Steambox/Alienware Alpha) that you can't disable USB3 on so device manager has both a USB2 and USB3 controller in it. Even if you plug a mouse into the USB2 port, it acts in passthrough mode and the mouse is really controlled by the USB3 controller if you dig under the hood. Not sure if it's Windows 8.1 and higher facilitating this or the hardware itself, or both, but the last chipset you can disable USB3 completely or run things off two different, native USB controllers at the same time seems to be Z77.
> 
> So, TLDR: Anything you plug into the USB port is going to negatively affect your mouse movement. Which is why using USB audio is the worst possible option from all the options I stated earlier. It might be better and less laggy than Realtek on-board audio though. But having Realtek audio enabled with drivers installed is so bad for mouse response there's probably not a worse option besides pissing on your computer to cool it instead of using fans.



Well naturally I disabled the Realtek onboard audio in bios, that's the whole point of the operation (besides getting better audio of course). I do not see how it is possible to alter mouse movement when all parameters behave exactly the same as in "only mouse in usb" mode. when I have the same dpc latency, the same-ish human benchmark, the same in game performance - why would I assume to need some voodoo usb magic? Also I'm pretty certain that on my x470 board are 2 seperate signal chains for usb2.0 and ps/2 aswell as the usb 3.0 controller. IF this is not the case on yours then I can kinda see where your going at. 

Since I directly compared my new usb audio solution to the pcie solution (soundblaster ae-5), I am 100% sure that this is at least equal, but cheaper and the sound comes out less distorted. each to their own, but my recommandation in regards to system responsivity and mouse feel is a good driverless usb card (much to my own surprise, as I tend to keep my usb hosts clean!)


----------



## 508859

PMB said:


> Well naturally I disabled the Realtek onboard audio in bios, that's the whole point of the operation (besides getting better audio of course). I do not see how it is possible to alter mouse movement when all parameters behave exactly the same as in "only mouse in usb" mode. when I have the same dpc latency, the same-ish human benchmark, the same in game performance - why would I assume to need some voodoo usb magic? Also I'm pretty certain that on my x470 board are 2 seperate signal chains for usb2.0 and ps/2 aswell as the usb 3.0 controller. IF this is not the case on yours then I can kinda see where your going at.
> 
> Since I directly compared my new usb audio solution to the pcie solution (soundblaster ae-5), I am 100% sure that this is at least equal, but cheaper and the sound comes out less distorted. each to their own, but my recommandation in regards to system responsivity and mouse feel is a good driverless usb card (much to my own surprise, as I tend to keep my usb hosts clean!)


you seem to be new here. 

I will give you a hint


----------



## w1tch

x7007 said:


> the sxfi amp is using a USB 3.1 C type also the X3. it doesn't use a normal USB 2.0 port.


ALC892 vs fiio k3 dac/amp gives same mouse latency so there is nothing to gain from disabling onboard sound. What I would like to know is there a difference between display port and hdmi


----------



## r0ach

w1tch said:


> ALC892 vs fiio k3 dac/amp gives same mouse latency so there is nothing to gain from disabling onboard sound. What I would like to know is there a difference between display port and hdmi


Pretty sure all the on-board sound solutions have non-negligble CPU overhead that you just don't have on real cards or by using audio passthrough from Nvidia card to monitor's 3.5mm jack. And as for displayport, the Asus VG27AQ I recently purchased would NOT work in displayport mode with a brand new Turing video card unless you disable legacy BIOS mode completely and force UEFI mode only. 

Enabling secure boot gives you slug cursor, and running UEFI mode in my experience seems to have a bit more floaty feeling mouse cursor than legacy mode (but not sure if it's that way on all boards or not), so displayport is kind of problematic to me. I'd just make sure to buy a monitor that also has HDMI 2.0 just in case.


----------



## Veii

Would you guys mind if i hijack this thread a bit - else please tell me to move along and i'll focus on an own one
Speaking of "Logitech makes the best mices"
I can't aside from anything just cry about my G502 Proteus Spectrum performance

I'm not sure where what is missing, but even on RAW tracking mode (using osu as random experiment, but it's not the only issue on specific game ,although the most stable in tracking perf)
The mouse indeed does drift away - be it precisely surface calibrated or using stock / i had to drop down pooling rate to 500 as 1000 was just unstable on it

It's hard, likely i forgot something, and it can't be dust
About my current setup:
An old mixed bag of garbage messed up Half Bios/Half Uefi notebook on a 3612QM chip (7th series, changed the cpu)
IntelME was fully wiped by myself, and any traces of it - lost Intel Quicksync that way and any kind of hardware acceleration by disabling intels backdoors (ty intel) 
Bluetooth is not used, can't disable yet internal ALC soundcard as the bios still needs an unlock (working on it)

The early usecase was on Hackintosh although it always had issues with switching to UEFI mode, soo i use bootloader chaining to even boot win10 in uefi pure GPT mode
But background information aside:
- Many of the services are disabled (please let me know what i miss)
- Mousepad is the 2012 Razer Goliathus Control XXL (wide one, but i've used it with other mices and it's fine to me for low DPI)
- I'm on an 1809 enterprise branch, one because i can't stand the work that needs to be done on Pro and two because i get the ability to use it ~ soo guess why not
- Spectre Migration nonsense is also disabled (on the intel side of things i should be as clean as possible) 
- Only has an HD4000, 2nd display is via VGA cable / main focus was competitive rhythm gaming
- Soundcard, yes Focusrite 2i2 usb but with forced global ASIO drivers under low latency (any kind of stutter and audio lag messes you up on keyboard input)

Attached are the nonsense i get out 
Thought about firmware downgrading, or anything similar - but till today it always had drift away issues on higher movement 
And to my notice (barely into this topic still) the charts look extremely different than your guys's result


----------



## Synoxia

Veii said:


> Would you guys mind if i hijack this thread a bit - else please tell me to move along and i'll focus on an own one
> Speaking of "Logitech makes the best mices"
> I can't aside from anything just cry about my G502 Proteus Spectrum performance
> 
> I'm not sure where what is missing, but even on RAW tracking mode (using osu as random experiment, but it's not the only issue on specific game ,although the most stable in tracking perf)
> The mouse indeed does drift away - be it precisely surface calibrated or using stock / i had to drop down pooling rate to 500 as 1000 was just unstable on it
> 
> It's hard, likely i forgot something, and it can't be dust
> About my current setup:
> An old mixed bag of garbage messed up Half Bios/Half Uefi notebook on a 3612QM chip (7th series, changed the cpu)
> IntelME was fully wiped by myself, and any traces of it - lost Intel Quicksync that way and any kind of hardware acceleration by disabling intels backdoors (ty intel)
> Bluetooth is not used, can't disable yet internal ALC soundcard as the bios still needs an unlock (working on it)
> 
> The early usecase was on Hackintosh although it always had issues with switching to UEFI mode, soo i use bootloader chaining to even boot win10 in uefi pure GPT mode
> But background information aside:
> - Many of the services are disabled (please let me know what i miss)
> - Mousepad is the 2012 Razer Goliathus Control XXL (wide one, but i've used it with other mices and it's fine to me for low DPI)
> - I'm on an 1809 enterprise branch, one because i can't stand the work that needs to be done on Pro and two because i get the ability to use it ~ soo guess why not
> - Spectre Migration nonsense is also disabled (on the intel side of things i should be as clean as possible)
> - Only has an HD4000, 2nd display is via VGA cable / main focus was competitive rhythm gaming
> - Soundcard, yes Focusrite 2i2 usb but with forced global ASIO drivers under low latency (any kind of stutter and audio lag messes you up on keyboard input)
> 
> Attached are the nonsense i get out
> Thought about firmware downgrading, or anything similar - but till today it always had drift away issues on higher movement
> And to my notice (barely into this topic still) the charts look extremely different than your guys's result


Hi Veii mate! Mouse USB precision is more a hassle than DRAM oc, be prepared!
Your graphs are different because you are not selecting "Interval vs Frequency" and initial point 500 to 2500.
That said, good job in stripping out devices you dont need/know. You might want to use msi_util and device manager to solve any IRQ conflicts if you have any.


----------



## Veii

Synoxia said:


> Hi Veii mate! Mouse USB precision is more a hassle than DRAM oc, be prepared!
> Your graphs are different because you are not selecting "Interval vs Frequency" and initial point 500 to 2500.
> That said, good job in stripping out devices you dont need/know. You might want to use msi_util and device manager to solve any IRQ conflicts if you have any.


Haha, sounds like a challenge 
This time it seems to be "fine", i have still to look what causes mousedrift - as interval to time looks a bit better right now
Do you have any information where/how to track IRQ conflicts ?
I've played with the msi_util a bit, but it was long time ago and only very vague
it might be the mousepad causing drift - although as it only happens on faster movement, i think it's sensor or settings related

About global timer resolution
I read you guys where able to micro finetune it ?
Is there any way i can globaly enforce 0.5ms timer ?
I'm kinda bound on it, as the low latency audio setup starts to crack on 1.0ms timer
~after all, the less bloadware runs on this bad notebook, the less lag spikes i'll get mid game~
Oh forgot to mention, using custom HD4000 drivers too, as intels where a mess to begin with


----------



## Timecard

Here's a new option to try which I haven't seen mentioned anywhere or in any guides, came across it last night and we did a few benchmarks which showed a small bit of increase in performance in relation to average latency.

What this does is it disabled the HyperV hypervisor stack at boot (different than uninstalling HyperV addon features) which is known to interfere with VMware products. It should do no harm and could be considered a safe tweak with potential performance gains if you don't use HyperV or Microsoft added virtualization features like device guard. It definitely affected my keyboard and mouse input so give it a shot and do your own benchmarks.

bcdedit /set hypervisorlaunchtype off
- https://blogs.technet.microsoft.com/gmarchetti/2008/12/07/turning-hyper-v-on-and-off/


----------



## 508859

Timecard said:


> Here's a new option to try which I haven't seen mentioned anywhere or in any guides, came across it last night and we did a few benchmarks which showed a small bit of increase in performance in relation to average latency.
> 
> What this does is it disabled the HyperV hypervisor stack at boot (different than uninstalling HyperV addon features) which is known to interfere with VMware products. It should do no harm and could be considered a safe tweak with potential performance gains if you don't use HyperV or Microsoft added virtualization features like device guard. It definitely affected my keyboard and mouse input so give it a shot and do your own benchmarks.
> 
> bcdedit /set hypervisorlaunchtype off
> - https://blogs.technet.microsoft.com/gmarchetti/2008/12/07/turning-hyper-v-on-and-off/


if everything is affecting your keyboard and mouse input, how do you know what is the target state?


----------



## Timecard

Love the question, in this case if the change has consistent influence (positive or not) on vs off and it's measurable then i'd say it's worth trying. At least in this case there could be a logical explanation for this since it's disabling core functionality which impacts multiple added windows features.


----------



## 508859

Timecard said:


> Love the question, in this case if the change has consistent influence (positive or not) on vs off and it's measurable then i'd say it's worth trying.


measurable point is very interesting, what was the measured difference? 430 gigafeelz?


----------



## Timecard

Approximately 600 gigafeels using latencymon, anyway to my point looking for others to do benchmarks and compare.


----------



## CrucialNUG

numberfive said:


> measurable point is very interesting, what was the measured difference? 430 gigafeelz?


You 


numberfive said:


> I blame the inconsistency of WDDM/DWM, unrelated to hardware. you can reboot windows and have a different mouse feeling


Also you


numberfive said:


> on my z370 STRIX F-GAMING, I have the opposite experience, ASMedia with the proper drivers is crisp (opposite to intel) and I can tell on a blind test 10 out of 10 which one is which, also in kovaks my results are perfectly consistent between them (which is a good way for an objective testing).


You can play this game but you might want to hold yourself to the same standards.


----------



## 508859

CrucialNUG said:


> You
> 
> 
> Also you
> 
> 
> You can play this game but you might want to hold yourself to the same standards.


agreed, but doing kovaks 20 times switching the setting back and forth is an objective testing which will either show a consistent and measurable difference, or lack of it (margin of error)


----------



## CrucialNUG

numberfive said:


> agreed, but doing kovaks 20 times switching the setting back and forth is an objective testing which will either show a consistent and measurable difference, or lack of it (margin of error)


I suppose Kovaks would be fair if you posted the results comparing the two. Otherwise its just more hearsay, that it appears we are all guilty of participating in. 

Its alright to feel that something is different, but to insist that it is the undeniable truth as a result of only those feelings is the main issue here. If you are gonna be so critical of others you may want to turn that same laser on to yourself. Obviously I agree with you here in that more proof is better, and that we should get as far away from r0ach type thinking as possible.


----------



## Athrutep

r0ach said:


> .......


What is your main mouse out of curiosity?

Also what are your main competitive games?


----------



## 508859

Athrutep said:


> What is your main mouse out of curiosity?
> 
> Also what are your main competitive games?


he don't play competitive games, as he stated few weeks before, he is playing only games that are unknown to the general public, which also happen do not have the ranking/competitive part. 
he hates mainstream FPS shooters as there is not enough entropy.


----------



## r0ach

Athrutep said:


> What is your main mouse out of curiosity?
> 
> Also what are your main competitive games?


I use G Pro Hero 16k Wired and G302 at the moment. Would probably prefer a more Kana v2-like shape, but those two mice seem to be the closest thing I can get to that shape while having a sensor that doesn't make me throw up. Haven't played any competitive FPS games lately because I never found a monitor I actually liked and returned them all. 

Was platinum in League of Legends last time I played (I'm better at FPS than MOBA but kinda prefer MOBA right now anyways). Currently I'm leveling up again in it right now. It's still early in the season so there's still lots of former diamonds and platinums in gold trying to get higher and that's where I currently am trying to beat diamond players to rank up. 

I might try competitive FPS again after my new monitor gets here. Random nooblords on the forum keep saying "this r0ach claims 60hz monitors are better". No, every high hz panel I've ever bought has been terrible and I returned them all! Like the Asus VG27AQ where at a brightness setting of zero it's 140cdm2 brightness burning out your eyes and a terrible gamma curve of 2.4. Here's the new monitor I ordered:



r0ach said:


> I just ordered the brand spanking new 27" 240hz LG IPS (27GN750-B) so I'll let you know how that goes for coating. There's virtually zero information on this monitor anywhere and no reviews at all, but I managed to find some Chinese forums where people tested it at a whopping 1.23 delta average for color, so it's like the best calibrated gaming monitor ever released? Gamma also tracked at a perfect 2.2 curve as well, which is one thing I put a lot of priority in after trying the Asus VG27AQ and it's terrible 2.4 gamma.
> 
> I think it has some of the same backlight bleed problems as the LG 27GL83A, but the picture quality and colors seems like it might be even better, albiet at less pixel density. Some picture from Chinese forum. Those colors look pretty amazing:


----------



## Athrutep

r0ach said:


> I use G Pro Hero 16k Wired and G302 at the moment. Would probably prefer a more Kana v2-like shape, but those two mice seem to be the closest thing I can get to that shape while having a sensor that doesn't make me throw up. Haven't played any competitive FPS games lately because I never found a monitor I actually liked and returned them all.
> 
> Was platinum in League of Legends last time I played (I'm better at FPS than MOBA but kinda prefer MOBA right now anyways). Currently I'm leveling up again in it right now. It's still early in the season so there's still lots of former diamonds and platinums in gold trying to get higher and that's where I currently am trying to beat diamond players to rank up.
> 
> I might try competitive FPS again after my new monitor gets here. Random nooblords on the forum keep saying "this r0ach claims 60hz monitors are better". No, every high hz panel I've ever bought has been terrible and I returned them all! Like the Asus VG27AQ where at a brightness setting of zero it's 140cdm2 brightness burning out your eyes and a terrible gamma curve of 2.4. Here's the new monitor I ordered:


Ah alright. I am quite happy with the LG 27GL850.


----------



## r0ach

Athrutep said:


> Ah alright. I am quite happy with the LG 27GL850.


I read some weird Reddit posts of people claiming the 27GL850 caused them enormous eye strain but the 27GL83A doesn't. Did you experience any of that? I don't think either monitor has PWM, so the only thing I can think of is that the 27GL850 uses some type of dithering to achieve it's extra color gamut and the issue doesn't exist on the other since it's standard SRGB. 

I have a Samsung PLS panel that I think is 6 bit + dithering and while the monitor has nowhere near the eyestrain of a PWM panel (those make my eyes hurt and give me a headache after looking at them for literally 5 minutes), the panel probably has higher eye strain than a regular 8 bit panel with no dithering like you'd see on an ipad or something.


----------



## Athrutep

r0ach said:


> I read some weird Reddit posts of people claiming the 27GL850 caused them enormous eye strain but the 27GL83A doesn't. Did you experience any of that? I don't think either monitor has PWM, so the only thing I can think of is that the 27GL850 uses some type of dithering to achieve it's extra color gamut and the issue doesn't exist on the other since it's standard SRGB.
> 
> I have a Samsung PLS panel that I think is 6 bit + dithering and while the monitor has nowhere near the eyestrain of a PWM panel (those make my eyes hurt and give me a headache after looking at them for literally 5 minutes), the panel probably has higher eye strain than a regular 8 bit panel with no dithering like you'd see on an ipad or something.


No problems so far. I have set it up like this https://www.tftcentral.co.uk/reviews/lg_27gl850.htm#brightness


----------



## x7007

A weird thing happened. I had free of input lag and when I connected the Xbox elite Controller I started having weird lag input only when going down and up. left and right seemed fine. even when the device was not plugged !! I tried many many things, but the only thing that fixed it was uninstalling the Hidden driver/device that was related to the Xbox Elite controller in the device manager. Can anyone have insight about this thing??? Because then we know that even when a device is not connected or even working/turned on, it can wreak havoc on the system. now I used the Front Case Panel USB 2.0 port. could this maybe conflict anything? Should I just use the back mobo?

Because if something like this can happen.

https://www.thurrott.com/windows/windows-10/66409/case-laggy-xbox-controller-windows-10
Also this
https://forums.tomshardware.com/thr...-slowing-down-games-when-using-on-pc.2927294/

Why the hell does it happen??? How can I fix this?


Could disabling the NX fix the issue? could it be something with Nvidia drivers?

I used DDU to uninstall Nvidia drivers and moved from DCH to Standard drivers, I didn't touch any global settings expect the Optimal Power to Adaptive and I change NX to Alwaysoff in bcdedit. I did all that from remote control but now when I got home from work the mouse seems perfect.

It seems like all the issues I had since windows 7 started because NX, I mean they added it to XP also but could this crap cause mouse issues?? I had these mouse issues 5 years but it changing. I succeeded in making it better using the language fix that even made a Laptop touchpad with FRESH windows 10 1903 or 1809 way better


----------



## x7007

What do I need to look for if I want to buy a USB controller device that will work externally to use instead the motheeboard USB. like etron or as media and such. because some devices seems to cause issues. is it possible to get that kind of device? it's no a hub because then it connects to some part of the motherboard. but I don't know which to buy, some kind of pcie or other and which one.
edit : after many readings I saw that Renesas and fresco are the best ones. if that helps anyone. etron and asmedia are the worst. 

could be our motherboard USB are crap? 
some Fresco chip has has many options for its USB controller. 

http://www.frescologic.com/product/single/fl6000

Hub class commands are executed instantly (no USB flow control due to microprocessor processing)


did anyone test a really good USB controller compare to mobo controller? we didn't test those things. could the USB polling be better? could the lag input be better? I think the USB just doesn't cut the need for the 1000hz mouse and many other devices.

because HD fury drhdmi 4k device couldn't work properly on any of the back mobo USB ports when tried to use their EDID program it kept freezing and saying device not connected in the app. USB 3.1 3.0, but when I used the front it worked fine with the app and also the lag input was way better compare to using the back ports. HD fury recommends to use the power adapter to all their devices when finished using the EDID program with the USB cable. so from testing it causes lag input. now could a better USB controller even be better than using the power adapter and fix both issues detecting in program and no lag input? also, better usb polling and all other things.

EDIT: to fix lag input I also removed any undeeded drivers with DriverStoreExplorer.v0.10.58 and Autoruns64

EDIT: Not sure if they are the best ones NEC Renesas 720202 newest chipset and FRESCO FL1100
https://eu.ptgrey.com/products/usb-3.1-host-controller-card/?model=ACC-01-1201
https://eu.ptgrey.com/products/usb-3.1-host-controller-card/?model=ACC-01-1202

Another chipset but the same card can buy from two places
https://www.amazon.com/Inateck-Comp...ional-KT4005/dp/B00JFR2I2C/ref=dp_ob_title_ce
https://www.inateck.com/inateck-kt4005-cord-free-4-port-usb-3-0-pci-express-card.html


----------



## MIETAS

Hello,

do you guys have any idea what those spikes are?

Win 8.1
Zowie S2 Divina


----------



## deepor

MIETAS said:


> Hello,
> 
> do you guys have any idea what those spikes are?
> 
> Win 8.1
> Zowie S2 Divina



You perhaps didn't move the mouse fast enough. The mouse only sends data if there's something new to report. If it measures less than one pixel of movement, it will send nothing instead of a zero. The report rate is then less than 1000Hz.


----------



## w1tch

if you have amd gpu, navigate radeon panel-> display-> override and adjust voltage swing and pre-emphasis. I have yet to find whether this can damage the monitor or gpu in prolonged use, however setting both values to 2 greatly improved mouse tracking in csgo.


----------



## BlackSilv3r

Hello everyone, anybody here has an idea how to improve my mouse pooling rate graphs? Below is my polling graph, bcdedit settings and latency. I think i have every setting right in the bios (disabled c states all that energy saving stuff) bcdedit settings i think are also good (with these settings i have exact 0.500ms timer in timerresolution otherwise it felt terrible) latency is also very good. 

Specs: maximus x hero, 8700k (5,1ghz on cores 4,9ghz on cache ht disabled), 16 gb RAM G.Skills Ripjaws V 3200mhz with timing tweaked to 15-16-16-28 command rate t1 trfc 470, gtx 1080 +170mhz on core and +500mhz on memory. System installed on SSD Samsung 850 Evo 500GB games on NVME adata xpg sx8200 pro 1tb.

I tried to disable onboard audio and uninstalling audio drivers, no difference in mouse polling whatsoever. Disabling unused usb ports no difference. Mouse plugged in different usb port no difference (i have mm711, g102 and g305 are graphs are looking the same). Power plan is set to "ultimate performance". Nvidia settings the same as Roach showed a few pages earlier, msi mode on gpu on/off no difference in terms of polling. Bios updated to the newest, windows 10 1909 build. Any ideas?


----------



## Straszy

Boot windows in safe mode and check there. If it's the same something is still set wrong in the bios.


----------



## BlackSilv3r

Looks a bit better in the safe mode but still not that good? Am i missing some critical service in Windows that i need to disable to improve polling?


----------



## Straszy

BlackSilv3r said:


> Looks a bit better in the safe mode but still not that good? Am i missing some critical service in Windows that i need to disable to improve polling?


On asus motherboard it should be straight line, for sure you missed something in bios.


----------



## Athrutep

BlackSilv3r said:


> Looks a bit better in the safe mode but still not that good? Am i missing some critical service in Windows that i need to disable to improve polling?


This is good enough for windows 10 1909.


----------



## Conditioned

I have a lot of green when I move my mouse with mousemovementrecorder.exe (part of the markc fix). 

This is from the readme.txt: "NOTE: While a game is running, Mouse Movement Recorder may show many red and greenlines, if the game continually re-positions the pointer to the middle of the screen.Those red and green lines do NOT mean that you have acceleration, they only mean that the re-positioned pointer has confused Mouse Movement Recorder.". 

But this happens even on a desktop, can anyone explain this?


----------



## deepor

Conditioned said:


> I have a lot of green when I move my mouse with mousemovementrecorder.exe (part of the markc fix).
> 
> This is from the readme.txt: "NOTE: While a game is running, Mouse Movement Recorder may show many red and greenlines, if the game continually re-positions the pointer to the middle of the screen.Those red and green lines do NOT mean that you have acceleration, they only mean that the re-positioned pointer has confused Mouse Movement Recorder.".
> 
> But this happens even on a desktop, can anyone explain this?



Maybe the Windows display scaling setting can cause this? If you use something like 150%, 125% etc., perhaps try to set it to 100% and see what happens.


----------



## Conditioned

deepor said:


> Maybe the Windows display scaling setting can cause this? If you use something like 150%, 125% etc., perhaps try to set it to 100% and see what happens.


Thanks for your reply! I do have it set to 100% (default) but I have mactype installed, and will try deinstalling that! Thanks for the idea! Just to be clear, I think I have followed pretty much all recommendations for best polling precision and latency, that's why I'm asking for input on this!


----------



## PurpleChef

Conditioned said:


> I have a lot of green when I move my mouse with mousemovementrecorder.exe (part of the markc fix).
> 
> This is from the readme.txt: "NOTE: While a game is running, Mouse Movement Recorder may show many red and greenlines, if the game continually re-positions the pointer to the middle of the screen.Those red and green lines do NOT mean that you have acceleration, they only mean that the re-positioned pointer has confused Mouse Movement Recorder.".
> 
> But this happens even on a desktop, can anyone explain this?


I had to run windows update to fix it. And marc fix does nothing. Its a meme unless your playing super old games...


----------



## t0niX

Hey guys,

I need some help to understand what's going on with my Corsair Glaive RGB Pro. I had the feeling my aiming in CS:GO is a little sluggish lately so I ran MouseTester 1.5.3.

- Polling Rate: 1000 Hz
- CPI: 800

I wonder what I'm able to do about those jumps from 1000 Hz to 500 Hz over and over again.

Have you ever seen this behavior before and might be able to help me out?

Thanks a lot!


----------



## cdcd

Most likely related to the RGB lighting: https://www.techpowerup.com/review/corsair-glaive-rgb-pro/5.html

See if the issue persists after disabling it.


----------



## t0niX

cdcd said:


> Most likely related to the RGB lighting: https://www.techpowerup.com/review/corsair-glaive-rgb-pro/5.html
> 
> See if the issue persists after disabling it.


You're the man - this fixed the issue. Thanks!

But: I feel it's pretty dumb having to disable the RGB in order to reach maximum sensor accuracy. Why does this happen? Nothing else to do about it? I miss the lighting already.


----------



## cdcd

t0niX said:


> You're the man - this fixed the issue. Thanks!
> 
> But: I feel it's pretty dumb having to disable the RGB in order to reach maximum sensor accuracy. Why does this happen? Nothing else to do about it? I miss the lighting already.


Not sure why it happens, probably just shoddy firmware. Apparently not trivial to fix though, otherwise Corsair should've addressed it by now. Might be worth to try and contact Corsair about it.


----------



## ylpkm

t0niX said:


> You're the man - this fixed the issue. Thanks!
> 
> But: I feel it's pretty dumb having to disable the RGB in order to reach maximum sensor accuracy. Why does this happen? Nothing else to do about it? I miss the lighting already.


Some sources suggest controlling the leds is a large enough task to basically create a create a delay for the MCU when communicating back to the pc. Since pwm dimming is often used with leds (such as your phones brightness and some tv's), the mcu has to adjust and maintain pulses when the leds are to be powered. It may seem like a trivial task, but depending on the capabilities of the microcontroller, issues can arise. It also depends if the mcu has multiple cores. If its a single core/threaded mcu... yikes. It means a lot of interrupts on top of getting sensor data to the pc. Also depending on how much power the leds take, the mcu will heat up as electronics normally do, and microcontrollers can have quite the core clock variance due to increased temperatures. Had to deal with some of these issues and concepts in a microcontrollers course.

Your keyboard have leds? Might wanna invest in an ambient light source to see your keys. Seen a few sites discuss that this issue occurs with mice and keyboards. and your mouse only has a few leds... keyboards have many. Just remember there was a SSD that had so many leds, and they drew so much power, the ssd was hitting (idk) like 50+C. And basically overheated during boot process and would freeze shortly after starting. But I bet it looked pretty.


----------



## empl

Maybe i found good tweak and included another impactful tweaks:

Since we all know you should turn off *HW acceleration* where you can, i though myself what else uses HW acceleration in windows. In browser and internet options, i have already disabled that. Thing is what else uses that in windows, there is a lots of bloatware. There was/is for some users options to turn off hardware accelerated scheduling in windows 10 - link. I personally don't option to turn it off, but i turn that off in registry and mouse feels more snappier, i can do more easily small adjustments and previously it felt like mouse is lagging behind ! I also read in *Windows 20H1 update*, which is upcoming, this feature maybe come back. 

On the other hand, someone maybe know before you install graphic card drivers, mouse feels more snappier and it is not only because lower resolution. You can also turn off acceleration in display setting. And after you install nvidia drivers for instance, there is much more lag ! But since your install gpu drivers, you can't turn hw accleration off, because it would probably affect gaming performance greatly ! But you don't want to use it for windows apps !

I also found link how to disable *virtual desktops*, don't know if it works, because many things may affect you even they not running. 

Also delete*Flash Player* and it reduces input lag, tho it may affect security, but if you are okay with that. 

Also guru 3d in spectre, can disable *Spectre and Meltdown* again reduces input lag of cost security. Some motherboards have BIOS version, that updates that on BIOS level, i wonder, if that introduces lag as well ! But today when even firmware viruses exists, which can root inside anything even flash disk, although for average joe, it doesn't represent that much of a risk, but still, if there is already malware that can do that, why not use it. So you should still care for security probably and you don't want nasty bitcoin miners and viruses lagging your pc.

I also found *clean boot* reduces input lag greatly, but you wouldn't have some programs, or services you need and you would have to restart pc, after each gaming session. And ticking each service on is too exhausting each time. Maybe it could be automatized with bat file. Also you can disable Cortana by going to system32/systemapps and write after it .bak and kill process right before you press delete. But you won't have start panel, it is already suspended, shouldn't use any CPU cycles. But still what about HW acceleration ? I have it disabled in internet options, but in case of other uw apps, which would be suspended dunno. Also if you disable wrong process, it can lead to system corruption. Maybe create bat file to suspend some processes i thought.

Also you can disable *DWM* on Windows 10, but you can damage system, if it freezes !!! And you have to play in windowed fullscreen mode, which reduces fps moderately and introduce like 1-3 ms latency, which kinda defeats purpose of disabling DWM.

Test turning off gpu sheduling and tell me your opinion, or if you know some app, which uses hardware acceleration !!!


----------



## vf-

cdcd said:


> Most likely related to the RGB lighting: https://www.techpowerup.com/review/corsair-glaive-rgb-pro/5.html
> 
> See if the issue persists after disabling it.





cdcd said:


> Not sure why it happens, probably just shoddy firmware. Apparently not trivial to fix though, otherwise Corsair should've addressed it by now. Might be worth to try and contact Corsair about it.


Is this just a Corsair issue? Or all mice...



ylpkm said:


> Some sources suggest controlling the leds is a large enough task to basically create a create a delay for the MCU when communicating back to the pc. Since pwm dimming is often used with leds (such as your phones brightness and some tv's), the mcu has to adjust and maintain pulses when the leds are to be powered. It may seem like a trivial task, but depending on the capabilities of the microcontroller, issues can arise. It also depends if the mcu has multiple cores. If its a single core/threaded mcu... yikes. It means a lot of interrupts on top of getting sensor data to the pc. Also depending on how much power the leds take, the mcu will heat up as electronics normally do, and microcontrollers can have quite the core clock variance due to increased temperatures. Had to deal with some of these issues and concepts in a microcontrollers course.
> 
> Your keyboard have leds? Might wanna invest in an ambient light source to see your keys. Seen a few sites discuss that this issue occurs with mice and keyboards. and your mouse only has a few leds... keyboards have many. Just remember there was a SSD that had so many leds, and they drew so much power, the ssd was hitting (idk) like 50+C. And basically overheated during boot process and would freeze shortly after starting. But I bet it looked pretty.


I used to always wonder about that for RGB in peripherals. Emitting interference.


----------



## cdcd

vf- said:


> Is this just a Corsair issue? Or all mice...


Just this particular Corsair mouse.


----------



## r0ach

Probably the root cause of 99% of people's USB issues if you're using a USB3 controller (z87 and higher):

https://www.overclock.net/forum/375...mode-driver-1-destroyer-aim-ever-created.html


----------



## Timecard

Posted this yesterday about ps/2 vs usb latency for keyboards.

https://www.overclock.net/forum/373...yboard-dpc-interrupt-latency-ps-2-vs-usb.html


----------



## Th3Awak3n1ng

Timecard said:


> about ps/2 vs usb latency for keyboards


 As far as I remember new platforms (Skylake, Kaby Lake, Coffee Lake, Ryzen) have no native PS/2 support, and even if motherboard has PS/2 port(s) it still works through some kind of "emulation" (or something like that, I don't remember exact scheme), so I'd say there is no reason to use PS/2 on current modern platforms, but on old platforms (Ivy Bridge, and maybe Haswell) it could make some difference.



r0ach said:


> 1070


 Which NVIDIA driver do you prefer?


----------



## empl

Timecard said:


> I personally enjoy finding ways to demonstrate the differences in perceived delay and investigating causes, but I agree 1000fps camera might not display all subtleties that we experience. If you look at my other thread related to electrical interference and impacts to perceived input, some may find it difficult to see the differences between videos (240fps) that I shared but to me and others who report similar issues, the difference is night and day.
> 
> When my interference is extremely low which is most notably influenced by physical changes vs configuration changes such as cable positioning, extra ferrites perhaps looped on data lines, network, monitor and usb cables then it's as if input (mouse/keyboard) to display latency on monitor is practically non-existent. e.g. zero mouse drag, not floaty (loss of accuracy) regardless of sensitivity, and I'm only using 500hz (2ms) mouse and 120hz monitor.
> 
> If it's consistent then you probably wouldn't think twice about there being an issue in any sense because that is your perceived 'norm'. Your computer and it's function solely drive your perception of the digital world, it feels fast, it feels slow, that looks fast, that looks slow.
> 
> There's a ton of variables but that is what makes all this interesting.


Interesting thread i read that, luckily there is normal grounding. Never had that problem. That actually sucks as hell !



> 1) everything can be proven, if it exists, of course.
> 2) if you are talking about microseconds here, you are delusional
> https://www.humanbenchmark.com/tests/reactiontime
> do this test five times, come back with your 175-190ms, that would give you 175000-190000 microseconds. it is physically (not technically, physically) impossible for microseconds to pile up in your case to have even 0.1% impact on your input delay.
> I understand that you hate math, physics and studying the subject, but at least don't be such a fool here.


If that was true, than timer resolution would make no difference. 1ms vs 500us, but it does. You can test it yourself, there is insane difference in input lag ! What about if mouse position renders in next frame, because you had extra dpc latency, before input was processed ? You still don't understand whole thing with microseconds. If you have frame each 16.7 ms (60 fps) and 1000hz polling 1 ms, what if frame starts rendering just before polling occurred and was processed, than it displays in a next frame. But if you had low dpc latency, it would render in previous frame, therefore you would have lower input lag. And after 1s of having 120fps, what if mouse position renders, in next frame x times. Than it scaled over time and input lag is more than microseconds. Btw you call 2nd best rank in cs go slightly above average ??? LOL I still think it is pointless to explain to you, you don't even listen, so IDC... 

Besides even anantech tests DPC latency, usually no one bothers, because it differs per hw configuration and drivers. So it is relative, but still useful. Why would they bother to test it, given previously mentioned, why would they bother, if that doesn't matter at all.


----------



## 508859

empl said:


> If that was true, than timer resolution would make no difference. 1ms vs 500us, but it does. You can test it yourself, there is insane difference in input lag !


not really. "insane" differences should be easily measured though. it's not that I've didn't try this particular snake oil, before I started calling it snake oil 



empl said:


> What about if mouse position renders in next frame, because you had extra dpc latency, before input was processed ?


What? nothing, for real. NOT A SINGLE NEGATIVE EFFECT will occur if mouse position will be rendered in the next frame. 



empl said:


> You still don't understand whole thing with microseconds. If you have frame each 16.7 ms (60 fps) and 1000hz polling 1 ms, what if frame starts rendering just before polling occurred and was processed, than it displays in a next frame


again, nothing. you can also apply the same exact thinking process to data processing ALL THE TIME, not only when some specific part of the chain processing data with some specific delay. 
not to mention that polling is "up to 1ms", not exactly 1ms. so you will have those reports not getting into their respective frames anyway



empl said:


> But if you had low dpc latency, it would render in previous frame, therefore you would have lower input lag.


thanks for this in-depth explanation. I wonder what does it have to do with 1) polling consistency 2) GIGANTIC differences between usb 2.0 & 3.0 3) inability to measure anything


empl said:


> And after 1s of having 120fps, what if mouse position renders, in next frame x times. Than it scaled over time and input lag is more than microseconds.


again, NOTHING...



empl said:


> Btw you call 2nd best rank in cs go slightly above average ??? LOL I still think it is pointless to explain to you, you don't even listen, so IDC...


well, yeah. for some reason you can't reach the 1st one, so it is not like input lag mitigation can magically help you. it is not because people who outrank you have their PCs tuned to have lower DPC and faster polling.



empl said:


> Besides even anantech tests DPC latency, usually no one bothers, because it differs per hw configuration and drivers. So it is relative, but still useful. Why would they bother to test it, given previously mentioned, why would they bother, if that doesn't matter at all.


it's not useful, it's just interesting at most. why would they bother to test it? because they can, obviously not because it is game changing for players


----------



## Th3Awak3n1ng

deepor said:


> The person is super confident and kind of rude about his 2.0 vs. 3.0 opinion while in reality it's very likely just placebo.


 There is a difference between 2.0 and 3.0 if motherboard has proper EHCI and not only XHCI for both (2.0 & 3.0) like most (if not all) new motherboards.


----------



## empl

numberfive said:


> What? nothing, for real. NOT A SINGLE NEGATIVE EFFECT will occur if mouse position will be rendered in the next frame.


Are you really that stupid ? If mouse position renders in a next frame, that means you will have higher input lag. How is that not negative effect ?



numberfive said:


> not to mention that polling is "up to 1ms", not exactly 1ms. so you will have those reports not getting into their respective frames anyway


Haha you just proved me right unknowingly. Even from unstable polling rate, you can have additional latency. Imagine, if you have high dpc latency and polls are being processed late, than mouse renders in a next frame and you will have higher input lag.



numberfive said:


> not really. "insane" differences should be easily measured though. it's not that I've didn't try this particular snake oil, before I started calling it snake oil


This also implies you are agreeing there is difference, between 500 us and 1000 us timer. How is dpc latency any different ? When cpu will process that input later. You just contradicted yourself. You say dpc latency, doesn't matter as it is in number of microseconds. But than you claim this can be measured as difference in matter of microseconds.

And yet you jerk, while not having single idea what you are talking about. While i don't understand why you care, if you claim it doesn't affect you. You can't even argue with people like you, i just could help myself this time, since you contradict yourself.


----------



## CrucialNUG

The whole USB 2.0 better thing is patently absurd. There are plenty of pros that have vastly better aim and reaction than these folks and use USB 3.0 exclusively at this point. If there is input inconsistency between your ports it is due to something else (external) and not USB 2.0 being superior by design.


----------



## CiselS

empl said:


> Maybe i found good tweak and included another impactful tweaks:
> 
> Since we all know you should turn off *HW acceleration* where you can, i though myself what else uses HW acceleration in windows. In browser and internet options, i have already disabled that. Thing is what else uses that in windows, there is a lots of bloatware. There was/is for some users options to turn off hardware accelerated scheduling in windows 10 - link. I personally don't option to turn it off, but i turn that off in registry and mouse feels more snappier, i can do more easily small adjustments and previously it felt like mouse is lagging behind ! I also read in *Windows 20H1 update*, which is upcoming, this feature maybe come back.


If the graphics card and its WDDM 2.7 driver for your Windows 10 build 19041 and higher computer supports hardware acceleration, you can turn on hardware-accelerated GPU scheduling to reduce latency and improve video output performance.

only work for 2004

https://www.tenforums.com/tutorials...-accelerated-gpu-scheduling-windows-10-a.html


----------



## 508859

CiselS said:


> If the graphics card and its WDDM 2.7 driver for your Windows 10 build 19041 and higher computer supports hardware acceleration, you can turn on hardware-accelerated GPU scheduling to reduce latency and improve video output performance.
> 
> only work for 2004
> 
> https://www.tenforums.com/tutorials...-accelerated-gpu-scheduling-windows-10-a.html


this has nothing to do with HW acceleration that was put here and there for many years.
this setting is letting GPU to manage it's own memory, instead of putting windows layer in between. 
it was supposed to improve latency by utilizing existing resources better with less overhead. While previous "HW acceleration" was there to allow offload of certain CPU tasks to other hardware (mostly GPU). 


> but i turn that off in registry and mouse feels more snappier, i can do more easily small adjustments and previously it felt like mouse is lagging behind


and this here is a pure placebo in action. since this setting is only compatible with unreleased (yet) W10 2004 AND nvidia drivers that explicitly support it.
If you changed those settings through registry on the unsupported platform and it "FEELS SNAPPIER" , you can put one more star on your tinfoil.


----------



## Asmodian

CrucialNUG said:


> Well its the truth so take it how you want, I could care less what some reductionist thinks.


Using reductionist as an insult triggers alarms for me, hopefully I am reading too much into it. You do believe Maxwell's equations and that humans understand electricity pretty well today? 

How did you go about determining this "truth"? Experiencing something odd and coming up with a reasonable sounding explanation is not a good way to determin truth. You need to come up with a falsifiable hypothesis and then actually test it. What experiment could you do that would prove your idea(s) false? Perhaps you could run some blind or double blind tests? Do you have anyone who could make a change you think is important so you could test it without knowing what configuration you were using?

This is an area that has a shocking amount of mystical/magical thinking in the community. I have seen so very many mouse related myths and superstitions over the years. Placebo is too strong of an effect with mouse input, human perceptions are simply not reliable at these timescales, and people quickly suck themselves into believing what turns out to be random superstitions by running lots of very unscientific tests. Bad ground can definitely cause computer issues, but reading your post on the subject I think you jumped to an explanation way too soon and in the face of conflicting evidence.

We are all susceptible to this, I cannot count the number of times I have been sure what an issue was caused by only to have more methodical testing prove the variable I was tweaking had nothing to do with it.


----------



## CrucialNUG

Asmodian said:


> Using reductionist as an insult triggers alarms for me, hopefully I am reading too much into it. You do believe Maxwell's equations and that humans understand electricity pretty well today?
> 
> How did you go about determining this "truth"? Experiencing something odd and coming up with a reasonable sounding explanation is not a good way to determin truth. You need to come up with a falsifiable hypothesis and then actually test it. What experiment could you do that would prove your idea(s) false? Perhaps you could run some blind or double blind tests? Do you have anyone who could make a change you think is important so you could test it without knowing what configuration you were using?
> 
> This is an area that has a shocking amount of mystical/magical thinking in the community. I have seen so very many mouse related myths and superstitions over the years. Placebo is too strong of an effect with mouse input, human perceptions are simply not reliable at these timescales, and people quickly suck themselves into believing what turns out to be random superstitions by running lots of very unscientific tests. Bad ground can definitely cause computer issues, but reading your post on the subject I think you jumped to an explanation way too soon and in the face of conflicting evidence.
> 
> We are all susceptible to this, I cannot count the number of times I have been sure what an issue was caused by only to have more methodical testing prove the variable I was tweaking had nothing to do with it.


Fully aware that humans understand electricity pretty well today. And part of that understanding includes the fact that electro magnetic interference can affect electronic devices and how they function. Now I have confirmed that the EMF in my home is quite high due to imbalanced current over my waterlines. I did not take pictures while my electrician was here, and do not currently own an ampere meter so you can either take me on my word for that or choose to disregard what else I have to say on it. However @Timecard who has the exact same technical issues as me has now confirmed (with photos) that he has the same situation regarding net currents on his waterlines. 

I am not dealing in placebos here- placing foil or any EM shielding around my cables, particularly data cables will change my mouse/keyboard function, display quality and sound quality. It is well beyond placebo and I am aware that you may disagree with this. However I am 100% confident if you were to sit down at my computer in my home you would see exactly what I am talking about. I have taken my PC's (yes multiple desktops and a laptop) to other location where this problem is not present and my computer functions properly- night and day of a difference. I then purchased power conditioners, Online UPS, Iso transformers and a number of other devices that effect the electrical connections for my system. When I saw that these made large differences in performance but did not offer a solution I began to explore if my home had power quality or electrical wiring issues. Then my electrician found an issue- Objectionable Current on a Grounding Conductor (NEC 250-21). Based on his recommendations we had our meter base and mast replaced and had our neighbor tighten their neutral connection. The current on the water pipe went down (from 6-8amps to 3-4). My problems improved across the board however only marginally. Therefore I believe the strange behaivor of my devices is linked to the EMF that the imbalanced current is putting off. 

If you think I am lying about this then so be it carry on thinking I am spreading nonsense. I have certainly been wrong about my theories here and will admit to it, however I currently believe that the elevated EMF in my home is interfering with the data signals for my peripherals, as well as causing other devices/appliances to act strange. The thing about this problem is that it is always present and varies in severity as a result of the net currents changing in respect to electrical loads on my home and those around me. Nothing is consistent and so a double blind test would be relatively hard in this situation. 

As far as an experiment I could do that would prove this all would be to divert the current from the water pipe and see if that fixes it. I am not going to buy a 1000fps camera to prove to people that believe that I am schizophrenic already and think I am conjuring this up in my mind. I have used my computer at other locations for extended periods of time and it functions exactly as I know a properly working PC to function. There are multiple people with this issue out there and it is already hard enough to find a the cause and a "solution" let alone waste time attempting to prove to some naysayers online that it is real. Its fine if you do not believe me, I totally get how unlikely it all seems (something that I grappled with myself even when the problem was right in front of me). But I will continue to talk about this issue because I know I am not the only one with it. And in respect to USB 3.0 preforming different to USB 2.0 that is absolutely the case at my current residence. When I took my PC to my friends I tested both ports and could discern zero difference in tracking between the two- completely different from my home.


----------



## Asmodian

CrucialNUG said:


> Fully aware that humans understand electricity pretty well today. And part of that understanding includes the fact that electro magnetic interference can affect electronic devices and how they function. Now I have confirmed that the EMF in my home is quite high due to imbalanced current over my waterlines.


I am really surprised your computer could boot with EMF so bad that foil around a mouse cable changes how it works. It messes up USB 2.0 but not DDR4? It also seems really strange that the signal corruption was enough to cause packet loss on the USB, but not enough that it stops working entirely. EMF on USB tends to entirely stop its function in my experience (custom automation systems), it is using a complex digital protocol so once it goes bad it just doesn't work at all. In what way is the video quality affected? Putting foil around the video cable actually helped? This is digital video, right?

If a power conditioner helped, but didn't fix it, it doesn't make sense either. How would that improve the EMF environment if that is caused by something else (using a power conditioner does not change the imbalanced current, right?).

Your data points so far do not paint a clear picture, at least to me.


----------



## CrucialNUG

Asmodian said:


> I am really surprised your computer could boot with EMF so bad that foil around a mouse cable changes how it works. It messes up USB 2.0 but not DDR4? It also seems really strange that the signal corruption was enough to cause packet loss on the USB, but not enough that it stops working entirely. EMF on USB tends to entirely stop its function in my experience (custom automation systems), it is using a complex digital protocol so once it goes bad it just doesn't work at all. In what way is the video quality affected? Putting foil around the video cable actually helped? This is digital video, right?
> 
> If a power conditioner helped, but didn't fix it, it doesn't make sense either. How would that improve the EMF environment if that is caused by something else (using a power conditioner does not change the imbalanced current, right?).
> 
> Your data points so far do not paint a clear picture, at least to me.


Well that is the truth. Covering my cables in foil/shielding material or using ferrites/toroids changes the performance which would clearly indicate that the cable is acting as an antenna for EMI. The ways in which video quality are affected: Screen tearing, severe motion blur, and hitches. At rare times there are destructive interference like artifacts on the monitor (horizontal lines). I have confirmed this is not the fault of any hardware as this happens across multiple entirely different builds. I have tested HDMI, DP, DVI cables and each one suffers these problems. I found the thicker and shorter the cable the better it works. 

I agree it does not make sense to me either why a power conditioner helped initially and is now just a glorified paper weight. The double conversion online UPS is the real confusing thing, as it does have some permanent affect however it really only helps with the display hitching and stutters. If this was a power quality issue (modern PSUs are plenty good enough already) then the double conversion UPS should fix it entirely. If it was the product of a "noisy" ground then lifting the ground connection should fix it. I have had to rule things out one by one in an effort to narrow down the possible culprits- which leaves me with the one problem my electrician was able to find, and it so happens Timecard who suffers the same input issues etc has this exact situation in his home.

Fully aware my data points do not paint a super clear picture. It has been absurdly hard to diagnose and triangulate the cause of the interference but I am confident it is from the net currents producing the EMF on my water lines. If you are curious and want to know more you are welcome to come ask us in @Timecard's thread.


----------



## MaximilianKohler

If anyone cares to contribute on-topic feedback to what this thread was originally about, I created a troubleshooting thread here: https://www.overclock.net/forum/375-mice/1746838-troubleshooting-unstable-mouse-hz-polling-rate.html


----------



## empl

CiselS said:


> If the graphics card and its WDDM 2.7 driver for your Windows 10 build 19041 and higher computer supports hardware acceleration, you can turn on hardware-accelerated GPU scheduling to reduce latency and improve video output performance.
> 
> only work for 2004
> 
> https://www.tenforums.com/tutorials...-accelerated-gpu-scheduling-windows-10-a.html


Hardware acceleration should be only on in 3d apps like games to improve performance. Problem other applications like browsers, spotify, battle.net launcher, e.g. edge are using it. If you turn that off, you can notice major improvement. As for 3d applications itself, i don't think you can turn that off, as it is handled by gpu drivers long time by now. Unless these apps have option to do so. So i am unsure what this setting in windows is supposed to do. Maybe if you have integrated gpu without drivers, this will allow you to turn that on/off. I doubt that if you turn this off, it will turn off hardware acceleration for 3d apps like games, as it is handled by gpu drivers, because it would reduce performance too much. Or it turns on/off hardware acceleration for windows apps only, or if you don't have gpu drivers installed. 

Even ramenraider said to turn off hw accleration, in e.g. in control panel/internet options/advanced/use software rendering... I always turned that off in chrome and everywhere i can. Before you install gpu drivers you could turn that off in display setting and notice huge difference. I am not exactly sure why it causes input lag, but i can feel difference and i read it everywhere to turn that off. E.g. https://community.stadia.com/t5/Sta...ary-of-stuttering-and-input-lag-My/td-p/15054 
You can find it on internet in input lag guides, other people can tell difference too.


----------



## 508859

empl said:


> Hardware acceleration should be only on in 3d apps like games to improve performance. Problem other applications like browsers, spotify, battle.net launcher, e.g. edge are using it. If you turn that off, you can notice major improvement. As for 3d applications itself, i don't think you can turn that off, as it is handled by gpu drivers long time by now. Unless these apps have option to do so. So i am unsure what this setting in windows is supposed to do. Maybe if you have integrated gpu without drivers, this will allow you to turn that on/off. I doubt that if you turn this off, it will turn off hardware acceleration for 3d apps like games, as it is handled by gpu drivers, because it would reduce performance too much. Or it turns on/off hardware acceleration for windows apps only, or if you don't have gpu drivers installed.
> 
> Even ramenraider said to turn off hw accleration, in e.g. in control panel/internet options/advanced/use software rendering... I always turned that off in chrome and everywhere i can. Before you install gpu drivers you could turn that off in display setting and notice huge difference. I am not exactly sure why it causes input lag, but i can feel difference and i read it everywhere to turn that off. E.g. https://community.stadia.com/t5/Sta...ary-of-stuttering-and-input-lag-My/td-p/15054
> You can find it on internet in input lag guides, other people can tell difference too.


except you have zero idea what "Hardware Accelerated GPU Scheduling" is in windows, nor did you ever try it yourself, and it also has nothing to do with rendering anywhere, and nothing to do with HW acceleration in browsers, and also not a single guide on the internet even mention this one, since it is NOT YET implemented.


----------



## empl

numberfive said:


> except you have zero idea what "Hardware Accelerated GPU Scheduling" is in windows, nor did you ever try it yourself, and it also has nothing to do with rendering anywhere, and nothing to do with HW acceleration in browsers, and also not a single guide on the internet even mention this one, since it is NOT YET implemented.


Yeah that's what i said... I have no idea what it is for, because hardware acceleration is handled by GPU drivers long time by now. Only before GPU drivers installation - there is a option to tweak it somewhere in settings under display/... There is no way to disable it in GPU drivers for 3D apps, as it would probably decreased performance too much. Some applications like browser, allows you to turn that off in their settings. You generally don't want HW acceleration on for any windows applications except: media player, or PC games. As having it on adds input lag. Don't know what is this new setting exactly for. Btw even it is yet to be implemented, it was already visible for some users. I would guess, it's maybe for windows UWP apps, dunno.

Haa https://www.windowslatest.com/2020/02/16/windows-10-v2004-hardware-accelerated-gpu-scheduling/
https://www.reddit.com/r/windows/comments/g99afg/hardwareaccelerated_gpu_scheduling/
WDDM 2.7 introduces another new feature called ‘Hardware-accelerated GPU scheduling’ for the integrated and discrete graphics card. Unfortunately, AMD Radeon is currently not supported due to the lack of insider drivers.

Ye that's what i thought, it is only for integrated, or discrete cards. For desktop gpus, it is handled by drivers. I first thought disabling it helps maybe a bit, but this doesn't do anything for dedicated gpu.



> x7007


Btw english philipines decreases input lag a lot, i have new monitor since my gaming monitor died, i was on va panel long time, which is laggy. There is huge difference, after installing both display language and switching keyboard language from en us to english phillipines.


----------



## x7007

empl said:


> x7007
> 
> 
> Btw english philipines decreases input lag a lot, i have new monitor since my gaming monitor died, i was on va panel long time, which is laggy. There is huge difference, after installing both display language and switching keyboard language from en us to english phillipines.


Thanks I'll test it again!


----------



## empl

x7007 said:


> Thanks I'll test it again!


Also i always read, that installing Intel Management Engine drivers causes input lag, which was true on my old rig. But on my new mobo Asus Z390 Gaming-i, which is known for low DPC latency, it rocks. Also mouse accuracy is better after installing Intel Management Engine firmware, drivers not sure.

Also as you install more programs, there are services in msconfig, which can't be disabled in task manager/services... Only by doing clean boot, which reduces input lag considerably, but problem is nothing works than and you have to restart system and than click services you want enabled back one by one, it is too annoying. Even if bat file for doing that worked, having to restart pc each time is annoying. Only thing i can think of is - having dual boot, but i don't want to waste precious space on my NVME. Should have bought 2TB, didn't knew there are 200GB games xD

Btw how did you find out, that philippines have lower input lag ? Did you test every language package against each other    ?


----------



## 508859

empl said:


> Yeah that's what i said... I have no idea what it is for, because hardware acceleration is handled by GPU drivers long time by now. Only before GPU drivers installation - there is a option to tweak it somewhere in settings under display/... There is no way to disable it in GPU drivers for 3D apps, as it would probably decreased performance too much. Some applications like browser, allows you to turn that off in their settings. You generally don't want HW acceleration on for any windows applications except: media player, or PC games. As having it on adds input lag. Don't know what is this new setting exactly for. Btw even it is yet to be implemented, it was already visible for some users. I would guess, it's maybe for windows UWP apps, dunno.
> 
> Haa https://www.windowslatest.com/2020/02/16/windows-10-v2004-hardware-accelerated-gpu-scheduling/
> https://www.reddit.com/r/windows/comments/g99afg/hardwareaccelerated_gpu_scheduling/
> WDDM 2.7 introduces another new feature called ‘Hardware-accelerated GPU scheduling’ for the integrated and discrete graphics card. Unfortunately, AMD Radeon is currently not supported due to the lack of insider drivers.
> 
> Ye that's what i thought, it is only for integrated, or discrete cards. For desktop gpus, it is handled by drivers. I first thought disabling it helps maybe a bit, but this doesn't do anything for dedicated gpu.


please stop spreading your incompetence. 
your first paragraph, again, has nothing to do with the subject. 

discrete means usual desktop gpus. and previously it was handled by windows API, now there will be a setting to give control over GPU resources to GPU directly. save yourself from the embarrassment, do not comment here.


----------



## empl

numberfive said:


> please stop spreading your incompetence.
> your first paragraph, again, has nothing to do with the subject.
> 
> discrete means usual desktop gpus. and previously it was handled by windows API, now there will be a setting to give control over GPU resources to GPU directly. save yourself from the embarrassment, do not comment here.


Lol i though discrete graphic card was that thing, that allowed you to connect output to integrated graphic card and use it with combination of your dedicated gpu. I didn't know what it was so ? It is new feature... I think you are still angry, because people disagreed with you on dpc latency...


Btw i don't know why OP is zoomed to couple of microseconds. I get autozoomed to scope, when interrupt with highest latency occurred. I read from the original post: Interval vs Time is supposed to measure Polling Rate stability, that is whether mouse captures frames at correct interval. E.g. some crappy mice have problem with this and report unstable polling rates... But OP also talks about polling rate imprecision and explains it as how fast cpu can address interrupt routine, caused by ehci controller aka time of polling rate handling by OS. But these values are way higher than couple us. On my old pc it was like 80us measuring from latencymon and OP says it can go up to 500 us on some badly optimized systems. Also there are multiple spheres above each other in his picture, i don't understand what that means. I have nothing like that in my program. So i am unsure if Interval vs Time tests only polling rate stability, or time of polling rate handling by os as well. But it looks like so, getting same results in latencymon as well... 

*Also can anyone clarify what are these ripples in the graph ?, beside dots. It is not explained in the original post. Or multiple spheres above each other ? Also what are spheres under 2ms ?* 

I have new PC so i want to show MouseTester results, low DPC latency mobo Asus Z390 Gaming-i, https://imgur.com/a/YijTlHr feels great! You can find mobo's DPC latency tests on anandtech, even they are relative, it helps when picking new mobo. Seems like i picked right. I didn't even changed many things in bios, just disabled power saving features for cpu so far. Only thing i don't like about it, you can't disable hpet, but it has already low dpc latency, so it isn't big deal. I also noticed i get a lot of less DPC calls on my new pc, maybe because everything works in msi mode.

Btw i also wonder about interrupt priority, checking at MSI_util_v2, sata and nvme is set to high by default and others are on undefined.

Picture from latmon, it is under Wdf1000, because i have usb 3 drivers installed. But i am using usb 2 ports and xhci is by default in msi mode. And i disabled all other ports in bios. Still i have only usb 3 roothub in device manager, so i don't want to uninstall it to not lose mouse.



x7007 said:


> s


This decreased input lag a lot for me! https://answers.microsoft.com/en-us...ndows-10/13bdf8f4-1c44-4133-8e7c-3214bb674742



r0ach said:


> abc


I tested using Nvidia HD audio output and disabled onboard audio in bios and it gave me insane input lag. I read on the internet in multiple articles, that nvidia hd audio causes input lag! Also i have sampling rate greyed out and set to 48khz. Although even if i had 44,1 i don't think that's it, even 48 vs 44,1 khz doesn't feel that bad. Could be just my setup, but i don't think so, i read it in multiple articles. So i don't know, it definitely feels much much worse at least for me. And thing is while oboard audio sound card cause huge input lag, i have ridiculously good mobo asus z390 gaming-i, on my previous computer, onboard audio caused definitely more input lag, it is not that different set to on/off right now.

*EDIT mousetester*: after cutting start and end, i got auto zoomed and graph shows not more than 40us polling rate handling by os. Still in latencymon i got spike sometimes to 100us. I think latencymon is better measure, it will simulate load to all drivers, because in real world scenarios, only mouse won't be used. However,when it measure it shortly, if i max polling rate in latencymon, i get same 40us. So it seems correct.


----------



## the1freeMan

Can anybody test if there's any difference in polling precision between usb 2 and 3 on a coffee lake board?


----------



## 508859

the1freeMan said:


> Can anybody test if there's any difference in polling precision between usb 2 and 3 on a coffee lake board?


it's not defined by the chipset, and also there are multiple other factors that will have much bigger impact on precision than a specific chipset


----------



## the1freeMan

numberfive said:


> it's not defined by the chipset, and also there are multiple other factors that will have much bigger impact on precision than a specific chipset


ofc, I was curious if there was any difference since they are both handled by the usb3 controller on those boards, with many not even having usb2 ports.
I unpacked one that I was about to send back to the store, taking an extra day to test it myself and the polling stability is equal on both ports.


----------



## NDUS

I wonder about the USB polling stability in cheap vs expensive boards.
For example, if we take some bargain bin z470 board and compare against a Z490 DARK - how will the stability differ? Is more PCB layers an advantage or disadvantage? Is the trace layout potentially superior on the expensive board?

If anyone has a high end board like a DARK, feel free to post your polling results here, I'm curious about them


----------



## SweetLow

NDUS said:


> I wonder about the USB polling stability in cheap vs expensive boards.


Zero difference.


----------



## MIETAS

Gotta ask you boys, is this acceptable?


----------



## cdcd

Yes


----------



## empl

NDUS said:


> I wonder about the USB polling stability in cheap vs expensive boards.
> For example, if we take some bargain bin z470 board and compare against a Z490 DARK - how will the stability differ? Is more PCB layers an advantage or disadvantage? Is the trace layout potentially superior on the expensive board?
> 
> If anyone has a high end board like a DARK, feel free to post your polling results here, I'm curious about them





SweetLow said:


> Zero difference.


How is that zero difference, you have these motherboards - even 500$ asus rogs, which have literally 1ms dpc latency. If motherboard is hogged with dpc latency, other devices suffer too. Besides on cheap motherboard you may not be able to disable individual USB ports. I found out on my ASUS Z390-gaming i, when i disable all usb ports except usb 2: i have USB Root Hubs. But if i enable even 1x usb 3.0: i have USB Root Hubs (USB 3.0) and mouse movement feels like crap (while i am using USB 2 ports for mouse and keyboard)! Also if you aren't able to set bclk closest to 100, it is not optimal. Or if you are unable to disable HPET, which is not possible on asus motherboard - not sure if this is true for all... Also i don't know if all modern moderboards use msi, which is huge deal for usb polling stability, or latency. So there will be difference for sure. There will be more difference in input lag probably (depending what features your motherboard can tweak) and dpc latency, which depends on what mobo you choose and what devices/drivers it is using...

I don't think polling stability is even main issue, unless you suffer from high dpc. latency! But i think input lag is more important! Today if you choose decent motherboard and disable useless services etc: polling stability will be probably very consistent. Bigger problem is choosing right mobo with low dpc. latency and features you can tweak. Majority of input lag comes from untweaked bios. What concerns stability not sure, i don't test BIOS features for stability, i already know what should be disabled etc. And mine is very decent. Not sure if that's a reason, but on my new motherboard supporting msi for USB ports - mouse polling stability is much better and there is a lot lower dpc. latency for USB driver, than on my old one! By margin like 100 us, depends.

ANANDTECH HAS A LOT OF TESTS FOR MOBOS INCLUDING DPC LATENCY! It makes all difference in the world!!! I was supreme master class in cs go and when i tried USB 3, i couldn't hit anything, i would be bronze  Some of these settings have gigantic impact on mouse, like disabling CSM helps a ton!!!


----------



## Timecard

empl said:


> Bigger problem is choosing right mobo with low dpc.


Isn't DPC the software layer abstraction for queued interrupt requests?


----------



## empl

Timecard said:


> Isn't DPC the software layer abstraction for queued interrupt requests?


And what with it? All i know of it: that cpu gets interrupts from hardware and than schedules DPC call. But higher the latency, than the longer time for processor to handle interrupts. You get frame each 16.67, or 6.9 ms etc. But what if interrupt is handled right after a frame is finished? Than it doesn't render last mouse position in that frame and you'll have to wait for a next frame to update another mouse position. If you have 500hz polling e.g. Than if mouse captures position each 2 ms. E.g. 2ms before gpu finishes rendering a frame (i doubt mouse position can be updated once frame is sent to monitor from gpu) and than on top of that: if you have high dpc. latency - it can take like another 100 us on good motherboards to handle interrupt from usb. Even usb doesn't send interrupts, but cpu polls it 500/1000 times per second. CPU still schedules dpc calls for USB driver... But than, if cpu handles interrupt 100us: after a frame is finished rendering and is sent to monitor. Than gpu had rendered mouse position from time 4ms ago, because new mouse position wasn't updated yet!!! Cpu prepares each frame for GPU, before GPU starts to render it, i don't know if cpu can update mouse position to gpu, once gpu has a frame. But after gpu sends it to monitor, it is doubtful. Than you can see over 1 second of time, mouse position can lag each time up to 144 times, if you use 144hz monitor. And small input lag adds up! 

It doesn't affect you, that much, because you have 200 ms reaction time on average and if on top of overall input lag. 2.6 ms is added, it won't be major a difference. But you can feel it on overall mouse movement and especially: when you do small adjustments and mouse movement is clealer! Also movement becomes more stable, than if you have high dpc. latency. It is called micro stuttering!

That's why if you set timer resolution to 0.5ms, there is gigantic difference between 1ms. It allows to update code to cpu faster. Yes it is only 500us difference, but than it means: it can update code to cpu each 500us, which scales over 1 second of time, because you have 144fps e.g. Than if you could send code to cpu only each 1ms, or at higher interval... You can try it yourself. There is gigantic difference! Also if you will be using 500hz, instead 1000hz. If mouse position renders from previous frame (4ms ago), there will be larger feeling that mouse lags. That if it used latest position from (2ms ago). So even if you have 100 us dpc. latency and there is 144 fps. So there is chance 144 times: mouse cursor can be rendered not from a last position! Depending how polling occurs and dpc. latency, before interrupt is handled by cpu.

I am not hardware expert, but if there wasn't any difference, than i couldn't feel anything. You can try timer resolution 500us vs 1ms, there is so gigantic difference, that anyone can probably tell. A lot of people use it. Also you can try 15.6ms, vs 0.5us and you have to disable dynamic tick, so timer resolution is constant! Also it increases cpu load slightly. It was tested in Crysis to add fps.

Also about stability: if you use 0.5ms it feels more unstable, than 1ms. It is hard to describe, but 0.5 ms is more inconsistent than 1ms, especially with 500hz polling. With 1000hz polling, it will be better. That's reason i was always using 1ms timer resolution, feels best! And 500hz also better than 1000hz. Because 1000hz captures even smallest movements... So 500hz is more accurate, but less consistent!


----------



## x7007

empl said:


> Timecard said:
> 
> 
> 
> Isn't DPC the software layer abstraction for queued interrupt requests?
> 
> 
> 
> And what with it? All i know of it: that cpu gets interrupts from hardware and than schedules DPC call. But higher the latency, than the longer time for processor to handle interrupts. You get frame each 16.67, or 6.9 ms etc. But what if interrupt is handled right after a frame is finished? Than it doesn't render last mouse position in that frame and you'll have to wait for a next frame to update another mouse position. If you have 500hz polling e.g. Than if mouse captures position each 2 ms. E.g. 2ms before gpu finishes rendering a frame (i doubt mouse position can be updated once frame is sent to monitor from gpu) and than on top of that: if you have high dpc. latency - it can take like another 100 us on good motherboards to handle interrupt from usb. Even usb doesn't send interrupts, but cpu polls it 500/1000 times per second. CPU still schedules dpc calls for USB driver... But than, if cpu handles interrupt 100us: after a frame is finished rendering and is sent to monitor. Than gpu had rendered mouse position from time 4ms ago, because new mouse position wasn't updated. Cpu prepares each frame for GPU, before GPU starts to render it, i don't know if cpu can update mouse position to gpu, once gpu has a frame. But after gpu sends it to monitor, it is doubtful.
> 
> That's why if you set timer resolution to 0.5ms, there is gigantic difference between 1ms. It allows to update code to cpu faster. Yes it is only 500us difference, but than it means: it can update code to cpu each 500us, which scales over 1 second of time, because you have 144fps e.g. Than if you could send code to cpu only each 1ms, or at higher interval... You can try it yourself. There is gigantic difference! Also if you will be using 500hz, instead 1000hz. If mouse position renders from previous frame (4ms ago), there will be larger feeling that mouse lags. That if it used latest position from (2ms ago). So even if you have 100 us dpc latency and there is 144 fps. So there is chance 144 times: mouse cursor can be rendered not from a last position! Depending how polling occurs and dpc. latency, before interrupt is handled by cpu.
> 
> I am not hardware expert, but if there wasn't any difference, than i couldn't feel anything. You can try timer resolution 500us vs 1ms, there is so gigantic difference, that anyone can probably tell. A lot of people use it. Also you can try 15.6ms, vs 0.5us and you have to disable dynamic tick, so timer resolution is constant! Also it increases cpu load slightly. It was tested in Crysis to add fps.
> 
> Also about stability: if you use 0.5ms it feels more unstable, than 1ms. It is hard to describe, but 0.5 ms is more inconsistent than 1ms, especially with 500hz polling. That's reason i was always using 1ms timer resolution, feels best! And 500hz also better than 1000hz. Because 1000hz captures even smallest movements... So 500hz is more consistent and accurate!
Click to expand...

disable dynamic tick yes is bad. it changes the mouse accelerating or decelerating. I tested it and someone else in Tom'shardware. I told him and he noticed it. shouldn't be used.


----------



## empl

x7007 said:


> disable dynamic tick yes is bad. it changes the mouse accelerating or decelerating. I tested it and someone else in Tom'shardware. I told him and he noticed it. shouldn't be used.


Yeah i can't wait btw: until some random peoples start jumping in and telling i am crazy and i am imagining it... Which probably think: that eye can't see more than 24 fps. Scientists discovered eye can see even one photon of light. Why it is so hard for people to believe, that people can tell input lag? And why the heck, someone cares, if someone else thinks otherwise???! If you think it doesn't help, why do you care and you have to flame other people, if they have different opinion?

Btw you may take interest in this:
https://csgotweaks.com/bios.html

Also did you try to delete wan adapters in device manager and do you disable usb power saving there, on top of in power plan setting? I don't know why, even device is not used, some drivers introduce input lag. Like if you use virtual mechanic, but you have nothing mounted. Or even i don't use Nvidia HD audio, disabling it reduced input lag! I also though of lowering priority of some useless system services, in process lasso. Which cannot be safely disabled. Or reducing core affinity. Because they sometimes consume cpu cycles. And cpu scheduling is pretty good. I still play competitive games sometimes and i have VR, so i enjoy low input lag. Btw Valve Index Hardware has very low input lag, i was surprised!

This may amuse you: https://forums.blurbusters.com/viewtopic.php?f=10&t=6378&p=55454#p55454 Also monitor expert telling color profiles can reduce input lag, or using no color profile. After someone flaming, that it doesn't matter haHAHHAHA...


----------



## 508859

empl said:


> How is that zero difference, you have these motherboards - even 500$ asus rogs, which have literally 1ms dpc latency. If motherboard is hogged with dpc latency, other devices suffer too. Besides on cheap motherboard you may not be able to disable individual USB ports. I found out on my ASUS Z390-gaming i, when i disable all usb ports except usb 2: i have USB Root Hubs. But if i enable even 1x usb 3.0: i have USB Root Hubs (USB 3.0) and mouse movement feels like crap (while i am using USB 2 ports for mouse and keyboard)! Also if you aren't able to set bclk closest to 100, it is not optimal. Or if you are unable to disable HPET, which is not possible on asus motherboard - not sure if this is true for all... Also i don't know if all modern moderboards use msi, which is huge deal for usb polling stability, or latency. So there will be difference for sure. There will be more difference in input lag probably (depending what features your motherboard can tweak) and dpc latency, which depends on what mobo you choose and what devices/drivers it is using...
> 
> I don't think polling stability is even main issue, unless you suffer from high dpc. latency! But i think input lag is more important! Today if you choose decent motherboard and disable useless services etc: polling stability will be probably very consistent. Bigger problem is choosing right mobo with low dpc. latency and features you can tweak. Majority of input lag comes from untweaked bios. What concerns stability not sure, i don't test BIOS features for stability, i already know what should be disabled etc. And mine is very decent. Not sure if that's a reason, but on my new motherboard supporting msi for USB ports - mouse polling stability is much better and there is a lot lower dpc. latency for USB driver, than on my old one! By margin like 100 us, depends.
> 
> ANANDTECH HAS A LOT OF TESTS FOR MOBOS INCLUDING DPC LATENCY! It makes all difference in the world!!! I was supreme master class in cs go and when i tried USB 3, i couldn't hit anything, i would be bronze  Some of these settings have gigantic impact on mouse, like disabling CSM helps a ton!!!


none of this matters. price of the motherboard has nothing to do with the quality of the hardware and firmware, many mobo components (like usb controllers) would be the same on budget and luxury boards, it's not unusual for an expensive mobo to have a ****ty component embedded for no reason as well. 
I mean you need to have at least a hint of justification for the dpc latency tantrum. I don't see it, never seen a single test where 10us was practically better than 100us for gaming, not a single one.



empl said:


> Btw you may take interest in this:
> https://csgotweaks.com/bios.html


garbage. 

like from the beginning "100:100. 100:100 should obviously provide better results since it matches BCLK. Memory with lower frequency and lower timing can reduce latency."
what is obvious about it? if number is the same it doesn't matter that they are in sync, and if they would be in sync, it doesn't make it better for some reason. 
"The closest you can get this to 100.00, the better." no? nothing better or worse about this. 
etc


----------



## ucode

x7007 said:


> disable dynamic tick yes is bad. it changes the mouse accelerating or decelerating. I tested it and someone else in Tom'shardware. I told him and he noticed it. shouldn't be used.


Last time I looked at this if you disable idle states then dynamic tick is disabled too as there is no opportunity to turn it off however IIRC disabling it by BCD see's the tick on only core 0 (on my HW anyways) which could cause conflicts of interest.

Timer resolution affects quantum's (thread time) so making it very low may affect performance in some situations depending on software etc then again being low means W10 will be messing about with it a lot less hopefully.

As for the more expensive mainboards, they might provide better overclocking and the faster your CPU runs the quicker it can get things done, generally speaking.


----------



## empl

> garbage.
> 
> like from the beginning "100:100. 100:100 should obviously provide better results since it matches BCLK. Memory with lower frequency and lower timing can reduce latency."
> what is obvious about it? if number is the same it doesn't matter that they are in sync, and if they would be in sync, it doesn't make it better for some reason.
> "The closest you can get this to 100.00, the better." no? nothing better or worse about this.
> etc


I never said it has. I was listing it as example. That even rogs for 500$ suffer from high dpc. latency (even 1ms). I mentioned this... It doesn't have to do with quality of hardware, but rather from poorly written drivers! Hardware plays role too, e.g. motherboard rgb was found to cause latency, or wifi. So you need hardware/bios, which can allow you to tweak and disable options. Someone claims, even VRM switching matters for input lag. But for dpc. latency solely, it is drivers which cause delay! So picking right hardware (motherboard - makes all difference in the world)! Again you don't even read and just berate and flame... I am not sure, if all new motherboards support msi for USB and if you can turn off individual USB ports on them. You are just cherry picking facts to flame more... If you don't care about it, why you care some people can tell difference ? Even if you had 10us and 100us. Than sometimes: latest mouse position won't render. And than you will experience less stability and cursor smoothness. Even in this case, it wouldn't probably matter much. But if it goes to extremes like 1ms of dpc. latency, than you will have micro-stuttering. And i also said, if you read... That it won't probably matter as much as input lag (depends on hardware/bios again, so what hardware you pick matters), if you have already decent motherboard, which has 100 us dpc. latency for USB driver. So you just don't read and than claim, i said things - i didn't say. So you just troll... And yes if you play on high level competitive games, that every distraction matters and it helps to have less input lag etc. Previously i was gold 3 in cs go and after tweaking my system, i instantly ranked up. And i tried previous setting and i couldn't hit anything...

IT IS MEANINGLESS TO TALK WITH YOU, YOU JUST FLAME, INSTEAD OF DISCUSSION... I am just calling you out...


----------



## Timecard

empl said:


> It doesn't have to do with quality of hardware, but rather from poorly written drivers!


This was my point about DPCs and you saying buy motherboards with low dpcs, just wanted to make sure readers understood. Sooo.... I think what you were trying to say is some motherboards operate better with the available drivers assuming Anandtechs (your reference) tests aren't unknowingly skewed.

numberfive is just stating that many components operate on the same clock so as long as the clock is stable they are equally synchronized, seems logical. I assume you'll get some performance benefit with having it higher e.g. constant 99.9 vs 100 but it shouldn't be significant. If anyone has evidence or research for the contrary please post.


----------



## 508859

empl said:


> I never said it has. I was listing it as example. That even rogs for 500$ suffer from high dpc. latency (even 1ms). I mentioned this... It doesn't have to do with quality of hardware, but rather from poorly written drivers! Hardware plays role too, e.g. motherboard rgb was found to cause latency, or wifi. So you need hardware/bios, which can allow you to tweak and disable options. Someone claims, even VRM switching matters for input lag. But for dpc. latency solely, it is drivers which cause delay! So picking right hardware (motherboard - makes all difference in the world)! Again you don't even read and just berate and flame... I am not sure, if all new motherboards support msi for USB and if you can turn off individual USB ports on them. You are just cherry picking facts to flame more... If you don't care about it, why you care some people can tell difference ? Even if you had 10us and 100us. Than sometimes: latest mouse position won't render. And than you will experience less stability and cursor smoothness. Even in this case, it wouldn't probably matter much. But if it goes to extremes like 1ms of dpc. latency, than you will have micro-stuttering. And i also said, if you read... That it won't probably matter as much as input lag (depends on hardware/bios again, so what hardware you pick matters), if you have already decent motherboard, which has 100 us dpc. latency for USB driver. So you just don't read and than claim, i said things - i didn't say. So you just troll... And yes if you play on high level competitive games, that every distraction matters and it helps to have less input lag etc. Previously i was gold 3 in cs go and after tweaking my system, i instantly ranked up. And i tried previous setting and i couldn't hit anything...
> 
> IT IS MEANINGLESS TO TALK WITH YOU, YOU JUST FLAME, INSTEAD OF DISCUSSION... I am just calling you out...


- they are not suffering from high dpc latency, because the only people who care about high dpc latency are those who play 3dmark 2040 and MAYBE sound engineers in some exotic cases in 2006. 
- quality of drivers has nothing to do with the cost of the hardware (which for some reason was your point two comments above)
- "Someone claims, even..." someone is even roach. ignore and forget, or prove and shut me up.
- "motherboard - makes all difference in the world" no? there is no right one, there are very few that are garbage and there are the rest, that will suit you perfectly for gaming.
- "Again you don't even read and just berate and flame..." for some reason you are spreading this bs and giving a link to a tweaking resource that is referring back to roach's thread on OCN. like for real? how can you not be a clown here? 
- "If you don't care about it, why you care some people can tell difference ?" don't make me laugh. you are continuously clowning here as an assistance manager to roach. 

I actually decided to not read your post further, because you repeat your bs claims which have zero justification, zero practical impact on (competitive) gaming and zero tests that would show even a slightest difference. 
it is not a matter of what kind of difference I can tell or not, it is not about me at all. 
Y O U 
A R E
S P R E A D I N G
B S


on a side note, you are simply bad in games (according to you), you can improve with more practice and less tin foil tuning. stock windows 10 setup is not providing some disadvantageous experience to professional players, if it does to you, well, it is a you problem.


----------



## ucode

Timecard said:


> numberfive is just stating that many components operate on the same clock so as long as the clock is stable they are equally synchronized, seems logical.


Looks at own board and sees lots of clocks that are not the same/synchronized, which ones did you mean? As for price not affecting firmware, that in my experience is just not true. A bit like Windows maybe, you pay more to use more features such as amount of RAM, it's just OS software settings.


----------



## MIETAS

I've depleted my tweaks ideas, that's all i can get @ win 10 (not safe mode).
Can't go any further with ryzen...


----------



## Athrutep

MIETAS said:


> I've depleted my tweaks ideas, that's all i can get @ win 10 (not safe mode).
> Can't go any further with ryzen...


That is good. The only way to get it lower is to install win7 and deinstall and remove loads of components (and it also depends on your mouse as well). But what you reached is pretty much win 10 limits. At least i haven't seen anyone provide anything that was lower. I also doubt that lower results would be a big improvement.


----------



## MIETAS

Safe mode graphs look pretty much like win 7, maybe a little worse. Normal boot is as it is. Took me like 1 year to get these kind of polling which is kinda sad but hey, at least the ******** journey has ended.

Thanks for reply @Athrutep


----------



## NDUS

I don't think these graphs showing your polling stability on desktop with nothing running are useful data. When you play a game, you will start getting tons of interrupts as the same thread that handles polling will also start handling game logic. If you play a game with mousetester recording active and then observe afterwards, it's not going to look as tranquil as your desktop graphs.

The real way to maximize polling stability in games should be:

1) identify the thread which controls USB polling by moving your mouse and observing thread activity. For me it's thread #2, or "thread #1" if you start from 0 like HWINFO64 does
2) move all possible computation off of that thread ie. forbid the game from using it

Also it should obviously help to have a stronger per-core performance, since the CPU will handle interrupts that much faster.


----------



## MIETAS

NDUS said:


> I don't think these graphs showing your polling stability on desktop with nothing running are useful data. When you play a game, you will start getting tons of interrupts as the same thread that handles polling will also start handling game logic. If you play a game with mousetester recording active and then observe afterwards, it's not going to look as tranquil as your desktop graphs.
> 
> The real way to maximize polling stability in games should be:
> 
> 1) identify the thread which controls USB polling by moving your mouse and observing thread activity. For me it's thread #2, or "thread #1" if you start from 0 like HWINFO64 does
> 2) move all possible computation off of that thread ie. forbid the game from using it
> 
> Also it should obviously help to have a stronger per-core performance, since the CPU will handle interrupts that much faster.



The main thing is - I had 0 stability at idle so i take these graphs as a huge step forward.


----------



## 508859

MIETAS said:


> The main thing is - I had 0 stability at idle so i take these graphs as a huge step forward.


you won in graphs, but do you play games?


----------



## ucode

NDUS said:


> I don't think these graphs showing your polling stability on desktop with nothing running are useful data.


All relative so still useful IMO. IRQ's should take precedence so if you seeing poor results in game it might just be MouseTester itself getting beat up by those other threads. 




MIETAS said:


> Safe mode graphs look pretty much like win 7, maybe a little worse.


Does seem very sad with W10 vs W7 in it's IRQ handling so you've done well. I had tried some serial mouse integration although not a real mouse, just a connection to the 232 port pretending to be a mouse and when looking at 10,000Hz packets W10 drops 40-50% of them while W7 carries on well above that (same HW). What I found surprising though was above 8000Hz the CPU (Haswell) starts throttling internally, reaching base ratio at somewhere around 15k packets. Enough of that though as this is a USB thread. I will say W7 does seem to set XHCI interrupt moderation (a way of throttling interrupts) high though or perhaps it's that it doesn't set it at all.


----------



## klunka

Noob here, why does my mousetester graphs have like 5 dots when I zoom in compared to the hundreds of dots I see posted in this thread?


----------



## Axaion

klunka said:


> Noob here, why does my mousetester graphs have like 5 dots when I zoom in compared to the hundreds of dots I see posted in this thread?


Because you zoomed the graph to 10ms?

I took the bait/10


----------



## Timecard

If you don't cut off the head and tail zoomed out it'll look almost perfect too (much less dots)


----------



## klunka

xD I am actually serious.....how do you zoom in on update time without zooming the timeline??


----------



## klunka

NDUS said:


> I don't think these graphs showing your polling stability on desktop with nothing running are useful data. When you play a game, you will start getting tons of interrupts as the same thread that handles polling will also start handling game logic. If you play a game with mousetester recording active and then observe afterwards, it's not going to look as tranquil as your desktop graphs.
> 
> The real way to maximize polling stability in games should be:
> 
> 1) identify the thread which controls USB polling by moving your mouse and observing thread activity. For me it's thread #2, or "thread #1" if you start from 0 like HWINFO64 does
> 2) move all possible computation off of that thread ie. forbid the game from using it
> 
> Also it should obviously help to have a stronger per-core performance, since the CPU will handle interrupts that much faster.


I tried to do this but couldn't identify the core for USB polling, can you help me? First I looked in ressource monitor - all cores go up when moving the mouse. This was due to dwm which runs on all cores apparently. So I tried ingame, where dwm is basically idle. I used msi afterburner to see usage of all cores and then I used process lasso to make every process only use 3 of my 4 cores. Then I moved my mouse in game and watched if the non busy core is doing anything. I did this for every core but it was almost the same result for each core(kind spiking up a few pct when I start moving mouse but nothing dramatic or lasting). 
Am I doing something wrong, or did I misunderstand your post? Running out of ideas to test this...


----------



## x7007

anyone who do the tests or any tests please write what is your system configuration. like any bcdedit and some bios settings like Global C State. and any MSI enabled/disabled devices. we will never know what is going on until people actively post them and update for any changes.


----------



## ucode

@klunka, it's not a given that only one core is used. You could try Windows performance monitor and select interrupts/s under processor or even latencymonitor although when I tried it, it's results seem to be lower than expected. You may even be able to read the temporal affinity in the registry, or not.



x7007 said:


> anyone who do the tests or any tests please write what is your system configuration. like any bcdedit and some bios settings like Global C State. and any MSI enabled/disabled devices. we will never know what is going on until people actively post them and update for any changes.


I think your asking a lot and would need to be more specific. Maybe if you created a form and new thread with updates to first post it might help, or not. Also the results can be variable so you might just see clipped results that look good and not the bad bits. Well I guess bad needs to be defined too.

With W10-1903, CPU 14 core 3GHz max, XHCI MSI disabled, EHCI1&2 disabled, no idle states, mousehook and some other stuff...

Here's a 23 second run









And same run cropped to 7 seconds









See what I mean?


----------



## NDUS

klunka said:


> I tried to do this but couldn't identify the core for USB polling, can you help me? First I looked in ressource monitor - all cores go up when moving the mouse. This was due to dwm which runs on all cores apparently. So I tried ingame, where dwm is basically idle. I used msi afterburner to see usage of all cores and then I used process lasso to make every process only use 3 of my 4 cores. Then I moved my mouse in game and watched if the non busy core is doing anything. I did this for every core but it was almost the same result for each core(kind spiking up a few pct when I start moving mouse but nothing dramatic or lasting).
> Am I doing something wrong, or did I misunderstand your post? Running out of ideas to test this...


For me, moving my M1K when it's set to 8000hz sets core 1 going at 25-30% from baseline: https://streamable.com/8dycrz
In task/resource monitor, Windows simply reports that the program I'm mousing within is consuming those CPU cycles. If I'm mousing in Windows' desktop manager, it says DWM is taking the core - if in a game, the game is taking the core

If your mouse is running at 1000hz it's probably not as easy to tell using this method but I'm sure it can be done. Maybe try safe mode to eliminate anything else eating CPU cycles.


----------



## klunka

NDUS said:


> For me, moving my M1K when it's set to 8000hz sets core 1 going at 25-30% from baseline: https://streamable.com/8dycrz
> In task/resource monitor, Windows simply reports that the program I'm mousing within is consuming those CPU cycles. If I'm mousing in Windows' desktop manager, it says DWM is taking the core - if in a game, the game is taking the core
> 
> If your mouse is running at 1000hz it's probably not as easy to tell using this method but I'm sure it can be done. Maybe try safe mode to eliminate anything else eating CPU cycles.


I think I have done it, but maybe you can confirm if this is right.
I did recording with latencymon and then checked the driver and cpu tabs for ISR counts. All ISR counts came from usbport and hdaudbus. Core 1(or "0") did all the ISR counts so Core 1 is used for polling yeah?
I did the same ingame, again usbport and hdaudbus were the only drivers with ISR count. This time however the core with most ISR counts was Core 2(or 1). Tested another game, same results.
I conclude that Core 2 is handling polling ingame.

So now I'm trying to compare normal mode vs. setting core affinity of all programs to core 1, 3 and 4. If only I could figure out how to zoom the x axis of mousetester plots without also zooming into y axis and see more than one poll at a time  why am I so stupid?


----------



## Jonagold

You realize that you are testing the polling stability when PC is idle, results should be quite different during gaming or similar cpu loads..


----------



## senileoldman

Can someone honestly tell me if they feel any difference going above 500hz? I've always used 500hz and can't tell any difference between it and 1000hz, of course, other than higher cpu load.


----------



## NDUS

klunka said:


> ...


I get the same results as you - polling is on thread #2 while a game is active.


----------



## klunka

Thanks @NDUS good to know!

I played around with this today
https://download.microsoft.com/down...ef1dae939e/interrupt_affinity_policy_tool.msi

You can set Core affinity for interrupts of some of the hardware. For me the second core handles USB polls but also interrupts from the soundcard. 
So first I tried setting affinty for the soundcard but it didn't work. What worked was changing affinity for the usb controler.
I set one USB controler where my mouse is plugged in to core 3(fourth core) and my other USB controler for keyboard to core 2(third core).
Now all interrupts on core 3 are only from my mouse. No interference from soundcard or keyboard.

Then I used Powershell in admin mode to set every process to use only the first 3 cores like this:

$instances = Get-Process
foreach ($i in $instances) { $i.ProcessorAffinity=7 }

...to reduce DPC and general workload on core 3.
Have not done a mousetester but thought I post anyway so people can try it if they want and at their own risk ofc.


----------



## NDUS

klunka said:


> Thanks @NDUS good to know!
> 
> I played around with this today
> https://download.microsoft.com/down...ef1dae939e/interrupt_affinity_policy_tool.msi


Appreciate the info about this tool. Had no idea such a thing exists


----------



## NDUS

I did some testing re: core affinity and USB polling.
Using the tools and Powershell command Klunka posted above, I confined USB polling to core 7 and all other programs to cores 0-6.

Here is polling in MouseTester while doing slow circles on the pad using my M1K. (The frequent up/downclocking is normal at this speed, per the firmware's author, qsxcv--he said PMW3360 doesn't produce enough frames for 8khz polling at low speeds)

Graph with core affinity set favorably (USB polling on core 7, other programs on cores 0-6 - or, affinity 127.)









Graph with core affinity set unfavorably (USB polling on core 7, other programs on cores 6 & 7 - or, affinity 192.) Both cores were under about 70% load:









Same graph as above, but with the anomalous spike cropped out:









As you can see, de-crowding the core responsible for USB polling does indeed make a difference for polling stability.

To do this yourself, and semi-automate it:

1) Download & install https://download.microsoft.com/down...ef1dae939e/interrupt_affinity_policy_tool.msi
2) After running it as administrator, select the relevant USB entry for your mouse's USB port (for me it's USB xHCI Compliant Host Controller - could be different for you) and "Set Mask", giving it access to only the last core on your CPU (core 7 on my CPU)
3) Create a file named "affinity.ps1" and paste the following:


> $instances = Get-Process
> foreach ($i in $instances) { $i.ProcessorAffinity=127 }


4) Using https://www.mathsisfun.com/binary-decimal-hexadecimal-converter.html, convert the correct binary code for your processor into decimal. For my 8-core 9700K, "01111111" (decimal: 127) is the correct binary to forbid programs from running on the last core (core #8.) If I wanted to forbid core #2, I would instead use "11111101", for example. Yes, it's backwards.
Paste the correct decimal number over 127 in the affinity.ps1 file.
5) Create a file called affinity.bat and paste the following:


> PowerShell -NoProfile -ExecutionPolicy Unrestricted -Command "& {Start-Process PowerShell -ArgumentList '-NoProfile -ExecutionPolicy Unrestricted -File ""D:\Affinity Script\affinity.ps1""' -Verb RunAs}";


6) In the affinity.bat file, replace D:\Affinity Script\affinity.ps1 with the correct path to the affinity.ps1 file you created in step #3.

From here you can set up a Task Scheduler task that runs affinity.bat on startup, but this won't apply it to games (or anything else you run afterwards.) Unfortunately I can't identify any way to automatically run affinity.bat when game processes are started.

Thus you have to either manually run affinity.bat after you launch a game (not that bad if you make a shortcut on your hotbar) or use Process Lasso (consumes ~2% of my 9700K.)

You can skip steps 1 & 2, but will have to figure out which core your system uses for polling in games by default (in my case it's core #2) and adjust the affinity accordingly


----------



## klunka

Good stuff!


NDUS said:


> Unfortunately I can't identify any way to automatically run affinity.bat when game processes are started.


When I enter the command manually in powershell, every program I start afterwards will use the command automatically. For my weak ass 4 core that is actually annoying because I need to set the game to use all cores again. Like this
(Get-Process -name r5apex).ProcessorAffinity = 15

You need to make sure the powershell command is run in admin mode....if you run it normal the system processes will not be touched.


----------



## J Doe

Your script is forcing all process to run on a single core. What is the point?


----------



## NDUS

J Doe said:


> Your script is forcing all process to run on a single core. What is the point?


No, it's forcing all processes to *avoid* a particular core, and that particular core to be responsible for USB polling


----------



## ucode

senileoldman said:


> Can someone honestly tell me if they feel any difference going above 500hz? I've always used 500hz and can't tell any difference between it and 1000hz, of course, other than higher cpu load.


Do you have any proof you can't tell the difference otherwise it might be construed as placebo! Only kidding  Maybe it depends on personal perception and hardware used. FWIW my perception these days is not so good, nor is my HW the greatest. I can feel a difference if polling at 10Hz or lower otherwise not so much.









What do you see right away. The young woman, the old woman or both?




NDUS said:


> As you can see, de-crowding the core responsible for USB polling does indeed make a difference for polling stability.


Not so much. Likely the USB polling itself is very good, it's what happens after the polling stage once the interrupt is generated. It then becomes in system response, or lack of it and not USB polling precision. 

For example the results I posted earlier were with idle states enabled, which means all 14 cores running at 100% for 100% of the time. Not good for being power efficient but does remove core C-State exit latency which can be 10's of microseconds. That would make the results much noisier. Even Mousetester itself can add to the noise of the results, such can be the price of using the HW to monitor itself. 

Does your mouse running at 8000Hz polling lose any data? I ask because I tested with an emulated serial mouse on W10 and it lost a lot with idle states enabled but this was at ~18000Hz and generated 4 IRQ's per packet. On top of that the large amount of interrupts appear to trigger an internal throttling mechanism forcing the ratio down to the HFM (2GHz in this case). Not something I've seen mentioned by Intel.

Tried the affinity program you linked but could not read USB XHCI, reported an error of unknown or something. Registry method works fine though as I mentioned earlier when wanting to know the interrupt affinity.









Example of my laptop XHCI using dual core with HTT on W8.1. The policy is at default and Windows in this case has chosen affinity on all threads as shown by the temporal target of 0xF. Policies can be added as shown by
https://docs.microsoft.com/en-us/windows-hardware/drivers/kernel/interrupt-affinity-and-priority

If you want to launch programs all set with affinity set to the same threads then change explorer.exe to that setting. Launching programs will then inherit that affinity, unless they set their own.


----------



## ucode

Did some testing with XHCI and on my system W10 loses a lot of mouse data when idle states are enabled. Not so much with W7 but the interrupt moderation needed changing as the default value was way too high for 8k, effectively producing wonky 2k results.

W7 with default 1ms interrupt moderation









W7 with interrupt moderation disabled









W10 with core C-state 3 and package c-state 2 enabled








Should be 100000 results and zero X Y sums if nothing missed

The interrupt moderation affects actual polling, the other results are more to do with in system response after polling precision.

Some interesting results with EHCI as well as other effects but maybe I've cluttered the thread too much already so I will not pollute further. Once the basic USB polling is out of the way I do hope to do some loaded tests, perhaps start of with a benchmark such as heaven while testing mouse response at high speed. However, since it will not likely be USB polling precision related but to do with mouse post processing, I don't think it should be posted on this particular thread.


----------



## klunka

@ucode
How did you disable interrupt moderation?



ucode said:


> Tried the affinity program you linked but could not read USB XHCI, reported an error of unknown or something.


The "unexpected type" error is normal, it still works.


----------



## MaxTendency

senileoldman said:


> Can someone honestly tell me if they feel any difference going above 500hz? I've always used 500hz and can't tell any difference between it and 1000hz, of course, other than higher cpu load.


Depends. Back when I was on bloated and un-optimized w10 I honestly couldn't tell the difference. But now that I have my bios cleaned up, have a w7 with proper optimizations applied the difference between 500hz and 1k hz is night and day. Its very easily perceivable. If you cannot notice the difference you most likely have a latency bottleneck somewhere else in ur pc.


----------



## Timecard

ucode said:


> Did some testing with XHCI and on my system W10 loses a lot of mouse data when idle states are enabled. Not so much with W7 but the interrupt moderation needed changing as the default value was way too high for 8k, effectively producing wonky 2k results.


Nice observations, have you seen lost data on Win10 or Win7 with cstates off, as in is there any likelihood of it happening even under ideal configuration? 

Could you clarify what you mean by changing XHCI interrupt moderation, not aware of any similar settings at least on the OS side that affects the controller in that way other than binding affinity like previous poster. Thanks



ucode said:


> W10 with core C-state 3 and package c-state 2 enabled
> Should be 100000 results and zero X Y sums if nothing missed


When I first saw this I was a bit confused, is the fact you didn't show 100,000 data points for the win10 example is because it wasn't there? I assume this is implied but just confirming.


----------



## ucode

@klunka Thanks for the info regarding Affinity SW. IMOD is set by HW and W7 to 4000 (0xFA0) which is 1ms (250ns * 4000), for W8.1 and W10 it is set to 200 which is 50us, at least on my HW that is. To disable it set it to 0. To locate the register use XHCI PCI offset 0x10 to locate memory base address then take memory base address offset at 0x18 and add it to the memory base address to get the runtime registers then add offset 0x24 to that for IMOD register. Lower 16bits are IMOD interval. My systems only use the first interrupter, there are more than one.

TSDR; For detail see 
https://www.intel.com/content/dam/w...ensible-host-controler-interface-usb-xhci.pdf

FWIW Intel EHCI has a different method (Interrupt Threshold Control) which also appears to allow disabling interrupt throttling by again setting to zero but is documented as "Reserved", seems to work though. Although HW setting is 1ms, W7, W8.1 and W10 use 125us on my HW so probably not a problem unless using high polling rate such as 8000Hz. There are 2 EHCI, one HW I have sees one EHCI controller running 8K, the other throttled to 5333Hz (2/3) while my laptop sees both throttled to 5333Hz unless threshold is disabled.

Results may be different for other HW, I can only report with what I have which is quite old now. (Haswell)

@Timecard Still need to do more testing but I think data can be lost anytime depending on the load. With idle states enabled, a busy system will reduce that idleness and core exit latency, so may be better in some cases. When I say lost data I don't mean lost polling but lost by the processing.

For interrupt moderation see above.

I have a Cypress FX2LP mini board for testing. Someone already wrote some mouse code for it so I modified it a little to send 250 single mouse movements to the right, then 250 up, 250 left and 250 down to end up at the same place. This is repeated 100 times to give 100,000 mouse movements / poll responses. The board runs at USB2 high speed so 125us polling. Cost less than $5 including shipping  A few strange things to see but probably not of interest to most given the high speed operation. So to answer your question, 100,000 responses were sent so 100,000 should have been received resulting in 100,000 events, 100,000 path counts and XY sums of 0 unless data is corrupted or lost.


----------



## klunka

ucode said:


> @klunka To disable it set it to 0. To locate the register use XHCI PCI offset 0x10 to locate memory base address then take memory base address offset at 0x18 and add it to the memory base address to get the runtime registers then add offset 0x24 to that for IMOD register. Lower 16bits are IMOD interval. My systems only use the first interrupter, there are more than one.


Thanks but I don't understand any of what you just said there. Could you give me a dumbed down step by step, something a grandma could follow. I want to disable interrupt moderation for EHCI on win 10.


----------



## ucode

I'll try with an example but as you've realized I'm terrible at communicating so appologies for being confusing. 

First check pertinent datasheet for southbridge, I will use my Intel X99 PCH in this example. 
https://www.intel.com/content/www/us/en/products/docs/chipsets/x99-chipset-pch-datasheet.html

Look up EHCI register for 'Interrupt Threshold Control' from datasheet.









Example of modifying using RWEverything. Note that it is easy to BSOD with this tool, be careful. Software can be downloaded from
http://rweverything.com/download/









Select 'PCI Devices' button then one of the EHCI controller









Double click the memory address space used by the controller at offset 0x10, if it's above 4GB you'll need to also use the upper 32-bits. In this example the upper 32-bits (offset 0x14) are zero so no need.









The datasheet shows the memory base offset at 0x20 so double click there.









As per datasheet bits 23:16 select 'Interrupt Threshold Control' value which I set to all zero to disable EHCI interrupt moderation despite that value being marked 'Reserved'. IOW not endorsed for use by Intel.

Do the same for the second EHCI controller.

RWEverything also has a commandline option which might be useful for automation but I haven't used it myself other than a quick test.


----------



## klunka

@ucode
Thank you so much, excellent instructions!









I think I did it....changed number under 16 from 1 to 0.



> if it's above 4GB you'll need to also use the upper 32-bits. In this example the upper 32-bits (offset 0x14) are zero so no need.


Only part I didn't understand but I guess I didn't need it either.

Will try some mousetester and games without the interruption moderation, see how it goes. Is there any potential downside to this, anything I should watch out for?
Thanks again


----------



## klunka

Doesn't seem to make much of a difference at 1000hz in w10

Left is default, right is without interrupt moderation.


----------



## Timecard

Posting Intel XHCI example with ucodes guidance, my Windows 10 also has C8 (200) set.


----------



## Timecard

ucode said:


> RWEverything also has a commandline option which might be useful for automation but I haven't used it myself other than a quick test.


Thanks for sharing all this with the community, this just means there's more to explore not just IMOD register on the USB controllers but other devices potentially as well. I haven't seen any similar posts on this level about this type of change, I also tried changing win10 to use win7s value and it does have a difference in 'feels', based on your note win10 is far more aggressive (50us) where as win7 (1ms).

What does the EHCI controller use as interrupt moderation value compared to XHCI on both win7 vs win10, there's lots of peoples saying these two feel very different so maybe there's a simple explanation for it other than driver overhead.


----------



## Timecard

Noticing if you use 500hz polling and use 1ms interrupt moderation (win7 value) on Windows 10 you get notable gaps/skips on desktop/game, I've seen similar on video capture as well (240fps). Mouse cursor trails are updating less on screen so its more x -> x (win7 val) vs x,x,x (win10 val). Hoping someone else can confirm with better PC, refreshrate, camera.


----------



## Timecard

One more thing to note, I set my mouse to 1000hz and 1ms interrupt moderation and Win10 1909 loses it's ****. I'm thinking it's a combination of 10mhz timer, dynamictick. You see polling reaching 20-25k and many ZEROs which suggests it's accumulating the result for those brief periods.


----------



## ucode

klunka said:


> Doesn't seem to make much of a difference at 1000hz in w10


Thanks for the pics. Probably more important for 8K polling where the 125us setting seems right on the point of activating. Example of 8K being reported as 5333Hz but effectually being closer to 2.5K due to the spacing or lack of it. 
Two points are very close to each other








That interrupt label should really be 'event'
As for the downsides I'm afraid that's a question for Intel since they marked the function as reserved. It might cause problems or might just be for energy efficiency, idk.




Timecard said:


> What does the EHCI controller use as interrupt moderation value compared to XHCI on both win7 vs win10, there's lots of peoples saying these two feel very different so maybe there's a simple explanation for it other than driver overhead.


125us on both here.


----------



## NDUS

ucode said:


> ...


I couldn't find anything about "Interrupt Threshold Control" in the data sheets for Z3xx PCH. Ctrl-f "micro-frame" yielded no similar-kind option either

https://www.intel.com/content/www/u...ipset Platform Controller Hub (PCH) Datasheet
https://www.intel.com/content/www/u...ipset Platform Controller Hub (PCH) Datasheet


----------



## ucode

NDUS said:


> I couldn't find anything about "Interrupt Threshold Control" in the data sheets for Z3xx PCH. Ctrl-f "micro-frame" yielded no similar-kind option either


It's an EHCI feature. I could be wrong but IIRC Intel stopped using EHCI since Skylake.


----------



## empl

numberfive said:


> - they are not suffering from high dpc latency, because the only people who care about high dpc latency are those who play 3dmark 2040 and MAYBE sound engineers in some exotic cases in 2006.
> - quality of drivers has nothing to do with the cost of the hardware (which for some reason was your point two comments above)
> - "Someone claims, even..." someone is even roach. ignore and forget, or prove and shut me up.
> - "motherboard - makes all difference in the world" no? there is no right one, there are very few that are garbage and there are the rest, that will suit you perfectly for gaming.
> - "Again you don't even read and just berate and flame..." for some reason you are spreading this bs and giving a link to a tweaking resource that is referring back to roach's thread on OCN. like for real? how can you not be a clown here?
> - "If you don't care about it, why you care some people can tell difference ?" don't make me laugh. you are continuously clowning here as an assistance manager to roach.
> 
> I actually decided to not read your post further, because you repeat your bs claims which have zero justification, zero practical impact on (competitive) gaming and zero tests that would show even a slightest difference.
> it is not a matter of what kind of difference I can tell or not, it is not about me at all.
> Y O U
> A R E
> S P R E A D I N G
> B S
> 
> on a side note, you are simply bad in games (according to you), you can improve with more practice and less tin foil tuning. stock windows 10 setup is not providing some disadvantageous experience to professional players, if it does to you, well, it is a you problem.


So you say also OP spreads bs. He said you can notice even 500us dpc latency on polling precision. Some asus 500$ motherboards have literally 1ms of dpc latency. You are spreading lies. It is not my fault, that you can't tell the difference. Some people say, you can't tell difference between 60 hz and 144 hz monitor... You are one of them... I would like know, what would you tell to ucode 



ucode said:


> Tutorial for RW Everything


Could you please tell me how to proceed on ASUS ROG STRIX Z390-i gaming to disable interrupt moderation? PLEASE if you are not 100% sure, don't tell me!!! I would rather not change it, than break my PC. If on Win10 EHCI/XHCI creates interrupts each 50us. Than it wouldn't probably help much disabling it. I have already like 40-100us dpc. latency for XHCI.

Strange thing is: i have 2 XHCI controllers, one Intel and one Nvidia. I have usb 3.1 type C on my motherboard, maybe that's Nvidia controller. I don't have usb on my GPU. I don't know what else it could be...

From RW Everything:
First picture is: list of my PCI devices. 
Second picture is: when i double click on first value in address 10. 

I also noticed: EHCI has lower input lag than XHCI and overall better feeling! My motherboard has only XHCI drivers all time, even if i disable all USB 3 ports individually in BIOS and using only USB 2 ports! But than i have only: "USB Root Hub" in device manager, under name for USB Root Hubs. But soon as i enable only one USB3 port in BIOS: i get in device manage for USB Root Hubs, name: USB Root Hub (USB 3.0). And there is no more only "USB Root Hub". And i notice my mouse lags and feels terrible! But i need USB 3 for VR. Is there a way to have normal USB Root Hub, while not using USB 3 port. I have nothing connected in USB 3.0. Only mouse and keyboard in USB 2.0. Yet i have only USB 3.0 Root Hubs...

Btw, i don't know if you are USB engineer, or who... But good stuff!!! More devs should be like you. But i am sure some big as corporations don't care about its customers... I hate fact you can't have normal EHCI drivers and are forced to use XHCI, which feels much worse! Btw in that Intel datasheet is: USB interrupt rate should not exceed 8000, but isn't that 125us? Someone was saying Win 10 is 50us, maybe Win 10 goes over that value...

...Also *ANYONE KNOW?* Can i uninstall my Nvidia USB controller? If you put somewhere in gpedit hardware ID. Under item, which goes something like: prevent driver installation by matching device hardware ID. It will prevent driver installation for that device. I noticed some drivers cause input lag, even if they are not used. But i have no idea what this driver is... And i can't disable it in DeviceManager. I am like 99% sure, that this is just other USB controller and disabling it wouldn't break anything, as it is not Intel USB controller. And has different hardware ID, than my GPU, or its controller. But i am careful with these things. *Someone was also talking about changing CPU affinity here, for some device. If you do this for your disks e.g. you can break your pc, not sure if hardware, or only Windows!*


----------



## ucode

@empl probably okay as you are. Biggest difference would be for W7 users where this value affects 1ms polling let alone higher rates. IINM XHCI came after W7 so it seems MS set it to 1ms and only updated to 50us from W8 or W8.1. IIRC Fedora32 uses 64us.

I'm not a USB eng' or developer, just learning a little about USB and sharing some info.


Some info for EHCI










https://www.intel.com/content/dam/s...-removal-6th-gen-core-pch-technical-paper.pdf


----------



## Melan

I wonder how different polling is on ryzen since USB controller is on CPU IO die instead of a chipset with intel. I doubt there's a meaningful difference but nonetheless.


----------



## SweetLow

ucode said:


> Double click the memory address space used by the controller at offset 0x10, if it's above 4GB you'll need to also use the upper 32-bits. In this example the upper 32-bits (offset 0x14) are zero so no need.


And you can get this memory address from Device Manager too (in Property->"Resources" tab).
P.S. JFYI, i have 1 value for EHCI controllers on my Intel B75 chipset in Windows 7 on regular microsoft drivers



ucode said:


> As per datasheet bits 23:16 select 'Interrupt Threshold Control' value which I set to all zero to disable EHCI interrupt moderation despite that value being marked 'Reserved'.


It is marked as reserved in pure EHCI specification too. Why do you think that zero is disable interrupt moderation? There is no such thing as "as fast as possible" mode of interrupt generation for description of host controller functioning.


----------



## Timecard

SweetLow said:


> Why do you think that zero is disable interrupt moderation? There is no such thing as "as fast as possible" mode of interrupt generation for description of host controller functioning.


It's mentioned at least in the XHCI documentation that i've seen.


----------



## SweetLow

Timecard said:


> It's mentioned at least in the XHCI documentation that i've seen.


may be, but i answered about EHCI definitely


----------



## kromtomas

when i start rw everything i get a message which tells me "Driver cannot be loaded, re-install the program may fix the issue" which doesnt work..


----------



## Timecard

Did you run as Admin?


----------



## kromtomas

Timecard said:


> Did you run as Admin?


ye i did, but it's the same.

Maybe its because of windows 10 v. 2004 ?


----------



## Timecard

Did you extract all the files too? Portable or installed version?


----------



## SmashTV

Currently running through the Zowie Mouse fitting Kit, and decided to have a go with measuring the polling with the GitHub release of MouseTester Reloaded. I seem to get good results if I click and hold from an empty bit of the taskbar when logging the inputs, but bad if its in the MouseTester window itself or anywhere else.

So my polling is either really good when clicked and held from the taskbar or more variable from anywhere else it seems. Doesn't necessarily bother me, but maybe someone would find the information useful.


----------



## kromtomas

Timecard said:


> Did you extract all the files too? Portable or installed version?


i tried both, install version and portable... no luck
also tried the lower version without luck


----------



## ucode

@SmashTV, using the keyboard F2 key works best for me. Holding down the mouse button can cause problems with the mouse cursor interacting with the desktop. 
@kromtomas I've heard some people have had issues with memory integrity checked under core isolation settings. My system doesn't seem to support it so cannot check. If it's the problem you'll need to decide whether you wish to disable it or not. If it's not that then check if the driver is already loaded, admin cmd -> sc query rwdrv.sys


----------



## SmashTV

Hmm F2 gave me about the same as using a blank spot in the window. 

Wonder if there's anything to it. The only thing I can think of is that in the case of F2 and using a blank portion of the tester window, it keeps the program in focus.


----------



## kromtomas

ucode said:


> @SmashTV, using the keyboard F2 key works best for me. Holding down the mouse button can cause problems with the mouse cursor interacting with the desktop.
> 
> @kromtomas I've heard some people have had issues with memory integrity checked under core isolation settings. My system doesn't seem to support it so cannot check. If it's the problem you'll need to decide whether you wish to disable it or not. If it's not that then check if the driver is already loaded, admin cmd -> sc query rwdrv.sys


Hi µcode, no there is not a service running with rwdrv. I also didnt found where i could get it.


----------



## HappyAlive

Melan said:


> I wonder how different polling is on ryzen since USB controller is on CPU IO die instead of a chipset with intel. I doubt there's a meaningful difference but nonetheless.


Ryzen has dogshit polling compared to intel.


----------



## SmashTV

More oddities from me, I had some strange input yesterday (like clearly obvious to where a non gamer in the GF notices, not the meme feelz type). Decided to check some plots with mice I have wired & wireless. On both, the first half or so of some movement would look like a jittery plateau swiping around in a medium-ish speed.

Turns out, it was Defender. Not sure what or why it was messing it, but once I canned it, it went smooth again. Probably MSD being overly aggressive, but it exhibited no signs of abnormal CPU usage. Three other AVs I tested were more or less fine, with only slightly more interference with Kaspersky compared to the other two (Avira, ESET) and against having no AV at all.

Never had this issue before, so maybe an update on Defender's end. A heads up for anyone using Defender, to check it out on your end just in case you feel something odd.


----------



## HappyAlive

SmashTV said:


> More oddities from me, I had some strange input yesterday (like clearly obvious to where a non gamer in the GF notices, not the meme feelz type). Decided to check some plots with mice I have wired & wireless. On both, the first half or so of some movement would look like a jittery plateau swiping around in a medium-ish speed.
> 
> Turns out, it was Defender. Not sure what or why it was messing it, but once I canned it, it went smooth again. Probably MSD being overly aggressive, but it exhibited no signs of abnormal CPU usage. Three other AVs I tested were more or less fine, with only slightly more interference with Kaspersky compared to the other two (Avira, ESET) and against having no AV at all.
> 
> Never had this issue before, so maybe an update on Defender's end. A heads up for anyone using Defender, to check it out on your end just in case you feel something odd.





> using an antivirus


----------



## fenriquez

I'm trying to disable the IMOD value but when I go to 0x10 offset there is no data. Any tips? I'm on Z490 how else could I find the IMOD values


----------



## fenriquez

This is what I see on the z490 datasheet, any tips or guide to lowering the interrupt rate would be appreciated as I'm a noob at this  thanks


----------



## Timecard

Looks like you're already on win10 so the interrupt rate is actually already really low compared to win7, as I understand it the main benefit would be if you have a mouse with above 1000hz polling rate then moderation off has impact.


----------



## NotThat

Has anyone tried using a logic analyzer to measure a mouse's polling precision? Seem to me like if the difference between mice comes from the actual mice themselves then a logic analyzer would be a superior tool to a PC running an OS. They can be found fairly cheap too compared to other external HW benchmark solutions.

A random one from Aliexpress for $7








10.51US $ |Debug Microcontroller Mini FPGA 24M 8CH Professional Portable Measuring Data Upload USB Powered Tool Logic Analyzer Arm Black|Oscilloscopes| - AliExpress


Smarter Shopping, Better Living! Aliexpress.com




www.aliexpress.com










Edit: Buy a USB extension cable as well. Cut the extension cable open and hook it up to the logic analyzer. Now testing different mice should be as easy as plug and play. Unless the mice have to be connected to an OS, in which case you can still branch out from the extension cable and record the signals being sent back and forth between the mice and PC.


----------



## fenriquez

Timecard said:


> Looks like you're already on win10 so the interrupt rate is actually already really low compared to win7, as I understand it the main benefit would be if you have a mouse with above 1000hz polling rate then moderation off has impact.


@Timecard I currently have a mouse capable of 8000hz polling but using SweetLow's custom drivers its throttling to 5333hz so I'm suspecting the case is the interrupt rate is affecting my polling  which is why I'm trying to figure this out to see if I can stabilize 8000hz


----------



## Timecard

I can probably help you test it out if the previous instructions and examples (Screenshots etc) we shared weren't clear, or unless they need more detail.


----------



## fenriquez

Yeah that would be great, my issue is trying to find the offsets on my driver. According to the Z490 datasheet of the screenshot I posted, the offsets are different from your earlier screenshots being a different chipset driver on Win10. Just wondering how I would understand finding the IMOD memory address such as the 2024h, 2044h, 2064h offsets, etc.


----------



## Timecard

Just msg me and link to the official intel doc you sent a screenshot of.


----------



## fenriquez

ucode said:


> Double click the memory address space used by the controller at offset 0x10, if it's above 4GB you'll need to also use the upper 32-bits. In this example the upper 32-bits (offset 0x14) are zero so no need.


@ucode How would I go about accessing the memory mapped register of the xHCI driver when given two BAR addresses
BAR1 0x12100004
BAR2 0x00000040

Confused on how to access the memory mapped registers which hold the IMOD register in RWEverything.exe

Thanks any help would be appreciated

EDIT

I was able to figure it out, thanks anyways for showing the possibility of changing the interrupt behaviour in the usb driver


----------



## PastaFinder

Have you guys ever experienced this kind of graphs ? Each peak is compensated by an equally opposite peak


----------



## Mach1ne1111

fenriquez said:


> @ucode How would I go about accessing the memory mapped register of the xHCI driver when given two BAR addresses
> BAR1 0x12100004
> BAR2 0x00000040
> 
> Confused on how to access the memory mapped registers which hold the IMOD register in RWEverything.exe
> 
> Thanks any help would be appreciated
> 
> EDIT
> 
> I was able to figure it out, thanks anyways for showing the possibility of changing the interrupt behaviour in the usb driver


Could you please explain how you figured it out. I've been messing with this for a couple days and just don't get it. I'm also on z490


----------



## fenriquez

Mach1ne1111 said:


> Could you please explain how you figured it out. I've been messing with this for a couple days and just don't get it. I'm also on z490


Yeah I was meaning to show the solution sorry about the delay. So to access the memory mapped address you must combine bar0 and bar1 values, in this case its 4012100000 when using 0x12100004 and 0x00000040 offsets since its over 4gb you must combine both 32bit addresses to get a 64 bit address. And whichever register your looking for which in this case the IMOD register as per datasheet its at offset 2024h so add 4012100000 to 2024 which you get 4012102024 as the IMOD memory mapped address. From there you can see the hex value 00C8 which is 200.


----------



## Mach1ne1111

fenriquez said:


> Yeah I was meaning to show the solution sorry about the delay. So to access the memory mapped address you must combine bar0 and bar1 values, in this case its 4012100000 when using 0x12100004 and 0x00000040 offsets since its over 4gb you must combine both 32bit addresses to get a 64 bit address. And whichever register your looking for which in this case the IMOD register as per datasheet its at offset 2024h so add 4012100000 to 2024 which you get 4012102024 as the IMOD memory mapped address. From there you can see the hex value 00C8 which is 200.


I don't have a bar0 ? It's only bar1-6


----------



## Mach1ne1111

Ok I found the c8 by manually typing the address in, but what do I do with it once I'm there. Set it to zero?


----------



## Timecard

Yep, 0 would disable


----------



## r0ach

empl said:


> I also noticed: EHCI has lower input lag than XHCI


It's difficult to describe the difference. To me, using a mouse on XHCI feels like there's more of a dead zone for registering response, but once the cursor actually does get moving, ironically each pixel feels like it has less friction to traverse over like how 500hz feels like it skips over pixels faster than 1000hz. A lot of this phenomenon can be explained entirely by EHCI using line based interrupt mode and XHCI using MSI mode IMO.

Using EHCI in line based mode pulls more overhead from the system than XHCI, but that overhead is not going to waste. I would say you're essentially being throttled using XHCI in MSI mode. This is why it feels more effortless and less friction to sling the cursor across the screen (It's similar in practice to throttling you down from 1000hz to 500hz) and also why it feels less responsive to get moving from a standstill.

The specific code monkey/engineer that was responsible for this pushing this change to USB 3.0 (getting rid of line based interrupt), thought he was making an optimization improvement, when in reality, the overhead was not actually being wasted and he's instead giving you a downgrade in performance. I haven't used a mouse over 1000hz yet, but I guess it's possible running a mouse at 2000hz on XHCI might be able to reclaim back some performance by pegging the system harder and feel more similar to running a 1000hz mouse on EHCI (or maybe it will feel like crap with MSI mode still throttling you and EHCI with line based interrupt will just always be superior, hell if I know).


----------



## Timecard

As r0ach said xhci can feel worst with msi enabled, try disabling it.


----------



## r0ach

Noooooo. When I first originally talked about it, I said that disabling MSI mode for XHCI removes the higher deadzone, but makes the cursor more wild and uncontrollable. It doesn't fix XHCI and make it more like EHCI. The default, Microsoft Windows XHCI driver doesn't seem to work well with MSI disabled for some reason. It's a lose/lose situation both having it on and off. I've never found any combination of hardware or settings where I liked an XHCI only box more than an older motherboard with XHCI disabled and EHCI only running.

It seems like the only path forward is mice getting their own interface like keyboards had with PS/2. I'm sure USB 4.0 will probably be equally as bad if not worse. And USB 4.0 requires a USB-C connector making every mouse ever made so far useless and obsolete? Might as well give mice their own proprietary interface if they're forcing a connector change anyway.


----------



## empl

r0ach said:


> It's difficult to describe the difference. To me, using a mouse on XHCI feels like there's more of a dead zone for registering response...


There is one more thing, which makes your mouse input suck on XHCI. Normally if you disable every USB 3 port in BIOS, there is USB Root HUB driver in Device Manager. But if you enable even one, your mouse movement turns to ****! And in Device Manager, it shows USB Root HUB (USB 3.0)!!! I thought i would force USB 2 driver but: i tried to click on update driver -> let me pick from list of a available drivers on my computer. And it shows only USB Root HUB (USB 3.0) driver! Also using search for drivers in this location and putting system32/drivers location, doesn't work. It says: windows determined already best driver for your device.

Problem is i need usb 3 ports for VR. But than i want to have these off when playing 2d games! And disabling and re-enabling these in bios is annoying, or using profiles, you have to still restart and go to BIOS each time... Maybe if i could delete USB Root HUB (USB 3.0) driver, it would use USB Root HUB driver instead. But problem is windows will redownload drivers automatically. I tried every regedit/gpedit entry i could find to disable automatic driver update and nothing works!

I disconnected pc from internet, scheduled restart, than i uninstalled USB Root HUB (USB 3.0) driver and after restart same driver was installed. I would have to delete this driver in system32/drivers. Problem is: i don't know if USB Root HUB driver would work, when there are USB 3 ports enabled, even they are not used. And if it didn't my mouse wouldn't work and i have no PS/2, or space for backup currently, so i can't test that out...


----------



## nizolol

Guys, why me polling rate is so unstable? Doesn't matter, its 500hz or 1000hz, everytime same sh**.
Do you have any ideas, tweaks, how i could fix this?
polling rate in picture is 500hz


----------



## empl

fenriquez said:


> @Timecard I currently have a mouse capable of 8000hz polling but using SweetLow's custom drivers its throttling to 5333hz so I'm suspecting the case is the interrupt rate is affecting my polling  which is why I'm trying to figure this out to see if I can stabilize 8000hz





I have the new Razer 8000 Hz prototype gaming mouse on my desk. - Page 16 - Blur Busters Forums



HAHAHAHAHAHAHHAHAH, low iq people and haters gave me ****, because i was talking about DPC. latency how it matters. Now they won't be able to run their 8k mouse. HAHAHAHAHHAHAHAHAHAHAHAHHAHAHAHH Now where are you little haters ??? xDDDD

*EDIT: So btw did it solve your problem, editing IMODI register? Didn't read previously your other posts. If not read on...*

Check couple pages back, someone was talking about editing value in registry of usb controller to reduce polling rate. Intel has 50us, but amd has higher interval! Also you will need < 76us DPC latency max. to utilize your 8k mouse. Good luck finding mobo like that, there are some mobos e.g. b550 from amd, which has 55us. Tho these tests are relative and differs per hw configuration, even main source of dpc. latency are badly coded drivers! I wouldn't recommend asus, their mobos have even 1ms DPC latency, these which cost 500$  And ASRock has ****ty bios yuck. I didn't see many low DPC latency mobos from GB, even now dunno, didn't check new chipset z490. My mobo has like 100 us dpc latency on average, including usb controller. Lowest i see was around 30, but can jump higher to 100 even.

You will probably need to test all bios version and all permutations of driver versions    Also don't run anything when you perform latencymon tests just FYI in case you didn't know, except for test of the usb 3 driver (wdf01000), you need to max out polling rate to see highest latency! GL!

Also hid cause huge input lag and you might want to force disable dwm  And every service you can, even firewall, to get to 8k, if it is even possible on some motherboards and using windows 10... There is debloated version of windows 10 btw, which supports only dx 11 and is less secure, however that may be better! Don't also forget use ultimate performance plan with disabled idle saver and disabled usb power saving, disable under every keyboard/mice/hid/usb device in device manager power saving and for nic and tweak nic for lowest dpc latency! And disable idle saver and dynamic tick and hpet possibly, or you can try disable platformtick. And irq affinities and priorities using msi_util 2... Also there are sites, where they will modify your bios, to unlock additional tweaking... And ram timings also you can tweak... You will need every advantage you can get...

*EDIT*: i tried to set xhci controller to high prirority and set affinity to one other core than cpu0 and mouse felt terrible - later out found it was because i set interrupt priority to high. So i tried only affinity and same, mouse felt terrible. I also had higher dpc latency 122 us in game, while without these changes 98us max execution time. So it seems it is pointless to change these. I also have set by default (by system) high for nvme and sata controller and other undefined. I tried prioritize also gpu and usb, or leave all on undefined and got always worse results.

Also IRQ policy onecloseprocessor did nothing, feels similar (after switching back i would say worse!) to having it on core 0 and same dpc latency. There is also one registry key since Windows NT, or something, which allows you to set IRQ priority. HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\PriorityControl
You have to find your IRQ number of a device and create dword entry here - "IRQ#Priority" without "". Where # is a IRQ number of a device - can be found in device manager, sort by type, IRQ. Than give it value 1 for highest to lowest. Which will prioritize IRQ for device listed in this key by this order.

But there is another key, which uses Msi_Util2 (google it on 3d guru), which can give devices interrupt priority. But if you set it twice, i don't know what would take a precedence... I mentioned this, because it is possible to give IRQ priority even to a system timer, or RTC clock. Somewhere i read recommendation, that users had good experience prioritizing system timer, or gpu in some order. This location definitely works tho, i was experimenting with it in past and you can feel it on mouse movement.

I tested prioritizing systemtimer and it feels great, mouse movement is more smoother! I know it helped, because i am using 500hz polling rate and sometimes, one frame is skipped (because DPC is handled after resolution timer window), so mouse feels more inconsistent. I was using 1ms resolution timer in past (currently 0.5ms) - because there is less variation. But setting systemtimer priority. I was even thinking, wow this is strange. Mouse is feels more round and smoother now, that's because it is updating more consistently! Resolution timer still affects input lag, lowest possible interval is 0.5ms, that's time window, when it can update new code to the cpu. You can test even 1ms vs 0.5ms greatly affects input lag! If someone has 1000 fps camera you can try testing this!

Yet it feels less responsive, especially when doing small adjustments. I tried many permutations for interrupt priority and it never was better, than default!!! System probably knows, what it is doing. But yet some device drivers i guess set this on their own in registry. E.g. nvme and sata controller to high, all on undefined feels worse. *So with prioritizing system timer, there is a tradeoff - mouse feels more consistent e.g. doing circles, but less responsive.*

It is hard to tell sometimes, when input lag is already very low. I use it awhile and than change it back to see what is better. E.g. some tweaks feel good initially e.g. like in terms of consistency, but than i notice my mouse lags after playing awhile and try different movements e.g. in case of system timer.

*Also: *when i put using Msi_util 2 xhci to high priority, it doesn't change my DPC latency for XHCI at all. Same 56us max, when doing circles furiously in Windows. But when i change it in this location, i get 28 us max after minute. While previously i tested that many times and was getting every time same value 56 us execution time, so that doesn't seem like a coincidence. In game only 85us max. Even latencymon already simulates load, it is not supposed to be used with a game, so it is possible it could be under 85us. Not sure, if it can test DPC latency of XHCI properly, when now we have a higher polling especially! Tho doing these changes, it could cause lag somewhere else and it isn't absolute measure of input lag.





Resplendence Software - LatencyMon Interrupt to process latencies


LatencyMon Interrupt to process latencies



www.resplendence.com





Btw mousetester gives me consistently like 20us with this change enabled, but it doesn't change time from ms to us no matter how much i zoom in... But it is very similar.

There are also other 2 modes, which is not explained how they work and they are less accurate said support. Also windows ADK might be better to test this.

*SO WAIT *(after doing changes from first paragraph after* EDIT:*) (i added more observations, so this paragraph became confusing). I have 52us for usb 3 driver when maxing out polling rate over 1 minute and furiously moving with a mouse. But when i launch prime95, i got highest execution time 236us oof!!! Prime 95 is probably more demanding, should test it in a game yet. But still remember your usb 3 drivers needs to stay <76us. And my motherboard is very low DPC latency yet 100us on average. And i have tweaked almost everything i ever heard. I am also using Bitdefender tbh, which has lowest impact, but still...

SO okay i get like 97us in bf1 (max. execution time), when my cpu maxes to 80-100%. Which is 21us higher than 76, which is still decent, but not for 8k and there will be some inconsistencies. Also timer resolution window is only 0.5ms and even intel usb controller will add small delay sometimes.

So you probably still need lowest possible dpc latency mobo like that 55us one, but that's from amd, you will need to edit usb interrupt controller register. And i don't even know, what is maximum polling rate amd can handle?! Not to mention there is no much of a choice and these mobos are expensive many times! So it won't probably be affordable for many people!

Btw can you uninstall drivers, after you save profile?  Razer synapse drivers are lagged as hell, even they improved latency in RazerSynapse 3, it is still garbage. Also you may notice, razer install some hid driver, which can be reverted in device managed - by clicking on update, let me pick... and selecting microsoft hid driver. Not sure which is best! But you may notice under control panel/mouse/events, there are some rz drivers, instead of mouhid etc. Even after you uninstall Synapse! Which needs to be deleted from windows/system32/drivers folder (just search for rz and delete every entry - careful) and every folder in appdata, program files/ program data, so you can get back ms drivers! Also usb2 has lower latency, but you probably know that! Also if you activate only one usb3 port in bios u get USB Root Hub (USB 3.0) and huge lag!

Also i have under 0x10, than under 0x20 all zeroes, so it should be disabled for me automatically. But my chipset z390, doesn't have listed IMODI register address in docs: Intel® Z390 Chipset








Intel® 300 Series Chipset Family PCH Datasheet, Vol. 1


Datasheet, vol. 1: Covers Platform Controller Hub (PCH) device information, I/O implementation, and more for the Intel® 300 series chipset family PCH.




www.intel.com





So supposedly it is disabled, you can take my benchmarks with disabled interrupt moderation. I am on z390-i gaming bios 2808- tweaked.


nizolol said:


> Guys, why me polling rate is so unstable? Doesn't matter, its 500hz or 1000hz, everytime same sh**.
> Do you have any ideas, tweaks, how i could fix this?
> polling rate in picture is 500hz


I was thinking the same thing first, i thought it just polls 500 per/second contantly, but it depends on number of sent packets from the mouse i guess. This is normal, if you don't max your polling rate there is nothing to report  Try moving quickly with mouse in circles and look. Or try zowie online mouse polling rate tester, or mousetester google on overclock. Maybe program is reporting it wrong...


----------



## Timecard

empl said:


> you will need < 76us DPC latency max.to utilize your 8k mouse


Feel free to elaborate on your calcs.


----------



## empl

Timecard said:


> Feel free to elaborate on your calcs.


Well i was talking about this on other site generally for all processors, so i took into account this number, given that intel cpus have lowest polling interval - 50us (so i heard). Let me ask you a question, i though usb controller was located on a motherboard, but than i heard someone talking about cpu. As you can find specifications for that in Intel datasheets per chipset - ucode said. This would imply it is in motherboard. Just to clear this up, so i refer to this correctly!

Yes you can disable interrupt moderation on Intel motherboards, as far as i know. Not sure on some chipsets tho! I have Z390 and i have all zeroes already at 0x20 offset, after double clicking 0x10 - in 32 bit mode. But i found no specifications for "Maximum Interrupt interval" for Z390 chipset in intel datasheet pch something, which was only doc listed under this chipset! So i should have that disabled. But i heard, that AMD has much higher polling rate, so can this be edited on amd too? And probably most people won't be comfortable editing this, or they don't know about it!

So to my value: when you move your mouse, it is sent through usb. USB interrupt controller will poll mouse each 50us constantly, but don't know when you will move your mouse! I need to make quick turn...

This is XHCI datasheet: https://www.intel.com/content/dam/w...ensible-host-controler-interface-usb-xhci.pdf

I heard from *ucode*, or someone, that Intel usb controller is by default set to 50 us of polling rate. And interrupt moderation can be disabled, question is: what is top limit? From XHCI datasheet i quote:
"Interrupts/sec = (250×10-9sec × IMODI) -1 For example, if the IMODI is programmed to 512, the host controller guarantees the host will not be interrupted by the xHC for at least 128 microseconds from the last interrupt. The maximum observable interrupt rate from the xHC should not exceed 8000 interrupts/sec."

Again i don't know, if they refer to when IMODI is set to 512 aka polling every 128us, than interrupt rate should not exceed 8000 interrupts/sec. Or they refer to interrupt controller, that it is not able to poll faster than 8000 interrupts/sec max. !? I read that someone was saying 8k is maximum possible interrupt rate of USB controller too. We need to clear this up! Also what is strange 1/8000 is 0.00125s. But they talk about 128us polling rate, which != 8000 polling rate, that's confusing! Seems like this means 8k is the top limit...

So now i can explain: because you will need interrupts to be handled by OS <125us. So you can utilize your 8k polling rate! I took into account 50us, which i heard is default polling rate of Intel usb controller. Polling is done constantly, it doesn't know when you will move your mouse in real life! So what if it occurs 1us, after 50us interval. That would add max. 49us, until next polling interval occurs. So 125-49=76. Than my understanding is, that usb controller will send interrupt to a cpu! Than OS schedules DPC and has to handle that in less than 76 us for it to render in your game. Not to even mention timer resolution window, which allows update code to the cpu, occurs only each 0.5ms, so there will be some interrupts missed!!! Which were handled, after timer resolution window occurred! These systems are out of sync, so there will be random variations, even at this polling rate, much less significant than on 500/1000, but still. Ofc. this will be still huge help, than 500/1000hz for 360hz monitors. But: lets say your motherboard has like 300us DPC latency, than this will be a lot of worse! I don't know what is average, it differs greatly. Some asus 500$ motherboards have problem with DPC latency and go even to 1ms...

So question is... your 360hz monitor will render frames like each 2.7 ms instead of 6.9 ms (144hz) and if you have like 300 DPC latency and timer resolution windows is 0.5ms max, than it is possible it will skip couple packets. Which could make your mouse movement jittery, because it is more frequently updated on the monitor. Sometimes you will get near to latest position, but sometimes you get position from couple mouse frames back. Dunno i am 144hz pleb 

So we know interrupt moderation of Intel usb controller can be completely turned off. Question is, is it even capable of more than 128/125us aka 8k polling rate? So if you move a mouse, right after poll occurs - you will have to wait max 127us, before next poll occurs, so last mouse position is sent to a cpu!

Correct me if, i am wrong. I am not usb engineer, i got my information from intel datascheets and users on a forums and internet...


----------



## empl

NDUS said:


> I couldn't find anything about "Interrupt Threshold Control" in the data sheets for Z3xx PCH. Ctrl-f "micro-frame" yielded no similar-kind option either
> 
> https://www.intel.com/content/www/us/en/products/docs/chipsets/300-series-chipset-pch-datasheet-vol-1.html?wapkw=Chipset Platform Controller Hub (PCH) Datasheet
> https://www.intel.com/content/www/us/en/products/docs/chipsets/300-series-chipset-pch-datasheet-vol-2.html?wapkw=Chipset Platform Controller Hub (PCH) Datasheet


Yep i couldn't find these anywhere too, ucode listed this for x99 chipset, or how it calls. But i found this in specification of XHCI controller from 2019 in my previous post listed.

EDIT: sorry for double post, i thought it was last post, it showed next to my previous post ***? *BTW i shared my findings about interrupt affinity and priority in my 2nd previous post.*


----------



## Tsubakii

NotThat said:


> Has anyone tried using a logic analyzer to measure a mouse's polling precision? Seem to me like if the difference between mice comes from the actual mice themselves then a logic analyzer would be a superior tool to a PC running an OS. They can be found fairly cheap too compared to other external HW benchmark solutions.
> 
> A random one from Aliexpress for $7
> 
> 
> 
> 
> 
> 
> 
> 
> 10.51US $ |Debug Microcontroller Mini FPGA 24M 8CH Professional Portable Measuring Data Upload USB Powered Tool Logic Analyzer Arm Black|Oscilloscopes| - AliExpress
> 
> 
> Smarter Shopping, Better Living! Aliexpress.com
> 
> 
> 
> 
> www.aliexpress.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Buy a USB extension cable as well. Cut the extension cable open and hook it up to the logic analyzer. Now testing different mice should be as easy as plug and play. Unless the mice have to be connected to an OS, in which case you can still branch out from the extension cable and record the signals being sent back and forth between the mice and PC.


maybe you would find this interesting, doesn't seem useful for my polling rate though, unless I wanted to cheat or something.





products:usb_sniffer [Wiki]







blog.lambdaconcept.com


----------



## sumandroid12

Hello guys, I need help understanding the mouse polling results on my Laptop. My mouse feels rather laggy so I have been searching for something like this. The mouse is Logitech G203. Booting into safe mode produces vastly different results and the mouse feels snappier as well. Here are the results with safe mode on/off. Is this supposed to be normal? Laptop is Lenovo Legion 5(Ryzen 5 4600h+gtx1650+16gb Ram)


----------



## Avalar

sumandroid12 said:


> Hello guys, I need help understanding the mouse polling results on my Laptop. My mouse feels rather laggy so I have been searching for something like this. The mouse is Logitech G203. Booting into safe mode produces vastly different results and the mouse feels snappier as well. Here are the results with safe mode on/off. Is this supposed to be normal? Laptop is Lenovo Legion 5(Ryzen 5 4600h+gtx1650+16gb Ram)
> 
> View attachment 2527064
> 
> View attachment 2527065


Could just be because laptop; Idk the specifics, but it has something to do with it being battery powered. Anyhow, I've never seen a laptop with less latency/better peripheral response than a decent PC. What's your power plan on at the moment, and the settings within that plan? Make sure your CPU is set to 100% minimum, for one thing.

Also, the second graph is looking at a smaller window than the first.


----------



## sumandroid12

I have tried all the usual stuff to turn off any power limits. I am using a custom high performance plan(everything set to max).



Avalar said:


> Also, the second graph is looking at a smaller window than the first.


Of course, this is not a deception on my part but If I were to increase the window size it would be difficult to see the data points apart. since they are of very low variance. The rest of the graph looks more or less same.
Another member in this forum suggested turning off some services with a script of his, services that are not there in safeboot. So I went ahead and turned off the AMDppm.sys (AMD processor driver) and I was able to get a better looking graph. I have checked cinebench and my results stay the same. Only the task manager seems to misreport the processor being stuck at 3.0Ghz.


----------



## Avalar

sumandroid12 said:


> I have tried all the usual stuff to turn off any power limits. I am using a custom high performance plan(everything set to max).
> 
> 
> 
> Of course, this is not a deception on my part but If I were to increase the window size it would be difficult to see the data points apart. since they are of very low variance. The rest of the graph looks more or less same.
> Another member in this forum suggested turning off some services with a script of his, services that are not there in safeboot. So I went ahead and turned off the AMDppm.sys (AMD processor driver) and I was able to get a better looking graph. I have checked cinebench and my results stay the same. Only the task manager seems to misreport the processor being stuck at 3.0Ghz.
> View attachment 2527310


What's that one thing called, HPET? Have you tried turning that off and seeing how it feels then? The option is both in BIOS and Device Manager.


----------



## sumandroid12

Avalar said:


> What's that one thing called, HPET? Have you tried turning that off and seeing how it feels then? The option is both in BIOS and Device Manager.


Didn't make any difference tbh. I learnt that the AMDppm is actually processor power management or something.. Maybe It's forcing the cpu to go into low power states that's messing with the polling, idk...
I get this weird microstutter when using mouse in battlefield 5 and battlefield 1. The fps is completely stable btw. could this be related to polling issues? I also found it in other games to some extent as well. 
Even after trying that tweak disabling my graph is far from stable. It looks like at every period of 5s or so I get a huge jitter in the polling rate. otherwise its 1000+/- 50hz.


----------



## Kingofinputlag

For the life of me i can not figure out the problem I am having with input lag.. now im using a PS4 controller a scuff used GitHub OC and no matter what I do. The input lag comes back.. it’s good for a day or maybe two after a clean install of windows sweetlow sent me the link to this and this stuff is way over my head. Im playing on a pretty high end pc with pretty much nothing on the pc the computer is pretty much used for competitor gaming. If anyone is willing to help or take a look and see what I got going on I’d have no problem paying someone for their time and or help thanks!


----------



## Layead ttv

PastaFinder said:


> Have you guys ever experienced this kind of graphs ? Each peak is compensated by an equally opposite peak


Yes i already get this type of graph, have you install Corsair Link? This software for RAM management is ***** everything on mouse polling-rate. Hard to uninstall, you need Autoruns and your eyes to delete all the bad stuff they install on your OS.


----------



## vf-

Layead ttv said:


> Yes i already get this type of graph, have you install Corsair Link? This software for RAM management is ***** everything on mouse polling-rate. Hard to uninstall, you need Autoruns and your eyes to delete all the bad stuff they install on your OS.


You mean, iCUE? iCUE hooks into mouse and keyboard it seems... As it installs some virtual mouse and keyboard drivers.


----------



## Layead ttv

vf- said:


> You mean, iCUE? iCUE hooks into mouse and keyboard it seems... As it installs some virtual mouse and keyboard drivers.


Yes that's it! Exactly, i find this when i was looking for disable RGB on corsair friend's RAM


----------



## Marctraider

Is it possible to apply USB register through GRUB? Or even before that, through EFI script?


----------



## Layead ttv

Marctraider said:


> Is it possible to apply USB register through GRUB? Or even before that, through EFI script?


In my personnal experience and opinion, yes it is, hard to do but possible.
You need a bios modder and ask to him. But Ukrain have the more bios modder population, and for now it's difficult time for them..
For how you need this register throw the grub? EFI Script is not enought, this is a windows features, not bios..


----------



## Rei

http://imgur.com/a/iDFV1gT

 I did these steps in the images.

The modification reverts when restarting the system, is it possible to make this automatic with some command?

I noticed an improvement in the quality of my mouse using a USB PCI card with chip Fresco Logic FL 1100. My mainboard is a Asrock B450M Pro4.


----------



## winmast

Crymore13 said:


> Why does my Deathadder 2013 have these peaks ?
> 
> 
> 
> Movement made:
> 
> 
> 
> My Latencymon:
> 
> 
> 
> Windows 8.1


As far as I know, DA2013 freezes about 1 MS every 400MS, losing 1MS of mobile data each time
I have a DA2013,like u


----------



## howiec

So it seems others including me are getting the error "*Driver cannot be loaded*..." for *RWEverything *after upgrading to Win11 *22H2*.

Has anyone been able to find a solution?

Regards,
Howie


----------



## frabb

Been having issue with floaty mouse feeling last few months and wondering if its EMI/electrical issue, getting this weird spikes up to 100k hz which seems impossible, anyone else ever seen this?


----------



## 0ne0ay

howiec said:


> So it seems others including me are getting the error "*Driver cannot be loaded*..." for *RWEverything *after upgrading to Win11 *22H2*.
> 
> Has anyone been able to find a solution?
> 
> Regards,
> Howie


try this GitHub - Faintsnow/HE: HE - Hardware Read & Write utility is a powerful utility for hardware engineers, BIOS engineers, driver developers, QA engineers, performance test engineers, diagnostic engineers… etc.


----------

