Second Monitor Firefox laggy when in game

Whenever Im in a full screen game, my second monitor becomes unusably laggy when I have anything that moves on the second monitor. Most noticeably when I have a video open it draws a like 1 frame every second and the mouse becomes unusable. I also notice this when trying to scroll through a wiki page or something when in game. I’m thinking its probably some kind of Nvidia issue but I’m not sure what.

System:
  Kernel: 5.10.7-3-MANJARO x86_64 bits: 64 compiler: gcc v: 10.2.1 
  parameters: BOOT_IMAGE=/boot/vmlinuz-5.10-x86_64 
  root=UUID=b07453a6-cfde-4f37-8e50-6f90b53b72b6 rw quiet apparmor=1 
  security=apparmor udev.log_priority=3 
  Desktop: KDE Plasma 5.20.5 tk: Qt 5.15.2 info: latte-dock wm: kwin_x11 
  dm: SDDM Distro: Manjaro Linux 
Machine:
  Type: Desktop Mobo: Micro-Star model: Z370M MORTAR (MS-7B54) v: 1.0 
  serial: <filter> UEFI: American Megatrends v: 1.20 date: 03/08/2018 
CPU:
  Info: 6-Core model: Intel Core i5-8600K bits: 64 type: MCP arch: Kaby Lake 
  note: check family: 6 model-id: 9E (158) stepping: A (10) microcode: DE 
  L2 cache: 9 MiB 
  flags: avx avx2 lm nx pae sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx 
  bogomips: 43212 
  Speed: 800 MHz min/max: 800/4400 MHz Core speeds (MHz): 1: 800 2: 800 3: 800 
  4: 800 5: 800 6: 800 
  Vulnerabilities: Type: itlb_multihit status: KVM: VMX disabled 
  Type: l1tf 
  mitigation: PTE Inversion; VMX: conditional cache flushes, SMT disabled 
  Type: mds mitigation: Clear CPU buffers; SMT disabled 
  Type: meltdown mitigation: PTI 
  Type: spec_store_bypass 
  mitigation: Speculative Store Bypass disabled via prctl and seccomp 
  Type: spectre_v1 
  mitigation: usercopy/swapgs barriers and __user pointer sanitization 
  Type: spectre_v2 mitigation: Full generic retpoline, IBPB: conditional, 
  IBRS_FW, STIBP: disabled, RSB filling 
  Type: srbds mitigation: Microcode 
  Type: tsx_async_abort mitigation: Clear CPU buffers; SMT disabled 
Graphics:
  Device-1: NVIDIA GP106 [GeForce GTX 1060 6GB] vendor: Micro-Star MSI 
  driver: nvidia v: 460.32.03 alternate: nouveau,nvidia_drm bus ID: 01:00.0 
  chip ID: 10de:1c03 
  Display: x11 server: X.Org 1.20.10 compositor: kwin_x11 driver: 
  loaded: nvidia display ID: :0 screens: 1 
  Screen-1: 0 s-res: 3840x1080 s-dpi: 101 s-size: 967x272mm (38.1x10.7") 
  s-diag: 1005mm (39.5") 
  Monitor-1: DVI-D-0 res: 1920x1080 hz: 60 dpi: 102 
  size: 477x268mm (18.8x10.6") diag: 547mm (21.5") 
  Monitor-2: DP-0 res: 1920x1080 hz: 144 dpi: 94 size: 519x293mm (20.4x11.5") 
  diag: 596mm (23.5") 
  OpenGL: renderer: GeForce GTX 1060 6GB/PCIe/SSE2 v: 4.6.0 NVIDIA 460.32.03 
  direct render: Yes 
Audio:
  Device-1: Intel 200 Series PCH HD Audio vendor: Micro-Star MSI 
  driver: snd_hda_intel v: kernel bus ID: 00:1f.3 chip ID: 8086:a2f0 
  Device-2: NVIDIA GP106 High Definition Audio vendor: Micro-Star MSI 
  driver: snd_hda_intel v: kernel bus ID: 01:00.1 chip ID: 10de:10f1 
  Sound Server: ALSA v: k5.10.7-3-MANJARO 
Network:
  Device-1: Intel Ethernet I219-V vendor: Micro-Star MSI driver: e1000e 
  v: kernel port: f000 bus ID: 00:1f.6 chip ID: 8086:15b8 
  IF: enp0s31f6 state: down mac: <filter> 
  Device-2: ASIX AX88179 Gigabit Ethernet type: USB driver: ax88179_178a 
  bus ID: 2-3:2 chip ID: 0b95:1790 serial: <filter> 
  IF: enp0s20f0u3 state: up speed: 1000 Mbps duplex: full mac: <filter> 
Drives:
  Local Storage: total: 2.05 TiB used: 508.26 GiB (24.2%) 
  SMART Message: Unable to run smartctl. Root privileges required. 
  ID-1: /dev/sda maj-min: 8:0 vendor: Seagate model: ST2000DM006-2DM164 
  size: 1.82 TiB block size: physical: 4096 B logical: 512 B speed: 6.0 Gb/s 
  serial: <filter> rev: CC26 
  ID-2: /dev/sdb maj-min: 8:16 model: SATA SSD size: 238.47 GiB block size: 
  physical: 512 B logical: 512 B speed: 6.0 Gb/s serial: <filter> rev: 61.2 
Partition:
  ID-1: / raw size: 43.95 GiB size: 43 GiB (97.86%) used: 33.86 GiB (78.7%) 
  fs: ext4 dev: /dev/sdb5 maj-min: 8:21 
  ID-2: /boot/efi raw size: 489 MiB size: 488 MiB (99.80%) 
  used: 3.6 MiB (0.7%) fs: vfat dev: /dev/sdb8 maj-min: 8:24 
  ID-3: /home raw size: 735.62 GiB size: 723.08 GiB (98.29%) 
  used: 474.32 GiB (65.6%) fs: ext4 dev: /dev/sda3 maj-min: 8:3 
Swap:
  Kernel: swappiness: 60 (default) cache pressure: 100 (default) 
  ID-1: swap-1 type: partition size: 3.84 GiB used: 73.5 MiB (1.9%) 
  priority: -2 dev: /dev/sdb6 maj-min: 8:22 
Sensors:
  System Temperatures: cpu: 29.8 C mobo: 27.8 C gpu: nvidia temp: 47 C 
  Fan Speeds (RPM): N/A gpu: nvidia fan: 47% 
Info:
  Processes: 251 Uptime: 4h 36m wakeups: 0 Memory: 15.59 GiB 
  used: 5.44 GiB (34.9%) Init: systemd v: 247 target: graphical.target 
  Compilers: gcc: 10.2.0 clang: 11.0.1 Packages: 1723 pacman: 1695 lib: 437 
  flatpak: 22 snap: 6 Shell: Zsh v: 5.8 running in: yakuake inxi: 3.2.02 

There is no “good” software solution, only a “bad” and “very bad” solution, so try them out one by one which is the worst for your use case and GeForce GTX 1060 hardware:

Do the following:

  • Go to System Settings
  • Type composit
  • Click Compositor
  • Tun off the Allow applications to block compositing as per below screenshot:

There! Done! Now your second monitor will still be able to use your GeForce GTX 1060 but now your game does not get the full exclusive use of the GeForce GTX 1060!!!

Depending on your use case that might be your preferred option (I.E. “bad, but not very bad”) but if it’s not, the only solution is a HW one and have one GPU per monitor…

:scream: