Unreasonably bad performance on RTX 2060 (mobile)

I just went from a desktop with 980-Ti to a lapto with an RTX 2060. These cards are supposed to have similar performance numbers with the RTX 2060 coming out on top (and though I am on a mobile version of it, I checked benchmarks that showed that this is still supposed to be the case between a desktop 980-Ti and a laptop’s 2060.)

On my desktop I had a 4K 60hz display, and it ran smoothly, I copied the mpv configuration in it’s entirety over to the laptop, which has a 1080p screen at 144hz (should still be easier than a 4K display to render) and it lags like crazy, hundreds of delayed frames in seconds. I changed the screen’s refresh rate to 60hz but it still lags a lot, even with slightly lowered settings.

Note: I originally thought that the interpolation setting was causing the lag, but even with interpolation it’s a stuttery mess (even at 60hz)

I tested it both with optimus-manager and bumblebee just to see if one or the other was the issue, but I got roughly identical performance.

(I’ve also noticed less than ideal performance in games, if I run anything with DXVK at 144hz it stutters like there’s no tomorrow, haven’t tried it at 60hz but I’m pretty sure that this is not normal at all)

Edit: Oh yeah also, if I use optimus-manager to start the computer on nvidia when running on battery, it poops out on me after I log in (screen blinks on and off, but never displays an image it’s just black when it’s on) but if it’s plugged in, it’s fine with starting out as nvidia. A really bizarre issue, although it does point to the possibility of this being a power management related issue.

What could be causing this and how do I fix it?

So there is no problem with manjaro?

there was another thread yesterday regarding RTX having really poor performance and afaik was not resolved. you can look through arch wiki and some man pages and play with settings to see if it improves anything.


Can you find that thread again (I looked but can’t seem to dig it up myself), I’d like to see whether it’s the same kind of problem and what kind of troubleshooting was done.

@Elloquin It’s too early to rule that out, there was a problem where it didn’t properly install the non-free drivers with the installer yesterday even if I selected them. I’m hoping it’s just a driver issue that will get fixed by nvidia soon but it’s not impossible that this is a manjaro related configuration issue (even if the odds of that seem slim). Also at the end of the day I’m on Manjaro, where else should I look for help? I could (and maybe will) try the Nvidia forums I guess but this is the first place I go for help when I have problems and am on manjaro.

i did a brief search on rtx 2060 on linux but didnt come up with much. the thread i was referring to was

i only speak english, so at first i was using a translator extension to translate but after a few posts its all english anyway. (i was translating the wrong language :sweat_smile: , luckily the OP spoke english as well).

bumblebee does not support vulkan (dxvk) so if thats your intended use then bumblebee might not be the best fit for you. as for optimus-manager, it should work with vulkan AFAIK since it uses the proprietary nvidia drivers, i cant confirm as i was never able to make it work properly. you can try PRIME or optimus-switch which can switch between prime and intel-only modes .

if your not using bumblebee, and the option your using now is working, then stick with it. i think this is more of a rtx+driver issue that may be fixable with the right options/configuration. heres some resources for you to look through, all very good.

you should probably check nvidia forums/dev discussions related to your gpu. im not sure what version stable is on right now but on testing branch my nvidia drivers are 418.43 .

another thing you may want to check out is specifying your monitors edid info, especially if you want the best performance out of the display. you can find guides on how to do it on arch wiki and other places.

what I meant was I tested it with mpv, mpv just uses GL (although I think it supports vulkan now, guess I need to update my config huh?) so I was testing that with bumblebee and optimus-switch.

I like optimus-manager, I had to use nvidia-xrun last time I tried to use DXVK, not very fun. I also found that it’s very simple to switch between optimus-manager and bumblebee, in fact I made a script for it just to make it even easier.



if pgrep -x "bumblebeed" > /dev/null; then
    echo "Switching to Optimus Manager"
    sudo systemctl stop bumblebeed && sudo systemctl start optimus-manager
elif pgrep -f "optimus-manager-daemon" > /dev/null; then
    echo "Switching to Bumblebee"
    sudo systemctl stop optimus-manager && sudo systemctl start bumblebeed
    echo "Error: Bumblebee & Optimus Manager are both disabled"

As long as I siwtch back to intel before I do this, it enables me to disable optimus manager and enable bumblebee with just one command. It allows me to use primusrun if I don’t feel like restarting X just to run something that doesn’t need vulkan support, And then disable that, re-enabling optimus manager when it’s actually necessary. (Although if I’m on power I usually just use optimus-manager, simpler that way really)

The thread you linked to could be the same problem; I coincidentally also tried minecraft and also had stuttering in it (wonder how tf he got 250+fps in minecraft though, that game never runs that well for me, but I had well over 100fps and stutters) I should probably note though that CS:GO ran OK~ish when I tested it, I got around 200 fps, there were stutters, there were in fact a lot of stutters, but there were not so many that it was unplayable (just microstutters really)

I’d need to know what it’s like for him to use mpv on the gpu-hq profile though to be sure if it’s the same problem.

I did notice however that we both have MSI mainboards (although his is a desktop one and mine is a laptop) I’ve googled around for this issue but practically found nothing, I wonder if it’s because the 2060 was only supported on linux in January and not enough people have it on linux yet? Although that wouldn’t make sense because on Phoronix the cards seemed to be working fine for benchmarks…

I tried a couple things from the archwiki, nothing had any impact, I did notice however when I set powermizer to prefer max performance that when I turned on mpv, it set itself to the 2nd fastest mode instead of the fastest (which it was on before I turned on mpv) while that is bizarre, I don’t think that would have resulted in anything this extreme.

Lastly… Specifying the EDID manually has some kind of impact on performance? First I hear of this, care to elaborate?

using the edid that is specific to your monitor will work better, for your screen anyway since its not trying to make use of a “one size fits all” configuration. i had random flickers avery few minutes along with a few other random issues that were all remedied by specifying an edid.bin at boot. i only suggested it since you have a 144hz monitor which is less common on a laptop and also, possibly sync better to the gpu? (that last part is merely a guess)

Your screen is supposed to contain the EDID, as in literally, the EDID is planted into the screen somewhere and whenever it is connected to a PC it forwards the EDID data to the PC. This is how a computer knows whether it’s a 60hz or 144hz screen or if it’s 10-bit or not among other things.

In other words, by default I should already be using an edid specific to my screen and the only way that I will lose it is if my screen or the cable connecting my screen to the PC is damaged somehow (this has happened to me like 3 times before and then I had to dig up an edid that was close enough to whatever screen I was using in order for it to work again… I suppose backing up the EDID data just in case wouldn’t be too dumb all things considered, but I don’t think setting it manually would change anything…)

Linux is already reading my screens EDID perfectly fine.

yes, it does.

if its able to be read, yes. if not then a compatible edid is used and from what i was able to pick up while reading about edid is that the i2c-dev module is used to do this which is not active by default (why? i have no idea).

so even tools like get-edid cant pickup the monitors actual edid info unless that i2c-dev module is modprobed first. its easy enough to confirm and compare to the edid info being used by default.

sudo pacman -S read-edid
sudo get-edid      #was an edid detected?? no? now try

sudo modprobe i2c-dev
sudo get-edid | parse-edid      #now one can be found right?

compare the modeline info to what your monitor is currently using, its probably not the same.

i think by default, compatible modelines are used that are seen as compatible with your display. i think these are generated in the same way that tools like cvt are used to generate modelines. here is mine for example, i disabled early kms so my edid is not specified, i modprobed i2c-dev so the actual edid can be read. look at the difference.

~ >>> xrandr                                                                                                                       
Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 16384 x 16384
eDP-1-1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 500mm x 281mm
   1920x1080     60.00*+  59.93  
   1680x1050     59.95    59.88  
   1400x1050     59.98  
   1600x900      59.95    59.82  
   1280x1024     60.02  
   1400x900      59.96    59.88  
   1280x960      60.00  
   1440x810      59.97  
   1368x768      59.88    59.85  
   1280x800      59.97    59.81    59.91  
   1280x720      60.00    59.99    59.86    59.74  
   1024x768      60.04    60.00  
   960x720       60.00  
   928x696       60.05  
   896x672       60.01  
   1024x576      59.95    59.96    59.90    59.82  
   960x600       59.93    60.00  
   960x540       59.96    59.99    59.63    59.82  
   800x600       60.00    60.32    56.25  
   840x525       60.01    59.88  
   864x486       59.92    59.57  
   700x525       59.98  
   800x450       59.95    59.82  
   640x512       60.02  
   700x450       59.96    59.88  
   640x480       60.00    59.94  
   720x405       59.51    58.99  
   684x384       59.88    59.85  
   640x400       59.88    59.98  
   640x360       59.86    59.83    59.84    59.32  
   512x384       60.00  
   512x288       60.00    59.92  
   480x270       59.63    59.82  
   400x300       60.32    56.34  
   432x243       59.92    59.57  
   320x240       60.05  
   360x202       59.51    59.13  
   320x180       59.84    59.32  
HDMI-1-1 disconnected (normal left inverted right x axis y axis)
~ >>> cvt 1920 1080 60                                                                                                             
# 1920x1080 59.96 Hz (CVT 2.07M9) hsync: 67.16 kHz; pclk: 173.00 MHz
Modeline "1920x1080_60.00"  173.00  1920 2048 2248 2576  1080 1083 1088 1120 -hsync +vsync
~ >>> sudo get-edid | parse-edid                                                                                                   
[sudo] password for dglt: 
This is read-edid version 3.0.2. Prepare for some fun.
Attempting to use i2c interface
No EDID on bus 0
No EDID on bus 1
No EDID on bus 2
No EDID on bus 3
No EDID on bus 4
1 potential busses found: 5
256-byte EDID successfully retrieved from i2c bus 5
Looks like i2c was successful. Have a good day.
Checksum Correct

Section "Monitor"
	Identifier "V�"
	ModelName "V�"
	VendorName "BOE"
	# Monitor Manufactured week 1 of 2015
	# EDID version 1.4
	# Digital Display
	DisplaySize 340 190
	Gamma 2.20
	Option "DPMS" "false"
	Modeline 	"Mode 0" 141.40 1920 1968 2000 2142 1080 1083 1089 1100 +hsync -vsync 
	Modeline 	"Mode 1" 113.12 1920 1968 2000 2142 1080 1083 1089 1100 +hsync -vsync 
~ >>>                                                                                                                              

when i use early kms and specify the monitors actual edid, the data matches, the screen no longer has random flash/flickers every so often, and when i run something like glxgears without early_kms/edid i can see it skipping frames, with edid specified with early kms the skipped frames are not present. so from that i can say that it does actually make a difference, for me at least. i know if i had a nice 144hz screen i would be sure to make it work it’s best but thats up to you, it was just a suggestion.

1 Like

Huh, I see, that’s new to me, guess I’ll need to dig up that EDID somewhere… This is useful info, thanks for pointing this out.

1 Like

I sent my laptop manufacturer a request to mail me the EDID for my display, still waiting for the file.

As for the original issue of this thread, the mystery has been solved. Disabling compositing in XFCE completely resolved all of my issues :smiley:

1 Like

nice, disabling the compositor options even fixed in game performance? works perfectly now?

Yep, mpv motion smoothing works, I get a few frame delays still, but it’s within reason (I just need to tweak the settings a little bit) as for the games, everything works.

After doing some further testing I figured that the issue was related to VSync, I asked around on the nvidia forums (https://devtalk.nvidia.com/default/topic/1048209/linux/lots-of-stuttering-and-frame-delays-on-rtx-2060-mobile-/) and after giving me a method to forcefully disable vsync (which alleviated but did not solve the issue) he suggested disabling the compositor. To be honest I’m a little shocked I never thought of that, I guess it’s just that in all the years I used linux, disabling/enabling compositing has NEVER had any serious effects of any kind on things like gaming for me, so I never even considered it. (I should have though, because it did solve problems for me on windows 7 once to disable aero)

But this completely resolved everything, even the microstuttering is gone, so my 144hz display now actually feels like a 144hz display and not… whatever it was before (honestly it felt like 60hz display even at 144fps gaming :open_mouth: I hardly believed the difference after I disabled compositing) . Not sure about the thing with booting the system with the nvidia card enabled on battery, I guess I’ll test that.

But it’s safe to say that my laptop is now performing within expectations as long as the compositor is off.

This does create a new problem though, it really sucks to not have the transparency option compositing provides for terminals/the taskbar/conky… I’d really like some way to just disable compositing when in fullscreen applications or something.

what compositor was it using by default? try others, compton, compiz

compton seems to be the solution for many people using xfce, the xfce default compositor is garbage. here is how to use compton instead


if the default ~/.config/compton.conf that gets installed is a little plain. here is mine you can either copy/paste the whole thing or cherry-pick options here and there.
this setup works perfectly for me on both nvidia and intel gpu and i use openbox/xfce. no stutters or tearing.



shadow = true;
no-dnd-shadow = true;
no-dock-shadow = true;
clear-shadow = true;
shadow-radius = 7;
shadow-offset-x = -7;
shadow-offset-y = -7;
shadow-opacity = 0.60;
shadow-ignore-shaped = true;


menu-opacity = 0.92;
inactive-opacity = 0.85;
active-opacity = 1.0;
inactive-opacity-override = false;


fading = true;
fade-delta = 2;
fade-in-step = 0.01;
fade-out-step = 0.01;
alpha-step = 0.08;
no-fading-openclose = false;
no-fading-destroyed-argb = false;


backend = “glx”;
vsync = “opengl”;
mark-wmwin-focused = true;
mark-ovredir-focused = true;
detect-rounded-corners = true;
detect-client-opacity = true;
refresh-rate = 0;
paint-on-overlay = true;
unredir-if-possible = true;
detect-transient = true;
detect-client-leader = true;

GLX backend

glx-no-stencil = true;
glx-no-rebind-pixmap = true;

opacity-rule = [
“85:class_g *?= ‘xterm’”,
“92:class_g *?= ‘thunar’”,

shadow-exclude = [
“name *?= ‘Notification’”,
“class_g *?= ‘synapse’”,
“class_g *?= ‘jgmenu’”,
“class_g *?= ‘VirtualBox’”,
“class_g *?= ‘Conky’”,
“class_g *?= ‘Notify-osd’”,
“class_g *?= ‘cairo-dock’”,
“class_g *?= ‘trayer’”,
“class_g *?= ‘i3-frame’”,
“class_g *?= ‘firefox’”,
“class_g *?= ‘navigator’”,
“class_g *?= ‘Cairo-clock’”,
“class_g *?= ‘Cairo-dock’”,
“class_g *?= ‘plank’”,
“class_g *?= ‘Docky’”,

focus-exclude = [
“class_g *?= ‘Cairo-clock’”,
“class_g *?= ‘Virtualbox’”,
“class_g *?= ‘trayer’”,
“name *?= ‘Authy’”

tooltip = { fade = true; shadow = true; focus = true; };

1 Like

Yeah i’ve used compton before back when I used openbox. Thanks for that link :smiley:

The default is XFWM4; I think the compositor is part of the WM.

Edit: Wow, I even had my old compton.conf file backed up somewhere, it’s a good habit, storing backups of things.

i run a timeshift sync every few days or before testing something i know i dont want to keep. i just ran one a few minutes ago because i was gonna try the anbox snap and i know i dont really want it other than out of curiosity and i dont use snap’s because they suck up alot of storage and can be a pain in the arse sometimes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.