AMD or Nivdia for future upgrade?

The reason I’m asking this is I’m long overdue for replacing my GeForce GTX 970 for a new dGPU. Both AMD and Nvidia have their Pros and Cons:

Nvidia: Lower Power Consumption with Good Performance. Drivers tend to well written. However the Company tend to drops support for older card. Even worse, they only have Proprietary Drivers. Getting hardware details is like pulling teeth for FOSS Developers.

AMD: Open Source Drivers which are built into the Kernel. Unlike Nvidia, if your GPU dies, the Card can be remove and System turn on, and you can use the iGPU with no issues trying to remove the previous driver.

Because this is the second time I will replace the Fans on the 970, I’m considering going with AMD Radeon for my next upgrade.

Those of you who use used AMD GPUs after the current CEO took over, any major issues you have experienced with them? Like I said I’m overdue so I will be replacing my Video Card sooner or later.

Sorry for the long post.

0-2 score if you ask me…

I did a complete system upgrade in 2020, ditched my really old GTX660ti in favour of a Radeon RX570 (used). It’s been much smoother and I get to use Wayland with Gnome. For the most part everything just works.

Have recently started doing more gaming on Linux too (I don’t buy the very latest games).

Unless there is some specific nvidia tech you need, such as CUDA for Blender GPU rendering as an example, then you’ll likely have a far nicer experience with AMD.

1 Like

I can play a few games as it now stands. Homeworld Classic instead of Remastered. I can still watch Videos with no issue.

At the moment, you’re forced (kind of) to use amd with linux.

Reasons:

Desktop environment I am using: A standard XFCE with no fancy 3d effects at all nor compositing enabled.

Sys 1 with NV GF 1060 + driverblob:
Power Consumption:
Idle power consumption risen from 7 w over the last 2~3 years to 9 w at the moment. When I move around simple windows like gnome-calculator the engine- and memoryclocks rise and so the power consumption also rise to around 30 w. I wonder why, because the 2d desktop performance is bad: If I move around a empty pluma-window while a GPU-acclerated (1080p 25 fps x265 – no fancy stuff) video is running, it stutters.

Drivers well written:
Every 2 or 3 new driverversions (minorversion ones): Kerbal Space Program crashes for no reason or runs into (new) memory leaks. That 2 issues are clearly the driverblobs fault.
Another issue at the moment: The driverblob spams the dbus and creating a lot of stuff to log to.

I can remember another bug that made the downclocking after window moving took at least 30 seconds, which wasn’t fixed for 6~12 months.

What they’re doing to the devs working on noveau I cannot find not-insulting words for.

Sys 2 with Vega 8 apu (first gen mobile ryzen with apu) + amdgpu:
Power Consumption:
It started-of bad with around 20 w for the whole system on idle, but that decreased down to 8 w at the moment with one of the new firmwares in the last months. Slow but sure improvement here.
2d desktop performance is okay: Running 2 monitors on that little buddy and moving windows around when playing 4(!) videos (also 1080p 25 fps x265) is fine. Of course gameperformance cannot compete, but it’s an apu.

I can remember 2 critical bugs that hang the whole system on playing HOI4 or videos, but that was in the beginning of amdgpu, and clearly improved over the years.
There was a regression in the last year’s spring, but they fixed it in 3 months.

System hangs that take months to be fixed is not what I want, when I pay money (too much these days) for hardware. But amd puts effort into the open source driver and is improving, while NV just puts effort into their blob and is not improving at all.
Also: If you want to support foss, you have to wallet-vote for amd. IMO the only way NV will learn that lesson. :roll_eyes:

1 Like

Just for asking : what do you think of the new intel video cards (alchemist) ? Can they be a good choice for linux and for manjaro in particular?

People could answer the day they exist in the hands of consumers.

My experience with two Nvidia cards has been great so far. I would say there are issues on both sides between Nvidia and AMD, but they are too specific to paint a general picture of the situation.

Currently Nvidia cards prices are really overblown, you might find some series 16 cards with a less overblown price, but still really abusive prices (and probably not a great jump from your current card).

I’m not sure about AMD prices but it should also be overblown.

So it is difficult currently to recommend any route as in both direction you’ll get :point_right: :ok_hand:

If you can wait I would recommend to not buy anything right now as the market is just horrible. Intel Arc cards should also come this year so maybe we’ll be surprised about price and performance and you may get a better deal there (on paper their high end card should be competitive with a RTX 3070 performance wise I think). Or get in line at some vendors online process and queue up to have one at a more reasonable price, after an undefined long waiting time.

If you have money to waste then get yourself a high end Nvidia card for double to triple the price it should be sold for. Or get an overpriced AMD card, at the end of the day you should have a great experience on both brands with latest generation card.

Maybe you can give more details about your specific use case, as people said, things like CUDA, Wayland, and so on can add some weight in the balance for one or the other.

1 Like

All GPU price increase is dependent on crazy Ethereum-Profit-Mining using too many GPUs, but the mining takes a long time.
We don’t know when ETH mining gives up, then all GPU costs go down.


The mining would break the gaming industries because many gamers can’t buy new expensive GPU for games.

Attention: Many miners can sell many used GPUs, they are probably defective.

I just replaced the Fans on my 970 yesterday afternoon. So I can play games now. So AMD GPU Drivers have improved with Power Consumption then, glad to hear that.

I forgotten about Intel’s upcoming ARC GPUs. Not holding my breath anytime soon. I hope it won’t be another i740… That one along with the S3 ViRGE were quite simply bad… Complete market failure…