Not so much asking for help but offering an Experience.
Running Wayland on a 32" Samsung monitor via an Nvidia gtx1050ti graphics card. Wayland would blank out after login and on X11 I was getting screen glitches and temporary blackouts.
Then it occurred to me, I was connected via DP 1.2, so I switched to HDMI and BINGO! Wayland worked fine with no degradation to image quality.
Whether the problem was down to the card or the monitor I dont know.
More than likely the quality of the cable played a part.
Regards.
Edit:- Renamed title to be more befitting the content.
Interesting - though I am always skeptical about cable quality…
I bought a stupidly cheap 10 metre HDMI cable, and it just works…
HDMI 2.1 offers up to 48 GBps, Displayport 1.2 supports up to 4k at 60Hz.at 21.6Gbps… but I’d have expected DisplayPort to be the more advanced option (depending on the version - for advanced features).
TL;DR - maybe your connection or cable are slightly dodgy.
it’s easy to be stuck if you pick the wrong cable that doesn’t match the display and that’s a lot:
all the hdmi-“standards” are a pain in the ass:
HDMI 1.0
HDMI 1.1
HDMI 1.2
HDMI 1.2a
HDMI 1.3
HDMI 1.3a/b/c
HDMI 1.4
HDMI 1.4a
HDMI 1.4b
HDMI 2.0
HDMI 2.0a
HDMI 2.0b
HDMI 2.1
HDMI 2.1a
HDMI 2.1b
Nvidia GeForce GTX 1050 Ti 4GB | GPUSpecs.com
Connectivity Max HDMI Resolution 3840x2160@60Hz Max DP Resolution 7680x4320@60Hz Display Port 1.4 x1 HDMI 2.0b x1
https://www.pcmag.com/how-to/hdmi-vs-displayport-which-should-i-use-for-my-pc-monitor
If you have the choice between DisplayPort 1.4 (or 1.4a) and HDMI 2.0, DisplayPort would be the better option. In other cases, if a monitor only gives you the choice between, say, HDMI 2.0 and DisplayPort 1.2, HDMI could be the way to go for the HDR support, as long as all your devices support the HDMI version in question.