Since you have a hybrid card I would assume you use the bumblebee config?
As I remember correctly, on hybrid cards the nvidia card is hard wired to the external output. Intel is used for the laptop screen. So, to use the second screen, it is needed to use the nvidia card as a sink of the primary (intel) and loop the signal through it.
Bumblebee is in that case a big hack, since Nvidia does not support it. It uses VirtualGL
:
In short:
VirtualGL redirects an application’s OpenGL/GLX commands to a separate X server (that has access to a 3D graphics card), captures the rendered images, and then streams them to the X server that actually handles the application.
VirtualGL - ArchWiki
So the shutter comes from frame drops or delay when rendering the second screen with OpenGL.
Use for example optimus to use the nvidia card only. I would say this will stop the shuttering.
Or remove the nvidia driver and stay on open source drivers. There you can use a real kernel based sink and not VirtualGL.