Chromium, Signal: GPU process isn't usable. Goodbye

Hi!

In recent months I’ve had trouble running Chromium. I solved it with sudo downgrade chromium. But that no longer works due to dependencies. It was no big deal because I had Firefox. But now Signal stopped working with the same error. Interesting that vscodium does work (also electron afaik).

$ mhwd -li -l
> Installed PCI configs:
--------------------------------------------------------------------------------
                  NAME               VERSION          FREEDRIVER           TYPE
--------------------------------------------------------------------------------
           video-linux            2018.05.04                true            PCI
     video-modesetting            2020.01.13                true            PCI
video-hybrid-intel-nvidia-prime            2023.03.23               false            PCI


Warning: No installed USB configs!
> 0000:01:00.0 (0300:10de:249c) Display controller nVidia Corporation:
--------------------------------------------------------------------------------
                  NAME               VERSION          FREEDRIVER           TYPE
--------------------------------------------------------------------------------
video-hybrid-intel-nvidia-prime            2023.03.23               false            PCI
video-hybrid-intel-nvidia-470xx-prime            2023.03.23               false            PCI
          video-nvidia            2023.03.23               false            PCI
    video-nvidia-470xx            2023.03.23               false            PCI
           video-linux            2018.05.04                true            PCI
     video-modesetting            2020.01.13                true            PCI
            video-vesa            2017.03.12                true            PCI


> 0000:00:02.0 (0300:8086:9a60) Display controller Intel Corporation:
--------------------------------------------------------------------------------
                  NAME               VERSION          FREEDRIVER           TYPE
--------------------------------------------------------------------------------
video-hybrid-intel-nvidia-prime            2023.03.23               false            PCI
video-hybrid-intel-nvidia-470xx-prime            2023.03.23               false            PCI
           video-linux            2018.05.04                true            PCI
     video-modesetting            2020.01.13                true            PCI
            video-vesa            2017.03.12                true            PCI
$ glxinfo | grep 'renderer string'
OpenGL renderer string: NVIDIA GeForce RTX 3080 Laptop GPU/PCIe/SSE2
$ inxi -Fazy
System:
  Kernel: 6.1.69-1-MANJARO arch: x86_64 bits: 64 compiler: gcc v: 13.2.1
    clocksource: tsc available: acpi_pm
    parameters: BOOT_IMAGE=/@/boot/vmlinuz-6.1-x86_64
    root=UUID=d7fa9c05-d68f-4aaf-94a9-??????? rw rootflags=subvol=@ quiet
    cryptdevice=UUID=b0a35949-1ae3-43bb-9d72-????????:luks-b0a35949-1ae3-43bb-9d72-????????
    root=/dev/mapper/luks-b0a35949-1ae3-43bb-9d72-???????? apparmor=1
    security=apparmor udev.log_priority=3 nvidia-drm.modeset=1
  Desktop: i3 v: 4.23 info: i3bar vt: 7 dm: 1: GDM v: 45.0.1 note: stopped
    2: LightDM v: 1.32.0 Distro: Manjaro Linux base: Arch Linux
Machine:
  Type: Desktop Mobo: ZOTAC model: ZBOX-EN173080C/EN173070C/EN153060C/QTG7A4500
    v: XX serial: <superuser required> UEFI: American Megatrends LLC. v: B453P112
    date: 04/19/2022
CPU:
  Info: model: 11th Gen Intel Core i7-11800H bits: 64 type: MT MCP
    arch: Tiger Lake gen: core 11 level: v4 note: check built: 2020
    process: Intel 10nm family: 6 model-id: 0x8D (141) stepping: 1
    microcode: 0x4E
  Topology: cpus: 1x cores: 8 tpc: 2 threads: 16 smt: enabled cache:
    L1: 640 KiB desc: d-8x48 KiB; i-8x32 KiB L2: 10 MiB desc: 8x1.2 MiB
    L3: 24 MiB desc: 1x24 MiB
  Speed (MHz): avg: 833 high: 1067 min/max: 800/4600 scaling:
    driver: intel_pstate governor: powersave cores: 1: 1066 2: 800 3: 800 4: 1067
    5: 800 6: 800 7: 800 8: 800 9: 800 10: 800 11: 800 12: 800 13: 800 14: 800
    15: 800 16: 800 bogomips: 73744
  Flags: avx avx2 ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx
  Vulnerabilities:
  Type: gather_data_sampling mitigation: Microcode
  Type: itlb_multihit status: Not affected
  Type: l1tf status: Not affected
  Type: mds status: Not affected
  Type: meltdown status: Not affected
  Type: mmio_stale_data status: Not affected
  Type: retbleed status: Not affected
  Type: spec_rstack_overflow status: Not affected
  Type: spec_store_bypass mitigation: Speculative Store Bypass disabled via
    prctl
  Type: spectre_v1 mitigation: usercopy/swapgs barriers and __user pointer
    sanitization
  Type: spectre_v2 mitigation: Enhanced IBRS, IBPB: conditional, RSB
    filling, PBRSB-eIBRS: SW sequence
  Type: srbds status: Not affected
  Type: tsx_async_abort status: Not affected
Graphics:
  Device-1: Intel TigerLake-H GT1 [UHD Graphics] vendor: ZOTAC driver: i915
    v: kernel arch: Gen-12.1 process: Intel 10nm built: 2020-21 ports:
    active: none empty: HDMI-A-3 bus-ID: 00:02.0 chip-ID: 8086:9a60
    class-ID: 0300
  Device-2: NVIDIA GA104M [GeForce RTX 3080 Mobile / Max-Q 8GB/16GB]
    vendor: ZOTAC driver: nvidia v: 545.29.06 alternate: nouveau,nvidia_drm
    non-free: 545.xx+ status: current (as of 2023-11; EOL~2026-12-xx)
    arch: Ampere code: GAxxx process: TSMC n7 (7nm) built: 2020-2023 pcie:
    gen: 1 speed: 2.5 GT/s lanes: 16 link-max: gen: 4 speed: 16 GT/s ports:
    active: none off: DP-1 empty: DP-2,HDMI-A-1,HDMI-A-2 bus-ID: 01:00.0
    chip-ID: 10de:249c class-ID: 0300
  Display: x11 server: X.Org v: 21.1.10 with: Xwayland v: 23.2.3
    compositor: Picom v: git-b700a driver: X: loaded: nvidia
    gpu: nvidia,nvidia-nvswitch display-ID: :0 screens: 1
  Screen-1: 0 s-res: 3840x2160 s-dpi: 139 s-size: 702x392mm (27.64x15.43")
    s-diag: 804mm (31.65")
  Monitor-1: DP-1 mapped: DP-0 note: disabled model: ViewSonic VP3256-4K
    serial: <filter> built: 2021 res: 3840x2160 hz: 60 dpi: 140 gamma: 1.2
    size: 697x392mm (27.44x15.43") diag: 800mm (31.5") ratio: 16:9 modes:
    max: 3840x2160 min: 640x480
  API: EGL v: 1.5 hw: drv: intel iris drv: nvidia platforms: device: 0
    drv: nvidia device: 1 drv: iris device: 3 drv: swrast surfaceless:
    drv: nvidia x11: drv: nvidia inactive: gbm,wayland,device-2
  API: OpenGL v: 4.6.0 compat-v: 4.5 vendor: nvidia mesa v: 545.29.06
    glx-v: 1.4 direct-render: yes renderer: NVIDIA GeForce RTX 3080 Laptop
    GPU/PCIe/SSE2 memory: 15.62 GiB
Audio:
  Device-1: Intel Tiger Lake-H HD Audio vendor: ZOTAC driver: snd_hda_intel
    v: kernel alternate: snd_sof_pci_intel_tgl bus-ID: 00:1f.3 chip-ID: 8086:43c8
    class-ID: 0403
  Device-2: NVIDIA GA104 High Definition Audio vendor: ZOTAC
    driver: snd_hda_intel v: kernel pcie: gen: 4 speed: 16 GT/s lanes: 16
    bus-ID: 01:00.1 chip-ID: 10de:228b class-ID: 0403
  Device-3: ESS USB DAC driver: hid-generic,snd-usb-audio,usbhid type: USB
    rev: 1.0 speed: 12 Mb/s lanes: 1 mode: 1.1 bus-ID: 3-11:4 chip-ID: 0495:3011
    class-ID: 0300
  API: ALSA v: k6.1.69-1-MANJARO status: kernel-api with: aoss
    type: oss-emulator tools: alsactl,alsamixer,amixer
  Server-1: JACK v: 1.9.22 status: off tools: jack_control,qjackctl
  Server-2: PipeWire v: 1.0.0 status: active with: 1: pipewire-pulse
    status: active 2: pipewire-media-session status: active 3: pipewire-alsa
    type: plugin tools: pactl,pw-cat,pw-cli
Network:
  Device-1: Intel Wi-Fi 6 AX200 vendor: Rivet Networks Killer driver: iwlwifi
    v: kernel pcie: gen: 2 speed: 5 GT/s lanes: 1 bus-ID: af:00.0
    chip-ID: 8086:2723 class-ID: 0280
  IF: wlp175s0 state: up mac: <filter>
  Device-2: Intel Ethernet I225-V driver: igc v: kernel pcie: gen: 2
    speed: 5 GT/s lanes: 1 port: N/A bus-ID: b0:00.0 chip-ID: 8086:15f3
    class-ID: 0200
  IF: enp176s0 state: down mac: <filter>
  Device-3: Qualcomm Atheros Killer E2500 Gigabit Ethernet vendor: Dell
    driver: alx v: kernel pcie: gen: 1 speed: 2.5 GT/s lanes: 1 port: 3000
    bus-ID: b2:00.0 chip-ID: 1969:e0b1 class-ID: 0200
  IF: enp178s0 state: down mac: <filter>
  IF-ID-1: tun0 state: unknown speed: 10 Mbps duplex: full mac: N/A
Bluetooth:
  Device-1: Intel AX200 Bluetooth driver: btusb v: 0.8 type: USB rev: 2.0
    speed: 12 Mb/s lanes: 1 mode: 1.1 bus-ID: 3-14:6 chip-ID: 8087:0029
    class-ID: e001
  Report: btmgmt ID: hci0 rfk-id: 2 state: down bt-service: enabled,running
    rfk-block: hardware: no software: yes address: <filter> bt-v: 5.2 lmp-v: 11
    status: discoverable: no pairing: no
Drives:
  Local Storage: total: 1.82 TiB used: 963.32 GiB (51.7%)
  SMART Message: Unable to run smartctl. Root privileges required.
  ID-1: /dev/nvme0n1 maj-min: 259:0 vendor: Samsung model: SSD 980 PRO 2TB
    size: 1.82 TiB block-size: physical: 512 B logical: 512 B speed: 63.2 Gb/s
    lanes: 4 tech: SSD serial: <filter> fw-rev: 5B2QGXA7 temp: 41.9 C
    scheme: GPT
Partition:
  ID-1: / raw-size: 1.82 TiB size: 1.82 TiB (100.00%) used: 963.32 GiB (51.7%)
    fs: btrfs dev: /dev/dm-0 maj-min: 254:0
    mapped: luks-b0a35949-1ae3-43bb-9d72-????????
  ID-2: /boot/efi raw-size: 300 MiB size: 299.4 MiB (99.80%)
    used: 752 KiB (0.2%) fs: vfat dev: /dev/nvme0n1p1 maj-min: 259:1
  ID-3: /home raw-size: 1.82 TiB size: 1.82 TiB (100.00%)
    used: 963.32 GiB (51.7%) fs: btrfs dev: /dev/dm-0 maj-min: 254:0
    mapped: luks-b0a35949-1ae3-43bb-9d72-????????
  ID-4: /var/log raw-size: 1.82 TiB size: 1.82 TiB (100.00%)
    used: 963.32 GiB (51.7%) fs: btrfs dev: /dev/dm-0 maj-min: 254:0
    mapped: luks-b0a35949-1ae3-43bb-9d72-????????
Swap:
  Kernel: swappiness: 60 (default) cache-pressure: 100 (default) zswap: yes
    compressor: zstd max-pool: 20%
  ID-1: swap-1 type: file size: 512 MiB used: 0 KiB (0.0%) priority: -2
    file: /swap/swapfile
Sensors:
  System Temperatures: cpu: 35.0 C mobo: N/A gpu: nvidia temp: 46 C
  Fan Speeds (rpm): N/A
Info:
  Processes: 352 Uptime: 57m wakeups: 0 Memory: total: 32 GiB note: est.
  available: 31.1 GiB used: 2.23 GiB (7.2%) Init: systemd v: 254
  default: graphical tool: systemctl Compilers: gcc: 13.2.1 alt: 11/12
  clang: 16.0.6 Packages: pm: pacman pkgs: 2116 libs: 472 tools: pamac,yay
  Shell: fish v: 3.6.4 running-in: xterm inxi: 3.3.31

nvidia-settings shows NVIDIA Driver version: 545.29.06.

I tried

$ DRI_PRIME=0 chromium
$ DRI_PRIME=1 chromium
$ chromium --disable-gpu --disable-software-rasterizer
$ chromium --gpu-vendor-id=0x8086 --gpu-device-id=0x3E92
$ chromium --gpu-active-vendor-id=0x8086 --gpu-active-device-id=0x3E92
$ chromium --gpu-testing-vendor-id=0x8086 --gpu-testing-device-id=0x1912

and many others. Any suggestions? Thank you and happy holidays if you have them :slight_smile:

ps. posting here the actual error:

$ chromium
[9591:9591:1227/161657.202258:ERROR:policy_logger.cc(156)] :components/enterprise/browser/controller/chrome_browser_cloud_management_controller.cc(161) Cloud management controller initialization aborted as CBCM is not enabled. Please use the `--enable-chrome-browser-cloud-management` command line flag to enable it if you are not using the official Google Chrome build.
[9591:9591:1227/161657.542585:ERROR:gpu_process_host.cc(992)] GPU process exited unexpectedly: exit_code=139
[9591:9591:1227/161657.885299:ERROR:gpu_process_host.cc(992)] GPU process exited unexpectedly: exit_code=139
[9591:9591:1227/161658.222390:ERROR:gpu_process_host.cc(992)] GPU process exited unexpectedly: exit_code=139
[9591:9591:1227/161658.602644:ERROR:gpu_process_host.cc(992)] GPU process exited unexpectedly: exit_code=139
[9591:9591:1227/161658.969727:ERROR:gpu_process_host.cc(992)] GPU process exited unexpectedly: exit_code=139
[9591:9591:1227/161659.355007:ERROR:gpu_process_host.cc(992)] GPU process exited unexpectedly: exit_code=139
[9591:9591:1227/161659.535900:ERROR:gpu_process_host.cc(992)] GPU process exited unexpectedly: exit_code=139
[9591:9591:1227/161659.748491:ERROR:gpu_process_host.cc(992)] GPU process exited unexpectedly: exit_code=139
[9591:9591:1227/161659.934367:ERROR:gpu_process_host.cc(992)] GPU process exited unexpectedly: exit_code=139
[9591:9591:1227/161659.934379:FATAL:gpu_data_manager_impl_private.cc(448)] GPU process isn't usable. Goodbye.
[1227/161659.938015:ERROR:elf_dynamic_array_reader.h(64)] tag not found
[1227/161659.938322:ERROR:elf_dynamic_array_reader.h(64)] tag not found
fish: Job 1, 'chromium' terminated by signal SIGTRAP (Trace or breakpoint trap)

Chromium version: 120.0.6099.129

Hi @43e,

Try launching it with the --no-sandbox:

chromium --no-sandbox

Thank you @Mirdarthos , same error.

In that case:

:man_shrugging:

Sorry.

:sob:

Edit:

According to this page:

Workaround described here, for another electron-based app, is to use --in-process-gpu.

And now, I’ve gotta run! C ya!

Thank you again :slight_smile:

$ chromium --in-process-gpu
[11135:11135:1227/163416.821483:ERROR:policy_logger.cc(156)] :components/enterprise/browser/controller/chrome_browser_cloud_management_controller.cc(161) Cloud management controller initialization aborted as CBCM is not enabled. Please use the `--enable-chrome-browser-cloud-management` command line flag to enable it if you are not using the official Google Chrome build.
[1227/163417.061360:ERROR:elf_dynamic_array_reader.h(64)] tag not found
[1227/163417.061608:ERROR:elf_dynamic_array_reader.h(64)] tag not found
fish: Job 1, 'chromium --in-process-gpu' terminated by signal SIGSEGV (Address boundary error)

Maybe I should try without i3wm, see if that is related.

Interesting that anything else works. For example I can use blender with cuda or develop OPENGL programs.

I don’t suppose you have VDPAU_DRIVER=nvidia set? If so, remove it.

Thanks @Yochanan , I only see this env variable:

$ printenv | grep nvidia
LIBVA_DRIVER_NAME=nvidia

I figured it out and the env var was related. I searched where LIBVA_DRIVER_NAME was defined and found it in

/etc/profile.d/libva.sh
/etc/profile.d/libva.csh

Then I did pacman -Ss libva and noticed that libva-nvidia-driver was not installed. I installed it, automatically removing libva-vdpau-driver which according to the arch wiki

The code for libva-vdpau-driver has not been touched for years. It causes crash when running VLC or OBS with recent versions of the NVIDIA driver. [2] If you need a translation layer for NVIDIA drivers, use libva-nvidia-driver instead.

Got my programs back :slight_smile:

3 Likes

This topic was automatically closed 36 hours after the last reply. New replies are no longer allowed.