Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HDMI output not working (solved) #855

Closed
bammitzb opened this issue Mar 21, 2017 · 12 comments
Closed

HDMI output not working (solved) #855

bammitzb opened this issue Mar 21, 2017 · 12 comments

Comments

@bammitzb
Copy link

bammitzb commented Mar 21, 2017

EDIT: A better solution seems to now be to use nouveau+modesetting - see comments below.

(This comes from #764 (comment) - I thought I'd open a separate issue for my problem that has been solved).

I have a Tuxedo XC1706 (aka Schenker XMG P706 aka Clevo P671RG) (i7-6700HQ/GTX 970M) where the external outputs (HDMI and 2 display ports) are wired to the NVIDIA card. The intel HD only has the laptop screen attached. My goal has been to have a laptop with good battery life, so turn off nvidia when not needed - and I found bumblebee to be the near perfect solution. It worked more or less except: 1) The nvidia card did not turn off automatically when the optirun command had finished and 2) The external outputs were not working.

I can live with 1) since I am able to turn the card off manually using a script. But I need 2) badly.

I finally got the HDMI output working - the trick was to add intel as a dummy inactive device in the bumblebee xorg.conf.nvidia (similar/opposite to what I do in /etc/X11/xorg.conf.d/xorg.conf):

tux_xc1706 ~ # cat /etc/bumblebee/xorg.conf.nvidia 
Section "ServerLayout"
    Identifier  "Layout0"
    Option      "AutoAddDevices" "false"
    Option      "AutoAddGPU" "false"
    Screen      0  "nvidia"
    Inactive       "intel"
EndSection

Section "Device"
    Identifier  "nvidia"
    Driver      "nvidia"
    VendorName  "NVIDIA Corporation"

#   If the X server does not automatically detect your VGA device,
#   you can manually set it here.
#   To get the BusID prop, run `lspci | egrep 'VGA|3D'` and input the data
#   as you see in the commented example.
#   This Setting may be needed in some platforms with more than one
#   nvidia card, which may confuse the proprietary driver (e.g.,
#   trying to take ownership of the wrong device). Also needed on Ubuntu 13.04.
#   BusID "PCI:01:00:0"

#   Setting ProbeAllGpus to false prevents the new proprietary driver
#   instance spawned to try to control the integrated graphics card,
#   which is already being managed outside bumblebee.
#   This option doesn't hurt and it is required on platforms running
#   more than one nvidia graphics card with the proprietary driver.
#   (E.g. Macbook Pro pre-2010 with nVidia 9400M + 9600M GT).
#   If this option is not set, the new Xorg may blacken the screen and
#   render it unusable (unless you have some way to run killall Xorg).
    Option "ProbeAllGpus" "false"

    BusID  "PCI:1:0:0"
    Option "AllowEmptyInitialConfiguration"
    Option "NoLogo" "true"
    Option "UseEDID" "true"
EndSection

Section "Device"
    Identifier     "intel"
    Driver         "dummy"
    BusID          "PCI:0:2:0"
EndSection

Section "Screen"
    Identifier     "nvidia"
    Device         "nvidia"
EndSection

This is my /etc/X11/xorg.conf.d/xorg.conf:

tux_xc1706 ~ # cat /etc/X11/xorg.conf.d/xorg.conf 
Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "intel"
    Inactive       "nvidia"
EndSection
 
Section "Monitor"
    Identifier     "Monitor0"
    Option         "DPMS"
EndSection
 
Section "Device"
    Identifier     "nvidia"
#    Driver         "nvidia"
#    Driver         "nouveau"
    Driver         "dummy"
    BusID          "PCI:1:0:0"
EndSection
 
Section "Device"
    Identifier     "intel"

#    Driver         "modesetting"

    Driver         "intel"
#    Option         "AccelMethod" "SNA"
#    Option         "AccelMethod" "UXA"
    Option         "TearFree"    "true"
    Option         "DRI" "3"

    BusID          "PCI:0:2:0"
EndSection
 
Section "Screen"
    Identifier     "nvidia"
    Device         "nvidia"
#    Monitor        "Monitor0"
#    DefaultDepth    24
#    SubSection     "Display"
#        Depth       24
#        Modes      "nvidia-auto-select"
#        Virtual     1920 1080
#    EndSubSection
EndSection
 
Section "Screen"
    Identifier     "intel"
    Device         "intel"
    Monitor        "Monitor0"
EndSection

I did edit my bumblebee.conf a bit, but I'm not sure that has anything to do with it:

tux_xc1706 ~ # cat /etc/bumblebee/bumblebee.conf 
# Configuration file for Bumblebee. Values should **not** be put between quotes

## Server options. Any change made in this section will need a server restart
# to take effect.
[bumblebeed]
# The secondary Xorg server DISPLAY number
VirtualDisplay=:8
# Should the unused Xorg server be kept running? Set this to true if waiting
# for X to be ready is too long and don't need power management at all.
KeepUnusedXServer=false
# The name of the Bumbleblee server group name (GID name)
ServerGroup=bumblebee
# Card power state at exit. Set to false if the card shoud be ON when Bumblebee
# server exits.
TurnCardOffAtExit=false
# The default behavior of '-f' option on optirun. If set to "true", '-f' will
# be ignored.
NoEcoModeOverride=false
# The Driver used by Bumblebee server. If this value is not set (or empty),
# auto-detection is performed. The available drivers are nvidia and nouveau
# (See also the driver-specific sections below)
Driver=nvidia
# Directory with a dummy config file to pass as a -configdir to secondary X
XorgConfDir=/etc/bumblebee/xorg.conf.d

## Client options. Will take effect on the next optirun executed.
[optirun]
# Acceleration/ rendering bridge, possible values are auto, virtualgl and
# primus.
Bridge=auto
# The method used for VirtualGL to transport frames between X servers.
# Possible values are proxy, jpeg, rgb, xv and yuv.
VGLTransport=proxy
# List of paths which are searched for the primus libGL.so.1 when using
# the primus bridge
PrimusLibraryPath=/usr/lib/primus:/usr/lib32/primus
# Should the program run under optirun even if Bumblebee server or nvidia card
# is not available?
AllowFallbackToIGC=false


# Driver-specific settings are grouped under [driver-NAME]. The sections are
# parsed if the Driver setting in [bumblebeed] is set to NAME (or if auto-
# detection resolves to NAME).
# PMMethod: method to use for saving power by disabling the nvidia card, valid
# values are: auto - automatically detect which PM method to use
#         bbswitch - new in BB 3, recommended if available
#       switcheroo - vga_switcheroo method, use at your own risk
#             none - disable PM completely
# https://github.com/Bumblebee-Project/Bumblebee/wiki/Comparison-of-PM-methods

## Section with nvidia driver specific options, only parsed if Driver=nvidia
[driver-nvidia]
# Module name to load, defaults to Driver if empty or unset
KernelDriver=nvidia
PMMethod=bbswitch
# colon-separated path to the nvidia libraries
LibraryPath=/usr/lib64/opengl/nvidia/lib:/usr/lib32/opengl/nvidia/lib:/usr/lib/opengl/nvidia/lib
# comma-separated path of the directory containing nvidia_drv.so and the
# default Xorg modules path
#XorgModulePath=/usr/lib64/opengl/nvidia/lib,/usr/lib64/opengl/nvidia/extensions,/usr/lib64/xorg/modules/drivers,/usr/lib64/xorg/modules
XorgModulePath=/usr/lib64/opengl/nvidia,/usr/lib64/xorg/modules
XorgConfFile=/etc/bumblebee/xorg.conf.nvidia

## Section with nouveau driver specific options, only parsed if Driver=nouveau
[driver-nouveau]
KernelDriver=nouveau
PMMethod=auto
XorgConfFile=/etc/bumblebee/xorg.conf.nouveau

I can now turn the nvidia card on (and enable external outputs on the nvidia card) by using this script:

tux_xc1706 ~ # cat nvidia_on.sh 
#!/bin/bash
optirun true
intel-virtual-output -f

Ctrl-C the script when no longer needed and run this script to turn off the nvidia card again:

tux_xc1706 ~ # cat nvidia_off.sh 
#!/bin/bash
rmmod nvidia_drm; rmmod nvidia_modeset; rmmod nvidia; /etc/init.d/bumblebee restart ; cat /proc/acpi/bbswitch 
@Brainiarc7
Copy link

Hmm, I'm in a similar situation with an MSI GS43VR 6RE Phantom Pro with the following specs:

Processor: Intel Core i7 6700HQ
RAM: 32 GB DDR4-2400 SODIMMs.
GPU: Nvidia GeForce GTX 1060 6GB
iGPU: Intel HD Graphics 530 (GT2)

Issue: When I connect the MSI laptop to an external monitor via HDMI, the screen flickers nonstop until the cable is unplugged.

With the mDP port on the rear, the external display only works IF set in duplicate mode, which can greatly limit the resolution used (Common culprit is a projector I use frequently).

I'll try this workaround and give feedback soon.

FYI: I'm on Ubuntu 16.04LTS.

@Brainiarc7
Copy link

And it works, voila!

@Lekensteyn
Copy link
Member

I have a similar laptop, the Clevo P651RA and have the same two requirements: (1) turn off dGPU when unused (2) enable use of external monitors. For this I don't use Bumblebee but nouveau. Older notes are at #808 (comment), but with the latest versions of the kernel/Xorg stack it gets better (I currently use nouveau/intel as xorg drivers instead of modesetting).

@bammitzb
Copy link
Author

@Lekensteyn - I did try nouveau (see here), but xrandr --listproviders only showed the intel card - and the power use was higher even though the nvidia card was turned "off" (could only get vgaswitcheroo to show DynPwr). Anything specific for the nouveau setup that I might have missed?

@Lekensteyn
Copy link
Member

@bammitzb "DynPwr" means that the GPU is turned ON, "DynOff" means that the power is turned off (both using runtime PM). I have not tested it with the nvidia blob driver, for listproviders to show the nouveau/modesetting device you need respective xorg modules (xf86-video-nouveau or modesetting as included with recent xorg-server).

If any of the programs keep /dev/dri/card1 open (Xorg, an application through DRI_PRIME=1, etc.), then the runtime PM timeout of 5 seconds will not kick in and your GPU will not enter suspend.

@bammitzb
Copy link
Author

bammitzb commented May 4, 2017

@Lekensteyn I did use the xf86-video-nouveau as well as the modesetting driver I think, but maybe I should give it another try...

@bammitzb
Copy link
Author

bammitzb commented Jun 12, 2017

@Lekensteyn Finally got time to try nouveau (+modesetting) again and it now works, i.e. xrand --listproviders shows both intel and nvidia card and I can activate the nvidia HDMI and DP outputs by doing this as you suggested (reverse PRIME?):

read intel nv <<<"$(xrandr --listproviders |  awk '/outputs:/{print $4}' | tr '\n' ' ')"
xrandr --setprovideroutputsource $nv $intel

If I don't do the above I just see the below in the log when plugging monitor on DP, so maybe if this event was handled the outputs could get switched on/off automatically?

Jun 12 13:00:27 tux_xc1706 root: ACPI event unhandled: video/switchmode VMOD 00000080 00000000

With respect to vgaswitcheroo I also get "DynOff" for the nvidia card when on battery, and "DynPwr" when on AC (EDIT: When on AC it is also possible to get DynOff - use powertop to toggle nvidia pci tunable "Bad" into "Good"). Initially I saw 16-18W when on battery, but trying now gives approx. 26W?? (EDIT: I get 16-18W when I have pcie_port_pm=off as kernel boot parameters). Nevertheless, my external outputs work better with nouveau, so I think I'll stick to nouveau and hope support just gets better over time... (EDIT: I still have a lot of issues with the external outputs - sometimes I just can't get my external display where I want it and sometimes I end up with a black screen with just a mouse pointer on it. This was also the case when I used bumblebee though.)

I tried to run without any xorg.conf, but that did not work - I had to have the following xorg.conf to get nouveau+modesetting to work:

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "modesetting"
    Inactive       "nouveau"
EndSection
 
Section "Monitor"
    Identifier     "Monitor0"
    Option         "DPMS"
EndSection
 
Section "Device"
    Identifier     "nouveau"
    Driver         "nouveau"
    BusID          "PCI:1:0:0"
EndSection
 
Section "Device"
    Identifier     "modesetting"
    Driver         "modesetting"
    BusID          "PCI:0:2:0"
EndSection
 
Section "Screen"
    Identifier     "nouveau"
    Device         "nouveau"
EndSection
 
Section "Screen"
    Identifier     "modesetting"
    Device         "modesetting"
    Monitor        "Monitor0"
EndSection

@jtreminio
Copy link

jtreminio commented Jun 6, 2019

@bammitzb Yours is the only instructions I have found that work to get my external USB-C/DisplayPort monitor working on my Lenovo P72.

Unfortunately response is extremely slow/laggy on the external display. I don't believe the Nvidia GPU is actually powering it, but at least I've gotten much further today than in the past weeks.

[edit] A little more googling brought me to https://bugs.freedesktop.org/show_bug.cgi?id=96820

Per comments, in /etc/X11/xorg.conf.d/xorg.conf on line 29 replacing Option "DRI" "3" with Option "DRI" "2" and rebooting completely removes lag.

If you don't want to change DRI then simply selecting to mirror displays between laptop and external monitor will also eliminate lag.

For reference, my Lenovo P72 is running

  • Fedora 30
  • Kernel 5.1.6-300.fc30.x86_64
  • Cinnamon 4.0.10
  • Nvidia Quadro P5200 Mobile

@bammitzb
Copy link
Author

bammitzb commented Jun 7, 2019

@jtreminio Glad to help - also took me a long time to get mine working. These days nouveau just works when attaching external display and the power down even works fine now also when the external screen is unplugged.

@szpak
Copy link

szpak commented Jun 14, 2019

Guys, what Desktop Environment do you use?

As described in this bug report the external monitor works only with Wayland (this conflicts with some tools I use). With LXQT and Gnome 3 (X.org) I have the external monitor enabled, no window is rendered there (only a mouse cursor).

I have Hyperbook NH5/Clevo NH55RCQ (with GeForce GTX 1660 Ti) which seems to have HDMI wired to the NVidia card as mentioned in this thread. I use nouveau.

@bammitzb
Copy link
Author

I am running Gentoo with KDE and above xorg.conf (nouveau + modesetting). I use this small script to turn on external monitor output when I need to use it:

tux_xc1706 ~ # cat nouveau_on.sh
#!/bin/bash
read intel nv <<<"$(xrandr --listproviders | awk '/outputs:/{print $4}' | tr '\n' ' ')"
xrandr --setprovideroutputsource $nv $intel
sleep 1
#Use this to configure external display (KDE system settings does not work very well):
#xrandr --output DP-1-1 --auto --right-of eDP-1
xrandr --output DP-1-1 --mode 1920x1080 --right-of eDP-1
xrandr --output HDMI-1-1 --mode 1920x1080 --right-of eDP-1

These days that just works! I used to have a script to turn off external monitor support as well, mainly to power down when on battery - but that also seems to work nicely now if I unplug the external monitor (17-18 W when on battery an unplugging the monitor). With the monitor plugged in I'm around 26-30W.

@szpak
Copy link

szpak commented Jun 16, 2019

Thanks @bammitzb . Yes, the NVidia card seems to be turned on automatically after the monitor is connected and is found by Gnome 3 (with LXQT I need to use xrandr --setprovideroutputsource) and turned of a few seconds after disconnection. However, the the external screen is black (with the mouse cursor rendered properly). I cannot used the nouveau is in Xorg directly as my graphic card is not (yet) supported. At the kernel level I can override its id to use modesetting.
I will check it out with KDE. If not, as my graphic card will (probably) get some better support one day in nouveau I will probably stick out with the NVidia binary driver or Wayland which seems to override the issue using it's own composer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants