Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate functionality without login to separate tty #4

Open
Witko opened this issue May 26, 2015 · 14 comments · May be fixed by #24
Open

Investigate functionality without login to separate tty #4

Witko opened this issue May 26, 2015 · 14 comments · May be fixed by #24
Assignees
Milestone

Comments

@Witko
Copy link
Owner

Witko commented May 26, 2015

Logging in into separate tty is a bit troublesome. Investigate possibilities to spawn new X directly from active session.

@Fincer
Copy link

Fincer commented Nov 3, 2015

Yeah I find this script bit troublesome as well. Or at least, bring support for Xephyr - a nested X server session. At the current state, the nvidia-xrun script is not very practical for daily usage because it requires a separate tty which is totally separated from your current X session. However, there is a great potential in nvidia-xrun and the performance boost compared to pure Bumblebee solution is tremendous.

For basic info about Xephyr, see:

https://en.wikipedia.org/wiki/Xephyr
https://wiki.archlinux.org/index.php/Xephyr

@Witko
Copy link
Owner Author

Witko commented Nov 3, 2015

Thanks mate! I will definetely have a look into Xephyr. But with a brief look I see there that hardware acceleration is only in some forked version. Also it might be not possible to use different video card than the "hosting" X server. But definetely will have a deeper look. Thanks!

@Fincer
Copy link

Fincer commented Nov 3, 2015

No probs!

Hmm... unfortunate to hear that. So it might be a bit challenging to get nvidia-xrun to work that way. At least, I hope you give a shot for Xephyr and could figure this issue out sooner or later. It would significantly boost the usability of your script in daily usage.

Lack of hardware acceleration support in the original Xephyr is something that shouldn't be there. Seriously. It can be a major and serious obstacle for getting rid of separate tty session that is a requirement for nvidia-xrun (and Optimus stuff overall) at the moment.

However...

Not sure if this is any help for you but anyway... I've used Xephyr + optirun with the following script:

DISPLAY=:0 Xephyr +extension GLX -br -ac -screen 1280x960 :1 &
export DISPLAY=:1
optirun $HOME/.nwn/nwn

The script launches a new Xephyr window (Display :1) with 1280x960 resolution and points Neverwinter Nights executable (nwn) into that window using optirun command.

@Fincer
Copy link

Fincer commented Nov 3, 2015

Allright, I have a rude workaround for this issue. I'm using KDE (Qt4). Simply switching to another TTY session and executing

nvidia-xrun startkde

gives me a secondary KDE session (desktop) which uses discrete Nvidia card for any program I use there. If I don't need the Nvidia card anymore, I simply log out from that KDE session and switch back to the primary TTY session which uses Intel card instead.

If you use different desktop environment, use corresponding command for launching the desktop environment in your secondary TTY session.

XFCE: nvidia-xrun startxfce4
LXDE: nvidia-xrun startlxde
Gnome: nvidia-xrun gnome-session

etc.

This is still not a great or user-friendly solution but, at least, it's better than nothing at all.


And if anyone reads this post, stay tuned! Valuable information ahead.

The nvidia-xrun script gives a significant boost to any programs using built-in Nvidia Optimus card.

For example, glxgears performs as follows:

with optirun/primusrun:

6798 frames in 5.0 seconds = 1359.468 FPS
6740 frames in 5.0 seconds = 1347.828 FPS
6885 frames in 5.0 seconds = 1376.830 FPS
6841 frames in 5.0 seconds = 1368.175 FPS
6813 frames in 5.0 seconds = 1362.424 FPS
6887 frames in 5.0 seconds = 1377.235 FPS
6814 frames in 5.0 seconds = 1362.665 FPS

and with nvidia-xrun:

71415 frames in 5.0 seconds = 14282.940 FPS
77207 frames in 5.0 seconds = 15441.392 FPS
77668 frames in 5.0 seconds = 15533.493 FPS
77224 frames in 5.0 seconds = 15444.704 FPS
76825 frames in 5.0 seconds = 15364.848 FPS
75690 frames in 5.0 seconds = 15137.942 FPS
76683 frames in 5.0 seconds = 15336.475 FPS

As it can be seen, the performance boost (increase) with nvidia-xrun script is roughly 1015% (average value), or 11 times higher than with optirun/primusrun.

I've also tested your script with Wine + Warhammer 40,000: Dawn of War II, and the performance difference is... like from another planet. With primus/optirun I used the lowest settings I could set and got like 10-20 FPS and warnings about multiplayer lag. With nvidia-xrun I can play the game with high/ultra settings and still get 35-50 FPS.

I'm using Asus N56JR laptop with Nvidia Gerforce GTX 760M card.

Just before using the script, make sure you have taken care of the laptop cooling because heat can be a significant problem in long run.

Thanks for the script, Witko! I hope you keep developing it because it truly has great potential.

@Witko
Copy link
Owner Author

Witko commented Nov 3, 2015

Hi Fincer!
thanks for you enthusiasm! I had the same issue - i have GTX 980M and I was pretty excited to see the games fluent on ultra - and i saw worse performance than on integrated graphics. So i had to do something to get it working as the steam already brings many games to linux and i dont want to switch to windows for it.
As for you workaround - i ussually run openbox cos i had issues with running steam directly and i didnt want to spend days on fixing it. This works ok(as for you) but still its not 100% convenient as i would like it. For me it was good enough but as i can see there are more of us now :) So hopefully we will find some way to make it more userfiendly.
BTW do you use archlinux?

@Fincer
Copy link

Fincer commented Nov 3, 2015

Np. Your script is targeting on real performance issues and handles them very well. It is not a perfect one in usability as discussed above but I'm quite happy with the current state of it as what comes to the performance it offers for the games and other graphic-intensive programs.

To be honest, your script is the only one I've found so far targeting on Optimus issue - excluding bumblebee/ironhide (which both are pretty dead now, no updates for 3/4 years) and Nvidia official support (which is bad because, as far as I understand, it forces user to log in/out in order to switch between cards).

I still wonder why didn't bumblebee devs focused more on the performance issue because it still exists there and affects thousands of users, at least. Too many users out there without a real answer to the performance issue they still have in daily computer usage.

To summarise, I'm not satisfied with the current official solution by Nvidia, nor with bumblebee because of very poor performance. Your script is a good one taking care of the performance issue and, simultaneously, it eliminates the need for totally logging in/out like apparently in nvidia-prime. In other words: I can use both Nvidia & Intel cards efficiently, not in the same session, but I still can switch between two sessions. If you can find a user-friendly solution to this usability issue (TTY session stuff), the script would be perfect.

What comes to Optimus technology overall, I've read Wayland would bring end to the usage of these hacky scripts:

https://blogs.gnome.org/uraeus/2015/08/19/fedora-workstation-next-steps-wayland-and-graphics/

However, I think that's pretty much still a few years ahead because Wayland is still under development. And waiting for a few years...nah...

For your question. How did you know? :D Yeah, I use Arch Linux. Pretty happy with it though setting it up is a story of its own.

@Witko
Copy link
Owner Author

Witko commented Nov 3, 2015

Yop you are right about the current state. Optirun/prismrun is so slow cos it starts other x and then it copies the frame to the primary x as far as i know(or sth similar). So it hardly will work well. Im also looking forward to wayland but it seems its gonna take some more time.
As for the archlinux - its the only distro where i created package. Its in AUR. And it seems to me that this script is hard to use without it - you need to know where to put the stuff, etc.

@Fincer
Copy link

Fincer commented Nov 3, 2015

Yeah, that's the case. The way optirun/primusrun handles drawing of graphics on the screen has, unfortunately, major performance drawbacks as we all know.

I installed this script using AUR with success (yes, it works). I though it was someone else who created that AUR package. Anyway, thumbs up for Arch Linux! One happy user here.

I'm pretty sure someone could make, for example, a debian package as well. Still, as you are the developer, you know it better than me so I'm not going to argue about that with you... Automated, easy installation would make this script more attractive in the eyes of many users, however.

@Witko
Copy link
Owner Author

Witko commented Nov 5, 2015

Hi Andennn,
please create an issue for this. And please write there also whether you use the AUR package or not.
Thanks!

@Witko Witko added this to the 0.3 milestone Dec 21, 2015
@Witko Witko self-assigned this Dec 21, 2015
@Witko Witko modified the milestones: Version 0.3, Version 0.2 Dec 21, 2015
@Witko Witko removed this from the Version 0.3 milestone Jul 19, 2016
@Witko
Copy link
Owner Author

Witko commented Jul 19, 2016

It seems the Xephyr runs on top of underlying X in terms of graphics. So I believe its not going to work.
Another option might be Xnest. But this seems to be atop of host X too.

@Witko
Copy link
Owner Author

Witko commented Oct 20, 2016

I found another option:
setsid sh -c 'exec nvidia-xrun openbox-session <> /dev/tty3 >&0 2>&1'
This looks promising, but there are quite some issues to solve:

  • if its executed like this it outputs no write permission on /dev/tty3
  • if sudo used nvidia-xrun complains about running as root
  • if these obstacles are solved sudo complains about no tty to read the password from

@Witko Witko added this to the Version 0.4.0 milestone Nov 21, 2016
@hlechner
Copy link

hlechner commented Dec 19, 2016

Hey @Witko thanks for the previous looking into this issue.

I've made some tests to run it directly on already running X through terminal emulator.

First to be able to run it outside tty you need to change/create the file: /etc/X11/Xwrapper.config, adding the following lines:

allowed_users = anybody
needs_root_rights = no

and just to test it you need to fake a virtual console on nvidia-xrun file (it must run in a different console than the already running X).

So you can change from:

LVT=`fgconsole`

to

LVT="8" 

However the xinit is crashing when use the argument -config nvidia-xorg.conf (only when running outside of tty)


log:

(==) Log file: "/var/log/Xorg.1.log", Time: Mon Dec 19 01:21:36 2016
(++) Using config file: "/etc/X11/nvidia-xorg.conf"
(==) Using system config directory "/usr/share/X11/xorg.conf.d"
Xorg: privates.c:385: dixRegisterPrivateKey: Assertion `!global_keys[type].created' failed.
xinit: giving up
xinit: unable to connect to X server: Connection refused
xinit: server error

@hlechner hlechner linked a pull request Dec 20, 2016 that will close this issue
@hlechner
Copy link

hlechner commented Dec 20, 2016

I have found a solution with openvt and also I've sent to you a pull request.

I hope you like it.

@Witko Witko modified the milestones: Version 0.4.0, Version 0.5.0 Dec 8, 2017
Witko pushed a commit that referenced this issue May 3, 2019
Update RPM to use GitHub source
@nvidiaswitch
Copy link

I have made a script similar to nvidia-xrun, but without needing to change TTY. I believe my alternative fixes this problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants