Is this really about Transformers?
When I bought this laptop, I had no idea it had this technology on board. To be honest, I didn’t know a thing about Optimus. When it arrived, I discovered that, not only I had no means of using the discrete NVIDIA graphic card, but I’d better find a way to turn it off, or the whole PC would be on fire in matter of minutes.
Basically, I’m talking about this:
dario@Abyss ~]$ lspci|grep -i vga 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GF108 [GeForce GT 540M] (rev ff)
And Wikipedia calls it “an optimization technology”. Sure, tell me about it!!
Well, as usual, I started digging to see what I can do, and found out the Bumblebee project, which looked really promising from the beginning. Unfortunately, it turned out not to be the case, not at that time at least, as it was 2010, the project was still in its very early days, and was supported only for Ubuntu, while I was running Fedora 16 at the time. I managed in setting it up by hand, to the point where it was working, but it never became really usable.
First of all, we now have a really nice page in the official Fedora wiki about how to install and get Optimus running. I tried this way first, and got almost there, but couldn’t claim 100% success. In fact, my card (which, BTW, is a
NVIDIA Corporation GF108 [GeForce GT 540M]) does not seem to be that well supported by Linux’s nouveau driver, or at least by Linux’s nouveau running in a “Optimus configuration”. In fact, I couldn’t get any application to really use the discrete card and, most important, I couldn’t dynamically turn on and off the discrete card, which, well, is the whole point!
The solution is to use proprietary NVIDIA drivers and a really really really nice step by step tutorial on how to set them up for a Optimus environment can be found at this link. Actually, Viger from http://forums.if-not-true-then-false.com did much more than that: he has (or has found and links to) some nice RPM packages that makes things unbelievably comfortable to set up. So, here’s what I did:
# ln -s /etc/modprobe.d/blacklist.conf /usr/lib/modprobe.d/ # echo "blacklist nouveau" >> /etc/modprobe.d/blacklist.conf # dracut /boot/initramfs-$(uname -r).img $(uname -r) # yum -y install http://install.linux.ncsu.edu/pub/yum/itecs/public/bumblebee/fedora18/x86_64/acpi-handle-hack-0.0.2-1.fc18.x86_64.rpm # yum -y install http://install.linux.ncsu.edu/pub/yum/itecs/public/bumblebee/fedora18/x86_64/bbswitch-0.5.0-1.fc18.x86_64.rpm # yum -y install http://install.linux.ncsu.edu/pub/yum/itecs/public/bumblebee/fedora18/x86_64/bumblebee-3.0.1-2.fc18.x86_64.rpm # yum -y install http://install.linux.ncsu.edu/pub/yum/itecs/public/bumblebee/fedora18/x86_64/primus-0.0.12112012-8.fc18.x86_64.rpm # yum install VirtualGL # yum -y --nogpgcheck install http://install.linux.ncsu.edu/pub/yum/itecs/public/bumblebee-nonfree/fedora18/noarch/bumblebee-nonfree-release-1.0-1.noarch.rpm # yum install Install bumblebee-nvidia # reboot
And then you are done. Finished. Everything works! The super cool thing about the bumblebee-nonfree package is that it takes care entirely of installing the NVIDIA proprietary Xorg drivers in a non standard location, as it is required by Optimus, and really painful to do by hand.
[dario@Abyss ~]$ rpm -ql bumblebee-nvidia /etc/bumblebee/bumblebee-nvidia.conf /etc/sysconfig/nvidia /etc/sysconfig/nvidia/NVIDIA-Linux-x86_64-310.32.run /etc/systemd/system/bumblebee-nvidia.service /usr/lib/nvidia-bumblebee /usr/lib/systemd/system/bumblebee-nvidia.service /usr/lib64/nvidia-bumblebee /usr/lib64/nvidia-bumblebee/xorg/modules/extensions /usr/sbin/bumblebee-nvidia /usr/share/selinux/devel/bumblebee-nvidia.te
(where the job of moving the files in the proper directories the package creates, after having extracted/compiled them, is carried out by the
Some nice info about how to use the discrete card via
optirun are available, for instance, on the ArchLinux Wiki. For making sure that the card is disabled when not in use have a look here:
[dario@Abyss ~]$ cat /proc/acpi/bbswitch 0000:01:00.0 OFF
(although, believe me, you can also hear that from the fact the fan is silent and the laptop not burning as much as hell!)
The last touch is this GNOME3 Extension: Bumblebee indicator.
And does it really work?
It sure does, but please, judge yourself.
Without Optimus (i.e., running on the integrated Intel card)
[dario@Abyss ~]$ glxspheres Polygons in scene: 62464 Visual ID of window: 0x9f Context is Direct OpenGL Renderer: Mesa DRI Intel(R) Sandybridge Mobile 55.670883 frames/sec - 54.525288 Mpixels/sec 39.869175 frames/sec - 39.048747 Mpixels/sec 40.019250 frames/sec - 39.195734 Mpixels/sec ...
With Optimus (i.e., running on the discrete NVIDIA card)
[dario@Abyss ~]$ optirun glxspheres Polygons in scene: 62464 Visual ID of window: 0x21 Context is Direct OpenGL Renderer: GeForce GT 540M/PCIe/SSE2 99.215315 frames/sec - 97.173662 Mpixels/sec 109.804884 frames/sec - 107.545319 Mpixels/sec 109.287161 frames/sec - 107.038250 Mpixels/sec ... [dario@Abyss ~]$ optirun -c yuv glxspheres Polygons in scene: 62464 Visual ID of window: 0x21 Context is Direct OpenGL Renderer: GeForce GT 540M/PCIe/SSE2 115.854143 frames/sec - 113.470097 Mpixels/sec 118.528492 frames/sec - 116.089413 Mpixels/sec 119.482522 frames/sec - 117.023811 Mpixels/sec ... [dario@Abyss ~]$ vblank_mode=0 primusrun glxspheres Polygons in scene: 62464 ATTENTION: default value of option vblank_mode overridden by environment. ATTENTION: default value of option vblank_mode overridden by environment. Visual ID of window: 0x9f ATTENTION: default value of option vblank_mode overridden by environment. ATTENTION: default value of option vblank_mode overridden by environment. Context is Direct OpenGL Renderer: GeForce GT 540M/PCIe/SSE2 primus: sorry, not implemented: glXUseXFont 270.721424 frames/sec - 265.150518 Mpixels/sec 275.764986 frames/sec - 270.090294 Mpixels/sec 276.292697 frames/sec - 270.607146 Mpixels/sec ...
Wasn’t it about overheating too?
Finally, here it is what I get on a mostly idle system:
[dario@Abyss ~]$ sudo sensors acpitz-virtual-0 Adapter: Virtual device temp1: +54.0°C (crit = +100.0°C) temp2: +54.0°C (crit = +100.0°C)coretemp-isa-0000 Adapter: ISA adapter Physical id 0: +54.0°C (high = +86.0°C, crit = +100.0°C) Core 0: +54.0°C (high = +86.0°C, crit = +100.0°C) Core 1: +53.0°C (high = +86.0°C, crit = +100.0°C) Core 2: +53.0°C (high = +86.0°C, crit = +100.0°C) Core 3: +50.0°C (high = +86.0°C, crit = +100.0°C)
And, again, believe me, this is something I did never achieve before, while not being able to turn the discrete card off properly.UPDATE (Feb 20, 2013):
Gary (see the comments below) pointed me to the Wiki page, where all this effort to package bumblebee and friends nicely and get them into Fedora lives. Check that out: it is full of much more useful and detailed information that this humble post!😛
- Bumblebee (wiki.archlinux.org)
- David Airlie Announces “Reverse Optimus” Multi-GPU (phoronix.com)
- Fedora 18 Spherical Cow brings Linux fans a taste of Cinnamon, new installer (engadget.com)