Announcement

Collapse
No announcement yet.

HOW TO: Install experimental X.Org and the latest graphics drivers

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • oshunluvr
    replied
    Latest update - kernel 3.5.0-10 - breaks nfs. booting back to -9 solved it for now.

    UPDATE: It appears the problem I was having was my server is using NFSv3 and the latest update moved fstab mounting to NFSv4 or somehow changed the setup of NFS. Adding "vers=3" to fstab fixed it.
    Last edited by oshunluvr; Aug 20, 2012, 06:00 PM.

    Leave a comment:


  • SteveRiley
    replied
    Ubuntu's environment variable help: https://help.ubuntu.com/community/EnvironmentVariables
    KDE's environment variable help: http://userbase.kde.org/Session_Envi...t_Variables/en

    Leave a comment:


  • vinnywright
    replied
    Originally posted by SteveRiley View Post

    If you configure that environment variable at boot, I'd imagine that all OpenGL would use LLVM.
    and how would one go about that ?
    currently I put a shell script in start up and that dose not seem to do it ...or adding it to the grub boot line

    VINNY

    Leave a comment:


  • SteveRiley
    replied
    You could "trick" glxinfo into spitting out some details while avoiding long lists of extensions and visuals:

    Code:
    steve@x1:~$ [B]glxinfo |grep ': '[/B]
    name of display: :0
    display: :0  screen: 0
    direct rendering: Yes
    server glx vendor string: SGI
    server glx version string: 1.4
    client glx vendor string: Mesa Project and SGI
    client glx version string: 1.4
    GLX version: 1.4
    OpenGL vendor string: Intel Open Source Technology Center
    OpenGL renderer string: Mesa DRI Intel(R) Sandybridge Mobile 
    OpenGL version string: 3.0 Mesa 8.1-devel
    OpenGL shading language version string: 1.30
    
    steve@x1:~$ [B]LIBGL_ALWAYS_SOFTWARE=1 glxinfo |grep ': '[/B]
    name of display: :0
    display: :0  screen: 0
    direct rendering: Yes
    server glx vendor string: SGI
    server glx version string: 1.4
    client glx vendor string: Mesa Project and SGI
    client glx version string: 1.4
    GLX version: 1.4
    OpenGL vendor string: VMware, Inc.
    OpenGL renderer string: Gallium 0.4 on llvmpipe (LLVM 0x301)
    OpenGL version string: 2.1 Mesa 8.1-devel
    OpenGL shading language version string: 1.20
    If you configure that environment variable at boot, I'd imagine that all OpenGL would use LLVM.

    Leave a comment:


  • 67GTA
    replied
    Originally posted by vinnywright View Post
    but how dose one tell whether gallium llvmpipe is being used as default or not?
    as (and I quote) "* gallium llvmpipe as the default software renderer rather than classic swrast"... it seas it dose ??
    I'm curious about this too. I wanted to run some tests to see if one performed with opengl better.

    Leave a comment:


  • schnelle
    replied
    I found one more ppa with latest stable releases of drivers (not from git, only bugfix releases): https://launchpad.net/~glasen/+archive/intel-driver

    Leave a comment:


  • sixonetonoffun
    replied
    Originally posted by pnunn View Post
    Sadly I just had to rebuild my entire system after a crash this morning, I think due to the kernel installed as part of this update not playing at all with the nvidia-current package.
    If you decide to try again follow Dibbles Nvidia Proprietary Video Driver -- HOW TO directions regarding blacklisting and creating a new init image.

    This happened to me on my old nvidia machine adding nomodeset to the grub gfx line allowed me to boot and make it to X and the kde desktop. Once there creating a new init image fixed the next boot. Directions are really detailed in Dibbles How To so I won't try to add anything. If nothing else try installing to a usb stick (not a persistent image) for testing.

    Leave a comment:


  • vinnywright
    replied
    well this PPA https://launchpad.net/~oibaf/+archive/graphics-drivers did not crash my system like the Xorg-Edgers did

    but how dose one tell whether gallium llvmpipe is being used as default or not?
    as (and I quote) "* gallium llvmpipe as the default software renderer rather than classic swrast"... it seas it dose ??

    I cant rely tell if anythings faster/better or not from the usual , which was ok to start with on my core i3 Core Processor Integrated Graphics Controller (rev 02)

    VINNY

    Leave a comment:


  • SteveRiley
    replied
    Originally posted by HalationEffect View Post
    It's been some years since I got curious about the FPS issue and did a lot of web searching for info, but from what I recall, film and TV frames incorporate motion blur (due to the exposure time for each frame) which makes a low frame rate seem smoother than it really is. That effect, combined with persistence of vision, gets the job done at low FPS rates. However, crisp computer generated frames don't have motion blur.
    Ah, I figured it was something related to how the brain perceives motion. Thanks for the quick bit of education.

    Leave a comment:


  • SteveRiley
    replied
    Originally posted by capt-zero View Post
    Just tried out your suggestion of this ppa. Seems to have solved my problems, so far, that I've had trying to use opengl on this new (to me) system. Works great.
    Cool, glad it's working for you. While Xorg-Edgers appears not to work for everyone (note some of the reports in this thread reflecting that), I've found that it works reliably for me. This is definitely one of those YMMV things, I guess.

    Leave a comment:


  • HalationEffect
    replied
    The big FPS numbers barely matter at all, because glxgears is not a benchmark. Don't expect to get FPS numbers anywhere near as high in a real OpenGL application, such as a game... expect an FPS rate an order or two of magnitude lower.

    It's been some years since I got curious about the FPS issue and did a lot of web searching for info, but from what I recall, film and TV frames incorporate motion blur (due to the exposure time for each frame) which makes a low frame rate seem smoother than it really is. That effect, combined with persistence of vision, gets the job done at low FPS rates. However, crisp computer generated frames don't have motion blur.

    Leave a comment:


  • capt-zero
    replied
    Steve,

    Just tried out your suggestion of this ppa. Seems to have solved my problems, so far, that I've had trying to use opengl on this new (to me) system. Works great.

    Thanx,
    capt-zero

    Leave a comment:


  • SteveRiley
    replied
    Yeah, I'm seeing similar results when using native DRI without v-sync.

    Code:
    steve@t520:~$ [B]vblank_mode=0 glxgears -info[/B]
    ATTENTION: default value of option vblank_mode overridden by environment.
    ATTENTION: default value of option vblank_mode overridden by environment.
    GL_RENDERER   = Mesa DRI Intel(R) Sandybridge Mobile 
    GL_VERSION    = 3.0 Mesa 8.0.2
    GL_VENDOR     = Tungsten Graphics, Inc
    GL_EXTENSIONS = <...snip...>
    17633 frames in 5.0 seconds = 3526.499 FPS
    I'm still trying to wrap my brain around whether these big FPS numbers matter. If 24 frames per second is good enough for film, and 60 is good enough for high-def, what do we need 3500 frames per second for?

    Leave a comment:


  • pnunn
    replied
    Sadly I just had to rebuild my entire system after a crash this morning, I think due to the kernel installed as part of this update not playing at all with the nvidia-current package.

    The machine would start to boot to X and just hang with the splash screen drawing dots across the screen, I was able to log in using a text terminal but no matter what I tried (backing out the ppa, removing the 3.5 kernel, reinstalling nvidia) I could not get X to run.

    I ended up, after about 3 hours blowing the system away and starting again (a day to fix). So.. buyer be ware.. . Won't be doing this again in a hurry.

    Peter.

    Leave a comment:


  • HalationEffect
    replied
    Originally posted by sixonetonoffun View Post
    I'm beginning to think Intel Graphics don't suck!
    Agreed! My SandyBridge HD graphics... that's right, not HD2000 or 3000 or 4000, just plain HD (the lowest spec version of SandyBridge graphics) gives me this:

    Code:
    export vblank_mode=0
    glxgears -info
    ATTENTION: default value of option vblank_mode overridden by environment.
    ATTENTION: option value of option vblank_mode ignored.
    GL_RENDERER   = Mesa DRI Intel(R) Sandybridge Desktop 
    GL_VERSION    = 3.0 Mesa 8.1-devel
    GL_VENDOR     = Intel Open Source Technology Center
    GL_EXTENSIONS = <snip>
    25768 frames in 5.0 seconds = 5153.522 FPS
    26773 frames in 5.0 seconds = 5354.439 FPS
    26835 frames in 5.0 seconds = 5366.963 FPS
    (Edit) Here's what I get with Gallium / LLVMpipe:

    Code:
    LIBGL_ALWAYS_SOFTWARE=1 glxgears -info
    GL_RENDERER   = Gallium 0.4 on llvmpipe (LLVM 0x301)
    GL_VERSION    = 2.1 Mesa 8.1-devel
    GL_VENDOR     = VMware, Inc.
    GL_EXTENSIONS= <snip>
    5454 frames in 5.0 seconds = 1090.742 FPS
    5687 frames in 5.0 seconds = 1137.258 FPS
    5688 frames in 5.0 seconds = 1137.393 FPS
    With Gallium / LLVMpipe giving ~20% of the performance and only OpenGL 2.1 support... I'll stick with the non-Gallium driver for now.
    Last edited by HalationEffect; Jul 26, 2012, 06:28 AM. Reason: Added Gallium / LLVMpipe results

    Leave a comment:

Working...
X