Doesn't Not Compute

My log of experiences with GNU/Linux and computers in general.

Tag Archives: Arch

Aspirin for the GMA500 Headache

While I haven’t completely figured out the problems I talking about a couple days ago, I’ve made some progress and feel I should share the details with you and my future self — who will undoubtedly have forgotten this and had to look it up. :mrgreen:

While I still can’t can’t get VA-API acceleration working, I did track down the reason I couldn’t get ordinary XVideo acceleration: that’s handled through the blob, which REQUIRES Xorg-server 1.6.

So, I installed Xorg 1.6, following the directions in the Arch Linux Wiki’s “Poulsbo” article. That includes instaling openssl-compatibility from the AUR, as Xorg 1.6 requires a slightly older version of openssl. Afterwards, I removed, rebuilt, and reinstalled the PSB driver packages (just in case) and started up the Xserver. I then opened a terminal, and used Mplayer try playing the H.264 720p encoding of Big Buck Bunny with XVideo acceleration. (-vo xv, if you didn’t know.)

It worked perfectly.

I have no clue if XVideo offloads the decoding to the SGX 535 chip the GMA500 hardware incorporates, like VA-API supposedly does with the right driver, but the playback is certainly a lot smoother than plain X11 rendering.

As for why I can’t get the VA-API acceleration to work, even after copying a blob to where it’s expected to be, it seems to be because the modern libva used in Arch Linux is not compatible with the older libdrm-poulsbo software. I’ve tried recompiling it, but that doesn’t help at all — and requires the standard, non-Poulsbo-driver-compatible libdrm to compile at all. I’ve tried changing the PKGBUILD to not conflict with the newer, standard libdrm, then compiling and force-installing libdrm-poulsbo, and then compiling and installing libva; when I tried to start Xorg up, though, the conflicting libdrm versions crashed Xorg. As expected.

So, the headaches continue, but are at least partially dealt with. Now, not only do I have backlight control, native resolution, and proper font DPI, but I can even play HD movies comfortably. 😀 Now I just have to sort out  that pesky, pesky libva issue — and perhaps figure out why XvMC still doesn’t work.


GMA500 Headaches on Arch

For the past several days, my free time has been occupied with only a few things: restructuring the Arch Linux wiki’s Poulsbo article, enjoying rather delicious vegan “Sloppy Joes”, and wrestling with the ancient “PSB” binary blob driver. Today I’m just dumping a few things I’ve figured out about it.

First thing is about why I can’t get the accelerated video playback, via VA-API, to work. libva, which is responsible for allowing such things as mplayer-vaapi to use the PSB driver’s capability to access the GMA500’s H.264 acceleration abilities, looks for something called “”. With the current PKGBUILDs in the Arch User Repository, this file is installed, under


but libva is configured to look for it in


So, after copying that over to where its expected to be found, you’d expect everything to work, right? Nope.

libva: libva version 0.31.1
libva: va_getDriverName() returns 0
libva: Trying to open /usr/lib/dri/
libva: error: /usr/lib/dri/ has no function __vaDriverInit_0_31
libva: va_openDriver() returns -1
vaInitialize failed with error code -1 (unknown libva error),exit

Looks like Arch’s current version of libva is looking for something that didn’t need to exist back when the binaries were made. And, although the open-source parts of the driver have been patched to work with Xserver 1.7 and newer, this can’t be done with because it’s a BLOB. 😡 So, no mplayer -vo vaapi -va vaapi

Second issue is similar: I couldn’t get XV accelerated video to work. Mplayer either complains that it can’t find an XVideo-capable card or some-such, or a window is drawn for Mplayer — and the Xserver completely locks up, except for the mouse. Progress? :mrgreen:

The reason, according to the GMA500 wiki, is because that capability is handled by another blob: So, no mplayer -vo xv. 😡

The third issue is due to both blobs depending on a modified libdrm 2.3.0 — while Arch Linux’s Mesa OpenGL implementation is compiled  against libdrm 2.4.something. I’ve tried compiling a similarly-patched libdrm 2.3.4 from the tarball in the Ubuntu gma500 PPA, but that didn’t create drm-psb.ko — which is rather necessary for everything to work.

So, pros and cons of using the PSB blob with Arch Linux:


  • Can use current kernel (2.6.35-ARCH)
  • Can use current Xorg-server (1.8.1.something)
  • Full, native resolution
  • Backlight control (see my how-to on this)


  • No VA-API acceleration
  • No XVideo acceleration (and quite likely will lock up your Xserver — won’t even be able to switch to a terminal)
  • No OpenGL/Mesa ANYTHING.

At least with the FBDEV driver I can try some software raster whatchamacallit for OpenGL. (Yes, that is indeed a word — I misspelled it and Firefox corrected me. :P) The backlight control, however, is more important to me. And I can’t run Xorg with the FBDEV or VESA driver over a PSB-module-set framebuffer, either — it crashes with a static underline cursor with the former and distorts and repeats everything with the latter.

My next attempt will be to use the procedure normally reserved for IEGD driver experimentation to downgrade to Xorg-server 1.6, and try again with the PSB driver. 1.6 was the latest server version that worked completely, I think . . . if the video works then, I’ll try building whatever version of Mesa was current back when libdrm 2.3 was. I think that was 7.0.4, but I could be wrong, and that’s not even in the Mesa repositories anymore. I have to go on Sourceforge for it, and would have to heavily modify the PKGBUILD. *sigh*

Or I could just use Jolicloud — but then I’d have a LOT of hacking-away to do.

Or I could use Ubuntu 10.04 — but then I’d still have a lot of hacking-away to do.

Thus the headaches.

Thank you so much, Intel, for licensing Imagination Technologies’ PowerVR SGX 535 to use in your System Controller Hub series. >_< At least Nvidia keeps their blobs current.

Uvesafb, and Possibly PCI, Problem with Linux kernel 2.6.34-ARCH

As part of a project to make my netbook what it’s designed to be — a device for internet browsing only, instead a laptop — I’ve dd’d the latest Arch Linux ISO to a flashdrive and wiped out Ubuntu Lucid.

The post-installation process went normally — set vga=791 so I have a 1024×768 screen, configure my wireless network connection (WPA necessitates some typing 😛 ), update the package lists, upgrade pacman, then upgrade the whole system.

But when I tried to follow the procedure for getting the full 1366×768 resolution, I discovered upon reboot that, even though I remembered  remove the “vga=xxx” kernel parameter — preventing the VESA framebuffer from overriding the new, “fancier” UVESAFB, I got a message reporting “uvesafb could not reserve memory at” something-or-other.

Not knowing what could be wrong, I started reading the kernel’s dmesg log (/var/log/dmesg.log)  and noticed this message:

pci_root PNP0A08:00: address space collision: host bridge window [mem 0x000c0000-0x000dffff] conflicts with reserved [mem 0x000d00000-0x000ffff]

A few lines above, I saw

PCI: Using host bridge windows from ACPI; if necessary, use “pci=nocrs” and report a bug.

I wasn’t sure if the two had anything to do with each other, but I rebooted, edited the kernel line in GRUB and added the “pci=nocrs” parameter, and continued booting. Everything worked perfectly.

There is an open bug about this problem in the Arch Linux system.

Acer Aspire One A0751h: Part Six

Acer Aspire One 751hThe last  in the series of posts about my netbook was back in October of last year, and discussed how to set the font DPI, and thus the basis of the font size, in Ubuntu’s Xorg. At the time, I didn’t have Arch Linux installed on the netbook, so I didn’t realize that the instructions wouldn’t work there. Only now have I gotten back to it. 😆

It seems that since Arch doesn’t use the same startup script system (BSD style instead of the SysV version most GNU/Linux distros use), there isn’t even an /etc/init.d directory, so the file doesn’t exist. So here’s a method that goes straight to the source, and tells the Xserver the exact screen size, from which it determines the font size.

➡ You will need to be root to edit the following mentioned file.

For this machine, which has a 1366×768 native screen resolution, add this line to /etc/X11/xorg.conf.

DisplaySize 294 165

Details on what these numbers mean and how to calculate them can be found in the  Arch Linux Wiki in the section linked to. Don’t ask me to explain it without a copy-and-paste job, I dislike math. :mrgreen:

The good news is, this should be a distro-agnostic way to determine the font DPI, meaning that it *should* work on Debian, Ubuntu, Fedora, Mandriva, Moblin, and pretty much any other distro. For all I know, it will probably work on ANY operating system that uses Xorg, such as the *BSDs and Solaris. But I can’t test that, so if you use a diferrent distro or operating system, and you test it, let me know if it works or not in the comments below. 🙂

A Series of Unfortunate Opportunities

As if being unable to make GNUStep, and programs based on it, to compile on my Arch Linux wasn’t bad enough, now I cannot start most GTK+-based programs on my netbook — I get segmentation faults. Two separate attempts to post on this blog were thwarted by Firefox crashing. 😡  Anything QT or KDE-based works just fine though, so I’ve been stuck with those.

Additionally, FTP transfers from the netbook to a softmodded Xbox resulted in file corruption on the Xbox, requiring me to repeatedly reload everything on the Xbox from another machine. At least, until I tried doing all of the FTPing from my Debian machine. Then it worked perfectly. 😕

This isn’t necessarily a bad thing, as it’s actually forcing me to ignore my excessively-minimalistic tendencies and actually examine the functionality of the program. And I must say, I’m more impressed than I had been before. The look between QT/KDE apps may be more “cartoonish”, as I’ve seen some people write, but it’s more consistent and attractive than default GTK+, or many of GTK+’s themes, and with far less work involved.

I’m actually running KDE 3.5 on my Debian desktop now, with mostly all-KDE and QT apps, and enjoying it immensely. 😀

But now I must fix my netbook… Arch Linux is great, and generally very stable, but I need two machines on which I don’t need to manually configure everything. That’s usually a lot of fun, but sometimes “just working” is a necessity. 😦

So now, even though it’s (by default) a GNOME-based distribution, I’m installing Linux Mint 8 on the netbook. After all, I need SOMETHING to use up those 2GB of memory. 😉

Acer Aspire One A0751h: Part Four

Acer Aspire One 751hSorry for the cliffhanger-esque silence for the past week, school projects are piling up. 😦

But I promised a way to get full resolution on this machine without the Poulsbo chipset’s drivers in Arch, so here it is at last. 😀

First, you must have a kernel with uvesafb support, otherwise nothing else here will be of any use. The stock Arch Linux kernel has this support.

Second, you’ll have to install the 915resolution-static package from the AUR as described in the ArchWiki.

In /lib/initcpio/hooks/915resolution, replace the contents (as root) with this:

run_hook ()
msg -n ":: Patching the VBIOS..."
/usr/sbin/915resolution 5c 1366 768
msg "done."

and save.

Third, as root, install uvesafb’s helper daemon with pacman.

# pacman -S v86d

Edit /etc/modprobe.d/uvesafb so that the “option” line reads like so:

options uvesafb mode=1366x766-32 scroll=ywrap

and save.

Then add 915resolution and v86d to the HOOKS line in mkinitcpio.conf, and regenerate you initcpio (eg, mkinitcpio -p kernel26).

Remove any “vga=” line or similar from your bootloader’s configuration, as this overrides the uvesafb with the standard vesa framebuffer.

Reboot and enjoy the 1366×768 framebuffer, if you wish — I’m going on to add the last step to getting the X server to access that hi-res glory! :mrgreen:

You will need to install xf86-video-fbdev first, then configure Xorg to use it — if you aren’t already using it. I’ll use the depreceated xorg.conf method here, as I haven’t had time to learn the new HAL-based method.

This is the relevant section of my xorg.conf:

Section "Device"
### Available Driver options are:-
### Values: : integer, : float, : "True"/"False",
### : "String", : " Hz/kHz/MHz"
### [arg]: arg optional
#Option "ShadowFB"
#Option "Rotate"
#Option "fbdev"
#Option "debug"
Identifier "Card0"
Driver "fbdev"
VendorName "Intel Corporation"
BoardName "System Controller Hub (SCH Poulsbo) Graphics Controller"
BusID "PCI:0:2:0"
Option "AccelMethod" "EXA"

(I have no idea why specifying an acceleration method on the framebuffer would have an effect, but it does according to both glxgears and my own eyes scrolling in Firefox and watching videos.)

And that should do the job — 1366×768 Xorg graphics! 🙂 No backlight control, so you’ll have to it with the brightness keys while the GRUB menu is still displayed, but it’s better than nothing. 😐