Appendix B. FAQ

Table of Contents

B.1. Compiling
B.2. Running
B.3. Miscellaneous

B.1. Compiling

Q: How do I build a 32-bit version of bugle on an AMD64 system?
Q: Why is a recent (>3.0) version of GCC required?
Q: Why I am getting an error like Invalid option -fdump-translation-unit?
Q: I get an error trying to move gl.c.tu, saying that the file does not exist.
Q: When I compile with GCC 4.2, compilation seems to hang on src/lib.c.
Q: I am getting the error could not find GL/glext.h.
Q: I am using the NVIDIA header files, and I am getting errors about conflicting definitions.
Q: I am using Gentoo, and during compilation of bugle I get errors similar to this: /usr/lib/opengl/nvidia/lib/libGLcore.so.1: undefined reference to `_nv000016gl'.
Q: I have an x86-64 (Athlon 64/Opteron/EM64T) machine with 64-bit OS, and compilation fails with the message /usr/lib/libGL.so: could not read symbols: File in wrong format.
Q: I am using Fedora 10, and when I run glxgears under bugle it exits with the message Error: couldn't get an RGB, Double-buffered visual.
Q: Why do I get the error *** %n in writable segment detected ***?

Q:

How do I build a 32-bit version of bugle on an AMD64 system?

A:

The exact details will depend on your OS. Here is what I do on Gentoo:

$ ./configure --x-libraries=/usr/lib32 CC="gcc -m32" CXX="g++ -m32" \
    --libdir=/usr/local/lib32 --bindir=/usr/local/bin32

Gentoo also tries to be clever by putting in a libGL.la to make linking against OpenGL work better, but in some cases it causes libtool to pick up the 64-bit version of OpenGL. I work around this by copying /usr/lib/libGL.la into the compilation directory, and replacing all references to /usr/lib with /usr/lib32.

Q:

Why is a recent (>3.0) version of GCC required?

A:

Bugle is driven by a number of tables that list all the information about the various types and functions in OpenGL. To capture this information, it is necessary to parse the OpenGL header files. It would be very difficult to write a parser to do this robustly, but GCC already knows how to do this. Unfortunately, the ability to extract a parse tree from GCC (the -fdump-translation-unit option) was only added sometime after 3.0, and is also broken in 4.0 (fixed in 4.1).

Q:

Why I am getting an error like Invalid option -fdump-translation-unit?

A:

You need to use a version of GCC that supports this flag. See the previous question for details.

Q:

I get an error trying to move gl.c.tu, saying that the file does not exist.

A:

You are probably using GCC 4.0, which has broken support for -fdump-translation-unit (GCC bug 18279). Use GCC 3.x or 4.1 instead.

Q:

When I compile with GCC 4.2, compilation seems to hang on src/lib.c.

A:

There is a performance bug in GCC 4.2.2, possibly #30052 but possibly not. Either downgrade to 4.1, upgrade to a version where the bug is fixed , or compile without optimisation (-O0).

Q:

I am getting the error could not find GL/glext.h.

A:

You need to have this file, as it defines extensions to OpenGL that an application might use. You can obtain the latest copy from http://www.opengl.org/registry. If you don't want to put it in your system include directories, you can create a GL subdirectory of the bugle source directory and put it there.

Q:

I am using the NVIDIA header files, and I am getting errors about conflicting definitions.

A:

Try upgrading to the latest NVIDIA drivers. The 60 series seem to fix many of the problems in older header files.

Alternatively, use the headers from Mesa. You can place them into a GL subdirectory of the source if you prefer not to overwrite the system headers.

Q:

I am using Gentoo, and during compilation of bugle I get errors similar to this: /usr/lib/opengl/nvidia/lib/libGLcore.so.1: undefined reference to `_nv000016gl'.

A:

This seems to be a problem with Gentoo's system for switching between OpenGL drivers, that causes part of the build process to use the X11 software drivers. A workaround is to switch to the X11 drivers for compilation and installation of bugle:

  1. Run eselect opengl set xorg-x11 (or opengl-update xorg-x11 on older systems) as root.

  2. Ensure that your environment is up to date e.g., source /etc/profile in the shell you will use to compile bugle.

  3. Unpack a fresh copy of bugle (do not use a previously configured copy), compile and install it.

  4. Switch back to the NVIDIA OpenGL driver (eselect opengl set nvidia).

Q:

I have an x86-64 (Athlon 64/Opteron/EM64T) machine with 64-bit OS, and compilation fails with the message /usr/lib/libGL.so: could not read symbols: File in wrong format.

A:

This has been reported on Fedora Core 5, where libtool incorrectly tries to link against the 32-bit version of libGL.so. The workaround is to run (in the build directory)

sed -i 's#sys_lib_search_path_spec="#sys_lib_search_path_spec="/usr/lib64#' libtool

and re-run make. If this still doesn't work, see if your 64-bit libGL.so is in a different directory and use that in place of /usr/lib64 above.

Q:

I am using Fedora 10, and when I run glxgears under bugle it exits with the message Error: couldn't get an RGB, Double-buffered visual.

A:

This seems to be caused by having two different versions of libGL.so on your system, and bugle picking up symbols from the wrong one. Find out which one is correct (ldd) should help, then move other ones out of the way.

Q:

Why do I get the error *** %n in writable segment detected ***?

A:

Most likely your distribution has crippled GCC by switching on options that improve security but are not C compliant. Try adding -U_FORTIFY_SOURCE to your CFLAGS.

B.2. Running

Q: What is the performance penalty?
Q: I wrote LD_PRELOAD=libbugle.so program and it failed, what's up?
Q: How do I get a stack trace when my program crashes inside the OpenGL driver?
Q: Do I need to always set BUGLE_CHAIN?
Q: Can I avoid using LD_PRELOAD?
Q: How do you get SDL-based apps to use BuGLe?
Q: How do you get Quake3 or Doom3 to use BuGLe?
Q: How do you get QT-based apps to use BuGLe?
Q: I am using GL_EXT_framebuffer_object to render to texture, but the results do not appear in the texture viewer of gldb-gui.
Q: I'm running 64-bit bugle on a 64-bit OS with a 64-bit application, but when I try to run it, it tries to open the 32-bit version of libGL.so.
Q: I am using ATI drivers and when I run configure, it does not find glBegin in -lGL.

Q:

What is the performance penalty?

A:

That depends on what you're doing with the library. If you only use filter-sets that intercept a few functions (like showstats), then all other functions suffer only a small penalty. Using filter-sets that check for error after every function (including showerror and the debugger) adds a much bigger penalty.

To give you an example, here are some rough numbers for Quake3 1.32c on demo four:

without bugle683.7
none668.4
logfps661.8
showfps625.6
trace181.6
gldb-gui131.3

Tracing is particularly slow because it does file I/O, as well as a lot of text formatting. The overhead will also depend on the CPU/GPU balance. GPU-bound programs will be less affected by overheads.

Q:

I wrote LD_PRELOAD=libbugle.so program and it failed, what's up?

A:

You either need to specify a full path to libbugle.so, or else place it somewhere that the linker will find. On GNU systems, you can add the appropriate path to /etc/ld.so.conf.

Q:

How do I get a stack trace when my program crashes inside the OpenGL driver?

A:

Use the bugle-unwindstack(7) filter-set. See the manual page for more information.

A:

Use the bugle-log(7) filter-set, and turn on the flush option (see the example in doc/examples/filters). Then after your program crashes, the last entry in the log file will most likely be the one just before the crash.

Q:

Do I need to always set BUGLE_CHAIN?

A:

No, the first chain in the configuration file is the default.

Q:

Can I avoid using LD_PRELOAD?

A:

Yes, you can specify -lbugle in place of -lGL when you link your program. If it is not installed in a standard directory, you may need extra compiler flags, such as -L/path/to/bugle -Wl,-rpath -Wl,/path/to/bugle.

Q:

How do you get SDL-based apps to use BuGLe?

A:

On GNU systems, they should work as is. On some systems, it may be necessary to run them as

LD_PRELOAD=/path/to/libbugle.so SDL_VIDEO_GL_DRIVER=/path/to/libbugle.so program

Q:

How do you get Quake3 or Doom3 to use BuGLe?

A:

On GNU systems, they should work as is. On some systems, run it as

LD_PRELOAD=/path/to/libbugle.so quake3 +set r_glDriver /path/to/libbugle.so

Note that Quake3 and Doom3 are 32-bit binaries, so if you have an AMD64 system you will need to compile a 32-bit bugle.

Q:

How do you get QT-based apps to use BuGLe?

A:

On GNU systems, they should work as is. There is currently no known workaround for systems that don't support dlsym with the RTLD_NEXT flag.

Q:

I am using GL_EXT_framebuffer_object to render to texture, but the results do not appear in the texture viewer of gldb-gui.

A:

This is a known bug in NVIDIA's 76.76 driver. It is fixed in the 80 series drivers. The results should appear in the framebuffer viewer.

Q:

I'm running 64-bit bugle on a 64-bit OS with a 64-bit application, but when I try to run it, it tries to open the 32-bit version of libGL.so.

A:

I haven't heard if this actually works, but try setting LTDL_LIBRARY_PATH to the path containing your 64-bit OpenGL library (probably /usr/lib64. If you run into this problem, please let me know if this workaround is effective.

A:

Another potential workaround is to replace libGL.so in gl.bc.in with the full path before running configure. Again, please let me know whether this is effective.

Q:

I am using ATI drivers and when I run configure, it does not find glBegin in -lGL.

A:

I have had this reported by one Debian user, but didn't get further details on hardware or driver version. It appears that this version depended on but did not link to libpthread. While it is possible to hack this in configure.ac, the same user reported other problems with this driver, and it is probably best to upgrade to a newer driver.

B.3. Miscellaneous

Q: Why is there a mix of C and C++ code?

Q:

Why is there a mix of C and C++ code?

A:

The library itself is written entirely in C. This is mainly because a C++ library will drag libstdc++ in with it, and this has been found to create some problems when forcing linking to an application that already depends on a conflicting version of libstdc++. There is still a dependence on libc, but libc is less of a moving target. The code generator is written in C++, since it makes the coding easier and does not introduce the dependency issues above.