Game Development Community

Linear subsampling with GuiBitmapCtrl ?

by Orion Elenzil · in Torque Game Engine · 11/09/2007 (11:03 am) · 7 replies

Howdy.

GuiBitmapCtrls are cool because they're okay with non-power-of-two images,
but i notice that even when using a power-of-two image, they don't seem to be mipMapping.

or perhaps more precisely, when the bitmap is shrunk, it seems to be using nearest-neighbor resampling instead of something nicer like bililnear.

eg:
elenzil.com/gg/images/minify.jpg

does anyone know how to get better quality on these ?

i've tried adding this into dglDrawBitmapStretchSR(), but no change:
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );

tia,
orion

#1
11/09/2007 (11:19 am)
Actually there is a very, very slight difference between using GL_LINEAR and GL_NEAREST,
but you only see it if you subtract the images from each other.

the various mipmap options just hose everything.
#2
11/09/2007 (12:06 pm)
Never mind.
i just realized that when going from 1024 down to something as small as 64, linear pretty much is the same as nearest neighbor. ie, linear combines the four closest pixels, and when reducing by a power of 4, you're smashing 16 pixels into one, so choosing 4 of them isn't really all that great.

what it really wants is proper mip-mapping, which seems to be a larger job w/ these bitmaps.
#3
11/09/2007 (12:57 pm)
Is the GBitmap mipping code causing the issues? The minification/maxification filtering should only effect what happens between mip-levels. It's a bit odd that, if there are mip-levels, that no mipping is taking place. I haven't touched anything non GFX based in a very, very, very, very long time, so I don't remember exactly how the texture stuff works, but I do know that the GFX texture profile used to create GUI images specifies no mipmapping, so no mip levels are generated either by GBitmap or any auto-mipping features of the underlying API.
#4
11/09/2007 (2:29 pm)
Right, i think there are no mimaps being generated for these textures,
which makes sense given the relaxing of the usual power-of-two requirement.

.. which i'm pretty unfamiliar with that stuff as well,
so enabling it for just certain GuiBitmapCtrls daunts me.
#5
11/09/2007 (2:43 pm)
Ok, I didn't actually test/do what I am suggesting here, but I took a look at it.
mTextureHandle = TextureHandle(mBitmapName, BitmapTexture, true);
The TextureHandleType is defined in gTexManager.h, BitmapTexture has only 1 mip level. The code which controls this is in gTexManager.cc in the TextureManager::registerTexture method.

You want to either add another enum value (warning: I don't know if this will screw stuff up) and modify the texture manager, or you can manually extrude mip levels on an image with GBitmap::extrudeMipLevels, and then you could create the texture handle with the GBitmap.

When it creates a texture of type BitmapTexture it pads the texture, if it isn't PoT size, so you can still generate mips for a non-PoT texture. The dglDrawBitmap* code takes the padding into account when it assigns texture coords, so no worries there either.
#6
11/09/2007 (2:54 pm)
Cool, thanks. i'll give it a shot.

re padding textures up to pow2, i think this is an ugly thought i've avoided thinking for some time.
so we're eating texture memory for all the padded guiBitmapCtrl images ?
#7
11/09/2007 (3:10 pm)
Well not exactly, the 'BitmapTexture' type does not keep the memory around after it uploads it to OpenGL. 'BitmapKeepTexture' does keep the GBitmap used to create the TextureObject around, and the pointer to the GBitmap data is stored in the 'bitmap' member variable of TextureObject, accessable via TextureHandle::getBitmap().

I think that it is reasonable to assume that any time you use a non-PoT texture, while it will only take up w*h*bpp in system memory, it's still probably getNextPow2(w) * getNextPow2(h) *bpp in video memory. I don't know this for sure, but it seems reasonable to assume that the API/hardware would favor size over speed, and it wouldn't want to adjust to non-PoT images for mip-level calculations (since these can be done with crazy bit shift stuff).

Also, random side note: It's also reasonable to assume that, on the majority of the hardware, a 24-bpp texture actually takes up 32-bpp in video memory. 24-bit color formats aren't really supported much, and the format of choice is R8G8B8X8, with the 8-bits of 'X' (alpha) are just ignored.