Render to texture glitches?
by Brian Richardson · in Torque Game Engine Advanced · 11/23/2005 (2:45 pm) · 2 replies
Hey All,
I'm working on a little project in TSE and we're doing a post-process effect. I hacked up GuiTSCtrl to render the scene to a texture, then I use this texture with a shader and a full screen quad to get it upon the screen.
It's working great on my NVidia fx5200 and ATI Mobility 9600. But running it on an earilier ATI card (9200SE I think?) the post process shader fails.
We narrowed it down to the texture size that we're using to render to. Currently, the render to texture is just the same as the screen size. If we change it to a power of two texture, it begins to work better. I guess older cards don't like to sample from textures that are not power2.
We first changed it to 512x512 which is smaller than the screen size. This means that part of the scene gets chopped up and it generally sucks.
So then I switched it to 1024x1024 (hardcoded for testing). Then I compute new texture coordinates to display only the part of the texture that would be rendered to (800/1024, 600/1024).
It works exactly like I had planned, except I keep getting very weird glitches. Quads that look funny, or just random pixel noise. It's almost like the zbuffer isn't getting cleared or something? I played around with the clear bits to no success.
Any clues on what I'm doing wrong? Keep in mind the artifacts only seem to appear when I'm rendering to a texture larger than the screen (but my actual rendering happens within the screen bounds).
Sorry this si ramblin' ;)
bzztbomb
knowhere.net/
I'm working on a little project in TSE and we're doing a post-process effect. I hacked up GuiTSCtrl to render the scene to a texture, then I use this texture with a shader and a full screen quad to get it upon the screen.
It's working great on my NVidia fx5200 and ATI Mobility 9600. But running it on an earilier ATI card (9200SE I think?) the post process shader fails.
We narrowed it down to the texture size that we're using to render to. Currently, the render to texture is just the same as the screen size. If we change it to a power of two texture, it begins to work better. I guess older cards don't like to sample from textures that are not power2.
We first changed it to 512x512 which is smaller than the screen size. This means that part of the scene gets chopped up and it generally sucks.
So then I switched it to 1024x1024 (hardcoded for testing). Then I compute new texture coordinates to display only the part of the texture that would be rendered to (800/1024, 600/1024).
It works exactly like I had planned, except I keep getting very weird glitches. Quads that look funny, or just random pixel noise. It's almost like the zbuffer isn't getting cleared or something? I played around with the clear bits to no success.
Any clues on what I'm doing wrong? Keep in mind the artifacts only seem to appear when I'm rendering to a texture larger than the screen (but my actual rendering happens within the screen bounds).
Sorry this si ramblin' ;)
bzztbomb
knowhere.net/
#2
It "feels" like I'm leaving garbage in the zbuffer which is then used to compare to the pixels in the quad that I'm putting up on the screen. The annoying thing is that I'm disabling the zbuffer at that point, so it shouldn't be an issue.
I'll check out Pix and hope it triggers a solution!
11/24/2005 (8:48 am)
Thanks for the pointer, I'll check that out soon! I'm pretty sure it's not a hardware/render bug, I've tried it on two different machines/cards with the same result.It "feels" like I'm leaving garbage in the zbuffer which is then used to compare to the pixels in the quad that I'm putting up on the screen. The annoying thing is that I'm disabling the zbuffer at that point, so it shouldn't be an issue.
I'll check out Pix and hope it triggers a solution!
Associate Kyle Carter