Texture binding in OpenGL acting like a filter for every other pixel's colors

Hi,

I was wondering if somebody ever had the same problem that I have with a texture binding. I am loading and binding a BMP file in an OpenGL texture. It works successfully with a first BMP, but now that I am trying to bind another, I get the screen darkened, quite like if it was filtered according to a part of the texture, and I do not understand why.

For example with a black only BMP file I get almost nothing, or very dark. For a red only, I get only things in red nuances. For a white BMP, I get everything almost ok.

It is probably related to the code to display this second texture, or to my BMP file wrongly formatted, but I do not find any problem (I check most BMP parameters like bits per pixel, and I get the texture displayed well).

Anybody has an idea what could be the problem here ? I suppose I made a wrong use of glEnable/glDisable, but I cannot find it.

BTW, it is a for displaying font in OpenGL ES.

Here is the code that seems to cause the problem :
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
	static const GLfloat tfVertices[] = {
		-.5f, -.5f, .1f, // Scaling and translating is handled before
		-.5f,  .5f, .1f,
		 .5f, -.5f, .1f,
		 .5f,  .5f, .1f
	};
	GLfloat tfTextureCoords[] = {
	    // calculating position to take only the desired font in the BMP
	    // this part works fine.
	};
	glColor4ub(255, 255, 255, 255); // not sure this is necessary ? should not be a problem anyway.
	glEnableClientState(GL_VERTEX_ARRAY);
	glVertexPointer(3, GL_FLOAT, 0, tfVertices);
	glBindTexture(GL_TEXTURE_2D, m_uiTextureGLNb); // this unsigned int is the texture index in OpenGL, this works fine.
	glEnableClientState(GL_TEXTURE_COORD_ARRAY);
	glTexCoordPointer(2, GL_FLOAT, 0, tfTextureCoords);
	glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
	glDisableClientState(GL_TEXTURE_COORD_ARRAY);
	glDisableClientState(GL_VERTEX_ARRAY);
Last edited on
No clue, anybody ?
Topic archived. No new replies allowed.