Shaders running slower than Fixed Pipeline

Pages: 12
While testing some simple examples with glDrawArrays I noticed switching it to shaders can cut my frame rate by over half (from 600- 300). I know I am using a bunch of deprecated code in GLSL right now but I don't expect it to cause that much of an fps drop. I would say it was just my intel graphics chip but if the FFP can handle it I see no reason shaders can't.

--Windows 7 Premum 64-bit
--Intel Core i3 540
--Intel HD Graphics

1
2
3
4
5
6
7
8
9
10
11
12
 
//Sending 10,000 quads with GLfloats (So Vertices vector has 80,000 elements, currently no indexing). 
//Vertices allocated earlier in code, before main game loop

glBindBuffer(GL_ARRAY_BUFFER, vbo);
glEnableClientState(GL_VERTEX_ARRAY );

glVertexPointer(2, GL_FLOAT, 0, 0);
glDrawArrays(GL_QUADS, 0, Vertices.size() / 2);

glBindBuffer(GL_ARRAY_BUFFER, 0);
glDisableClientState(GL_VERTEX_ARRAY );


1
2
3
4
5
6
7
//Basic vertex shader 
#version 120

void main()
{
      gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}


1
2
3
4
5
6
7
//Basic fragment shader 
#version 120

void main()
{
     gl_FragColor = vec4(1.0,0.0,0.0,1.0);
}


The highest Opengl version I can run is 2.1 so 120 in shaders.
I know this is a fairly pointless test right now but it is still surprising to see, anything obvious I am missing?
It's probably because your GPU is weak. It can probably support 2.1 but isn't really optimized for it.
But it can run perfectly fine when it is running through the FFP, which has to be more complicated then what I have there.

Just seems strange, I knew my GPU was bad but I didn't know it was "that" bad.
I know. Your GPU may be optimized for the FFP, but at the time it was released maybe shaders weren't widely used, so they didn't bother to but as much time into it as they should have.
----
Or maybe it's something else. Can I see your full program Loop?
Last edited on
This isn't every single function I use but it should give a good enough idea.
http://pastebin.com/mNrTFssz
Hm... why aren't you calling glLoadIdentity()?

Most likely not the cause, but just curious. I still have a strong feeling that it's the GPU. Have you tried to test the program on a different computer?
Last edited on
Oops I must of removed glLoadIdentity when cleaning up my main loop to post. No change when adding it back.

I guess it being the GPU is a pretty good guess (I was hoping it wasn't because I am not in a situation to get another one, no one should be using an Intel HD i3 these days so my fault). I don't currently have any other computers to test this on that will perform any better than this.

If you want I can upload the .exe with/without shaders for you to confirm your theory but I don't want to waste any more of your time (+ no one trusts files online these days). Thanks for your advice though.
closed account (3hM2Nwbp)
Check which opengl driver you're using. Microsoft's "fallback driver" (GDI) is notorious for odd performance issues. You might have to install a driver from your graphics card manufacturer to see the true numbers.

1
2
glGetString(GL_VENDOR); // should give you which vendor you're using
glGetString(GL_RENDERER); // should give you which driver you're using 


If you find that you're using Microsoft's GDI driver, you can find Intel's drivers here:

https://downloadcenter.intel.com/Detail_Desc.aspx?agr=Y&ProdId=3319&DwnldID=23377&ProductFamily=Graphics&ProductLine=Laptop+graphics+drivers&ProductProduct=2nd+Generation+Intel%C2%AE+Core%E2%84%A2+Processors+with+Intel%C2%AE+HD+Graphics+3000%2f2000&DownloadType=Drivers&OSFullname=Windows+7+%2864-bit%29*&lang=eng


...but be sure to check to make 100% sure it's correct for your system before installing it!


On second thought, use their utility to find the correct driver. It'd be much safer.
http://www.intel.com/p/en_US/support/detect?iid=dc_iduu
Last edited on
I actually read that somewhere before I posted here, my drivers weren't fully updated so I used that link you sent earlier and now it's fully updated.

That didn't solve the problem though. Here is what Opengl says (which is what I expected)
http://puu.sh/6Jejf.png
Edit - Oops I typed vendor there, oh well!

So I think we are back to Intel HD with i3 CPU (one of the first HD chips they made) is bad and should be avoided.
Last edited on
closed account (3hM2Nwbp)
If you're using Intel's driver, then my next suggestion would be to set up smart indexing. That can make a huge difference, especially if a lot of the vertex data is redundant.

* Also, are you sure you aren't re-compiling your GLSL program each frame? I've seen that happen as well.
Last edited on
Yeah I figured I would have to do indexing sooner or later, I have to figure out a quick way to find which vertices share position/texture coords/etc.

Also no I set up all of the shaders/glLinkProgram/glUseProgram stuff before the main loop.
Last edited on
closed account (3hM2Nwbp)
The "Bad" situations like the i3 are the right ones to aim for a "good enough" threshold with. Bring the bottom up to snuff and the newer cards won't even break a sweat (unless they're using the GDI driver!).
Yeah I realize it's good to aim for the lower graphics cards as soon as possible but with shaders taking such a performance drop it might just be a better idea to code everything using FFP from the start.

Kind of depends if I need per pixel lighting etc I suppose.
closed account (3hM2Nwbp)
Heresy!

Try indexing it and re-checking your benchmark. I've never met someone who would willingly go back to the FFP after working with shaders. Any case, the best of luck on your project.
Thanks for all the help.

I will report back here in a day or so with indexed results. If it is still slower I am going to come knocking on your door! (ok maybe not).
Try changing from
glDrawArrays(GL_QUADS, 0, Vertices.size() / 2);
into
glDrawArrays(GL_QUADS, 0, Vertices.size() / 4);
Just did a 200x200 quad test with glDrawElements. (40,000 quads, 80,000 vertices -- because of indexing, 158,404 indices).

Shaders on - 80fps
Shaders off - 200fps

I must be doing something seriously wrong if this isn't GPU related.

EssGeEich why divide by 4? I am currently working in 2D and I only fill the buffer with x/y coordinates.
Last edited on
glVertexPointer should tell how many floats in a vertex and glDrawArray how many vertices in a model...
I think.

AHEM, I MEANT QUADS

Besides, intel is known for being bad for graphics.
No OpenGL 3/4 drivers in 2014?
Last edited on
Ok it's my GPU by the look of it.

I just had a friend run it with a fairly old Radeon HD card and he got 900 without shaders and 1000 with (approx).

Yes intel has Opengl 3.0/4.0 drivers I just have a very bad intel HD chip. So in general I am screwed :(.

Thanks again for all of the advice, someone want to buy me a super powered GPU for christmas?~
Last edited on
Maybe something like the AMD Radeon HD 7870?
Pages: 12