-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Switch the OpenGL code to use VBOs instead of direct mode #17
Conversation
This obviously isn't ready yet, it will need some work (and extreme amounts of testing - you can't EVER be sure that it works) My goal for now:
Also doesn't my project become GLava basically after these mods (except it has windows support)? |
Yes, defining the shader files in the config would be pretty good. In the GLSL code, we need to access X.A.V.A. variables, it would be good if you could pass:
If you have another idea of variables to pass, go ahead. |
Also while the code works it is less efficient in terms of video card usage |
Didn't build it properly, I'm good now, no shadows though... |
Did some work on it, I figured out (somewhat) what to do. But there are so many annoying bugs to fix. (fxp. shader compilation failures, as OGL is asyncronous AF). And also the way that I deal with variables is backstabing me. I'm thinking of just making all major variables global, since having 3 different variables with the same name is driving me insane (messy code, i know)..... Will update as I go through on this. |
But I think the first order of duty is to update the branch to mainline, since the removal of bar limits is causing seg. faults all over the place. Thank GDB for giving me insight about my variable issue. |
Also your assumption is correct, shaders are expesive (in cycles) as it gets when it comes to code. Any slight change to the shader code can change the temps of the GPU from 40 to 80*C. But I'm thinking of doing geometry inside of a shader, that'll give more flexibility to the developer/end-user (and possibly reduce the load). Only catch though: it's not EGL friendly. |
Also those shaders (that you demonstrated) draw every single pixel, unlike real fragment shaders which are ran on every pixel that is occupied by a 2d raster of the final polygon. |
Can I work around this by using two polygons that just cover the entire screen and use fragment shaders for the final image? (kinda hacky, but may work) |
Yeah, I think having 2 polygons that cover the screen is an idea worth exploring but the frag makers need to have an easy way to map the fragment shader to the screen. |
update: the two polygon thing is working, I've managed to cram the entire renderer into a fragment shader that runs about 30 lines of code per pixel (not sure if it's bad or not) comparing with the legacy OGL code it uses 20% more GPU, but it's not so bad as you get the complete flexibility of GLSL However, uno problemo: GLSL had no way to put a dynamic array as an "uniform" variable up until OGL 4.0 |
Also the current shader I wrote is about 100 LoC and implements most features (80%) and works with the minimum version tag of 110 = OGL 2.0 minimum, which is ok compatibility IMO. |
Yup, sounds great, +20% GPU should be fine for most people. |
Review this, and try to find bugs and other things (if you're willing to do so). Just need to update the windows code as well, and do a bit of testing there before merging. |
Anyway it wasn't 20% more it was more like triple, my benchmarking was flawed. Besides it's only 10% of my GPU (when I disable compositing ofc) Also the conversion between ints and floats is expensive, so maybe it's that. |
Ok, I will test it ASAP, thanks! |
|
Also, shaders do not get copied over by install and it has to regenerate them in Ubuntu 20.04... |
Error is in |
Both should be non-NULL at that point... I have no idea what might be going on. |
I mean look at the lines behind... it's just two mallocs!?! |
Also it doesn't matter if the package is installed in |
Gonna have to rebuild it on another PC to find out what is going on, in the meantime do you have a Windows build? |
I have to work on it since I've changed the renderer code |
It works but there is no way to make another shader, in the config changing the shader variable does nothing. |
This would not only improve performance, but will actually allow me to use GLSL shaders among other things.