Write your own Einstein@home screensaver

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6588
Credit: 317437463
RAC: 371175

Thanks guys

Message 78218 in response to message 78215

Thanks guys :-)

Quote:
OK, I tried the windows build and a window flashed by and disappeared.
Does it have to be run from a specific folder or do I just unpack it to it's own folder and run?


Yes, just unzip it somewhere convenient and run the executable. Yes, the window just flashes by - it's precisely that behaviour which I've been unravelling. My code's call to SDL then becomes an OS call to create a window AND an OpenGL context to render upon. So if the request for context fails - OGL/MS-Windows version issues as discussed - then it all flops during that SDL call. My debug code is executed before that windows creation request and hence these outputs. Good to see GL_ARB_compatibility here and we're clearly getting the 32 bit behaviour on 64 bit machines ... :-)

I have a 3 year old Dell Inspiron with Intel chipset, thus Intel drivers, but only gives OpenGL v1.4 too .... perhaps they are only expecting/allowing Windows and thus DirectX alone. Hmmm ... do people have Linux on laptops ???

[ glGetStringi is the only OpenGL v3.2+ method of querying extension capability ie. doesn't exist for earlier libraries and drivers. Again we meet not just deprecation but elimination of interfaces. One wonders what their post v3.2 concept of 'forward compatible profile' will ultimately mean in years to come .... while I'm no industry insider I can certainly imagine that alot of faith in the ARB has been lost in recent years. ]

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

robertmiles
robertmiles
Joined: 8 Oct 09
Posts: 127
Credit: 29370881
RAC: 20156

RE: I have a 3 year old

Message 78219 in response to message 78218

Quote:

I have a 3 year old Dell Inspiron with Intel chipset, thus Intel drivers, but only gives OpenGL v1.4 too .... perhaps they are only expecting/allowing Windows and thus DirectX alone.

Cheers, Mike.

I've found that some of the graphics drivers Microsoft distributes are less capable than the similar versions available from the graphics chip company, at least for Nvidia drivers.

You might check if Intel has updated their drivers for this Intel chipset, but Microsoft or Dell has not passed along all of this update.

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6588
Credit: 317437463
RAC: 371175

RE: RE: I have a 3 year

Message 78220 in response to message 78219

Quote:
Quote:

I have a 3 year old Dell Inspiron with Intel chipset, thus Intel drivers, but only gives OpenGL v1.4 too .... perhaps they are only expecting/allowing Windows and thus DirectX alone.

Cheers, Mike.

I've found that some of the graphics drivers Microsoft distributes are less capable than the similar versions available from the graphics chip company, at least for Nvidia drivers.

You might check if Intel has updated their drivers for this Intel chipset, but Microsoft or Dell has not passed along all of this update.


Thanks, I'll do that. As you say the 'Dell update' procedure doesn't grant that.

Aside : the Linux build. My Ubuntu 10.4 will run the SolarSystem Linux version and yet has OpenGL v3.3 - by my reading ( of defined standards ) this should not happen. I ought have to ask the v3.3 context to give me a pre-v3.2 context and go from there. Particularly the glMapBuffer routine has allegedly been eliminated for that version, with glMapBufferRange the future preferred route of server-side buffer access. In the absence of another reason ( including my ignorance ) I'd say some 'secret non-deprecation' has occurred by the driver writers.

[ I think I'll look into how 'state-ful' SDL v1.2/3 is ie. can I get away with requesting a context change - to obtain backward compatibility profiles - outside of an SDL call without throwing it's internal state to inconsistency. Either that or some 'pre-release' SDL v2.0 perhaps ]

Cheers, Mike.

( edit ) The squirrel/skunk in all this is that you first have to have a context to then get another one. But it is OS's that grant contexts : roughly think of it like a granted area of memory for which OpenGL semantics can be applied to. A sort of uber-instance of a really complex uber-class. Alot of the time the memory is actually on the video card, or is mapped to that without your direct knowledge, and this is broadly one aspect of 'hardware acceleration'. On the Vista type 'aero' desktop it's an in-general-memory buffer that is mixed with the contents of other similiar ones - from other applications - by a final MS driver which then writes the result to screen memory. This is conceptually like sound mixers combining different sound streams. So for such systems my attempts at on-card memory allocation ( the various buffer calls ) will be honored in their semantics but really/probably won't lead to an efficiency or frame-rate gain. The best I can hope then is that it won't worsen ....

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6588
Credit: 317437463
RAC: 371175

Investigations

Message 78221 in response to message 78220

Investigations :

Quote:
I think I'll look into how 'state-ful' SDL v1.2/3 is ie. can I get away with requesting a context change - to obtain backward compatibility profiles - outside of an SDL call without throwing it's internal state to inconsistency.


After some experimentation I'd say ... NO.

I can/have successfully coded to change the OpenGL context from some v3.2+ to say v2.1 ( Windows calls mentioned here, but similiar idea for glx ):

- check that OpenGL version is 3.2+ [ glGetString ]
- check that all required extension functions are available [ glGetStringi plus wglGetProcAddress ]
- check that an OpenGL context has actually been acquired [ WindowManager::initialize ]
- obtain the current device context ( HDC ) [ wglGetCurrentDC ]
- obtain the current OpenGL rendering context ( HGLRC ) [ wglGetCurrentContext ]
- create/initialise an array of attributes for wglCreateContextAttribsARB use
- create a new OpenGL rendering context from the current one using the HDC and HGLRC [ wglCreateContextAttribsARB ]
- make the newly created OpenGL rendering context the current one [ wglMakeCurrent ]
- delete the previous OpenGL rendering context [ wglDeleteContext ]

but when I use glMapBuffer I get a NULL return. I think the most likely reason for this is internal to SDL ie. is/has a software vs hardware buffer been obtained with the original call to SDL_SetVideoMode ....

... and the more I read the more I see others pretty well dodging the issue of acquiring legacy OpenGL contexts on post v3.2 installations, and it seems many/most have given up using OpenGL on Windows altogether. Probably due to the sort of convolutions mentioned here.

Cheers, Mike.

( edit ) The commit which reproduces the above is 3923772be2

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6588
Credit: 317437463
RAC: 371175

Very interesting development

Very interesting development from the Khronos Group !! :-) :-)

Last week they released an update of a C++ wrapper for use with OpenGL v3.0+ called OGLPlus. At a glance I think it may help with the majority of problems encountered with this backwards/forwards/sideways compatibility stuff. It doesn't get you a context/framebuffer, but if you have one then it encapsulates pretty much everything I'd need and have already 'rolled my own' for so far. In particular it has Buffer and Texture classes the interfaces of which seem to do all you'd ever want, and in particular enact the Resource Acquistion Is Initialisation paradigm ( as I have ). Plus some other cool stuff like Vector, Angle, Matrix ..... and runtime diagnostic and error handling via exceptions.

It can be git retrieved and uses CMake to build headers which basically hide OpenGL names and calls. The question immediately coming to mind is if/how this is integrated into our current build system.

[ As I've structured my code around a Renderable class, meaning that all methodology that acquires/enacts/releases rendering functionality is hidden within, then it is quite possible to readily 'fork' this lower level behaviour based upon runtime disclosure of the OpenGL version. Which is why I came up with Renderable interface in the first place of course! One option is thus : a pre v3.0 code set, and a post v3.0 code set .... ]

Cheers, Mike.

( edit ) Further detail : the first OGLPlus version came out about 10 weeks ago. It's a header-only library ( *.hpp ). Need g++ 4.5 or higher. Need Doxygen for documentation. Need either gl3.h or glew.h included b/4 including OGLplus ... interesting. So that suggests I proceed with GLEW incorporation into our build system regardless. This still leaves the question of where SDL lies in all this ( only in regard to it's video functions though ).

( edit ) I've never met a header-only library, but a glance within the *.hpp's shows how it's done. I guess you don't actually have to separate interface from implementation after all. Also there's a pretty simple configure script to invoke CMake via, and of itself is an interesting demonstration of how to go about using CMake.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6588
Credit: 317437463
RAC: 371175

So ...... thinking out loud

So ...... thinking out loud ....... for Windows builds, one could use this OGLPlus stuff ( to the extent that the OpenGL functionality it covers is what you want ) for a runtime case of v3.0+ without the need to obtain/transfer to some backwards compatible context ( eg. v2.1 ), that I have already discovered to be problematic ( with my implementations at least ). The requirements for this approach would be :

(i) include ( with later static link ) to GLEW, then

(ii) include OGLPlus headers, then

(iii) runtime sensing of OpenGL version ( ie. within the application code portions that are already compile-time switched as Windows relevant ), then

(iv) transfer execution to either pre-3.0 or post-3.0 suitable lines of code

For Linux one might be asking why it is not the problem that Windows is ( see here, my red emphasis ) :

Quote:
Graphics on Linux is almost exclusively implemented using the X windows system. Supporting OpenGL on Linux involves using GLX extensions to the X Server. There is a standard Application Binary Interface defined for OpenGL on Linux that gives application compatibility for OpenGL for a range of drivers. In addition the Direct Rendering Infrastructure (DRI) is a driver framework that allows drivers to be written and interoperate within a standard framework to easily support hardware acceleration, the DRI is included in of XFree86 4.0 but may need a card specific driver to be configured after installation. These days, XFree86 has been rejected in favor of XOrg due to the change in the license of XFree86, so many developers left Xfree86 and joined the XOrg group. Popular Linux distros come with XOrg now. Vendors have different approaches to drivers on Linux, some support Open Source efforts using the DRI, and others support closed source frameworks but all methods support the standard ABI that will allow correctly written OpenGL applications to run on Linux.


.... which is precisely why one can be oblivious to version compatibility issues until one gets to Windows builds.

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6588
Credit: 317437463
RAC: 371175

Progress : have rewritten the

Progress : have rewritten the framework code using a combination of GLFW/GLEW/OGLPlus which ought make pretty well all currently discussed problems evaporate. Now testing that assertion .... :-) :-)

Thus SDL is tossed as GLFW gives virtually all that it did. GLFW doesn't have an associated event union type like SDL -> so I wrote one. GLFW doesn't have an event queue like SDL ( formed by backend polling/callbacks ) to be read during an event loop handler, thus making event generation asynchronous with responses to said events -> so I wrote one. Most of the WindowManager class has thus been rewritten. The implementation that is. I have preserved the interface.

Three notable issues :

- GLFW doesn't do window icons.

- GLFW doesn't have equivalent image loaders.

- Mac OS 10.7+ currently only offers forward compatible 3.2+ OpenGL contexts, but will probably ( this isn't clear to me ) allow Carbon/Cocoa ( whoever they are ).

Cheers, Mike.

( edit ) So it seems that 'forward compatible' means :

(a) if it's not mentioned in v3.2 or subsequent evolutions then it isn't provided. Ever.

(b) ARB promise to never remove anything in (a)

( edit ) For image loading one could keep/use the SDL subset that does that, ignoring the window/event features ....

( edit ) The event queue idea is simple in principle. Things that the app should be aware of and respond to in some way - keyboard and mouse input for instance - can occur any old time, and you don't want miss any of them. But the app should only consider such events at a certain point in it's rendering cycle - typically to change the state of the simulation - prior to a rendering effort for a single animation frame. So events are continuously herded into a queue - via callbacks say - but are read from the queue and dealt with until the queue is emptied. Then render. Rinse, repeat etc ... indeed the only way to escape this loop from within is to receive an event implying exit from the loop. One uses a data structure with 'double-ended' semantics.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6588
Credit: 317437463
RAC: 371175

Progress : I have produced a

Progress : I have produced a functioning Linux executable with the GLEW/GLFW combination and the re-written WindowManager class etc. Some trimming to do but the principles are sound, and just to increase the challenge I've written a leak proof Singleton class for managing an event queue. It's a learning curve, and for an example I'll post a recent code comment I thought valuable to insert as a cautionary tale :
[pre]// Note carefully : I am a moron. The GLFW manual clearly instructs
// setting the event callback routines AFTER one has created the
// relevant window whose own OS message queue will be invoking said callbacks.
// Or it won't work. This is absolutely true. So this current code
// ordering is the correct case. I wasted six hours discovering this
// during testing. All hail RTFM !! :-) :-)[/pre]
Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

DanNeely
DanNeely
Joined: 4 Sep 05
Posts: 1364
Credit: 3562358667
RAC: 0

That reads more like a

Message 78226 in response to message 78225

That reads more like a checkin comment (or a bug tracker if you're using one) than a code comment to me.

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6588
Credit: 317437463
RAC: 371175

Progress : Quite good! I have

Progress : Quite good! I have Windows executables, functioning as per design, on about half of the dozen Windows machines tested using my new framework etc. I was about to claim complete victory. Sad Panda. :-)

I have ran into the following error on some machines, the detail appearing in stderrgfx. The window gets created, hangs around for about 5 - 10 seconds and then closes without any graphics output in the client area :
[pre]Unhandled Exception Detected ...

- Unhandled Exception Record -
Reason: Access Violation (0xc0000005) at address 0x00406805 write attempt to address 0x00000000

Engaging BOINC Windows Runtime Debugger...[/pre]
followed by lot's of stuff.

This would appear to be due to a naughty write attempt to a null pointer target. I have ensured that each and every instance of pointer dereference that I have written has a conditional guard for nullity, which never fires. So I deduce it's not my thread ( or that part visible to me ) firing this error.

So upon inquiry of the BOINC FAQ there are two obviously relevant pages ( here and here ). The error is consistently reproduced each time on a given machine and not otherwise ie. it is per machine, thus you either get it every time or never. I am going to follow the suggestions given in the FAQ, and also collate detail on the drivers and other machine features to see what pattern may/not emerge. If anything I probably should raise this issue over in BOINC-Land and/or with Ageless, when I have a more comprehensive characterisation.

Cheers, Mike.

( edit ) Only one of the Windows machines is doing any BOINC work at all. The others don't even have BOINC installed. So if my analysis is correct so far, it can only be related to the BOINC code included via the screensaver builds.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.