Quick Links: Download Gideros Studio | Gideros Documentation | Gideros Development Center | Gideros community chat | DONATE
"strange" frame rate with same win32 export and different laptops — Gideros Forum

"strange" frame rate with same win32 export and different laptops

piepie Member
edited November 2016 in Bugs and issues
Hi as the title say I am experiencing a strange issue with frame rate: my project setting is 30 fps.

I am displaying fps using:
		local fps = 1/e.deltaTime
in enterframe.

on my laptop (i5 win7 home) I get the correct 29/30 fps as project settings
on a friend laptop (i7 win7 home) I get 1014.9867.. fps (and the game runs accordingly.. it is unplayable :) )

I tried forcing 30 and 60 fps with win32 1 and win32 2 options from command line with no luck (on my laptop this works).

just to add a bit I tried using Qt export, and on both laptops it runs at 60 fps.

Is there something I can do to help spotting this bug?

Thank you

Comments

  • SinisterSoftSinisterSoft Maintainer
    edited November 2016
    try putting a "1" as the program parameter - that will make it use vsync rather than a timer. imho this should be automatic if it appears to be 60hz.

    (To add a program parameter you may have to make a shortcut, right-click it and add it there).
    Coder, video game industry veteran (since the '80s, ❤'s assembler), arrested - never convicted hacker (in the '90s), dad of five, he/him (if that even matters!).
    https://deluxepixel.com
  • Thank you @SinisterSoft, as I said earlier I had no luck using additional parameters :(

    I tried again rebooting my friend's laptop today, these are the values I get:
    win32 with no additional parameters: around 1244 fps
    win32 with "1": around 759 fps
    win32 with "2": around 1091 fps

    the project setting is still 30 fps.
    On my laptop all these settings work fine (29/30 fps as expected)
  • hgy29hgy29 Maintainer
    @john26 will know better, but it may be that his OpenGL driver doesn't support VSYNC!
  • piepie Member
    edited November 2016
    What feels really strange is that the project is the same, same os and same video card nvidia geforce gt540M.
    I even updated nvidia drivers to latest, but nothing changed.

    The only big difference seems to be on the CPU (i5 and i7)
  • Have you tried updating your drivers on the machine?
    Coder, video game industry veteran (since the '80s, ❤'s assembler), arrested - never convicted hacker (in the '90s), dad of five, he/him (if that even matters!).
    https://deluxepixel.com
  • Do you mean like chipset and stuff? not yet, I updated only nvidia drivers at this time, unfortunately I don't have the laptop with me so I can't access it freely
  • I meant video drivers, chipset might matter if it's a timer issue.
    Coder, video game industry veteran (since the '80s, ❤'s assembler), arrested - never convicted hacker (in the '90s), dad of five, he/him (if that even matters!).
    https://deluxepixel.com
  • What feels really strange is that the project is the same, same os and same video card nvidia geforce gt540M.
    I even updated nvidia drivers to latest, but nothing changed.

    The only big difference seems to be on the CPU (i5 and i7)
    Maybe you launch the game or the player with the Intel Graphic Chipset and not the Nvidia Graphic Card ?

  • Good point Jerome! Didn't think of it, but unfortunately it's not that.
    Using Nvidia chipset on my laptop works fine, I managed to tell her how to run the app forcing to use nvidia chipset, but it is still iperfast.
    Unless I can put my hands on her laptop I can't update its intel drivers. Will do it asap.
    What else could it be? .NET framework or something like that? I am blindly guessing.. :)

    thank you
  • It won't be .net...
    Coder, video game industry veteran (since the '80s, ❤'s assembler), arrested - never convicted hacker (in the '90s), dad of five, he/him (if that even matters!).
    https://deluxepixel.com
  • In the NVidia control panel it could be a switch to override the apps frame rate?
    Coder, video game industry veteran (since the '80s, ❤'s assembler), arrested - never convicted hacker (in the '90s), dad of five, he/him (if that even matters!).
    https://deluxepixel.com
  • not directly, there is just an option to enable or disable vsync, or to use app settings (which is default value)
  • john26john26 Maintainer
    edited June 2017 Accepted Answer
    Sorry I haven't seen this before. The command line argument you can use is "timer" as you can see here from the source code

    https://github.com/gideros/gideros/blob/master/win32_example/win32.cpp#L717

    This will make the program use the windows timer events for synchronisation. Otherwise it will attempt to use VSYNC which is part of OpenGL but does not, it seems, work with all gfx cards. See the export box when you export win32 it gives some advice!

    Likes: SinisterSoft, pie

    +1 -1 (+2 / -0 )Share on Facebook
  • piepie Member
    Thank you @john26 that's it!
    I didn't notice the timer parameter, I would have tried it before :)

    Do you think that using windows timer would affect the app in some way? If windows timer works anywhere (on any windows pc) without issues nor downsides I would use it as default for the windows export, and leave vsync via parameter.

    Thank you again! :)
  • john26john26 Maintainer
    edited June 2017
    Yes, I think you are right, timer events should be the default behaviour given your reports of problems with VSYNC. According to documentation I have read (and some people's opinions) VSYNC is the "correct" way to get smooth animation on Windows but in practice is seems to be implemented differently in different gfx cards so not really reliable/standardised. The problem is OpenGL contains no standard function to refresh the screen so that is left to the operating system and is implemented differently on different operating systems. But while Android and iOS have standard methods of refreshing the screen in a timed way, Windows doesn't seem to (other than dispatching WM_TIMER events which are not really geared up to games and have nothing to do with OpenGL per se). This has always been a problem I think: Windows going back to 1.0 has never had a standard way of restricting fps and I suspect modern (and old) Windows games do it in many different ways.

    Most people seem to frown on using WM_TIMER events for game animation (the use case seems to be simple clock apps etc), but in my experience it works well and is simple to implement. However I've noticed the timer events are quite inaccurate so I actually set the interval to 10ms to geet 60 fps (really should be 1000/60=16 ms -- you have to set an integer -- but that seems too slow!), see here

    https://github.com/gideros/gideros/blob/master/win32_example/win32.cpp#L771

    Also WM_TIMER cannot run faster than 60 fps even if you wanted that. so if you set

    SetTimer(hwnd, ID_TIMER, 0, NULL);

    it would still give 60 fps.

    In win32 the program attempts to detect if VSYNC is implemented and falls back to WM_TIMER if not, see here:

    https://github.com/gideros/gideros/blob/master/win32_example/win32.cpp#L220

    The problem seems to be most gfx cards claim it is implemented but then ignore the standard behaviour refreshing the screen either too quickly or slowly.

    With the hints I've given it should be quite simple to understand the main program win32.cpp. It's not very complex and despite the above issues the win32 platform is way simpler than any of the mobile ones. If anyone knows of a better way to get smooth animation in vanilla Windows please let me know or send a Pull Request!

    Likes: pie

    +1 -1 (+1 / -0 )Share on Facebook
  • piepie Member
    @john26 Actually I am not sure the issue is in the graphic card: we have the same Nvidia geforce gt540M with same latest drivers on different laptop models.
    If anyone can think of some test to do to spot the issue please let me know. :)
    Thank you
  • hgy29hgy29 Maintainer
    Searched about it on google, it seems that enabling vsync may causes various effects such as:
    - not working at all (if vsync was globally disabled in the video card control panel)
    - consuming CPU because some drivers will just spin CPU waiting for VSYNC
    - sync period other than 0 (disable) or 1 (enabled) might no be supported

    Common accepted way of doing seems to rely on a timer while trying to VSYNC at the the same time, i.e. computing elapsed time between frames and waiting a little less than the expected frame period (to save cpu) and then calling SwapBuffers with VSYNC enabled.

    Likes: john26

    +1 -1 (+1 / -0 )Share on Facebook
Sign In or Register to comment.