Im not 100% sure but MEDIUM/LOW graphic settings feels like 30fps to me. Highest quality makes game much more smooth. I have overheating graphic card and medium/low lower my temperatures but unfortunately I see difference between 60 and 30 fps. Its huge so I prefer my higher temperatures:)
Niether medium nor low graphic settings make fps limited by 30. With medium/low graphic settings we have same 60 fps with with much worse picture quality. Difference you see is NOT related to fps. You will NOT feel any difference between 60 and 30 fps if picture quality stays the same just because human eye (you`re not a reptiloid, right ) can NOT differentiate more than 24 frames per second. That`s why I don't want my graphics card to work harder for adittional 30 fps that no one can observe.
My friend you are wrong about not seeing differences in frames if you have 60hz monitor/tv(24 frames per second for human eye is a myth). Anyway I thought medium/low settings had 30fps limit because it wasnt so smooth like high settings. Cant help you then
Yep, I have 60hz monitor and my thoughts are based on my experience playing with 60 and 30 fps. I can see if game turns into slideshow (hi, unity) with something like 18 fps and it is awful. But I'm absolutely ok with something like 22 (even less than mythical 24). I really tried play with different framerate in skyrim (without additional load i have 60 and with another games in tray roughly twice less) and can't recognize the difference. Maybe it is my maximalism to think that if i can not no one can, I'm sorry if so. But the fact you thought game runs 30 fps with low graphics settings seeng same 60 is kinda proving my point of view. I would like to found someone who thinks like you and give him test where he must play one game with same graphics settings but different fps and tell me which version is 60 and which is 30. With big enough selection it may resolve all the questions. Anyway thank you for trying to help.
I'm certainly not an expert on this matter, but based on my personal experience, I usually feel a large difference between 30 and 60 fps (at least if compared directly). I even noticed a large difference on my freind's mouse cursor movement on his desktop (120 hz) in comparison to how mine ususally feels (60hz).
30 fps may be acceptible for a card game, but if I can get it, I'd even take the 60 (or better - optionally - unlimited) fps here as well.
My understanding is that it takes at least around 24 pictures a second for the human brain to connect it to one flowing motion, but that it cannot make out a difference when watching higher rates does not cope with my personal experience.
That is saying that I don't have anything against an optional 30 fps setting, but don't take away higher fps. Don't make things worse, make them better.
"I even noticed a large difference on my freind's mouse cursor movement on his desktop (120 hz) in comparison to how mine ususally feels (60hz)."
I believe, if on box of your monitor was written 120Hz and on box of your friend's deckstop - 60Hz (while the real frequencies are 60 and 120 respectively), you would feel just the opposite.
Of corse I don't want Bethesda to make 30 fps only option. But I would be glad to see it among others. The ones who want fps not to be limited at all will be glad as well.
1 of 1 people found this helpful
If you have an nVidia GPU you can use nVidia Inspector to force a frame limit of your choice. That being said, I feel like the game gets very choppy even at ~45 FPS from nVidia's limiter so I ended up turning it back off.
""I even noticed a large difference on my freind's mouse cursor movement on his desktop (120 hz) in comparison to how mine ususally feels (60hz)."
I believe, if on box of your monitor was written 120Hz and on box of your friend's deckstop - 60Hz (while the real frequencies are 60 and 120 respectively), you would feel just the opposite. "
Well if the numbers on the box had been changed, the experience might have caused some confusion, and I might have disregarded the effect of the "120 hz" labeled monitor being worse as my subjective perception not being up to form that day or explained it away in some other fashion.
But to dispute your assumption that my impression is based on a self-fullfilling prophecy, I believe that it is not: I went into it with the same mindset as you currently have as to believe that this whole framerate/refreshrate thing was a marketing ploy and that the human eye could not percieve a difference. Well, I did percieve a difference, and my next monitor will be a 144 hz one.
I can imagine, however, that there are people who do not percieve differences here. After all we are not all built the same way.