Jump to content

Frames Per Second, AA, Tearing etc...


The Mighty Ash
 Share

Recommended Posts

Castlevania is 24 fps?!!? That's pretty fucking low. We're talking N64 territory.

The backgrounds in Rock Band run at 24fps. I can see every frame jerking into the next one. At least RB3 is going up to 30. Of course it doesn't really matter because the games happening on the high Fps fret board. But 24 for an action game? Oh Castlevania.....LOL.

Link to comment
Share on other sites

I mean the fact you can press LT-RT-LT-RT-LT-RT-LT-RT-LT-RT really fast and ten people in front of you fall down dead :lol:

Try it on the airport scene, it's hilarious.

This feature can easily be turned off, it's present in every current gen COD game. Option in the main menu.

It's completely off in multiplayer.

MGS2 on the PS2 tears a lot of frames, at least the NTSC versions do.

Link to comment
Share on other sites

No its not machine based. If it was machine based everything would tear and it doesn't (excusing the C64 which would tear but could be programmed to do Beam rendering or whatever it was called).

But its just graphics engine efficency.

Saints Row (360) has an option to turn it off or on because it can't render a full world at 60fps, so it gives you the option to turn it off, causing tearing occurs, but it'll run quicker (in frames) and without the minor input lag. It looks pretty shite either way though.

Link to comment
Share on other sites

No its not machine based. If it was machine based everything would tear and it doesn't (excusing the C64 which would tear but could be programmed to do Beam rendering or whatever it was called).

But its just graphics engine efficency.

Saints Row (360) has an option to turn it off or on because it can't render a full world at 60fps, so it gives you the option to turn it off, causing tearing occurs, but it'll run quicker (in frames) and without the minor input lag. It looks pretty shite either way though.

I know its not, i was wondering why some posts were saying the PS2 doesnt do it.

Worst ive ever seen is Bioshock on 360 with vsynch off. Makes me want to vomit. The whole screen rips in 2.

Link to comment
Share on other sites

Ok - I can see now. It's not a massive difference, and I doubt most people would notice at all while playing.

It's much more noticeable when you're moving forwards looking at the distance, as the detail doesn't smoothly blend away, but rather sharply drop off at a specific point. I don't actually know how consoles use it, I have to imagine almost all consoles games have it turned up to max anyway, as it doesn't really impact performance.

Anyway, you can get screen tearing from anything in theory:

-------

The first frame is generated by the machine and sent to your TV

The TV accepts this frame and is busy repainting the screen with this image

While the TV is repainting the image, along comes the next frame, even before the TV has had a chance to update the first frame

What happens now is that the second frame starts updating on the top of the first frame and you have an old frame and a new frame at the same time on your TV

The clean horizontal split between the old frame and the new frame looks like a tear as the image gets distorted at that point causing a screen tearing effect.

As the console outputs more frames than the TV can handle, this effect can be come quite pronounced and be very distracting and unpleasant to watch.

--------

The bit that confuses me is why all consoles games don't just have a 60hz Vsync switched on. There seems to be some issue where it has to be locked at either 30 or 60, and in open/busy sequences where the machine chugs, they take the limit off entirely so it can be jumping all over the place, chugging like fuck one minute and tearing like a bitch the next (Bayonetta PS3 is a good example). I'm sure there's a reason, but I don't understand why they don't just lock it to 60. Maybe it hardly ever gets there, maybe it runs at 30fps almost the whole time, but why don't they just vsync it at 60 anyway? On the PC side, you just put a tick in a box and no more tearing, ever.

Link to comment
Share on other sites

You may have it configured wrongly? What games are you having problems with? I've had it running for about five years, played over 100 games on Steam with it, never had it not work and never had it cause a crash, using both ATI and Nvidia cards, including a 2 card setup. I have it set to play a little 'ding' whenever it kicks in :) It's saved my skin in games such as Dead Space, which bizarrely locks to either 30fps or completely uncapped (ie 400fps!) and games like The Witcher and Crysis, where the VSYNC options doesn't work properly in Vista/7.

Ladies and gentlemen, PC gaming!

Link to comment
Share on other sites

Eh? That's not true, it doesn't even make sense. Why would rendering more frames than your monitor can keep up with, thus doing nothing but causing tearing, be advantageous?

Go ask about it in any PC game forum, you'll see. I'm not saying it's right, but it's just what people do and it seems to be why in 2010 this is all still such a mess - Because only I give a damn about it.

What about adding the amnesia executable to the list of programs. It picks up the usage of any Direct3D though, so unless you're playing Quake 2 in software mode it should pick it up. I just tested it, works fine. The built in vsync in Amnesia works anyway though, I just turned off d3doverrider and the in game vsync still did the job. There must be something going on.

NVidia I assume?

Also, AOC used to crash on me every time I quit, I never figured out what it was but it wasn't d3doverrider - probably just the fact that it was programmed terribly.

Well obviously I tested it with and without to come to that conclusion. The most annoying thing with AoC was that they HAD a v-sync option in the menus which worked fine, then one patch it vanished and d3doverrider was the only thing that worked for ATI users, but it took an extra 10 clicks to quit the game.

Link to comment
Share on other sites

Ladies and gentlemen, PC gaming!

Yeah, downloading a one meg app and then never, ever touching it ever again. What a chore, I'd rather have the abundant screen tearing and vastly inferior performance and visual quality of a console. *Tears hair out in frustration at the memory of downloading an app two years ago*

Go ask about it in any PC game forum, you'll see. I'm not saying it's right, but it's just what people do and it seems to be why in 2010 this is all still such a mess - Because only I give a damn about it.

No, that's simply not true. Never in my PC Gaming life, since the days of quakeworld on a modem, have I ever come across a single friend, forum poster or lan party attendant who turned their vsync off on purpose. 'More frames' doesn't mean anything, so long as you can reach the required 60hz (or 100, or whatever your monitor is) then you're seeing all the frames you're ever going to see. Going over this does nothing but add screen tearing, which in itself is a handicap. I've had a few friends who didn't know how to switch/force it on, but that's it. I did have a friend who irrationally thought that more frames equals better performance, and left it uncapped after he bought a new video card because he blindly wanted the highest framerate possible (whilst also wondering what all the screen tearing was, and ringing me up to ask how to get rid of it or if he should send his card back :lol:)

NVidia I assume?

No, not Nvidia, I have a 5870.

Well obviously I tested it with and without to come to that conclusion. The most annoying thing with AoC was that they HAD a v-sync option in the menus which worked fine, then one patch it vanished and d3doverrider was the only thing that worked for ATI users, but it took an extra 10 clicks to quit the game.

This wasn't with Vista, by any chance, was it? I think d3doverrider and Vista had some issues way back when. I only ran it for a couple of weeks though, so not sure. I also didn't play AoC for very long, for obvious reasons.

In short - screen tearing, must be horrible for the guys who can't switch it off with a click. If a console game has screen tearing, I either don't buy it or take it back, makes my eyes hurt.

Link to comment
Share on other sites

I hate screen tearing in console games, but this thread makes me confused about why it happens in the first place; surely getting a console to output more frames than a TV can physically show is counter-productive. Does it take more effort/processing power to tell a console to render exactly 30/60 fps rather than spew out as many as possible, and in a console game that's meant to run at 60fps, the maximum refresh rate of most TVs, what on earth is the benefit of trying to produce more than 60?

Link to comment
Share on other sites

No, that's simply not true. Never in my PC Gaming life, since the days of quakeworld on a modem, have I ever come across a single friend, forum poster or lan party attendant who turned their vsync off on purpose. 'More frames' doesn't mean anything, so long as you can reach the required 60hz (or 100, or whatever your monitor is) then you're seeing all the frames you're ever going to see. Going over this does nothing but add screen tearing, which in itself is a handicap.

I seem to recall most serious Quake 3 players set their framerate to 125fps, as by some quirk of the Quake engine it allows for better movement/strafe jumping.

Link to comment
Share on other sites

I know its not, i was wondering why some posts were saying the PS2 doesnt do it.

Worst ive ever seen is Bioshock on 360 with vsynch off. Makes me want to vomit. The whole screen rips in 2.

IIRC, Its not so noticeable on a CRT which quite a few people played it on. A game only has to run at 25fps, under interlacing (at 50hz). Then AFAIK you only get flicker.

Link to comment
Share on other sites

I seem to recall most serious Quake 3 players set their framerate to 125fps, as by some quirk of the Quake engine it allows for better movement/strafe jumping.

I heard that one a few times during the Q3 era. I tried a few different configs for 125 fps/max packets but it didn't make any difference to the feel of gameplay. I think it was one of those odd rumours concerning how 'pro' players played, along with crazy FOVs etc. Certainly when you watch vids of the best players playing from the period, there's no sign of it. Apparently some guns in COD4 have a tiny chance of increased firerate over 125fps though :ph34r:

Also... Quake 3 parkour

http://www.youtube.com/watch?v=VCy7Doq_Odk

I bet he's a real hit with the ladies!

Link to comment
Share on other sites

I heard that one a few times during the Q3 era. I tried a few different configs for 125 fps/max packets but it didn't make any difference to the feel of gameplay. I think it was one of those odd rumours concerning how 'pro' players played, along with crazy FOVs etc. Certainly when you watch vids of the best players playing from the period, there's no sign of it.

I wasn't good enough for it to make any difference either, but it definitely did have an effect. It's the reason Doom 3 was forcibly capped at 60fps: Link

Link to comment
Share on other sites

Never in my PC Gaming life, since the days of quakeworld on a modem, have I ever come across a single friend, forum poster or lan party attendant who turned their vsync off on purpose.

I can't begin to describe how frustrating this is to read. This is the total opposite of all my experience.

Link to comment
Share on other sites

But why on earth would you? Except to see a bigger number on your framerate counter? It's the same with Counterstrike, people constantly talking about '300fps' as if it makes a blind bit of difference to the gameplay. It's nothing but nerd e-peen, like a benchmark score. I'm sure some probably have the impression that it gives you an advantage, maybe seeing people coming around the corner before they see you or whatever, but that doesn't make any sense, even if you're playing on a lan with <1ms ping. And even if it were true, the window of opportunity would be a 60th of a second vs a 300th of a second, in which case you should probably sell yourself to science for military application.

Reminds me of that Starcraft 2 bug recently that made the menus run about 9000 frames a second and overheat Macbook Pros :lol:

Link to comment
Share on other sites

I once honestly thought that by now we would be playing pure 60fps gamez. How naive I was. I will never understand or give any credence to the '30fps is fiiiiiiine' argument. ALL games, regardless of genre or type, benefit from 60fps. I don't give a shit about 'feel' or what the creator intended or similar bullshit.

All motion should be smooth and fluid. One day (hopefully) consoles/gaming apparatus/cyberspace interfaces (yes, it might take that long), will present us with silky smooth animation and movement. And all folks large and small, old or young, will shake their heads in disbelief at the mere notion that anyone could even THINK that 30fps is fiiiiine. And they will laugh. And point fingers. At us. And think of us as simple peasants from a long gone age.

Link to comment
Share on other sites

I always turn Vsync off. I hate having my FPS limited by it, id prefer a bit of tearing, which oddly never really bothered me much, than get stuck with 30FPS

Vsync syncs to whatever your monitor hz is set to, or 60 at the lowest.

The only exception I can think of, though I'm sure there are more, is Dead Space on the PC, which vsyncs to 30fps if you use the ingame vsync, for some reason.

Try this, Brer :)

Install it, find the 'd3doverrider' bit in the install folder, run it at startup. It will vsync all your games to 60fps. No more tearing and constant 60fps. PM me if you get stuck, though I'm sure you'll be fine. In something like Stalker, a love for which we both have in common, I find tearing really distracting and atmosphere-killing.

I once honestly thought that by now we would be playing pure 60fps gamez. How naive I was. I will never understand or give any credence to the '30fps is fiiiiiiine' argument. ALL games, regardless of genre or type, benefit from 60fps. I don't give a shit about 'feel' or what the creator intended or similar bullshit.

All motion should be smooth and fluid. One day (hopefully) consoles/gaming apparatus/cyberspace interfaces (yes, it might take that long), will present us with silky smooth animation and movement. And all folks large and small, old or young, will shake their heads in disbelief at the mere notion that anyone could even THINK that 30fps is fiiiiine. And they will laugh. And point fingers. At us. And think of us as simple peasants from a long gone age.

Completely agreed, but I think 30 is fine for a lot of games, just not action heavy titles. Anything below about 40 in an action title (say a shooter, or a brawler) gives me a headache. It's always preferable though, anything below a solid 60fps gives me alarm bells on PC. On console I just put up with it, and it doesn't spoil my enjoyment of the games, but I can't help thinking that something like Red Dead Redemption would benefit massively from 60fps. I remember installing Gears of War on PC (after loving it console), booting it up and thinking "Ohhhh, that's what it's meant to look like". Similarly something like Left 4 Dead benefits hugely from it, as you can really enjoy the fluidity of the zombie animation in a way that just isn't possible at 30fps.

Though of course, some of us already live in the era of purely 60fps games :eyebrows:

It's very hard to demonstrate 60fps to people, as there are no sites that let you upload it, and converting a raw video stream into just about any format will cap the framerate. I'd love to be able to take some 60fps footage, convert it to 60fps avi and then upload it to a 60fps capable video hosting site. If anyone has any ideas how to do this, I'm all ears. People watching comparison videos need to understand that they're probably watching 24fps vs 30fps at most. There's a massive and very very noticeable difference between 30fps and 60. You'd have to be legally blind not to see it.

Edit - Sigh. Look people, more frames are good. I'm sorry your console can't do them, and I'm happy that this doesn't bother you. It doesn't mean you need to neg people who go the extra mile for their own enjoyment. Stop being so defensive of the little box under your telly, it's fine. It does the job. A 1.0ltr Nissan Micra does the job as well, but you don't feel the need to neg someone every time they buy a faster car, do you? Honestly. Get over your divisive little mentality and polarising stereotypes, and try to understand that some people can own and enjoy more than one platform, whilst being able to recognise the qualities inherent to each and the relative strengths and weaknesses therewithin. Unless you've actually got a decent gaming PC and have the ability to compare it to the same titles running on the console, we have nothing to debate about. You have no basis for opinion besides preconception and defensiveness. You can't simply neg someone because they find better performance better than worse performance. That's either jealousy or idiocy.

Link to comment
Share on other sites

Vsync syncs to whatever your monitor hz is set to, or 60 at the lowest.

The only exception I can think of, though I'm sure there are more, is Dead Space on the PC, which vsyncs to 30fps if you use the ingame vsync, for some reason.

Try this, Brer :)

Install it, find the 'd3doverrider' bit in the install folder, run it at startup. It will vsync all your games to 60fps. No more tearing and constant 60fps. PM me if you get stuck, though I'm sure you'll be fine. In something like Stalker, a love for which we both have in common, I find tearing really distracting and atmosphere-killing.

Cheers :D I used rivatuner back in the day, but only for overclocking, didnt know you could do this with it. Have some pos rep to cancel out the neg ;)

Its odd tho, i genuinely never noticed tearing at all in games, until it was pointed out to me :lol:

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Use of this website is subject to our Privacy Policy, Terms of Use, and Guidelines.