Jump to content

Frames Per Second, AA, Tearing etc...


The Mighty Ash
 Share

Recommended Posts

Yes, tearing is by far the worst. Now I may be wrong, but isn't this a problem that never existed in the pre-HD console era? Stuttering/bad framerates are something we've seen for ages (Perfect Dark for example, GTA3 in places), but tearing seems to be a relatively new phenomenon.

It's not new, it's just that when you've got multiple resolutions to design for, it's more likely to show up.

Tearing in games that only run at 480i is pretty inexcusable. But it did happen!

Link to comment
Share on other sites

Jesus, why are people negging Morrius for just posting info on how to tweak PC stuff? I hate "wasting" my plus votes to put right what once went wrong.

I think they think I'm one of those guys who thinks consoles are bad, just because I have a PC and think that it's better at some things.

FYI someone just negged you but I saved the day.

Link to comment
Share on other sites

60fps Defence Force!! JOIN NOW!!

I can't imagine I'll be getting a proper gaming PC even though it would be nice. It's a dangerously expensive hobby. And because I'm such an anal nutter I would just end up in tweaking hell. Nope, I'll live with these here rubbish ole underpowered consoles.

I know there's loads of good arguments for why the current consoles are the way they are when it comes to the graphics bits (Sony's last minute ditching of their own graphic chip and then buying from nVidia didn't help) and that the Xbox 360 graphics chip is actually pretty good all things considered..... but I think they should have like, gone all out and just put a MASSIVE ENORMOUS graphic chip in the consoles. Never mind the cost. Just make that shit fly. Perhaps they will do it next time round.

Link to comment
Share on other sites

Both the 360 and the PS3 are incredibly efficient machines, I can't believe half the stuff they pump out, for what is essentially 'old tech' in the fast moving world of computers. The 360 runs on essentially a Radeon X1900 and a three core CPU. I love the fact that this generation has become about efficiency, waggle aside.

Link to comment
Share on other sites

Both the 360 and the PS3 are incredibly efficient machines, I can't believe half the stuff they pump out, for what is essentially 'old tech' in the fast moving world of computers. The 360 runs on essentially a Radeon X1900 and a three core CPU. I love the fact that this generation has become about efficiency, waggle aside.

Yeah, yeah agreed. But I want moar!!! :lol:

Link to comment
Share on other sites

We won't ever get constant 60fps in console games unless the marketing bods can sell it.

Or when/if we reach a point where graphics can no longer improve (true phot realism?) - then any processing power improvements can be used to give better frame rates. Then again, it'll probably be used for ever greater draw distances, crazily large levels, millions of AIs scooting around, that sort of thing.

Link to comment
Share on other sites

60fps Defence Force!! JOIN NOW!!

:lol:

I was playing Metro 2033 on PC last night, did the library section up to the D6 section, and the part where you and 5 other guys are stalking through the tunnels, being ambushed by mutants, man, running at 60FPS just allowed you to step back for a minute and admisre just how fucking amazing the animations look running at that framerate. I was blown away. I havent felt so immersed in a game in a long time. I loved it on 360, but on PC its in a laegue of its own

Link to comment
Share on other sites

Or when/if we reach a point where graphics can no longer improve (true phot realism?) - then any processing power improvements can be used to give better frame rates. Then again, it'll probably be used for ever greater draw distances, crazily large levels, millions of AIs scooting around, that sort of thing.

I think games will be limited by prohibitive cost of development long before we reach that point. We have already seen in this console generation that the cost of employing scores of people to generate art assets has pushed the price of making a game through the roof. I think graphics will plateau (some might say they already have to a degree) and maybe new hardware, when it eventually arrives, will see things like 60fps, decent AA and no tearing become more common, but again nobody is in any rush to make a new generation because of the cost of development on this one. What studio in their right mind would want to have to learn to code for a new machine right now?

Link to comment
Share on other sites

We'll be seeing this generation for a while yet. Years, I think. They'll rack up a couple of years with waggle alone.

Meanwhile, as the console hare larks about with cameras and waggle and has a quick nap, the tortoise PC with his robust and successful digital delivery system trundles on, wearing his upstart developer-friendly trainers and alternative business model friendly shorts.

The next big thing will surface on PC, as it almost always does, and it will catch everyone unawares. The PC might be complicated as a platform to use, but it also cultures new ideas and new developers in a way the console simply can't. Every day something fun and new and interesting bubbles up to the surface of the PC, just look at Minecraft. The great designers of tomorrow aren't tinkering away with console dev kits in their bedrooms or sitting in class in University videogame design lessons, they're knocking stuff together for the PC just for the love of it.

Link to comment
Share on other sites

Eh? That's not true, it doesn't even make sense. Why would rendering more frames than your monitor can keep up with, thus doing nothing but causing tearing, be advantageous?

Input lag resulting from Vsync induced slowdown, the competitive Q3 scene seems to favour 120fps+ as the optimal update rate for that game, and someone was talking about superior "air control" if your hardware could run the game at 333fps as the physics change.

Ok this then leads me onto the next major flame being the frames limit. We all know that capping at 120fps whilst connected to a dedicated server does wonders for your physics. Any competitive Quake 3 player knows that all they need to run play well is ~140fps in demo four and they can play well. The framerate issue is the same in any game, the better the PC the better the player although in Quake 3 there is a slight physics advantage. The main purpose of this rule however was not to remind people of the 120fps capping but rather to eliminate a serious bug in Quake 3 and OSP and that is the 333fps cap error. If a person caps their frame rate at 333 they literally get super physics and air control. I've seen this demonstrated and it really is insane. Players can get approximately 10u more height and it feels as though you are floating about the ground when you walk. Point release 1.32 was a performance hit on everyone's PC and it almost did away with the problem but some high end extremely tweaked machines can still hit 333 and get those physics.
Link to comment
Share on other sites

Vsync syncs to whatever your monitor hz is set to, or 60 at the lowest.

The only exception I can think of, though I'm sure there are more, is Dead Space on the PC, which vsyncs to 30fps if you use the ingame vsync, for some reason.

Try this, Brer :)

Install it, find the 'd3doverrider' bit in the install folder, run it at startup. It will vsync all your games to 60fps. No more tearing and constant 60fps. PM me if you get stuck, though I'm sure you'll be fine. In something like Stalker, a love for which we both have in common, I find tearing really distracting and atmosphere-killing.

Hey dude, I've just tried this as I'm currently stomping through the original Painkiller (ace game!) and it tears like a motherfucker. When I tried using my ATI Catalyst 'force vsync' option it felt noticeably input-laggy so hoped this would solve it - but it's the same! Well, the lag is a tiny bit less, but it's definitely there - something 'feels' wrong when it's enabled.. Any ideas?

Link to comment
Share on other sites

Input lag resulting from Vsync induced slowdown, the competitive Q3 scene seems to favour 120fps+ as the optimal update rate for that game, and someone was talking about superior "air control" if your hardware could run the game at 333fps as the physics change.

To be fair if your in-game physics are dependant on frames per second, you're doing your physics wrong. (well, cheating them to make them work in a game, more accurately)

Link to comment
Share on other sites

Input lag resulting from Vsync induced slowdown, the competitive Q3 scene seems to favour 120fps+ as the optimal update rate for that game, and someone was talking about superior "air control" if your hardware could run the game at 333fps as the physics change.

What about outside of a Q3 bug?

C'mon now.

Link to comment
Share on other sites

Hey dude, I've just tried this as I'm currently stomping through the original Painkiller (ace game!) and it tears like a motherfucker. When I tried using my ATI Catalyst 'force vsync' option it felt noticeably input-laggy so hoped this would solve it - but it's the same! Well, the lag is a tiny bit less, but it's definitely there - something 'feels' wrong when it's enabled.. Any ideas?

I'll post up my settings later and see if that helps.

Edit - Oh, try forcing triple buffering :) That's the other thing d3doverrider does. Or turn it off if it's on. The Triple buffering option in the ati/nvidia control panels only work with opengl (ie nothing) but the one is d3doverrider offers true TB for direct3D games.

Anandtech:

So there you have it. Triple buffering gives you all the benefits of double buffering with no vsync enabled in addition to all the benefits of enabling vsync. We get smooth full frames with no tearing. These frames are swapped to the front buffer only on refresh, but they have just as little input lag as double buffering with no vsync at the start of output to the monitor. Even though "performance" doesn't always get reported right with triple buffering, the graphics hardware is working just as hard as it does with double buffering and no vsync and the end user gets all the benefit with out the potential downside. Triple buffering does take up a handful of extra memory on the graphics hardware, but on modern hardware this is not a significant issue.
Link to comment
Share on other sites

What about outside of a Q3 bug?

C'mon now.

Read the digital foundry articles on input lag, it's a real thing, and for competitive play, less input lag the better, why do game mice run at much higher update rates than bog standard mice? afterall, average human reactions times aren't measured in 1000ths of a second.

Link to comment
Share on other sites

You win, whatever will end this fucking inane argument. The entire thing is the exclusive domain of nerdy pro wannabes obsessively trying to tweak imaginary performance boosts out of their system. a q3 bug all but ignored by actual 'pros' does not a pattern make. Go watch some genuine pro tournament Q3 games, they're all playing on default settings with intellimice :lol:

High DPI mice exist to sell expensive mice to nerds. All the best players all throughout the nineties played on ball mice and low DPI laser mice.

Link to comment
Share on other sites

High DPI mice exist to sell expensive mice to nerds.

All the best players all throughout the nineties played on ball mice and low DPI laser mice.

Because the technology didn't exist back then, much like how super high fps wasn't possible back in those days, what does that prove?

Now if you could pit 2 equally skilled players against each other, and one ran at 30fps with a low dpi/refresh rate mouse and the other guy ran with the latest and greatest, your sure the first guy would win as there is no advantage to super high update rates in either the game or the input device for competitive play at the top level?

Link to comment
Share on other sites

I'll post up my settings later and see if that helps.

Edit - Oh, try forcing triple buffering :) That's the other thing d3doverrider does. Or turn it off if it's on.

Yeah, I tried triple buffering on and off and there's still a weird 'feeling' to it. Though that description you've posted basically says that there is a still tiny bit of input lag anyway! Ho hum! :)

Link to comment
Share on other sites

Because the technology didn't exist back then, much like how super high fps wasn't possible back in those days, what does that prove?

Now if you could pit 2 equally skilled players against each other, and one ran at 30fps with a low dpi/refresh rate mouse and the other guy ran with the latest and greatest, your sure the first guy would win as there is no advantage to super high update rates in either the game or the input device for competitive play at the top level?

I said there was no advantage at running over the hz rate of your monitor, not 30fps. There's a very noticeable difference in running at 24 or 30fps vs running at 60 or over. In fact, I've been extolling the virtue of the maximum framerate possible without causing tearing for the entirety of this thread. It sounds to me as if you've fallen into the trap of believing that 3000dpi vs 6000dpi or 100fps vs 300fps or an lcd screen embedded in your keyboard or God knows what else makes a blind bit of difference. Probably due to having too many conversations like this one. It's the very mentality that entrepreneurial hardware manufacturers use to sell pointless nonsense to pallid basement dwellers.

Just as with all PC gaming, there's a gigantic screeching halt at a point where diminishing returns loom like a brick wall, the other side of which is populated entirely by 'pro' wannabe lunatics who like to clog up forums debating exceptionally pointless things like this, giving all the sensible PC Gamers a bad name in the process.

The two important things to maintain are a decent connection and a solid, consistent frame rate. Everything else is so far removed from reality as to be almost imaginary. But, as we know, PC gamers like to argue about this (and of course, every single one of them is in the right) so it takes on an importance that far exceeds that which it deserves.

Now, using blutack as a crosshair for the AWP in CS, that's a tangible tweak -_-

dood - got any other games you can test? Might just be that engine.

Link to comment
Share on other sites

Interesting article about the input lag on Digital Foundry. Most interesting is that, based on the stats, Modern Warfare reacts about twice as snappy as Halo 3 and I personally can't tell the difference. Maybe if I played them next to each other, but never have I played either game and even wondered about which had the highest input lag as it just seems to, you know, work.

Link to comment
Share on other sites

Interesting article about the input lag on Digital Foundry. Most interesting is that, based on the stats, Modern Warfare reacts about twice as snappy as Halo 3 and I personally can't tell the difference. Maybe if I played them next to each other, but never have I played either game and even wondered about which had the highest input lag as it just seems to, you know, work.

:blink::wacko:

One runs @60 fps and the other @30fps for a start. That right there is why it reacts twice as fast.

Link to comment
Share on other sites

Going straight from one to the other it's reallllllllllly noticeable, as I said a few pages ago and got hellaciously negged for. As for the input lag difference between the two, I don't think it matters so long as you're all playing from the same position. Do wireless pads present any more input lag than wired ones? Do pro Halo 3 players have special golden braided cables on their pads to reclaim those vital few milliseconds? :lol:

Out of interest, I loaded up MW2, turned some settings down, turned Vsync off and had it running about 350fps. Not only did it feel exactly the same as when capped at 60fps, but it made firing a gun almost impossible because the entire screen looked like a zebra crossing that someone had vomited on. It's unplayable with that much tearing :lol:

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Use of this website is subject to our Privacy Policy, Terms of Use, and Guidelines.