• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

GeForce RTX 3080

My 1660ti on my Asus ROG laptop will do fine for years.. helps that I dont play first person shooters :)

But yea, people who recently bought a 2080 need their money back imo.. real low blow from Nvidia. Then again, 2080 was way overpriced imo as are most graphics cards. Graphics cards have become the "E-penis" of the gaming world. Lets look at the most popular games out there by Twitch viewers at 16:40 GMT.

1) Among US.. LOL, yea that can run on a machine from 2008 easily.
2) Just Chatting... any graphics card going back to the 1990s can handle that.
3) Counter Strike Global Offensive.. that game is so old that your microwave can run it. The recommended spec is a graphics card from 2012.
4) Call of Duty, Modern Warfare.. recommended graphics card from 2014...
5) League of Legends... a potato can run that. Geforce 8800 is recommended.. that is older than most players playing Fortnite... speaking off.
6) Fortnite .. something made in the last half a decade and yet the recommended card is the same as CSGO...
7) Minecraft...lol this game is so bad in graphics that it is nuts that they recommend a card from 2014-15.
8) Warcraft... came out over 15 years ago and has an old engine. Anything since 2012-14 can run this at near ultra settings.
9) Grand Theft Auto V ... again rather old game and again 2012-14 graphics card recommendation.
10) DOTA2.. again.. no need for mega graphics cards.
11) Fall Guys.. seriously, no need for a RTX 3080 there.

Point is, you would only really see any benefit in games that almost no one plays. Love when they do bench marks with Tomb Raider.. cool, yea err who plays that? At least they have the new flight simulator to test new graphics cards, but again who the hell plays that other than the few fanboys out there? Yea lets do DOOM in ultra settings.. who the **** plays DOOM still? Right now DOOM Eternal has 211 viewers on Twitch.. DOOM has 117.
 
As suspected, this card is NOT obtainable. This is a game NVidia is notorious for. They drive price and demand up through planned shortage. Already I've seen Asus cards retailing for $799, and even those are back ordered on Newegg.
 
I always get a laugh at gamers who claim they can perceive the difference between 40, 60, 120, 200 FPS.

The limits of human perception for 98% of us range between 23-28 FPS, and some exceptional people can theoretically distinguish into the low 30's. Beyond this, you're deluding yourself.

But... there they are. The VODs where DeludedGamer97 is proudly boasting how his rig puts up 150 FPS and audio sampled at 192kHz (which he swears also makes a difference despite the human ear maxing out 20kHz) and his gaming mouse is sensitive to movement down to 2 microns and...

My reality is that I play a lot of games, and while I like to run most at or above their middle-of-the-road settings for some additional polish, in 99% of cases the additional teraflops increase the gameplay enjoyability by precisely zero. Even then, I wonder how much of that is a placebo effect.

It is laughably false to claim our visual limit is 23-28fps.

Do your own blind test. Get a friend to queue up a 30fps video and a 60fps video. The difference is obvious.
 
I understand the general principle of a high frame rate as improving the precision of motion interpolation.

My argument is that if controlled scientific experiments were run on a randomly selected cohort of gamers, the results would show that most couldn't reliably distinguish between 40 FPS and 60 FPS. Of those who could, their performance at 60 FPS would be negligibly better. And comparing 60 FPS to 120 FPS, etc., would reveal no statistically significant difference whatsoever.

I think where a rig that can support a sustained 120 FPS is genuinely differentiable from one that can support a sustained 60 FPS or 40 FPS is that situations often arise when the GPU is briefly overburdened and the frame rate lags to as little as 25% the sustained rate. Hence a 40 FPS sustained occasionally lags to 10 FPS, which is admittedly brutal, while a 120 FPS sustained only lags to 30 FPS, which would barely be noticeable. Hence on that basis, I can see the justification for more power.

It's interesting you mention game streaming services, because there you've got anywhere from a 150-300 ms round trip lag on your inputs baked in. Not quite the same thing as frames, I know, but for the gamer who claims he needs 60 rather than 30 FPS (a difference of ~16 ms per frame) to time his inputs right, what's the streaming going to do to his performance?

I think you are interpreting "no statistical increase in your gaming performance" as "people can't tell the difference."

It's true. a 120fps screen isn't going to make me better at Call of Duty, because I'm terrible at Call of Duty and no amount of hardware is going to overcome my lack of skill. You have to get to the extreme upper limits of competitive play before monitor framerate becomes a noticeable impediment to performance of the player, a play level where milliseconds really do count.

But there absolutely is a visible difference between a monitor running at 60fps and a monitor running at 120fps.
 
It is laughably false to claim our visual limit is 23-28fps.

Do your own blind test. Get a friend to queue up a 30fps video and a 60fps video. The difference is obvious.
I just did. I genuinely can't tell the difference between the 30 FPS and the 60 FPS footage. I've been a gamer for 35 years. Maybe my brain is exceptionally good at motion interpolation, but I doubt it.

Individual tests and anecdotal evidence are meaningless. I'm talking about an actual series of controlled trials that take the placebo effect out of play.
But there absolutely is a visible difference between a monitor running at 60fps and a monitor running at 120fps.
Apparently the issue is quite complicated and the experts are still out on the subject. But I'd definitely take the bet that you couldn't reliably distinguish between 60 and 120 FPS in a series of controlled trials.
 
I just did. I genuinely can't tell the difference between the 30 FPS and the 60 FPS footage. I've been a gamer for 37 years. Maybe my brain is exceptionally good at motion interpolation, but I doubt it.

Individual tests and anecdotal evidence are meaningless. I'm talking about an actual series of controlled trials that take the placebo effect out of play.

Apparently the issue is quite complicated and the experts are still out on the subject. But I'd definitely take the bet that you couldn't reliably distinguish between 60 and 120 FPS in a series of controlled trials.

Not only is there a noticeable quantifiable and specific difference between 30FPS and60FPS, there's even a difference between 24FPS and 48FPS.
It's the reason why that big fat expensive Hobbit movie got panned by so many critics.
For all the budget and all the hard work, the damn thing looks like VIDEO instead of film because it was shot at 48FPS.

People are buying high dollar HDTV sets and they are mostly defaulted to 120FPS with extra motion smoothing and the result is EVERYTHING looks like video until you disable all that nonsense.

There is also a pyschological and emotional connection to specific frame rates as well.
The 24FPS was decided upon not just for economic considerations but also because the mind does a interesting things at 24 that it does not do at 29.97 or 30.
It also does other interesting things at 15/16FPS that are different than 24FPS.

By the way @COTO, you have a very fundamental misunderstanding about what is even MEANT BY "192kHz".
That is not a measure of audio frequency range, it's a measure of quantization, also sometimes known as bit resolution or digital sampling resolution.

A 24-bit 192KHz recording has a bit depth of 24 bits and is sampled AT 192kHz.
 
PS: The only reason TV was originally tied to 30 FPS (and later, 29.97 FPS when color came around) is because our electrical mains power comes to us at 60Hz and it was relatively easy to sync 60 interlaced fields per second to the mains frequency, and in EU and most of the rest of the world, they tied FIFTY fields per second to their 50Hz mains frequency.

Above, I am referring to the old analog NTSC and European PAL systems, neither of which exist anymore.
But what does still exist is the 59.94/60 FPS legacy standard still tied to our mains frequency, although it no longer needs to be.

Television engineers desperately wanted TV to be 24 FPS but they could not keep everything stable enough due to technical limitations of 1930's electronics. They had the same aspect ratio as most motion picture film but not the right frequency or frame rate, thus necessitating 3-2-2 pulldown or a repeat of every sixth frame when movie film was transmitted over television.
 
Not only is there a noticeable quantifiable and specific difference between 30FPS and60FPS, there's even a difference between 24FPS and 48FPS.
It's the reason why that big fat expensive Hobbit movie got panned by so many critics.
For all the budget and all the hard work, the damn thing looks like VIDEO instead of film because it was shot at 48FPS.
24 FPS is below the perception threshold, and you yourself point out that there's a specific "movie quality" we've come to expect.

Post #21 talks about 40, 60, 120, 200 FPS.

I'd bet you that maybe 40% of "high performance" gamers could reliably distinguish between 40/60, 5-10% could reliably distinguish between 60/120, and the number who could distinguish between 120/200 FPS is statistically insignificant.

People are buying high dollar HDTV sets and they are mostly defaulted to 120FPS with extra motion smoothing and the result is EVERYTHING looks like video until you disable all that nonsense.
This much is true. Although this is a jump from 24 FPS to 120, not 60 to 120.

By the way @COTO, you have a very fundamental misunderstanding about what is even MEANT BY "192kHz".
That is not a measure of audio frequency range, it's a measure of quantization, also sometimes known as bit resolution or digital sampling resolution.

A 24-bit 192KHz recording has a bit depth of 24 bits and is sampled AT 192kHz.
I know the difference between sampling rate and bandwidth, which why I said "audio sampled at 192kHz" in #21. I'm an electrical engineer.

My point in #21 is that 192K/2 = 96K > 20K by a pretty substantial factor.

Nowhere do I mention bit depth anywhere in the thread, but FWIW, I'd bet you dollars to dimes a controlled series of trials to distinguish between 16-bit and 24-bit audio would find the vast majority of gamers unable to reliably distinguish between them either.
 
I know the difference between sampling rate and bandwidth, which why I said "audio sampled at 192kHz" in #21. I'm an electrical engineer.

My point in #21 is that 192K/2 = 96K > 20K by a pretty substantial factor.

Nowhere do I mention bit depth anywhere in the thread, but FWIW, I'd bet you dollars to dimes a controlled series of trials to distinguish between 16-bit and 24-bit audio would find the vast majority of gamers unable to reliably distinguish between them either.

Remember, 192 is labeled and referred to as "audiophile" quality, one way or another.
It is described as ultra, super, super-duper, audiophile, etc....and that IS "the one to five percent who CAN tell the difference."

It is not a surprise when you point out human nature! 🤣
Of course the majority fancy themselves audiophiles and can't really tell the difference. Of course the majority can't tell the difference between 2K and 4K.
Hell, most of them don't even realize that the difference between 2K and 1080 (1920X1080) is.... 80 !!

At age sixty-four and with a misspent youth as a rock and roll keyboard player in an age when hearing protection was unheard of, I now have frying pans for ears and can't discern much in the region above 5000~8000 Hz.
But I can still hear bass frequencies. The train yard six miles away, just as one example.
A car door slamming in the street when our bedroom is at the back of the house, a LARGE house at the end of a 600 foot driveway. Another example.
But my wife speaking to me in her characteristic sotto voce, I have to lean over and say "What?"
My eyesight isn't what it used to be either, and neither is my wife's.
She wants to get another 70+ inch 4K TV set and I asked her why...she and I wouldn't know the difference anymore unless we sat two feet from the screen.

I can't speak for gamers, because I'm not one, but as a film editor I demand and try to work with those specs as much as possible because I don't want my work to take a quality hit when it gets compressed for delivery, which it does on almost ANY platform you can pick.
And I get what I want, too.
Ask anyone in the industry what it takes to make ancient analog NTSC videotape footage from 1972 and make it good enough to look delicious on a sixty foot theater screen.

Triple comparison DVD.jpg

FullSizeRender (2019_02_27 17_41_55 UTC).jpg
 
Back
Top Bottom