- Joined
- Feb 24, 2013
- Messages
- 35,033
- Reaction score
- 19,492
- Gender
- Undisclosed
- Political Leaning
- Conservative
Since I do zero game creation or modding I have no idea what all that meant but it sure sounded interesting!! Basically, my understanding is that with a higher power graphics card I get faster rendering and more "stuff" to see. With that understanding and my personal interest in how better graphics capabilities impin 3D space_ove my gaming experience, I've been really impressed with the attention to detail seen in landscape renderings. I started to pay more attention to that kind of thing with Morrowind but more recent games such as Red Dead Redemption 2 and Witcher 3 have gorgeous scenery. In all the games, however, the NPCs seem to still move unnaturally. I've seen great improvement in skin and clothing texture but all the NPCs seem to walk at the same speed, keep their heads pointed in the same direction, adopt the same dozen or so poses, etc. They also never move ANYTHING in their villages. If there's a barrel or a bale of hay next to a barn one day it will be there every time you go by that spot even though the folks in town are supposed to be working.
I guess that isn't really graphics issue as much as it is an AI issue but if we could see as much attention put into creating a more dynamic environment such as we have seen put into textures and landscapes I figure that would improve the gaming experience in my eyes.
All I know is that it's too expensive to try and keep up.
I try to dump enough money into a machine that I'll get 5-7 years out of it.All I know is that it's too expensive to try and keep up.
I don't play games enough to make it worth it.I try to dump enough money into a machine that I'll get 5-7 years out of it.
I went pretty high end in 2014 when I had my previous machine built. It was and still is fine for most of what's on the market. The only thing that was messing me up was lack of memory but that was easy to resolve with external hard drives. Anyway, it had been 6 years and I kind of wanted a new machine anyway so I dropped $3k on my current machine but it should serve well for another 5-7 years or more.
The problem is we are nearing the brickwall on hardware, so we are nearing a point either coders need to be highly efficient to push such advances in current hardware, or the next gen of hardware be introduced either quantum ternary etc, as the current 64 bit binary is nearing it's limits with not so much room to improve overall.I'm surprised nobody has mentioned ray tracing yet. That's the future of photorealistic graphics.
The OP is right that all these advances will cost a lot of computational power.
With the release of the nVidia 3080 and 3090, we have, in theory, reached the upward limit of graphics output. The 3090, for example, can run a game at 8K @ 60fps
While there is a weird gray area in human vision-vs-reaction, and we see that humans can react faster than our eyes can theoretically see (there is evidence that skilled gamers can get benefits from frame rates up to 240 while the eye technically doesn't process images that fast). That, however, has more to do with the precision of the mouse, and the rate at which hit boxes update.
So 8K is, at normal viewing distance, higher resolution than the eye can see, and anything over 60fsp is really higher than human perception.. what is the next big lead for graphics?
I think the easy answer is pushing polygons, but that hit's it's theoretical ceiling at about 33 million, at which point each pixel on an 8k display represents a polygon. It could be argued that the theoretical cap then is 11 million polygons since a true polygon would need at least 3 pixels.
With complex game models we see polygon counts in the hundreds of thousands today, and I think the polygon count cap is not really worth considering as it will be hitting the theoretical useful cap soon if the 3090 isn't already capable.
I think the next big move in graphics will be in two old technology pathways that were until fairly recently limited by a processing bottleneck: Voxels and Physics.
Voxel graphics have been around forever, and have been very useful in creating mold-able terrain. It turns the virtual world into a set of discrete 3D pixels (usually cubes). Minecraft is a rudimentary voxel engine, for example. But as graphics cards increase their flop counts to supercomputer levels, if is approaching the possibility that in a few years that pixel counts will no longer matter (like how 2D render speeds eventually became a pointless comparison), and we will be measuring graphics by voxels per second (VPS).
On top of that, I would not be surprised to see a return to discrete physics cards, and would have their own drivers that would perform a whole host of calculations regarding virtualized material physics. For instance a 3D world could be generated and the artist would assign a voxel based block a material identifier like "Concrete", and when that object was impacted by a small, fast moving voxel-based entity with the identifier "lead", the physics card to calculate the impacts, fractures, relative speeds of particles and hand off to the graphics card/engine where and how to render the resulting voxels.
The one outcome of this would be a temporary return to solo gaming for the wow factor experiences until network speeds are fast enough to share the that much change data... OR, we would see the birth of dumb-terminal gaming where all users connect remotely to a central data center and all user interactions ware managed server side, with only control inputs and screen draws exchanged between the end use and the server... like the ultimate version of game streaming.
So I guess that is three things to look for in the next 10 years:
1) Voxel graphics accelerators
2) Discrete Physics accelerators
3) Growth in server side gaming
The problem is we are nearing the brickwall on hardware, so we are nearing a point either coders need to be highly efficient to push such advances in current hardware, or the next gen of hardware be introduced either quantum ternary etc, as the current 64 bit binary is nearing it's limits with not so much room to improve overall.
The limiting factor is actually not the ability to dump heat, they had pentium 4 processors near 2 decades ago as well as celerons outclocking most modern computers yet they actually performed slower. The issue is bandwidth and how it is utilized, multi core processors worked better than highly clocked heat pumping single cores when software allowed because the bandwidth and processing was distributed among multiple cores.The limiting factor on computational power is becoming the machine's ability to dump heat.
Many servers are liquid-cooled. It wouldn't surprise me to see home computers in the near future adapt the same technology.
Like all nascent technologies, that would be expensive at first but gradually become cheaper.
Naww. Just think how fast those GPUs could mine bitcoin!With the release of the nVidia 3080 and 3090, we have, in theory, reached the upward limit of graphics output. The 3090, for example, can run a game at 8K @ 60fps
While there is a weird gray area in human vision-vs-reaction, and we see that humans can react faster than our eyes can theoretically see (there is evidence that skilled gamers can get benefits from frame rates up to 240 while the eye technically doesn't process images that fast). That, however, has more to do with the precision of the mouse, and the rate at which hit boxes update.
So 8K is, at normal viewing distance, higher resolution than the eye can see, and anything over 60fsp is really higher than human perception.. what is the next big lead for graphics?
I think the easy answer is pushing polygons, but that hit's it's theoretical ceiling at about 33 million, at which point each pixel on an 8k display represents a polygon. It could be argued that the theoretical cap then is 11 million polygons since a true polygon would need at least 3 pixels.
With complex game models we see polygon counts in the hundreds of thousands today, and I think the polygon count cap is not really worth considering as it will be hitting the theoretical useful cap soon if the 3090 isn't already capable.
I think the next big move in graphics will be in two old technology pathways that were until fairly recently limited by a processing bottleneck: Voxels and Physics.
Voxel graphics have been around forever, and have been very useful in creating mold-able terrain. It turns the virtual world into a set of discrete 3D pixels (usually cubes). Minecraft is a rudimentary voxel engine, for example. But as graphics cards increase their flop counts to supercomputer levels, if is approaching the possibility that in a few years that pixel counts will no longer matter (like how 2D render speeds eventually became a pointless comparison), and we will be measuring graphics by voxels per second (VPS).
On top of that, I would not be surprised to see a return to discrete physics cards, and would have their own drivers that would perform a whole host of calculations regarding virtualized material physics. For instance a 3D world could be generated and the artist would assign a voxel based block a material identifier like "Concrete", and when that object was impacted by a small, fast moving voxel-based entity with the identifier "lead", the physics card to calculate the impacts, fractures, relative speeds of particles and hand off to the graphics card/engine where and how to render the resulting voxels.
The one outcome of this would be a temporary return to solo gaming for the wow factor experiences until network speeds are fast enough to share the that much change data... OR, we would see the birth of dumb-terminal gaming where all users connect remotely to a central data center and all user interactions ware managed server side, with only control inputs and screen draws exchanged between the end use and the server... like the ultimate version of game streaming.
So I guess that is three things to look for in the next 10 years:
1) Voxel graphics accelerators
2) Discrete Physics accelerators
3) Growth in server side gaming
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?