VR - experimental support

User avatar
Lombra
Posts: 685
Joined: 27 Jan 2016 01:45
Location: Sweden

Re: Oculus Rift - experimental support

#1921 Post by Lombra » 27 Apr 2022 21:05

750 W is plenty.
Donric
Posts: 4
Joined: 20 Aug 2019 19:20

Re: Oculus Rift - experimental support

#1922 Post by Donric » 02 May 2022 17:40

i tried to run ETS in OpenXR with open composite. Its working but i have some odd flickering around the navigation system and most of the road singns are empty? This is only when i run with OpenXR. OpenVR all good. This is with a Varjo Aero, any ideas whats going on?
Some newbie driver
Posts: 7234
Joined: 12 Dec 2018 11:37

Re: Oculus Rift - experimental support

#1923 Post by Some newbie driver » 02 May 2022 19:25

KingOfTheRoad77 wrote: 27 Apr 2022 17:46But a new power supply would be an additional investment of 200-250 €, depending on whether to go "only" for 850 W or directly for 1000 W in order to have enough reserves for a NVidia RXT 4080 somewhen later on, hopefully...
First and foremost: Next gen high-tier GPUs will use a complete new power connectors that will supersede the actual 6/8 pin connectors for GPUs or the stupidly proprietary thing NVidia put in their 3000 series Founders Edition models. And as far as I know, no decent PSU has been released compatible with that (although probably some good modular models could receive some kind of reliable adapter, maybe, I've not enough knowledge on that field). So, bad idea to purchase a PSU now with the "future-proof" expectation and aiming high-tier GPUS on next gen. Best case scenario is that you will have to spent some extra tens of bucks to adapt it. Worst case scenario: you could use it only to upgrade your moms PC.

Second: About those predictions of what could be "plenty" of wattage for the future. The strongest actual leaks point towards the future 4090 to reach 600W (and that would be NVidia's model, not what other AIB could try on their more tuned versions). To put it in perspective, the 3090Ti almost-vaporware version specs 450W (so, they expect 33% increase in electric consume in one generation). And of course, I'm talking about 4090, not the mentioned 4080. But if the odds of that model are high to reach 600W; then take for granted than a 4080 will be around 450W like the current 3090Ti. Go check what kind of PSU are using the (serious) reviewers on internet to tame those beasts and you will have a pretty decent idea of how wild things are turning.

And that's just the GPU, you will need a heck of a CPU to squeeze all the juice those GPU could provide (otherwise, why to waste the money?). No problem, new Intels high-end gaming CPU reach up to 240W consume. Yeeehaaa! I wonder if those companies are buying stacks on electrical corporations... 8-)

Third and most important: The final problem is NOT the "nominal" wattage, the one we see in the specs and portfolios. The problem are the spikes. PSU had to be able to sustain spikes above their ratings for brief periods of time before over-current protection triggers. But that has been a problem already with current gent GPUs (and it was before the most insane CPUS that appeared later). The 3090 rated 350W but several reviewers explained how they needed 1.000W PSU to have stable benchmarks. Current gen high-end components had got a wild ride of very high current spikes; I mentioned Intel and NVidia consumes so far (they are the worst offenders); but AMD had their own examples too. And that race on wattage makes me foresee the problem will only go worse. It's expected that next gen AMD cards will have a way better efficiency than NVIdia ones; but that means relation between consume and performance; nothing points their consumes aren't going to be high too (just at least not so absurdly big than NVidia ones)

There's people over there ragging how their 500W random PSU could manage high-tier GPUs and they are happy with their life. Well, those examples mean those crap PSU are working lots of times and for long periods on their over-current zone because their over-current protections are the thing where manufacturers put crap to save money. I hope those people don't have to learn the lesson the hard way: the day their computer (and the house that hold it) catches fire when some components (like the cables) will end melting due over-heating.

The new specification for PSU that include the new GPU connector I mentioned before includes a lot of constrains to make the over-current protections (and the whole PSU) better. But that's for the products that will comply with specifications in a honest way. The PSU market is bloated with pirates out there. From companies that sent curated PSU to the tests but the regular sold ones will never comply to companies that directly falsify their compliance. Not to mention the cables for those new pins will have different ratings (yes, the cable connection should had a label with the max power is rated to). The amount of confusion and problems there will be with regular customers when all of this goes launched to mass markets is going to be terrific.

So, sorry for the wall of text, but that's for the safety of your wallet and your skin. Really, last year has been worst time ever to plan a gaming PC purchase.

Regards

PS: For anybody who things those wattage are "not a problem"; I suggest you do a test. Take a space heather and put it in your room at half power (that would be around 750W on US, for example). Then notice how much time you can keep in that room before having to crank up whatever ventilation you could before have a heat shock. Maybe in winter you could save money in central heating, but less than the money you will burn in AC on summer I bet you.

PS2: And to add more salt to the absurdity of that trend, US regular electric wall plugs can only deliver 1500W (and it's not a good idea to do it for long periods, just in case you find some ass**** cheaped in your electric installation). So, if things keep going that way, there will be people needing to install special heavy-duty wall plugs and electrical wiring to connect a personal computer. Live long to see strange times coming. :roll:
User avatar
KingOfTheRoad77
Posts: 232
Joined: 07 May 2021 13:27
Contact:

Re: Oculus Rift - experimental support

#1924 Post by KingOfTheRoad77 » 03 May 2022 06:16

@Some newbie driver
I totally agree with you.
But especially concerning reliability, overheat and over-current protection, I'd never go for a cheap budget PSU... Since more than ten years now, I think, I am using be quiet! PSU's and already recommended at least four of them to my father, my brother and two friends of mine. All still working perfectly (okay, my first one (500W) is sold; so I don't know there ;) - but my second (750W) is almost seven years old now).
So: Definitely no risky "games" with PSU's 👍

Finally, I think concerning VR gaming especially with SCS's truck simulators, the investment for the 5800X 3D probably would pay off much more than for a 4080... First, it seems to provide an FPS increase of up to 40% within the current generation - which sometimes isn't even the case from one generation to the next... Second, it probably costs less than half of what a 4080 will cost. Third, it only draws up to +40 W compared to my current 5600X - and not +100 - 200W a 4080 would draw compared to the 3080.

So I probably will go for the 5800X 3D soon.
I'm just a bit unsure whether to catch it now for 520-530€ (= +30) - or whether to wait until it's below 500€ again or perhaps even towards 450. But it could be that I miss the last exemplars now, would have to wait months until it gets available again - and costs more than 600€ then 😂

Really no easy times for PC hardware decisions and purchases 👍
Some newbie driver
Posts: 7234
Joined: 12 Dec 2018 11:37

Re: Oculus Rift - experimental support

#1925 Post by Some newbie driver » 03 May 2022 06:42

Beware with that CPU, because it's a very unique and special beast. Due it's manufacturing (and being the first of his kind); AMD had to reduce the base clock speed regarding 5800 and also block the possibility to overclock it the usual "safe" ways (some very skillful people had managed it to overclock it in other ways, but they are no "normal" users).

The result is a CPU whose advantages are VERY reliant on how it works internally the software being used. All reviews I saw about it had the same conclusions: there's big nice gains in some programs and not so many gains or even loses in other programs. So, the best could be to find some review of that CPU for SCS games (or others you like to play) to be sure before spent that amount of money.

Regards
User avatar
KingOfTheRoad77
Posts: 232
Joined: 07 May 2021 13:27
Contact:

Re: Oculus Rift - experimental support

#1926 Post by KingOfTheRoad77 » 03 May 2022 06:53

Yes, I know.
But OC for CPU isn't my scene - too many parameters, usually less than 5% of performance gain, and exponentially increasing heat... no, thx ;)
So no problem there from my side.

Concerning SCS's truck sim's, @grantalope seems to make very good experiences with it (viewtopic.php?p=1686983#p1686983).
Elite Dangerous would be a second candidate for me I'd like to see how it performs with the 5800X 3D. Project Cars (1+2). Dirt Rally 2.0. Perhaps Subnautica @VR as well although its atmosphere is even more nightmarish in VR... ;)

Apart from that (VR), I am playing mostly a couple of older games in "flat 2D" on a 47" TV in FullHD @60Hz - those weren't a problem at all for my former 2600X already ;)
Some newbie driver
Posts: 7234
Joined: 12 Dec 2018 11:37

Re: Oculus Rift - experimental support

#1927 Post by Some newbie driver » 03 May 2022 08:16

Well, the OC part applies to a very broad range of changes. Most hard and tuned OC use to be out of interest of most users; but in modern years the companies had added lots of tools to ease a certain "basic" level of OC to their products that it's as easy as to click a few buttons on a desktop program. Specifically for AMD, to match 1:1:1 he clocks on cores, infinity bus and RAM has a huge benefit without needing to aim very high clock values in CPU or RAM (in fact, it gives way more smoothness, important for VR, than to aim for higher clocks on specific components instead of link them all more sync). That would be out of the table with the 5800X3D too as far as I know.

Compared with the current 2600X you say, of course, it would be a huge upgrade. But it's possible that for your uses the regular 5800X would be even better and with a good save on money (I really doubt it lowers under those 500+ prices before next gen appears). And even if not would be better, the upgrade from the 2600X would still be huge keeping the save in money. That's why I warned you in case you weren't aware.

Regards
User avatar
KingOfTheRoad77
Posts: 232
Joined: 07 May 2021 13:27
Contact:

Re: Oculus Rift - experimental support

#1928 Post by KingOfTheRoad77 » 03 May 2022 08:25

Some newbie driver wrote: 03 May 2022 08:16 Well, the OC part applies to a very broad range of changes. Most hard and tuned OC use to be out of interest of most users; but in modern years the companies had added lots of tools to ease a certain "basic" level of OC to their products that it's as easy as to click a few buttons on a desktop program. Specifically for AMD, to match 1:1:1 he clocks on cores, infinity bus and RAM has a huge benefit without needing to aim very high clock values in CPU or RAM (in fact, it gives way more smoothness, important for VR, than to aim for higher clocks on specific components instead of link them all more sync). That would be out of the table with the 5800X3D too as far as I know.

Compared with the current 2600X you say, of course, it would be a huge upgrade. But it's possible that for your uses the regular 5800X would be even better and with a good save on money (I really doubt it lowers under those 500+ prices before next gen appears). And even if not would be better, the upgrade from the 2600X would still be huge keeping the save in money.
No, the 2600X was before Dec '20.
Since Dec '20, I already own a 5600X which already gave me a boost of ~15% for VR.
But it seems as if the 5800X 3D might give another 20-40% of performance increase on top of the 5600X especially for the truck sim's if Guantalope is right - and why shouldn't he as he already installed it ;)

For the moment I'll just keep watching the prices. Perhaps there will be some kind of opportunity soon - if not, the 5600X still is doing his job well enough :)
Some newbie driver
Posts: 7234
Joined: 12 Dec 2018 11:37

Re: Oculus Rift - experimental support

#1929 Post by Some newbie driver » 03 May 2022 08:52

Ah OK, I got lost in the different CPU mentions. So in your case, the best as you say, is wait for some good opportunity in prices. Because otherwise, I don't see the normal price of the 3D one to be a good value just to upgrade from a 5600X. It would be a noticeable upgrade of course, but the cost is hard. On the long run, if you want to keep that computer for quite a time, once next gen appears and prices plummet, then the 3D would be a good way to give the long possible life to AM4 platform builds for a reasonable price.

Next gen GPU are going to be PCIe5.0 but current ones don't even cope PCIe4.0 so it would be not an issue at all for normal GPU with whole x16 bus (it's a problem for cards like AMD 6500 with that shaved x4 PCIe connection). Next gent CPUs (and current Intel one) will support also DDR5, but prices of those are absurdly high, the best DDR4 modules still beat DDR5 and the 3D cache is specifically aimed to reduce dependency of the RAM performance. So, the 5800X3D seems a good alternative at long term for people who wanna keep an AM4 platform skipping the first AM5 gen.

Regards
User avatar
KingOfTheRoad77
Posts: 232
Joined: 07 May 2021 13:27
Contact:

Re: Oculus Rift - experimental support

#1930 Post by KingOfTheRoad77 » 09 May 2022 16:36

grantalope wrote: 24 Apr 2022 08:06 Ok , got my 5800x3D installed and jumped in the game for about an hour. Hype seems to be real on this chip. Feels like I'm completely unbottleknecked on a GTX 3080 now. I was cruising around at night in ATS with a rock solid 90 fps and zero stuttering with consistent ~95% gpu usage (never see it this high.) My CPU times in FPSVR dropped from ~15-17ms to ~5-7ms.
Hi,

well, I've got my 5800X 3D on Saturday and installed it yesterday.
Played ATS for an hour with my Pimax 8K-X first in Oculus-mode, then half an hour in OpenVR/SteamVR-mode.

Result:
Disillusioning and sobering: No FPS gain at all - or perhaps just a couple one. But far from the values you mentioned.

New 5800X 3D:
[ external image ]

Former 5600X:
[ external image ]

So it seems, at this 8K-X's resolution, I am already fully bottlenecked by the GPU...

CPU-frametimes have dropped a bit (>30ms => 23.2; <13.3ms 56,9% instead of only 30,3%) - but in the end, this hardly noticeable.
The only real positive thing is that those little reload-to-RAM-stutter have become much much better now, also hardly noticeable, no 0,5sec stuck and judder any more.
But... well...
...is this worth 530 €?

So, right now I am a bit unsure what to do?
=> Keep it and wait and hope it helps to fully unleash RTX 4080 later on right from the start...? 500 € now for a "maybe" later 🤔
=> ... or go back to the 5600X ?
Post Reply

Return to “General discussion about the game”

Who is online

Users browsing this forum: No registered users and 14 guests