Fps change at 720p?

We
- in Xbox
15

I have an xbox one x and would like to know if I would have more constant 60fps in games if I were to go to

General> TV and Display Option> Display

the graphics from 1080p to 720p. I would also like to know if I would have a more consistent 60 fps if I was at

> Advanced> Video Quality & Overscan

change the color depth (24 bits per pixel - 30 bits per pixel - 36 bits per pixel)

I thank you

ma

No, you're lured to 60fps.

We

I mean you often have "fps drops" in games, so I thought this would no longer occur in most situations if I set the quality to 720p, or am I getting something wrong?

El

No, your TV has nothing to do with Xbox performance.

We

Ah okay

ma

Yes, you understand something wrong. It's not a PC. You can't just change a few settings or what is overclocking and good.

The problem is that the PS4 as well as the Xbox one got the worst processor there was back then. (Shoutouts to people with the FX 8350). In terms of the rest of the hardware (GPU, RAM), the console is also rather poor. More is just not there, you can play on 480p and it doesn't get any better.

We

Okay thank you

We

So I can set the color depth to the best with video quality, i.e. 36 bits per pixel because that has something to do with the television?

ma

Yes.

We

OK

Re

As far as I know, with the OneX only premium games declared for this console are natively pre-rendered in FHD by the graphics processor, while the "normal" games for the X-Box One are rendered natively in 720p, and then upscaled to 1080p for high-resolution screens can.

Since the OneX has a considerably more powerful graphics unit than the One "Classic", but the CPU has not changed that much between these two variants, you can only try out where the bottleneck is in the respective games.

Most likely, your project is likely to bear fruit in some "standard games" with an internal 720p render path to relieve the graphics unit. With the "OneX - Exlusiv - Premiums" I do not know whether the native rendering resolution can be reduced from 1080p to 720p, or whether it is only scaled down afterwards.

Re

Shoutouts to people with the FX 8350

Technically, "Bulldozer technology" was not even used in the XBox One and PS4, but 2 × 4 cores of the newly revised "K 10.x technology" were used as in the Athlon II / Phenom II.

The whole thing was called "Puma" and came onto the market for desktop computers in the form of an Athlon 5350 with 4 cores / 4 threads on the AM1 socket. The consoles had two of these processors with slightly different wiring for shared memory and a separate graphics chip (modified Radeon HD 7800 series) instead of iGPU.

ma

Yeah, just wanted to do the shoutouts to the bulldozers Bois.

Unfortunately, you couldn't buy Jaguar and the Athlon II and Phenom II ran relatively poorly. Bulldozer ran better but it was also the horror (especially the TDP for so little performance).

Re

Unfortunately you couldn't buy a Jaguar

Yes, you could buy it as part of the AM1 platform from AMD with the associated Sempron 3xxx to Athlon 5xxx SoC. Only then up to max. 4 cores / 4 threads. Are still used today:

https://www.ebay.de/...id=7364532

(At that time it was just under 200 euro new with this SoC)

The two consoles used "Puma" in a roughly similar way to old Deneb dual socket boards with 2 × 4 core Opteron based on K 10.x. The memory management on the consoles was completely different from Opteron - Dualsocket. One of the two consoles even used DDR3+ GDDR5 in combination.

AMD GX 890 (AM3 - Socket) even used a 3rd DDR3 sideback cache for a short time for the top iGPU Radeon 8290 (or similar) that was still in the chipset at that time.

If you wanted to summarize it roughly, AMD combined roughly the best from K 10.x (for Socket AM3 it was "Thuban") with Opteron dual socket board and sideback memory channel.

"Puma" was then more or less a Thuban technology for low energy with dual socket and potent iGPU on shared memory, optimized again to the extreme (for the technology at the time).

Unfortunately, it was never available in this form because there was no need for it.

It's just a shame that AMD has not even taken up this "sideback principle" from the 890GX - PCH on the AM3 to this day porting it to the AMD AM4 for the "G-Ryzens".

AM1 was freely available for sale at the end of 2013/2014 only as a low-cost direct offensive against Intel's Atom on the Nvidia ION platform. Indirectly, this has been taken up again at least partially with Athlon G (e) on the shared AM4 platform.

Athlon G (e), like the Athlon 5350 back then, only offers 4 directly connected PCIe lanes to the board as a SoC for PCIe slots without going through the chipset.

ma

Learned something again.

Re

Let's see if AMD picks up on this old hodgepodge of ideas with Zen 3 & RDNA, at least with regard to the caching side branch to the iGPU. I see it as a very big opportunity for AMD… Because with Ryzen 4x00g, storage technology in the notebook area is already being slow down again. (A10-7850K sends its regards)

Just imagine something like a GTX 1050Ti that can be retrofitted inexpensively for device buyers by means of additional branches in AMD books and desktops with "Nvidia - Optimus - like technology" (AMD had something there itself (most recently "Enduro") but it never really continues)

In the semi-professional area, AMD even offers a Ryzen Pro - Hexacore with an integrated graphics unit.

Why doesn't AMD finally use its Knoff-Hoff from its fundamentals? Intel has been using it in a similar way for a long time. (Intel Iris Pro with integrated L4 between 64 to 128 or even 256 Mbytes for the iGPU)

AMD could reactivate "Sideback" with a fast DDR4 via branched RAM channels.