UnknownSouljer
[H]F Junkie
- Joined
- Sep 24, 2001
- Messages
- 9,041
You didn't read what I wrote or you didn't understand it.I think you're confusing something here. If the GPU is just displaying a static 2D image of course the resolution doesn't come into play, but start using the GPU and it will absolutely make a difference. HardOCP's tests are only idle vs maximum load and they very specifically test at the maximum settings they could put into the GPU. They don't need to say it's for XYZ resolution because the implication is they're trying to make the GPU draw as much as they can (although after the release of power viruses like Furmark, it's "as much GPU draw as realistic").
Are you thinking in terms of no-vsync perhaps? That's the only way I can see how someone could think lower resolutions don't use less power. You're literally asking the GPU to do less work.
A13 = A13. You can't make A13 use less power than A13 or more than A13.
Utilization versus utilization. If A13 is utilizing 80% at 4k and A13 is utilizing 80% at 1080p they are using the same amount of power. Resolution factors into it not at all. Feel free to change 80% for any number you like.
-
Edit, for clarity: You can make A13 utilize less than A13. But you can't make it use less power than A13. Power draw is power draw. There are no other variables. But even to that point of utilization, the SOC will always use as much utilization as possible to complete tasks. The iPhone 11, 11 Pro, and 11 Pro Max I doubt HIGHLY that there is a difference in utilization when doing the same tasks despite their three different resolutions because it will always use all the resources it can. Now certain phones may get more or less frames due to resolution, but not more or less utilization due to resolution except at literally idle with zero tasks being run and specifically the screen ON. If the screen is OFF then in that case again there would be no difference.
The power used by the display is the question. Which is where the difference is. Screen size, resolution, display type.
You didn't read what I said. Do you have power usage numbers for any of their SOC's? It's impossible to know if the power usage from A11 or A12 versus A13 is more or less.As for the battery life, if your expectation is the same then that's precisely what's wrong. The A13 is already known to be more efficient than the previous A12 and A11. There's not really any scenario where the iPhone SE2 will have worse battery life unless Apple packs in additional hardware.
The A13 is more efficient per core than A12, but we don't know how that relates to A11 and we also don't know how that relates to number of transistors or die size in terms of power consumption.
-
Edit, for clarity: Additionally the A11 has 6 CPU cores and 3 GPU cores. The A13 has 6 CPU Cores and 4 GPU cores. An additional point is that the architecture has differences too. M11 co-processor was integrated at A11. And it's very hard to know what those take in terms of power regardless of more or less in terms of other co-processors have also been integrated or changed regardless of lowered power draw per core. Efficiency has definitely gone up, but components have also gone up.
On Wikipedia you can see that although the A13 is 7nm+ and the A12 is 7nm the A13 has a larger die size. https://en.wikipedia.org/wiki/Apple_A13
There are 6.9 billion transistors in the A12 and 8.5 billion transistors in the A13. The A11 has a paltry 4.3 billion transistors and a larger die than A12 but smaller than A13. How all of these factors relate to total power consumption is incredibly hard to say despite their increase in efficiency generation to generation.
I kind of feel like responding to you is pointless. Because you're not responding directly to what I'm saying.
EDIT: Apple themselves state that the iPhone SE has the same battery life as the iPhone 8.
Quote: "Lasts about the same as iPhone 8"
https://www.apple.com/iphone-se/specs/
Last edited: