leucineleprec0n's comments

leucineleprec0n | 4 years ago | on: Samsung and AMD will reportedly take on Apple’s M1 SoC later this year

Indeed and the reported Exynos GPU utilizing AMD IP is likely a bit hot on phones but the sustained leaks for 3D games look promising, if they are accurate in that they were recorded for sustained activity then it sets a sort of upper bound on the power draw given it’s in a phone.

Peak:

https://twitter.com/fronttron/status/1430192491450363913?s=2...

Sustained:

https://twitter.com/fronttron/status/1434492507283279884?s=2...

leucineleprec0n | 4 years ago | on: Samsung and AMD will reportedly take on Apple’s M1 SoC later this year

Edited for harsh tone.

The TDP is a non-standardized marketing term. You know what is fairly standardized? Watts.

Please Read The TDP is not the wattage consumed during peak load when these chips — or the constituent cores I should say — actually hit these single or multithreaded benchmarks. It’s merely a “hey this is roughly what this chip will consume at the base clock rate or the system is capable of dissipating with said chip after turbo boosting” and even then with cTDP of recent years it’s not invariably worth that much.

It’s not even funny how far ahead Apple & ARM reference cores on high-density/low-power libraries are in this regard.

leucineleprec0n | 4 years ago | on: Samsung and AMD will reportedly take on Apple’s M1 SoC later this year

Lol they use off the shelf ARM cores again, and the major issues since the custom cores of years past have dissipated. Sure, Qualcomm adds a slight bit of low-level code and has a different power policy on the 888 vs the 2100, but it’s not actually that different and both are on the Samsung 5NM (by nomenclature anyways) process, which is what actually draws criticism in terms of “Samsung” in the context of SOC’s as of 2021. The reason is simple: the density of the process is solid (EUV and all come in handy) but the leakage is rather disproportionately poor for given points toward the upper range of the voltage-frequency curve e.g. 2.7-3GHz, where mobile “big” or “huge” cores tend to run such as the Cortex X1, Cortex A78 which are both found in the former two SOC’s from Samsung and Qualcomm.

In other words: Samsung and Qualcomm are both behind Cupertino in CPU single-threaded performance but it’s worth noting why this is especially bad: ARM still optimizes for performance per area/efficiency with reference cores & furthermore the implementations of these cores by Samsung and Qualcomm do not utilize remotely as much cache that ARM recommends for the Cortex X1 nor L3 (4MB, but they could go to 16!) which is typical, and the 888 actually reduces the maximum frequency of the X1 (2.84GHz - as ppposed to topping out at 3.09/3.1 like ARM spec or as with their last SOC, the 865) presumably because they deemed the power trade-off not worthwhile for their mainstream flagship chip. Throw in Samsung fab inferiority (to TSMC) and the engine lights here just stack up.

Microsoft is reportedly using an X1 & A710 on a TSMC 5NM process, which is odd given they ought to use an X2 but whatever. Qualcomm and Samsung’s X1’s score 1000-1100 on Geekbench 5’a Single thread test, which is indeed an excellent ledger of general performance. With TSMC 5NM & more cache, this chip is going to have phenomenal performance per watt and accounting for the Windows Geekbench penalty probably have X1’s hitting 1200 in single thread tests, which is about 500 off an M1, but also at less power - ARM quotes the X1 on TSMC 5NM (which Microsoft will be using instead of Samsung 5NM) as a 3.2-3.6 watt core at peak. The A710’s? We’re talking about a 1W profile (and 800-100 GB5 performance if the A78’s are anything to go by, and an absurdly high performance to area ratio).

And really, this is just about providing a decently performant ARM option that retains the ever-obvious performance per watt advantage. The Qualcomm 8CX is just not powerful enough, because it uses old Cortex A76 cores, as does Microsoft’s dogshit SQ1/2. Battery is life is reportedly solid though.

This is one big step in actually hitting “good enough” for ARM on Windows. It’s about competing with X86 and keeping the Windows ecosystem competitive with Apple’s offerings, not necessarily beating them overnight.

leucineleprec0n | 4 years ago | on: AMD’s Lisa Su

It flies and stops halfway in the face. Apple’s developer tool chains (at least, for iOS and the world of mobile apps and games, Metal notwithstanding) are decent, their codebase is at least a cleaner implementation than what Google have with the combined GPL & refactoring nightmare of the Linux kernel & Android (to be solved by my beloved Fuchsia in time, surely) — but it’s not as if Apple are really a shining light on software overall. To be honest, they absolutely suck. C+ tier stuff. iOS background management, iOS & MacOS UI since iOS 7 & Big Chu—Sur respectively, woefully inadequate security defense on all systems, an AWFUL file manager on MacOS dubbed “Finder”, garbage implementation of Bluetooth and WiFi antennae status on control panel, no iOS option for reasonable DPI scaling (zoomed doesn’t count), excessive font smoothing only modifiable via Terminal as of MacOS Big Sur — excessive use of advanced animation and sickening transparency leading to poor performance on hardware that makes Android or Windows fly. Oh, no option to delete all messages at any given time via a simple “select all” button in the edit bar for Messages, nor an ability to export all messages to a fricking plaintext file or HTML.

The list continues, and is a long one worthy of a short documentary. Apple excels at a few interface consistencies and at selling premium hardware with high margins in part due to an admittedly deserved proficiency with supply chain consistency & scale, and with convincing customers to pay with their kidneys for marginal upgrades e.g. 400% up charges on RAM and Storage.

The M1 is impressive, certainly agree, but it won’t be too long before others catch-up. Still, let’s not get ahead of ourselves about Apple and software prowess. It’s been a long time since I preferred Cupertino’s code to Redmond’s or Mountain View’s.

leucineleprec0n | 4 years ago | on: M1 Icestorm cores can still perform well

Indeed, thank you. It's plainly pathetic this has to be said so often, especially here FFS. Every time this topic comes up "AMD can do the same, 15W" as if the "15W" figure was a terminating value from a set of industry-standard "TDP" figures. It is not, and the "TDP" listed at best offers a clue as to the power draw. Not much more. Hell, it can even differ by motherboard config. Intel encourages it.

The next emotional preservation tactic usually cites the old GF IO Die, but that was only on H/desktop series chips anyways and furthermore they still lose to Apple in sheer performance per watt.

It's August 2021 and we still have to have this conversation. Sigh

leucineleprec0n | 4 years ago | on: Google Pay team reportedly in major upheaval after botched app revamp

The moment I downloaded the new version, without even having used Gpay for years — I knew — I knew they used some web-based or write-once bullshit just by the latency in the tab swipes alone, I can discern it distinctly and am sensitive to these deficits in performance (unfortunately).

Then I thought, "ah shit" searched "Flutter Gpay" and sure enough.

Sigh. When will they learn.

leucineleprec0n | 4 years ago | on: A Guide to RCS, and Why It Makes Texting So Much Better

It's not ideal and pisses me off it took this long, but the technical implementation and UX is not at all bad now that we're here. The big 3 American carriers are in on this standard now, and Ma Bell's RCS-capable messaging app that is preinstalled on Android phones is basically indistinguishable from Google Messages even down to the logo. What I mean is that the UX isn't that "carrier-y" and more dumb-utility esque. Lastly, Google may not install their APK version as default on OEM phones- but the binary is readily available on the Play Store and Android has a myriad default app settings, including for messaging.

Look, Google fucked up for years and it's actually my biggest gripe with them (ceding messaging to the loonies in Cupertino in the US) but at least we're now at the point where all American Android users will have an interoperable standard, and since iMessage is a cancer in America particularly, well, it's okay if RCS takes longer to reach critical mass elsewhere.

leucineleprec0n | 4 years ago | on: Apple's plan to “think different” about encryption opens a backdoor to your life

It’s not just you. It’s fucking enraging at this point. I feel like I woke up one day and gleaned a fat look at Finder or various iCloud/background service junk and just realized it is to me what fucking bloatware ware of 2010 PC’s (and presumably today) was/is.

I just want general purpose computation equipment @ reasonably modern specifications - albeit largely devoid of rootUser-privileged advertisement stacks (included libraries etc).

I mean what the fuck, is that so fucking hard? This is hellworld, given the obviously plausible counter factual where we just… don’t… do this

leucineleprec0n | 4 years ago | on: My Fanless OpenBSD Desktop

Yeah seriously, I for one am gonna have to throw in my hat for scaling. Big pixel isn’t paying me off, I can just tell an absurd discrepancy in the realism of (lol, cartoonish in reality) graphical menus and fonts. Granted you can go too far

leucineleprec0n | 4 years ago | on: The Framework Laptop is now shipping

In this thread: “do you think you all will/can/ ship an [even more niche feature fully several orders of magnitude more difficult to make profitable than the product’s existing value offering right as it is shipping, from a startup to boot].

Seriously guys [some commentators], what the hell. They did a good job, I mean I’d consider it independent of the modularity benefit which I don’t care overly much about - if only because it’s bloat free, sports a very pragmatic design without being obtuse and frankly it’s high time PC oem’s had more pressure. With this being their only product there’s a chance QC ends up proving superior to premium laptops from HP, Dell, etc. Certainly these guys’s track record so far doesn’t instill any doubt!

Great job, @ the Framework founders. I love to see any innovative angles or iterations possible in the PC space these days.

leucineleprec0n | 4 years ago | on: Where’s the Apple M2?

You can compare Apple's 7NM chips, too, as I just said if you'd read. And guess what? The power consumption figures bode poorly for AMD.

The IO die only adds like ~15W btw, yes, I'm aware it's on a GlobalFoundries node.

Lastly, AMD's mobile chips throttle down to fairly low clocks and in AMD's case, low to modest performance when not plugged in, it's why you keep hearing tale of the great battery life on Zen 3 laptops.

page 2