(no title)
fellerts | 18 days ago
I’m surprised that they chose to add a bunch of components to feed the AC line frequency to the microcontroller instead of just using a 32.768 kHz crystal. A single crystal oscillator seems like both the cheaper and more accurate option
The power line frequency is carefully monitored and adjusted to ensure that deviations from the ideal (60 Hz in OP's case) are smoothed out [0]. Even a single ppm deviation equates to 2.6 seconds per month, and your cheap 32.768 kHz crystal is going to be orders of magnitude worse than that.[0] https://en.wikipedia.org/wiki/Utility_frequency#Stability
jonathanlydall|17 days ago
However, out of interest I just pulled yesterday's stats from my inverter on Sunsynk's website. It has the frequency of the grid at 5-minute intervals and the average over the whole day was 49.975Hz which doesn't strike me as particularly bad, so I have to wonder if the Microwave itself has an issue. It's a Samsung which is now 13 years old.
Joker_vD|17 days ago
A day, having 86_400 seconds in it, is equivalent to 4_320_000 pulses at 50 Hz. At 49.975 Hz, it's only 4_317_840 pulses which is 2_160 pulses too few. Which, at assumption of 50 Hz, translates into discrepancy of 43.2 seconds, in this one day.
So, no, it's a pretty big discrepancy actually, over here anything over 0.2 Hz is legally declared to be "degraded quality", and it's been debated for years that this is actually a way too wide margin but the electricity providers/grid operators managed to successfully argue that they can't afford upgrades.
Moral of the story: don't get cute when designing electronics, just use AC/DC power supply and put a damn crystal oscillator as every other reasonable person.
djdnndne|18 days ago
[deleted]