(no title)
pxdm
|
10 months ago
I can speak for the GB case. Low Frequency Demand Disconnection (LFDD) occurs automatically and in stages when the frequency drops until it stabilises. The substations or feeders that are tripped off are not currently determined by real-time metering - instead they are pre-allocated based on their typical demand. This means that the system operator does not really know how much demand will be disconnected at any given time. If it's sunny, you could easily trip off a lot of solar generation connected on the low voltage network, causing the frequency to drop further. It is far from optimal!
pbmonster|10 months ago
This is wild. From a amateur technical perspective, it would only take a cheap hall sensor inside the transformer to have a pretty good guess of how much current has been flowing to the load.
Hell, put the hall sensor onto a board with a micro controller and a LORA transmitter and stick it to the outside of the feed line. Seems like an incredibly cheap upgrade to get real-time load data from every substation.
remus|10 months ago
If you're monitoring real time power consumption you then need a whole extra infrastructure to communicate this info back and forth. Of course you then have to consider how you're going to keep that extra infra online in the event of power issues.
pjc50|10 months ago
I also wonder what the realtime requirement is. Data from a minute ago is fine .. except in this kind of situation, when things are changing very quickly.
pyrale|10 months ago
The estimates we get from seasonal studies are usually close enough, especially since load shedding isn't a finesse exercise.
The situations that require load shedding usually give operators only a few minutes to react, where analyzing the issue and determining a course of action takes the lion's share. Once you're there, you want the actual action to be as simple as possible, not factor in many details.