One thing that I found counter-intuitive is that building a device that periodically receives data wirelessly is generally more expensive on the battery than a device which periodically transmits.
Naively, I assumed that it must take more power to transmit than to receive, which is true on an instantaneous basis, but false on an average basis.
A device that wakes up every so often, transmits, then goes to sleep can use very little average power as compared to a device that must constantly have the receiver powered up to listen.
>compared to a device that must constantly have the receiver powered up to listen
But there's no need for the receiver to constantly stay awake to listen or poll the transmitter like in wired network systems.
Low power wireless protocols use time slots since forever, where receivers wake up only in their dedicated time slots to check if any messages are addressed to them, and if so, then they wake up the entire CPU block and start processing the payload and reply to the message, but if not, then they put the receiver back to sleep till their next time slot. Simple and very energy efficient.
Therefore receivers are more efficient than transmitters as transmitters are constantly operating as beacons for every time slot which is what you want when the base station can be powered on AC, while the IoT receivers are usually battery powered and need to last for years.
The only tricky part is building a self compensation mechanism in firmware for the receiver wake-up time jitter as all receivers inevitably start to drift in time as per the drift of their oscillators, including the transmitter which also drifts, especially when using low-cost oscillators with horrible drift.
Think in terms of energy per symbol and it makes sense. When waiting in rcv mode you're paying energy for zero symbols.
The other is transmitting short packets at high power/data rate is a win vs low power low data rate long packets because your energy per symbol is lower with the former. And people that show know better seem to make that mistake a lot.
> Naively, I assumed that it must take more power to transmit than to receive, which is true on an instantaneous basis, but false on an average basis.
Even that is not necessarily always the case in low-power systems, since often receive amplifiers need to be run at a fairly high current to achieve a low noise floor, and power saving tricks like envelope tracking power supplies are harder to implement on the receive side.
For example, I've seen several Bluetooth LE radios where the instantaneous supply current is higher during receive than during transmit.
I'd assume the problem with receiving data on a periodic basis is that you still have to establish the link with the towers. Such that you are always "polling" from the perspective of the device.
That is, treat the times that you wake up to receive information the same way as the ones where you wake up to send, and I'd expect them to be roughly the same? That not the case?
I’ve been working with IoT for about 8-ish years and the one thing that has rang true for across platforms, designs, customers, and use-cases is that you can only squeeze so much performance from a setup that wasn’t properly optimized for low power consumption.
I’ve had customers approach me desperately to me trying to make IoT device survive just one night on a small LiPo battery, enough so the sun in the morning will charge it up again, but their solution was a cobbled together mess with a ESP32 looking for their administration network to connect to, a uBlox modem powering up and sending off a 4MB packet every 5 minutes. Turns out, it would have more power efficient to leave the modem powered on and connected to the cell network as you need to exchange something like 25-50 packets per handshake or 2 packets per minute if you’re just idling.
I’ve had the curse of the guy who just fixed everything because I have a background in both hardware and software in addition to knowing cell networks at a low level and TCP/IP stack (usually DTLS, in this case). When I optimize stuff, I will attack it from all directions. For example, it costs more power to receive messages so for anything nonsensical (such as a periodic data packet) I use a non-confirmable UDP packets, i.e. fire and forget. I try to avoid using heavy RTOSes on my devices and opt for a simple messaging library to properly format data for optimal transfer over the cell network. My devices I build have low powered MCUs with a restart timer to wake up periodically. I managed to make a solar powered environment sensor with only an super capacitor as reserve power.
This went on a bit, but I think my point is that for well-architected, low-power devices, you need to start from the ground up, and sometimes that means ditching your IoT platform and spinning your own hardware and firmware. My last observation is that many hardware engineers are not the ones who install or test the solutions they design and are unaware of the power consumption outside of the specs in the data sheet.
Effective capacity also drops with load for many batteries, and there can be subtleties. Read all the data sheets and applications manuals from your suppliers.
Even very good firmware engineers can need reminders that everything you do with a battery operated device drains the battery.
Keysight, R&S, and Tektronix/Keithley all have nice battery test devices and battery test simulators. You can rent one if buying one takes your breath away.
Also IoT devices can require you to use very fast ammeters or sourcemeters to correctly measure net current or power. The RMS reading on your multimeter might not even register fast spin-up and spin-down on a BLE device. That's another use case for the Qiotech tool. Again, the big instrument makers make even nicer stuff. Call an FAE.
Use a CR2032 battery or be prepared for a life of misery. :)
To a first, second, and third approximation: CR2032 is the largest coin cell that exists. Anything else has terribly weird quirks and may not actually be better than a CR2032.
We're building a bicycle product involving loadcells, and one of the issues with them is that strain gauges typically have pretty low resistance. As we need readings at a relatively high sample rate (measuring pedalling dynamics) it's a lot of fun waking up loadcells, getting the sigma delta ADCS to get nicely-settled 24-bit results across multiple channels and synchronising the whole thing across 4 separate sensor nodes. Basically we have to pedaL and chew minimal joules at the same time.
Latest hardware has coulomb counters to keep track of charge state; you probably already know it but the nPM1100 has some good press.
I've always been curious how the Ring video camera can go "Live" 5 seconds after you click a button on a web browser, but still last for months on a small battery.
A sometimes overlooked resource is your MCU vendor. It may have a power monitoring daughterboard w/ supporting software to help you optimize your battery usage against a development board. The last one I used was Nordic's and it was stellar, free (thanks to the FAE), allowing us to ship a BLE device that would run for at least 1 year on two alkaline AA cells.
[+] [-] sokoloff|2 years ago|reply
Naively, I assumed that it must take more power to transmit than to receive, which is true on an instantaneous basis, but false on an average basis.
A device that wakes up every so often, transmits, then goes to sleep can use very little average power as compared to a device that must constantly have the receiver powered up to listen.
[+] [-] FirmwareBurner|2 years ago|reply
But there's no need for the receiver to constantly stay awake to listen or poll the transmitter like in wired network systems.
Low power wireless protocols use time slots since forever, where receivers wake up only in their dedicated time slots to check if any messages are addressed to them, and if so, then they wake up the entire CPU block and start processing the payload and reply to the message, but if not, then they put the receiver back to sleep till their next time slot. Simple and very energy efficient.
Therefore receivers are more efficient than transmitters as transmitters are constantly operating as beacons for every time slot which is what you want when the base station can be powered on AC, while the IoT receivers are usually battery powered and need to last for years.
The only tricky part is building a self compensation mechanism in firmware for the receiver wake-up time jitter as all receivers inevitably start to drift in time as per the drift of their oscillators, including the transmitter which also drifts, especially when using low-cost oscillators with horrible drift.
[+] [-] Gibbon1|2 years ago|reply
The other is transmitting short packets at high power/data rate is a win vs low power low data rate long packets because your energy per symbol is lower with the former. And people that show know better seem to make that mistake a lot.
[+] [-] tesseract|2 years ago|reply
Even that is not necessarily always the case in low-power systems, since often receive amplifiers need to be run at a fairly high current to achieve a low noise floor, and power saving tricks like envelope tracking power supplies are harder to implement on the receive side.
For example, I've seen several Bluetooth LE radios where the instantaneous supply current is higher during receive than during transmit.
[+] [-] taeric|2 years ago|reply
That is, treat the times that you wake up to receive information the same way as the ones where you wake up to send, and I'd expect them to be roughly the same? That not the case?
[+] [-] tyhoff|2 years ago|reply
[+] [-] samtho|2 years ago|reply
I’ve had customers approach me desperately to me trying to make IoT device survive just one night on a small LiPo battery, enough so the sun in the morning will charge it up again, but their solution was a cobbled together mess with a ESP32 looking for their administration network to connect to, a uBlox modem powering up and sending off a 4MB packet every 5 minutes. Turns out, it would have more power efficient to leave the modem powered on and connected to the cell network as you need to exchange something like 25-50 packets per handshake or 2 packets per minute if you’re just idling.
I’ve had the curse of the guy who just fixed everything because I have a background in both hardware and software in addition to knowing cell networks at a low level and TCP/IP stack (usually DTLS, in this case). When I optimize stuff, I will attack it from all directions. For example, it costs more power to receive messages so for anything nonsensical (such as a periodic data packet) I use a non-confirmable UDP packets, i.e. fire and forget. I try to avoid using heavy RTOSes on my devices and opt for a simple messaging library to properly format data for optimal transfer over the cell network. My devices I build have low powered MCUs with a restart timer to wake up periodically. I managed to make a solar powered environment sensor with only an super capacitor as reserve power.
This went on a bit, but I think my point is that for well-architected, low-power devices, you need to start from the ground up, and sometimes that means ditching your IoT platform and spinning your own hardware and firmware. My last observation is that many hardware engineers are not the ones who install or test the solutions they design and are unaware of the power consumption outside of the specs in the data sheet.
[+] [-] buescher|2 years ago|reply
Effective capacity also drops with load for many batteries, and there can be subtleties. Read all the data sheets and applications manuals from your suppliers.
Even very good firmware engineers can need reminders that everything you do with a battery operated device drains the battery.
Keysight, R&S, and Tektronix/Keithley all have nice battery test devices and battery test simulators. You can rent one if buying one takes your breath away.
Also IoT devices can require you to use very fast ammeters or sourcemeters to correctly measure net current or power. The RMS reading on your multimeter might not even register fast spin-up and spin-down on a BLE device. That's another use case for the Qiotech tool. Again, the big instrument makers make even nicer stuff. Call an FAE.
[+] [-] bsder|2 years ago|reply
To a first, second, and third approximation: CR2032 is the largest coin cell that exists. Anything else has terribly weird quirks and may not actually be better than a CR2032.
Basically, there are only three battery choices:
1) Alkaline
2) CR2032
3) Full blown LiPol rechargeable
Any divergence is pain. Lots and lots of pain.
[+] [-] zh3|2 years ago|reply
Latest hardware has coulomb counters to keep track of charge state; you probably already know it but the nPM1100 has some good press.
* https://www.nordicsemi.com/products/npm1100
[+] [-] 5d41402abc4b|2 years ago|reply
[+] [-] tetris11|2 years ago|reply
[+] [-] KnobbleMcKnees|2 years ago|reply
[+] [-] swamp40|2 years ago|reply
[+] [-] pcdoodle|2 years ago|reply
With a 100Wh laptop battery thats 10 days battery life!
[+] [-] howard941|2 years ago|reply