How to Maximize Battery Life on Dual-Mode Wi-Fi / Bluetooth IoT Projects
While Wi-Fi and Bluetooth include standard protocols to reduce power consumption, the architecture that connects the radio subsystems can provide direct support by offloading network processing tasks with a low power microcontroller.
Designers of battery-powered Internet of Things (IoT) devices and other connected products must meet conflicting requirements for continuous wireless connectivity and extended battery life. The further "stretching" of the already limited power limits comes from the growing demand for Bluetooth 5 and Wi-Fi connectivity on the same device. While Wi-Fi and Bluetooth include standard protocols to help reduce power consumption, an architecture that connects radio subsystems can provide more direct support by offloading network processing tasks with a low power microcontroller.
This article outlines the importance of Dual-Mode Wi-Fi / Bluetooth connectivity and how it complicates IoT designs. It then shows how the development board and associated software from Cypress Semiconductor can be used to create Dual-Mode Wi-Fi / Bluetooth IoT devices with continuous connectivity and longer battery life.
Growing demand for continuous connectivity of Dual-Mode Wi-Fi / Bluetooth devices
Bluetooth connectivity is considered a standard requirement for many IoT devices designed to interact with users via smartphones and other portable devices with Bluetooth functionality. However, for many IoT applications, these devices need a Wi-Fi connection to access a wireless local area network (WLAN) and then to connect directly to the Internet, or to interact with other peer devices and host systems on the same network.
In many ways, the possibility of extending battery life would be much simpler if these devices only needed to connect to the WLAN or Bluetooth host when data or other messages were required. Since the active duty cycle of many IoT devices is typically low, these devices can extend battery life by mainly operating in a low-power sleep mode and waking up long enough to take sensor measurements, complete associated processing tasks, and transfer the obtained data before returning to the low power mode. Most IoT devices need to respond quickly to asynchronous commands and data coming from peers, host systems, and end users.
IoT devices must ensure continuous connectivity to remain responsive to incoming data so that they can react in an acceptable time. If developers try to meet this basic requirement by repeatedly waking devices to receive incoming data, their battery will run out quickly. In fact, radio receivers in battery-operated Wi-Fi devices tend to consume more energy over time than radio transmitters, despite the higher energy consumption associated with a single transmission operation. Of course, the power consumed by the main processor of the device in each receive operation adds its own significant load to the overall power budget. Fortunately, wireless connectivity standards define protocols that allow developers to reduce power while maintaining the illusion of continuous connectivity.
How Wireless Standards Help Reduce Power Consumption
During normal operation, Wi-Fi Receiving Stations (STAs) save power by turning off most of their Wi-Fi subsystem. Since the access points (APs) buffer frames for sleeper STAs, no message is lost. In normal network management operations, access points regularly transmit beacon frames including a bitmap, called a TIM (traffic indication map), which indicates whether the access point has traffic waiting for a dormant STA. The access points (APs) also periodically transmit a beacon frame that includes a delivery traffic indication map (DTIM) that indicates that broadcast and multicast frames are available in the buffer. STAs are expected to wake up regularly within the DTIM period value, which is some multiple of the normal beacon frame interval. An IoT network configured with a high DTIM period value would allow devices in its network to reduce power consumption as they could stay dormant longer before waking the receiver to receive a beacon frame indicating that the access point (AP) is buffering frames for it. This is the fundamental approach that underpins the 802.11 polling standard, discussed below.
Bluetooth Low Energy (BLE) technology allows to reduce energy consumption by devices by optimizing the frequency of broadcasts and the possibility of Bluetooth loading. By increasing the broadcast interval, IoT devices can delay the operation of the transmitter, and by reducing the payload, they can shorten the duration of broadcast events. Of course, not every application accepts long broadcast intervals or minimal payloads. For example, in an audio or real-time device, long broadcast intervals represent a delay in the connection that can adversely affect the behavior of the application as a whole.
Peripherals can use another BLE feature called slave delay, which allows the peripheral device to skip connection events. As with DTIM Wi-Fi, the BLE slave latency allows devices to stay in low power mode for extended periods of time. Rather than simply increasing the connection interval, this special mode allows a peripheral device to skip host connection events, but still wake up and send data when needed, without additional delay.
The full article You can find here: https://tek.info.pl
Rectangular BGA pads are a bad idea.
Why is a flux used in PCBA assembly?
Gold contamination makes the solders brittle