Reducing the amount of wireless data transmissions in constrained battery-powered sensor nodes is an effective way of prolonging their lifetime. In this paper, we present a machine learning-based data transmission reduction scheme for application-specific IoT networks. Though many error thresholding-based data prediction schemes have been explored in the past, this is the first work to incorporate machine learning in constrained sensor nodes to reduce data transmissions. We also provide a generic overview and comparison of five traditional supervised machine learning algorithms in the context of offloading trained models to memory and computationally constrained microcontrollers. The proposed data reduction scheme is validated on an occupancy estimation testbed deployed in our lab. Experimental results demonstrate 99.91% overall reduction in data transmissions while imparting similar performance and 18 to 82 times lesser transmissions than Shewhart change detection algorithm.