Entropy profiling is a recently introduced approach that reduces parametric dependence in traditional Kolmogorov-Sinai (KS) entropy measurement algorithms. The choice of the threshold parameter r of vector distances in traditional entropy computations is crucial in deciding the accuracy of signal irregularity information retrieved by these methods. In addition to making parametric choices completely data-driven, entropy profiling generates a complete profile of entropy information as against a single entropy estimate (seen in traditional algorithms). The benefits of using “profiling” instead of “estimation” are: (a) precursory methods such as approximate and sample entropy that have had the limitation of handling short-term signals (less than 1000 samples) are now made capable of the same; (b) the entropy measure can capture complexity information from short and long-term signals without multi-scaling; and (c) this new approach facilitates enhanced information retrieval from short-term HRV signals. The novel concept of entropy profiling has greatly equipped traditional algorithms to overcome existing limitations and broaden applicability in the field of short-term signal analysis. In this work, we present a review of KS-entropy methods and their limitations in the context of short-term heart rate variability analysis and elucidate the benefits of using entropy profiling as an alternative for the same.