Commit 814b8797 authored by Rafael J. Wysocki's avatar Rafael J. Wysocki

cpuidle: menu: Avoid overflows when computing variance

The variance computation in get_typical_interval() may overflow if
the square of the value of diff exceeds the maximum for the int64_t
data type value which basically is the case when it is of the order
of UINT_MAX.

However, data points so far in the future don't matter for idle
state selection anyway, so change the initial threshold value in
get_typical_interval() to INT_MAX which will cause more "outlying"
data points to be discarded without affecting the selection result.
Reported-by: default avatarRandy Dunlap <rdunlap@infradead.org>
Signed-off-by: default avatarRafael J. Wysocki <rafael.j.wysocki@intel.com>
parent ef800684
...@@ -186,7 +186,7 @@ static unsigned int get_typical_interval(struct menu_device *data, ...@@ -186,7 +186,7 @@ static unsigned int get_typical_interval(struct menu_device *data,
unsigned int min, max, thresh, avg; unsigned int min, max, thresh, avg;
uint64_t sum, variance; uint64_t sum, variance;
thresh = UINT_MAX; /* Discard outliers above this value */ thresh = INT_MAX; /* Discard outliers above this value */
again: again:
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment