Tags: javascript
JavaScript is full of surprises and unexpected behaviors that often leave developers scratching their heads. One of those strange behaviors is the fact that Math.max() < Math.min() is true in JavaScript. In this blog, we'll delve into the underlying reasons for this seemingly paradoxical result.
The Math
object in JavaScript provides a collection of properties and methods for performing mathematical operations. Two of the most commonly used methods are Math.max() and Math.min().
Math.max()
returns the largest of zero or more numbers.
Math.min()
returns the smallest of zero or more numbers.
At first glance, it would seem impossible for the maximum of a set of numbers to be less than the minimum of the same set. But, as we will see, there's a catch when we don't provide any arguments to these functions.
When you call Math.max()
with no arguments, it returns -Infinity, which represents negative infinity in JavaScript. Similarly, when you call Math.min() with no arguments, it returns Infinity, which represents positive infinity in JavaScript.
These results might seem counterintuitive, but they're based on the behavior of these functions when dealing with an empty set of numbers.
The logic behind this behavior lies in the definition of the functions:
Math.max()
: When given an empty set, it assumes that every real number is smaller than the largest number in the set. Since there are no numbers in the set, it defaults to -Infinity.
Math.min()
: When given an empty set, it assumes that every real number is larger than the smallest number in the set. Since there are no numbers in the set, it defaults to Infinity.
As a result, the following inequality is indeed true in JavaScript:
Math.max() < Math.min() // evaluates to true
This is because -Infinity < Infinity is a valid statement in the world of mathematics.