*Bits, ink, particles, and words.*

If you’re in secondary school, chances are you’ve had to discuss a bunch of properties of functions. These include finding the maxima and minima, the roots (zeros), the intervals where the function is positive or negative, the intervals where the function is increasing or decreasing, and the domain of the function. While having to slog through page after page of this, you might be tempted to ask, “Why do I have to learn about this? Not only is it boring, it’s completely useless!”

When you first learned about algebra, chances are you learned about something called a function, typically one that looks like this:
\(\begin{equation}
f(x) = ax +b.
\end{equation}\)
This is nothing more than the equation of a straight line. You probably also learned how this could be represented as a graph (which is why you know it’s a straight line). This was simple enough, and you soon learned how to deal with different kinds of functions. These include quadratics (parabolas), exponentials, rationals, and a host of other functions. You learned what these looked like when graphed, and how to find various properties of these functions. This includes finding the roots of the equation (when $f(x)=0$), finding the domain and range, and characteristics of when the graph is increasing or decreasing.

In mathematics, notation is simultaneously everything and nothing. It isn’t difficult to imagine another alien species havig the same notions of calculus as we do, but without the symbols of integration or differentiation. It might *seem* so natural now to see the expression $\partial x$, but that’s only because we’ve spent years working with these symbols, forging a connection between concepts and notation. Due to this, it can seem entirely natural to look at notation and instantly understand what it’s about as a *concept*, rather than just symbols. This is quite similar to our experience with foreign languages, where the words and characters look alien to us, yet our own languages seem so obvious.

When you’re trying to solve a simple algebraic expression like $ab = 5b$ for the variable $a$, it quickly becomes second-nature to divide both sides of the equation by $b$, yielding $a = 5$. This makes complete sense, and it’s what most people would do right off, without even thinking. I mean, look at both sides of that equation! If there’s a $b$ on both sides, then the other value on each side of the equation should be equal to each other, giving us $a = 5$.