Tuesday, January 19, 2021

Object-Oriented Programming Is the Biggest Mistake of Computer Science

From Determinism to Nondeterminism

Photo by Annie Spratt on Unsplash

Let’s take a look at an addition function:

We can always be sure, that given the input of (2, 2) , the result will always be equal to 4 . How can we be so sure? In most programming languages, the addition operation is implemented on the hardware, in other words, the CPU is responsible for the result of the computation to always remain the same. Unless we’re dealing with the comparison of floating-point numbers, (but that is a different story, unrelated to the problem of nondeterminism). For now, let’s focus on integers. The hardware is extremely reliable, and it is safe to assume that the result of addition will always be correct.

Now, let’s box the value of 2:

So far so good, the function is deterministic!

Let’s now make a small change to the body of the function:

What happened? Suddenly the result of the function is no longer predictable! It worked fine the first time, but on every subsequent run, its result started getting more and more unpredictable. In other words, the function is no longer deterministic.

Why did it suddenly become non-deterministic? The function has caused a side effect by modifying a value outside of its scope.

Let’s recap

A deterministic program guarantees that 2+2==4 . In other words, given an input of (2, 2) , the function add , should always result in the output of 4 . No matter how many times you call the function, no matter whether or not you call the function in parallel, and no matter what the world outside of the function looks like.

Nondeterministic programs are the exact opposite. In most of the cases, the call to add(2, 2) will return 4 . But once in a while, the function might return 3, 5, or even 1004. Nondeterminism is highly undesirable in programs, I hope you can now understand why.

What are the consequences of nondeterministic code? Software defects, or as they are more commonly referred to as “bugs”. Bugs make the developers waste precious time debugging, and significantly degrade the customer experience if they made their way into production.

To make our programs more reliable, we should first and foremost address the issues of nondeterminism.



from Hacker News https://ift.tt/2XlRTlH

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.