Header

Programming Decoded: Bridging the Gap Between Human and Machine

by Lauren Alworth

February 21, 2017

The world of coding and programming can be an intimidating place for the uninitiated. It’s difficult to find a clear explanation of how it all works that isn’t laden with industry jargon that requires complex explanations. Consider this post the first of several in a Programming, Deconstructed series; our attempt at unpacking the topic and explaining the fundamentals of programming in a way that is accessible to everyone, regardless of their background.

Like most complex topics, knowledge about programming is cumulative. So before we dig into a discussion of basic programming concepts or compare different languages, we need to answer the most fundamental question: what is programming at the most basic level? Let's start off by talking about a computer we all love to hate—the human brain.

It’s a wildly versatile organ, allowing us to determine everything from how to catch a football based on it's initial trajectory to guessing how someone else feels based on nothing but their body language. But one of the most impressive functions of the brain is how it processes language.

When trying to understand a sentence, the brain breaks it up into different parts: semantics and syntax (note: context is also pretty important, but that's best left for a more advanced discussion of programming). Semantics measure the meaning of a word, while syntax refers to the rules we have for combining words into phrases and sentences, and for understanding the relationship between words. Using a combination and semantics and syntax, the human brain is able to assign meaning to words and phrases that isn’t explicitly stated.

Computer processors, on the other hand, don’t have the same ability to interpret syntax (and context), and that’s where programming comes in. It’s important to keep in mind is that both the processor in your computer and the human brain serve a similar function: they produce an output based on an input. But  they process information in fundamentally different ways. To better understand programming, we first have to understand how humans and computers interpret the world differently.

Contrary to popular belief, programming is, at its core, just creative problem solving according to a predefined set of rules. Whether it's fixing an existing tech-related headache or inventing a solution to a problem that, earlier, hadn't even been defined, programming isn’t necessarily about solving a computer problem, but more the process of using a computer to solve a real-life problem. 

If you wish to make a PB&J sandwich from scratch, you must first invent the universe 

The key to any type of problem solving is taking things step-by-step, and with programming it’s more like baby-step-by-baby-step. Because computers process information differently than the human brain, we have to explain things in different terms.

Let's consider the task of making a peanut butter and jelly sandwich. First, you need to define your list of ingredients: a loaf of bread, a jar of peanut butter (chunky, you monster), a jar of jelly—raspberry is the only option, as we all know—one plate, and two knives (thou shalt not double dip). After defining the ingredients, the next step is to provide a set of instructions for making the sandwich. If you're not a programmer, your instructions might look a bit like this:

1. Remove two slices of bread

2. Put the peanut butter on one slice

3. Put the jelly on the other slice

4. Put them together

5. Enjoy

Obviously, the computer didn't interpret the instructions correctly. In this example, the difference is a matter of inferences. A person is able to infer that "put the peanut butter on a slice of bread" is really a series of many steps that are quite complex, whereas a computer is frustratingly literal in the way it interprets instructions. If we were to imagine the conversation between a human and a computer, it might go something like this:

Human: Open the jar of peanut butter, please.

Computer: How do I do that?

Human: Twist the cap

Computer: What does 'twist' mean?

Human: Rotate. Rotate the cap.

Computer: How much should I rotate the cap?

Human: I don't know. Three, maybe 4 times?

Computer: 3 or 4 radians. Got it.

Human: No. Full revolutions. Rotate the cap 1440°.

Computer: Ok. Got it. Rotating the cap 1440°. Which direction?

Human: ( ╯°□°)╯︵ ┻━┻ 

And that's just getting the jar of peanut butter open. That’s how programming works. It’s about thinking a few levels below and breaking down actions in the most simplistic steps. The entire process is methodical, and requires a very explicit, step-by-step breakdown to get to your exact desired outcome. 

Bodyno2

What’s a programming language?

Much of the frustration with human-to-computer communication can be minimized by leveraging programming languages. In the same way that an English speaker would recognize an assembly of letters in the Roman alphabet as words that form a sentence, computers recognize a series of 1s and 0s, known as 'binary code', that are assembled in a way that eventually leads to an output. While both computer processors and the human brain produce an output based on an input, they excel at completely different things.

For example, consider the phrase Hello World! in English: it's 12 characters (10 letters, a space, and an exclamation point). Simple enough. But in binary, the same phrase is, 01001000 01100101 01101100 01101100 01101111 00100000 01010111 01101111 01110010 01101100 01100100 00100001. That's 118 characters vs. 11. If you're the type that's looking for patterns, you probably noticed that each group of 8 digits represents one letter.

If it's not already obvious, binary code would be mind-bendingly difficult for a human brain to understand and translate quickly, but computers are great at it because it boils down to computing trillions of simple on/off calculations every second (in binary, 1 = on and 0 = off). For humans to tap into the full potential of computers, it quickly becomes necessary to be able to communicate commands to computers in a way that is mutually intelligible. Enter: programming languages.

While programming languages started out relatively simplistically (from the point of view of the computer), they now operate significantly higher levels of complexity. The easiest way to visualize this is to think of a sliding scale with human language on one end and binary code on the other. In between each extreme lie various levels of programming languages. Languages closer to binary are considered "low-level," whereas languages that are closer in syntax to human language are considered "high-level." 

Assembly, for example, is a very low-level language that consists simply of a series of just three or four letters. Those letters are then run through what's a pre-built interpreter that converts the programming language into binary so that the computer can understand the input.

Programmers now have access to languages that are much closer to human language, and therefore go through much more filtration and processes to boil back down to binary code. Although sadly there is still no mainstream computer program that can make you a peanut butter and jelly sandwich (if someone has this program written, please reveal yourself), programming languages have become very advanced. The programming community continues to build on the existing base of code and technology to make things easier and more automated.

There are a handful of different programming languages out there, and the language you use depends on what problem you are working to solve. What’s important to remember is programming builds on itself, and once you’ve learned one language the next comes pretty easily.

We hope this breakdown brought you a clear, more comprehensive understanding of both programming and programming languages. Keep an eye out for more posts in our Programming: Deconstructed series, we’ll dive much deeper into the specifics of programming languages, frameworks, and different functions and roles of developers. 

Get Updated

Apply