Algorithmic Thinking and How Machines Think
Lesson Objective
In this lesson, you will understand what thinking algorithmically actually means when you’re dealing with computers and how exactly we can think in terms of algorithms to make computing hardware perform the tasks we require. This lesson might answer your questions of why intelligent engineers are needed for software engineering. Happy Learning!
Why is it important to understand how computers think(or do they🤔)?
The reason it is important to understand how computers actually “think” (i.e. perform calculations that make them capable of even defeating the world champions at chess) is because you can make the computers perform various tasks only when you know how the computers work on these tasks at a lower level. You will be able to create better computer programs if you understand how computers actually interpret the instructions specified in these computer programs.
Introduction to the Lesson
In this lesson we are going to discover whether computers have “thinking” power or not and how a software programmer can make computers perform tasks that make them seem like they are capable of thinking. We will also understand what algorithms are and why thinking in terms of algorithms is important to make computers perform the task that you want them to perform.
We will discuss the following major points in this lesson
How did computers become so clever?
Definitions and examples of algorithms
How to think algorithmically as a Programmer?
Flowcharts & Pseudocode
How did computers become so clever?
Before discussing how computers are able to do such miraculous things like beating the world chess champion or even having a conversation with a human, we have to discuss whether computers can think on their own or not. 💭 Can computers actually think ?
Thinking in the human sense involves imagination, gathering all the information you have, analysing the actual situation, comparing it over and over again, in less than a second and then making a choice with your own will and these choices might not even be based on logic or rules.
Computers simply cannot think. If you thought that computers are able to think by looking at various applications like voice recognition and face recognition, you have been deceived by the complexity of these applications. But, at a deeper level, these applications also contain only basic programming instructions and only follow the instructions to perform various tasks.
How do computers seem to be able to think?
How do these devices always do PRECISELY what you tell them? A computer program is essentially a list of very specific rules and instructions - perhaps like a cooking recipe. It follows that recipe to the letter - never straying from it - never having an original thought of any kind.
The actual instructions for the computer have to be written very precisely in special languages (called programming languages which you will learn more about in upcoming modules).
With that said, we are now in the age of “Artificial Intelligence” - where we write a set of rules for the mindless machine that make it seem like it’s thinking. Let’s look at a program to play Tic-Tac-Toe.
Artificial Intelligence Let’s say we write a program to play Tic-Tac-Toe. AI programs create relations between decisions and results, if it loses (which it will do a lot initially) - it will never play that same way again. So eventually, after enough time playing real human players, it’ll learn a set of moves that will allow it to win (or at least draw) every single time. You will learn more about AI in more advanced courses in the future.
The Recipe Analogy
Let us continue with the cooking analogy and consider an imaginary person who has never been to a kitchen nor have they seen anyone cook anything before i.e. . If you require this person to cook a dish for you, what would you do?
You would have to give them a recipe which would involve a step-by-step detailed guide along with all considerations like if an ingredient X is not available, then what to do and so on.
Computers also work the same way, we give them a step-by-step guide with all possible considerations so that they can perform the desired task because as we have come to realize, computers are dumb machines that do exactly as they are instructed to (no more and no less). They are basically a group of circuits with 0s and 1s(representation for low voltage and high voltage) flowing in them. Thus, we need to provide step-by-step and clear instructions to the computer. These step-by-step instructions are what we call algorithms, which we’ll discuss in detail shortly.
Let's get familiar with the term "Alogrithmic Thinking". (Duration : till 1:14)
Check-point Assessment
Q. Algorithm is…
A. Blueprint of the steps to complete a task.
B. Actual stepwise execution of a task.
C. Both of them.
D. None of the above.
Check the Solution!
Answer: A) Blueprint of the steps to complete a task.
Definition of Algorithmic Thinking
Algorithmic thinking is a systematic way of thinking through problems and solutions in a way that’s similar to how a computer would run. This approach automates the problem-solving process by creating a series of systematic, logical steps for the computer(or even a human) to follow.
Algorithmic thinking is not solving for a specific answer; instead, it solves how to build a sequential, complete, and replicable process that has an end point.
Algorithmic thinking is a component of computational thinking about which you have already studied
What are Algorithms?
Algorithms are a series of instructions that are followed, step by step, to do something useful or solve a problem. For example, you could consider a cake recipe an algorithm for making a cake.
In computer science, algorithms provide computers with a successive guide to completing actions. They consist of a precise list of instructions that outline exactly how to complete a task.
In higher level applications, algorithms act in complex patterns, each using smaller and smaller sub-methods which are built up to the program as a whole.
Let's get familiar with the term "Alogrithms". (Duration : from 1:14 till 3:45)
Check-point Assessment
Q. Which of the following is false?
A. Algorithm is a set of steps to carry out a task.
B. Algorithms follow a sequence.
C. Algorithms are always written in C language.
D. Algorithms are used as blueprints for actual computer programs.
Check the Solution!
C) Algorithms are always written in C language.
Essential Properties of Algorithms
Input
An algorithm should have 0 or more well-defined inputs.
Output
An algorithm should have 1 or more well-defined outputs, and should match the desired output.
Definiteness
The sequence of events and the details of each step must be clearly defined along with the mechanism to handle errors
Unambiguity
Algorithms should be clear and unambiguous. Each of its steps (or phases), and their inputs/outputs should be clear and must lead to only one meaning.
Effectiveness
The operations must be doable and the algorithm must be feasible in terms of computing resources.
Finiteness
Algorithms must terminate after a finite number of steps
Notations to represent algorithms
Natural Languages : Languages used by humans like English can be used to write out algorithms just like a cooking recipe is written out.
Flowchart : A diagramatic representation
Pseudo Code : A fake code
We will learn about flowcharts and pseudocodes in detail shortly.
Source: Giphy
Flowcharts
It is a diagrammatic notation that uses various graphical symbols to represent an algorithm and it produces the most unambiguous and easy to understand representation of an algorithm.
It makes use of symbols which are connected among them to indicate the flow of information and processing.
Checkout the short video on flowchart.
An example for a flowchart considering a real life situation is described as below
Symbols used in Flowcharts
The basic symbols used in flowcharts are specified below along with their names. You can understand the use case of each of these by the name itself. You will understand them more clearly once you start drawing flowcharts.
Why are Flowcharts important?
Effective Communication
Flowcharts are a better way of communicating the logic of the system.
Effective Analysis
Using flowchart problems can be analysed more efficiently.
`Easy Debugging and Efficient Testing
Efficient Coding
The flowcharts are very useful during the program development phase.
Proper Documentation
Flowcharts serves as a good program documentation, which is needed for various purposes.
Pseudocode
Pseudocode literally means ‘fake code’. It is an informal and contrived way of writing programs in which you represent the sequence of actions and instructions (aka algorithms) in a form that humans can easily understand.
The language of a computer is very rigid: you are not allowed to make any mistakes or deviate from the rules. Pseudocode works as a method for writing algorithms similar to a programming language without the rigidity of a programming language.
In pseudocode, you don't have to think about semicolons, curly braces, the syntax for arrow functions, how to define promises, DOM methods and other core language principles. You just have to be able to explain what you're thinking and doing.
Main Constructs of Pseudocode
The core of pseudocode is the ability to represent 6 programming constructs (always written in uppercase): SEQUENCE, CASE, WHILE, REPEAT-UNTIL, FOR, and IF-THEN-ELSE. These constructs — also called keywords —are used to describe the control flow of the algorithm.
SEQUENCE represents linear tasks sequentially performed one after the other.
WHILE a loop with a condition at its beginning.
REPEAT-UNTIL a loop with a condition at the bottom.
FOR another way of looping.
IF-THEN-ELSE a conditional statement changing the flow of the algorithm.
CASE the generalisation form of IF-THEN-ELSE.
Why is pseudocode important?
When you're writing code in a programming language, you’ll have to battle with strict syntax and rigid coding patterns. Since pseudocode is an informal method of program design, you don’t have to obey any set-out rules.
Pseudocode allows you to plan instructions which follow a logical pattern, without including all of the technical details.
Pseudocode helps in starting with software programming as a beginner as you won't have to overwhelm your brain with coding syntax.
In fact, many companies organise programming tests for their interviewees in pseudocode. This is because the importance of problem solving supersedes the ability to ‘hack’ computer code.
Planning computer algorithms with pseudocode makes you meticulous. It helps you explain exactly what each line in a software program should do. This is possible because you are in full control of everything, which is one of the great features of pseudocode.
How to think Algorithmically?
Thinking algorithmically is a mindshift from how we, as humans, usually think. We all have unconsciously built up shortcuts, assumptions, and rules of thumb that we use to help us solve everyday problems without thinking about them. It is the way of thinking based on how computers would perform a particular task.
We’re not used to breaking our thought process down into its individual steps and translating that to what computers can do. For instance, computers can’t jump to general spots in a dictionary to find a word based on its spelling. A computer has to have very specific instructions.
Like all skills, thinking algorithmically takes practice.
Algorithmic thinking is all about
Defining the problem clearly
Breaking the problem down into small, simple parts
Define the solution for each part of the problem
Implementing the solution
Making it efficient (eventually) (For complex algorithms)
Problem Statement: Check whether the given number is prime or not.
The first step is to understand and define the problem clearly
We know that a prime number is a number that has exactly two factors i.e. it is only divisible by 1 AND itself. So, we have to check for numbers with precisely 2 factors with our algorithm.
The next step is to actually tackle the problem by dividing it into smaller parts
Here, we can divide the problem into checking for factors and then counting the number of factors for the given number.
Solve each smaller part of the problem
To check for factors, we can divide the numbers and if the remainder is zero then the divisor is a factor of the number. For example, 4 / 2 gives remainder 0, so 2 is a factor of 4.
Then, we divide the given number n by all numbers from 2 to n-1 and if we find a number that is a factor of n, then it's not prime. And, if we don’t find such a number then n is prime.
Improve the efficiency of the algorithm
We can reduce the number of factor checks in the given problem by only checking from 2 to n/2 (try figuring out why?)
Let us now represent the algorithm we developed above using flowchart and pseudocode…
Flowchart
Pseudocode
INPUT n
isPrime = TRUE
IF n <= 1
isPrime = FALSE
i = 2
WHILE i <= n / 2 AND isPrime is TRUE
remainder = n % i
IF remainder is equal to 0
isPrime = FALSE
ELSE
i = i + 1
END IF
END WHILE
IF isPrime is TRUE
DISPLAY “Not a Prime Number”
ELSE
DISPLAY “A Prime Number”
END IF
STOP
*We will be learning how to create flowcharts and writing pseudocode from the very basic problems to more intricate problems in the next lesson.*