Hello, Folks. I hope the articles increase your interest in programming. First of all, let’s talk about the concept of concurrency. We will also talk about deadlock and race condition.
Concurrency control mechanisms help to increase program flexibility. It’s used when more than one task is done concurrently. For example, web browsers. They send and receive data to web servers at the same time. It’s usually confused with parallelism. The difference between concurrency and parallelism is that concurrency occurs in interfering periods of time. Parallelism is when tasks completely run at the same time, e.g., on a multicore processor.
Concurrency occurs at four levels:
- Instruction level — executing two or more machine instructions simultaneously
- Statement level — executing two or more high-level language statements simultaneously
- Unit level — executing two or more subprogram units simultaneously
- Program level — executing two or more programs simultaneously
Concurrency is called scalable if the speed of execution increases when more processors are available.
There are two categories that used multiple data streams: Single Instruction, Multiple Data (SIMD), and Multiple Instruction, Multiple Data (MIMD) architecture computers. In an SIMD computer, each process has its own local memory (a single decoder) and in MIMD, computer architecture consists of multiple decoders. SIMD required less memory and is synchronous programming. MIMD can appear in two configurations: distributed and shared memory systems.
There are 2 categories of concurrent unit control:
Physical concurrency — Multiple program units from the same program that literally execute simultaneously.
Logical concurrency — Multiple processors providing actual concurrency, when in fact the actual execution of programs is taking place in an interleaved fashion on a single processor
Concurrency is used to deadlock and race conditions. Deadlock happens when the processes cannot access the resource because it’s locked. Suppose Process 1 requests resource B from Process 2 and Process 2 required resource A from process 1 but resource A and B are locked during processes 1 and 2 are running. The point is that no process can continue until the other is over, and they are constantly waiting for each other. This process is called deadlock.
Race condition or race hazard is the unexpected or undesirable situation that happens when the system tries to perform operations at the same time. Suppose the variable TOTAL has the value of 3 before modifications. If A puts its value back before B fetches it, TOTAL will be 6. If B puts its value back before A fetches its value, TOTAL will be 4. If B completes its operation before task A beings, TOTAL will be 7. This is called race condition because two or more tasks are racing to use the shared resource and the behavior of the task is dependent on which task arrives first or wins the race.
Processes, also known as tasks fall into two categories:
Heavyweight task — It executes in its own address space
The lightweight task, also known as threads — All processes run in the same address space. This task runs under the address space of a normal (heavy-weight) process, and the lightweight tasks under the same process may share variables
Thank you very much for your time. Don’t forget to read more articles. Take care. Peace ✌🏼
What is a Thread?
A thread may refer to any of the following: 1. With computer programming, a thread is a small set of instructions…
Introduction to Concurrent Programming: A Beginner's Guide
Concurrency allows programs to deal with a lot of tasks at once. But writing concurrent programs isn't a particularly…
A deadlock is a condition where a program cannot access a resource it needs to continue. When an active application…
What is race condition? - Definition from WhatIs.com
A race condition is an undesirable situation that occurs when a device or system attempts to perform two or more…
Sebesta, R. W. (2019). Examples of Multiple Selectors. In Concepts of programming languages. NY, NY: Pearson.