Concurrency

Ilyas Karimov
3 min readDec 13, 2020

Hello, Folks. I hope the articles increase your interest in programming. First of all, let’s talk about the concept of concurrency. We will also talk about deadlock and race condition.

Concurrency control mechanisms help to increase program flexibility. It’s used when more than one task is done concurrently. For example, web browsers. They send and receive data to web servers at the same time. It’s usually confused with parallelism. The difference between concurrency and parallelism is that concurrency occurs in interfering periods of time. Parallelism is when tasks completely run at the same time, e.g., on a multicore processor.

Concurrency occurs at four levels:

  1. Instruction level — executing two or more machine instructions simultaneously
  2. Statement level — executing two or more high-level language statements simultaneously
  3. Unit level — executing two or more subprogram units simultaneously
  4. Program level — executing two or more programs simultaneously

Concurrency is called scalable if the speed of execution increases when more processors are available.

There are two categories that used multiple data streams: Single Instruction, Multiple Data (SIMD), and Multiple Instruction, Multiple Data (MIMD) architecture computers. In an SIMD computer, each process has its own local memory (a single decoder) and in MIMD, computer architecture consists of multiple decoders. SIMD required less memory and is synchronous programming. MIMD can appear in two configurations: distributed and shared memory systems.

There are 2 categories of concurrent unit control:

Physical concurrency — Multiple program units from the same program that literally execute simultaneously.

Logical concurrency — Multiple processors providing actual concurrency, when in fact the actual execution of programs is taking place in an interleaved fashion on a single processor

Concurrency is used to deadlock and race conditions. Deadlock happens when the processes cannot access the resource because it’s locked. Suppose Process 1 requests resource B from Process 2 and Process 2 required resource A from process 1 but resource A and B are locked during processes 1 and 2 are running. The point is that no process can continue until the other is over, and they are constantly waiting for each other. This process is called deadlock.

Race condition or race hazard is the unexpected or undesirable situation that happens when the system tries to perform operations at the same time. Suppose the variable TOTAL has the value of 3 before modifications. If A puts its value back before B fetches it, TOTAL will be 6. If B puts its value back before A fetches its value, TOTAL will be 4. If B completes its operation before task A beings, TOTAL will be 7. This is called race condition because two or more tasks are racing to use the shared resource and the behavior of the task is dependent on which task arrives first or wins the race.

Processes, also known as tasks fall into two categories:

Heavyweight task — It executes in its own address space

The lightweight task, also known as threads — All processes run in the same address space. This task runs under the address space of a normal (heavy-weight) process, and the lightweight tasks under the same process may share variables

Thank you very much for your time. Don’t forget to read more articles. Take care. Peace ✌🏼

--

--

Ilyas Karimov

Master of Computer Science and Data Analytics at ADA/GW Universities, Researcher, Psychology-lover, Meme-maker, Musician, Writer, AI & Sarcasms!