Parallel Computing MCQ Questions and Answers
Parallel Computing MCQ Questions with Answers assess your understanding of multi-threading, distributed systems, and concurrency. This technical topic plays a vital role in modern computing, especially in data-intensive applications. These programming questions and answers are frequently asked in interviews at Infosys, Capgemini, and in GATE computer science papers. Practicing these MCQs helps you understand process synchronization, scalability, and system performance optimization. Download the parallel computing MCQ PDF and take online practice tests to enhance your technical knowledge and exam readiness.
Parallel Computing MCQ
Showing 10 of
34 questions
1. It is the simultaneous use of multiple compute resources to solve a computatinal problem
- Parallel Computing
- Single processing
- Sequential computing
- None of these
2. Parallel Execution
- A sequential execution of a program one statement at a time
- Execution of a program by more than one task, with each task being able to execute the same or different statement at the same moment in time
- A program or set of instructions that is executed by a processor
- None of these
3. Parallel computing can include
- Single computer with multiple processors
- Arbitrary number of computers connected by a network
- Combination of both A and B
- None of these
4. Serial Execution
- A sequential execution of a program one statement at a time
- Execution of a program by more than one task being able to execute the same or different statement at the same moment in time
- A program or set of instruction that is executed by a processor
- None of these
5. Shared Memory is
- A computer architecture where all processors have direct access to common physical memory
- it refers to network based memory access for physical memory that is not common
- Parallel tasks typially need to exchange dat (A) There are several ways this can be accomplished, such as through a shared memory bus or over a network, however the actual event of data exchange is commony referred to as communications regardless of the m
- None of these
6. Parallel Overhead is
- Observed speedup of a code which has been parallelized, defined as: wall-clock time of serial execution and wall-clock time of parallel execution
- The amount of time required to coordinate parallel tasks. it includes factors such as: Task start-up time, Synchronizations, Data communications.
- Refers to the hardware that comprises a given parallel system - having many processors
- None of these
7. Massively Parrallel
- Observed speedup of a code which has been parallelized, defined as : wall-clock time of serial execution and wall-cock time of parallel execution
- The amount of time required to coordinate. parallel tasks. It include factors such as : Task start-up time, Synchronizations, Data communications
- Refers to the hardware that comprises a given parallel system - having many
- None of these
8. Fine - gram Parallelism is
- in parallel computing, it is a qualitative measure of the ratio of fomputation to communication
- Here relatively small amounts of computational work are done between communincation events
- Relatively large amounts of computational work are done between communication / synchronization events
- None of these
9. In shared Memory
- Changes in a memory location effected by one processor do not affect all other processors.
- Changes in a memory location effected by one processor are visibe to all other processors
- Changes in a memory location effected by one processor are randomly visible to all other processors.
- None of these
10. In shared Memory:
- Here all processors accesses, all memory as global address space
- Here all processors have individual memory
- Here some processors access, all memory as global address space and some not
- None of these