Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
65 views

In Computer Programming

Amdahl's law states that in parallel programs, a few sequential instructions limit speedup so adding more processors may not speed up the program. This argues against parallel processing for some applications and overstating claims of parallel computing. However, others argue that parallel processing works best for large problems where scaling up processors improves throughput and performance.

Uploaded by

archumeenabalu
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
65 views

In Computer Programming

Amdahl's law states that in parallel programs, a few sequential instructions limit speedup so adding more processors may not speed up the program. This argues against parallel processing for some applications and overstating claims of parallel computing. However, others argue that parallel processing works best for large problems where scaling up processors improves throughput and performance.

Uploaded by

archumeenabalu
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

In computer programming, Amdahl's law is that, in a program with parallel

processing , a relatively few instruction s that have to be performed in


sequence will have a limiting factor on program speedup such that adding
more processor s may not make the program run faster. This is generally an
argument against parallel processing for certain applications and, in general,
against overstated claims for parallel computing. Others argue that the kinds
of applications for which parallel processing is best suited tend to be larger
problems in which scaling up the number of processors does indeed bring a
corresponding improvement in throughput and performance.

You might also like