User:LiviuE/sandbox

Amdahl’s law Amdahl’s law states that for any system, if an improvement is made to a part of it , the overall speedup (benefit) of that improvement to the whole system is given by the formula: SF = 1 /  (  (1 - Is) +  Is/ GP ) where: SF = the overall speedup factor, for a computer program is the old execution time divided by the new improved one; Is = the interval of speedup, meaning the amount of time of the total time where the improvement is actually taking place. For a computer program, if a section is 20% of the total time it takes the program to complete , any improvement tot that section  will have Is = 20% = 0.2 GP = the gained performance, means the new execution time of the improved portion divided by the old execution time of that same portion. It is a raw measure of the improvement itself. What you should be looking for in optimizing a program There are many types of improvements that can be made to a computer program. I will divide them in 2 categories : cheap ones and expensive ones. Take for example the Java language: Cheap optimizations are things like: •	using the shift operator whenever possible ( x < > 1 instead of x * 2 or x / 2 ) •	Getting the count size as a constant in a loop operation ( using for(.. ; I < const ; i++) instead of  using for(.. ; I < Array.size ; i++) ) •	Eliminating redundant calculations in common expression ( 	double depth = d * (lim / max); double x = depth * sx; double y = depth * sy; instead of : double x = d * (lim / max) * sx; double y = d * (lim / max) * sy;

Notice that these cheap optimizations are closely related to good java practices. Expensive or costly optimizations, are the ones that require new code to be written , and old code to be readapted , it changes some in depth aspects of the program. In the end it all translates to a lot of programmer hours spent on these optimizations. Here are some examples of costly optimizations: •	Changing parts of the program to use parallel programing, meaning dividing the tasks of a section of code to some threads. •	Adding a cache plugin and using cache for certain functions, like cache-ing the function that displays a picture. •	Redesigning a function of complexity O(n) to be of complexity O(n-1), usually involving changing other parts of the program. How to be efficient Given a certain amount of time (hours) a developer must chose (and it is usually a tough choice) what type of optimization he will add to his program. We will compute here the efficiency of each type of optimizations (the cheap ones and the costly one) by using Amdahl’s law. The following computations are just approximations, but one can hopefully see that the conclusion extracted from them is a valid one. Cheap optimizations apply to a large section of code, usually a very large part of the operations made by a program can be improved this way. We shall estimate that: Is = 80% and GP = 1.5 This means SF = 1 / ( 0.2 + 0.8 / 1.5) = 1 / ( 0.2 + 0.53 ) =  1.36 So for the cheap optimizations we estimate SFcheap = 1.36 Now for the costly optimizations, since there are less portions of code where they can be applied , if the program is not especially prone for parallel execution or other major optimization , we estimate that: Is = 15% and GP = 4 It means SF = 1/ (0.85 + 0.15/4 ) = 1 / (0.85 + 0.03) = 1.36 So for the costly optimizations we estimate SFcostly= 1.36 Conclusions Surprisingly we estimated that SFcheap = SFcostly, even though we made a lot of approximations , we can see that for the average program the optimization that can be cheaply done (in a few hours) have an impact just as big(comparable) as the ones that require a lot of hours of programming work. (my name is LiviuE and I wrote this as part of my PPAM (PCSAM master) homework )