User:Guptashu

I. Introduction

Despite the fact that the selection of suitable simulation software is of considerable importance to any simulation practitioner, there are not many papers that have contributed to methodologies of simulation software selection and simulation software evaluation techniques. Some literature concentrates on information about specific packages and a comparison of some of them, but it is beyond the scope of this paper to cover this, especially as the updating of packages changes previous comparisons.A lot work have been done about selectiion and evaluation of simulation tools by Dr. Kawaljeet Singh, Dr. Rajesh Verma and Ashu Gupta from India. The detail is available on www.simvehic.com

A point, which is usually discussed in related articles, is simulation software classes. Currently, simulation software is classified into two major classes: languages and simulators. Languages are either general purpose programming languages like FORTRAN, or general-purpose simulation languages like GPSS, introduced in 1960. A language is like a small foundry. You can and have to make any tool you need, but it takes time and needs expertise. A simulator, introduced in1980 (Banks et al., 1991), is like a toolbox containing a limited number of different tools and maybe some flexible ones. The main advantage of using a simulator is that you do not need to spend time and effort on making tools, but the flexibility is not as great as the flexibility of a language. Because of the growth in quality, features and flexibility of simulators, they are also more than the languages used for simulation modeling these days. A survey by Hlupic and Paul (1993) showed that less than 10% of the users at universities and industry use only simulation languages. The majority use either both or just simulators.

Programming-like commands and interfaces with programming languages are features, which make a simulator flexible. In general, improvements in the facilities available in simulators make them increasingly powerful, flexible and user friendly. According to Banks (1991), the distinction between simulators and simulation languages is blurring. They are moving toward each other by offering special features. Some experts introduce other classifications. Banks (1991) counted spreadsheets and rapid modeling tools as two other classes of simulation modeling tools. Carson (1990) classifies classes as: pure simulator, simulator with programming-like capability, simulation language with simulator-like extensions, and simulation languages.

Whatever the classification, what matters in simulation software selection is the capabilities, features and suitability for the specific application area rather than the way they are classified.

II. Business Process Simulation Tools

Currently the market offers a variety of discrete-event simulation software packages. Some are less expensive than others. Some are generic and can be used in a wide variety of application areas while others are more specific. The following studies focus on evolution, types and categories of simulation tools in the marketplace.

Suri et al. (1990) described rapid modeling tools such as ManuPlan/SimStarter, marketed by Network Dynamics Inc. The purpose of these tools is to gain an idea about such measures of performance as throughput and bottlenecks. The system is modeled in very general terms, omitting many of the details in order to get an idea about the performance measures. In many instances, this level of output is sufficient as it answers the questions that are being asked in a timely manner.

Bhaskar et al. (1994) proposed a set of requirements that should be met by tools used for modeling and simulation of business processes. These requirements can be divided into five groups: process documentation, process redesign, performance measurement, communication, and institutional learning.

Greasley (1994) evaluated a number of tools for the redesign of processes through the use of two case studies. There is a particular emphasis on the use of Business Process Simulation in conjunction with Activity Based Costing and Activity Based Budgeting within the context of a Business Process Reengineering approach. The use of a balanced scorecard and marking guide can be used to identify suitable processes for redesign. A process map enables a study of the relationship between the activities that form the process. The process map relates to the conceptual map in a simulation study.

Aalst and Hee (1995) proposed high-level Petri nets as a tool for the modeling and analysis of business processes. Petri nets have proved to be useful in the context of logistics and production control. However, the application of these Petri nets is not restricted to logistics and manufacturing, they can also be used to support business process reengineering efforts. High-level Petri nets have a formal semantics. A Petri net model of a business process is a precise and unambiguous description of the behavior of the modeled process. The precise nature and the firm mathematical foundation of Petri nets have resulted in an abundance of analysis methods and tools.

Tumay (1995) gave an overview of business process simulation, described the modeling and analysis considerations, and listed typical model input, simulation and output requirements. He provided a classification of simulation software products to aid the user in understanding the business process simulation tools. He also presented a simulation exercise to illustrate the power and suitability of simulation for analyzing a business process. Wright and Burns (1996) said that many tools are available to stimulate, simulate and support the business process modeling approaches, based on different methodologies and frameworks. They tended to describe the business process analysis and modeling market as having three groups: ·	drawing packages with templates; ·	drawing packages with templates and some spreadsheet-like capabilities; and ·	the heavyweight BPA/BPM tools proper, akin to discrete event simulators.

Bing and David (1997) examined the wide range of Business Process Analysis/Modeling (BPA/M) tools available, and compared the features of 12 specific tools. They presented two case studies with examples of software tool analysis results. The discussion addressed whether these tools met the needs of managers of change and Business Process Re-engineering (BPR) initiatives, and offered suggestions for future tool evolution. The focus is to highlight the differences between the often lofty claims of tool vendors, and both the real needs of BPR analysts and implementers, and the actual capabilities of the BPR tools.

Kettinger et al. (1997) reported about at least 72 techniques and 102 tools. No single technique or approach can capture the whole spectrum of requirements posed by different people and applications. The choice of a modeling technique for a particular project should be based on matching the virtues and limitations of various techniques with the objectives of the project.

Aguilar et al. (1999) stated that Business Process Simulation is a powerful tool supporting analysis and design of business processes. The added value of simulation is based on four factors: §	Process performance analysis is improved by simulation-triggered measurements. §	Simulation (especially animation) is an effective tool to communicate process thinking and process analysis results. §	Simulation enables the migration towards dynamic models for business processes. §	Simulation provides essential decision support by anticipating changes.

Stemberger et al. (2003) discussed the level of information system modeling and simulation modeling methods and tools integration in the conditions of dynamic e-business environment. They stressed the necessity for integrating simulation modeling and information system modeling. The integrated BPM tools combine formerly diverse areas of business process, IT, resource and financial modeling, enabling the companies to form a complete view of their operations and providing a framework for efficient development of robust and complete enterprise architecture.

Vreede et al. (2003) considered the suitability of Arena to simulate business processes. They stated that a weak point in simulating business processes is the time consuming and complicated process to create simulation models. They took advantage of the possibility to develop their own template with predefined building blocks, which they considered to be successful in several simulation studies they carried out.

III. Evaluation & Selection of Business Process Simulation Tools

Evaluation of simulation packages is not new. Many researchers have carried out surveys on available packages for different purposes. However, there are only a limited number of papers that describe methods to perform an evaluation and selection of simulation packages. Some of the studies are:

One of the best-known early simulation software evaluation and comparison was carried out by Tocher (1965). The simulation languages analyzed were: GPSS, SIMPAC, SIMSCRIPT, SIMULA, CSL, ESP, GSP, MONTECODE and SIMON. These languages are examined on the basis of the following groups of criteria: the organization of time and activities in a simulation programming system, naming and structure of entities and generalized activity specification, testing of conditions in activities, test formation facilities, statistics collection procedures. It is estimated how well the languages under consideration satisfy the criteria within each group. Subsequently, each language is briefly described with an emphasis on its main qualities and weaknesses. The languages evaluated have not been ranked nor were particular ones recommended for use.

A comprehensive evaluation of fifteen simulation languages is provided by Cellier (1983). Languages examined are ACSL, DARE-P, SIMNON, DYMOLA, SYSMOD, FORSIM-IV, SIMULA 67, PROSM, SIMSCRIPT-II, GPSS_FORTRAN-II, GPSS_FORTRAN-III, SLAM-II, GASP-V, GASP-VI and COSY. The evaluation criteria are classified in six groups regarding expressiveness of the language, numerical behavior, structural features, status of implementation, portability, and documentation. Features within each group are assessed according to their availability and quality. Data presented is analyzed and software tools are compared and ranked on the basis of evaluation.

Haider and Banks (1986) addressed the issues relating to the choice of simulation software products for the analysis of manufacturing systems and established the following desirable features for simulation software: input flexibility; structural modularity; modeling flexibility; macro capability and hierarchical modeling; materials handling modules; standard statistics generation; data analysis; animation; interactive model debugging; micro/mainframe compatibility; the support provided by the supplier; and the cost of simulation software.

Grant and Weiner (1986) analyzed simulation software products such as BEAM, Cinema, PCModel, SEE WHY and SIMFACTORY II.5, Modelmaster, RTCS with the main emphasis on their graphical and animation features. The analysis is done on the basis of the information provided by the vendors. The features examined are grouped in three main groups. The simulation model building system group includes the main orientation of the software and flexibility. Animation graphics related features determine the type of graphics and animation. Criteria within the operational considerations include the cost of the software, platforms on which software can be run and determination of need for a specialized VDU. The authors do not comment on the provided features of software tools. They conclude with a specification of general trends regarding simulation software tools such as the implementation of software on microcomputers, manufacturing oriented preprocessors, lower priced systems and interactivity both for model building and model animation.

Deaver (1987) identified a need to thoroughly analyze system requirements before selecting simulation software, as simulation packages vary widely in capability. Some of the factors that should be considered prior to the selection of simulation software include: identification of potential simulation users, consideration of future training for employees, determination of types of systems to be simulated, analyzing the resources currently available and consideration of the amount of time that is to be dedicated to simulation. In addition, several criteria are presented that can be used for software evaluation. These criteria include features such as graphics, interaction, statistical data gathering and analysis, flexibility, support provided by vendor, ability for discrete-event and continuous-processes modeling.

Szymankiewicz et al. (1988) provided a list of features which manufacturing simulators should possess. Some of these features include an effective user interface; an implemented set of algorithms for sequencing production orders; interactivity; an interface to external data sources; a mechanism to store all input and output data in a database; and fast execution of simulation; coded algorithms; standard and user-coded performance reports; storage of data used for model design in an external file; and orientation to the design issues including randomness.

Bovone et al. (1989) proposed a simple three step method for the selection of simulation software. The purpose of using this method is to obtain the weights which can express the importance of software evaluation criteria with regard to the simulation objectives. The applicability of this method is illustrated using the following criteria: flexibility, learning and use, modeling speed, running speed, report features, debugging, stochastic capacity, ease of transport, service and reliability. Separate evaluation tables are constructed both for the conceptual design and for the detailed design which emphasizes the importance of flexibility. On the basis of evaluation of several simulation packages using this method, the authors conclude that no product is superior to the others with regard to both software purposes.

Ekere and Hannam (1989) presented an evaluation of the event, activity and process-based approaches for modeling as well as an evaluation of three software tools for simulation. They evaluate the simulation language SLAM, the program generator CAPS/ECS, and the data driven simulation package HOCUS. The criteria specified for the evaluation of software features are classified into four categories. The first group relates to model characterization and programming, the second group relates to model development features, third group relates to experimental and reporting features and the fourth group relates to commercial and technical features.

Law and Haider (1989) provided a simulation software survey and evaluation on the basis of information provided by vendors. Both simulation languages and simulators such as FACTOR, MAST, WITNESS, XCELL + and SIMFACTORY 11.5 are included in this study. Instead of commenting on the information presented about the software, the authors conclude that there is no simulation package which is completely convenient and appropriate for all manufacturing applications.

Kochhar and Ma (1989) addressed the essential and desirable features of simulation software for its effective use in manufacturing environments, provided criteria which should be used for the selection of manufacturing simulation software tools. These criteria relate to: modeling assistance; interactivity; graphics; a data handling capability; the time scale for model development; the learning curve and the required skills for the use of software; ease of model editing; portability; simulation speed; and interfacing the simulation package with external systems.

Holder (1990) proposed a structured approach to selection of simulation software. This approach suggests that software selection should commence with a consideration of the available resources within the organization, and a determination of the simulation objectives. Subsequently, the essential features of the software are to be determined in order to eliminate software products that would certainly not be suitable. This should result in a short-list of products that are to be evaluated using the evaluation table provided. This table comprises evaluation features categorized in six groups: technical features, user needs (system development), user needs (end user), future development, functionality and commercial features. No weighing of the proposed criteria is established. These criteria are to be used to determine whether the products have the features required, and on the basis of this, a recommendation as to which software seems to be most suitable is to be derived.

Banks (1991) made a classification of simulation modeling tools and discussed a collection of features of simulation software. They provided guidance for selecting a simulation modeling tool. They also described a technique to reduce the vast number of simulation modeling tools to a manageable few. The selection of simulation software depends on the problems to be solved as much as the characteristics of the various modeling tools.

Banks et al. (1991) evaluated SIMFACTORY II.5, XCELL +, WITNESS and ProModelPC by modeling two manufacturing systems. The criteria for the evaluation are classified in five groups. The first group relates to the basic features such as routes, schedules, capacities, downtimes or transporters. The robust features (within the second group) include programming, conditional routing, part attributes, global variables and interface to other softwares. The main results of the evaluation revealed that SIMFACTORY II.5, WITNESS and ProModelPC are similar in their basic features, whilst XCELL + does not model downtimes and requires the user to construct transporters and conveyors from available elements. Those simulators that were found to be similar differed in their operational procedures. Whilst in SIMFACTORY II.5 and ProModelPC, the complete route is specified directly on the screen, in WITNESS the user builds the route one step at a time when specifying other characteristics. SIMFACTORY II.5 and XCELL + do not have robust features whilst WITNESS and ProModelPC have most or all of them. Such conclusions were obtained on the basis of twenty-two criteria.

Law and Kelton (1991) evaluated AutoMod II, ProModel, SIMFACTORY II.5, WITNESS and XCELL +. The main strength of Automod II is considered to be its three dimensional animation capability and a comprehensive set of material-handling modules. On the other hand, this package has very limited statistical capabilities. ProModel is regarded as one of the most flexible simulators currently available, due to its programming-like constructs and its ability to call C or Pascal routines to model complex decision logic. The greatest advantages of SIMFACTORY II.5 are its ease of use and good statistical capabilities and the main shortcoming is its inadequate modeling flexibility for certain manufacturing applications. WITNESS is regarded as a very flexible manufacturing simulator due to its programming-like input/output rules and actions but its main shortcoming is the lack of an easy mechanism for making multiple replications of simulation. XCELL + is easy to learn and use with menus being employed to place and connect predefined graphical representations of system components but its statistical facilities are poor.

Pidd (1992) identified general principles for selecting discrete simulation software by dividing these principles into three main groups. The first one is focused on computer programming, covering the field of logical machines, machine code, assembly languages, compilers and interpreters. The second group of principles analyses different simulation executive approaches, model logic, distribution sampling, random number generation and report generation. The last group of principles examines a range of factors which should be considered when appraising DES software, such as: the type of application, the expectation for end-use, knowledge, computing policy and user support.

Davis and Williams (1993) illustrated the evaluation and selection of simulation software using the analytic hierarchy process method. They evaluate five simulation software systems using this method in order to recommend suitable simulation software for a U.K. company. The chosen criteria include: cost, comprehensiveness of the system, integration with other systems, documentation, training, ease of use, hardware and installation, and confidence related issues. An illustration of the main phases of software evaluation and comparison using the analytic hierarchy process method is provided. In the first stage, the criteria are ranked according to their relative importance when selecting a simulation package. Several other steps follow, finally producing an overall ranking for each package being evaluated. It is emphasized that it is not possible to derive absolute measures of how well any package performs against a given criterion. Only its relative performance compared to the other packages can be obtained.

Bradley et al. (1995) defined seven different categories to evaluate business process simulation tools. The seven categories are as follows: 1.	Tool capabilities, including a rough indication of modeling, simulation and analysis capabilities. 2.	Tool hardware and software, including, e.g., the type of platform, languages, external links and system performance. 3.	Tool documentation, covering the availability of several guides, online-help and information about the learning curve of the tool. 4.	User features: amongst others user friendliness, level of expertise required, and existence of a graphical user interface. 5.	Modeling capabilities, such as identification of different roles, model integrity analysis, model flexibility and level of detail. 6.	Simulation capabilities, summarizing the nature of simulation (discrete vs. continuous), handling of time and cost aspects and statistical distributions. 7.	Output analysis capabilities such as output analysis and BPR expertise.

Hlupic and Paul (1995) provided an evaluation of several manufacturing simulators. During the evaluation not every single criteria within each group was examined, because the aim was to generally perceive basic features of each simulator. Specific features are probably going to change and be added to with new releases of the simulators under consideration. A comparison of the evaluated simulators is provided. The general quality of each group of criteria was ranked for each simulator. This revealed that although all simulators belong to the same type of simulation software, there is a variety of differences between them. In addition, none of the simulators satisfies all criteria, and none is equally good for all purposes. Although some simulators are more comprehensive and flexible than others, a simulator that can suit any manufacturing problem does not exist. At the same time those simulators that are more robust and adaptable are usually more expensive and difficult to learn and use properly. The fact that the selection of a piece of simulation software is a matter of compromise between many factors is substantiated by this research. One of the most important factors that determined which software is more suitable than others is its intended purpose. Other factors to consider are financial constraints and subjective factors such as individual preferences and previous experience in using simulation software.

Hlupic and Mann (1995) developed a software tool (SimSelect) that selects simulation software given the required features.

Banks and Gibson (1997) suggested some considerations to be made while purchasing the simulation software like accuracy and detail, powerful capabilities, fastest speed, demo-solution of problem, opinions of companies with similar applications, attending the user group meetings.

According to Oakshot (1997), a range of features desired from a simulation tool are modeling flexibility, ease of use, animation, general simulation functions (e.g. warm-up period, multiple runs), statistical functions, interface with other software, product help and support, price and expandability.

Kalnins et al. (1998) presented the comparison of main Business Process Reengineering (BPR) tools from the point of view of modeling languages supported by them. One of the tools considered was the GRADE tool developed by IMCS LU. The proposed comparison criteria are language support for the selected basic modeling activities common to most BPR methodologies. The main emphasis was on the semantic properties of modeling languages.

Nikoukaran et al. (1998) presented a comprehensive list of criteria structured in a hierarchical framework for evaluating and selecting simulation software. Issues related to criteria for simulation software evaluation and selections are categorized into seven main groups and several sub-groups. The hierarchy can be used for obtaining a better view of the features of simulation software and as a guide to test and analyze simulation modeling packages. With the help of a suitable evaluation technique, such as the Analytic Hierarchy Process (AHP), the hierarchy could be used to evaluate simulation software. The software, the vendor and the user are the important elements which form the elements of the highest level of the hierarchy. Considering the process of modeling a problem using a simulation package, they defined model and input, execution, animation, testing and efficiency and output.

Hlupic and Paul (1999) presented criteria for the evaluation of simulation packages in the manufacturing domain together with their levels of importance for the particular purpose of use. They suggested general guidelines for software selection. They pointed that to expect a particular package to satisfy all criteria. However, it is indicated which criteria are more important than others, according to the purpose of software use. These guidelines can be used both by users that are looking for a suitable simulator to buy, and by developers of such simulators who wish to improve existing versions of simulators, or who wish to try to develop a new and better manufacturing simulator.

Hommes and Reijswoud (2000) have developed a framework for the evaluation and selection of business process modeling tools. They propose eight evaluation criteria, which can be divided into two groups: one related to the conceptual modeling in general and another group related to the business process modeling in particular. They refer to the quality of the way of modeling and the way of working of a modeling tool respectively. These criteria are: §	Expressiveness - the degree to which a given modeling tool is capable of denoting the models of any number and kinds of application domains; §	Arbitrariness - the degree of freedom one has when modeling one and the same domain; §	Suitability - the degree to which a given modeling technique is specifically tailored for a specific kind of application domain. §	Comprehensibility - the ease with which the way of working and way of modeling are understood by the participants; §	Coherence - the degree to which the individual sub-models of a way of modeling constitute a whole; §	Completeness - the degree to which all necessary concepts of the application domain are represented in the way of modeling; §	Efficiency - the degree to which the modeling process utilizes resources such as time and people; §	Effectiveness - the degree to which the modeling process achieves its goal.

Perera and Liyanage (2001) pointed that a number of factors such as inefficient data collection, lengthy model documentation and poorly planned experimentation prevent frequent deployment of simulation models.

Tewoldeberhan et al. (2002) proposed a two-phase evaluation and selection methodology for simulation software selection. Phase one quickly reduces the long-list to a short-list of packages. Phase two matches the requirements of the company with the features of the simulation package in detail. Different methods are used for a detailed evaluation of each package. Simulation software vendors participate in both phases.

Becker et al. (2003), in order to manage the increasing complexity of business processes, has formulated six main quality criteria for business process models. These criteria are: §	Correctness: the model needs to be syntactically and semantically correct. §	Relevance: the model should not contain irrelevant details. §	Economic efficiency: the model should serve a particular purpose that outweighs the cost of modeling. §	Clarity: the model should be (intuitively) understandable by the reader. §	Comparability: the models should be based on the same modeling conventions within and between models. §	Systematic design: the model should have well-defined interfaces to other types of models such as organizational charts and data models.

Melao and Pidd (2003) conducted a survey of practitioners asking how and why BPS is used in practice. The survey revealed that users want tools that are not only easy-to-use, but also flexible enough to tackle different application areas and complex human behavior. Survey revealed a low usage of simulation in the design, modification and improvement of business processes. It confirms that BPS projects are typically short, relatively non-technical, and rely on good project management for their success. Most BPS users employ general purpose simulation software rather than purpose-designed business process simulators. There is no evidence of a skills gap, rather a feeling that there is no net gain from employing simulation methods when simpler methods will suffice. The results of the survey have implications for four groups of people i.e. Practitioners, Educators, Researchers and Software developers.

Seila et al. (2003) presented a framework for choosing simulation software for discrete event simulation. By evaluating about 20 software tools, the proposed framework first tried to identify the project objective, since a common understanding of the objective will help frame discussions with internal company resources a well as vendors and service providers. It is also prudent to define long-term expectations. Other important questions dealt with model dissemination across the organization for others to use, model builders and model users, type of process (assembly lines, counter operations, material handling) the models will be focused, range of systems represented by the models etc.

Popovic et al. (2005) developed a criteria that can help experts in a flexible selection of business process management tools. They classified the simulation tools selection criteria in seven categories: model development, simulation, animation, integration with other tools, analysis of results, optimization, and testing and efficiency. The importance of individual criteria (its weight) is influenced by the goal of simulation project and its members (i.e., simulation model developers and model users).

Pidd and Carvalho (2006) suggested that typical simulation package should provide the following: ·	Modeling tools: a graphical modeling environment, built-in simulation objects with defined properties and behavior, sampling routines, property sheets and visual controls ·	Tools to execute the simulation: a simulation executive to run a model, animated graphic, virtual reality representation and user interaction with the simulation as it runs ·	Tools to support experimentation: tools to define run lengths and parameters, analysis tools to enable optimization, results interpretation and presentation ·	Links to other software (links to spreadsheets, databases, ERP systems).

Vullers and Netjes (2006) in a study discussed a number of simulation tools that are relevant for the Business Process Management (BPM) field, and evaluated their applicability for Business Process Simulation (BPS). They evaluated the tools on their modeling capabilities, simulation capabilities and possibilities for output analysis.

IV. Summary and Conclusions

In this paper, the result of a search of the literature concerning Business Process Simulation (BPS) and its Tools has been discussed. It is not beneficial for a user to spend too much on the evaluation of packages. It seems reasonable for this part of the job to be done by a group of experts, who keep a record of the results and update it as appropriate. Academic simulation departments would seem suitable for this job. The selection process needs knowledge of the application area and the user's needs, too. This can be done by a joint group from the academic simulation department that is active in the evaluation process and the user. What we need is a selection methodology and an evaluation technique which is flexible enough to be used for different application areas without too many changes and capable of keeping an easily updateable record of evaluation results. Research in this direction is ongoing.

The need for a new tool, or replacing an existing one, does not make much sense unless we are facing a problem. Therefore, the first question, before asking ``which tool?, is ``what is the problem?. Obviously, when the problem is known to us, we will go for a tool which has the capabilities, or at least we expect it to have, to solve our problem. Therefore the job of selection needs expertise in both the areas of the tool and the particular application area. Further knowledge that can be of great help in this matter is to know which capabilities of the tool can be applicable and more effective in solving our problem. We conclude that not only would simulation software selection benefit from research on the subject but also important areas for future development will be identified. We finish this paper by the following saying: ``The choice of software, therefore, is primarily a matter of taste'' (Seila, 1995). We believe that the choice of software is a matter of convenience.

References

Aalst, W. M. P. V., & Hee, K. M. V. (1995). Framework for Business Process Redesign. In J. R. Callahan (Ed.), Proceedings of the Fourth Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises (pp. 36-47). Berkeley Springs: IEEE Computer Society Press.

Aguilar, M., Rautert, T., & Pater, A. J. G. (1999). Business Process Simulation: A Fundamental Step Supporting Process Centered Management. In P. A. Farrington, H. B. Nembhard, D. T. Sturrock, & G. W. Evans (Eds.), Proceedings of the 1999 Winter Simulation Conference (pp. 1383-1392). Phoenix: IEEE Computer Society Press.

Banks, J., Aviles, E., McLaughlin, J.R., & Yuan, R.C. (1991). The Simulator: New Member of the Simulation Family. Interfaces, 21(2): 76-86.

Banks, J. (1991). Selecting Simulation Software. Proceedings of the 1991 Winter Simulation Conference, Arizona, USA, 15-20.

Banks, J., & Gibson, R. R. (1997). Selecting Simulation Software. IIE Solutions, 29(5): 30-32. Becker, J., Kugeler, M., & Rosemann, M. (2003). Process Management - A Guide for the Design of Business Processes. Berlin: Springer-Verlag.

Bhaskar, R., Lee, H. S., Levas, A., Petrakian, R., Tsai, F., & Tulskie, B. (1994). Analyzing and Reengineering Business Processes using Simulation. In M. S. Manivannan, & J. D. Tew (Eds.), Proceedings of the 26th Winter Simulation Conference (pp. 1206-1213). San Diego: Society for Computer Simulation International.

Bing, Y., & David, T. W. (1997). Software Tools Supporting Business Process Analysis and Modeling. Journal of Business Process Management, 3(2): 133-150.

Bovone, M., Ferrari, D. V. and Manuelli, R. (1989). How to Choose an Useful Simulation Software. In D. M. Smith, J. Stephenson, & R. N. Zobel (Eds.), Proceedings of the 1989 European Simulation Multiconference (pp. 39-43). SCS, San Diego: Society for Computer simulation International.

Bradley, P., Browne, J., Jackson, S., & Jagdev, H. (1995). Business Process Reengineering (BPR)– A Study of the Software Tools Currently Available. Computers in Industry, 25(3): 309-330. Carson, J. (1990). Simulation Concepts in Manufacturing and Material Handling. Autofact'90 Conference Proceedings, Detroit, MI, 4.1-4.19.

Cellier, F. E. (1983). Simulation Software: Today and Tomorrow. In J. Burger, & Y. Jarny (Eds.), Proceedings of the IMACS International Symposium (pp. 3-19), Amsterdam: Elsevier Science Publishers.

Davis, L., & Williams, G. (1993). Evaluating and Selecting Simulation Software using the Analytic Hierarchy Process. Integrated Manufacturing Systems, 5(1): 23-32.

Deaver, R. A. (1987). Selecting a Manufacturing Simulation System. CIM Review, 3(3) : 6-8. Ekere, N. N., & Hannam, R. G. (1989). An Evaluation of Approaches to Modeling and Simulating Manufacturing Systems. International Journal of Production Research, 27(4): 599-611.

Grant, J. W., & Weiner, S. A. (1986). Factors to Consider in Choosing a Graphical Animated Simulation System. Industrial Engineer, 31: 65-68.

Greasley, A. (1994). Effective uses of Business Process Simulation. In J. A. Joines (Ed.), Proceedings of 2000 Winter Simulation Conference (pp. 2004-09). New York: Association for Computing Machinery.

Haider, S. W., & Banks, J. (1986). Simulation Software Products for Analyzing Manufacturing System. Industrial Engineer, 12: 98–103.

Hlupic, V., & Paul, R.J. (1993). Simulation Software in Manufacturing Environments: A Users' Survey. Journal of Computing and Information Technology, 1(3): 205-212.

Hlupic, V., & Paul, R. J. (1995). A Critical Evaluation of Four Manufacturing Simulators. International Journal of Production Research, 33(10): 2757 – 2766.

Hlupic V., & Mann, A. S. (1995). SimSelect: A System for Simulation Software Selection. In C. Alexopoulos, K. Kang, D. Goldsman, & W. Lilegdon (Eds.), Proceedings of 1995 Winter Simulation Conference (pp. 720-27). Piscataway, N.J. : Institute of Electrical and Electronics Engineers. Hlupic, V., & Paul, R. J. (1999). Guidelines for Selection of Manufacturing Simulation Software. IIE Transactions, 31(1): 21-29.

Holder, K. (1990). Selecting Simulation Software. OR Insight, 3(4): 19-24.

Hommes, B., & Reijswoud, V. (2000). Assessing the Quality of Business Process Modeling Techniques. In R. H. Spraque (Ed.), Proceedings of the 33rd Hawaii International Conference on System Sciences (pp. 1-10). Washington, D.C.: IEEE Computer Society.

Kalnins, A., Kalnina, D., & Kalis, A. (1998). Comparison of Tools and Languages for Business Process Reengineering. In D. H. Withers, & R. N. Zobel (Eds.), Proceedings of the Third International Baltic Workshop on Databases and Information Systems (pp. 24-38). Washington, D.C.: IEEE Computer Society.

Kettinger, W. J., Teng, J. T. C., & Guha, S. (1997). Business Process Change: A Study of Methodologies, Techniques, and Tools. MIS Quarterly, 21(1): 55-80.

Kochhar, A. K., & Ma, X. (1989). Discrete Event Simulation Software Tools for the Simulation of Advanced Manufacturing Systems. In G. Lazeolla, A. Lehmann, & H. J. D. Herik (Eds.), Proceedings of the 1989 European Simulation Conference (pp. 13–18). San Diego: Society for Computer Simulation International.

Law, A. M., & Haider, S. W. (1989). Selecting Simulation Software for Manufacturing Applications: Practical Guidelines and Software Survey. Journal of Industrial Engineering, 21(5): 33-46.

Law, A. M., & Kelton, W. D. (1991). Simulation Modeling and Analysis. Singapore: McGraw-Hill. Melao, N., & Pidd, M. (2003). Use of Business Process Simulation: A Survey of Practitioners. Journal of the Operations Research Society, 54(1): 2-10.

Nikoukaran, J., Hlupic, V., & Paul, R. J. (1998). Criteria for Simulation Software Evaluation. In D. J. Medeiros, E. F. Watson, J. S. Carson, & M. S. Manivnnan (Eds.), Proceedings of the 30th Conference on Winter Simulation (pp. 399-406). Washington, D.C.: IEEE Computer Society. Oakshott, L. (1997). Business Modeling and Simulation. London: Pitman Publishing.

Perera, T., & Liyanage, K. (2001). IDEF based Methodology for Rapid Data Collection. Integrated Manufacturing Systems, 12(3): 187-194.

Pidd, M. (1992). Computer Simulation in Management Science. New York: John Wiley & Sons. Pidd, M., & Carvalho, A. (2006). Simulation Software: Not the same Yesterday, Today or Forever. Journal of Simulation, 1(1): 7-20.

Popovic, A., Jaklic, J., & Vuksic, V. B. (2005). Business Process Change and Simulation Modeling. System Integration Journal, 13(2): 29-37.

Seila, A. F., Ceric, V., & Tadikamalla, P. (2003). Applied Simulation Modeling. Australia: Thomson Learning. Seila, A.F. (1995). Introduction to Simulation. Proceedings of The 1995 Winter Simulation Conference,Virginia, USA, 7-15.

Stemberger, M. I., Popovic, A., & Vuksic, V. B. (2003). Simulation and Information Systems Modeling: A Framework for Business Process Change. In A. Verbraeck, & V. Hlupic (Eds.), Proceedings of 15th European Simulation Symposium and Exhibition (pp. 75-81). North Holland: Elsevier Science Publishers.

Suri, R. D., & Tomsicek, M. (1990). Modeling Manufacturing Systems Using ManuPlan and SimStarter-A Tutorial. In O. Balci, R. P. Sadowski, & R. E. Nance (Eds.), Proceedings of the 1990 Winter Simulation Conference (pp. 168-176). New Orleans: IEEE.

Szymankiewicz, J., McDonald, J., & Turner, K. (1988). Solving Business Problems by Simulation. London: McGraw Hill.

Tewoldeberhan, T. W., Verbraeck, A., Valentin E., & Bardonnet, G. (2002). An Evaluation and Selection Methodology for Discrete-Event Simulation Software. In E. Ycesan, J. L. Snowdon, J. M. Charnes, & J. Wayne (Eds.), Proceedings of the 2002 Winter Simulation Conference (pp. 67-75). Boston: Kluwer Academic Publisher.

Tocher, K. D. (1965). Review of Simulation Languages. Operational Research Quarterly, 16(2): 189-217.

Tumay, K. (1995). Business Process Simulation. In D. Goldsman, C. Alexopoulos, & K. Kang (Eds.), Proceedings of 1995 Winter Simulation Conference (pp. 55-60). Piscataway, N.J.: Institute of Electrical and Electronics Engineers.

Vreede, G. J. D., Verbraeck, A., & Eijck, D. T. T. V. (2003). Integrating the Conceptualization and Simulation of Business Processes: A Modeling Method and an Arena Template. Simulation, 79(1): 43–55.

Vullers, M. H. J., & Netjes, M. (2006). Business Process Simulation - A Tool Survey. In K. Jensen (Ed.), Proceedings of Seventh Workshop and Tutorial on the Practical Use of Coloured Petri Nets and the CPN Tools (pp. 77-96). Denmark: University of Aarhus.

Wright, D. T., & Burns, N. D. (1996). Guide to using the WWW to Survey BPR Research, Practitioners and Tools. IEE Engineering Management Journal, 6(5): 211-216.

Authors: 1. Ashu Gupta, Sr. Lecturer, Apeejay Institute of Management, Jalandhar, Punjab, India. e-mail: guptashu1@rediff.com.

2. Dr. Kawaljeet Singh, Director, University Computer Centre, Punjabi University, Patiala, Punjab, India. e-mail: singhkawaljeet@rediff.com.

3. Dr. Rajesh Verma, Asstt. Professor, Lovely Professional University, Phagwara, Punjab, India. e-mail: rajesh.verma@rediff.com.