Research Article
Modeling Parallel Execution Policies of Web Services
@INPROCEEDINGS{10.1007/978-3-319-38904-2_25, author={Mai Trang and Yohei Murakami and Toru Ishida}, title={Modeling Parallel Execution Policies of Web Services}, proceedings={Cloud Computing. 6th International Conference, CloudComp 2015, Daejeon, South Korea, October 28-29, 2015, Proceedings}, proceedings_a={CLOUDCOMP}, year={2016}, month={5}, keywords={Parallel execution Service policy Performance analysis}, doi={10.1007/978-3-319-38904-2_25} }
- Mai Trang
Yohei Murakami
Toru Ishida
Year: 2016
Modeling Parallel Execution Policies of Web Services
CLOUDCOMP
Springer
DOI: 10.1007/978-3-319-38904-2_25
Abstract
Cloud computing and high performance computing enable service providers to support parallel execution of provided services. Consider a client who invokes a web service to process a large dataset. The input data is split into independent partitions and multiple partitions are sent to the service concurrently. A typical customer would expect the service speedup to be directly proportional to the number of concurrent requests (or the degree of parallelism - DOP). However, we obtained that the achieved speedup is not always directly proportional to the DOP. This may because service providers employ parallel execution policies for their services based on arbitrary decisions. The goal of this paper is to analyse the performance improvement behavior of web services under parallel execution. We introduce a model of parallel execution policy of web services with three policies: Slow-down, Restriction and Penalty policies. We conduct analyses to evaluate our model. Interestingly, the results show that our model have a good accuracy in capturing parallel execution behavior of web services.