On-Demand Service in Cloud Computing

Release Date:2010-12-20 Author:Xiong Jinhua, Hu Songlin, Liu Hui Click:

This work was funded by the National Basic Research Program of China ("973" Program) under Grant No. 2007CB310805, and the National High Technology Research and Development Program of China ("863" Program) under Grant No. 2007AA12Z309.

 

    The cloud computing model aims to change the way traditional computing systems are used and occupied. Computing and communication resources are organized and aggregated through networking, and users are provided with scalable computing resources through virtualization. Therefore, planning, purchasing, occupying, and using a computing system is made more flexible for users[1-3]. As far as users are concerned, services received (rather than computing resources) are the major issue. Providing and using services can therefore be deemed a key issue in cloud computing.


    Cloud services are provided using a universal interface and by managing, dispatching, and integrating resources distributed throughout the network. Applications can process information at terabyte scale or even petabyte scale in a very short time, and thus provide super computer processing capability. Services are used on demand and are charged accordingly so that processing, storage, and software becomes a public facility. Two layers exist in cloud computing: infrastructure (hardware, platform, and software), and cloud applications (information service established on the infrastructure).


    Fulfilling service demands is critically important. Besides infrastructure services such as Amazon, Google’s App Engine, and Windows Azure, distributed data storage and processing platforms such as open source Hadoop[4] provide horizontally scalable services for storing and processing mass data. Developers are starting to develop and deploy cloud services and applications in ever increasing numbers. It is predictable that more and more services will become available on the Internet. Service issues not only encompass user demands but also functions and performance offered by providers. In the eyes of consumers, cloud services should satisfy personalized demands. Domain knowledge from the cloud and user state information can be obtained to provide context awareness that significantly optimizes user experience. On the other hand, for those who maintain the cloud platform, consideration must be given to efficiency in satisfying demand. Issues such as maintaining optimal experience, accommodating as many users as possible, and creating energy-saving platforms must be carefully considered.


    Geo-spatial information systems are an important part of information infrastructure. They integrate and apply information such as location, time, and geography obtained from the Earth’s surface from space. Technologies used in geo-spatial systems include Global Positioning System (GPS), Geographic Information System (GIS), and Remote Sensing Technology (RS). Continuously Operating Reference Stations (CORS) is a major part of a geo-spatial information system and provides positioning reference and timing. Therefore, various GIS and RS application systems can be established. CORS typically comprises a reference station that has fixed coordinates and collects satellite signals, a data center used for computing and providing positioning service for users, and user terminals and wide area network used for real time data transmission. From the perspective of cloud computing, CORS has an established resource management and dispatching platform, and can provide services such as space positioning and timing to users. Herein, service issues in cloud computing architecture are discussed as well as the application of cloud computing in CORS system, CORS architecture, and related key technologies.


1 Application Architecture of On-Demand Service in Cloud Computing 
    The application architecture of on-demand cloud service is shown in Figure 1. It consists of four layers:

 


    (1) Infrastructure Layer
    Support for multiple cloud centers involves controllable internal clouds, but also having accesses to third-party cloud resources with related service level protocols. The cloud platform pools these resources to provide unified cloud services for upper-layer system modules and applications.  Cloud infrastructure supports numerous resource types including hardware, structured and unstructured data storage, and basic software. 

 

 
    (2) Service Resource Management and Monitoring Layer
    Resources provided by the cloud infrastructure layer can be registered into the service resources management system, as can specific service-oriented resources (including resources from third parties). All service resources are used in a unified way. In this layer, resource management attributes in the large-scale distributed environment are dynamically monitored and controlled.  Resource management is provided for layer resource scheduling and on-demand services.  Depending on different monitoring and management demands, specific resource models, management attributes, and monitoring policies can be defined to ensure scalability in resource management and monitoring.


    (3) Programming Framework and Supporting Engine Layer 
    This layer supports service programming according to the basic platform and service resources.  It also provides engine support for service operation. The application of cloud computing in numerous industries is leading to a rapid increase of cloud-based Application Program Interfaces (API) and services. An application demand usually draws on multiple service resources. Therefore, it requires a suitable environment for combining different service resources, and the support of an operating engine. In this layer, predefined service templates, visualized manual service composition, and automatic on-demand service composition for dynamic, large-scale environments is implemented. Efficient and reliable combining of applications for large-scale concurrency is also implemented.


    (4) Personalized On-Demand Service Layer
    This layer determines how on-demand personalized cloud services are provided to users. It determines:

  • How to support users in describing their demands accurately so that resources can be accurately identified, recommended, and matched according to these demands;
  • Context awareness relating to information such as the state of available resources and user scenarios in order to provide services that are adaptable to information changes.

 

 

2.1 Distributed Management and Resource Status Monitoring Technologies 
    Cloud computing not only virtualizes hardware, storage, and network resources, but also various software resources to provide Software as a Service (SaaS). Internet resources are registered and managed on a unified platform; and for high quality SaaS, effective and dynamic monitoring, management, and control of these resources is necessary. In a large-scale distributed environment, important considerations for dynamically monitoring and managing heterogeneous resources include:

  • How to address different management and monitoring demands in order to dynamically configure different resource types and their management attributes;
  • How to support easily-scalable monitoring architecture;
  • How to best monitor resources and predict status by establishing efficient and flexible monitoring policies and creating as little resource overhead as possible;
  • How to efficiently and automatically monitor resource state using smart management technology, and how to diagnose faults so that basic support is provided for high quality services.

 

 

2.2 Modeling Technology of On-Demand Services with Context Awareness 
    When resources become abundant on the cloud platform, services with context awareness play a key role in optimizing user experience.


    (1) User Demand Modeling Based on Domain Demand 
    Domain demand models are the basis for modeling users’ personalized demands. Semantic interoperable technology uses domain specifications and demand models to create an interoperable semantic base for correctly describing and matching user demands. It also provides underlying support for user-centered demand modeling and automatic service combination. Reference [7] proposes a method of discovering conflicts among multiple inter-organizational service processes. It also provides a solution based on independent amending areas for resolving conflicts. This method is used for assisting and checking user demand modeling during a service process.


    (2) A Demand Model that Supports Uncertainty
    Another challenge for accurate demand modeling is user uncertainty. In situations where a user may be unspecific, unsure, or provide incomplete information, accurately predicting these demands is a key concern of demand modeling. Uncertainty also lies in service modeling and evaluation. When services are matched with demands, uncertainty requires prediction. Therefore, uncertainty models and speculative technologies such as probability logic, fuzzy logic, and cloud model should be used together.


    (3) User Scenario Modeling
    User context awareness plays a key role in providing on-demand services that can intuitively fit status changes. Aiming to address modeling demands in uncertain scenarios, Reference [8] suggests a probabilistic-constrained fuzzy logic as well as its speculative method. As opposed to conventional context aware computing, cloud applications deal with resource states, user context awareness, and speculative computing on a large scale. Issues related to scenario identification and speculation must be addressed in a large-scale environment. To solve these issues, efficiency is key. Identification and speculation for a single scenario is not complicated. However, highly effective algorithms are needed for identification and speculation on a large scale.

 

2.3 Automatic Service Composition Technology on Demand 
    Cloud file systems and concurrent processing models such as Google File System (GFS) and MapReduce are becoming more sophisticated. Large-scale database services are also improving with the emergence of Bigtable, PNUTS, Dynamo, Cassandra, Hbase, and Azure. These open-source or commercial systems lay a solid foundation for the development of cloud-based applications. Using combinatorial modeling and operation mode in current Service Oriented Architecture (SOA), flexible combination modeling can be implemented and operational support for application combinations can be provided using various workflow engines. However, SOA cannot adapt to mass service management and mass concurrent execution request in the cloud computing environment. The conventional method of manually selecting interoperable services for composition modeling cannot be applied when developing cloud-based composition applications because of the massive number of service resources in clouds. Also, a centralized execution engine cannot process hundreds of thousands of concurrent tasks, nor can it meet large-scale monitoring requirements. In the context of cloud computing, on-demand construction and operation of service composition applications is the most important issue to be solved.


    (1) Automatic On-Demand Construction of Service Composition Applications
On-demand construction of service composition applications helps developers: 

  • Retrieve and organize resources automatically or semi-automatically according to coarse-grained user requirements and context;
  • Assist users in constructing intricately combined application programs by providing flexible and accessible navigation.

 

 

2.4 On-Demand Service Based on Complex Adaptive System
    The cloud computing ecosystem is a typical complex adaptive system, with interactions between different parts of the cloud computing system, and between system parts and the networked cloud-computing environment. The cloud computing network, by self-organization, achieves an orderly state with a specific space-time structure (modal or community structure). The system therefore becomes self-organized, self-adaptive, and is capable of self-study. It can reproduce and develop in the complicated networking environment. If the cloud computing system and its components cannot keep pace with network development, it will be abandoned by consumers. That is why cloud computing must provide on-demand services. The cloud computing system must adjust its operation mechanism to suit the needs of the environment and users. With interactions occuring within the system, and between the system and environment, the cloud computing system is wholly interconnected, interworked, and interoperated in order to provide on-demand services.  Compared with traditional computing systems, the networked cloud computing system has more independent composition units, and is capable of on-demand evolution and on-demand service provision with looser coupling and flexible scale. Study into networked cloud computing systems from the perspective of complex adaptive system theory has significant implications for system management, and resource organization and scheduling[17].


3 Application of On-Demand Cloud Computing Framework in Continuously Operating Reference Station

 

3.1 Overview of Continuously Operating Reference Station
    A typical Continuous Operating Reference Station (CORS) is a real-time communication and service system, consisting of reference station, data center, communication network, and user. Each reference station has a consistent and precise three-dimensional coordinate. The system transmits real-time observation data back to the data center, and the latter calculates the data, manages the devices, and provides users with real-time fast positioning services. Figure 2 shows the CORS architecture.

 


 
3.2 CORS Applications and Services
    In China, CORS is mostly used in surveying and mapping, city planning, transportation and logistics, and hydroelectricity production. CORS services include providing coordinates derived from space, spatial positioning, source data service, and time service. Among these, source data service, space coordinates, and time service are basic CORS services. Spatial positioning, satellite trajectory, meteorological service, and geodynamics parameters are high-level services requiring integration with other information system services. Satellite trajectory is the aggregate of source data services; meteorological service is interdisciplinary, involving source data services and meteorological technology; and atmospheric environmental monitoring is a high-level interdisciplinary service. Table 1 shows the response time and content of these services.

 

 

3.3 CORS Architecture Analysis and On-Demand Cloud Computing Structure
    When treated as a network, a CORS can be divided into three layers: physical layer, data layer, and application layer. Respectively, these correspond to physical interconnection, data organization and scheduling, and computing and service. From the perspective of on-demand cloud computing, the data layer of CORS can be further divided into a management and monitoring layer for resources (such as reference stations), and an organization and monitoring layer for reference station data.


    CORS architecture in cloud computing is shown in Figure 3.

 

 

3.4 Key Factors Affecting the Application of Cloud-Computing Model in CORS
    Two basic problems must be solved when applying the cloud computing model to CORS: 


    (1) Organization and Monitoring of Large-Scale Resources
    Compared with cloud computing infrastructure, CORS resources are relatively simple: the physical devices are reference stations, data center servers, and communication network resources; and the data resources are initial data from reference stations, coordinates, service data generated by the system, and registered user information. Major problems arise in building a unified platform for cloud computing data and services, developing intelligent resource status management technology, monitoring resources actively and efficiently, and detecting the breakdown of resources.


    (2) On-Demand Service Technology for Large-Scale Users
    Compared with common Internet-based information service systems, CORS provides simple services with definite semantics and grammar. It generally uses rough user locations to carry out precise, iterative positioning. The difficulty lies in building a cloud computing platform compatible with the existing systems, creating sockets for system access and user login, and service mining and redirection according to the needs of active or inactive users.


4 Conclusions 
    A cloud computing system includes bottom infrastructure that provides computing ability, upper-layer service software, and users. One of the core problems in cloud computing research and development is services. This paper focuses on services and proposes an architecture of on-demand services in cloud-computing. Modeling of cloud computing services, mass automated service composition, and management and testing technology for distributed service resources is discussed. The architecture and technologies mentioned above will enable cloud developers to design the hardware and structure for different layers; to deploy, manage, and dispatch service resources; and to provide users with a flexible and efficient service system. This paper uses the example of CORS in a geo-spatial information system to analyze service demands, and to highlight two basic issues in providing effective services in the cloud computing architecture.

 

Acknowledgement: Thanks Researcher Liu Zhiyong and Professor Lü Jinhu for their guidance and assistance in the research and writing of this paper.

 

 


References
[1] ARMBRUST M. Above the Clouds: A Berkeley View of Cloud Computing [R]. UCB/EECS-2009-28. Berkeley, CA,USA: Dept of EE&CS, University of California, Berkeley, 2009.
[2] BOSS G, MALLADI P, QUAN D, et al. Cloud Computing [R]. IBM Corporation, 2007.
[3] 云计算白皮书校订稿 [R]. 北京: 中国电子学会云计算专家委员会, 2009.
      Cloud Computing White Paper (Revised Version) [R]. Beijing: Cloud Computing Experts Association of Chinese Institute of Electronics, 2009.
[4] Hadoop [EB/OL]. [2009-07-29]. http://hadoop.apache.org/.
[5] BUYYA R, YEO C S, VENUGOPAL S. Market-Oriented Cloud Computing: Vision, Hype, and Reality for Delivering IT Services as Computing Utilities [C]//Proceedings of the 10th IEEE International Conference on High Performance Computing and Communications(HPCC’08), Sep 25-27, 2008, Dalian, Beijing. Piscataway, NJ,USA: IEEE, 2008:5-13.
[6] 服务资源管理与监控软件(C3 Service Management) [R]. 北京: 中国科学院计算技术研究所, 2010.
      Service Resource Management and Monitoring Software (C3 Service Management) [R]. Beijing: Institute of Computing Technology, Chinese Academy of Sciences, 2010.
[7] GONG Shuai, XIONG Jinhua, LIU Zhiyong, et al. Correcting Interaction Mismatches for Business Processes [C]//Proceedings of the 8th IEEE International Conference on Web Services (ICWS'10), 7th International Conference on Services Computing (SCC’10), 3rd International Conference on Cloud Computing(CLOUD’10), 6th World Congress on Services(SERVICES’10), Jul 5-10, 2010, Hyatt Regency, MI,USA. Piscataway, NJ,USA: IEEE, 2010.
[8] XIONG Jinhua, FAN Jianping. Probabilistic-Constrained Fuzzy Logic for Situation Modeling [C]//Proceedings of the 2009 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE’09), Aug 20-24, 2009, Jeju Island, Korea. Piscataway, NJ,USA: IEEE, 2009.
[9] MARCONI A, PISTORE M, POCCIANTI P, et al. Automated Web Service Composition at Work: The Amazon/MPS Case Study [C]//Proceedings of the International Conference on Web Services(ICWS '07), Jul 13-19,2007, Salt Lake City, UT, USA. Piscataway, NJ,USA: IEEE, 2007:767-774.
[10] RAO J, DIMITROV D, HOFMANN P, et al. A Mixed Initiative Approach to Semantic Web Service Discovery and Composition: Sap's Guided Procedures Framework [C]//Proceedings of the International Conference on Web Services (ICWS’06), Sep 18-22,2006, Chicago, IL,USA. Piscataway, NJ,USA: IEEE, 2006: 401-410.
[11] BERARDI D, CALVANESE D, GIACOMO G D, et al. Automatic Composition of e-Services that Export Their Behavior [C]//Proceedings of the 1st International Conference on Service Oriented Computing(ICSOC’03), Dec 15-18, 2003, Trento, Italy. Berlin, Germany: Springer-Verlag, 2003: 43-58.
[12] JIANG Wei, ZHANG C, HUANG Zhenqiu, et al. QSynth: A Tool for QoS-Aware Automatic Service Composition [C]//Proceedings of the 8th IEEE International Conference on Web Services (ICWS'10), 7th International Conference on Services Computing (SCC’10), 3rd International Conference on Cloud Computing(CLOUD’10), 6th World Congress on Services(SERVICES’10), Jul 5-10,2010, Hyatt Regency, MI,USA. Piscataway, NJ,USA: IEEE, 2010.
[13] ZENG Liangzhao, BENATALLAH B, NGU A H H, et al. QoS-Aware middleware for Web Services Composition [J]. IEEE Transactions on Software Engineering, 2004, 30(5):311-327.
[14] YU T, LIN K J. Service Selection Algorithms for Web Services With End-to-End QoS Constraints [J]. Journal of Information Systems and e-Business Management, 2005,3(2):103-126.
[15] BENATALLAH B, SHENG Q Z, DUMAS M. The Self-Serv Environment for Web Services Composition [J]. IEEE Internet Computing, 2003,7(1): 40-48.
[16] NANDA M, CHANDRA S, SARKAR V. Decentralizing Execution of Composite Web Services [C]//Proceedings of the 19th Annual ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications(OOPSLA’04), Oct 24-28, 2004, Vancouver, Canada. New York, NY, USA:ACM, 2004:170-187.?
[17] 李德毅. 需求工程——对复杂系统的软件工程的基础研究 [J]. 中国基础科学, 2009,11(2).
        LI Deyi. Research Progress on Requirement Engineering: Basic Research on Software Engineering for Complex Systems [J]. China Basic Science, 2009, 11(2).
[18] 刘晖. 地球空间信息网格及其在连续运行参考站网络中的应用研究 [D]. 武汉: 武汉大学, 2005.
        LIU Hui. Research on Geo-Spatial Information Grid and Application in Continuously Operating Reference Stations [D]. Wuhan: Wuhan University, 2005.

 


    On-demand construction relies on multimode rapid application modeling technology based on automatic service composition. Amazon and SAP have explored automatic service composition technology in modeling using WS-BPEL and various process modeling tools[9-10]. Further research must be done on application grain size and application mode of automatic service in cloud computing. This involves researching location- and time-based search in conjunction with navigation system, and hybrid modeling in conjunction with on-the-fly or interactive recommendation. To help cloud application developers, algorithms for simplifying application configuration[11-12] should be studied. Moreover, reasonable concurrency patterns and distributed processing methods are required to improve processing ability and efficiency when configuring on-demand applications.


    (2) On-Demand Deployment and Operation of Service Composition Application
When considering on-demand deployment and operation of service composition applications, user Service Level Agreements (SLA) as well as applicable computing, storage, and network facilities must be taken into account to optimize task scheduling, and to make maximum use of  distributed resources. This improves the operation of the entire cloud system. 
Key technologies include SLA and dynamic selection of service/route of context aware. When a service composition application is operating, the copy of the best Quality of Service (QoS) is selected for local and global optimization[13-14]  with reference to the SLA between user and service provider, or to the user’s context. In deploying and operating service composition applications, the resource application and SLA constraint should be taken into account for flexible shifting between wholly distributed or sharding structures. In this way, single-point performance bottlenecks can be avoided[15-16].


    As in Reference [5], a cloud computing architecture is proposed that allocates resources according to a market mechanism and Service Level Agreement (SLA). One of the fundamentals of resource allocation according to SLA is effective monitoring of resources of multiple providers. Reference [6] describes a large-scale distributed heterogeneous resource management and monitoring platform called C3 service management. Using this platform, resource registration, management, and automatic monitoring, can be realized.

 


2 Key Technologies for Implementing On-Demand Services in Cloud Computing

 

[Abstract] Cloud computing provides a new paradigm for hardware and software infrastructure design as well as planning and usage of information systems. It offers flexible, efficient, inexpensive, and quality services. This paper proposes an on-demand service system using the cloud computing architecture and analyzes important issues such as organization, management, and monitoring of distributed service resources; context-aware on-demand service modeling, on-demand automated service composition in large-scale networks, and service system analysis based on complex system theory. Continuous Operating Reference Station (CORS) of a geo-spatial information system is taken as an example, and its architecture is analyzed from the perspective of cloud computing. Some fundamental questions are raised about its service.

[Keywords] cloud computing; service composition; situation modeling; on-demand service