Skip to main content

Use Case—The Amazon EC2 Resource Pool

The Amazon EC2 Resource Pool in Aneka allows integration with Amazon EC2, a popular cloud resource provider. A simple Web service client has been developed in Aneka to interact with EC2, leveraging its web-service-based interface. To interact with EC2, certain parameters are required:

User Identity: This is the account information used to authenticate with Amazon EC2. It consists of an access key and a secret key, which are obtained from the Amazon Web services portal after signing in. These keys are necessary for any operation involving web service access.

Resource Identity: The resource identity is the identifier of a public or private Amazon Machine Image (AMI) that serves as a template for creating virtual machine instances.

Resource Capacity: This specifies the type of instance that will be deployed by EC2. Instance types vary based on the number of cores, amount of memory, and other performance-related settings. Common instance types include small, medium, and large, each with a predefined capacity and associated cost.

The EC2ResourcePool in Aneka uses the Web service client and the provided configuration information to forward requests from the pool manager to EC2. It stores metadata of each active virtual instance for future use.

To optimize the utilization of EC2 instances and minimize costs, the EC2ResourcePool implements a cost-effective optimization strategy. Amazon charges virtual machine instances in one-hour time blocks. If an instance is used for only 30 minutes, the customer is still charged for one hour. To cater to applications with smaller execution time granularity, the pool implements a local cache. Released instances whose time block has not expired yet are kept in the cache and reused instead of provisioning new instances from Amazon.

By employing this cost-effective optimization strategy, the EC2ResourcePool in Aneka can minimize provisioning costs from the Amazon cloud while achieving high utilization of each provisioned resource.

Comments

Popular posts from this blog

2.1 VIRTUAL MACHINES PROVISIONING AND MANAGEABILITY

In this section, we will have an overview on the typical life cycle of VM and its major possible states of operation, which make the management and automation of VMs in virtual and cloud environments easier than in traditional computing environments As shown in Figure above, the cycle starts by a request delivered to the IT department, stating the requirement for creating a new server for a particular service.  IT administration to start seeing the servers’ resource pool, matching these resources with the requirements, and starting the provision of the needed virtual machine.  Once provisioned machine started, it is ready to provide the required service according to an SLA, or a time period after which the virtual is being released.

2.2 VIRTUAL MACHINE MIGRATION SERVICES

Migration service, in the context of virtual machines, is the process of moving a virtual machine from one host server or storage location to another; there are different techniques of VM migration, hot/life migration, cold/regular migration, and live storage migration of a virtual machine. In process of migration, all key machines’ components, such as CPU, storage disks, networking, and memory, are completely virtualized, thereby facilitating the entire state of a virtual machine to be captured by a set of easily moved data files. 2.2.1. Migrations Techniques Live Migration and High Availability Live migration (which is also called hot or real-time migration) can be defined as the movement of a virtual machine from one physical host to another while being powered on.  Live migration process takes place without any noticeable effect from the end user’s point of view (a matter of milliseconds).  One of the most significant advantages of live migration is the fact that it facili...

Open SaaS and SOA

A considerable amount of SaaS software is based on open source software.  When open source software is used in a SaaS,  it referred to as Open SaaS.  The advantages of using open source software are that systems are much cheaper to deploy because you don’t have to purchase the operating system or software, there is less vendor lock-in, and applications are more portable.  The popularity of open source software, from Linux to APACHE, MySQL, and Perl (the LAMP platform) on the Internet, and the number of people who are trained in open source software make Open SaaS an attractive proposition.  The impact of Open SaaS will likely translate into better profitability for the companies that deploy open source software in the cloud, resulting in lower development costs and more robust solutions. SOA (Service-Oriented Architecture): SOA is an architectural approach for designing and developing software systems that are composed of loosely coupled services.  In an SO...