An Overview of Grid Computing

January 4, 2014

Grid computing has proven to be an important new field focusing on the sharing of resources. Spearheaded by huge corporations like Oracle, Sun Micro-systems and IBM; it has also become one of the greatest things in information systems architecture. Grid computing has been hailed as the solution to performance and capacity problems for several applications. […]

Grid computing has proven to be an important new field focusing on the sharing of resources. Spearheaded by huge corporations like Oracle, Sun Micro-systems and IBM; it has also become one of the greatest things in information systems architecture. Grid computing has been hailed as the solution to performance and capacity problems for several applications. If you require a bigger and faster system, grid computing should definitely be a consideration.

What is grid computing?

The general idea behind grid computing is not new; the advantages of computing clusters and parallel processing in scientific research are well documented. The main concept behind gird computing is to make numerous autonomous machines that may be in different physical locations, act like they are a single virtual machine.

An electrical grid analogy is often used to describe grid computing. In the electrical power grid, wall sockets link end users to a large infrastructure of resources the not only generates, but also distributes and bills for electricity; they use these resources with no consideration or care for where or how the electricity is actually generated. In grid computing, individual users obtain computing resources (applications, data, storage, processors etc.) on demand with limited knowledge of where the resources are located.

Grid computing captures the basics of distributed computing and puts an enterprise-friendly face on it.  Despite the fact that the concept is not new, it hasn't been perfected. Programmers, engineers and computer scientists are still trying to create, establish and implement protocols and standards. As of now, most of the current grid computer systems heavily rely on propriety tools and software. The management features found on grid software enables the linking of computer resources together in a way that lets an individual use a single machine to leverage and access the collected power of all the machines within the grid computing system.

How does Grid Computing Differ from Cloud Computing?

Despite the fact that cloud computing and grid computing both suggest computing resources and data centers are available over networks, the two are quite different. In grid computing, end users are given access to shared storage capacity and use computing power from their desktop and shared computers in the grid. Cloud computing is designed to act as a whole and instead provides leased storage capacity and computing power.

The other major differences include;

Grid computing is ideal for applications that require large amounts of computational power, that is why it’s mostly used by research collaborations referred to ‘virtual Organizations’. On the other hand, cloud computing users tend to be small to medium enterprises with general IT needs.

Location is another factor that differentiates cloud computing and Grid computing; in the latter, the computing centers are distributed over various locations. However, in cloud computing, the data centers are generally located in a few centralized locations.

Since cloud computing is considered an evolution of grid computing; they are bound to be similarities between the two. The two major ones include;

The term multitasking is used when several processes (tasks) share processing resources. Multitenancy is used to refer to situations where a single instance of software has the ability to serve several clients (tenants). Both cloud and grid computing allow several users to perform different tasks and access multiple or single application instances.

This refers to a systems ability to deal with growing amounts of work, or to improve its performance. Both cloud and grid computing are scalable. Their systems have storage capacities that fluctuate depending on the number of users and the amount of data being transferred at a given time.

The Benefits of Grid Computing

  1. Allows the sharing of computer resources across networks thus increasing the computational power available to programs and reducing the number of computers needed by the organization.
  2. It enables the linking of cheaper computers together, instead of spending a lot of money on one machine or super-computer that has a larger processing capability.
  3. It enables applications to be easily scaled since additional computers can be added to the grid.
  4. Since the technologies being used are open source, trust and transparency is encouraged.

Drawbacks of Grid Computing

  1. Reliability: Despite the fact the service is always available, grids tend to heavily rely on distributed services and subsequently maintained by distributed staff. This could result in inconsistency in reliability.
  2. Complexity: There are various complexities to creating and efficiently using grid computing. Currently, users need to have advanced level of expertise.

By Team FileCloud