Grid computing, a concept that has been stirring excitement and skepticism in equal measure, promises to revolutionize the way we approach complex computational tasks. By harnessing the collective power of distributed computing resources, grid computing offers a vision of a virtual supercomputer that transcends geographical boundaries. But is this technology merely a buzzword, or does it hold tangible benefits for the future of computing?
Grid computing is an innovative approach that connects a multitude of computing resources across local or wide area networks, presenting them as a singular, formidable virtual computing system to users and applications. This concept is not new; it has been articulated by industry giants like IBM, Sun Microsystems, and the Globus Alliance, each highlighting the collaborative and utility-like nature of grid computing.
Oracle likens grid computing to a utility service, emphasizing the user's indifference to the physical location of data or the specific computers processing their requests. Clients can request and receive information or computation on-demand, similar to how one would use electricity or water without concern for the source or infrastructure behind the scenes.
The technology sector has been abuzz with the potential of grid computing, with many software companies positioning themselves to be at the forefront of this perceived "next big thing." However, amidst the excitement, there is a lack of consensus on the direction and definition of grid computing, leading to more debate than development.
To assess the practicality of grid computing, it's crucial to examine the types of applications businesses will run on future computing platforms. The balance between network usage, processing time, and disk storage required for a task is a key determinant of grid computing's viability.
For instance, the SETI@Home project, which has utilized over 1.6 million years of CPU time donated by volunteers worldwide, demonstrates the cost-effectiveness of grid computing. The project's ability to break down the search for extraterrestrial intelligence into small data packets made it feasible to distribute the workload across numerous computers, resulting in significant cost savings. However, such a favorable ratio of CPU cost to network cost is not typical for many business applications.
Business calculations often require access to vast amounts of proprietary data, and the costs associated with transferring this data across networks can outweigh the benefits of distributed processing. For example, simulations that necessitate a high degree of interconnectivity between components are not easily partitioned like the SETI calculations.
Companies like Oracle and Sun Microsystems have different approaches to integrating grid computing into their offerings. Oracle's 10g database, with "g" denoting grid, is essentially a rebranding of existing clustering technology, leveraging the grid computing buzzword for marketing purposes.
Sun, conversely, advocates for intra-company grids, which are more feasible due to the lower network transmission costs within a company compared to the internet. Nonetheless, only a select few applications stand to benefit from such a setup.
Despite these limitations, certain industries could see substantial advantages from grid computing. Small architecture firms and 3D animation studios, for instance, could leverage grid computing to process complex calculations and renderings that require significant computational power but involve relatively small data inputs and outputs.
While grid computing may not yet be the panacea for all computational challenges, it holds immense promise for scientific endeavors such as cancer research, human genome mapping, and new material development. These high-end projects can greatly benefit from the vast pool of computing resources that grid computing offers.
In the commercial realm, however, grid computing's impact will likely be limited to larger enterprises until network connectivity becomes as affordable and ubiquitous as processing power. With Moore's Law still in effect, indicating that processor speeds will double approximately every 18 months, this reality seems distant.
In summary, the dream of on-demand processing and storage, also known as utility computing, remains a work in progress, both technically and economically.
For further insights into grid computing and its implications, FWOSS (Fire Without Smoke Software Ltd) welcomes inquiries and discussions on the topic. Their 2003 article, "Distributed Computing Economics" by Jim Gray, provides a foundational understanding of the economics behind distributed computing systems.
Please note that this article is copyrighted by Fire Without Smoke Software Ltd (2003) and can be reproduced with proper attribution. For more information, visit FWOSS.
Web Standards
The ... believe it or not, is a ... ... system. With millions of ... ... all with their own agenda, it is nothing short of a small miracle that they all play nicely toInformation Security for SMEs
This article explores computer ... aiming to give ... an insight into why they must be ... in ... their systems. There are many aspects to security on the Internet and a lotSoftware Integration for SMEs
The Problem Software ... aims to solve a problem that started when large ... ... ... that had ... been manual; such as finance or human ... This in itself was a