What is Green Computing?
Technology & Science / June 25, 2018

The term ‘Green Computing’ is the study and practices that covers the computing life-cycle from cradle to grave. It start from design to manufacturing to use of equipment and then safely disposing-off computers, related devices, networking and communications equipment’s efficiently and effectively with negligible or no impact on the environment. Today’s computing vision is utility based Consumers only need to pay provider only when and how they access, they need not to invest much and there is no need to develop a complex and costly infrastructure, this model of computing is cloud computing. Basic Overview of Green Computing Green computing is receiving more and more attention with increasing energy cost and growing environmental disquiets. Green Computing is a latest movement in IT industry towards scheming, building, and operating computer systems to be energy proficient. Green computing is environmentally sustainable to use of computers and related resources efficiently and effectively. Green computing also called green technology whose goals are to diminish the use of perilous materials, maximize energy effectiveness during the lifetime of the product, and encourage the recyclability or Biodegradability of obsolete products and waste of factory. Green computing or green IT, refers to environmentally sustainable computing or IT. Green…

What is Cluster Computing
Technology & Science / February 24, 2018

Very often applications need more computing power than a sequential computer can provide. One way of overcoming this limitation is to improve the operating speed of processors and other components so that they can offer the power required by computationally intensive applications. Cluster computing is an increasingly popular high performance computing solution. Cluster computing is not a new area of computing. It is, however, evident that there is a growing interest in its usage in all areas where applications have traditionally used parallel or distributed computing platforms. The mounting interest has been fueled in part by the availability of powerful microprocessors and high-speed networks as off-the-shelf commodity components as well as in part by the rapidly maturing software components available to support high performance and high availability applications. Overview of Cluster Computing Cluster computing is best characterized as the integration of a number of off-the-shelf commodity computers and resources integrated through hardware, networks, and software to behave as a single computer. Initially, the terms cluster computing and high performance computing were viewed as one and the same. However, the technologies available today have redefined the term cluster computing to extend beyond parallel computing to incorporate load-balancing clusters (for example, web…

Cloud Computing Introduction
/ October 12, 2017

Internet has been a driving force towards the various technologies that have been developed. Arguably, one of the most discussed among all of these is Cloud Computing. Over the last few years, cloud computing paradigm has witnessed an enormous shift towards its adoption and it has become a trend in the information technology space as it promises significant cost reductions and new business potential to its users and providers. This section provides the cloud computing and their basic overview about the different techniques that are utilized for further data sharing over cloud: Cloud Computing: Introduction Cloud computing is a computing paradigm, where a large pool of systems are connected in private or public networks, to provide dynamically scalable infrastructure for application, data and file storage. With the advent of this technology, the cost of computation, application hosting, content storage and delivery is reduced significantly. Cloud computing is a practical approach to experience direct cost benefits and it has the potential to transform a data center from a capital-intensive set up to a variable priced environment. The idea of cloud computing is based on a very fundamental principal of reusability of IT capabilities. The difference that cloud computing brings compared to…

Insert math as
$${}$$