free web hit counter
Introduction of Automatic Speech Recognition

Speech is a versatile mean of communication. It conveys linguistic speaker and environmental information. Even though such information is encoded in a complex form, humans can relatively decode most of it. Among all speech tasks, automatic speech recognition (ASR) has been the focus of many researchers for several decades. In this task, the linguistic message is the information of interest. Speech recognition applications range from dictating a text to generating subtitles in real-time for a television broadcast. Despite the human ability, researchers learned that extracting information from speech is not a straightforward process. Definition of Automatic Speech Recognition Speech recognition has in years has become a practical concept, which is now being implemented in different languages around the world. Speech recognition has been used in real-world human language applications, such as information recovery. Speech in human can be said as the most common means of the communication because the information maintains the basic role in conversation. The conversation or speech that is captured by a microphone or a telephone is converted from acoustic signal to a set of words in speech recognition. It can be defined: “Automatic speech recognition (ASR) can be defined as the independent, computer‐driven transcription of spoken language…

Data Preprocessing in Data Mining

Data analysis is now integral to our working lives. It is the basis for investigations in many fields of knowledge, from science to engineering and from management to process control. Data on a particular topic are acquired in the form of symbolic and numeric attributes. Analysis of these data gives a better understanding of the phenomenon of interest. When development of a knowledge-based system is planned, the data analysis involves discovery and generation of new knowledge for building a reliable and comprehensive knowledge base. What is Data Preprocessing Exploratory data analysis and predictive analytics can be used to extract hidden patterns from data and are becoming increasingly important tools to transform data into information. Real-world data is generally incomplete and noisy, and is likely to contain irrelevant and redundant information or errors. Data preprocessing, which is an important step in data mining processes, helps transform the raw data to an understandable format. Data pre-processing is an important step in the data mining process. It describes any type of processing performed on raw data to prepare it for another processing procedure. Data preprocessing transforms the data into a format that will be more easily and effectively processed for the purpose of the…

An Introduction of Data Visualization

A picture is worth a thousand words – especially when we are trying to understand and discover insights from data. Visuals are especially helpful when we’re trying to find relationships among hundreds or thousands of variables to determine their relative importance – or if they are important at all. Regardless of how much data we have, one of the best ways to discern important relationships is through advanced analysis and high-performance data visualization. If sophisticated analyses can be performed quickly, even immediately, and results presented in ways that showcase patterns and allow querying and exploration, people across all levels in our organization can make faster, more effective decisions. Data Visualization : Definition Data visualizations are surprisingly common in our everyday life, but they often appear in the form of well-known charts and graphs. A combination of multiple visualizations and bits of information are often referred to as infographics. Data visualizations can be used to discover unknown facts and trends. You may see visualizations in the form of line charts to display change over time. Bar and column charts are useful when observing relationships and making comparisons. Pie charts are a great way to show parts-of-a-whole. And maps are the best way…

Introduction of Bioinformatics

Quantitation and quantitative tools are indispensable in modern biology. Most biological research involves application of some type of mathematical, statistical, or computational tools to help synthesize recorded data and integrate various types of information in the process of answering a particular biological question. Bioinformatics involves the use of computers to collect, organize and use biological information to answer questions in fields like evolutionary biology. Definition of Bioinformatics Bioinformatics is an interdisciplinary research area at the interface between computer science and biological science. A variety of definitions exist in the literature and on the World Wide Web; some are more inclusive than others. Bioinformatics involves the technology that uses computers for storage, information retrieval, manipulation, and distribution of information related to biological macromolecules such as DNA, RNA, and proteins. The emphasis here is on the use of computers because most of the tasks in genomic data analysis are highly repetitive or mathematically complex. The use of computers is absolutely indispensable in mining genomes for information gathering and knowledge building. Bioinformatics is the science of developing computer databases and algorithms for the purpose of speeding up and enhancing biological research. It can be defined more specifically, “Bioinformatics combines the latest technology with biological…

What is Knowledge Graph

Knowledge graphs are large networks of entities and their semantic relationships. They are a powerful tool that changes the way we do data integration, search, analytics, and context-sensitive recommendations. Knowledge graphs have been successfully utilized by the large Internet tech companies, with prominent examples such as the Google Knowledge Graph. Open knowledge graphs such as Wikidata make community-created knowledge freely accessible. Overview of Knowledge graphs The World Wide Web is a vast repository of knowledge, with data present in multiple modalities such as text, videos, images, structured tables, etc. However, most of the data is present in unstructured format and extracting information in structured machine-readable format is still a very difficult task. Knowledge graphs aim at constructing large repositories of structured knowledge which can be understood by machines. Such knowledge graphs are being used to improve the relevance and the quality of search in case of search engines like Google and Bing. Knowledge graphs are also being used by applications like Google now, Microsoft Cortana and Apple Siri which are capable of understanding natural language queries and answer questions, making recommendations, etc. to the user. The construction of knowledge graphs is thus a major step towards making intelligent personalized machines. Web…

What is Fault Tolerance in Distributed System

The use of technology has increased vastly and today computer systems are interconnected via different communication medium. The use of distributed systems in our day to day activities has solely improved with data distributions. This is because distributed systems enable nodes to organize and allow their resources to be used among the connected systems or devices that make people to be integrated with geographically distributed computing facilities. The distributed systems may lead to lack of service availability due to multiple system failures on multiple failure points. Definition of Fault Tolerance In a broad sense, fault tolerance is associated with reliability, with successful operation, and with the absence of breakdowns. A fault-tolerant system should be able to handle faults in individual hardware or software components, power failures or other kinds of unexpected disasters and still meet its specification. A fault-tolerance is the ability of a system to continue correct performance of its intended tasks after the occurrence of hardware and software faults. Fault tolerant system research covers a wide spectrum of applications namely embedded real-time systems, commercial transaction systems, transportation systems, and military/space systems, distribution and service systems, etc. Fault tolerance approach in any system results in the improvement as far as…

Introduction of Medical Image Processing or Medical Imaging

Present day technological developments in imaging and vision have brought so many changes in the medical diagnosis, treatment planning and treatment verification procedures. The accurate precision, speed in diagnosis process and non-invasive clinical procedures are drastically improved. In modern medicine, medical imaging has undergone major advancements. Today, this ability to achieve information about the human body has many useful clinical applications. Over the years, different sorts of medical imaging have been developed, each with their own advantages and disadvantages. Overview of Medical Imaging Aside from Superman with his x-ray vision, people generally can’t look at a sick person and instantly figure out the problem. Most medical issues occur inside the body, so making a diagnosis can be a challenge. Medical imaging has made that challenge far easier over the last century. Medical imaging is the technique of producing visual representations of areas inside the human body to diagnose medical problems and monitor treatment. It has had a huge impact on public health. Medical imaging is the visualization of body parts, tissues, or organs, for use in clinical diagnosis, treatment and disease monitoring. Imaging techniques encompass the fields of radiology, nuclear medicine and optical imaging and image-guided intervention. Medical imaging refers to several…

What is Deep Learning

Machine-learning technology powers many aspects of modern society: from web searches to content filtering on social networks to recommendations on e-commerce websites, and it is increasingly present in consumer products such as cameras and smartphones. Machine-learning systems are used to identify objects in images, transcribe speech into text, match news items, posts or products with users’ interests, and select relevant results of search. Increasingly, these applications make use of a class of techniques called deep learning. Deep learning is a machine learning technique that teaches computers to do what comes naturally to humans: learn by example. Deep Learning Overview Deep Learning is a subfield of machine learning concerned with algorithms inspired by structure and function of brain called artificial neural networks. In deep learning, a computer model learns to perform classification tasks directly from images, text, or sound. Deep learning models can achieve state-of-the-art accuracy, sometimes exceeding human-level performance. Models are trained by using a large set of labeled data and neural network architectures that contain many layers. Deep learning is a subset of machine learning. Usually, when people use the term deep learning, they are referring to deep artificial neural networks, and somewhat less frequently to deep reinforcement learning….

Introduction of Cyber Security
Technology & Science , Web Security / December 5, 2017

The term cyber security is often used interchangeably with the term information security. Cyber security is the activity of protecting information and information systems (networks, computers, data bases, data centres and applications) with appropriate procedural and technological security measures. Cyber security has become a matter of global interest and importance. Cyber security refers to a set of techniques used to protect the integrity of networks, programs and data from attack, damage or unauthorized access. Cyber security Definition Cyber security is the collection of tools, policies, security concepts, security safeguards, guidelines, risk management approaches, actions, training, best practices, assurance and technologies that can be used to protect the cyber environment and organization and user’s assets. Organization and user’s assets include connected computing devices, personnel, infrastructure, applications, services, telecommunications systems, and the totality of transmitted and/or stored information in the cyber environment. Cyber security strives to ensure the attainment and maintenance of the security properties of the organization and user’s assets against relevant security risks in the cyber environment. More specifically, “Cyber security refers to the body of technologies, processes, and practices designed to protect networks, devices, programs, and data from attack, damage, or unauthorized access. Cyber security may also be referred…

Cloud Storage and Data De-Duplication

Rendering efficient storage and security for data is very important for cloud. With the rapidly increasing data produced worldwide, networked and multi-user storage systems are becoming very popular. However, concerns over data security still prevent many users from migrating data to remote storage. Data deduplication refers to a technique for eliminating redundant data in a data set. In the process of deduplication, extra copies of the same data are deleted, leaving only one copy to be stored. Data is analysed to identify duplicate byte patterns to ensure the single instance is indeed the single file. Then, duplicates are replaced with a reference that points to the stored chunk. Data deduplication Data deduplication is a technique to reduce storage space. By identifying redundant data using hash values to compare data chunks, storing only one copy, and creating logical pointers to other copies instead of storing other actual copies of the redundant data. Deduplication reduces data volume so disk space and network bandwidth can be reduced which reduce costs and energy consumption for running storage systems Figure 1: Data de-duplication View Data deduplication is a technique whose objective is to improve storage efficiency. With the aim to reduce storage space, in traditional…