What is Plagiarism Detection
Big Data , Technology & Science , Web Security / February 15, 2018

Plagiarism is the reuse of someone else’s prior ideas, processes, results, or words without explicitly acknowledging the original author and source. In modern years, plagiarism has raised great concern over knowledgeable property protection. Plagiarists violate knowledgeable property rights either by copying source/binary code or by stealing and covertly implementing protected algorithms. The first case is also known as software plagiarism. Plagiarism involves reproducing the existing information in modified format or sometimes the original document as it is. This is quiet common among students, researchers and academicians. This has made some strong influence on research community and awareness among academic peoples to prevent such a kind of misuse. in this presented article Plagiarism Detection is explained. Overview of Plagiarism Detection A word may have several possible meanings and senses due to the richness of natural languages, which make detecting plagiarism a hard task especially when dealing with semantic meaning, not just searching for patterns of text that are illegally copied from others (copy and paste texts from digital resources without acknowledging the original resource), Plagiarism occurs in various forms: submitting another’s work exactly same without proper citation, paraphrasing text, reordering the sentences, using synonyms, or changing grammar, code plagiarism etc. Plagiarism is…

Happy Valentine’s Day
Culture , History / February 14, 2018

Valentine’s Day has roots in several different legends that have found their way to us through the ages. One of the earliest popular symbols of the day is Cupid, the Roman god of Love, who is represented by the image of a young cherub with bow and arrow. Valentine’s Day is an annual festival to celebrate romantic love, friendship and admiration. Every year on 14 February people celebrate this day by sending messages of love and affection to partners, family and friends. History of Valentine’s Day Valentine’s Day is named after a Roman martyr named Valentine. Actually, there are two Valentines in the history of Roman martyrs. One was a Christian priest, who lived around 300 AD. He had been thrown in prison for his teachings, and for refusing to worship the Roman gods. He also supposedly cured the jailer’s daughter of her blindness. On February 14, this Valentine was beheaded. As the story goes, the night before he was executed, he wrote the jailer’s daughter a farewell letter, signing it, “From Your Valentine.” February has long been a month of romance. It is the month associated with Valentine’s Day celebrations. We have, time and again, heard the name St….

What is Automatic Number Plate Recognition (ANPR)
Technology & Science / February 13, 2018

With growing urban population and its supporting transport services, there is an urgent need to improve traffic management and secure the transport systems. Automation in transport has been used successfully in signaling systems and has helped in managing urban traffic to a great extent. Automatic recognition of vehicle license plate number became a very important in our daily life because of the unlimited increase of cars and transportation systems which make it impossible to be fully managed and monitored by humans, examples are so many like traffic monitoring, tracking stolen vehicle, managing parking toll, red-light violation enforcement, border and customs checkpoints. Automatic Number Plate Recognition: Overview Automatic Number Plate Recognition (ANPR) system is an important technique, used in Intelligent Transportation System. ANPR is an advanced machine vision technology used to identify vehicles by their number plates without direct human intervention. The development of Intelligent Transportation System provides the data of vehicle numbers which can be used in follow up, analyses and monitoring. ANPR is important in the area of traffic problems, highway toll collection, borders and custom security, premises where high security is needed. The complexity of automatic number plate recognition work varies throughout the world. For the standard number…

What is Cognitive Computing
Technology & Science / February 11, 2018

Much of the excitement about cognitive computing is spurred by its enormous potential in learning, only a small fraction of which has so far been realized. The overarching goal here is to devise computational frameworks to help us learn better by exploiting data about our learning processes and activities. There are two important aspects of it—the mechanisms or insights about how we actually learn and the external manifestations of our learning activities. Cognitive computing is an emerging approach that builds upon a wealth of research and development work in Artificial Intelligence (AI). What is Cognitive Computing? Cognitive computing is the simulation of human thought processes in a computerized model. Cognitive computing involves self-learning systems that use data mining, pattern recognition and natural language processing to mimic the way the human brain works. The goal of cognitive computing is to create automated IT systems that are capable of solving problems without requiring human assistance. The origins of cognitive systems work lie in cognitive science — a discipline that brings together researchers from the fields of psychology, linguistics, philosophy, computer science, and more recently, neuro-computing. We perceive “cognitive computing” as an approach that has emerged from, and attempts to subsume, the work…

What is Image Compression in Image Processing
Image Processing , Technology & Science / February 10, 2018

A marked progress has been made in the field of image compression and its application in various branches of engineering. Image compression is associated with removing redundant information of image data. It is a solution which associated with storage and data transmission problem of huge amounts of data for digital image. Compressing an image is significantly different than compressing raw binary data. Of course, general purpose compression programs can be used to compress images, but the result is less than optimal. This is because images have certain statistical properties which can be exploited by encoders specifically designed for them. Also, some of the finer details in the image can be sacrificed for the sake of saving a little more bandwidth or storage space. Overview of Image Compression Image compression is an application of data compression that encodes the original image with few bits. The objective of image compression is to reduce the redundancy of the image and to store or transmit data in an efficient form. Image compression is a type of an application for data/image compression in which the basic image gets encoded with the limited bits. To lower the irrelevance and the redundancy of image data is the…

What is Visual Analytics

We are living in a world which faces a rapidly increasing amount of data to be dealt with on a daily basis. In the last decade, the steady improvement of data storage devices and means to create and collect data along the way influenced our way of dealing with information: Most of the time, data is stored without filtering and refinement for later use. Virtually every branch of industry or business, and any political or personal activity nowadays generate vast amounts of data. Making matters worse, the possibilities to collect and store data increase at a faster rate than our ability to use it for making decisions. However, in most applications, raw data has no value in itself; instead we want to extract the information contained in it. Overview Generally, large scale organizations have large amount of data and information to process. They need some strong procedures and techniques to collect, analyze, process and visualize the data in order to get required results as well as to take the right decision in order to get their long term goals and objectives. Several software and tools relating to big data analytics, visual analytics are being used by companies in order to…

What is Content Delivery Network (CDN)
Networking , Technology & Science / February 8, 2018

With the proliferation of the Internet, popular Web services often suffer congestion and bottlenecks due to large demands made on their services. Such a scenario may cause unmanageable levels of traffic flow, resulting in many requests being lost. Replicating the same content or services over several mirrored Web servers strategically placed at various locations is a method commonly used by service providers to improve performance and scalability. The user is redirected to the nearest server and this approach helps to reduce network impact on the response time of the user requests. What is it? Content Delivery Networks (CDNs) provide services that improve network performance by maximizing bandwidth, improving accessibility and maintaining correctness through content replication. They offer fast and reliable applications and services by distributing content to cache or edge servers located close to users. CDNs have evolved to overcome the inherent limitations of the Internet in terms of user perceived Quality of Service (QoS) when accessing Web content. They provide services that improve network performance by maximizing bandwidth, improving accessibility, and maintaining correctness through content replication. The typical functionalities of a CDN include: Request redirection and content delivery services, to direct a request to the closest suitable CDN cache…

What is Ubiquitous (Pervasive) Computing
Technology & Science / February 7, 2018

The world is witnessing the birth of a revolutionary computing paradigm that promises to have a profound effect on the way we interact with computers, devices, physical spaces, and other people. This new technology, called ubiquitous computing, envisions a world where embedded processors, computers, sensors, and digital communications are inexpensive commodities that are available everywhere. Ubiquitous computing will surround users with a comfortable and convenient information environment that merges physical and computational infrastructures into an integrated habitat. Overview The dissemination and use of modern information and communication technologies (ICT) are considered to be the preconditions today for dynamic economic growth and future viability in global competition. At the same time, the processes of change triggered, enabled and accelerated by ICT are enormous. Ubiquitous computing is viewed less as a discrete field of technology, but rather as an emerging application of information and communications technology that is integrated into the everyday world more than ever before. The goal is to meet the claim of “everything, always, everywhere” for data processing and transmission through the ubiquity of ICT systems. Ubiquitous computing is a paradigm in which the processing of information is linked with each activity or object as encountered. It involves connecting…

What is Software Development Life Cycle Model in Software Engineering
Programming , Technology & Science / February 6, 2018

As the world became more and more dependent on technology with each passing day, software automatically became an important organ for development. Since software is needed almost everywhere today, its development is a highly intelligent and precise process, involving various steps. It’s pretty evident that technology is accelerating at a rapid pace and humans are becoming further dependent on it for every purpose. And with every new day, software development is becoming more and more crucial since the demand for software is fast rising from every corner imaginable. Overview and Framework A software life cycle model (also called process model) is a descriptive and diagrammatic representation of the software life cycle. A life cycle model represents all the activities required to make a software product transit through its life cycle phases. It also captures the order in which these activities are to be undertaken. Software Development Life Cycle, SDLC for short, is a well-defined, structured sequence of stages in software engineering to develop the intended software product. SDLC provides a series of steps to be followed to design and develop a software product efficiently. SDLC framework includes the following steps: Figure 1: SDLC Steps Communication This is the first step…

What is Data Steaming in Data Mining

In today’s information society, computer users are used to gathering and sharing data anytime and anywhere. This concerns applications such as social networks, banking, telecommunication, health care, research, and entertainment, among others. As a result, a huge amount of data related to all human activity is gathered for storage and processing purposes. These data sets may contain interesting and useful knowledge represented by hidden patterns, but due to the volume of the gathered data it is impossible to manually extract that knowledge. Data streaming requires some combination of bandwidth sufficiency and, for real-time human perception of the data, the ability to make sure that enough data is being continuously received without any noticeable time lag. What is it? Streaming Data is data that is generated continuously by thousands of data sources, which typically send in the data records simultaneously, and in small sizes (order of Kilobytes). Streaming data includes a wide variety of data such as log files generated by customers using your mobile or web applications, e-commerce purchases, in-game player activity, information from social networks, financial trading floors, or Geo-spatial services, and telemetry from connected devices or instrumentation in data centers. This data needs to be processed sequentially and incrementally…

Insert math as
Additional settings
Formula color
Text color
Type math using LaTeX
Nothing to preview