Social Engineering Attacks

The rise of 21st century marked the transition phase of the most global businesses towards a paperless office environment, where the focus shifted the manual to the computerized form of work culture. But at the same time, change brought a number of threats and menace in terms of one of the biggest issues of the current businesses, the social engineering used among the hackers for cracking techniques that rely more on human weaknesses rather than technology itself. The aim or motive of such attacks was getting access to passwords or other relevant information by tricking people for carrying out illegal or criminal activities. FBI and other security experts hold a firm view that majority of threats originate from the internal working environment or employees who have been granted additional privileges or authorities to company’s information. People who have an urge for power and control over other individuals exhibit the social engineering skills. Computer hacking is the modern form of social engineering and the most hi tech of all. The fundamental problem with online social networking services especially is that there are no criteria or authentication for evidence or proof of an individuals identity, which keeps at stake both our privacy and information.

Social engineering attacks are driven by financial needs where hackers try to obtain confidential information about the users to access accounts. Social engineering is the root cause to ideas behind phishing and pretexting where hackers gain confidence of people who are careless or blindly trust others helping them to take undue advantage.… Read the rest

What is Expert System?

An expert system is an advanced computer application that is implemented for the purpose of providing solutions to complex problems, or to clarify uncertainties through the use of non-algorithmic programs where normally human expertise will be needed. Expert systems are most common in complex problem domain and are considered as widely used alternatives in searching for solutions that requires the existence of specific human expertise. The expert system is also able to justify its provided solutions based on the knowledge and data from past users. Normally expert systems are used in making business marketing strategic decisions, analyzing the performance of real time systems, configuring computers and perform many other functions which normally would require the existence of human expertise.

The difference between an expert system with a normal problem-solving system is that the latter is a system where both programs and data structures are encoded, while for expert system only the data structures are hard-coded and no problem-specific information is encoded in the program structure. Instead, the knowledge of a human expertise is captured and codified in a process known as knowledge engineering. Hence, whenever a particular problem requires the assistance of a certain human expertise to provide a solution, the human expertise which has been codified will be used and processed in order to provide a rational and logical solution. This knowledge-based expert system enables the system to be frequently added with new knowledge and adapt accordingly to meet new requirements from the ever-changing and unpredictable environment.

Components of Expert System

An expert system has many core system components to function and interfaces with individuals of various roles.… Read the rest

Case Study on Information Systems: Integrated Customer Ordering Service at Marks & Spencer

Marks and Spencer is one of the leading retail organizations in UK which sell stylish, quality and great value clothing and home products , also quality food. They are one of the most popular brand among people not only in UK but globally. They have more than 600 stores in UK and constantly increasing many more around the world. It was founded when in 1884, Michael Marks opened a stall at Leeds Kirkgate Market. In 1901, its first registered store was located at Derby street, Manchester. By 1924 they started expanding and the head office moved from Manchester to London. Implementation of new policies and maintenance of services and value kept on adding to the success of Marks and Spencer. In 1998, it became the first retailer to earn a profit of £1 billion. The organisation commonly called as M&S has always followed the principles of Quality, Value, Service, Innovation and Trust since its founded. This is the reason why it has been successful, distinguished and popular among people.

Need of the Information System: Integrated Customer Ordering Service (ICOS)

M&S was facing several backlogs in order processing and complaints were increasing day by day. The company had invested a lot to move ahead in this competitive business environment but was unable to overcome this problem. The need of the hour was to have an customer service ordering information system implemented which can accurately and assuredly keep away these problems which were an obstacle in the success of a great retail organisation.… Read the rest

What is Agile Methodology?

Engineering methodologies required a lot of documentation thereby causing the pace of development to slow down considerably. Agile Methodologies evolved in the 1990s to significantly eliminate this bureaucratic nature of engineering methodology. It was part of developer’s reaction against “heavyweight” methods, who desired to drift away from traditional structured, bureaucratic approaches to software development and move towards more flexible development styles. They were called the ‘Agile’ or ‘Light Weight’ methods and were defined in 1974 by Edmonds in a research paper.

Agile methodology is an approach to project management, typically used in software development. It refers to a group of software development methodologies based on iterative development. Requirements and solutions evolve through cooperation between self-organizing cross-functional teams, without concern for any hierarchy or team member roles. It promotes teamwork, collaboration, and process adaptability throughout the project life-cycle with increased face-to-face communication and reduced amount of written documentation.

Agile methods break tasks into small increments with no direct long term planning. Every aspect of development is continually revisited throughout the lifecycle of project by way of iterations (also called sprints). Iterations are short time frames (“timeboxes”) that normally last 1-4 weeks. This “inspect-and-adapt” approach significantly reduces both development costs and time to market. Each iteration involves working through a complete software development cycle characterized by planning, requirements analysis, design, coding, unit testing, and acceptance testing. This helps minimize overall risk, and quicker project adaptability. While iteration may not have enough functionality necessary for a market release, aim is to be ready with a release (with minimal bugs) at the end of each iteration.… Read the rest

Data Mining Functionalities

Data mining has an important place in today’s world. It becomes an important research area as there is a huge amount of data available in most of the applications. This huge amount of data must be processed in order to extract useful information and knowledge, since they are not explicit. Data Mining is the process of discovering interesting knowledge from large amount of data.

The kinds of patterns that can be discovered depend upon the data mining tasks employed. By and large, there are two types of data mining tasks: descriptive data mining tasks that describe the general properties of the existing data, and predictive data mining tasks that attempt to do predictions based on inference on available data. The data mining functionalities and the variety of knowledge they discover are briefly presented in the following list:

  1. Characterization: It is the summarization of general features of objects in a target class, and produces what is called characteristic rules. The data relevant to a user-specified class are normally retrieved by a database query and run through a summarization module to extract the essence of the data at different levels of abstractions. For example, one may wish to characterize the customers of a store who regularly rent more than movies a year. With concept hierarchies on the attributes describing the target class, the attribute oriented induction method can be used to carry out data summarization. With a data cube containing summarization of data, simple OLAP operations fit the purpose of data characterization.
  2. Discrimination: Data discrimination produces what are called discriminant rules and is basically the comparison of the general features of objects between two classes referred to as the target class and the contrasting class.
Read the rest

An Introduction to Data Mining

Data mining involves the use of sophisticated data analysis tools to discover previously unknown, valid patterns and relationships in large data sets. These tools can include statistical models, mathematical algorithms, and machine learning methods such as neural networks or decision trees. Consequently, data mining consists of more than collecting and managing data, it also includes analysis and prediction. The objective of data mining is to identify valid, novel, potentially useful, and understandable correlations and patterns in existing data. Finding useful patterns in data is known by different names (e.g., knowledge extraction, information discovery, information harvesting, data archaeology, and data pattern processing).

The term “data mining” is primarily used by statisticians, database researchers, and the business communities. The term KDD (Knowledge Discovery in Databases) refers to the overall process of discovering useful knowledge from data, where data mining is a particular step in this process. The steps in the KDD process, such as data preparation, data selection, data cleaning, and proper interpretation of the results of the data mining process, ensure that useful knowledge is derived from the data. Data mining is an extension of traditional data analysis and statistical approaches as it incorporates analytical techniques drawn from various disciplines like AI, machine learning, OLAP, data visualization, etc.

Data Mining covers variety of techniques to identify nuggets of information or decision-making knowledge in bodies of data, and extracting these in such a way that they can be. Put to use in the areas such as decision support, prediction, forecasting and estimation. The data is often voluminous, but as it stands of low value as no direct use can be made of it; it is the hidden information in the data that is really useful.… Read the rest