Disaster Recovery Plan (DRP) in Business

Fire, flood, earthquake and accidental deletion of data are all acts that can cause disastrous consequences on data. Such disasters can prevent the network from operating normally, which in turn can hamper the organisation’s business. These disasters can be classified into man-made disasters and environmental disasters. Man-made disasters are intentionally or unintentionally caused by humans. For example, a user accidentally deletes the data, virus and malicious programs can damage data and various other events can cause data loss and downtime. Environmental disasters are non-preventive but can be reduced if appropriate precautions are taken. Environmental disasters include fire, flood, earthquake, tornado and hurricane.

Disaster recovery deals with recovery of data that is damaged due to destructive activities. The time required to recover from a disaster depends on the disaster recovery plan implemented by the organisation. A good disaster recovery plan can prevent an organisation from any type of disruption.

Disaster Recovery Plan/Business Continuity Plan

A Disaster Recovery Plan (DRP) helps to identify threats to an existing business such as terrorism, fire, earthquake and flood. It also provides guidance on how to deal with occurrence of such events. Disasters are unpredictable; hence, planning for the worst is important for any business. A DRP is also called a Business Continuity Plan (BCP). The only difference between Disaster Recovery Plan and Business Continuity Plan is the focus. The focus of Business Continuity Plan is to provide continuity of operations in the organisation. Whereas, Disaster Recovery Plan focuses on recovery and rebuilding of the organisation after a disaster has occurred.

Read the rest

Case Study: Amazon.com Situation Analysis

Jeffrey Bezos started Amazon.com in 1994, after recognizing that Internet usage was growing at a rate of 2,300 percent a year. Operating from a 400-square foot office in Seattle, Jeffrey launched Amazon.com on the Internet in July 1995. Amazon.com mission is to use the Internet to transform book buying into the fastest, easiest, and most enjoyable shopping experience possible. By the end of 1996, his firm was one of the most successful Web retailers, with revenues reaching $15.6 million. Almost overnight Amzon.com quickly became the world’s largest e-tail bookstore in the world. Amazon has continued to expand its customer base, and sales revenues have increased every year. The firm’s revenues increased from $15.7 million in 1996 to $2.76 billion in 2000 . Today, Amazon.com is the place to find and discover anything you want to buy online. Amazon offers the Earth’s Biggest Selection of products to 29 million people in more than 160 countries across the world making them the leading online shopping site accessed via the World Wide Web.

Over past several years Amazon.com has grown and developed very rapidly. The key core processes that have lead to Amazon’s success are convenience, selection, service, and price. Convenience can best be described when Bill Gates stated that, “I buy all my books at Amazon.com because I’m busy and it’s convenient. They have a big selection, and they’ve been reliable.” With over 106 million adults purchasing books every quarter, Amazon has capitalized on the convenience of on-line ordering. The next key process for Amazon is selection.… Read the rest

Case Study: eBay’s Business Model

Founded on 1995 by Pierre Omidyar, eBay was considered a pioneer in the online auction industry whereby people are brought together on a local, national and international basis to serve the purpose of creating a person-to-person community where ever individual could have an equal access through the same medium which is the Internet. eBay offers wide varieties of products and services for bargain hunters, hobbyists and collectors and sellers, changing the way people engage in trading hence eBay had changed the face of e-commerce from its inception. Today, eBay is continuously the brand preference with over 39 market presence and with $60 billion of the total value of sold items on the site’s trading platform.

Basically, eBay introduced several crucial innovations tailor-made for the internet at the business level, a strategy which was conceived to be an improvisation. The online auction business model is where eBay served as the value-added facilitator of trade between a buyer and a seller in a highly individualistic manner. The online auction model developed by eBay marked an important extension of e-commerce, offering millions of individuals a low-cost opportunity to engage in a new type of economic activity.

When Whitman arrived in 1998 as eBay’s second president and CEO, eBay became a public listed company in September of that year, brand building recognition at eBay was prioritized. eBay’s registered users had grown six-fold, to over 2 million. Under Whitman’s leadership the company grew to over 200 million users globally and over $7 billion in revenue.… Read the rest

What is 4G?

Fourth generation (4G) wireless was originally conceived by the Defense Advanced Research Projects Agency (DARPA), the same organization that developed the wired Internet. It is not surprising, then, that DARPA chose the same distributed architecture for the wireless Internet that had proven so successful in the wired Internet. Although experts and policymakers have yet to agree on all the aspects of 4G wireless, two characteristics have emerged as all but certain components of 4G: end-to-end Internet Protocol (IP), and peer-to-peer networking. An all IP network makes sense because consumers will want to use the same data applications they are used to in wired networks. A peer-to-peer network, where every device is both a transceiver and a router/repeater for other devices in the network, eliminates this spoke-and-hub weakness of cellular architectures, because the elimination of a single node does not disable the network. The final definition of “4G” will have to include something as simple as this: if a consumer can do it at home or in the office while wired to the Internet, that consumer must be able to do it wirelessly in a fully mobile environment.

Let’s define “4G” as “wireless ad hoc peer-to-peer networking.” 4G technology is significant because users joining the network add mobile routers to the network infrastructure. Because users carry much of the network with them, network capacity and coverage is dynamically shifted to accommodate changing user patterns. As people congregate and create pockets of high demand, they also create additional routes for each other, thus enabling additional access to network capacity.… Read the rest

What is a Proxy Server?

Proxy severs have been around for quite a while now. Most likely, the history of proxy servers dates back to the beginnings of networking and the internet itself. Proxy servers were originally developed as a tool for caching frequently accessed Web pages. A proxy server is a server that acts as an intermediary for requests from clients seeking resources from other servers. A client connects to the proxy server, requesting some service, such as a file, connection, web page, or other resource, available from a different server. The proxy server evaluates the request according to its filtering rules. It may filter traffic by Internet Protocol (IP) address or protocol. If the request is validated by the filter, the proxy provides the resource by connecting to the relevant server and requesting the service on behalf of the client. A proxy server may optionally alter the client’s request or the server’s response, and sometimes it may serve the request without contacting the specified server. In this case, it ‘caches’ responses from the remote server, and returns subsequent requests for the same content directly.

How Proxy Servers Work

Proxy servers functions as a middle-man between the public internet and internal network. An example is an internal host makes a request to access a website. The request goes to the proxy server, which examines the packet including the header and data against rules that are pre-configured by the administrator. The proxy server recreates the packet with a different IP address. The proxy server then sends the packet to its destination (the IP has now changed to by the proxy server to the receiver).

Read the rest

Proxy and Reverse Proxy Servers

Proxy Server

Proxy server is a computer system or application which works as an intermediary for clients searching resources which exist on other servers. It allows different users to access the internet at the same time on a single Internet connection. It aims to improve the speed of surfing the internet and reducing network traffic. It receives requests from a web user such as a web page or a file available on a different server, then it evaluates at responds to the request. It simplifies this process and controls it. A proxy server is also part of a firewall and helps to prevent hackers from using the internet in order to not let them gain access to computers on a private network.

The main features of a proxy server are:

  1. Caching: This happens when a user requests for a file. The proxy first browses its cache and forwards it if present; otherwise it forwards the request to the web server.
  2. Connection sharing: Proxies facilitate users to share the internet connection by configuring them to access the web through it instead of providing a direct link to each user.
  3. Filtering: Since the proxy servers handle all the users requests, it can therefore be used to restrict certain URLs.
  4. Security: The proxy server assists in security by hiding the IP address of the users.
  5. Scanning traffic: Sometimes proxies integrate with open source anti-virus software to scan the network traffic for viruses and worms.
  6. Bandwidth Control: The proxies use delay pools to control bandwidth by allocating specific bandwidth to internet traffic.
Read the rest