March 15, 2009

Contemporary Hardware platform trends




  • Grid Computing

- Connects geographically remote computers into single network to create virtual super computer by combining the computational power of all computers on the grid.

- Most CPU is used on an average 25% of time leaving idle for rest of time.

- Grid computing is possible due to high speed internet connections (economic)

- Grid computing depends on software to divide and apportion pieces of a program among several computers, sometimes up to many thousands.

- Client sw communicates with server sw application.

- Server sw breaks data into chunks that are parcled out to grid machines.

Client machine can perform their tasks while running grid application in background.

Example:

Royal Dutch/Shell Group

1024 servers running Linux – largest linux supercomputer

Grid adjust to accommodate fluctuating data volumes that are typical in seasonal business

  • ON DEMAND Computing

- Refers to firm’s offloading peak demand for computing power to remote, large scale data processing centers

- Firm can reduce investment in IT infrastructure

- “Utility Computing” - suggests that firms purchase computing power from central computing utilities and pay only for amount of computing power they use similar to electricity

- Annual traffic surge for certain firm on seasonal occasions

  • Automatic Computing

- Industry wide effort to develop systems that can configure themselves, optimize and tune themselves, heal themselves when broken and protect themselves from outside intruders and self destruction.

  • Edge Computing

- It is multitier, load-balancing scheme for web based applications in which significant part of website content, logic, and processing are performed by smaller, less expensive servers located nearby users.

- There are three tier in edge computing – local client, near by edge computing platform and enterprise computer located in firm’s data center

- Request from user client computer are initially processed by edge computer

Business benefit of edge computing:

1. technology cost are lowered – no need to purchase infrastructure at its own data center

2. service level are enhanced for users – less time

3. flexibility of firm is enhanced coz it can respond to business opportunities quickly


  • Cloud Computing

Cloud computing is Internet ("cloud") based development and use of computer technology ("computing"). It is a style of computing in which typically real-time scalable resources are provided “as a service” over the Internet to users who need not have knowledge of, expertise in, or control over the technology infrastructure ("in the cloud") that supports them.

It is a general concept that incorporates software as a service (SaaS), Web 2.0 and other recent, well-known technology trends, in which the common theme is reliance on the Internet for satisfying the computing needs of the users. An often-quoted example is Google Apps, which provides common business applications online that are accessed from a web browser, while the software and data are stored on Google servers.

The cloud is a metaphor for the Internet, based on how it is depicted in computer network diagrams, and is an abstraction for the complex infrastructure it conceals.

January 13, 2009

TCP/IP





Introduction to TCP/IP

Many people may not know what TCP/IP is nor what its effect is on the Internet. The fact is, without TCP/IP there would be no Internet. And it is because of the American military that the Internet exists.During the days of the cold war, the defense department was interested in developing a means of electronic communication which could survive an attack by being able to re-route itself around any failed section of the network.They began a research project designed to connect many different networks, and many different types of hardware from various vendors. Thus was the birth of the Internet (sorta). In reality, they were forced to connect different types of hardware from various vendors because the different branches of the military used different hardware. Some used IBM, while others used Unisys or DEC.

IP is responsible for moving data from computer to computer. IP forwards each packet based on a four-byte destination address (the IP number). IP uses gateways to help move data from point “a” to point “b”. Early gateways were responsible for finding routes for IP to follow.

TCP is responsible for ensuring correct delivery of data from computer to computer. Because data can be lost in the network, TCP adds support to detect errors or lost data and to trigger retransmission until the data is correctly and completely received.

How TCP/IP works

Computers are first connected to their Local Area Network (LAN). TCP/IP shares the LAN with other systems such as file servers, web servers and so on. The hardware connects via a network connection that has it’s own hard coded unique address – called a MAC (Media Access Control) address. The client is either assigned an address, or requests one from a server. Once the client has an address they can communicate, via IP, to the other clients on the network. As mentioned above, IP is used to send the data, while TCP verifies that it is sent correctly.

When a client wishes to connect to another computer outside the LAN, they generally go through a computer called a Gateway (mentioned above). The gateway’s job is to find and store routes to destinations. It does this through a series of broadcast messages sent to other gateways and servers nearest to it. They in turn could broadcast for a route. This procedure continues until a computer somewhere says “Oh yeah, I know how to get there.” This information is then relayed to the first gateway that now has a route the client can use.

How does the system know the data is correct?

As mentioned above, IP is responsible for getting the data there. TCP then takes over to verify it.

Encoded in the data packets is other data that is used to verify the packet. This data (a checksum, or mathematical representation of the packet) is confirmed by TCP and a confirmation is sent back to the sender.

This process of sending, receiving and acknowledging happens for each individual packet sent over the Internet.

When the data is verified, it is reassembled on the receiving computer. If a package is not verified, the sending computer will re-send it and wait for confirmation. This way both computers – both sending and receiving – know which data is correct and which isn’t.

One nice thing about this protocol is that it doesn’t need to stick to just one route. Generally, when you are sending or receiving data it is taking multiple routes to get to its destination. This ensures data accuracy.

Just the facts:

TCP/IP addresses are based on 4 octets of 8 bits each. Each octet represents a number between 0 and 255. So an IP address looks like: 111.222.333.444.

There are 3 classes of IP addresses:

Ranges starting with “1” and ending with “126” (i.e.. 1.1.1.1 to 126.255.255.254) are Class A

Ranges starting with “128” and ending with 191 (i.e.. 128.1.1.1 to 191.255.255.254) are Class B

Ranges starting with 192 and ending with 254 (i.e.. 192.1.1.1 to 254.255.255.254) are Class C ( You will notice that there are no IP addresses starting with “127”. These are reserved addresses.)

October 28, 2008

Ten strategic technologies for 2009 - By Gartner

Gartner Inc analysts have highlighted the top 10 technologies and trends that will be strategic for most organisations. The analysts presented their findings during Gartner Symposium/ITxpo held recently.

“Strategic technologies affect, run, grow and transform the business initiatives of an organisation,” said David Cearley, vice president and distinguished analyst at Gartner. “Companies should look at these 10 opportunities and evaluate where these technologies can add value to their business services and solutions, as well as develop a process for detecting and evaluating the business value of new technologies as they enter the market.”

Virtualisation

Virtualisation to eliminate duplicate copies of data on the real storage devices while maintaining the illusion to the accessing systems that the files are as originally stored (data duplication) can significantly decrease the cost of storage devices and media to hold information.

Hosted virtual images deliver a near-identical result to blade-based PCs. But, instead of the motherboard function being located in the data center as hardware, it is located there as a virtual machine bubble.

However, despite ambitious deployment plans from many organisations, deployments of hosted virtual desktop capabilities will be adopted by fewer than 40 per cent of target users by 2010.

Cloud computing

Cloud computing is a style of computing that characterizes a model in which providers deliver a variety of IT-enabled capabilities to consumers. They key characteristics of cloud computing are delivery of capabilities “as a service,” delivery of services in a highly scalable and elastic fashion, using Internet technologies and techniques to develop and deliver the services, and designing for delivery to external customers.

Although cost is a potential benefit for small companies, the biggest benefits are the built-in elasticity and scalability, which not only reduce barriers to entry, but also enable these companies to grow quickly.

As certain IT functions are industrialising and becoming less customised, there are more possibilities for larger organisations to benefit from cloud computing.

Servers -- beyond blades

Servers are evolving beyond the blade server stage that exists today. This evolution will simplify the provisioning of capacity to meet growing needs. The organisation tracks the various resource types, for example, memory, separately and replenishes only the type that is in short supply. This eliminates the need to pay for all three resource types to upgrade capacity.

It also simplifies the inventory of systems, eliminating the need to track and purchase various sizes and configurations. The result will be higher utilisation because of lessened “waste” of resources that are in the wrong configuration or that come along with the needed processors and memory in a fixed bundle.

Web-oriented architectures


The use of Web-centric models to build global-class solutions cannot address the full breadth of enterprise computing needs. However, Gartner expects that continued evolution of the Web-centric approach will enable its use in an ever-broadening set of enterprise solutions during the next five years.

Enterprise mashups


Enterprises are now investigating taking mashups from cool Web hobby to enterprise-class systems to augment their models for delivering and managing applications. Through 2010, the enterprise mashup product environment will experience significant flux and consolidation, and application architects and IT leaders should investigate this growing space for the significant and transformational potential it may offer their enterprises.

Specialised systems


Appliances have been used to accomplish IT purposes, but only with a few classes of function have appliances prevailed. Heterogeneous systems are an emerging trend in high-performance computing to address the requirements of the most demanding workloads, and this approach will eventually reach the general-purpose computing market.

Heterogeneous systems are also specialised systems with the same single-purpose imitations of appliances, but the heterogeneous system is a server system into which the owner installs software to accomplish its function.

Social software and social networking


Social software includes a broad range of technologies, such as social networking, social collaboration, social media and social validation.

Organisations should consider adding a social dimension to a conventional website or application and should adopt a social platform sooner, rather than later, because the greatest risk lies in failure to engage and thereby, being left mute in a dialogue where your voice must be heard.

Unified communications


During the next five years, the number of different communications vendors with which a typical organisation works with will be reduced by at least 50 per cent. This change is driven by increases in the capability of application servers and the general shift of communications applications to common off-the-shelf server and operating systems.

As this occurs, formerly distinct markets, each with distinct vendors, converge, resulting in massive consolidation in the communications industry. Organisations must build careful, detailed plans for when each category of communications function is replaced or converged, coupling this step with the prior completion of appropriate administrative team convergence.

Business Intelligence


Business Intelligence (BI), the top technology priority in Gartner’s 2008 CIO survey, can have a direct positive impact on a company’s business performance, dramatically improving its ability to accomplish its mission by making smarter decisions at every level of the business from corporate strategy to operational processes.

BI is particularly strategic because it is directed toward business managers and knowledge workers who make up the pool of thinkers and decision makers that are tasked with running, growing and transforming the business. Tools that let these users make faster, better and more-informed decisions are particularly valuable in a difficult business environment.

Green IT


Shifting to more efficient products and approaches can allow for more equipment to fit within an energy footprint, or to fit into a previously filled center. Regulations are multiplying and have the potential to seriously constrain companies in building data centers, as the effect of power grids, carbon emissions from increased use and other environmental impacts are under scrutiny. Organisations should consider regulations and have alternative plans for data center and capacity growth.

“A strategic technology may be an existing technology that has matured and/or become suitable for a wider range of uses,” said Carl Claunch, vice president and distinguished analyst at Gartner. “It may also be an emerging technology that offers an opportunity for strategic business advantage for early adopters or with potential for significant market disruption in the next five years. Companies should evaluate these technologies and adjust based on their industry need, unique business needs, technology adoption model and other factors.”

September 12, 2008

Gartner's estimate of the Indian IT market


Here's looking into Gartner's estimate of the Indian IT market
  • According to IT research firm Gartner, IT end-user spending in India is expected to grow at a compound annual growth rate (CAGR) of 14.8 per cent from 2007 through 2012 to generate US $110 billion in 2012.
  • By 2012, Gartner predicts that the Indian market will ship nearly 24 million PCs, of which more than 50 per cent will be mobile PCs. By 2012, the Indian market will ship more mobile than desk-based PCs.
  • The Indian server revenue market is poised for 6.3 per cent CAGR from 2007 through 2012. This growth will be primarily fuelled by the growth in the x86 and the Itanium platform.
  • Non-x86 based platforms will continue to dominate the high-end and midrange server segments, especially in large enterprises such as the banking, telecommunications and government sectors. Feature-rich processors in x86 servers will account for the majority of server spending, followed by RISC/Itanium-based servers.
  • The total software market in India will reach $3.4 billion by 2012. India is one of the fastest-growing software markets, and it is expected to register the third highest CAGR of 16.3 per cent for the forecast period of 2007 through 2012. Software spending and plans for critical IT initiatives to attain business agility will remain unaltered.
  • Demand from vertical markets, which see IT as a core business enabler such as banking, finance and insurance will drive the software market in India.
  • India's domestic IT services market is the fastest-growing market in Asia/Pacific with a CAGR of 20.2 per cent from 2007 through 2012, reaching $11.8 billion in 2012. The Indian IT services market is fragmented, and competition will intensify with new players entering into the market, forcing providers to find their market niche and differentiation to win deals. Top technology priorities include security, business intelligence and enterprise application projects.
  • By 2012, the telecommunications services market in India is expected to reach $52 billion growing at a CAGR of 26.8 per cent from 2007 through 2012. India will remain a wireless and voice-dominated market until at least 2011, and wireline technology will take a back seat. Broadband penetration will reach above 7 per cent by 2012, and Broadband Wireless Access based solutions will be the growth drivers.

August 31, 2008

Insight on Open Source




Freeware is like somebody will create and distributes it unconditionally. It will not include any responsibility of owner even if software does something unwarranted to your system.

Free software / games are like liberty to run, modify, distribute or improve the software. It will also give you access to program’s source code.

Following are the two major force that is working toward Open Source.

(1) www.opensource.org (Open Source Initiative)

It is a independent group that is involved in open source community building and educating.

(2) www.fsf.org (Free Software Foundation)

FSF and OSI are committed to achieving some objectives:

  1. Given everyone right to run, modify, distribute or improve software.
  2. Both organization help software developers create licences that are compatible to their definition.

Neither open source nor free software puts any restriction on commercialization.

Linux Distros:

· Other communities made use of Linux available in operating system and brought out their own version, its called Linux distros.

· e.g. RedHat, Suse, Debian, CentOS

· Some are freely available and some are commercial.

· Like RedHat community version is “Fedora” and commercial version is RedHat Linux.

· Likewise for Suse its OpenSUSE & SUSE.


Linux is Open Source but Open Source doesn’t necessarily mean Linux.

Some examples of Open Source Applications for both Linux and Windows:

Apache, Perl, MySQL

Duel Licensing and Open Source

Practice of distributing identical software under the two different sets of terms and conditions with or without monitory benefit.

Beside license capability and market segregation, biggest motivation for software developer to offer their software under “Duel Licensing” model is to make money by monetizing their intellectual property.

Software developer may offer same software at different prices to single user for personal use and to group of users or corporate. However, software continues to remain open.

Duel licensing policy sustains innovation as well as growth.

e.g. MySQL

Killer Open Source ideas

Apache Web Servers (Web Server)

Popularity of Apache can also be highly linked with growth of World Wide Web(WWW).

Functionality:

Giving WebDAV features

URL redirection

CAPTCHA image processing

FireFox (Web Browser)

It is a source code of dying web browser Netscape and it has made really big.

Its popularity is due to

  • Featured constrained and security flawed of IE6.
  • offered tabbed browsing and restricted pop ups and ActiveX.
  • De facto browser for Linux applications
  • Runs of Windows and Mac
  • Most websites and portals become FireFox compatible besides IE.
  • More than 30% market share in web browser arena

High Performance Clustering (HPC)

  • Big educational inititutes and government research centers could afford such HPC in 80s and till mid 90s.
  • To build a common HPC two things were required

1. Commodity hardware

2. HPC middleware (which has been addressed by open source developers)

e.g. OSCAR, OpenMosix, ROCKS, Globus etc

  • In early eighties any way to deploy a super computer or an HPC was to go to some high specialized companies like Cray etc.

Linux Kernel

Open Source OS – Linux is developed by “Linux Tarvalds”

Live distribution through Live Distros – One of the coolest thing that is currently available in Linux.

Advantages:

You can use them to boot into almost any machine with Linux without installing the OS or distributing any content already existing on the hard disks or machine.

Can be used for:

  • Learning Linux without sacrificing windows machine
  • Troubleshooting
  • Disaster Recovery
  • Network Management

MySQL

  • Web 2.0 is powered undoubtedly by MySQL.
  • Facebook, Flickr, YouTube, Wikipedia uses it
  • Runs on Windows, Linux, Mac, Unix
  • Offered under dual licensing scheme
  1. MySQL native (besides ODBC) drives exists with .Net, Java, C/C++, Ruby, Perl, PHP

PHP (Hypertext Preprocessor)

· It grew from niche programming platform to world wide popular method of programming for dynamic sites.

· Zend is a creator of PHP

· Now it also support IIS 7 along with Apache

Samba

  • One of the first open source software to successfully bridge the gap between Windows and Linux.
  • Allowed accessing of file on a Windows network share easily
  • Samba based servers provides basic file sharing services

SendMail

  • Most widely used Mail Transfer Agent (MTA) on internet.
  • With the popularity of email it went on to become most popular program for mail routing and delivery.
  • In 1996, 80% of Public email servers were running SendMail.
  • Since then there is a downfall in its usage
  • Recently SendMail 8.14.2 is used at 29% of public mail servers

Current Open Source usage:

Email 43%

Database 30%

Internet Gateway / Proxy 27%

Linux on Desktop 25%

Network Monitoring & Management 21%

Firewall 21%

AntiSpam 19%

AntiVirus 18%

Collaboration 11%

ERP / CRM 11%

Unified Communication 8%

Virtualization 7%

High Performance Clustering 7%

August 19, 2008

Know Windows Vista



  • Virtually every Windows Vista edition (with the exception of Starter) will ship with both 32-bit (x86) and 64-bit (x64) versions

Parental Control

Which is not present in the Windows XP, will allow parents to control their children's PC activities without a fuss. In fact, parents will be delighted with Vista's ability to block objectionable content. Parents can also curtail access to specific applications and view where other users have been on the computer and on the Web

User Account Control (UAC)

UAC is a new security feature that allows users to operate Windows Vista more as a standard user than as a true administrator, where one has complete access to everything.

Windows Firewall

Vista features two different firewalls: the standard firewall that was available in Windows XP and Windows Firewall with advanced features. The latter offers true firewall protection, including bi-directional filters, meaning that both incoming and outgoing data is scanned before being let out.

Windows Defender

Formerly known as Microsoft AntiSpyware, Windows Defender is the built-in spyware blocker bundled in Windows Vista. Windows Defender (already available for past many months as a free standalone application to the Windows Genuine users) is an improved version of Microsoft AntiSpyware, and sports a simpler interface and very rapid access to scanning.

Booting process

The NT Boot Loader present in XP has been replaced by Windows Boot Manager. Vista does not allow storing users' own as well as application files in Windows installed boot drives such as C drive for security reasons, for users, including the administrator.

Features

Windows Aero : There is a new feature called Aero which is enabled for use if the system contains a high graphics card that can support DirectX 9.0 and higher. This feature, which again is not present in Windows XP.

Windows Sidebar : Vista brings a sidebar similar to the sidebar in MS-Office 97. Programmes can be quickly accessed, through customizable buttons provided for the purpose. One need not navigate through Start > Programmes > Programme Group > Programme Name to run a programme.

Windows Search

One of the most helpful new features in Windows Vista, is the new, vastly improved search for files or applications from almost anywhere. Unlike in Windows XP, with Window Search, you can simply type a few letters of your search request, and the results appear on the fly — a very helpful feature when looking for a file or application from the Start menu. Google watch your Desktop!

Enhanced User Experience

Start menu : Microsoft has redesigned the desktop items, such as the start menu. The taskbar, which consists of the start button, is similar in look to the Windows XP Start button. However, the default colour of the task bar has been changed. Instead of classic blue, the colour has been changed to coffee black.

The Start menu displays everything within the context of a single menu.

Live Icons : When Windows Aero starts, users can hover their mouse over open windows in the taskbar and see a live representation of what's running in them.

The Vista disc is a DVD, not a CD-ROM anymore!