29 November 2011

About Us

An astute professional with 6 years of proven success in Networking & System Administration with technical expertise in the implementation, operations and support functions of mission-critical business solutions using IT as a tool. Adept in handling the networking operations pertaining to LAN, WAN, Network Security, Servers, Storage Area Networks, network operating systems, networking devices administration and Maintenance in multi-platform environments. Configured Microsoft cluster and network load balancing. Acquired certifications and training from major IT vendors like Cisco and Microsoft. Proficiency in managing installation, configuration, maintenance, migrations and monitoring of networks with a focus on systems/ network administration (Computers, Peripherals, Operating Systems) in multi-platform environments ensuring maximum uptime.

Expertise in mapping requirements of clients as well as the different stakeholders, suppliers, service providers, vendors; customizing asset / hardware upgrades and process as well as offering comprehensive & innovative solutions in line with guidelines specified by client. Deft in working on Windows 2012/2008 Active Directory configuration and maintenance; Exchange 2010 Server installation, configuration and maintenance; DNS, DHCP sever installation, configuration and maintenance as well as Windows 2012/2008 server backup and recovery solutions.

Skilled in enhancing network systems & provisioning support for system engineering activities entailing mapping business processes, studying workflow to design technical solutions, ensuring business functionality & enhancing competitive advantage. An effective communicator with exceptional relationship management skills. Demonstrated ability to relate comfortably to people at any level of business and management; significant experience working with steering committees and other project managers.

29 May 2010

What is FTP?


File Transfer Protocol (FTP) is a standard network protocol used to copy a file from one host to another over a TCP/IP-based network, such as the Internet. FTP is built on a client-server architecture and utilizes separate control and data connections between the client and server applications, which solves the problem of different end host configurations (i.e., Operating System, file names).FTP is used with user-based password authentication or with anonymous user access.
Applications were originally interactive command-line tools with a standardized command syntax, but graphical user interfaces have been developed for all desktop operating systems in use today.PORTS FTP (Data) 20 FTP (Control) 21

History

The original specification for the File Transfer Protocol was published as RFC 114 on 16 April 1971 and later replaced by RFC 765 (June 1980) and RFC 959 (October 1985), the current specification. Several proposed standards amend RFC 959, for example RFC 2228 (June 1997) proposes security extensions and RFC 2428 (September 1998) adds support for IPv6 and defines a new type of passive mode.

Protocol overview

The protocol is specified in RFC 959, which is summarized below.
A client makes a connection to the server on TCP port 21. This connection, called the control connection, remains open for the duration of the session, with a second connection, called the data connection, on port 20 opened as required to transfer file data. The control connection is used to send administrative data (i.e., commands, identification, passwords).Commands are sent by the client over the control connection in ASCII and terminated by a carriage return and line feed. For example "RETR filename" would transfer the specified file from the server to the client. Due to this two port structure, FTP is considered out-of-band, as opposed to an in-band protocol such as HTTP
The server responds on the control connection with three digit status codes in ASCII with an optional text message, for example "200" (or "200 OK.") means that the last command was successful. The numbers represent the code number and the optional text represent explanations (i.e., ) or needed parameters (i.e., )[1]. A file transfer in progress over the data connection can be aborted using an interrupt message sent over the control connection.
FTP can be run in active mode or passive mode, which control how the second connection is opened. In active mode the client sends the server the IP address port number that the client will use for the data connection, and the server opens the connection. Passive mode was devised for use where the client is behind a firewall and unable to accept incoming TCP connections. The server sends the client an IP address and port number and the client opens the connection to the server. Both modes were updated in September 1998 to add support for IPv6 and made some other changes to passive mode, making it extended passive mode.
While transferring data over the network, four data representations can be used
  • ASCII mode: used for text. Data is converted, if needed, from the sending host's character representation to "8-bit ASCII" before transmission, and (again, if necessary) to the receiving host's character representation. As a consequence, this mode is inappropriate for files that contain numeric data in binary, floating point or binary coded decimal form.
  • Image mode (commonly called Binary mode): the sending machine sends each file byte for byte and as such the recipient stores the bytestream as it receives it. (Image mode support has been recommended for all implementations of FTP).
  • EBCDIC mode: use for plain text between hosts using the EBCDIC character set. This mode is otherwise like ASCII mode.
  • Local mode: Allows two computers with identical setups to send data in a proprietary format without the need to convert it to ASCII
For text files, different format control and record structure options are provided. These features were designed to facilitate files containing Telnet or ASA formatting.
Data transfer can be done in any of three modes
  • Stream mode: Data is sent as a continuous stream, relieving FTP from doing any processing. Rather, all processing is left up to TCP. No End-of-file indicator is needed, unless the data is divided into records.
  • Block mode: FTP breaks the data into several blocks (block header, byte count, and data field) and then passes it on to TCP.
  • Compressed mode: Data is compressed using a single algorithm (usually Run-length encoding).

Security

The original FTP specification has many security concerns. In May 1999, the following flaws were addressed
  • Bounce Attacks
  • Spoof Attacks
  • Brute Force Attacks
  • Sniffing
  • Username Protection
  • Port Stealing
FTP has no encryption tools meaning all transmissions are in clear text; user names, passwords, FTP commands and transferred files can be read by anyone sniffing on the network. This is a problem common to many Internet protocol specifications written prior to the creation of SSL, such as HTTP, SMTP and Telnet. The common solution to this problem is to use either SFTP (SSH File Transfer Protocol), or FTPS (FTP over SSL), which adds SSL or TLS encryption to FTP as specified in RFC 4217.

Anonymous FTP

A host that provides an FTP service may additionally provide anonymous FTP access. Users typically log into the service with an 'anonymous' account when prompted for user name. Although users are commonly asked to send their email address in lieu of a password, no verification is actually performed on the supplied data; examples of anonymous FTP servers can be found here.

Remote FTP or FTPmail

Where FTP access is restricted, a remote FTP (or FTPmail) service can be used to circumvent the problem. An e-mail containing the FTP commands to be performed is sent to a remote FTP server, which is a mail server that parses the incoming e-mail, executes the FTP commands, and sends back an e-mail with any downloaded files as an attachment. Obviously this is less flexible than an FTP client, as it is not possible to view directories interactively or to modify commands, and there can also be problems with large file attachments in the response not getting through mail servers. As most internet users these days have ready access to FTP, this procedure is no longer in everyday use.

Web browser support

Most recent web browsers can retrieve files hosted on FTP servers, although they may not support protocol extensions such as FTPS. When an FTP—rather than HTTP—URL is supplied, the accessible contents of the remote server is presented in a manner similar to that used for other Web content.
FTP URL syntax is described in RFC1738 ,taking the form:
ftp://[[:]@][:]/
(The bracketed parts are optional.) For example:
By default, most web browsers use passive (PASV) mode, which more easily traverses end-user firewalls.

NAT traversal

The representation of the IP addresses and port numbers in the PORT command and PASV reply poses a challenge to FTP in traversing Network address translators (NAT). The NAT device must alter these values, so that they contain the IP address of the NAT-ed client, and a port chosen by the NAT device for the data connection. The new address and port will probably differ in length in their decimal representation from the original address and port. Such translation is not usually performed in most NAT devices, but special application layer gateways exist for this purpose.

FTP over SSH (not SFTP)

FTP over SSH (not SFTP) refers to the practice of tunneling a normal FTP session over an SSH connection.
Because FTP uses multiple TCP connections (unusual for a TCP/IP protocol that is still in use), it is particularly difficult to tunnel over SSH. With many SSH clients, attempting to set up a tunnel for the control channel (the initial client-to-server connection on port 21) will protect only that channel; when data is transferred, the FTP software at either end will set up new TCP connections (data channels), which bypass the SSH connection, and thus have no confidentiality, integrity protection, etc.
Otherwise, it is necessary for the SSH client software to have specific knowledge of the FTP protocol, and monitor and rewrite FTP control channel messages and autonomously open new forwardings for FTP data channels. Version 3 of SSH Communications Security's software suite, the GPL licensed FONC, and Co:Z FTPSSH Proxy are three software packages that support this mode.
FTP over SSH is sometimes referred to as secure FTP; this should not be confused with other methods of securing FTP, such as with SSL/TLS (FTPS). Other methods of transferring files using SSH that are not related to FTP include SFTP and SCP; in each of these, the entire conversation (credentials and data) is always protected by the SSH protocol.

10 May 2010

Top 15 Highest Paying Certifications in Technology Industry...!



1. PMI Project Management Professional (PMP)With an average annual salary of $101,695, the PMP certification from the Project Management Institute (PMI) organization tops the list of highest paying certifications.
2. PMI Certified Associate in Project Management (CAPM)

Next highest on the list of highest paying certifications is PMI's Certified Associate in Project Management (CAPM). The average annual salary for CAPM holders that were surveyed is $101,103.
3. ITIL v2 - Foundations

With an annual average salary of $95,415 the ITIL v2 Foundations certification came up third on the list of highest paying certifications. ITIL stands for the IT Infrastructure Library. The ITIL certification is designed to show expertise in ITIL service support and service delivery.
4. Certified Information Systems Security Professional (CISSP)

Coming in at a close 4th on the list of highest paying certifications is the Certified Information Systems Security Professional or CISSP certification from (ISC). The average annual reported salary was $94,018.
5. Cisco CCIE Routing and Switching

At $93,500 per year average annual salary, the Cisco CCIE Routing and Switching certification came in 5th on the list of highest paying certifications in the technology industry.

6. Cisco CCVP - Certified Voice Professional
Number six on the list of the highest paying certifications is the Cisco CCVP or Cisco Certified Voice Professional. The average annual salary of CCVP respondents was $88,824.

7. ITIL v3 - ITIL Master
The ITIL v3 certification - the ITIL Master - came in 7th on the list of the highest paying technical certifications. The average annual salary for ITIL Master certification holders was $86,600.

8. MCSD - Microsoft Certified Solution Developer
The MCSD or Microsoft Certified Solution Developer certification pays an average of $84,522. This puts the MCSD certification at number 8 on the list of highest paying certifications in technology.

9. Cisco CCNP - Cisco Certified Network Professional
Cisco Certified Network professional or CCNP certification is number 9 on the list of highest paying technical certifications. The average annual salary reported by CCNP holders is $84,161.

10. Red Hat Certified Engineer
The Red Hat Certified Engineer (RHCE) came in at number 10 on the list of highest paying certifications. The average annual salary reported by Red Hat Certified Engineers is $83,692.

11. MCITP - Microsoft Certified IT Professional (Enterpeise)
The MCIPT certification (Enterprise), or Microsoft Certified IT Professional - Enterprise Support comes in at number 11 on the list of highest paying technical certifications. (The MCITP Database is number 14, see below). The average MCITP Enterprise salary reported was $82,941.

12. Cisco CCSP - Cisco Certified Security Professional
Coming in at number 12 on the list of the highest paying technical certifications is the Cisco CCSP or Cisco Certified Security Professional. The average annual salary reported by CCSP holders is $80,000.

13. MCAD - Microsoft Certified Applications Developer
With an average annual salary of $79,444, the MCAD certification, or Microsoft Certified Application Developer certification, is number 13 on the list of highest paying certifications in technology.

14. MCITP - Microsoft Certified IT Professional (Database)
The MCIPT certification (Database), or Microsoft Certified IT Professional - Database comes in at number 14 on the list of highest paying technical certifications. (The MCITP Enterprise Support is number 11, above). The average MCITP Database salary reported was $77,000.

15. MCDBA - Microsoft Certified Database Administrator
The Microsoft Certified Database Administrator, or MCDBA, comes in at number 15 on the list of highest paying technical certifications. The average annual salary reported by MCDBA respondents is $76,960.

20 March 2010

HISTORY OF NETWORK..!


A lot of the West Coast hackers belonged to the Homebrew Computer Club, founded by Lee Felsenstein. Lee had actually begun networking computers before the development of the PC, with his Community Memory project in the late 1970s. This system had dumb terminals (like computer screens with keyboards connected to one large computer that did the processing). These were placed in laundromats, the Whole Earth Access store, and community centres in San Francisco. This network used permanent links over a small geographical area rather than telephone lines and modems.

The first public bulletin board using personal computers and modems was written by Ward Christensen and Randy Seuss in Chicago in 1978 for the early amateur computers. It was about 1984 that the first bulletin boards using the IBM (Bill Gates/Microsoft) operating system and Apple operating systems began to be used. The most popular of these was FidoNet.

At that time the Internet technologies were only available on the UNIX computer operating system, which wasn't available on PCs. A piece of software called ufgate, developed by Tim Pozar, was one of the first bridges to connect the Fidonet world to the Internet world. An alternative approach undertaken by Scott Weikart and Steve Fram for the Association for Progressive Communications saw UNIX being made available on special low cost PCs in a distributed network.

In the community networking field early systems included PEN (Public Electronic Network) in Santa Monica, the WELL (Whole Earth 'Lectronic Link) in the Bay area of San Francisco, Big Sky Telegraph, and a host of small businesses with online universities, community bulletin boards, artists networks, seniors clubs, womens networks etc. ..

Gradually, as the 1980s came to a close, these networks also began joining the Internet for connectivity and adopted the TCP/IP standard. Now the PC networks and the academic networks were joined, and a platform was available for rapid global development.

By 1989 many of the new community networks had joined the Electronic Networkers Association, which preceded the Internet Society as the association for network builders. When they met in San Francisco in 1989, there was a lot of activity, plus some key words emerging - connectivity and interoperability. Not surprisingly in the California hippy culture f the time, the visions for these new networks included peace, love, joy, Marshall McLuhan's global village, the paperless office, electronic democracy, and probably Timothy Leary's Home Page. However, new large players such as America on Line (AOL) were also starting to make their presence felt, and a more commercial future was becoming obvious. Flower power gave way to communications protocols, and Silicon Valley just grew and grew.

PEN (The Public Electronic Network) in Santa Monica, may be able to claim the mantle of being the first local government based network of any size. Run by the local council, and conceived as a means for citizens to keep in touch with local government, its services included forms, access to the library catalogue, city and council information, and free email.

PEN started in February 1989, and by July 1991 had 3,500 users. One of the stories PEN told about the advantages of its system was the consultations they had with the homeless people of Santa Monica. The local council decided that it would be good to consult the homeless to find out what the city government could do for them. The homeless came back via email with simple needs - showers, washing facilities, and lockers. Santa Monica, a city of 96000 people at the time, was able to take this on board and provide some basic dignity for the homeless -and at a pretty low cost. This is probably the first example of electronic democracy in action.

Meanwhile, back in the academic and research world, there were many others who wanted to use the growing network but could not because of military control of Arpanet. Computer scientists at universities without defence contracts obtained funding from the National Science Foundation to form CSNet (Computer Science Network). Other academics who weren't computer scientists also began to show interest, so soon this started to become known as the "Computer and Science Network". In the early days, however, only a few academics used the Internet at most universities. It was not until the1990s that the penetration of Internet in academic circles became at all significant.

Because of fears of hackers, the Dept of Defence created a new separate network, MILNet, in 1982. By the mid-1980s, ARPANET was phased out. The role of connecting university and research networks was taken over by CSNet, later to become the NSF (or national science foundation) Network.

The NSFnet was to become the U.S. backbone for the global network known as the Internet, and a driving force in its early establishment. By 1989 ARPANet had disappeared, but the Information Superhighway was just around the corner.