The modern term “Information Technology” was coined by Leavitt and Whisle. In 1958 a Harvard Business Review included, “The new technology does not yet have a single established name. We shall call it information technology.”[2]
Information technology is the concept that includes every process of information flow, such as data collection, processing, storage, search, transmission, and reception. In the information society, information technology is one of the most necessary industries. As technology advances day by day, IT has developed into an essential part. It is also attracting future industries. In the future, the use of information technology will be more important in the overall industry. The development of the internet made it possible for the world to be connected. IT will be a power that makes an increasing number of promising occupations and science and technology.[3]
Now, people have been using it to refer to other aspects of technology. It now covers many more fields of study than it has covered in the past.
Remove ads
Four basic periods
Characterized by a principal technology used to solve the input, processing, output and communication problems of the time:
The Greeks later adopted the Phoenician alphabet and added vowels; the Romans gave the letters Latin names to create the alphabet we use today.
Paper and Pens—input technologies.
Sumerians' input technology was a stylus that could scratch marks in wet clay.
About 2600 B.C., the Egyptians write on the papyrus plant
around 100 A.D., the Chinese made paper from rags, on which modern-day papermaking is based.
Books and Libraries: Permanent Storage Devices.
Religious leaders in Mesopotamia kept the earliest "books"
The Egyptians kept scrolls
Around 600 B.C., the Greeks began to fold sheets of papyrus vertically into leaves and bind them together.
The First Numbering Systems.
Egyptian system:
The numbers 1-9 as vertical lines, the number 10 as a U or circle, the number 100 as a coiled rope, and the number 1,000 as a lotus blossom.
The first place value numbering systems similar to those in use today were invented between 100 and 200 A.D. in India who created a nine-digit numbering system.
Around 875 A.D., the concept of zero was developed.
Invented the movable metal-type printing process in 1450.
The development of book indexes and the widespread use of page numbers.
The first general purpose "computers"
Actually people who held the job title "computer: one who works with numbers."
Slide Rules, the Pascaline and Leibniz's Machine.
Early 1600s, William Oughtred, an English clergyman, invented the slide rule.
C. The Electromechanical Age: 1840 – 1940.
The discovery of ways to harness electricity was the key advance made during this period. Knowledge and information could now be converted into electrical impulses.
British scientists used this report and outpaced the Americans.
Max Newman headed up the effort at Manchester University
Where the Manchester Mark I went into operation in June 1948--becoming the first stored-program computer.
Maurice Wilkes, a British scientist at Cambridge University, completed the EDSAC (Electronic Delay Storage Automatic Calculator) in 1949—two years before EDVAC was finished.
Thus, EDSAC became the first stored-program computer in general use (i.e., not a prototype).
The First General-Purpose Computer for Commercial Use: Universal Automatic Computer (UNIVAC).
Late 1940s, Eckert and Mauchly began the development of a computer called UNIVAC (Universal Automatic Computer)
Remington Rand.
First UNIVAC delivered to Census Bureau in 1951.
But, a machine called LEO (Lyons Electronic Office) went into action a few months before UNIVAC and became the world's first commercial computer.
Vacuum tubes replaced by transistors as main logic element.
AT&T's Bell Laboratories, in the 1940s
Crystalline mineral materials called semiconductors could be used in the design of a device called a transistor
Magnetic tape and disks began to replace punched cards as external storage devices.
Magnetic cores (very small donut-shaped magnets that could be polarized in one of two directions to represent data) strung on wire within the computer became the primary internal storage technology.
High-level programming languages
e.g., FORTRAN and COBOL
The Third Generation (1964–1979).
Individual transistors were replaced by integrated circuits.
Magnetic tape and disks completely replace punch cards as external storage devices.
Magnetic core internal memories began to give way to a new form, metal oxide semiconductor (MOS) memory, which, like integrated circuits, used silicon-backed chips.
Operating systems
Advanced programming languages like BASIC developed.
The Fourth Generation (1979–Present).
Large-scale and very large-scale integrated circuits (LSIs and VLSICs)
Microprocessors that contained memory, logic, and control circuits (an entire CPU = Central Processing Unit) on a single chip.
Which allowed for home-use personal computers or PCs, like the Apple (II and Mac) and IBM PC.
Apple II released to public in 1977, by Steve Wozniak and Steve Jobs.
Initially sold for $1,195 (without a monitor); had 16k RAM.
First Apple Mac released in 1984.
IBM PC introduced in 1981.
Debuts with MS-DOS (Microsoft Disk Operating System)
Fourth generation language software products
e.g., VisiCalc, Lotus 1-2-3, dBase, Microsoft Word, and many others.
Graphical User Interfaces (GUI) for PCs arrive in early 1980s
Windows wouldn't take off until version 3 was released in 1990
Remove ads
A Bachelor of Information Technology (abbreviations BIT, BInfTech, B.Tech(IT) or BE(IT)) is an undergraduate academic degree that generally requires three to five years of study.[6] While the degree has a major focus on computers and technology, it differs from a Computer Science degree in that students are also expected to study management and information science, and there are reduced requirements for mathematics. However, people pursue an MBA in IT, a 2-year degree,[7] to attain managerial roles and advance their careers. A degree in computer science can be expected to concentrate on the scientific aspects of computing, while a degree in information technology can be expected to concentrate on the business and communication applications of computing. There is more emphasis on these two areas in the electronic commerce, e-business and business information technology undergraduate courses. Specific names for the degrees vary across countries, and even universities within countries.
This is in contrast to a Bachelor of Science in Information Technology which is a bachelor's degree typically conferred after a period of three to four years of an undergraduate course of study in Information Technology (IT).[8] The degree itself is a Bachelor of Science with institutions conferring degrees in the fields of information technology and related fields.
Many employers require software developers or programmers to have a Bachelor of Science in Computer Science degree; however, those seeking to hire for positions such as network administrators or database managers would require a Bachelors of Science in Information Technology or an equivalent degree.[9] Graduates with an information technology background are able to perform technology tasks relating to the processing, storing, and communication of information between computers, mobile phones, and other electronic devices. Information technology as a field emphasizes the secure management of large amounts of diverse information and its accessibility via a wide variety of systems both local and world-wide.[10]
"Network and Computer Systems Administrators". Occupational Outlook Handbook. United States Bureau of Labor Statistics. 2012-03-29. Retrieved 2013-12-01.