Computing

[1] It includes the study and experimentation of algorithmic processes, and the development of both hardware and software.

The earliest known tool for use in computation is the abacus, and it is thought to have been invented in Babylon circa between 2700 and 2300 BC.

The first recorded proposal for using digital electronics in computing was the 1931 paper "The Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena" by C. E.

[4] Claude Shannon's 1938 paper "A Symbolic Analysis of Relay and Switching Circuits" then introduced the idea of using electronics for Boolean algebraic operations.

[7] However, early junction transistors were relatively bulky devices that were difficult to mass-produce, which limited them to a number of specialised applications.

The same program in its human-readable source code form, enables a programmer to study and develop a sequence of steps known as an algorithm.

It is a set of programs, procedures, algorithms, as well as its documentation concerned with the operation of a data processing system.

Frequently used development tools such as compilers, linkers, and debuggers are classified as system software.

Some users are satisfied with the bundled apps and need never install additional applications.

Networks may be classified according to a wide variety of characteristics such as the medium used to transport the data, communications protocol used, scale, topology, and organizational scope.

One well-known communications protocol is Ethernet, a hardware and link layer standard that is ubiquitous in local area networks.

This includes millions of private, public, academic, business, and government networks, ranging in scope from local to global.

The Internet carries an extensive range of information resources and services, such as the inter-linked hypertext documents of the World Wide Web and the infrastructure to support email.

[31] However, the term programmer may apply to a range of program quality, from hacker to open source contributor to professional.

It is also possible for a single programmer to do most or all of the computer programming needed to generate the proof of concept to launch a new killer application.

[citation needed] A programmer's primary computer language (C, C++, Java, Lisp, Python, etc.)

However, members of these professions typically possess other software engineering skills, beyond programming.

[39][40][41] Software development, a widely used and more generic term, does not necessarily subsume the engineering paradigm.

[43] Its subfields can be divided into practical techniques for its implementation and application in computer systems, and purely theoretical areas.

All IS degrees combine business and computing topics, but the emphasis between technical and organizational issues varies among programs.

[52][53][54] The field of Computer Information Systems (CIS) studies computers and algorithmic processes, including their principles, their software and hardware designs, their applications, and their impact on society[55][56] while IS emphasizes functionality over design.

[57] Information technology (IT) is the application of computers and telecommunications equipment to store, retrieve, transmit, and manipulate data,[58] often in the context of a business or other enterprise.

Potential infrastructure for future technologies includes DNA origami on photolithography[62] and quantum antennae for transferring information between ion traps.

[64][65] Fast digital circuits, including those based on Josephson junctions and rapid single flux quantum technology, are becoming more nearly realizable with the discovery of nanoscale superconductors.

[67] IBM has created an integrated circuit with both electronic and optical information processing in one chip.

[68] One benefit of optical interconnects is that motherboards, which formerly required a certain kind of system on a chip (SoC), can now move formerly dedicated memory and network controllers off the motherboards, spreading the controllers out onto the rack.

[74] It allows individual users or small business to benefit from economies of scale.

This suggests potential for further legislative regulations on cloud computing and tech companies.

Computer simulation
Computer simulation, one of the main cross-computing methodologies
Early vacuum tube Turing complete computer
ENIAC, the first programmable general-purpose electronic digital computer