Difference Between 4th And 5th Generation Computer

The Fifth Generation Computer Systems (FGCS) was an initiative by Japan's Ministry of International Trade and Industry (MITI), begun in 1982, to create computers using massively parallel computing and logic programming. It was to be the result of a massive government/industry research project in Japan during the 1980s. It aimed to create an 'epoch-making computer' with supercomputer-like performance and to provide a platform for future developments in artificial intelligence. There was also an unrelated Russian project also named as a fifth-generation computer (see Kronos (computer)).

Prof. Ehud Shapiro, in his 'Trip Report' paper[1] (which focused the FGCS project on concurrent logic programming as the software foundation for the project), captured the rationale and motivations driving this huge project:

Characteristics Of 4th And 5th Generation Computers

'As part of Japan's effort to become a leader in the computer industry, the Institute for New Generation Computer Technology has launched a revolutionary ten-year plan for the development of large computer systems which will be applicable to knowledge information processing systems. These Fifth Generation computers will be built around the concepts of logic programming. In order to refute the accusation that Japan exploits knowledge from abroad without contributing any of its own, this project will stimulate original research and will make its results available to the international research community.'

Difference Between 4th And 5th Generation Computer

The difference between a fourth generation of computer and a fifth generation of computer can be the size. It can also be a new operating system, new storage capacity, or just a new look. 1) The fourth generation computers have microprocessor-based systems. 2) They are the cheapest among all the computer generation. 3) The speed, accuracy and reliability of the computers were improved in fourth generation computers. 4) Many high-level languages were developed in the fourth generation such as COBOL, FORTRAN, BASIC, PASCAL and C. Fourth generation programming languages are designed for a specific application domain, while fifth generation programming languages are deigned to allow computers to solve problems by themselves. 4GL programmers need to specify the algorithm in order to solve a problem, whereas 5GL programmers only need to define the problem and constraints. Generation in computer terminology is a change in technology a computer is/was being used. Initially, the generation term was used to distinguish between varying hardware technologies. Nowadays, generation includes both hardware and software, which together make up an entire computer system. The term 'fifth generation' was intended to convey.

Difference

The term 'fifth generation' was intended to convey the system as being a leap beyond existing machines. In the history of computing hardware, computers using vacuum tubes were called the first generation; transistors and diodes, the second; integrated circuits, the third; and those using microprocessors, the fourth. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU, the fifth generation, it was widely believed at the time, would instead turn to massive numbers of CPUs for added performance.

The project was to create the computer over a ten-year period, after which it was considered ended and investment in a new 'sixth generation' project would begin. Opinions about its outcome are divided: either it was a failure, or it was ahead of its time.

Information[edit]

In the late 1965s till the early 1970s, there was much talk about 'generations' of computer hardware — usually 'three generations'.

  1. First generation: Thermionic vacuum tubes. Mid-1940s. IBM pioneered the arrangement of vacuum tubes in pluggable modules. The IBM 650 was a first-generation computer.
  2. Second generation: Transistors. 1956. The era of miniaturization begins. Transistors are much smaller than vacuum tubes, draw less power, and generate less heat. Discrete transistors are soldered to circuit boards, with interconnections accomplished by stencil-screened conductive patterns on the reverse side. The IBM 7090 was a second-generation computer.
  3. Third generation: Integrated circuits (silicon chips containing multiple transistors). 1964. A pioneering example is the ACPX module used in the IBM 360/91, which, by stacking layers of silicon over a ceramic substrate, accommodated over 20 transistors per chip; the chips could be packed together onto a circuit board to achieve unheard-of logic densities. The IBM 360/91 was a hybrid second- and third-generation computer.

Omitted from this taxonomy is the 'zeroth-generation' computer based on metal gears (such as the IBM 407) or mechanical relays (such as the Mark I), and the post-third-generation computers based on Very Large Scale Integrated (VLSI) circuits.

There was also a parallel set of generations for software:

  1. First generation: Machine language.
  2. Second generation: Low-level programming languages such as Assembly language.
  3. Third generation: Structured high-level programming languages such as C, COBOL and FORTRAN.
  4. Fourth generation: 'Non-procedural' high-level programming languages (such as object-oriented languages)[2]

Throughout these multiple generations up to the 1970s, Japan had largely been a follower in the computing arena, building computers following U.S. and British leads. The Ministry of International Trade and Industry decided to attempt to break out of this follow-the-leader pattern, and in the mid-1970s started looking, on a small scale, into the future of computing. They asked the Japan Information Processing Development Center (JIPDEC) to indicate a number of future directions, and in 1979 offered a three-year contract to carry out more in-depth studies along with industry and academia. It was during this period that the term 'fifth-generation computer' started to be used.

Prior to the 1970s, MITI guidance had successes such as an improved steel industry, the creation of the oil supertanker, the automotive industry, consumer electronics, and computer memory. MITI decided that the future was going to be information technology. However, the Japanese language, in both written and spoken form, presented and still presents major obstacles for computers. These hurdles could not be taken lightly. So MITI held a conference and invited people around the world to help them.

The primary fields for investigation from this initial project were:

  • Inference computer technologies for knowledge processing
  • Computer technologies to process large-scale data bases and knowledge bases
  • High performance workstations
  • Distributed functional computer technologies
  • Super-computers for scientific calculation

The project imagined an 'epoch-making computer' with supercomputer-like performance using massively parallel computing/processing. The aim was to build parallel computers for artificial intelligence applications using concurrent logic programming. The FGCS project and its vast findings contributed greatly to the development of the concurrent logic programming field.

The target defined by the FGCS project was to develop 'Knowledge Information Processing systems' (roughly meaning, applied Artificial Intelligence). The chosen tool to implement this goal was logic programming. Logic programming approach as was characterized by Maarten Van Emden – one of its founders – as:[3]

  • The use of logic to express information in a computer.
  • The use of logic to present problems to a computer.
  • The use of logical inference to solve these problems.

More technically, it can be summed up in two equations:

  • Program = Set of axioms.
  • Computation = Proof of a statement from axioms.

The Axioms typically used are universal axioms of a restricted form, called Horn-clauses or definite-clauses. The statement proved in a computation is an existential statement. The proof is constructive, and provides values for the existentially quantified variables: these values constitute the output of the computation.

Logic programming was thought as something that unified various gradients of computer science (software engineering, databases, computer architecture and artificial intelligence). It seemed that logic programming was the 'missing link' between knowledge engineering and parallel computer architectures.

The project imagined a parallel processing computer running on top of massive databases (as opposed to a traditional filesystem) using a logic programming language to define and access the data. They envisioned building a prototype machine with performance between 100M and 1G LIPS, where a LIPS is a Logical Inference Per Second. At the time typical workstation machines were capable of about 100k LIPS. They proposed to build this machine over a ten-year period, 3 years for initial R&D, 4 years for building various subsystems, and a final 3 years to complete a working prototype system. In 1982 the government decided to go ahead with the project, and established the Institute for New Generation Computer Technology (ICOT) through joint investment with various Japanese computer companies.

In the same year, during a visit to the ICOT, Prof. Ehud Shapiro invented Concurrent Prolog, a novel concurrent programming language that integrated logic programming and concurrent programming. Concurrent Prolog is a logic programming language designed for concurrent programming and parallel execution. It is a process oriented language, which embodies dataflow synchronization and guarded-command indeterminacy as its basic control mechanisms. Shapiro described the language in a Report marked as ICOT Technical Report 003,[4] which presented a Concurrent Prolog interpreter written in Prolog. Shapiro's work on Concurrent Prolog inspired a change in the direction of the FGCS from focusing on parallel implementation of Prolog to the focus on concurrent logic programming as the software foundation for the project. It also inspired the concurrent logic programming language Guarded Horn Clauses (GHC) by Ueda, which was the basis of KL1, the programming language that was finally designed and implemented by the FGCS project as its core programming language.

Implementation[edit]

So ingrained was the belief that parallel computing was the future of all performance gains that the Fifth-Generation project generated a great deal of apprehension in the computer field. After having seen the Japanese take over the consumer electronics field during the 1970s and apparently doing the same in the automotive world during the 1980s, the Japanese in the 1980s had a reputation for invincibility. Soon parallel projects were set up in the US as the Strategic Computing Initiative and the Microelectronics and Computer Technology Corporation (MCC), in the UK as Alvey, and in Europe as the European Strategic Program on Research in Information Technology (ESPRIT), as well as the European Computer‐Industry Research Centre (ECRC) in Munich, a collaboration between ICL in Britain, Bull in France, and Siemens in Germany.

Five running Parallel Inference Machines (PIM) were eventually produced: PIM/m, PIM/p, PIM/i, PIM/k, PIM/c. The project also produced applications to run on these systems, such as the parallel database management system Kappa, the legal reasoning systemHELIC-II, and the automated theorem proverMGTP, as well as applications to bioinformatics.

Failure[edit]

The FGCS Project did not meet with commercial success for reasons similar to the Lisp machine companies and Thinking Machines. The highly parallel computer architecture was eventually surpassed in speed by less specialized hardware (for example, Sun workstations and Intelx86 machines). The project did produce a new generation of promising Japanese researchers. But after the FGCS Project, MITI stopped funding large-scale computer research projects, and the research momentum developed by the FGCS Project dissipated. However MITI/ICOT embarked on a Sixth Generation Project in the 1990s.

A primary problem was the choice of concurrent logic programming as the bridge between the parallel computer architecture and the use of logic as a knowledge representation and problem solving language for AI applications. This never happened cleanly; a number of languages were developed, all with their own limitations. In particular, the committed choice feature of concurrent constraint logic programming interfered with the logical semantics of the languages.[5]

Another problem was that existing CPU performance quickly pushed through the 'obvious' barriers that experts perceived in the 1980s, and the value of parallel computing quickly dropped to the point where it was for some time used only in niche situations. Although a number of workstations of increasing capacity were designed and built over the project's lifespan, they generally found themselves soon outperformed by 'off the shelf' units available commercially.

The project also suffered from being on the wrong side of the technology curve. During its lifespan, GUIs became mainstream in computers; the internet enabled locally stored databases to become distributed; and even simple research projects provided better real-world results in data mining.[citation needed] Moreover, the project found that the promises of logic programming were largely negated by the use of committed choice.[citation needed]

At the end of the ten-year period, the project had spent over ¥50 billion (about US$400 million at 1992 exchange rates) and was terminated without having met its goals. The workstations had no appeal in a market where general purpose systems could now take over their job and even outrun them. This is parallel to the Lisp machine market, where rule-based systems such as CLIPS could run on general-purpose computers, making expensive Lisp machines unnecessary.[6]

Fifth generation of computers

Ahead of its time[edit]

In spite of the possibility of considering the project a failure, many of the approaches envisioned in the Fifth-Generation project, such as logic programming distributed over massive knowledge-bases, are now being re-interpreted in current technologies. For example, the Web Ontology Language (OWL) employs several layers of logic-based knowledge representation systems. It appears, however, that these new technologies reinvented rather than leveraged approaches investigated under the Fifth-Generation initiative.

In the early 21st century, many flavors of parallel computing began to proliferate, including multi-core architectures at the low-end and massively parallel processing at the high end. When clock speeds of CPUs began to move into the 3–5 GHz range, CPU power dissipation and other problems became more important. The ability of industry to produce ever-faster single CPU systems (linked to Moore's Law about the periodic doubling of transistor counts) began to be threatened. Ordinary consumer machines and game consoles began to have parallel processors like the Intel Core, AMD K10, and Cell. Graphics card companies like Nvidia and AMD began introducing large parallel systems like CUDA and OpenCL. Again, however, it is not clear that these developments were facilitated in any significant way by the Fifth-Generation project.

In summary, a strong case can be made that the Fifth-Generation project was ahead of its time, but it is debatable whether this counters or justifies claims that it was a failure.

References[edit]

  1. ^Shapiro, Ehud Y. 'The fifth generation project—a trip report.' Communications of the ACM 26.9 (1983): 637-641.
  2. ^http://www.rogerclarke.com/SOS/SwareGenns.html
  3. ^Van Emden, Maarten H., and Robert A. Kowalski. 'The semantics of predicate logic as a programming language.' Journal of the ACM 23.4 (1976): 733-742.
  4. ^Shapiro E. A subset of Concurrent Prolog and its interpreter, ICOT Technical Report TR-003, Institute for New Generation Computer Technology, Tokyo, 1983. Also in Concurrent Prolog: Collected Papers, E. Shapiro (ed.), MIT Press, 1987, Chapter 2.
  5. ^Carl Hewitt. Inconsistency Robustness in Logic Programming ArXiv 2009.
  6. ^Hendler, James (1 March 2008). 'Avoiding Another AI Winter'(PDF). IEEE Intelligent Systems. 23 (2): 2–4. doi:10.1109/MIS.2008.20. Archived from the original(PDF) on 12 February 2012.
Retrieved from 'https://en.wikipedia.org/w/index.php?title=Fifth_generation_computer&oldid=992096796'

Difference Between 4th And 5th Generation Computer History

Fourth Generation vs Fifth Generation Programming Languages (4GL vs 5GL)

A programming language is a non-natural language used to present the computations that a machine can perform. Very first programming languages (often called 1st generation languages or 1GL) were mere machine code consisting of 1’s and 0’s. Programming languages have evolved tremendously over the past few decades. Programming languages are classified (or grouped) together as 1st generation programming languages to 5th generation programming languages depending on common characteristics or attributes of the languages. This evolution made the programming languages friendlier to humans than to machines. Fourth generation programming languages (4GL) are the languages which are developed with a specific goal in mind like developing commercial business applications. 4GL followed 3GL (3rd generation programming languages, which were the first high-level languages) and are closer to the human readable form and are more abstract. Fifth generation programming languages (which followed 4GL) are programming languages that allow programmers to solve problems by defining certain constraints as opposed to writing a specific algorithm.

Difference Between 4th And 5th Generation Computer

Features Of 4th And 5th Generation Computer

What are Fourth Generation Programming Languages?

Fourth generation programming languages are designed to achieve a specific goal (such as to develop commercial business applications). 4GL preceded 3rd generation programming languages (which were already very user friendly). 4GL surpassed 3GL in user-friendliness and its higher level of abstraction. This is achieved through the use of words (or phrases) that are very close to English language, and sometimes using graphical constructs such as icons, interfaces and symbols. By designing the languages according to the needs of the domains, it makes it very efficient to program in 4GL. Furthermore, 4GL rapidly expanded the number of professionals who engage in application development. Many fourth generation programming languages are targeted towards processing data and handling databases, and are based on SQL.

What are Fifth Generation Programming Languages?

Fifth generation programming languages (which followed 4GL) are programming languages that allow programmers to solve problems by defining certain constraints as opposed to writing an algorithm. This means that 5GL can be used to solve problems without a programmer. Because of this reason, 5GL are used in AI (Artificial Intelligence) research. Many constraint-based languages, logic programming languages and some of the declarative languages are identified as 5GL. Prolog and Lisp are the most widely used 5GL for AI applications. In the early 90’s when the 5GL came out, it was believed they would become the future of programming. However, after realizing that the most crucial step (defining constraints) still needs human intervention, the initial high expectations were lowered.

What is the difference between Fourth Generation and Fifth Generation Programming Languages (4GL and 5GL)?

Examples Of 4th And 5th Generation Computers

Fourth generation programming languages are designed for a specific application domain, while fifth generation programming languages are deigned to allow computers to solve problems by themselves. 4GL programmers need to specify the algorithm in order to solve a problem, whereas 5GL programmers only need to define the problem and constraints that need to be satisfied. 4GL are mainly used in data processing and database handling applications, while 5GL are mostly used for problem solving in AI field.

Difference Between 4th And 5th Generation Computer Language

Related posts: