No CrossRef data available.
Published online by Cambridge University Press: 06 March 2019
In these modern times, where the use of the computer in the analytical laboratory is taken for granted, it is perhaps difficult to realize that, less than one generation ago, computers were little more than an idea on an engineer's desk. It is interesting to note the sequence in which the automation of data collection and data processing developed. As would be expected, the time sequence followed closely the developments in computer hardware and peripherals. An important factor in the development of most commercial automated systems was the “20%” rule. This rule required that the total cost of any computer package should not exceed 20% of the sale price of the final automated product. Rex's “Numerical Control Powder Diffractometer” was described in the 1966 Denver Conference and this machine was to be the forerunner of a whole host of automated diffractometers which appeared in the early 1970s. Typical systems used either a 4K minicomputer or a time-sharing system with a large main-frame computer. It is interesting to observe that, as we come into the 1990s, the argument as to whether the main-frame will survive as a viable alternative to the rapidly developing PC still goes on.