The way in which the size of a particle is defined is always somewhat arbitrary. Most of the mineral particles encountered are irregular in shape - some more irregular than others. Materials such as limestone, clay, gypsum and clinker are quite isotropic in their breakage characteristics, and form ground particles that approximate spheres at least to the extent that, when placed on a three-dimensional grid, their maximum dimensions in the x-, y-, and z-directions are roughly the same. A few other materials, such as micas, produce plate-like particles, and coal has a slight tendency to be plate-like too.
The intuitive way of summarizing the size of roughly spherical particles is to test their ability to pass through, or be retained on, a real or imaginary sieve. The idealized sieve has identical flat square apertures, and the length of the side of the square is specified. If the particles tested on the sieve are perfect spheres, then the diameter of a sphere that just passes the sieve is assumed to be equal to the size of the sieve opening. All these assumptions are questionable, but the sieve concept has the advantage of simplicity and is easily reproducible.
When materials are subjected to a size reduction process, even if the unground particles are completely uniform in size, the final product will always have a range of sizes, because of the random nature of the breakage process. The range of sizes can be expressed graphically as a size distribution curve, in which the mass proportion of each size is plotted against the size.
In the cement industry the shape of such particle size distributions is commonly modelled using the Rosin Rammler Distribution function:
is the % retained at size x, xc
is a characteristic size, and S is the "slope".
The Rosin Rammler Distribution (RRD) is applicable to materials that have been ground by a process that breaks uniform brittle particles in a random manner, without any agglomeration. Real distributions depart from the RRD, for two main reasons:
- Grinding any material containing two distinct minerals, such as clinker and gypsum, or limestone and shale, necessarily results in a bimodal distribution.
- Almost all grinding processes involve some form of size classification, which truncates the otherwise asymptotic distribution.
Despite these reservations, the RRD is an effective method of characterising the grinding processes used in cement manufacture. The RRD summarises a set of particle size data in terms of just two parameters:
- the "characteristic size" xc, which is the size exceeded by 36.8% (i.e. 100/e) of the mass, and is an expression of the overall fineness
- the "slope" S, which is an expression of the "narrowness" or "tightness" of the distribution: a low value of S indicates a broad range of particle sizes.
Sieves were used to separate coarse and fine particles in milled materials such as flour from earliest times, so it was natural that the measurement of particle size began with the use of sieves with standardised mesh sizes. Sieves, however, have a number of disadvantages:
- The manufacture of sieves with precisely uniform mesh size is difficult and expensive, and in use, the uniformity and size of the mesh can change.
- Undersize material does not simply fall through a sieve: a certain amount of energy has to be supplied, and the material on the sieve has to be agitated. These processes can have a grinding action on the material, and can distort the mesh.
- Sieves below about 30 μm are extremely fragile, and yet require more energy to separate materials, so sieves below this size are impractical.
However, at least for early cements, the powders were sufficiently coarse that sieve analysis was considered adequate to characterise their fineness, and some statement of the proportion retained on some standard sieve size was always considered necessary when certifying the quality of early cements. But a given value was no guarantee of quality, because all early cements were sieved after grinding, and the quoted fineness value was more an indication of the thoroughness of sieving than of the thoroughness of grinding.
For those familiar with modern cements, it is hard to conceive just how coarse the early cements were. Using data on sieve analysis from the nineteenth century writers, it is possible to describe the evolution in fineness of British cements.
As can be seen, the 1890s saw a critical transition. New technology was taken on to bring this about. In Britain's case, this was only to a limited extent due to the availability of new technology: the change to improved technology that already existed was driven by user pressure. Continental cement, generally much finer and more effectively quality controlled, began to be imported, and had already largely robbed Britain of its export markets.
The first British Standard for Portland cement, published in 1904, specified fineness in terms of maximum amounts retained on 76 mesh (223 μm) and 180 mesh (96 μm) sieves. The dangers of manufacturing cement merely to meet such a specification were emphasised by Butler (1899, pp 121-122), referring to stone grinding in 1898:
The use of sieves in the manufacture of cement is found to be very economical, as they help the stones considerably, but care must be taken that the proper amount of flouring is done by the stones. It is possible, by excessive sieving, to have a cement which may all pass through a certain sieve, and yet contain little or no flour, it being simply cracked till it is just fine enough to pass that sieve. A practical illustration of this fact—familiar no doubt to many who have tried to make cement in a small experimental way—may be obtained by pounding some clinker in a pestle and mortar with frequent sifting, and comparing the powder thus produced with that of the same clinker emanating from millstones. It will be found that the mill-stone-ground material contains a much larger proportion of flour than that ground in the mortar, and gives infinitely better results when tested for strength in the ordinary way. There is no doubt that this is just where the many kinds of edge-runner mills now largely used for grinding cement, fail to come up to the old-fashioned millstones, viz. in the flouring of the cement. They depend too much on their sieves: the clinker is passed under the runners and all cracked slightly, and then passed up to the sieves; any that is cracked fine enough to pass the sieves is conveyed to the warehouse, the remainder being returned to the mills to be cracked again; so the process continues; the material often passing under the rollers ten to fifteen times altogether before the reduction is completed. So long as the cement will pass the sieves, the promoters of these mills fondly imagine they have done their duty, and lose sight of the fact that it is the impalpable powder or flour which is the essential part of the cement.
Suffice it to say that sieve data were insufficient to describe the fineness of cement as it relates to cement strength. In the case of rawmix or fuel grinding, the production of excessively fine material is detrimental, and sieve analyses are adequate for control of their fineness (Note 1). The concept of "flour" in cement (i.e. material below about 20 μm: see Note 2) understood by Butler was quantified after 1900 by elutriators, which separated out the fines by flushing a sample with an upward current of air or a liquid, the size range extracted being related to the sweeping velocity. However, elutriators were only ever research instruments and unsuitable for routine use, and no universally-recognised standard was ever developed.
The specification of maximum levels of sieve oversize in Standards occurred during the period when grinding technology was developing rapidly, and the levels specified reflected more the current technology than any desirable fineness. Successive British standards included the following maxima:
|96 μm ||22.5||18||14||10|
|*The standard sieves were redefined in 1931. From 1958, due to use of specific surface, sieve data were no longer relevant.|
Cement specifications clearly needed a way of prescribing the fineness of cement, and the sieve method was written into standards in the absence of any more useful measure, despite the fact that it provided little guidance as to the likely quality of the cement, particularly as the number of different grinding technologies multiplied. The need for a fineness measure relatable to cement properties such as strength and setting time became ever more urgent until finally a breakthrough occurred in 1939, with the development of the measurement of specific surface by air permeability.
Specific surface is a measure of the total surface area of the particles in a given mass of cement: SI unit m2/kg. Assuming spherical particles of cement, the mass of a particle of diameter D μm is M = 3150 × πD3/6 × 10-18 kg (for particle density 3150 kg/m3), and the surface area is A = πD2 × 10-12 m2. Thus the specific surface of the particle is A/M, or 1905/D, and the smaller the particle, the higher the specific surface. The specific surface of a cement sample is therefore largely influenced by the mass of small particles present, in contrast to sieve analysis which focuses only on the largest particles. The specific surface concept also has the advantage that, because reaction with water occurs at the particle surfaces, it accurately correlates with the rate of reaction of the cement.
The air permeability method was first published by Lea and Nurse in 1939. The method consists of measuring the equilibrium rate of flow of air through a bed of cement compacted to a standard porosity, when a measured pressure differential is maintained across it. The mathematical treatment was based on the Carman Equation (published 1938), from which it followed that, for a constant particle density, air viscosity and bed porosity, the specific surface is proportional to √P/V where P is the pressure differential and V is the air flow-rate. There are no empirical constants in the equation, so no "standard sample" is required for calibration.
During the following ten years, a number of variants of the Lea and Nurse apparatus were developed, with a view to simplifying operation for rapid use in hour-to-hour quality control, the original Lea and Nurse method being somewhat laborious. These included the Rigden and Blaine methods, and rather than employing equilibrium flow conditions, they compared a constantly-changing flowrate with a constantly-changing applied pressure. These methods were fairly rapid for the coarser cements of fifty years ago, but had the distinct disadvantage that a primary standard cement was needed for calibration, with an assigned value traceable to the Lea and Nurse method. An automated Lea and Nurse apparatus remains the best, providing virtually instantaneous results.
Adoption of specific surface for control of cement grinding began during WWII, and a specific surface minimum of 225 m2/kg was introduced into the British Standard in 1947, and remained unchanged until 1989, when it was raised to 275 m2/kg.
Crushers are used to reduce material in lump form to a size sufficiently small to be fed to a mill for fine grinding. The cement industry processes requiring fine grinding are rawmix preparation, fuel preparation and grinding of clinker to make cement. All three may require a preliminary crushing stage.
In the case of raw materials, attempts have always been made to minimise the amount of crushing needed by extracting the rock in the finest possible state. In the chalk areas, chalk was until well into the twentieth century quarried by hand, and by a process termed "milling", chalk was scratched from the quarry face producing material that was mostly 30 mm or smaller. With the use of explosives in hard rock quarries, the amount and disposition of the charges is designed to maximise the amount of small material produced. However, crushing is invariably employed to further reduce the rock. Crushers are usually more energy-efficient than the finer-grinding equipment.
Coal, as used by the cement industry, rarely needs crushing. Clinker, as produced by early static kilns, emerged as large lumps, and crushing was always required before grinding could take place. Rotary kilns by contrast produce clinker that is typically below 50 mm and does not need crushing. However, every rotary kiln occasionally produces larger clinker, or may contain large slabs of kiln coating material, so some facility for occasional crushing is required.
There are a number of different crusher types in use in the industry:
Mills are used to reduce materials of maximum size around 30 mm to fine powder. Different types of mill used in the industry include:
In the early history of fine grinding, most of the technology was developed for milling cereals, which are soft. Powered machines for flour milling developed directly from the techniques for hand grinding and so the flat quern gave rise to the flat stone mill, while roller mills developed from the hand roller and the pestle and mortar. The relatively minor processes for grinding harder materials, such as the production of paints and inks, tended to use the same sort of equipment, often using wet grinding.
The limitations of what was still essentially a Stone Age process for fine grinding of hard materials led to the development of the continuous tumbling mill, and the cement industry led the way with its development, although today the method is used in many other heavy processing industries. As with the development of kilns, the main thrust of development 1875-1975 consisted in finding ways of applying ever-increasing amounts of cheap energy to the process, and tumbling mills, although extremely inefficient in terms of energy conversion, allowed unlimited scale-up. As increasingly fine cement grinding was required, it became more difficult to dissipate the enormous amount of waste heat generated during grinding in tumbling mills.
The somewhat earlier development of new crushing and grinding techniques in Germany and the USA consisted of moving from the use of "natural" abrasives such as millstones to custom designed steel alloys, and the development of these alloys - particularly those with manganese and chromium - only began with the introduction of the Bessemer process in the 1870s, producing a cheap pure mild steel base that could be modified at will.
The ancient observation that hard materials were easier to grind in a liquid medium led to the use of "wet process" grinding of raw materials in the cement industry, and grinding by this technique was always much more energy-efficient than dry grinding, although the energy required to remove the water from rawmix was always vastly greater than that saved during grinding. Flat stones were easily modified for wet grinding, but roller mills were almost exclusively used for dry grinding. Tumbling mills, when developed, were also easily adapted for wet grinding.
From 1975, concern to improve energy efficiency led to a reversion to the use of roller mills, which are considerably more efficient. Roller mills are now almost exclusively used for dry process raw milling on new installations, and the latest finish grinding systems also employ roller mills, in combination with high efficiency separators.
Evolution of types of mill used for finish milling