User:FuzzyMagma/History of metal casting

The history of metal casting is rich and complex, dating back thousands of years. The Chalcolithic period, which occurred between 5000 and 3000 B.C., is when the experimentation of smelting copper and melting metals for castings began. The process of casting made its way to Egypt by 2800 B.C., and effectively performing this process was tremendously influential on their gain of power during the Bronze Age.

In 1300 B.C., the Shang Dynasty in China were the first to utilize sand casting when melting metals, while around 500 B.C., the Zhou Dynasty invented cast iron to the world. Iron was discovered around 2000 B.C., but it was not until around 700 B.C. that the first production of cast iron was developed in China. During the 19th century, many important advancements were made in the field of metal casting. In 1809, A.G. Eckhardt of Soho, England developed centrifugal casting. The cupola was introduced in the United States in 1815 in Baltimore, MD, and in 1818, the first cast steel was produced by the crucible process in the U.S. at the Valley Forge Foundry. Aluminum, which is the most common metal in the earth's crust, was isolated in 1825.

Today, casting affords great flexibility in terms of design and readily accommodates a wide range of shapes, dimensional requirements, and configuration complexities. Metal casting has been employed by civilizations through the ages to produce goods, structures and materials used in the world all around us, from bridges to medical equipment and even car parts. The history of metal casting is marked by discoveries, advancements, and influential events, which have led to its widespread use and application in various fields of industry.

Copper age
The first metal used by humans was native copper, which occurs naturally. It was used first by hammering it into sheets from which various simple tools and ornaments were made. Eventually, humans discovered that metallic copper could be separated and extracted from mineral rocks through smelting which marks the beginning of metallurgy. The consensus is that this happened around ca. 8200 BC in the Near East, particularly in eastern Anatolia and northern Iraq. According to Simpson, more recently quoted by Ravi, the oldest surviving casting is a copper frog from 3200 BC discovered in Mesopotamia. Yet, a lead casting figurine discovered at Luxor in Egypt which predates the copper frog by a significant 600 years. Furthermore, more recent archaeological work demonstrated that in 4500 BC that Old Europe (6200 to 3300 BC) was among the most sophisticated and technologically advanced places in the world. Some Old European villages grew to city like sizes, larger than the earliest cities of Mesopotamia, and metal casting appears to have started there. Bruce Liston Simpson, author of the History of the Metalcasting Industry, note that the birthplace of metals in the area north of the Black Sea in the Carpathian Mountains in today's Romania. The smelting of mineral copper-bearing ores to extract copper was the decisive step in the invention of metallurgy. It must have made an enormous impression on Neolithic craftworkers, as, depending on the process (still no consensus of archaeologists on this issue), in one case, a metallic "stone" would turn into a liquid and harden back into a metal. Conversely, a more rocklike "stone" would be transformed into a metal with different properties.

Copper battle axes were cast in Old Europe as early as 4600-3900 BC in Gumelniţa, Romania and 4400-4200 BC in today’s Varna, Bulgaria, and between 4000-3500 BC in Sfârnaș and Cucuteni, Romania. Moreover, hoards of cast copper axes and chisels discovered at Pločnik, Serbia, dated 5500-4700 BC which will move the beginning of metal casting even earlier. This development was possible through the discovery of smelting and the abundance of copper-rich malachite and azurite ores in the region.

Bronze Age
As we advance from the Stone Age to the Bronze Age, we find remarkable artwork produced by other ancient populations. For example, the Ghassulian culture, located in the eastern Jordan Valley near the northern edge of the Dead Sea in the Southern Levant (ca. 4400 – 3500 BC), mastered the casting of copper containing a high percentage of arsenic (4–12%) using the lost-wax process, the earliest known use of this technology, as exemplified by the sceptre unearthed at Nahal Mishmar in Jordan. In Egypt, arsenical copper was used from the pre-dynastic times (c. 5000 BC). The Ancient Egyptian stone known as the "Palermo Stone" records the making of a copper statue of Khasekhemwy of the second Dynasty of Egypt (c. 2890–2649 BCE), but this was not a cast statue. The Ancient Egyptians also manufactured cast bronze products with tin contents of 0.1% to 10% or more. A cylinder named Pepi I (2289–2255 BCE) indicates that the moulding of bronze castings dates to earlier than 2200 BCE.

The first recorded depiction of a smelting process was found on the wall of an Egyptian tomb dating to about 1500 BC. The process utilized a 'bowl furnace'. Additional air to raise the temperature of the fire was supplied by foot bellows. Copper ore was the source of ore, while charcoal was used as the reducing agent to separate oxygen from copper. Iron ore was used as the 'flux' to separate impurities from the melt.

The striding horned demon cast in Mesopotamia (today's Iran) from arsenical copper (ca. 3000 BC) and the Dancing Girl lost-wax bronze figurine produced by the Indus Valley Civilization (2300–1750 BC) further demonstrate the connection between art and metal casting. This last statue is 10.5 cm tall and depicts a nude young woman with stylised ornaments in a naturalistic pose. The Iron Age started in 1200 BC. However, it did not decrease the importance of bronze for many years. In China, during the Shu kingdom (1600-316 BC), the use of bronze for art, weapons, and currency flourished. Spade money made of an 80Cu-15Pb-5Sn alloy was used as early as 1200 BC. By 221-206 BC, copper currency was in use. Ancient people believed that heaven was round and the earth was square. Thus, the copper coin has a round shape and a square hole in the centre. Ceremonial vessels and statues were made through the lost-wax process. In Korea, the host of the 74th World Foundry Congress (WFC), bronzeware fabrication diffused into the central region of the Korean Peninsula from the Northeast. Raised band decoration pottery was cast around the 13th century BC. Weaponry such as Liaoning bronze daggers were made in the 8th and 7th centuries BC. Korean-type bronze daggers were dated to 400 BC, and other castings were produced well into the CE. They include spiritual, art, and warfare pieces.

Throughout Europe, weapons and agricultural implements were also manufactured by casting after the beginning of the Iron Age. In Britain, hundreds of bronze axe heads that were cast before the Iron Age as well as into the first millennium BC, were found. They were cast in moulds made of clay, stone or bronze. In Europe, the Etruscans produced bronze goods as early as 1100-750 BC. For example, the Chimera of Arezzo, a fire-breathing monster from Greek mythology, was cast in bronze using the lost-wax technique. The Capitoline Wolf is a bronze sculpture depicting a scene from the legend of the founding of Rome, a she-wolf suckling the mythical twin founders of Rome, Romulus and Remus.

While bronze utilization for artwork and military applications extended well into the Roman times, other industrial (e.g., marine propellers) and works of art continued to the present day. The status of bronze as the material of choice started fading as a new material, iron, replaced bronze, particularly for military use, because of the higher strength and hardness of the carburized iron (steel). It was the dawn of the Iron Age.

Iron Age
The accepted definition of the beginning of the Iron Age is when ferrous metallurgy was the dominant metalworking technology in a region's culture. The earliest known iron artefacts are nine small beads dated to circa 3200 BC from Gerzeh, Egypt. They were made from meteoritic iron and shaped by carefully hammering the metal into thin sheets before rolling them into tubes. This confirms that in the 4th millennium BC, metalworkers had already mastered the smithing of meteoritic iron, and meteoritic iron-nickel alloys much harder and more brittle than the more commonly worked copper. In addition, the necklace included lazulite, gold and carnelian, revealing the status of meteoritic iron as a special material on par with precious metals and gemstones. Other early iron artefacts include the Umm el Marra’s pendant (Syria; 2300 BC), the Tutankhamun's dagger, bracelet and headrest (Egypt; 1350 BC), the Shang Dynasty axes (China; 1400 BC), and the Alaca Höyük’s dagger (Turkey; 2500 BC). Yet, Bronze Age’s iron artefacts could be derived from either meteoritic iron (extraterrestrial) or smelted ores (terrestrial), which is still unclear. In principle, nickel-rich iron (6–20% Ni) is meteoritic, while nickel-poor iron is smelted. A section of a text from the province of Hatay, Turkey, mentions 400 Sukur weapons (spearheads) made of iron (AN.BAR). This information indicates that iron was starting to be used in weapons from the 18th century BC.

According to Jambon, most irons from the Bronze Age are derived from meteoritic iron until some transition period, which occurred close to about 1200 BC, considered by most scientists to be the beginning of the Iron Age.

The meteoritic origin of iron in the early Bronze Age is also supported by the terminology used in documents from that time. The Sumerian word AN.BAR, the oldest word designating iron, comprises the pictograms 'sky' and 'fire'. A similar language is found in Egypt where at ca. 1300 BC, the term 'biA-n-pt' starts to be used, which reads 'iron from the sky' and from this point onwards, was applied to describe all types of iron. The Ancient Egyptian name for iron was 'bja', a word mentioned repeatedly in the Pyramid of Unas’s Texts found in the Saqqara Complex (ca. 2500 BC) in connection with the 'bones' of the star kings: "The King's bones are iron ('bja') and his limbs are the imperishable stars". The close association between iron and other costly materials found in the meteoric iron-gold dagger in the tomb of Tutankhamun strongly suggests that iron was highly valued. Old Assyrian texts found at Kültepe in Anatolia (1950-1750 BC) support this interpretation, recording that iron was as much as 40 times more valuable than silver and ten times more than gold by weight. The same text implies that the Assyrians used iron ore of various purities, including hematite, and that smelted iron was only available in small quantities. Blooms of iron were used to produce rings and pins.

Other historians attribute the first instances of ironworking to the Hittites in ca. 1380 BCE. The Hittites, a people speaking the oldest historically attested Indo-European language, migrated into Northern Asia Minor (today's Turkey). In 1274 BC, kept the Egyptian army in check at battle of Kadesh were each combatant side claimed victory. Then, they extended their empire until they were a superpower on the level of Egypt and Assyria. Their military success is sometimes considered to originate from their iron weapons. The Hittites used the word AN.BAR borrowed from Sumerians for smelted iron, AN.BAR.GE for meteoric iron, and AN.BAR ŠA GUNNI (iron straight from the furnace), which suggests bloom iron and, therefore, smelting. Iron was produced in the Near Eastern bloomery process, which involved heating the iron ore in furnaces where bellows were used to force air through the ore (the charcoal was both a fuel and a reducing agent, as the carbon monoxide produced by the charcoal reduced the iron oxide from the ore to metallic iron), was a sponge-like material, with slag trapped in the crevices. Further hammering the hot, mushy state, bloom removed most of the slag. The surface of the iron was then heated again within a bed of red-hot charcoal to carburize the iron.

However, as one of the earliest smelted iron artefacts known is a dagger with an iron blade found in a Hattic tomb in Anatolia, dating from 2500 BC. It is possible that the Hittites learned the technology from the Hattians, a people who spoke a distinctive language, which was neither Semitic nor Indo-European, assimilated in the Hittite empire between 2000 and 1700 BC.

In most ancient cultures, this sky connection led to the belief that the metallurgist had a direct link to the divine, and the art of producing iron objects (the word 'metallurgy' did not exist) had a magic origin. The early metal workers often rose to high status in the tribal hierarchy. Even in later history, the metal worker ranked highly in the social hierarchy. In Ireland, Foundrymen ranked with the nobility from early times. The founder of the Mongol empire, Genghis Khan (1162-1227 AD), was a simple smith before becoming emperor.

As Waldbaum stated, the technique of producing carburized steel by quenching and tempering was first attested in Cyprus in the 12th or 11th century BC. The Hittites may not have been the only owners of the secret of iron weapons. Indeed, the number of iron objects from the Bronze Age found in Anatolia is comparable to that of iron objects found in Egypt and other places from the same period. The Bronze Age thus saw the anomaly of an iron-making capability and limited demand for metal before the Iron Age began.

The principle of the Renn kiln involves the reduction of the iron ore with charcoal to obtain sponge iron (loupe), a mixture of slag, charcoal, pure iron and unreduced iron ore. The sponge iron is then forged and cleaned of residuals to produce a soft iron.

It appears that in the 8th century BC, the ancient Greeks were aware of iron and the quenching process, as Homer (c. 750 BC) distinctly mentioned the use of iron in the Iliad [xxiii, 261], describing how the red hot metal hisses when it is submerged in water.

The exact end time of the Iron Age is not a uniquely accepted date. For the Ancient Near East, establishing the Achaemenid Empire (ca. 550 BC) is traditionally taken as a cut-off date. This is also the year when Herodotus, "The Father of History," began writing "The Histories". In Central and Western Europe, the Roman conquests of the 1st century BC are taken as the end of the Iron Age. Even a later date, 800 AD (the beginning of the Viking Age), is considered in Scandinavia. With this uncertainty, it is difficult to place cast iron artefacts as being produced before or after the end of the Iron Age. Thus, we will consider the iron Age to end with the discovery of plastic materials, i.e., Bakelite in Belgium by Leo Baekeland in 1909 AD. The Chinese became the first people to produce iron castings successfully and regularly as early as 800-700 BC, with the earliest sand casting being traced to 645 BC. Historians believe they benefited from earlier work, probably passed along to them by migrating Mesopotamian craftsmen. One ancient document (513 BC) refers to a requisition for 600 pounds of iron to cast a tripod on which the criminal code was inscribed. Cast iron ploughshares were in use as early as 233 BC. The oldest cast iron objects found to date were cast during the Han dynasty (206 BC – 220 AD) and included a stove, an ink pallet, a vase, a pan, and various fittings.

Cast iron became so popular in China that it was used not only for home implements but also for art, worship objects such as incense burners and statues, pagodas roof tiles, and even true cast-iron pagodas, such as the iron pagoda of Yuquan Temple, and the iron lion of Cangzhou which was cast in a single mould.

The progress in cast iron technology that occurred in China is attributed to the development of melting equipment capable of producing greater air draft (blast furnaces that convert raw iron ore into pig iron were operational in China by 722–481 BC), the abundance of iron ore rich in phosphorus (6-7% P), which allowed pouring of this iron at 980℃, i.e., 100 ℃ below the melting point of copper.

As for Europe, although the Greeks and Romans had some understanding of the art of casting iron, their early applications did not match the broad development of cast iron in China. In Britain, a small cast statuette found in Sussex shows that Anglo-Saxon monks were aware of the art by 170 AD. During the Dark Ages, knowledge was kept secret and transferred by word of mouth. In 1122 AD, Theophilus of Antioch wrote the famous book On Divers Arts, which provides some insight into the technical knowledge of metalworking and metal casting at that time. The book's title indicates that metalworking, including metal casting, was considered art at that time. That sometimes described non-conventional procedures such as: "Tools are also made harder by hardening them in the urine of a small red-headed boy rather than in plain water". However, the large-scale introduction of cast iron in Europe did not occur until about 1200-1450 AD.

While the beginning of the progressive of blast furnaces in Europe can be traced to the charcoal-fuelled Catalan forge developed by the Moors in the 8th century AD, the beginning of modern iron-foundry can be considered to coincide with the introduction of the water-driven bellows in Germany and Sweden in 1325.

From art to technology
The transition of metal casting from art to technology in Europe can be considered to occur around 1550, when Vannoccio Biringuccio, an Italian metallurgist, authored De la Pirotechnia, a handbook on metalworking published posthumously in 1540. It gives details of the extraction and refining of metals and alloys such as brass and the art of casting metals with detailed descriptions for bells and cannons. It defines steel as "nothing other than iron, purified by means of art and given a perfect quality by the decoction of fire". It is credited with starting the tradition of scientific and technical literature. It preceded the printing of De Re Metallica by Georgius Agricola by 14 years. Biringuccio, who is considered the father of the foundry industry, recommended using the dregs of beer vats and still that of human urine as binders for moulding sand, both of which were in use well into the 20th century. Biringuccio is also credited with the development of a standard bell scale. This is one of the earliest instances on record of the interaction between the metal caster and the engineer in perfecting castings. For more than 400 years, foundry processes and materials often relied on the methods described in this book as the rather confused art/technology terminology extended into the 18th century, with René Antoine Ferchault de Réaumur (L'Art de conver le fér forge en acier, 1722) and Gabriel Cramer (Elements of the art of assaying, 1739). Even some magic is still in the mix, as according to Cramer, "steel is nothing but pure iron impregnated with phlogiston".

The technology of cast iron made significant progress with the introduction of cast iron water pipes in the Dillenburg Castle in Germany in 1455 and in the palace of Versailles in 1664. A significant French contribution to cast iron during this period was the development of white-heart malleable iron by Réaumur (1720), which dispelled the belief that cast iron is an inferior material, a "corrupt metal" inherently brittle.

The first iron production facility in North America was in operation in 1619. The first surviving cast iron artefact produced in America is the Saugus pot (1642), made in Massachusetts. The development of the cast iron industry in North America ran into environmental problems, as the English Parliament prohibited pig iron production in the Colonies unless the iron was shipped to England, because of the forest degradation caused by the use of charcoal-fired blast furnaces. This did not sit well with the industrialists, which explains why six owners of iron foundries were signatories of the Declaration of Independence resulting from the soon-to-follow American Revolution (1775). George Washington's father also owned a foundry. Around 1760, the First Industrial Revolution started in England with new chemical manufacturing and iron production processes, the development of machine tools, and the factory system's rise. For example, cast iron tram-road rails produced in Coalbrookdale in 1756 replaced wooden rails, and the still-working Iron Bridge was built in 1779.

Aluminium
In 1825, aluminium was to become the main competitor of cast iron. It was first isolated in 1825 by Hans Christian Ørsted in Denmark, who reported: "a lump of metal which in colour and lustre somewhat resembles tin." He produced aluminium by reducing aluminium chloride using a potassium-mercury amalgam. The mercury was removed by heating to leave aluminium. Just like iron, when first discovered, aluminium was more expensive than gold. In 1852, aluminium was sold at US$34 per ounce, while gold was $19 per ounce. The emperor, Napoleon III of France, perceived the new metal as a breakthrough material for his army. The emperor would show off his "lavish" aluminium dinnerware to visiting leaders — "lesser" gold dinnerware would be reserved for them. Aluminium currency was coined. Yet, it was not until 1866 that aluminium became an industrial material with the discovery of electrolytic refining of aluminium, independently by C. Hall and P. Héroult. It became widely used for architectural artefacts, buildings and statues and extensively in the automotive and aeronautic industries. The high strength-to-weight ratio of cast aluminium meant a substantial reduction in energy consumption. Consequently, the aluminium industry became pivotal for ecological sustainability and strategic for technological development.

New methods of casting
The 19th century witnessed some significant development in the equipment used in metal casting. The first die-casting equipment invented in 1838 and patented in 1849 was a small, the hand-operated machine. Next, a tin-led alloy was poured into a steel mould to produce movable type for the printing industry. Then, in 1879, the first industrial arc furnace became operational in Germany.

While the first microscope was invented by Galileo Galilei in 1609, it was not until Carl Zeiss developed the modern microscope in 1846. Then Sorby, in 1863, used a microscope to study polished metal samples to understand the metallographic constituents of alloys.

Modern Materials Age
In the Modern Materials Age other casting alloys such as spherical and compacted graphite cast iron, stainless steel, titanium alloys and many others were developed. New processes, such as single crystal growth invented by the Polish scientist Jan Czochralski in 1915, opened the field for improved parts for the aerospace industry. While castings were used mostly for house implements and weapons in previous ages, new fields such as automotive, aviation, and medical devices became major users of castings. Cast iron remains, however, the main casting material by tonnage.

Before using the microscope in metallurgy, only two types of cast irons were identified based on the aspect of their fracture, white or grey iron. They had a very low strength of 80-100 MPa (~12-15 ksi), and knowledge of their properties was limited: "The physical properties of cast iron are shrinkage, strength, deflection, set, chill, grain and hardness." In the Materials Age, significant progress was made. In 1908, we note the first attempts at liquid treatment of cast iron with FeSi, Ca, and V by Geilenkirchen. In 1928, the first specification for cast iron was issued (DIN 1691 classes 140-280 MPa (20-41 ksi)), both in Germany. In 1942, Piwowarsky in Germany published his famous work “High quality cast iron: its characteristics and the physical metallurgy of its manufacture” (Hochwertiges Gußeisen (Grauguß): seine Eigenschaften und die Physikalische Metallurgie seiner Herstellung), the first book on cast iron to include Physical Metallurgy, and therefore a science in its content. The tensile strength of grey iron continued to rise, as attested by the patent on a process consisting of inoculation of the iron with calcium silicide granted in the US to Augustus Meehan in 1931. It claimed tensile strength up to 500 MPa (72 ksi). In 1938, Adey obtained a patent for a process producing wholly or partly spheroidal graphite cast iron through the annealing of white iron. On 12 April 1943, one of the most significant metal casting discoveries was made. A base iron of 3.64%C, 2% Si, 0.75%Mn, 0.06%S and 2%Ni was treated with an 80% Ni-20% Mg alloy followed by a FeSi inoculation. The samples polished and examined under the microscope revealed that the graphite was entirely spheroidal. The ductile iron, whose expansion in the industry in the following years was explosive, was invented. Cast iron, the first man-made composite material, became the first engineered composite material. A patent was applied for in 1947 and issued in 1949.

In England, Morrogh used cerium to spheroidise the graphite. The major discoveries related to graphite shape control ended with the recognition of compacted (vermicular) graphite iron (CGI) as a grade in its own merit through the discovery of magnesium-titanium treated CGI by Schelleng in 1965 (patent granted in 1969), and the patent granted in 1968 to Thury et al. for cerium-treated vermicular graphite cast iron. Finally, with the commercialization of austempered ductile iron in 1970, the strength of cast iron rivalled that of many other steel alloys.

In 1931, German physicist Ernst Ruska and his doctoral advisor Max Knoll presented the prototype electron microscope. It was a seminal discovery of paramount importance in understanding the microstructure of metals. The first Scanning Electron Microscope (SEM) image was obtained in 1935. The technique was widely used in conjunction with interrupted solidification and deep etching in the last decade to understand the solidification sequence in cast alloys. Examples of mapping to identify the chemical composition of nuclei in spheroidal graphite (SG) iron and of deep etching and SEM analysis to reveal the growth mechanisms of graphite. Then, through Transmission Electron Microscopy (TEM) obtaining information on the inner structure of materials, such as crystal structure, became possible. For example, it could be demonstrated that spheroidal graphite has an amorphous-like central region, annular rings of the layered intermediate region, and an outer region of polygonal graphite platelets, and that both hexagonal and rhombohedral structures exist in the spheroidal graphite particles.

Solidification science was transformed from a pure physics discipline into an engineering science by Chalmers in 1956 when the constitutional undercooling criterion was formulated, giving an insight into the processing-microstructure correlation. A decade later, Jackson and Hunt developed the mathematical framework for the analytical analysis of regular lamellar eutectic growth. The same year, the era of Virtual Cast Iron was launched, as Oldfield developed a computer model to calculate grey iron cooling curves. Progress in the field was driven by some well-known academics such as Merton Flemings from Massachusetts Institute of Technology, who in 1974 published his epochal treatise Solidification Processing. The introduction of the Niyama criterion, a simulation output variable used to detect solidification shrinkage defects in steel castings, was another milestone in the drive to use computer modelling in casting processing. By 1985, solidification modelling of cast iron became an area of intensive research, including the simulation of microstructure and properties of many casting alloys, particularly steel, cast iron and aluminium alloys. Peter Sahm's intensive research and development work carried out at the Foundry Institute of Technology, RWTH Aachen University, resulted in the establishment of MAGMA, currently Magmasoft, in 1988, one of the leading metal casting software in the industry, in a field that includes other remarkable software developers such as ProCast and Flow3D.

Today computer models are regularly used in foundries for process simulation of mould filling, solidification and cooling for sand casting, shell mould casting, and die casting (gravity, high pressure and low pressure). Moreover, they can predict the final structures and properties of castings of many different alloys. The acceptance of computational modelling of solidification by the industry is a direct result of the gigantic strides made by solidification science in the last two decades.

The metal casting applications have also expanded in tune with their scientific progress. The medical field is one of the beneficiaries. The earliest records of the use of metallic implants in surgery go back to the 16th century. Today, the selection of materials for medical applications is based on considerations of biocompatibility, which spurred the development of a new class of materials, biomaterials. Commonly used metallic biomaterials belong to one of the three good corrosion-resistant alloy systems: iron-chromium-nickel alloys (austenitic stainless steels), cobalt-chromium-based alloys, and titanium and its alloys. The first 18-8 stainless steel surgical implant in 1926 was a hip implant. However, stainless steel is the least corrosion-resistant, and today it is used for temporary implants only. In 1953 G. McKee performed the first hip replacement with a Co-Cr prosthesis, as Co-Cr alloys do not corrode in the body. Yet, metal ions slowly diffuse through the oxide layer and accumulate in the fibrous tissue of a thickness proportional to the dissolution products' amount and toxicity. Pure titanium produces minimal fibrous encapsulation.

The number of casting alloys developed and used by the industry continues to grow. The 2018 AFS census of world casting production states, "Worldwide casting production growth slowed but still reported a 2.6% increase in 2018", with cast iron amounting to 60% of the total. Currently, castings are used in architecture, cooking and many industrial applications, some of which have been discussed in an earlier publication.

Additive manufacturing
More recently, a new technology that involves melting and shaping, but is not metal casting, captured the enthusiasm of the innovators – Additive Manufacturing (AM). The term can be regarded as the opposite of subtractive manufacturing, a retronym for the old machining process. One of the most remarkable AM processes uses a laser beam to melt and fuse metallic powders under a protective gas atmosphere. A previously digitally produced CAD item is printed onto a building platform layer by layer. The ASTM standard term is powder bed fusion, also known as selective laser melting. Good mechanical characteristics are achieved, and all common post-processing options are possible. While originally used mostly to produce 3D printing of prototypes, it is now used in the transportation industry (aeroplanes, automotive), the health sector (e.g., joint replacement, orthopaedic implants, craniomaxillofacial reconstruction) and even in architecture.