Thursday, October 27, 2022

History, definition, terminology in nanoscience and importance of Moore’s law


History, definition, terminology in nanoscience and importance of Moore’s law


History

The history of nanotechnology traces the development of the concepts and experimental work falling under the broad category of nanotechnology. Although nanotechnology is a relatively recent development in scientific research, the development of its central concepts happened over a longer period of time. The emergence of nanotechnology in the 1980s was caused by the convergence of experimental advances such as the invention of the scanning tunneling microscope in 1981 and the discovery of fullerenes in 1985, with the elucidation and popularization of a conceptual framework for the goals of nanotechnology beginning with the 1986 publication of the book Engines of Creation. The field was subject to growing public awareness and controversy in the early 2000s, with prominent debates about both its potential implications as well as the feasibility of the applications envisioned by advocates of molecular nanotechnology, and with Governments moving to promote and fund research into nanotechnology. The early 2000s also saw the beginnings of commercial applications of nanotechnology, although these were limited to bulk applications of nanomaterials rather than the transformative applications envisioned by the field.
The American physicist Richard Feynman lectured, "There's Plenty of Room at the Bottom," at an American Physical Society meeting at Caltech on December 29, 1959, which is often held to have provided inspiration for the field of nanotechnology. Feynman had described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, so on down to the needed scale. In the course of this, he noted, scaling issues would arise from the changing magnitude of various physical phenomena: gravity would become less important, surface tension and Van der Waals attraction would become more important
After Feynman's death, scholars studying the historical development of nanotechnology have concluded that his actual role in catalyzing nanotechnology research was limited, based on recollections from many of the people active in the nascent field in the 1980s and 1990s. Chris Toumey, a cultural anthropologist at the University of South Carolina, found that the published versions of Feynman’s talk had a negligible influence in the twenty years after it was first published, as measured by citations in the scientific literature, and not much more influence in the decade after the Scanning Tunneling Microscope was invented in 1981. Subsequently, interest in “Plenty of Room” in the scientific literature greatly increased in the early 1990s. This is probably because the term “nanotechnology” gained serious attention just before that time, following its use by K. Eric Drexler in his 1986 book, Engines of Creation: The Coming Era of Nanotechnology, which took the Feynman concept of a billion tiny factories and added the idea that they could make more copies of themselves via computer control instead of control by a human operator; and in a cover article headlined "Nanotechnology", published later that year in a mass-circulation science-oriented magazine, OMNI. Toumey’s analysis also includes comments from distinguished scientists in nanotechnology who say that “Plenty of Room” did not influence their early work, and in fact most of them had not read it until a later date.
These and other developments hint that the retroactive rediscovery of Feynman’s “Plenty of Room” gave nanotechnology a packaged history that provided an early date of December 1959, plus a connection to the charisma and genius of Richard Feynman. Feynman's stature as a Nobel laureate and as an iconic figure in 20th century science surely helped advocates of nanotechnology and provided a valuable intellectual link to the past.

The Japanese scientist Norio Taniguchi of the Tokyo University of Science was the first to use the term "nano-technology" in a 1974 conference, to describe semiconductor processes such as thin film deposition and ion beam milling exhibiting characteristic control on the order of a nanometer. His definition was, "'Nano-technology' mainly consists of the processing of, separation, consolidation, and deformation of materials by one atom or one molecule."
In the 1980s the idea of nanotechnology as a deterministic, rather than stochastic, handling of individual atoms and molecules was conceptually explored in depth by K. Eric Drexler, who promoted the technological significance of nano-scale phenomena and devices through speeches and two influential books.
In 1979, Drexler encountered Feynman's provocative 1959 talk "There's Plenty of Room at the Bottom" . The term "nanotechnology", which had been coined by Taniguchi in 1974, was unknowingly appropriated by Drexler in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale "assembler" which would be able to build a copy of itself and of other items of arbitrary complexity. He also first published the term "grey goo" to describe what might happen if a hypothetical self-replicating molecular nanotechnology went out of control. Drexler's vision of nanotechnology is often called "Molecular Nanotechnology" (MNT) or "molecular manufacturing." and Drexler at one point proposed the term "zettatech" which never became popular.
His 1991 Ph.D. work at the MIT Media Lab was the first doctoral degree on the topic of molecular nanotechnology and (after some editing) his thesis, "Molecular Machinery and Manufacturing with Applications to Computation," was published as Nanosystems: Molecular Machinery, Manufacturing, and Computation, which received the Association of American Publishers award for Best Computer Science Book of 1992. Drexler founded the Foresight Institute in 1986 with the mission of "Preparing for nanotechnology.” Drexler is no longer a member of the Foresight Institute.

Experimental advances

Nanotechnology and nanoscience got a boost in the early 1980s with two major developments: the birth of cluster science and the invention of the scanning tunneling microscope (STM). These developments led to the discovery of fullerenes in 1985 and the structural assignment of carbon nanotubes a few years later

Invention of scanning probe microscopy

The scanning tunneling microscope, an instrument for imaging surfaces at the atomic level, was developed in 1981 by Gerd Binnigand Heinrich Rohrer at IBM Zurich Research Laboratory, for which they were awarded the Nobel Prize in Physics in 1986. Binnig, Calvin Quate and Christoph Gerber invented the first atomic force microscope in 1986. The first commercially available atomic force microscope was introduced in 1989.
IBM researcher Don Eigler was the first to manipulate atoms using a scanning tunneling microscope in 1989. He used 35 Xenon atoms to spell out the IBM logo. He shared the 2010 Kavli Prize in Nanoscience for this work.

Advances in interface and colloid science

Interface and colloid science had existed for nearly a century before they became associated with nanotechnology. The first observations and size measurements of nanoparticles had been made during the first decade of the 20th century by Richard Adolf Zsigmondy, winner of the 1925 Nobel Prize in Chemistry, who made a detailed study of gold sols and other nanomaterials with sizes down to 10 nm using an ultramicroscope which was capable of visualizing particles much smaller than the light wavelength. Zsigmondy was also the first to use the term "nanometer" explicitly for characterizing particle size. In the 1920s, Irving Langmuir, winner of the 1932 Nobel Prize in Chemistry, and Katharine B. Blodgett introduced the concept of a monolayer, a layer of material one molecule thick. In the early 1950s, Derjaguin and Abrikosova conducted the first measurement of surface forces.
In 1974 the process of atomic layer deposition for depositing uniform thin films one atomic layer at a time was developed and patented by Tuomo Suntola and co-workers in Finland.
In another development, the synthesis and properties of semiconductor nanocrystals were studied. This led to a fast increasing number of semiconductor nanoparticles of quantum dots

Discovery of fullerenes

Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry. Smalley's research in physical chemistry investigated formation of inorganic and semiconductor clusters using pulsed molecular beams and time of flightmass spectrometry. As a consequence of this expertise, Curl introduced him to Kroto in order to investigate a question about the constituents of astronomical dust. These are carbon rich grains expelled by old stars such as R Corona Borealis. The result of this collaboration was the discovery of C60 and the fullerenes as the third allotropic form of carbon. Subsequent discoveries included the endohedral fullerenes, and the larger family of fullerenes the following year.
The discovery of carbon nanotubes is largely attributed to Sumio Iijima of NEC in 1991, although carbon nanotubes have been produced and observed under a variety of conditions prior to 1991. Iijima's discovery of multi-walled carbon nanotubes in the insoluble material of arc-burned graphite rods in 1991 and Mintmire, Dunlap, and White's independent prediction that if single-walled carbon nanotubes could be made, then they would exhibit remarkable conducting properties helped create the initial buzz that is now associated with carbon nanotubes. Nanotube research accelerated greatly following the independent discoveries by Bethune at IBM and Iijima at NEC of single-walled carbon nanotubes and methods to specifically produce them by adding transition-metal catalysts to the carbon in an arc discharge.
In the early 1990s Huffman and Kraetschmer, of the University of Arizona, discovered how to synthesize and purify large quantities of fullerenes. This opened the door to their characterization and functionalization by hundreds of investigators in government and industrial laboratories. Shortly after, rubidium doped C60 was found to be a mid temperature (Tc = 32 K) superconductor. At a meeting of the Materials Research Society in 1992, Dr. T. Ebbesen (NEC) described to a spellbound audience his discovery and characterization of carbon nanotubes. This event sent those in attendance and others downwind of his presentation into their laboratories to reproduce and push those discoveries forward. Using the same or similar tools as those used by Huffman and Kratschmere, hundreds of researchers further developed the field of nanotube-based nanotechnology.
The National Nanotechnology Initiative is a United States federal nanotechnology research and development program. Its goals are to advance a world-class nanotechnology research and development (R&D) program, foster the transfer of new technologies into products for commercial and public benefit, develop and sustain educational resources, a skilled workforce, and the supporting infrastructure and tools to advance nanotechnology, and support responsible development of nanotechnology.
Nanoscience is an emerging area of science which concerns itself with the study of materials that have very small dimensions, in the range of nano scale. The word itself is a combination of nano, from the Greek “nanos” (or Latin “nanus”), meaning “Dwarf”, and the word "Science" meaning knowledge. It is an interdisciplinary field that seeks to bring about mature nanotechnology, focusing on the nano scale intersection of fields such as physics, biology, engineering, chemistry, computer science and more.

Nanoscience is the study of phenomena on a nanometer scale. Atoms are a few tenths of a nanometer in diameter and molecules are typically a few nanometers in size. Nanometer is a magical point on the length scale, for this is the point where the smallest man-made devices meet the atoms and molecules of the natural world. Typically nano means 10-9.  So, a nanometer is one billionth of a meter and is the unit of length that is generally most appropriate for describing the size of single molecule. Nanometer objects are too small to be seen with naked eye. Infect, if one wanted to see a 10 nm sized marble in his hand, his eye would have to be smaller than a human hair. Anyhow the rough definition of Nanoscience could be anything which has at least one dimension less than 100 nanometer.
Terminology
Nanoarray: an ultra-sensitve, ultra-miniaturized array for biomolecular analysis. BioForce Nanosciences' Nanoarrays utilize approximately 1/10,000th of the surface area occupied by a conventional microarray, and over 1,500 nanoarray spots can be placed in the area occupied by a single microarray domain
Nanoassembler: the Holy Grail of nanotechnology; once a perfected nanoassembler is availble, building anything becomes possible, with physics and the imagination the only limitation (of course each item would have to be designed first, which is another small hurdle).
Nanobeads: Polymer beads with diameters of between 0.1 to 10 micrometers. Also called nanodots, nanocrystals and quantum beads.
Nanobiotechnology: applying the tools and processes of MNT to build devices for studying biosystems, in order to learn from biology how to create better nanoscale devices.
Nanochips: smaller microchip.  They are also a next-gen device for mass storage, of significantly higher density, with greater speed, and much lower cost.
Nanocomputer: A computer made from components (mechanical, electronic, or otherwise) built at the nanometer scale. These computers could be many orders-of-magnititude faster than today's, which enables software to take proportional leaps.
Nanocontainers: "Micellar nanocontainers" or "Micelles," these are nanoscale polymeric containers that could be used to selectively deliver hydrophobic drugs to specific sites within individual cells
Nanocrystals: nanoscale semiconductor crystals. "Nanocrystals might be used to make super-strong and long-lasting metal parts. The crystals also might be added to plastics and other metals to make new types of composite structures for everything from cars to electronics.
NEMS - nanoelectromechanical systems: A generic term to describe nano scale electrical/mechanical devices.


Nanofilters: One opportunity for nanoscale filters is for the separation of molecules, such as proteins or DNA, for research in genomics.  

Nanofluidics: controlling nano-scale amounts of fluids
Nanogate: A device that precisely meters the flow of tiny amounts of fluid. Precise control of the flow restriction is accomplished by deflecting a highly polished cantilevered plate. The opening is adjustable on a sub-nanometer scale, limited by the roughness of the polished plates. Thus, the Nanogate is an Ultra Surface Finish Effect Mechanism (USFEM). The Nanogate can be fabricated on a macro-, meso- or micro- (MEMs) scale.
Nanomanipulation: The process of manipulating items at an atomic or molecular scale in order to produce precise structures.
Nanomaterials: can be subdivided into nanoparticles, nanofilms and nanocomposites. The focus of nanomaterials is a bottom up approach to structures and functional effects whereby the building blocks of materials are designed and assembled in controlled ways.
Nanomedicine:
Nanopharmaceuticals: nanoscale particles used to modulate drug transport for drug uptake and delivery applications.
Nanopores: Involves squeezing a DNA sequence between two oppositely charged fluid reservoirs, separated by an extremely small channel. Essentially itty bitty tiny holes. Nanoscopic pores found in purpose-built filters, sensors, or diffraction gratings to make them function better. As activated carbon, they may also be used as an alternative fuel storage medium, due to their massive internal surface area.
Nanoprobe: Nanoscale machines used to diagnose, image, report on, and treat disease within the body.
Nanorods: or Carbon Nanorods. Formed from multi-wall carbon nanotubes. Another nanoscale material with unique and promising physical properties, such that may yield improvements in high-density data storage, and allow for cheaper flexible solar cells.


Nanoscale: 1 - 100 nanometer range.
Nanotube: A one dimensional fullerene (a convex cage of atoms with only hexagonal and/or pentagonal faces) with a cylindrical shape. Strictly speaking, any tube with nanoscale dimensions, but generally used to refer to carbon nanotubes (a commonly mentioned non-carbon variety is made of boron nitride), which are sheets of graphite rolled up to make a tube. The dimensions are variable (down to 0.4 nm in diameter) and you can also get nanotubes within nanotubes, leading to a distinction between multi-walled and single-walled nanotubes. Apart from remarkable tensile strength, nanotubes exhibit varying electrical properties (depending on the way the graphite structure spirals around the tube, and other factors), and can be insulating, semiconducting or conducting (metallic).

NEMS - Nanoelectromechanical systems: Nanoscale MEMS

nm: Abbreviation for Nanometer.

NRAM - Nanotube-based/Nonvolatile RAM, developed by Nantero, using proprietary concepts and methods derived from leading-edge research in nanotechnology. 

NBIC: Nanotechnology, Biotechnology, Information Technology and Cognitive Science. 

Bottom up and top down approach: Bottom up manufacturing would provide components made of single molecules, which are held together by covalent forces that are far stronger than the forces that hold together macro-scale components. Use of AFM, liquid phase techniques based on inverse micelles, sol-gel processing, chemical vapor deposition (CVD), laser pyrolysis and molecular self assembly use bottom up approach for nano scale material manufacturing.

Top down method for manufacturing involves the construction of parts through methods such as cutting, carving and molding. Using these methods, we have been able to fabricate a remarkable variety of machinery and electronics devices. Milling, Nano-lithography, hydrothermal technique (for some materials), laser ablation, physical vapor deposition, electrochemical method (electroplating) uses top down approach for nano-scale material manufacturing.                                                                                                                              Angstrom: A unit of length equal to 0.1 nanometer = 10-10 meters.
Atomic force microscope (AFM): An instrument that uses a sharp tip to probe a surface. By moving the tip around, an image of the surface can be made. Height differences as small as 10-11 m can be measured this way.
C60:
A molecule consisting of 60 carbon atoms arranged in the pattern found on a soccer ball. C60 molecules are also known as Buckministerfullerine and as Buckyballs. Nanotech Now's nanotube and buckyball page
Carbon nanotube: Carbon nanotubes are cylindrical structures made only from carbon atoms that are about 1 nm in diameter and 1-100 microns in length. Carbon naotubes are very strong; they are 5 times as strong as steel for the same wieght. The electrical properties of carbon nanotubes depend on their diameter and their chirality. Some tubes are metallic and some are semiconductors.
Electron-beam lithography: Electron beam lithography is a process that can be used to write fine patterns with an electron beam. An electron beam can be generated by extracting electrons from a sharp needle with an electric field. If the electrons are then accelerated towards a metal plate with a small hole in it, a narrow beam of electrons emerges from this hole. The electrons are typically accelerated through a potential of several kilovolts and are traveling at a good fraction of the speed of light. An electron beam can be deflected by electric and magnetic field which makes it possible to write with the beam. Such electron beams are used in televisions and computer monitors. It is possible to focus an electron beam to have a very small diameter of about 1 nanometer. Such narrowly focused beams are used in scanning electron microscopes, transmission electron microscopes, and electron-beam pattern generators. Typically in an electron-beam pattern generator, the electron beam writes a pattern in a thin organic film that is coated over a wafer. Usually the wafer is a single crystal of silicon that is about 0.5 mm thick and 10 - 30 cm in diameter. The thin (~100 nm) organic film contains long molecules that wrap around each other like cooked spagetti. The energetic electrons in the electron beam cut these molecules up into small pieces. The film is then dipped in developer which disolves the short pieces but leaves the long unexposed sections unaffected. A pattern is thereby defined in the resist. The pattern can be transfered to the substrate by putting the wafer in a gas or liquid that dissolves the substrate material. Patterns with features of a few tens of nanometers can be made this way.
Micelles: Micelles are small, spherical structures composed of molecules that attract one another to reduce surface tension. The head of the molecule is hydrophilic, meaning it likes water, while the interior portion is hydrophobic, meaning it avoids water.
Moore's law: The observation by Intel executive Gordon Moore that the number of transistors on a computer chip doubles every 1.5 years. He also observed that the number of instructions per second performed by a chip also doubles every 1.5 to 2 years. The computing power of microprocessors has been growing exponentially for the last 40 years but this cannot continue indefinitely.
NEMs: NanoElectroMechanical systems. Devices based on the movement of nanometer-scale components.
Quantum dots: Quantum dots are regions of semiconductors that can be occupied by a few electrons. These dots have many properties that are similar to atoms.
Semiconductor: A pure semiconductor is a poor electrical conductor but when certain impurities are added, the conductivity can increase by orders of magnitude. Common semiconductors are silicon, germanium, and gallium-arsenide. Computer chips are usually made from thin wafers cut from large single crystals of silicon. The silicon is a badly conducting matrix in which better conducting regions are defined by adding impurity atoms (often called dopants).
Moore's law
Moore's law describes a long-term trend in the history of computing hardware. The number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years. This trend has continued for more than half a century. 2005 sources expected it to continue until at least 2015 or 2020. However, the 2010 update to the International Technology Roadmap for Semiconductors has growth slowing at the end of 2013, after which time transistor counts and densities are to double only every 3 years.
The capabilities of many digital electronic devices are strongly linked to Moore's law: processing speed, memory capacity, sensors and even the number and size of pixels in digital cameras. All of these are improving at (roughly) exponential rates as well. This exponential improvement has dramatically enhanced the impact of digital electronics in nearly every segment of the world economy. Moore's law describes a driving force of technological and social change in the late 20th and early 21st centuries.
The law is named after Intel co-founder Gordon E. Moore, who described the trend in his 1965 paper. The paper noted that the number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted that the trend would continue "for at least ten years". His prediction has proved to be uncannily accurate, in part because the law is now used in the semiconductor industry to guide long-term planning and to set targets for research and development

Computer industry technology "roadmaps" predict (as of 2001[update]) that Moore's law will continue for several chip generations. Depending on and after the doubling time used in the calculations, this could mean up to a hundredfold increase in transistor count per chip within a decade. The semiconductor industry technology roadmap uses a three-year doubling time for microprocessors, leading to a tenfold increase in the next decade. Intel was reported in 2005 as stating that the downsizing of silicon chips with good economics can continue during the next decade, and in 2008 as predicting the trend through 2029.

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home