Quantcast

Chip

A chip, also known as an integrated circuit, is a miniaturized electronic circuit consisting mainly of semiconductor devices and passive components, that has been manufactured in the surface of a thin substrate of semiconductor material. There are many different types of chips: microcircuit, microchip, silicon chip, etc. Integrated circuits are regularly used in present-day electronic equipment and have modernized the world of electronics. Computers, cellular phones, and other electronic devices have become extremely important components of the structure of modern societies.

A hybrid integrated circuit is a smaller electronic circuit made of individual semiconductor devices along with passive components. These components are bonded to a substrate or circuit board. A monolithic integrated circuit is made of devices manufactured by diffusion of trace elements into a piece of semiconductor substrate, a chip.

In the beginning, experimental discoveries showed that semiconductor devices could perform the functions of vacuum tubes, thus making integrated circuits possible as well as enhancing semiconductor device fabrication. The integration of large numbers of tiny transistors into a small chip was a massive improvement over the manual assembly of circuits using electronic components. This ensured the ability of the integrated circuit to mass produce. Early developments of the integrated circuit date back to 1949 when Werner Jacobi filed a patent for an amplifying device, similar to an integrated circuit. It showed five transistors on a common substrate arranged in a 2-stage amplifier arrangement. Jacobi discloses small and affordable hearing aids as common industrial applications of his patent. While Jacobi had significant contribution to the chip, the idea was originally conceived by a radar scientist named Geoffrey Dummer. While he gave his best efforts, he was not able to successfully produce it himself. A new idea was to make small ceramic squares that contained single miniaturized components in each one. Components could then be integrated and wired into a bidimensional or tridimensional compact grid. Jack Kilby proposed the idea to the US Army; however, his later revolutionary design was for the actual integrated circuit. He successfully demonstrated the first working integrated circuit on September 12, 1958. Robert Noyce developed his own idea relative to an integrated circuit soon after. Noyce’s chip solved many practical problems that Kilby’s had not.

The first integrated circuits had very few transistors, called Small-Scale Integration. SSI circuits were crucial to early aerospace projects, and vice-versa. Next, Medium-Scale Integration took form, which introduced devices that contained hundreds of transistors on each chip. Engineers were drawn to them because they allowed more complex systems to be produced using smaller circuit boards and less assembly work. Further developments and experiments helped to create Large-Scale Integration with tens of thousands of transistors per chip. Lastly, very large-scale integration was formed with hundreds of thousands of transistors.

As growth in technology furthered, ultra-large-scale integration was proposed for complex chips containing over one million transistors. Wafer-scale integration is a structure of building very-large integrated circuits, with the intent that it will use an entire silicon wafer, therefore producing a single super-chip. WSI has the potential to significantly decrease costs for systems through its large size and reduced packaging.

A system-on-a-chip is an integrated circuit that contains all necessary components needed for a computer or other system on a single chip. Designing a SOC can be involved and costly, and building different components on a single piece of silicon may compromise the competence of some elements. However, these disadvantages are offset by lower manufacturing and power costs. A three-dimensional integrated circuit has two or more layers of active electronic components that are integrated both vertically and horizontally into a single circuit. On-die signaling allows communication between the layers, thus making power consumption much lower than in equivalent separate circuits.

The process of manufacturing an integrated circuit can be quite complex. Pure silicon is the most common base used for the entire chip and is chemically doped to provide the N and P regions that make up the integrated circuit components. Aluminum is commonly used to connect the various IC components. The thin wire that leads from the integrated circuit chip to its mounting package may be aluminum or gold, and the mounting package itself may be ceramic or plastic. Hundreds of integrated circuits are made at the same time on a single, thin slice of silicon and are then cut apart into individual chips. First, the silicon wafer needs to be prepared by holding it vertically inside a vacuum chamber to heat it. The heating coil used is moved slowly along the chamber until it hits the bottom. Then the bottom is cut off, thus leaving a cylindrical ingot of purified silicon. The surfaces of the now wafer are coated with a layer of silicon dioxide to insulate it. To mask it, a drop of photoresist material is placed in the center of the silicon wafer and distributed over the surface. It is then baked. The coated wafer is then placed under the first layer mask and irradiated with light. Once complete, the mask is removed and parts of the photoresist are dissolved. This process is repeated for each successive layer until all of the integrated circuit chips are complete.

Photo Copyright and Credit

Chip


comments powered by Disqus