Mec Assignment
Mec Assignment
2/25/2019
[ ME – 475 ]
Submitted by
Name : Md. Shihab Rahman
Student ID : 1410148
Department : Mechanical Engineering
(1) Machine Learning through Artificial Intelligence
“The exciting new effort to make computers “The study of mental faculties through the
think . . . machines with minds, in the use of computational models.”
full and literal sense.” (Haugeland, 1985) (Charniak and McDermott, 1985)
“[The automation of] activities that we “The study of the computations that make
associate with human thinking, activities it possible to perceive, reason, and act.”
such as decision-making, problem solving, (Winston, 1992)
learning . . .” (Bellman, 1978)
Acting Humanly Acting Rationally
“The art of creating machines that perform “Computational Intelligence is the study
functions that require intelligence of the design of intelligent agents.” (Poole
when performed by people.” (Kurzweil, et al., 1998)
1990)
“AI . . . is concerned with intelligent behavior
“The study of how to make computers do in artifacts.” (Nilsson, 1998)
things at which, at the moment, people are
better.” (Rich and Knight, 1991)
Figure 1.1 Some definitions of artificial intelligence, organized into four categories.
The first work that is now generally recognized as AI was done by Warren McCulloch and
Walter Pitts (1943). They drew on three sources: knowledge of the basic physiology and function
of neurons in the brain; a formal analysis of propositional logic due to Russell and
Whitehead; and Turing’s theory of computation. They proposed a model of artificial neurons in
which each neuron is characterized as being “on” or “off,” with a switch to “on” occurring in
response to stimulation by a sufficient number of neighboring neurons. The state of a neuron was
conceived of as “factually equivalent to a proposition which proposed its adequate stimulus.”
They showed, for example, that any computable function could be computed by some network of
connected neurons, and that all the logical connectives (and, or, not, etc.) could be implemented
by simple net structures. McCulloch and Pitts also suggested that suitably defined networks
could learn. Donald Hebb (1949) demonstrated a simple updating rule for modifying the
connection strengths between neurons. His rule, now called Hebbian
learning, remains an influential model to this day.
Two undergraduate students at Harvard, Marvin Minsky and Dean Edmonds, built the first
neural network computer in 1950. The SNARC, as it was called, used 3000 vacuum tubes and a
surplus automatic pilot mechanism from a B-24 bomber to simulate a network of
40 neurons. Later, at Princeton, Minsky studied universal computation in neural networks.
His Ph.D. committee was skeptical about whether this kind of work should be considered
mathematics, but von Neumann reportedly said, “If it isn’t now, it will be someday.” Minsky
was later to prove influential theorems showing the limitations of neural network research.
There were a number of early examples of work that can be characterized as AI, but
Alan Turing’s vision was perhaps the most influential. He gave lectures on the topic as early as
1947 at the London Mathematical Society and articulated a persuasive agenda in his 1950 article
“Computing Machinery and Intelligence.” Therein, he introduced the Turing Test, machine
learning, genetic algorithms, and reinforcement learning. He proposed the Child
Programme idea, explaining “Instead of trying to produce a programme to simulate the adult
mind, why not rather try to produce one which simulated the child’s?”
The first successful commercial expert system, R1, began operation at the Digital Equipment
Corporation (McDermott, 1982). The program helped configure orders for new computer
systems; by 1986, it was saving the company an estimated $40 million a year. By 1988,
DEC’s AI group had 40 expert systems deployed, with more on the way. DuPont had 100 in use
and 500 in development, saving an estimated $10 million a year. Nearly every major U.S.
Corporation had its own AI group and was either using or investigating expert systems.
In 1981, the Japanese announced the “Fifth Generation” project, a 10-year plan to build
intelligent computers running Prolog. In response, the United States formed the Microelectronics
and Computer Technology Corporation (MCC) as a research consortium designed to assure
national competitiveness. In both cases, AI was part of a broad effort, including chip design and
human-interface research. In Britain, the Alvey report reinstated the funding that was cut by the
Lighthill report.13 In all three countries, however, the projects never met their ambitious goals.
Overall, the AI industry boomed from a few million dollars in 1980 to billions of dollars in 1988,
including hundreds of companies building expert systems, vision systems, robots, and software
and hardware specialized for these purposes. Soon after that came a period called the
“AIWinter,” in which many companies fell by the wayside as they failed to deliver on
extravagant promises.
Throughout the 60-year history of computer science, the emphasis has been on the algorithm as
the main subject of study. But some recent work in AI suggests that for many problems, it makes
more sense to worry about the data and be less picky about what algorithm to apply.
This is true because of the increasing availability of very large data sources: for example,
trillions of words of English and billions of images from the Web (Kilgarriff and Grefenstette,
2006); or billions of base pairs of genomic sequences (Collins et al., 2003).
One influential paper in this line was Yarowsky’s (1995) work on word-sense disambiguation:
given the use of the word “plant” in a sentence, does that refer to flora or factory?
Previous approaches to the problem had relied on human-labeled examples combined with
machine learning algorithms. Yarowsky showed that the task can be done, with accuracy above
96%, with no labeled examples at all. Instead, given a very large corpus of unannotated text and
just the dictionary definitions of the two senses—“works, industrial plant” and “flora, plant
life”—one can label examples in the corpus, and from there bootstrap to learn
28 Chapter 1. Introduction new patterns that help label new examples. Banko and Brill (2001)
show that techniques like this perform even better as the amount of available text goes from a
million words to a billion and that the increase in performance from using more data exceeds any
difference in algorithm choice; a mediocre algorithm with 100 million words of unlabeled
training data outperforms the best known algorithm with 1 million words.
As another example, Hays and Efros (2007) discuss the problem of filling in holes in a
photograph. Suppose you use Photoshop to mask out an ex-friend from a group photo, but now
you need to fill in the masked area with something that matches the background. Hays and Efros
defined an algorithm that searches through a collection of photos to find something that will
match. They found the performance of their algorithm was poor when they used a collection of
only ten thousand photos, but crossed a threshold into excellent performance when they grew the
collection to two million photos.
Work like this suggests that the “knowledge bottleneck” in AI—the problem of how to express
all the knowledge that a system needs—may be solved in many applications by learning methods
rather than hand-coded knowledge engineering, provided the learning algorithms have enough
data to go on (Halevy et al., 2009). Reporters have noticed the surge of new applications and
have written that “AI Winter” may be yielding to a new spring (Havenstein,
2005). As Kurzweil (2005) writes, “today, many thousands of AI applications are deeply
embedded in the infrastructure of every industry.”
4.1 Conclusion of AI
Following World War II, America experienced a strong industrial push, reinvigorating the
economy. Rapid advancement in technology drove this industrial wave—servos, digital
logic, solid state electronics, etc. The merger of this technology and the world of science
fiction came in the form of the vision of Joseph Engelberger, the ingenuity of George
Devol, and their chance meeting in 1956. Joseph F. Engelberger was born on July 26, 1925,
in New York City. Growing up, Engelberger developed a fascination for science fiction,
especially that written by Isaac Asimov. Of particular interest in the science fiction world
was the robot, which led him to pursue physics at Columbia University, where he earned
both his bachelor’s and master’s degrees. Engelberger served in the U.S. Navy and later
worked as a nuclear physicist in the aerospace industry. In 1946, a creative inventor by
the name of George C. Devol, Jr., patented a playback device used for controlling
machines. The device used a magnetic process recorder to accomplish the control. Devol’s
drive toward automation led him to another invention in 1954, for which he applied for a
patent, writing, “The present invention makes available for the first time a more or less
general purpose machine that has universal application to a vast diversity of applications
where cyclic control is desired.” Devol had dubbed his invention universal automation, or
unimation for short. Whether it was fate, chance, or just good luck, Devol and Engelberger
met at a cocktail party in 1956. Their conversation revolved around robotics, automation,
Asimov, and Devol’s patent application, “A Programmed Article Transfer,” which
Engelberger’s imagination translated into “robot.” Following this chance meeting,
Engelberger and Devol formed a partnership that lead to the birth of the industrial robot.
Engelberger took out a license under Devol’s patent and bought out his employer,
renaming the new company Consolidated Controls Corporation, based out of his garage.
His team of engineers that had been working on aerospace and nuclear applications
refocused their efforts on the development of the first industrial robot, named the
Unimate, after Devol’s “unimation.” The first Unimate was born in 1961 and was delivered
to General Motors in Trenton, New Jersey, where it unloaded high temperature parts
from a die casting machine — a very unpopular job for manual labor. Also in 1961, patent
number 2,998,237 was granted to Devol —the first U.S. robot patent. In 1962 with the
backing of Consolidated Diesel Electric Company (Condec) and Pullman Corporation,
Engelberger formed Unimation, Inc., which eventually blossomed into a prosperous
business —GM alone had ordered 66 Unimates. Although it took until 1975 to turn a
profit, Unimation became the world leader in robotics, with 1983 annual sales of $70
million and 25 percent of the world market share. For his visionary pursuit and
entrepreneurship, Joseph Engelberger is widely considered the “Father of Robotics.”
Since 1977, the Robotic Industries Association has presented the annual Engelberger
Robotics Awards to world leaders in both application and leadership in the field of
robotics.
The post-World War II technology boom brought a host of developments. In 1946 the world’s first
electronic digital computer emerged at the University of Pennsylvania at the hands of American
scientists J. Presper Eckert and John Mauchly. Their computer, called ENIAC (electronic numerical
integrator and computer), weighed over 30 tons. Just on the heels of ENIAC, Whirlwind was
introduced by Jay Forrester and his research team at the Massachusetts Institute of Technology
(MIT) as the first general purpose digital computer, originally commissioned by the U. S. Navy to
develop a flight simulator to train its pilots. Although the simulator did not develop, a computer
that shaped the path of business computers was born. Whirlwind was the first computer to
perform real-time computations and to use a video display as an output device. At the same time
as ENIAC and Whirlwind were making their appearance on the East Coast of the United States, a
critical research center was formed on the West Coast. In 1946, the Stanford Research Institute
(SRI) was founded by a small group of business executives in conjunction with Stanford University.
Located in Menlo Park, California, SRI’s purpose was to serve as a center for technological
innovation to support regional economic development. In 1966 the Artificial Intelligence Center
(AIC) was founded at SRI, pioneering the field of artificial intelligence (AI), which gives computers
the heuristics and algorithms to make decisions in complex situations. From 1966 to 1972 Shakey,
the Robot, was developed at the AIC by Dr. Charles Rosen (1917–2002) and his team. Shakey was
the first mobile robot to reason its way about its surroundings and had a far-reaching influence
on AI and robotics. Shakey was equipped with a television camera, a triangulating range finder,
and bump sensors. It was connected by radio and video links to DEC PDP-10 and PDP-15
computers. Shakey was equipped with three levels of programming for perceiving, modeling, and
interacting with its environment. The lowest level routines were designed for basic locomotion —
movement, turning, and route planning. The intermediate level combined the low-level routines
together to accomplish more difficult tasks. The highest-level routines were designed to generate
and execute plans to accomplish tasks presented by a user. Although Shakey had been likened to
a small unstable box on wheels—thus the name— it represented a significant milestone in AI and
in developing a robot’s ability to interact with its environment.
1.1.3 Robotics in Industry
Running in parallel with the developments in research laboratories, the use of robotics in
industry blossomed beyond the time of Engelberger and Devol’s historic meeting. In 1959,
Planet Corporation developed the first commercially available robot, which was
controlled by limit switches and cams. The next year, Harry Johnson and Veljko Milenkovic
of American Machine and Foundry, later known as AMF Corporation, developed a robot
called Versatran, from the words versatile transfer; the Versatran became commercially
available in 1963. In Norway, a 1964 labor shortage led a wheelbarrow manufacturer to
install the first Trallfa robot, which was used to paint the wheelbarrows. Trallfa robots,
produced by Trallfa Nils Underhaug of Norway, were hydraulic robots with five or six
degrees of freedom and were the first industrial robots to use the revolute coordinate
system and continuous-path motion. In 1966, Trallfa introduced a spray-painting robot
into factories in Byrne, Norway. This spray-painting robot was modified in 1976 by
Ransome, Sims, and Jefferies, a British producer of agricultural machinery, for use in arc
welding applications. Painting and welding developed into the most common applications
of robots in industry. Seeing success with their Unimates in New Jersey, General Motors
used 26 Unimate robots to assemble the Chevrolet Vega automobile bodies in Lordstown,
Ohio, beginning in 1969. GM became the first company to use machine vision in an
industrial setting, installing the Consight system at their foundry in St. Catherines,
Ontario, Canada, in 1970. At the same time, Japanese manufacturers were making
quantum leaps in manufacturing: cutting costs, reducing variation, and improving
efficiency. One of the major factors contributing to this transformation was the
incorporation of robots in the manufacturing process. Japan imported its first industrial
robot in 1967, a Versatran from AMF. In 1971 the Japanese Industrial Robot Association
(JIRA) was formed, providing encouragement from the government to incorporate
robotics. This move helped to move the Japanese to the forefront in total number of
robots used in the world. In 1972 Kawasaki installed a robot assembly line, composed of
Unimation robots at their plant in Nissan, Japan. After purchasing the Unimate design
from Unimation, Kawasaki improved the robot to create an arc-welding robot in 1974,
used to fabricate their motorcycle frames. Also in 1974, Hitachi developed touch and
force-sensing capabilities in their Hi-T-Hand robot, which enabled the robot to guide pins
into holes at a rate of one second per pin. At Cincinnati Milacron Corporation, Richard
Hohn developed the robot called The Tomorrow Tool, or T3. Released in 1973, the T3 was
the first commercially available industrial robot controlled by a microcomputer, as well
as the first U.S. robot to use the revolute configuration. Hydraulically actuated, the T3
was used in applications such as welding automobile bodies, transferring automobile
bumpers, and loading machine tools. In 1975, the T3 was introduced for drilling
applications, and in the same year, the T3 became the first robot to be used in the
aerospace industry. In 1970, Victor Scheinman, of Stanford Arm fame, left his position as
professor at Stanford University to take his robot arm to industry. Four years later,
Scheinman had developed a minicomputer-controlled robotic arm, known as the Vicarm,
thus founding Vicarm, Inc. This arm design later came to be known as the “standard arm.”
Unimation purchased Vicarm in 1977, and later, relying on support from GM, used the
technology from Vicarm to develop the PUMA (Programmable Universal Machine for
Assembly), a relatively small electronic robot that ran on an LSI II computer. The ASEA
Group of Vaster ¨ as, Sweeden, made signi ˚ ficant advances in electric robots in the
1970’s. To handle automated grinding operations, ASEA introduced its IRb 6 and IRb 60
all-electric robots in 1973. Two years later, ASEA became the first to install a robot in an
iron foundry, tackling yet more industrial jobs that are not favored by manual labor. In
1977 ASEA introduced two more electric-powered industrial robots, both of which used
microcomputers for programming and operation. Later, in 1988, ASEA merged with BBC
Brown Boveri Ltd of Baden, Switzerland, to form ABB (ASEA, Brown, and Boveri), one of
the world leaders in power and automation technology. At Yamanashi University in Japan,
IBM and Sankyo joined forces to develop the Selective Compliance Assembly Robot Arm
(SCARA) in 1979. The SCARA was designed with revolute joints that had vertical axes, thus
providing stiffness in the vertical direction. The gripper was controlled in compliant mode,
or using force control, while the other joints were operated in position control mode.
These robots were used and continue to be used in many applications where the robot is
acting vertically on a workpiece oriented horizontally, such as polishing and insertion
operations. Based on the SCARA geometry, Adept Technology was founded in 1983.
Adept continues to supply direct drive robots that service industries, such as
telecommunications, electronics, automotive, and pharmaceuticals. These industrial
developments in robotics, coupled with the advancements in the research laboratories,
have profoundly affected robotics in different sectors of the technical world.
Just as space programs have used robots to accomplish tasks that would not even be considered
as a manned mission, military and law enforcement agencies have employed the use of robots to
remove humans from harm’s way. Police are able to send a microphone or camera into a
dangerous area that is not accessible to law enforcement personnel, or is too perilous to enter.
Military applications have grown and continue to do so. Rather than send a soldier into the field
to sweep for landmines, it is possible to send a robot to do the same. Research is presently
underway to mimic the method used by humans to identify landmines. Another approach uses
swarm intelligence, which is research being developed at a company named Icosystems, under
funding from DARPA. The general approach is similar to that of a colony of ants finding the most
efficient path through trial and error, finding success based on shear numbers. Icosystems is using
120 robots built by I-Robot, a company co-founded by robotics pioneer Rodney Brooks, who is
also the director of the Computer Science and Artificial Intelligence Laboratory at MIT. One of
Brooks’ research interests is developing intelligent robots that can operate in unstructured
environments, an application quite different from that in a highly structured manufacturing
environment.
In addition to their extensive application in manufacturing, space exploration, the military, and
medicine, robotics can be found in a host of other fields, such as the ever-present entertainment
market— toys, movies, etc. In 1998 two popular robotic toys came to market. Tiger Electronics
introduced “Furby” which rapidly became the toy of choice in the 1998 Christmas toy market.
Furby used a variety of different sensors to react with its environment, including speech that
included over 800 English phrases, as well as many in its own language “Furbish.” In the same year
Lego released its Lego MINDSTORMS robotic toys. These reconfigurable toys rapidly found their
way into educational programs for their value in engaging students, while teaching them about
the use of multiple sensors and actuators to respond to the robot’s surroundings. Sony released
a robotic pet named AIBO in 1999, followed by the third generation AIBO ERS-7 in 2003. Honda
began a research effort in 1986 to build a robot that would interact peacefully with humans,
yielding their humanoid robots P3 in 1996 and ASIMO in 2000 (ASIMO even rang the opening bell
to the New York Stock Exchange in 2002 to celebrate Honda’s 25 years on the NYSE). Hollywood
has maintained a steady supply of robots over the years, and there appears to be no shortage of
robots on the big screen in the near future.
Looking forward there are many frontiers in robotics. Many of the applications presented here
are in their infancy and will see considerable growth. Other mature areas will see sustained
development, as has been the case since the technological boom following the Second World
War. Many theoretical areas hold endless possibilities for expansion —nonlinear control,
computational algebra, computational geometry, intelligence in unstructured environments, and
many more. The possibilities seem even more expansive when one considers the creativity
generated by the cross-pollination of playwrights, science fiction writers, inventors,
entrepreneurs, and engineers.
(3) Rapid prototyping with 3-D printing
1.1 INTRODUCTION
Rapid Prototyping Technology is a group of manufacturing processes that enable the direct
physical realization of 3D computer models. This technology converts the 3D computer data
provided by a dedicated file format directly to a physical model, layer by layer with a high degree
of accuracy. This technology is fast developing and is more than competitive to traditional model
building techniques considering time and degree of detail. The paper gives an overview on existing
major RP techniques and their applications in engineering fields.
The past decade has witnessed the emergence of new manufacturing technologies that
build parts on a layer-by-layer basis. Using these technologies, manufacturing time for parts of
virtually any complexity is reduced considerably. In other words, it is rapid.
Rapid Prototyping Technologies and Rapid Manufacturing offer great potential for producing
models and unique parts for manufacturing industry. Thus, the reliability of products can be
increased; investment of time and money is less risky. Not everything that is thinkable today is
already workable or available at a reasonable price, but this technology is fast evolving and the
better the challenges, the better for this developing process.
1.2 Overview
The term Rapid prototyping (RP) refers to a class of technologies that can automatically construct
physical models from Computer-Aided Design (CAD) data.
It is a free form fabrication technique by which a total object of prescribed shape, dimension and finish
can be directly generated from the CAD based geometrical model stored in a computer, with little human
intervention. Rapid prototyping is an "additive" process, combining layers of paper, wax, or plastic to
create a solid object. In contrast, most machining processes (milling, drilling, grinding, etc.) are
"subtractive" processes that remove material from a solid block. RP’s additive nature allows it to create
objects with complicated internal features that cannot be manufactured by other means.
In addition to prototypes, RP techniques can also be used to make tooling (referred to as rapid tooling)
and even production-quality parts (rapid manufacturing). For small production runs and complicated
objects, rapid prototyping is often the best manufacturing process available. Of course, "rapid" is a relative
term. Most prototypes require from three to seventy-two hours to build, depending on the size and
complexity of the object. This may seem slow, but it is much faster than the weeks or months required to
make a prototype by traditional means such as machining. These dramatic time savings allow
manufacturers to bring products to market faster and more cheaply.
Although several rapid prototyping techniques exist, all employ the same basic five-step
process. The steps are:
Most commercially available rapid prototyping machines use one of six techniques. At
present, trade restrictions severely limit the import/export of rapid prototyping machines, so this
guide only covers systems available in the U.S.
1.4.1 Stereolithography
Ink-Jet Printing refers to an entire class of machines that employ ink-jet technology. The first was 3D
Printing (3DP), developed at MIT and licensed to Soligen Corporation, Extrude Hone, and others. The
ZCorp 3D printer, produced by Z Corporation of Burlington, MA is an example of this technology. As shown
in Figure 6a, parts are built upon a platform situated in a bin full of powder material. An ink-jet printing
head selectively deposits or "prints" a binder fluid to fuse the powder together in the desired areas.
Unbound powder remains to support the part. The platform is lowered, more powder added and leveled,
and the process repeated. When finished, the green part is then removed from the unbound powder, and
excess unbound powder is blown off. Finished parts can be infiltrated with wax, CA glue, or other sealants
to improve durability and surface finish. Typical layer thicknesses are on the order of 0.1 mm. This process
is very fast, and produces parts with a slightly grainy surface. ZCorp uses two different materials, a starch
based powder (not as strong, but can be burned out, for investment casting applications) and a ceramic
powder. Machines with 4 color printing capability are available. Ballistic particle manufacturing, depicted
in Figure 6b, was developed by BPM Inc., which has since gone out of business.
Rapid prototyping is widely used in the automotive, aerospace, medical, and consumer products
industries.
Engineering
The aerospace industry imposes stringent quality demands. Rigorous testing and certification is necessary
before it is possible to use materials and processes for the manufacture of aerospace components. Yet,
Boeing's Rocketdyne has successfully used RP technology to manufacture hundreds of parts for the
International Space Station and the space shuttle fleet. The company also uses RP to manufacture parts
for the military's F-18 fighter jet in glass-filled nylon .Another not yet mature idea is to have a RP
machine on board of the International Space Station (ISS) to produce spare parts for repair jobs. Models
are widely used in automotive industry for design studies, physical experiments etc. Functional parts have
been used for titanium casting have been made by RP techniques for parts in F1 racing cars.
Architecture
The Department of Architecture at the University of Hongkong is applying Rapid Prototyping Technology
for teaching students about the new possibilities in testing there draft, e.g. for lighting conditions,
mechanical details. One example is the Sidney Opera House.
Medical Applications
RPT has created a new market in the world of orthodontics. Appearance conscious adults can now have
straighter teeth without the embarrassment of a mouth full of metal. Using stereolithography technology
custom-fit, clear plastic aligners can be produced in a customized mass process. The RP world has made its
entry into the hearing instrument world too. The result is instrument shells that are stronger, fit better and
are biocompatible to a very high degree. The ear impression is scanned and digitized with an extremely
accurate 3-D scanner. Software specially developed for this converts the digital image into a virtual hearing
instrument shell .Thanks to the accuracy of the process, instrument shells are produced with high precision
and reproducibility. This means the hearing instruments fit better and the need for remakes is reduced. In
the case of repairs, damage to or loss of the ITE instrument, an absolutely identical shell can be
manufactured quickly, since the digital data are stored in the system.
1.6 Future developments
Rapid prototyping is starting to change the way companies design and build products. On the
horizon, though, are several developments that will help to revolutionize manufacturing as we
know it.
One such improvement is increased speed. "Rapid" prototyping machines are still slow by some
standards. By using faster computers, more complex control systems, and improved materials,
RP manufacturers are dramatically reducing build time.
Another future development is improved accuracy and surface finish. Today’s commercially
available machines are accurate to ~0.08 millimeters in the x-y plane, but less in the z (vertical)
direction. Improvements in laser optics and motor control should increase accuracy in all three
directions. In addition, RP companies are developing new polymers that will be less prone to
curing and temperature-induced war page.
Another important development is increased size capacity. Currently most RP machines are
limited to objects 0.125 cubic meters or less. Larger parts must be built in sections and joined by
hand. To remedy this situation, several "large prototype" techniques are in the works.
1.7 Conclusion
Finally, the rise of rapid prototyping has spurred progress in traditional subtractive methods as well.
Advances in computerized path planning, numeric control, and machine dynamics are increasing the
speed and accuracy of machining. Modern CNC machining centers can have spindle speeds of up to
100,000 RPM, with correspondingly fast feed rates. 34 Such high material removal rates translate into short
build times. For certain applications, particularly metals, machining will continue to be a useful
manufacturing process. Rapid prototyping will not make machining obsolete, but rather complement it.
(4) Precise manufacturing with CNC machines
1.1 INTRODUCTION
The design and construction of Computer Numerically Controlled (CNC) machines differs
greatly from that of conventional machine tools. This difference arises from the requirements of
higher performance levels. The CNC machines often employ the various mechatronics elements
that have been developed over the years. However, the quality and reliability of these machines
depends on the various machine elements and subsystems of the machines. There are some of
the important constituents parts and aspects of CNC machines to be considered in their
designing, for example Machine structure, Guideways, Feed drives, Spindle and Spindle bearings,
Measuring systems, Controls, Software and Operator interface, Gauging, Tool monitoring.
The control of a machine tool by means of stored information through the computer is
known as Computer Numerically Controlled. The information stored in the computer can be read
by automatic means and converted into electrical signals, which operate the electrically
controlled servo systems. Electrically controlled servo systems permits the slides of a machine
tool to be driven simultaneously and at the apporopriate feeds and direction so that complex
shapes can be cut, often with a single operation and without the need to reorient the workpiece.
Computer Numerically Control can be applied to milling machines, Lathe machines,
Grinding machines, Boring machines, Flame cutters, Drilling machines etc.
1.2 HISTORICAL DEVELOPMENTS IN CNC MACHINES
A relatively short time ago, machines were operated by craftsmen who determined the many
variables such as speeds, feeds, depth of cut etc. Now much of this work is being assigned to
computer and machines that are numerically controlled.
Numerical control as applied to machine tools, had it’s beginning in 1949, when the United
States Air Force combined with Parsons Corporation to produce contoured surface from
instructions in the form of punched tape.
Numerical control equipment was substituted for the tracer controls on a three axis Cininnati
Hydrotel Vertical Planer Mill.
A successful feasibility demonstration of continuous contour milling was held in March 1952
followed by a final report; both the Air Force and Private Industry began further work to
extend the Numerical Control technology developed at Massachusetts, Institute of
Technology, U.S.A.
Between 1955 and 1960 only 500 Numerically Controlled Machine were installed in U.S.A.
During 1960 and 1964 approximately 4000 more were added.
Till the middle of 60’s Russia, West Germany, Japan, and U.K. were for behind U.S.A., in the
production of Numerical Control Machine Tools.
Japan entered the Numerical Control commercial scene during mid 60’s and by 1971
surpassed the U.S.A. in NC ratio.
U.S.S.R. is also one of the largest producers of nc machine tools mainly for industrial use
1.3 FEATURES OF CNC MACHINES
Rigid machine structure is provided, in order to bear the static load, Dynamic load ,
Thermal load , Vibration.
Guideways are used in machine tools to control the direction or line of action of the
carriage or the table on which a tool or workpiece is held , To absorb all the static and
dynamic forces.
On a CNC machine the function of feed rate drive is to provide motion to the slide as per
the motion commands. Since the degree of accuracy requirements are high, the feed
drive should having high efficiency and response.
Hydrodynamic bearings, Hydrostatic Bearings, Antifriction Bearings are provided to the
CNC machine. In order to achieve the accuracy and the quality of the work produced
depends directly on the geometrical accuracy, running accuracy and the stiffness of the
spindle assembly.
On all CNC machines, an electronic measuring system is employed on each controlled
axis to monitor the movement and to compare the position of the slide and the spindle
with the desired position.
CNC controls are the heart of the CNC machines. The early CNC controls were developed
for simple applications in turning, machining centres and grinding. But with increased
capabilities on modern machine tools such as higher spindle speeds, higher rapid
traverses and more number of axes, CNC systems have been developed to meet these
needs.
Better work piece quality is one of the most important advantages in using a hi-tech CNC
machine. To maintain quality the effect of parameters like tool wear and thermal growth
can be eliminated by automatic gauging system.
Presently, established tool monitoring sensors and systems are available commercially
for integrating and systems are available commercially for integrating with CNC
machines. Tool monitoring systems enable the introduction of adaptive controls on
machines for optimizing the cutting parameters.
Feedback is taken from various devices, e.g. Ecoders, Transducers.
1.4 MACHINE STRUCTURE
The machine structure is the load carrying and supporting member of the machine
tool. All the motors, drive mechanisms and other functional assemblies of machine tools are
aligned to each other and rigidity fixed to the machine structure. The machine structure is
subjected to static and dynamic forces and it is, therefore , essential that the structure does
not deform or vibrate beyond the permissible limits under the action of these forces. All
components of the machine must remain in correct relative positions to maintain the geometric
accuracy, regardless of the magnitude and direction of these forces. The machine structure
configuration is also influenced by the consideration of manufacture, assembly and operation.
The basic design factors involved in the design of machine structure are as follows,
1. Static load
2. Dynamic load
3. Thermal load
4. Guideways
5. Feed drive: - 1) Servo motor, 2) Mechanical Transmission System
6. Spindle / spindle bearings 1) Hydrodynamic 2) Hydrostatic 3) Antifriction
7. Measuring Systems: - 1) Direct 2) Indirect
8. Controls, Software and user interface
9. Gauging
10. Tool monitoring systems: - 1) Direct 2) Indirect
1.5 CONFIGURATION OF CNC SYSTEM
Fig shows a schematic diagram of the working principle of an NC axis of a CNC machine
and the interface of a CNC control.
A CNC system basically consists of the following:-
The control unit continues to adjust the position of the slide until it arrives it’s destination,
this system has feedback. Although more costly and complex than open loop system, these
system gives more accurate positioning. For this type of system, servomotors are used.
Higher flexibility.
Increased productivity.
Consistent quality.
Reduced scrap rate.
Reliable operation.
Reduced non-productive time.
Reduced manpower.
Shorter cycle time.
Just in time manufacture.
An automatic material handling.
Lesser floor space.
Increased operational safety.
Machining of advanced materials.
DISADVANTAGES
1.10 CONCLUSION
Development of Computer Numerically Controlled machines (CNC) is an outstanding contribution
to the manufacturing industries. It has made possible the automation of the machining processes
with flexibility to handle small to medium batch quantities in part production.
At present Indian industries needs CNC machines, which facing competition in global market, due
to day by day uncertainties in customer demand requirements in term of huge variety of products
with better quality and at lesser time. CNC machines are best suitable for better accuracy and
less man power