CSC4221 Lecture 1 - Introduction
CSC4221 Lecture 1 - Introduction
Datti 2018
Introduction
Applications of Computer Graphics
Graphics System
OpenGL Overview
Output Primitives
2D Transformations & Viewing
3D Transformations & Viewing
2
Computer graphics is the creation and manipulation of geometric
objects (models) and images using computer.
3
The applications of computer graphics are many and varied; we can,
however, divide them into four major areas:
4
Scientific Visualization – Geographic Info. Systems
5
Scientific Visualization – Geographic Info. Systems
6
Statistics– Charts and Graphs
7
Statistics – Charts and Graphs
8
Scientific Visualization – Charts and Graphs
9
Scientific Visualization – Charts and Graphs
10
Scientific Visualization – Charts and Graphs
11
Scientific Visualization - Medical
12
Scientific Visualization - Medical
13
Info graphic Posters
14
CAD/CAM – Mechanical Engineering
15
CAD/CAM - Architecture
16
CAD/CAM - Fashion
17
Scientific
Simulations
18
20
Entertainment - Art
21
Training Simulations
22
Entertainment - Animations
23
Entertainment - Movies
24
Entertainment - Movies
25
Entertainment – Video Games
26
Entertainment – Video Games
27
Graphical User Interfaces
28
Graphical User Interfaces
29
Graphical User Interfaces
30
Introduction
Applications of Computer Graphics
Graphics System
Hardware
Software
31
• There are six major hardware elements in a computer
graphics system:
• Input devices
• Processing
• Central Processing Unit
• Graphics Processing Unit
• Memory
• Main Memory
• Frame buffer
• Output devices
Input devices Output devices
• There are ways images are represented on digital output devices
Raster and Vector.
• Virtually all modern graphics systems are raster based. The image
we see on the output device is an array—the raster—of picture
elements, or pixels, produced by the graphics system.
• Vector Systems on the other hand represent an image as collection
of lines.
*Each pixel corresponds to a location, or small area, in the
image.
* For example, a 1-bit-deep frame buffer allows only two colors, whereas
an 8-bit-deep frame buffer allows 256 colors.
* Infull-color systems, there are 24 (or more) bits per pixel. Such
systems can display sufficient colors to represent most images
realistically. They are also called true-color systems, or RGB-color
systems, because individual groups of bits in each pixel are assigned to
each of the three primary colors—red, green, and blue—used in most
displays.
* High dynamic range (HDR) systems use 12 or more bits for each color
component.
1 bit (2 colors) 2 bit (4 colors) 4 bit (16 colors)
* Graphics Processing Units (on modern systems) on the other hand are
dedicated processing units specialized in graphic functions.
Hardcopy
• Dot Matrix Printers
• Ink-Jet Printers
• Laser Printers
• 3D Printers
Softcopy
• Cathode Ray Tube (CRT)
• Liquid Crystal Display (LCD)
Miscellaneous
• Hologram
• Dot Matrix - uses a head with 7 to 24 pins to strike a ribbon
(single or multiple color)
•Examples
• Cathode Ray Tube (CRT)
• Liquid Crystal Display (LCD)
• Light Emitting Diode (LED) Display
When electrons strike the phosphor coating on the tube, light is emitted.
Light appears on the surface of the CRT when a sufficiently intense beam of electrons is
directed at the phosphor.
The screen is coated
with phosphor, 3 colors
for a color monitor.
For a color monitor,
three guns light up red,
green, or blue
phosphors.
Liquid crystal displays use small flat chips which change their transparency properties
when a voltage is applied.
LCDs elements do not emit light, but use backlights behind the LCD matrix
50
Special Purpose
Word, Excel etc.
AutoCAD
Animation and Simulation Packages e.g. Maya
Visualization Packages e.g. GraphViz
Painting Packages e.g. MSPaint
General Purpose
Programming API (Application Program Interface)
OpenGL
DirectX
Java2D and Java3D
Programmer sees the graphics system through an interface: the Application
Programmer Interface (API)
Application
For more complex objects, there may be multiple ways of defining the
object from a set of vertices. A circle, for example, can be defined by three
points on its circumference, or by its centre and one point on the
circumference.
Viewer/Camera :can be defined using four types of necessary specifications:
Position The camera location usually is given by the position of the center of the
lens, which is the center of projection (COP).
Focal length The focal length of the lens determines the size of the image on the
film plane or, equivalently, the portion of the world the camera sees.
Much of the work in the pipeline is in converting object
representations from one coordinate system to another
World coordinates
Camera coordinates
Screen coordinates
Every change of coordinates is equivalent to a matrix
transformation
Just as a real camera cannot “see” the whole world, the virtual camera
can only see part of the world space
Objects that are not within this volume are said to be clipped out of the scene
Must carry out the process that combines the 3D viewer
with the 3D objects to produce the 2D image
If an object is visible in the image, the appropriate pixels in
the frame buffer must be assigned colors
Vertices assembled into objects
Effects of lights and materials must be determined
Polygons filled with interior colors/shades
Must have also determine which objects are in front (hidden surface
removal)