0% found this document useful (0 votes)
50 views

I. Productivity

Operation management aims to maximize efficiency within an organization. It involves planning, organizing, and controlling activities to convert raw materials into goods and services. Productivity measures the ratio of outputs to inputs and is a key factor in a firm's performance. Qualitative and quantitative forecasting methods are used to predict future outcomes based on past data and management insights to aid decision making. Quality management strategies define quality levels and approaches to ensure achievement, such as total quality management which focuses on customer expectations, problem identification, and teamwork.

Uploaded by

Chan Jolan Carla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views

I. Productivity

Operation management aims to maximize efficiency within an organization. It involves planning, organizing, and controlling activities to convert raw materials into goods and services. Productivity measures the ratio of outputs to inputs and is a key factor in a firm's performance. Qualitative and quantitative forecasting methods are used to predict future outcomes based on past data and management insights to aid decision making. Quality management strategies define quality levels and approaches to ensure achievement, such as total quality management which focuses on customer expectations, problem identification, and teamwork.

Uploaded by

Chan Jolan Carla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 16

I.

Productivity

Operation Management (OM) is the administration of business practices to create


the highest level of efficiency possible within an organization. Operation management is
mainly concerned with planning, organizing, directing and controlling all the activities of
an organization which helps in converting the raw materials and human efforts into
valuable goods and services for satisfying customer needs. Operations management is
an area of management concerned with designing and controlling the process of
production and redesigning business operations in the production of goods or services.
The Importance of OM is that the major aim of an operation manager is to ensure
timely delivery of the products and to successfully turn the raw materials into the
finished products (input to output). Operation Management plays a vital role to run any
project successfully. It is a complicated process to manage the operations, so in-depth
knowledge is required to take on the position of an operation manager. Earlier everyone
believed that the operation management was not that important for the organization, but
later on, it was discovered that it is actually important for the functioning of the
organization. It was found that the manufacturing of raw materials to make the goods
and selling them along with management of sales is necessary, and this is done
efficiently by managing the operations. Measured productivity is the ratio of
a measure of total outputs to a measure of inputs used in the production of goods and
services.
Productivity is a measure of the efficiency of a person, machine, factory, system, etc., in
converting inputs into useful outputs. Productivity is computed by dividing average
output per period by the total costs incurred or resources (capital, energy, material,
personnel) consumed in that period. A measure of the efficiency of a person, machine,
factory, system, etc., in converting inputs into useful outputs. Productivity is computed
by dividing average output per period by the total costs incurred or resources (capital,
energy, material, personnel) consumed in that period. Physical productivity is the
quantity of output produced by one unit of production input in a unit of time.
For example, certain equipment can produce 10 tons of output per hour.
Economic productivity is the value of output obtained with one unit of input. Input.
Measured productivity is the ratio of a measure of total outputs to a measure of inputs
used in the production of goods and services. Productivity growth is estimated by
subtracting the growth in inputs from the growth in output it is the residual.
You can measure employee productivity with the labor productivity equation: total output
/ total input. Let's say your company generated 80,000 worth of goods or services
(output) utilizing 1,500 labor hours (input). To calculate your company's
labor productivity, you would divide 80,000 by 1,500, which equals 53.
Productivity describes various measures of the efficiency of production.
A productivity measure is expressed as the ratio of output to inputs used in a production
process, i.e. output per unit of input. Productivity is a crucial factor in production
performance of firms and nations. The Project Planning Phase is the second phase in
the project life cycle. It involves creating of a set of plans to help guide your team
through the execution and closure phases of the project. The plans created during this
phase will help you to manage time, cost, quality, change, risk and issues.
II. Forecasting

Is a decision-making tool used by many businesses to help in budgeting, planning,


and estimating future growth? The Objectives of Forecasting In the narrow sense, the
objective of forecasting is to produce better forecasts. But in the broader sense, the
objective is to improve organizational performance more revenue, more profit,
increased customer satisfaction. Why is forecasting is so important? A business should
balance its approach to marketing. Forecasting is essential to making marketing plans,
but so are concerns with budget, overhead and cash flow. Of all of those
concerns, forecasting may be the most important because it tells you the future
environment in which you may operate. In the simplest terms, forecasting is the attempt
to predict future outcomes based on past events and management insight. There
are four main types of forecasting methods that is financial analysts. While there are a
wide range of frequently used quantitative budget forecasting tools, in this article we
focus on the top four methods: first is straight-line, second moving average, third is
simple linear regression, and fourth are multiple linear regressions.
Examples of qualitative forecasting methods are informed opinion and judgment, the
Delphi method, market research, and historical life-cycle analogy. Quantitative
forecasting models are used to forecast future data as a function of past data.
Forecasting starts with certain assumptions based on the management's experience,
knowledge, and judgment.
Qualitative forecasting is an estimation methodology that uses expert judgment,
rather than numerical analysis. This approach is substantially different from
quantitative forecasting, where historical data is compiled and analyzed to discern future
trendy the primary advantage of forecasting is that it provides the business with
valuable information that the business can use to make decisions about the future of the
organization.
Time series forecasting is the use of a model to predict future values based on
previously observed values.
Time series analysis comprises methods for analyzing time series data in order
to extract meaningful statistics and other characteristics of the data. Time series
forecasting uses information regarding historical values and associated patterns to
predict future activity. Most often, this relates to trend analysis, cyclical fluctuation
analysis and issues of seasonality. As with all forecasting methods, success is not
guaranteed. Exponential smoothing is a rule of thumb technique for time series data
using the exponential window function. Whereas in the simple moving average the past
observations are weighted equally, exponential functions are used to assign
exponentially decreasing weights over time. Exponential smoothing of time series data
assigns exponentially decreasing weights for newest to oldest observations.
Exponential smoothing is usually used to make short term forecasts, as longer
term forecasts using this technique can be quite unreliable. Forecast
Accuracy Measures. Difference between forecast and actual value for a given period.
However, error for one time period does not tell us very much. Two of the most
commonly used error measures are the mean absolute deviation (MAD) and the mean
squared error.
III. Managing Quality

A quality management strategy defines the required quality level for the project and the


approach to be used to ensure that this level is achieved. A more detailed
quality management plan is developed in the planning stage. Quality Glossary
Definition: Organizational Excellence. Organizational excellences defined as the
ongoing efforts to establish an internal framework of standards and processes intended
to engage and motivate employees to deliver products and services that fulfill customer
requirements within business expectations. The development of a strategic quality
plan is the key to determining the right quality initiatives for your organization. To get
started, create a team of quality professionals who are responsible for ensuring the
delivery of quality products and services to the organization's customers. Relying on
real-time data to measure and reduce the total Cost of Quality and improve customer
satisfaction. Making decisions faster on redesigning or eliminating the least profitable
production processes helps to keep manufacturing costs down and product quality up.
Garvin proposes eight critical dimensions or categories of quality that can serve
as a framework for strategic analysis: Performance, features, reliability, conformance,
durability, serviceability, aesthetics, and perceived quality. Therefore, employee
performance has a major impact on customers' perceptions of service quality. They
identified ten overlapping dimensions of service quality: tangibles, reliability,
responsiveness, communication, credibility, security, competence, courtesy,
understanding/knowing the customer, and accessibility. Quality parameters.
Quality attributes of fruits and vegetables such as appearance, color, aroma,
taste drive the decision of consumers to buy a product. In the past high quality
standards were mainly imposed by retailers (e.g., long shelf-life, color, texture, shape)
and less on the consumers' side.
Total quality management (TQM) is achieved and becomes part of the overall
organizational culture when the five principles - produce quality work the first time, focus
on the customer, have a strategic approach to improvement, improve continuously and
encourage mutual respect and teamwork It approach is focused on exceeding
customers' expectations, identifying problems, building commitment, and promoting
open decision-making among workers. To be successful implementing TQM, an
organization must concentrate on the eight key elements:

 Ethics.
 Integrity.
 Trust.
 Training.
 Teamwork.
 Leadership.
 Recognition.
 Communication
Six Sigma is simply a process for solving a problem. It consists of five basic
phases: Define Measure, Analyze, Improve, and Control. Six Sigma can improve new or
existing processes using its defined methodology. DMAIC – This method improves
existing processes. ... DMADV – This acronym stands for define, measure, analyze,
design and verify. It is used to develop a new product or service or to redesign a
process that has reached its limits. Six Sigma successes are based on five
key principles: Focusing on customer requirements. Using extensive measurement and
statistical analysis to understand how work gets done and to identify the root cause of
problems (variations) Being proactive in eliminating variation and continually improving
the process.
Employee empowerment is giving employees a certain degree of autonomy and
responsibility for decision-making regarding their specific organizational tasks. The
organization has the responsibility to create a work environment which helps foster the
ability and desire of employees to act in empowered ways. ... They become more
responsible and accountable when self-direction is the norm. Employee involvement
and participative management are often used to mean empowerment. When
organizations adopt strategies that promote employee empowerment, they
benefit through cost savings, improved employee relations and increased customer
satisfaction.
A benchmark is a point of reference by which something can be measured. In
surveying, a "bench mark" (two words) is a post or other permanent mark established at
a known elevation that is used as the basis for measuring the elevation of other
topographical points. There are four primary types of benchmarking: internal,
competitive, functional, and generic. Internal benchmarking is a comparison of a
business process to a similar process inside the organization. Competitive
benchmarking is a direct competitor-to-competitor comparison of a product,
service, process, or method. Benchmarking is a way of discovering what is the best
performance being achieved – whether in a particular company, by a competitor or by
an entirely different industry. This information can then be used to identify gaps in an
organization's processes in order to achieve a competitive advantage.
Just-in-time (JIT) manufacturing, also known as just-in-time production or the
Toyota Production System (TPS) is a methodology aimed primarily at
reducing times within production system as well as response times from suppliers and
to customers. His just-in-time inventory system is a management strategy that aligns
raw-material orders from suppliers directly with production schedules.
 There are seven basic quality tools, which can assist an organization for problem
solving and process improvements. Known around the world as the seven quality
control (7-QC) tools, they are:
 Cause-and-effect diagram (also called Ishikawa or fishbone chart)
 Check sheet.
 Control chart.
 Histogram.
 Pareto chart.
 Scatter diagram analysis.
 Stratification
IV. Statistical Process Control

Statistical Process Control (SPC) is an industry-standard methodology for measuring


and controlling quality during the manufacturing process. Quality data in the form of
Product or Process measurements are obtained in real-time during manufacturing.
The control chart is a graph used to study how a process changes over time. Data are
plotted in time order. A control chart always has a central line for the average, an upper
line for the upper control limit and a lower line for the lower control limit.
What is a variables control chart? Variables control charts plot continuous measurement
process data, such as length or pressure, in a time-ordered sequence. In contrast,
attribute control charts plot count data, such as the number of defects or defective units.
Attribute Charts are a set of control charts specifically designed for Attributes data
(i.e. counts data). Attribute charts monitor the process location and variation over time in
a single chart. In statistical quality control, and the R chart is a type of control chart used
to monitor variables data when samples are collected at regular intervals from a
business or industrial process.
The Central Limit Theorem (CLT) is a statistical theory states that given a
sufficiently large sample size from a population with a finite level of variance, the mean
of all samples from the same population will be approximately equal to the mean of the
population.
How do you find the central limit theorem?
Central Limit Theorem Examples: Less than
Subtract the mean (μ in step 1) from the less than' value (in step 1). Set this number
aside for a moment.
Divide the standard deviation (σ in step 1) by the square root of your sample
(n in step 1)
Divide your result from step 1 by your result from step 2 (i.e. step 1/step 2)
The central limit theorem states that if you have a population with mean μ and standard
deviation σ and take sufficiently large random samples from the population with
replacement , then the distribution of the sample means will be approximately normally
distributed it is an important result in statistics, most specifically, probability theory.
This theorem enables you to measure how much the means of various samples vary
without having to use other sample means as a comparison. The central limit
theorem states that given a distribution with a mean μ and variance σ², the sampling
distribution of the mean approaches a normal distribution with a mean (μ) and
a variance σ²/N as N, the sample size, increases. ... Keep in mind that N is the sample
size for each mean and not the number of samples. The central limit theorem says that
this sampling distribution is approximately normal commonly known as a bell curve. This
approximation improves as we increase the size of the simple random samples
that are used to produce the sampling distribution. The central limit theorem tells
us exactly what the shape of the distribution of means will be when we draw repeated
samples from a given population. Specifically, as the sample sizes get larger, the
distribution of means calculated from repeated sampling will approach normality.
If the subgroup size is between 7 and 10, select the appropriate constant, called D3,
and multiply by R-bar to determine the Lower Control Limit for the Range Chart. There
is no Lower Control Limit for the Range Chart if the subgroup size is 6 or less.
LCL(R) =R-bar x D3 Plot the Lower Control Limit on the R chart. Control
charts are used to routinely monitor quality In general, the chart contains a center line
that represents the mean value for the in-control process. Two other horizontal lines,
called the upper control limit (UCL) and the lower control limit (LCL), are also shown on
the chart.
A p-chart is a type of control chart used to monitor the proportion of
nonconforming units when measuring subgroups at regular intervals from a process. A
c-chart is a type of control chart used to monitor the total number of nonconformities
when measuring subgroups at regular intervals from a process. An attribute chart is a
type of control chart for measuring attribute data (vs. continuous data). There are four
types of attribute charts: c chart, n chart, np chart, and u chart. The choice
of charts depends on whether you have a problem with defects or defectives, and
whether you have a fixed or varying sample size.
In statistical quality control, the c-chart is a type of control chart used to monitor "count"-
type data, typically total number of nonconformities per unit. It is also occasionally used
to monitor the total number of events occurring in a given unit of time.
Quality characteristic type: Attributes data
Measurement type: Number of nonconformities
Rational subgroup size: n > 1
Size of shift to detect: ≥ 1.5σ
What is C chart used for? A c-chart is an attributes control chart used with data
collected in subgroups that are the same size. C-charts show how the process,
measured by the number of nonconformities per item or group of items, changes over
time. Nonconformities are defects or occurrences found in the sampled subgroup.
 A run test is used to help spot abnormalities in a control chart process. It is used if
points are not individually out of control, but form a pattern above or below the nominal
centerline. The control chart is a graph used to study how a process changes over time.
Data are plotted in time order. A control chart always has a central line for the average,
an upper line for the upper control limit and a lower line for the lower control limit.

V. Process Strategy

Process design refers to the process of originating and developing a plan for a


product, service or process. Process: Is any part of an organization which takes a
set of input resources which are then used to transform something into outputs of
products or services.
Process Analysis and Design is a systematic approach to improve our
understanding of the business processes of an organization to assist in the
realization of tangible benefits such as cost reduction, process efficiency, and
effective human resource allocation. The Process Analysis and Design team in
Organizational Excellence supports campus clients on their journey toward high
performance by combining talent, experience and campus knowledge with a
comprehensive array of methodologies and frameworks to deliver results. The major
objective of any process in the business is to support the business's overall
objectives. Therefore process design must reflect the relative priority of the
normal performance objectives–quality, speed, dependability, flexibility and cost.
Analysis emphasizes an investigation of the problem and requirements, rather than
a solution. Design emphasizes a conceptual solution (in software and hardware) that
fulfills the requirements, rather than its implementation. For example, a description of
a database schema and software objects.
A flow chart is a diagram that visualizes a process or workflow. Typically, you use
boxes or shapes to represent different steps in a process, and then you connect
those steps with lines or arrows.

Five Tips for Better Flowcharts


1. Use Consistent Design Elements. Shapes, lines and texts within a flowchart
diagram should be consistent. ...
2. Keep Everything on One Page. ...
3. Flow Data from Left to Right. ...
4. Use a Split Path Instead of a Traditional Decision Symbol. ...
5. Place Return Lines under the Flow Diagram.

While on the other hand, flowchart is a diagram which contains different steps


through a problem can be explained. In short flowchart is the pictorial form of
representation of a process and algorithm is done using a step by step process. And
I think flowchart is a tool which is used with algorithms. Flowcharts use
special shapes to represent different types of actions or steps in a process. Lines
and arrows show the sequence of the steps, and the relationships among them. 
Time- Function mapping Definition Time-Function Mapping is a process in a flow
diagram with time added on the horizontal axis. This tool is also called
process mapping. The nodes indicate the activities and the arrows indicate the flow
direction, with time on the horizontal axis. Business process mapping, a part
of Business Process Management (BPM), is a framework used to create visual
representations of work processes.
Process mapping allows you to visually communicate the important details of a
process rather than writing extensive directions. Flowcharts and process maps are
used to show others how a process is done. Improve communication between
individuals engaged in the same process. Process Mapping is the technique of using
flowcharts to illustrate the flow of a process, proceeding from the most macro
perspective to the level of detail required to identify opportunities for
improvement. Process mapping focuses on the work rather than on job titles or
hierarchy.
Graphical representation of the sequence of steps or tasks (workflow)
constituting a process, from raw materials through to the finished product. It serves
as a tool for examining the process in detail to identify areas of possible
improvements. Also called a process map. A Flow Process Chat is a symbolic
representation that illustrates the sequence of actions within a process. It records
the steps of a process along a vertical line. ... The flow process chart uses different
symbols to indicate the type of activity being undertaken. It also uses text to give
details of those activities.
A service blueprint is an operational planning tool that provides guidance on how
a service will be provided, specifying the physical evidence, staff actions, and
support systems / infrastructure needed to deliver the service across its different
channels. A blueprint is a reproduction of a technical drawing using a contact print
process on light-sensitive sheets it was widely used for over a century for the
reproduction of specification drawings used in construction and industry.

VI. Capacity Planning

Capacity planning is the process of determining the production capacity needed


by an organization to meet changing demands for its products. In the context
of capacity planning, design capacity is the maximum amount of work that an
organization is capable of completing in a given period.
Capacity planning can apply to a company's computer network, storage, workforce
maintenance, and product manufacturing. Planning for capacity breaks down into
three steps: determining service level requirements, analyzing current capacity, and
planning for the future.  Capacity planning is the process of developing and
implementing a capacity strategy to scale to business volumes. Capacity includes
things like labor, land, facilities, infrastructure, machines and resources that are
required to deliver products, services and processes
Break-even analysis is a technique widely used by production management and
management accountants. Total variable and fixed costs are compared with sales
revenue in order to determine the level of sales volume, sales value or production at
which the business makes neither a profit nor a loss. A break-even analysis is a
useful tool for determining at what point your company, or a new product or
service, will be profitable. Said another way, it's a financial calculation used to
determine the number of products or services you need to sell to at least cover your
costs. Break-even analysis is widely used to determine the number of units the
business needs to sell in order to avoid losses. This calculation requires the
business to determine selling price, variable costs and fixed costs.
Expected monetary value (EMV) is a statistical technique in risk management that is
used to quantify the risks, which in turn, assists the project manager to calculate the
contingency reserve. Expected monetary value (EMV) is a risk management
technique to help quantify and compare risks in many aspects of the project. EMV is
a quantitative risk analysis technique since it relies on specific numbers and
quantities to perform the calculations, rather than high-level approximations like
high, medium and low.
Net present value is the value in the present of a sum of money, in contrast to some
future value it will have when it has been invested at compound interest. Net present
value (NPV) is determined by calculating the costs (negative cash flows) and benefits
(positive cash flows) for each period of an investment. NPV is the sum of all the
discounted future cash flows.

VII. Inventory Management

Inventory management is the supervision of non-capitalized assets and stock


items. A component of supply chain management, inventory management supervises
the flow of goods from manufacturers to warehouses and from these facilities to point of
sale.
Generally, inventory types can be grouped into four classifications: raw material,
work-in-process, finished goods, and MRO goods.

 RAW MATERIALS. ...


 WORK-IN-PROCESS. ...
 FINISHED GOODS. ...
 TRANSIT INVENTORY. ...
 BUFFER INVENTORY. ...
 ANTICIPATION INVENTORY. ...
 DECOUPLING INVENTORY. ...
 CYCLE INVENTORY.

The cost of inventory includes all costs associated with holding or


storing inventory for sale. These costs include the opportunity cost of the money used to
purchase the inventory, the space in which the inventory is stored, the cost of
transportation or handling, and the cost of deterioration and obsolescence. Inventory is
generally categorized as raw materials, work-in-progress, and finished goods. Retailers
typically refer to this inventory as merchandise. Common examples of
merchandise include electronics, clothes, and cars held by retailers.
The four basic types of inventory are Raw materials are purchased and
unprocessed inventory, Work-in-process are going to be partially processed inventory,
finished goods are inventory that have been completely processed or assembled, ready
for sales or shipping to customers.
ABC analysis is a method of analysis that divides the subject up into three categories:
A, B and C. Category A represents the most valuable products or customers that you
have. These are the products that contribute heavily to your overall profit without eating
up too much of your resources. In materials management, the ABC analysis (or
Selective Inventory Control) is an inventory categorization technique. ... The ABC
analysis suggests that inventories of an organization are not of equal value. Thus, the
inventory is grouped into three categories (A, B, and C) in order of their estimated
importance.
A cycle count is an inventory auditing procedure, which falls under inventory
management, where a small subset of inventory, in a specific location, is counted on a
specified day. Cycle counting involves counting a small amount of inventory each day,
with the intent of cycling through the entire inventory on an ongoing basis. Any errors
found during these small incremental counts should result in an adjustment to the
inventory accounting records.
Independent demand is demand for a finished product, such as a computer, a bicycle,
or a pizza. Dependent demand, on the other hand, is demand for component parts or
subassemblies. For example, this would be the microchips in the computer, the wheels
on the bicycle, or the cheese on the pizza.
Production order quantity is a model that answers how much to produce and
when to order. Economic order quantity is an equation for inventory that determines the
ideal order quantity a company should purchase for its inventory given a set cost
of production, demand rate and other variables.
A quantity discount is an incentive offered to a buyer that results in a decreased cost
per unit of goods or materials when purchased in greater numbers. A quantity
discount is often offered by sellers to entice buyers to purchase in larger quantities.
Calculate the quantity discount. Multiply the number of widgets purchased by the
discount associated with purchasing that number of widgets. Then multiply this number
by the price of each widget. The calculation is 2,998 multiplied by 20 percent multiplied
by $10.

VIII. Aggregate Planning S&OP

In aggregation, the relation between two entities is treated as a single entity. In


aggregation, relationship with its corresponding entities is aggregated into a higher level
entity. An aggregation is a collection or the gathering of things together. Your baseball
card collection might represent the aggregation of lots of different types of cards.
Aggregation comes from the Latin ad meaning herd. So the word was first used
to literally mean to herd or to flock. Aggregation is a way of composing different
abstractions together in defining a class. For example, a car class can be defined to
contain other classes such as engine class, seat class, wheels class etc. The car class
can define an engine class as one of its attributes.
Aggregate planning is the process of developing, analyzing, and maintaining a
preliminary, approximate schedule of the overall operations of an organization. The
aggregate plan generally contains targeted sales forecasts, production levels, inventory
levels, and customer backlogs. Sales and operations planning (S&OP) is
an important process that aims to ensure that customer demand can be met by the
production, distribution and purchasing. With this foundation, demand and supply
balancing, as well as operations and executive review can be conducted with speed and
efficiency.
Aggregate planning is a marketing activity that does an aggregate plan for the
production process, in advance of 6 to 18 months, to give an idea to management as to
what quantity of materials and other resources are to be procured and when, so that the
total cost of operations of the organization is kept to the minimum over that period.
Aggregate planning plays an important part in achieving long-term objectives of the
organization. Aggregate planning helps in: Achieving financial goals by reducing overall
variable cost and improving the bottom line. Maximum utilization of the available
production facility.
Miscellaneous Services are available in the event of: issuance of new passport. Change
of personal particulars such as nationality, name, father's name, date of birth etc.

IX. Short-Term Scheduling

Short term scheduling concerns the allocation of limited resources to tasks


overtime. ... The objective of short-term scheduling is to allocate and prioritize demand,
matching daily and hourly requirements to specific personnel and equipment. The short-
term scheduler decides which of the ready, in-memory processes is to be executed after
a clock interrupt, an I/O interrupt, an operating system call or another form of signal.
On the other hand, the Short-Term Scheduler selects the processes from the Ready
queue. Long-Term scheduler controls the degree of multiprogramming whereas;
the Short-Term Scheduling has less control over the degree of Multiprogramming.
Long-Term Scheduling is also called Job Scheduler.
Input/output control. A technique for capacity control where planned and
actual inputs and planned and actual outputs of a work center are monitored.
Planned inputs and outputs for each work center are developed by capacity
requirements planning and approved by manufacturing management. Input-output
Control is a technique that allows operation to manage facility work flow. It is used
to control the size of the queues in front of work centers, thereby helping
to control manufacturing lead times.
A Gantt chart is a type of bar chart that illustrates a project schedule. This chart lists the
tasks to be performed on the vertical axis, and time intervals on the horizontal axis. The
width of the horizontal bars in the graph shows the duration of each activity.
A Gantt chart is a horizontal bar chart developed as a production control tool in
1917 by Henry L. Gantt, an American engineer and social scientist. Frequently used
in project management, a Gantt chart provides a graphical illustration of a schedule that
helps to plan, coordinate, and track specific tasks in a project. Gantt charts are useful
for planning and scheduling projects. They help you assess how long a project should
take, determine the resources needed, and plan the order in which you'll complete
tasks. They're also helpful for managing the dependencies between tasks.
Assignment method is a way of allocating organizational resources where a resource is
assigned to a particular task. The model is a special case of the transportation method.
In order to generate an assignment problem it is necessary to provide the number of
jobs and machines and indicate whether the problem is a minimization or maximization
problem. The number of jobs and machines do not have to be equal but usually they
are.
X. Orlando Utilities Commission

The Orlando Utilities Commission (OUC: "The Reliable One") is a municipally-


owned public utility providing water and electric service to the citizens of Orlando,
Florida and portions of adjacent unincorporated areas of Orange County, as well as St.
Cloud, Florida, in Osceola County. In addition, each unit is also taken off-line every
three years for a complete overhaul and turbine generator inspection. Overhauls
schedules are for spring March to May and fall from September to December when the
weather is mildest, and demand for power is low. Schedules overhauls are not easy;
each one has 1,800 distinct tasks and requires 72,000 labor hours. Established in 1923
by a special act of the Florida Legislature. OUC is the second largest municipal utility in
Florida and 14th largest municipal in the country. OUC provides electric, water, chilled
water and/or lighting services to more than 240,000 customers. OUC owns and
operates the Curtis H. Stanton Energy Center in east Orange County. The most diverse
generating site in the state – natural gas, landfill methane gas, coal and solar are on the
3,280 acre property which can generate more than 1,800 megawatts of electricity. 
RAMS is an acronym for Reliability, Availability, Maintainability, and Safety. Reliability is
a product's or system's ability to perform a specific function and may be given as
design reliability or operational reliability. Availability is the ability of a system to be kept
in a functioning state. People often confuse reliability and availability. Simply
put availability is a measure of the % of time the equipment is in an operable state
while reliability is a measure of how long the item performs its intended function.
Availability can be measured as: Uptime / Total time (Uptime + Downtime). Availability
is the probability that a system will work as required when required during the period of
a mission.
Parallel Redundancy Protocol (PRP) is a network protocol standard for Ethernet that
provides seamless failover against failure of any network component. This redundancy
is invisible to the application.
PRP nodes have two ports and are attached to two separated networks of similar
topology. PRP can be implemented entirely in software, i.e. integrated in the network
driver. Nodes with single attachment can be attached to one network only. 

XI. Decision-Making Tools

Decision-Making Fundamentals. Every decision that we make shapes our future.


A structured decision-making process ensures that important decisions are made on
time and are based on facts, research, and analysis. Identify the decision. The first step
in making the right decision is recognizing the problem or opportunity and deciding to
address it. Determine why this decision will make a difference to your customers or
fellow employees.
Operations strategy will be revealed in the total pattern of decisions that a
business takes in developing its operations in the long term. In the management
the manager or decision makers always have plane how can increase the company or
establishment profit and have to take right decision.
Conditions that Influence Decision Making. Managers make problem‐solving decisions
under three different conditions: certainty, risk, and uncertainty.
Certainty is perfect knowledge that has total security from error, or the mental state of
being without doubt. Objectively defined, certainty is total continuity and validity of all
foundational inquiry, to the highest degree of precision.
Decision making is the process of making choices by identifying a decision,
gathering information, and assessing alternative resolutions. Using a step-by-step
decision-making process can help you make more deliberate, thoughtful decisions by
organizing relevant information and defining alternatives.
A decision can be defined as a course of action purposely chosen from a set of
alternatives to achieve organizational or managerial objectives or goals. Decision
making process is continuous and indispensable component of managing any
organization or business activities. Decision making is important to achieve the
organizational goals/objectives within given time and budget.
At the highest level we have chosen to categorize decisions into three major types:
decision making under uncertainly, decision making under certainly, and decision
making under risk.
Decision Making Under Certainty Uncertainty and Risk Examples. Decision-making is
needed whenever an individual or an organization (private or public) is faced with a
situation of selecting an optimal (or best in view of certain objectives) course of action
from among several available alternatives.
A decision problem, where a decision-maker is aware of various possible states of
nature but has insufficient information to assign any probabilities of occurrence to them,
is termed as decision-making under uncertainty.
Decision making under risk and Uncertainty example. In case of decision-making
under uncertainty the probabilities of occurrence of various states of nature are not
known. ... Risk implies a degree of uncertainty and an inability to fully control the
outcomes or consequences of such an action.
Decision making is the process of making choices by identifying a decision, gathering
information, and assessing alternative resolutions. Using a step-by-step decision-
making process can help you make more deliberate, thoughtful decisions by organizing
relevant information and defining alternatives.
Decision Trees are a type of Supervised Machine Learning (that is you explain
what the input is and what the corresponding output is in the training data) where the
data is continuously split according to a certain parameter. An example of a decision
tree can be explained using above binary tree.
XII. Linear Programming

Linear programming is used for obtaining the most optimal solution for a problem
with given constraints. In linear programming, we formulate our real life problem into a
mathematical model. It involves an objective function, linear inequalities with subject to
constraints. Linear programming is often used in business to find maximum profit or
minimum cost. The first step in solving linear programming problems is to set up a
function that represents cost, profit, or some other quantity to be maximized or
minimized subject to the constraints of the problem. Linear programming is used to
obtain optimal solutions for operations research. Using linear programming allows
researchers to find the best, most economical solution to a problem within all of its
limitations, or constraints. Many fields use linear programming techniques to make their
processes more efficient.
Operations Research/Linear Programming (LP) is a mathematical modeling
technique useful for allocation of limited resources such as material, machines etc to
several competing activities such as projects, services etc. Linear programming,
mathematical modeling technique in which a linear function is maximized or minimized
when subjected to various constraints. Operation research is an approach to decision-
making, which involves a set of methods to operate a system. In the above example, my
system was the Delivery model. Linear programming is used for obtaining the
most optimal solution for a problem with given constraints. Some of the advantages of
Linear Programming are: Utilized to analyze numerous economic, social, military and
industrial problems. Linear programming is most suitable for solving complex problems.
Helps in simplicity and productive management of an organization which gives better
outcomes.

All linear programming problems must have following five characteristics:

 (a) Objective function - There must be clearly defined objective which can be


stated in quantitative way. In business problems the objective is generally profit
maximization or cost minimization.
 (b) Constraints - All constraints (limitations) regarding resources should be
fully spelt out in mathematical form.
 (c) Optimization - All linear programming problems are problems of
optimization. This means that the true purpose behind solving a linear programming
problem is to either maximize or minimize some value. Thus, linear programming
problems are often found in economics, business, advertising and many other fields that
value efficiency and resource conservation. Examples of items that can be optimized
are profit, resource acquisition, free time and utility.
 (d) Non- negativity - The value of variables must be zero or positive and not
negative. For example, in the case of production, the manager can decide about any
particular product number in positive or minimum zero, not the negative.
 (e) Linearity - The relationships between variables must be linear. Linear
means proportional relationship between two ‘or more variable, i.e., the degree of
variables should be maximum one.

To solve a linear programming problem, follow these steps. Graph the region


corresponding to the solution of the system of constraints. Find the coordinates of the
vertices of the region formed. Evaluate the objective function at each vertex to
determine which - and -values, if any, maximize or minimize the function.
Linear programming, mathematical modeling technique in which a linear function is
maximized or minimized when subjected to various constraints. This technique has
been useful for guiding quantitative decisions in business planning, in industrial
engineering, and to a lesser extent in the social and physical sciences.
Linear programming is a method that is used to find a minimum or maximum value for a
function. That value is going to satisfy a known set of conditions constraints. Constraints
are the inequalities in the linear programming problem. Their solution is graphed as a
feasible region, which is a set of points.
The constraints may be equalities or inequalities. ... These are called non negativity
constraints and are often found in linear programming problems. The other
constraints are then called the main constraints. The function to be maximized (or
minimized) is called the objective function.
Linear programming provides a method to optimize operations within certain
constraints. It is used to make processes more efficient and cost-effective. Some areas
of application for linear programming include food and agriculture, engineering,
transportation, manufacturing and energy. It consists for four basic components:
Decision variables represent quantities to be determined. Objective function represents
how the decision variables affect the cost or value to be optimized (minimized or
maximized).
Linear programming is a mathematical method to determine the optimal scenario. The
theory of linear programming can also be an important part of operational research. It's
frequently used in business, but it can be used to resolve certain technical problems as
well.
Submitted by: Carla Joyce M. Chan
Submitted to: Sir Isaiah Jacob Depusoy

You might also like