CBET Assessment
CBET Assessment
IN
COMPETENCY – BASED EDUCATION
ASSESSMENT
IN
COMPETENCY – BASED EDUCATION
NCTVET, Jamaica
November , 2006
2
CONTENT
Page
Introduction 4
I. Assessment in Competency-Based Education 5
II. Modular Assessment 11
III. Authentic VS. Traditional Assessment 19
IV. Methods of Assessment Used by NCTVET in
Competence-Based Education 22
V. Quality Assurance and Record-Keeping 39
VI. The Process Used by NCTVET for Modular Assessment and
Certification 44
Appendices 48
3
INTRODUCTION
4
ASSESSMENT IN COMPETENCY BASED EDUCATION
Introduction
Most people involved in the training and educational process are interested in knowing
how effective the training has been, whether or not learning has taken place, how the
courses can be improved, or how well the trainees are progressing. Assessing the
students can generate information which will suggest what should be changed, what is
working and therefore should be continued, what needs a little fine-tuning, and what an
individual has learned.
5
I. COMPETENCY-BASED EDUCATION
It is important that teachers involved in the instructional delivery of the programme fully
understand and are aware of the requirements of competency-Based Education (CBE).
This section provides an overview to CBE and should be read by all teachers and
assessors involved in the implementation of the programme in secondary schools.
Through the establishment of industry lead groups, the National Training Agencies guide
the development of the occupational standards and curriculum materials which are
presented either in a modular format or in units. Both formats facilitate the competency-
based training and assessment in the TVET system as the approaches seeks to
systematically identify and develop essential skills, knowledge and attitudes for the job.
Competency-Based Education (CBE) is built on the philosophy that “almost all learners
can learn equally well if they receive the kind of instructors the need”. To make this
philosophy work CBE requires significant changes in the development and the
administration of the modularized/unit-based programmes. Although technical
vocational education has always been concerned with the practical demonstration of the
skill, CBE places a new and systematic emphasis on this principle. In this approach, the
systematic development and delivery of the training is guided by five essential elements:
(i) The tasks to be taught are identified by the experts in the occupation.
(ii) The programme allows each learner to have the opportunity to develop
and tube evaluated on the competencies achieved.
6
CBE also dictates a change in the role of the teacher which changes from the
conventional information-giver to that of a resource person. Hence, the
students/trainees will have more responsibility for their own learning and progress. This
kind of student/trainee involvement is critical to CBE. Therefore, at the start of the
training programme students/trainees should be made aware of the key elements of
CBE that is :
(iv) Students/trainees are informed about criteria and attitude important to the
occupation.
7
Other Elements of a Competency-based Education Programme
6. Skill mastery records are maintained for each learner, and learners have
access to and make use of these records to monitor their skill acquisition.
B. Purposes of Assessment
Assessment serves several purposes. It gives Information about the knowledge skills
and attitude students have acquired. Assessment helps to determine the level of
competence the students have acquired, and whether they can apply that knowledge,
and it can help in providing high-quality instruction for the students enrolled in your
training programme. You and other people can put this information to several important
uses.
8
C. Definitions and Terminologies
Terms used in the testing process are usually misunderstood or used inaccurately.
Terms used in measurement suc h as evaluation, assessment, and achievement and
testing are specific processes , often used in the same breath, or used interchangeably,
but although closely related, they do not mean the same thing.
Assessment
As far as possible, the term “assessment” should be reserved for application to people.
It covers activities included in grading, be it formal or non-formal, examining, certifying
and so on. Student achievement on a particular course may be assessed. A trainer, an
instructor, or a student’s competence may be assessed; an applicants attitude for a
particular job may also be assessed. Assessment is therefore the gathering of
information or evidence about an individual’s ability to perform to clearly stated
standards.
Evaluation
Evaluation refers to the interpretation of the data to determine how well the student has
grown towards the goals and instructional objectives - how well he or she has
performed. Analysis of data collected from various instruments can be used to
determine the level of competency of the student in a skill area. When decisions are
made based on the interpretation of the data that is collected, that is the point at which
as evaluation has taken place.
9
its own sake. It may be included in an assessment or evaluation procedure, but it is
more to be regarded as a basic research procedure.
Tests
Individual performance can be measured by using different instruments often referred
to as tests. A test is a set of items designed to measure the performance of a student or
trainee. The three (3) areas of student performance most often tested are those relating
to:
(a) Achievement - The extent of the learner’s knowledge in the
skill area
D. Assessment Concepts
10
II. MODULAR ASSESSMENT
What is Modular Assessment?
Modular assessment strategies are usually adopted wherever individualized
competency-based education and training is implemented. Typically, modular
assessment is integrated into the instructional package to ensure that mastery of the
outcomes is based on the demonstration of the competencies defined rather than “test
taking skills”.
A. Principles of Assessment
1. Validity
An assessment is valid in so far as it actually assesses what it sets out to
measure.
2. Reliability
The assessment produces the same results on different occasions and with
different assessors.
3. Authenticity
Assessment can be shown to relate to the student’s own individual work.
4. Accessibility
An assessment is accessible in so far as it is available as frequently done,
covering a wide range of tasks as far as conditions allow.
5. Efficiency
The assessment methods used avoid unnecessary length and duplication.
6. Adequacy of Feedback
The results are recorded and are available to the users speedily and in sufficient
detail to be of positive use.
11
7. Cost-Effectiveness
In meeting al the requirements above, and as far as possible, the assessment
procedures adopted should be cost-effective.
12
C. Key Differences between Competency-based Assessment and
Traditional Modes of Assessment
13
Modes of Assessment
- How do we assess?
- What do we assess?
- What do we do with the assessment?
The choice of mode used will be determined by the purpose of assessment. The
purpose may in turn be determined by the nature of the subject/course/skill being
asses sed, the aims and objectives of the curriculum/course the indented use of the
assessment information (placement, selection) and differences in teaching style.
14
MODES EXAMPLES/JUSTIFICATION
Formal Public examinations CEE, CXC,
GCE, NCTVET, School exams.
Contrived situations to provide a
final judgement at one sitting.
or
Recorded observations, two
Informal way interaction between
teacher and pupil, provide
feedback.
Final/Summative At the end of the course
concerned with final summing
up of student’s performance.
Often used as means of ranking
or and for selection purposes.
15
D. Integrated Assessment
Modular assessment is integrated in the module format. Assessment should be
an on-going (continuous process) which provides the opportunity for more than
one attempt (if necessary) at successful completion of the module. The following
compares modular instruction with modular assessment.
16
E. Criterion-referenced Tests
Norm referenced interpretation matches the individual score against the scores of the
group or other individuals.
1. How does it compare with the average score of some group of people?
17
0 50 100% 0 50
100%
18
III. AUTHENTIC VS TRADITIONAL ASSESSMENT
Authentic assessments present the student Conventional tests are usually limited to
with the full array of tasks that mirror the paper-and-pencil, one-answer questions.
priorities and challenges found in the best
instructional activities
Authentic assessments attend to whether Conventional tests typically only ask the
the student can craft polished, thorough student to select or write correct
and justifiable answers, performances or responses--irrespective of reasons
products.
Authentic tasks involve “ill-structured” Traditional tests are more like drills,
challenges and roles that help students assessing static and too-often arbitrarily
rehearse for the complex ambiguities of discrete or simplistic elements of those
the “game” of adult and professional life. activities.
19
Authentic assessments are enabling and forward-looking, not just reflective of prior
teaching. In many training and teaching settings the essential challenges are known in
advance.
Traditional tests, by requiring complete secrecy for their validity, make it difficult for
instructors and learners to rehearse and gain the confidence that comes from knowing
their performance obligations.
20
AUTHENTIC ASSESSMENT
21
IV. METHODS OF ASSESSMENT USED BY ASSESSORS IN CBE
There are many methods of gathering evidence that can be used to enable an assessor
to make a judgement about the learner’s competence. The methods chosen should be
the most direct and relevant to the competencies or learning outcomes being assessed.
(Over-reliance on one particular method should be avoided). Work place tasks may be
simulated. The assessor makes a judgement as to which combination of methods
provides simplicity and flexibility and is best suited to the competency that is being
asses sed. The assessment methods may include:
Skills
Demonstration Methods: work sample; skill sample; practical project; structured
problems and tasks
Type: checklists; rating scales; research task; assignments
Direct
Observation: Methods: product and/or processes on the job
Type: checklists; rating scales; research tasks; log books; skills
books; work experience; interaction analysis, peer assessments;
group assessment
Indirect
Observation: Methods: product and/or processes on the job
Type: evidence from supervisors, colleagues and clients,
portfolios
Evidence of
Prior Learning Methods: Examination of evidence
Type: Portfolios; logbooks; qualifications; referees; supervisor
reports
22
Assembling Performance –Based Tests
Process Vs Product
Process Measures consists of the evaluation of steps the learner goes through in
order to perform the task.
23
Characteristics
There are two key areas of interest in process measurement
1. The quality of the performance
2. The efficiency, in terms of rate, speed, and approach
Product Measurement consists of the evaluation of the final outcome (or product)
of performing the task.
Characteristics
Process Measurement
This is the assessment of the steps the learner goes through in order to perform the
task. The facilitator/examiner/instructor observes each step to determine how the task is
performed. A prepared checklist is useful in maintaining objectivity.
Process measurement answers questions such as:
24
Product Measurement
This is the assessment of the final outcome on completion of the task. It answers
questions such as:
(a) Does the finished product meet the design specifications?
(b) Is the finished product neat in appearance?
(c) Does the finished product meet safety standards?
Process measurement is used when the steps to complete a task are critical. Safety is a
major concern and efficiency of the operation must be observed in order to determine
competency.
In some cases, the judgement of the final outcome is all that may be required to
determine competency. Visual inspection or testing of the product may be objectively
done and the competency determined without having seen the process.
For example, inspection of a weld, and submitting it to a stress test (product measure)
may be of greater value to measuring competency than watching the step taken in the
welding process (process measure).
In a situation where the student is setting up the oxygen and acetylene tanks for the first
time without assistance, the instructor may want to observe each step (process) in
addition to ensuring that the unit is welded properly when gases are turned on
(product).
25
Product Measurement Process Measurement
Easy to develop and administer this type of More time required to develop and
assessment administer this type of assessment
Does not detect errors in performance that Can pinpoint exactly where procedural
may have affected outcome (finished errors occurred
product)
• The item deals with essential aspects of the content area and not with trivial
aspects
• The length of the item is directly related to the level of abilities and skills being
assessed
26
Good performance tests can be prepared by following these steps:
Operationalise the task, define the concept, determine the context of the
assessment – “simulated” or “naturally occurring”.
Determine the Performance Objectives
Describe exactly what you wish to test:
What skills do you expect the candidate to have and how do you expect
him/her to use/apply it in a given situation?
- What does the industry/certifying body expect of the
candidate?
- Do you wish to measure accuracy, speed, ability to plan, use
tools, - manipulate/handle materials?
- Will product or process or both be tested?
27
- Prepare scoring system which will be used to measure students
performance. A checklist reduces error.
- Establish the minimum acceptable level of performance for
mastery
Example
28
Step 4 Prepare for sampling and verification
Advantages
Disadvantages
29
ORAL QUESTIONING TECHNIQUES
Questioning is a valuable part of the process of learning because it helps the learner and
facilitator establish the extent of what is known and to develop new ideas. Questions
can be used to help learners to reflect on their understanding of a topic and make
improvements in learning and thinking.
For assessment purposes oral questions require some amount of thought and planning
to elicit underpinning knowledge from the learner. Spontaneous questions that emerge
during a practical assessment are expected but in general, the questioning strategy and
techniques are planned.
• Clarify understanding
• Gain feedback on learning
• Create links between ideas
• Promote complex thinking skills
• Prevents ambiguity
30
• Avoids cluttering of ideas caused by multiple questions or unclear questions
• Allows for wider coverage of the area and prevent too great a focus on the
immediate circumstances of the assessment.
• Avoid phrasing questions that are closed
• Ask probing and evaluative questions that call for higher cognitive thinking such
as analysis, synthesis and evaluation
• Encourage the exploration of various possibilities
• Design questions to help students see things from a broader perspective
Present the question clearly to ensure that the learner hears it and understands what is
required of him/her.
Ask questions that are within the language and literacy range of the learner and the
requirements of the competence.
Wait Time
Due to the nature of this kind of test, the problem of extraneous variables presents itself.
The anxiety level of candidates will possibly be higher than in a written test. Outside
interference, such as noise, can pose a distraction to candidates. The examiner’s non-
verbal cues also impact on candidates’ performance, whether in a negative or positive
way.
31
The oral testing environment must reflect the following:
32
PROJECTS
What is a Project?
A project is any exercise or investigation in which the time constraints have been
relaxed. Candidates are actively involved in making item to be showcased or presented
as evidence of competence.
Projects:
§ are practical
§ are more comprehensive than other assignments
§ may be tackled by an individual or a group
§ usually involve a significant part of the work being carried out without
close supervision, although the assessor may provide guidance and
support
• Build a model
• Collect, analyse and evaluate data
• Organise ideas, create visuals and make an integrated oral presentation
1. Select a task that requires the use of complex, cognitive skills and important
learning outcomes
2. Specify the range of content and resources that student can use at performing a
task
4. Ensure that students have prior knowledge essential for the task and are familiar
with the materials they need to use
33
6. Clearly communicate performance expectations in terms of the criteria by which
the performance can be judged
34
CHECKLISTS AND RATING SCALES
Rating scales and checklists are flexible tools that may serve a variety of assessment
purposes which include:
rating familiarity or competence for a skill
rating small group observable actions
evaluating learning.
Checklists and rating scales are completed by the observer while (or after) observing the
learner.
Rating Scales
Rating scales provide lists of specific observable actions or skills and then provide a
space to give a rating for the observable action or skill. These scales may be used to
rate such things as the learner’ ability to use equipment. It can also be used to
document observations of actions from which you make inferences about the learner’s
written work. Rating scales allow for recording the qualities or frequencies of the action
that is observed.
Observational Checklists
Checklists provide the observer with a list of observable actions or skills that can be
marked as present or absent (yes or no; observed or not observed). Observational
checklists can be used for the same assessment purposes as rating scales except that
the observable action or skill should be absolute. Either the observable
action/characteristic is there or it is not there. (For example: a trainee does or does not
prepare a material and tool listing before beginning work).
35
Checklists used over a period of time can be combined into frequency ratings. For
example, you might use checklists periodically to document the observable actions
characteristic of desirable workplace traits such as punctuality, cooperation, safety,
correct use of lab/workshop equipment. (For example: a trainee does or does not wear
protective gear when operating machinery).
Before developing a checklist or rating scale, you must decide what it is you want to
focus on:- skill in completing a process, attitudes or dispositions.
Skills:
List of observable actions that show a skill you are teaching or a skill that is a
prerequisite to what you are planning to teach (e.g. measuring an angle, applying
fertilizer, following directions, sharpening a cutting tool).
Thinking:
Observable actions that show understanding of how to use the thinking or reasoning
strategies characteristics of the discipline (e.g. drawing conclusions, generating
hypotheses, making predictions, supporting claims with evidence, asking open-ended
questions).
Conceptual:
Observable actions (or comments) that show understanding of the major concepts of the
discipline (e.g. character development, plot, theme, setting in literature; ratio, proportion,
percent, fractions in mathematics; using vocabulary in oral communication)
? The directions at the top of the checklist or rating scale tell how to complete it.
36
? Specific observable actions are given from which inferences can be made (e.g.
offers ideas, listens to others’ ideas) rather than the inferences (e.g. “cooperates”
or “participates”) themselves
37
ATTITUDINAL SCALES
Rating Scale
38
V. QUALITY ASSURANCE AND RECORD –KEEPING
39
o for promotion
o to determine equivalencies
40
INSTITUITING EMPLOYABILITY SKILLS
PURPOSE
One of the outcomes expected of CBE is the provision of world class workers who have
the competitive edge in terms or their knowledge, skills and positive attitudes in finding
their place in the job market for the utilization and provision of labour. Employers expect
the holders of vocational qualifications to exhibit attitudinal qualities that best fit into the
professional work environment. Above all, persons with the appropriate work attitudes
have been facilitated in the developmental process especially where there are limitations
in areas of skill and knowledge. A process must be defined whereby a student’s work
potential and characteristics/attitudes are rated, and such records are maintained as part
of the training records of the institution.
POLICY
Each trainee shall be informed that he or she will be monitored during training for
scholastic and positive attitudinal competencies. A system shall be in place whereby
trainees are monitored in a transparent, fair and systematic manner throughout their
period of training.
Positive attitudes shall be encouraged and promoted. Where negative attitudes are
detected, the trainee shall be the beneficiary of counselling and continuous feedback
with a view to producing a modified behaviour at the end of training. The evidence
should be easily retrievable and manageable such that it can be represented in a
summarized format on a record-keeping or data capturing form .
PROCEDURE
• The attitude monitoring form shall be discussed with all students both collectively
and individually. The attitudinal factors on the form shall be explained to each trainee
and allow for clarifications.
• Each teacher/senior teacher with responsibility for a class, shall assign a score on a
scale of 1 to 5 to each attitudinal factor for each student. The score shall be based
41
on a fair assessment of what appears to be the consistent characteristics of the
student.
• Each teacher/coordinator is expected to complete the form at least once per quarter
and submit it to the counselling department. This should be discussed with the
student on an individual basis.
• This form should be completed one month before each training term is
completed.
• The total score should be inserted in the column provided and represents a fraction
of 50 points since there are ten (10) attitudinal factors.
• The average score arrived at from the three (3) term’s assessments should be used
as the final score.
• It is suggested that the details of the last attitudinal factors be represented on the
trainee’s file.
• It is expected that there will be sufficient feedback during the first six (6) months of
the trainee’s tenure. Hence during the last three (3) months of training, it is expected
that there be behavioural changes to sharpen the trainee job readiness skills.
Ø Team teaching may be employed in order that job readiness infusion can be
planned for by both the skill and the support subjects instructors.
42
and computation instructors; instructors of same skill coming together to plan
lessons with an emphasis on job readiness objectives).
Ø Job readiness attitudinal factors should be represented on the lessons plans and
hence the realization of the job readiness objectives can inform the assessment
process with the attached form where each trainee is assigned a score, based on
the competencies displayed.
43
VI. THE PROCESS USED BY NCTVET FOR MODULAR ASSESSMENT
AND CERTIFICATION
Internal/Institution-Based Assessment
The Institutional-Based Assessment (IBA) is a very important part of the certification
programme. Facilitators are required to evaluate the knowledge, skills and attitude of
the learners during the training programme. Both theory and practical assessment of
performance must be administered for each candidate.
Using a variety of methods, each task listed in a module must be assessed by the
instructor on an on-going basis. An average of the learner’s rating at the end of each
module, should be recorded on the IBA Summary Sheet.
Accurate records must be kept by the facilitator/institution as these ratings will be used
as part of the certification and recorded on the Record of Achievement given to each
candidate.
The rating scale, with scores ranging from 1 to 5, where 5 is the highest and 1 is the
lowest, is used for both theory and practical assessment. All scores must be presented
as a rating when reporting to NCTVET’s Registrar. A learner’s performance, which is
calculated in a percentage, must be converted using the established rating scale.
44
RATING CONVERSION SCALE – LEVEL 1
SCORE RATING
75 – 100 5
60 – 74 4
45-59 3
30 -44 2
BELOW 30 1
A learner must achieve a rating of at least 3 in theory and practical in each completed
module to be eligible for certification. Therefore, if a learner is not achieving an average
rating of 3, he or she should NOT be submitted for final external assessment in that
module, but should be given further instruction/training in order to acquire mastery of the
tasks in the module. Once mastery has been achieved the learners can be submitted for
final assessment in that module.
The Internal Verifier will be responsible for the monitoring of the internal assessment
process.
Each completed IBA summary sheet must bear the signature of the institution’s
Manager, the Internal Verifier and the External Verifier/Assessor. If these signatures are
missing, the scores will not be accepted by NCTVET.
All completed IBA summary sheets must be submitted to the Registrar’s Office no later
that four (4) working days after the administration of the examination. Institutions will be
sent a letter acknowledging receipt of the IBA summary sheets and should make contact
with the Registrar’s Office to ensure that these records have been received.
45
Practical Assessment
Each area of assessment (skill and support, theory and practical) will be monitored by
and External Verifier/Assessor. The External Verifier/Assessor will visit the institution to
examine the assessment procedures used for the administration of the practical and he
theory components and report of the validity and fairness of the internal assessment.
The External Verifier will also examine learner assessment records to ensure that they
are correctly produced and maintained.
The External Verifier will validate practical and theory tests used for internal assessment
to determine the competency and knowledge level of the learners. Feedback will be
given to the institution by the External Verifier/Assessor.
The practical assessment will be administered internally under the guidance of the
Internal Verifier on a continuous basis. Standardised Practical Assessment test papers,
provided by NCTVET will be used by facilitators to assess candidates for the External
practical Assessment (EPA) scores.
External Practical Assessment (EPA) forms are provided for the recording and reporting
of practical ratings. EPA reporting forms must be completed by the facilitators and
submitted to the Registrar four (4) working days after the grades have been validated by
the External Verifier/Assessor.
The examinations are divided into two sets of multiple choice papers:
Paper I
Paper 1 will test the major skill area and General Technical Studies. (10 -20
items per module)
46
Paper II
Paper II will test the support subjects.
Section A: Calculations and Computations (10 – 15 Multiple choice items)
Section B: Language and Communication (10- 20 multiple choice items)
CERTIFICATION
Level One (1) certification implies that the holder of such certificate is:
Equipped with basic trade skills, knowledge and attitude
Expected to be given routine task assignments and be closely supervised
The Council reserves the right to withhold/cancel its certificate if it is proven at any time
that there was any irregularity during the administration of the examinations.
The Certificate shall bear the signatures of the Registrar and Chairman of the Council
47
APPENDICES
48
Sample Objective Questions
A. elasticity
B. flexibility
C. length
D. absorbency
A. Outside micrometer
B. Inside micrometer
C. Steel rule
D. Dial indicator
A. Carburisation
B. Oxidation
C. Combustion
D. Polarisation
A. square stock
B. rip stock
C. cross-cut stock
D. joint stock
5. Which of the following size binding wires is MOST suitable for steelfixing job?
A. #12
B. #14
C. #16
D. #18
49
Sample Practical Instrument
Criteria 1 2 3 4 5
Rating Scale
5. Can perform the task with initiative and adaptability to problem situations.
4. Can perform the task satisfactorily without assistance and/or supervision.
3. Can perform the task but requires periodic assistance and/or supervision.
2. Can perform limited parts of the tasks satisfactorily, requires considerable
assistance.
1. Has not demonstrated sufficient evidence on which judgment can be made.
50
Practical Assessment
METALWORK ENGINEERING
Materials Listing
51
Sample Rating Scale
Using the scale below rate the candidate’s performance by writing the number that
best describes the candidate’s competence in the space provided to the right of
each question.
Candidate displays extremely good ability to listen, interpret, evaluate,
5 and communicate ideas and knowledge as it relates to the topic/skill
area; sustains conversation very well.
52
Sample Practical Instrument
TASK:
Criteria 1 2 3 4 5
Module Name
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
TOTAL
Rating Scale
5. Can perform the task with initiative and adaptability to problem situations.
4. Can perform the task satisfactorily without assistance and/or supervision.
3. Can perform the task but requires periodic assistance and/or supervision.
2. Can perform limited parts of the tasks satisfactorily, requires considerable
assistance.
1. Has not demonstrated sufficient evidence on which judgment can be made.
53
Example of a Certification Plan from NCTVET Modular Examination
Environmental Studies
Hours 20
General Knowledge
Hours 10
54
SAMPLE PAGE FROM PERFORMANCE LOGBOOK
55
SAMPLE PRACTICAL TEST
Using the saved information on your diskettes from the previous class:
2. Edit text/data:
3. Format text/document
Total 50 Marks
56
SAMPLE RECORD KEEPING FORM
M#……. Interview
Written Test
Oral Test
Project
Case Study
Attitudinal Scale
Portfolio
M#……. Interview
Written Test
Oral Test
Project
Case Study
Attitudinal Scale
Portfolio
M#……. Interview
Written Test
Oral Test
Project
Case Study
Attitudinal Scale
Portfolio
Verifier’s Comment:
Recommendation:
Verifier’s Signature:
57
SAMPLE RECORD KEEPING FORM
PROGRESS FORM
Assessor’s Comments:
Assessor: Signature:
58
STUDENT EMPLOYABILITY SKILLS INVENTORY FORM
Not
EMPLOYABILITY Demon- Demon-
SKILLS COMPETENCIES strated strated
59
Interpersonal
Relationships
Accepts constructive constrictive criticism
Works as a team member
Displays a friendly and cooperative spirit
Accepts assignments pleasantly
Demonstrates tactfulness in difficult situations
Becomes aware of and accepting of cultural differences
Respects the rights and property of others
Displays leadership qualities
Identifies varying management styles
Understands self and accept value system of others
60
Health and Safety Maintains a good work pace and production rate
Habits
Practises good personal hygiene
Dresses in a well-groomed, appropriate manner
Recognizes stress-related situations and deals with them effectively
Develops physical stamina and tolerance for the kind done
Maintains good personal health
61
62