You are on page 1of 82

Accreditation

and

Quality Assurance
Surendra Prasad
Chairman, NBA

Some Questions

Do you (really) need Accreditation?


It is now almost Universal!

A tool for quality assurance


of engineering education

Q. Who Needs the Assurance?

All Stakeholders:
Prospective Students
Their Parents
Faculty
Management
&

Government

What is quality and How do we


measure it?
Quality may be taken for granted, when there is
a brand name: IITs, IISc.
Reputation earned over the years becomes an
assurance of some kind.
This is rather subjective.
Accreditation: A system which attempts to
assess quality a little more objectively for
Assurance.

What are the possible benefits?


Let us see

Students
Four Years of Valuable time, @18 yrs.
Will I become a good engineering
professional?
Survive the competition?
Admission to an Accredited Institution is a
safe bet!

Parents

Will our son/daughter have a good life?


Is the investment in the education OK?

Faculty

Working at an Accredited Institution leads


to a better career!
So, Better Faculty Induction!

Management

Institution has better recognition.


Greater Contribution to Society.

Government
Better Educated Engineers community leads
to better economic growth.
What can be more important to the
government?

Accreditation for Quality Assurance


Q. Is the Accreditation Process Universal?
Ans. Very Unlikely.
Every System of Education Own
Angularities.
Washington Accord Many Systems
Jointly Acceptable
India now a permanent signatory.

Accreditation
What accreditation is NOT?
1. Not a ranking procedure (Gold, Silver, 1,2,3,..)
2. Not an Investigation (There has been no
complaint)
3. (Should not be): A Regulatory Process,
Accreditation is not a ranking system.
NBA (or ABET) do not rank programs in any
way.

What is it then?
It is a Process by which:
The Institution being looked at is given
a SEAL of approval by stakeholders in its
activity - as meeting their expectations.
(Stakeholders: Students, Govt., Industry,
Faculty,)

Beyond Certification
However, its aim is not mere certification.
It is a process which encourages
continuous look at how you teach, what is
its impact, and how to achieve the set
objectives better.
It should lead to a Continuous
Improvement System.

Relevance to Indian Scenario


Indian Engineering Education Scene: Large, Multilayered, and Complex.
Common criticism: Sub-standard education delivery.
Quality: New Mantra.
: If we wish to do justice to aspirations of the young
people.

NBA
NBA hopes: To create a movement towards quality
through accreditation with credibility.
Accreditation only a tool.
Real Goal: Improved teaching and learning at our
Engineering Institutions.
Real Benefit of Washington Accord!

Assessment & Evaluation


+
Closing the Loop
: Continuous Improvement of
Program Outcomes

Definitions
Program Educational Objectives: Broad statements that
describe what graduates are expected to attain within a
few years after graduation.
Program Outcomes/Student Outcomes/Graduate
Attributes: What students are expected to know and able
to do by the time of graduation.
: Relate to the knowledge, skills, attitudes and behaviors
that students acquire as they progress through the
program

Definitions
Outputs AND Outcomes
Outputs: Graduation rates, Employment Statistics etc.
Outcomes: Student Learning, Skills, Behavior.

Focus on Program Outcomes


Criterion 9 is concerned with continuous improvement in
Student or Program Outcomes.
Note: Assessment of PEO not easy.
However PEOs important and relevant to vision and mission
of the program and the institution.
Not to be discussed in this discourse.
Program Outcomes: Central to Short-Term and Long-Term
Improvements in Curriculum and Pedagogy.

Curriculum
Curriculum Design exercise in many Indian institutions:
Examine the practices in some good institutions (IITs),
and adopt suitably.
Workable, but not necessarily the best approach for your
students and your objectives.

Backward Design Approach


A Desirable Order of Goals
Improved Student learning: Four Dimensions

Knowledge
Skills
Attitudes
Behaviour

Improved learning Environment.


Faculty Efficiency and Efficacy.
Accountability to Stake-holders.
Accreditation.

Goals as Seen by Many Institutions

Goals: same, But in a Different Order


Accreditation.
Faculty Efficiency and Efficacy.
Improved Student learning

Knowledge
Skills
Attitudes
Behaviour

Improved learning Environment.


Accountability to Stake-holders.

Faculty Engagement in Continuous


Improvement
Successful achievement of Goals requires faculty
participation and engagement.
Is their focus right?
Is the Question:
What do we have to do to get NBA accreditation?
OR
How can we effect improved student learning?

Definition
Assessment:
One or more processes that identify, collect, and
prepare data to evaluate attainment of student
outcomes.

Transformational Assessment
Goals of Assessment:
Information based decision-making.
Transformational Assessment:
When assessment used to help enhance student
learning.
Change Pedagogy
Modify courses or facilities
Redesign curriculum

Information basis
Smart Data Collection is the Key
Collect data that would be useful.
Sufficient to provide a reasonable basis for action.
Data Information.
Sufficient Granularity.

Criterion 9: Continuous Improvement


Statement of NBA criterion not so crisp, although
essence well captured in Criterion 2 and 9 seen
together.
More meaningful to look at the ABET statement.
The program must regularly use appropriate,
documented processes for assessing and evaluating the
extent to which the student outcomes are being attained.
The results of these evaluations must be systematically
utilized as input for the continuous improvement of the
program. Other available information may also be used
to assist in the continuous improvement of the program.

Closing the Loop


This concept, although the essence of quality assurance,
is also seen to be the weakest link in most institutions
seeking accreditation.
Most institutions fail to spend enough effort in closing the
loop.
Coming from an IIT, I would say, this is true even for the
IITs to a large extent.

Why this weakness?


After all, we all swear by continuous improvement! Then
why this weakness? Several Reasons:
Lack of Leadership/Ownership. Sometimes inadequate
appreciation of this aspect.
Non-Compliant Faculty: Too Busy.
Ineffective Tools.
Discomfort with Assessment.
Inconclusive Results.

Other Reasons
Trying to do too much.
Unsustainable

Inadequate Resources.
Many more.

Where Do We Start?
If We Wish To.

Management Plan
Processes

Results

Collecting/
Consolidating
Data.
Analyzing and
Reporting.
Evaluation

Data
Information
Knowledge
Decisions
Curriculum/
Pedagogy
Improvements

Improvement
Actions
Curriculum
changes.
Pedagogy
Others.

Mapping of Program Outcomes with


Courses
Course
No./
Outcome

EE 101

Engineerin
g
Knowledge

I,P

HU 203

EE306

EE 347
(Design
Project)

EE 450
(Capston
e)

Problem
Analysis

P,E

Design/
Developme
nt

P,E

P,E

Investigatio
n of
Complex
Problems
Engineer &
Society
Communic
ation

P,E

I
I

Benefits of Mapping
Shows what desired outcomes are already taught, and at
what level.
Reveals gaps (Outcome xx).
Potential for cross-course collaboration (Faculty buy-in).
Makes course, program level outcomes clearer to
students.
Shows where interventions may be required.

Collecting Evidence
Evidence is the key to making decisions about
improvements and closing the loop.
Direct Evidence usually primary for program outcomes.
Indirect Evidence: Acceptable and valuable as
secondary evidence.
Should not be all you use.

Both qualitative/quantitative form of (both direct/indirect)


evidence useful.

Sampling (Needed for Larger Programs)


Using part of the whole to represent all students or all
student work.
Sampling acceptable and even desirable for assessment
of larger programs.
Samples must be representative of student population.

Assessment is the Key


Program Outcomes broadly defined by NBA.
Up to Institutions to assess them as per their defined
objectives: They retain the flexibility to assess them any
way which is appropriate to measure the attainment of
desired outcomes.
Many innovations possible here.
Main Players: Faculty.

The Assessment Challenge


Faculty: Subject Experts.

May not be Assessment Experts.

Can and do bring their own intuition and insight into: What
constitutes Acceptable Performance in their classes?
Challenge: To involve them more deeply into the Assessment
Conversation to measure effectiveness of what they do
individually, or collectively
: additional effort should seem worthwhile to faculty.

Definitions
Evaluation: One or more processes for interpreting the
data and evidence accumulated through assessment
processes.
Evaluation determines the extent to which student
outcomes are being attained. Evaluation leads to
decisions and actions regarding program improvement.
Performance Criteria: Specific, measurable statements
identifying the performance required to meet the
outcome confirmable through evidence

Definitions
Rubric: a method for explicitly stating the expectations
for student performance
Generally has three components:
Performance Indicators (Dimensions of Learning)
Level of Performance (Scales)
Descriptors
Level of Performance: Discrete vs. Continuous

Rubrics
A method for explicitly stating the expectations for
student performance.
Describe characteristics for each level of performance.
Provide clear information how well the students
performed

Sample Assessment Rubric for NBA


Outcome
Example Outcome: Ability to Apply knowledge of
mathematics, science and engineering.
Possible Rubrics:
Standard: The student shows clear evidence of being able to
apply knowledge, techniques, skills and modern tools learnt in
the discipline and use knowledge of these disciplines to
identify, analyze and solve technical problems.
Exceeding the Standard: The student shows superior
evidence of being able to apply knowledge, techniques,
skills..

Sample Assessment Rubric for NBA


Outcome
Approaching the Standard: The student shows marginal
evidence of being able to apply knowledge, techniques,
skills...
Not Meeting the Standard: The student shows no
evidence of being able to apply knowledge, techniques,
skills

Assessment Types
Outcome Assessment Process may use multiple
methods of assessment (and allowing for triangulation):
direct and indirect, qualitative and quantitative:
Direct: Provide direct examination or observation of
student performance.

Students assignments.
Exams, Quizzes (Course Embedded assessment)
Student Portfolios.
Locally developed exams.
National exams (University Exams, GATE etc.)

Indirect Assessments
Ascertain opinion or self-report: extent of attainment of
outcomes.
Terminal (at graduation)
Surveys (Employer Surveys?)
Focus Groups (Alumni).
Advisory Board Recommendations.
Exit Interviews.

Course Embedded Assessment (Direct


Assessment)
One direct form of Assessment: through the Course itself.
Easy for engineering/computer science /applied science/
technology faculty to do.
Robust.
Students motivation (or lack of it) for assessment not a factor.
Requires careful design of assignments, tests, quizzes etc., to
ascertain different performance criteria.

Assessment Process
Must include thresholds (or targets) for acceptable
performance.
Must be measurable.
Exam & Grading Assessment.
However, both may be done together via suitable
embedding strategies.

Setting Targets or Thresholds


Critical part of closing the loop.
Not advisable to set targets once for all be very flexible
here.
No hard and fast rules: pilot through and alter based on
experience.
Be realistic about the programs context in setting
targets.

Sample Outcomes Assessment Processes


Assessment
Methods

Description

Target

Frequency

Direct
Assessment of
Course Material

Specific Course
Material used to
attain POs

70% must meet


expectations
(standard):
Average Rubric
Score: 7/10

Yearly (see
Schedule)

Exit Survey of
Final Year
Students

Specific
Questions
mapped to POs

75% : 7.5/10

Yearly

End of the year


survey (Student
Feedback?)

Supplemental
Questions
mapped to POs

80% agree or
strongly agree

Yearly

Co-opt employer
survey

Specific questions 70% agree or


mapped to POs
strongly agree.

Bi-annually

Triangulation
Ultimate Target: Assess the truth for the attainment of
various desired outcomes through multiple assessments.
Student work
Survey

Truth

Focus Group

Triangulation
More realistic
Student work

Truth

Focus Group
Surveys

Closing the Loop


When targets not met and improvement needed:
Action required to analyse and improve.
curriculum (and pedagogy?).
Mapping against outcomes may help identify the
changes needed.

Program Level Outcome Assessed


At Course level
Course: CE xxx

ETHICS MATRIX
Direct Assessment of Ethics Matrix

State
Course
Outcome
Objective

%
exceeding
standard
>80

% meeting
standard
70-80

%
approachin
g standard
60-70

% not
meeting
standard

Comment

<60
Semester
2, 2014

20

40

20

20

Test # 2

Semester
1, 2013

30

40

14

16

Term paper

Semester
2, 2013

24

36

26

14

Term Paper
Test # 3

Semester
1, 2012

26

42

22

10

How Often Do We measure?


Every semester? Every Year?
Regularly?
NBA Year?
Closing the Loop
: Regularly Scheduled Process: Not every semester or
every year for all outcomes: Lighter on faculty.

Student Outcomes

14-15

Understanding of
professional and ethical
responsibilities

Understanding the relevance


of Science, Mathematics and
Engineering on
Contemporary Issues

15-16

Ability to communicate
effectively, orally, in written
and visual forms.

17-18

18-19

Ability to work in teams


Understanding of diverse
cultural traditions in the
country.

16-17

Summary
Establish Program and Course Outcomes
Select where you want to assess (and in which course(s)).
Select components of outcome.
Select performance level.
Select criteria of success.
Use information to Close the loop.

Questions for Closing the Loop


What do the findings tell us?
Which improvements are critical for greater effectiveness:
In curriculum, pedagogy, infrastructure, delivery systems?

Did the changes made improve learning?


How good is our assessment process? How can it be
improved further?
Express learning as a year-wise chart or Table, for a better
appreciation.

Need for Change of Attitudes


From
Grading
Scoring right/wrong
answers.
Comparison of
Students

To
Assessment
Considering the whole
reasoning process.
Comparison to
measurable outcomes.

Secretive, Exclusive.

Public, Open,
participative.
Embedded.
Helping Students.

Add On.
Challenging Faculty.

Thanks for Your Attention

Program Assessment Report


Cover Sheet
Name of the Program : B.Tech (xxx Engg).
Report Submitted by: Faculty name

Date:

Intended Outcome and Associated College Goals


Outcome No. 1: Students will be able to apply current
knowledge, techniques and modern tools learned in the
discipline and by adapting emerging applications of
mathematics, science and engineering to identify,
analyze and solve technical problems.
(NBA Criterion: yy)

Program Assessment Report


Cover Sheet
College Goals Supported: Goals Nos. aa, bb, etc.
aa: To foster teaching and learning in a supportive
environment.
bb: To involve students in solving problems of interest to
industry, Government and Community organizations.
Outcome No. 2: Students will be able to conduct, analyze and
interpret experiments and apply experimental results to
improve processes related to the field.
Indicate connection with College Level goals (or PEOs) etc.

Program Level Assessment Report


Outcome wise
Outcome No. 1: Students will be able to apply current
knowledge, techniques and modern tools learned in the
discipline and by adapting emerging applications of
mathematics, science and engineering to identify,
analyze and solve technical problems.
1. Means of Assessment for Outcome No.1:
Course level Assessment of a List of Courses Relevant
to Outcome No.1: Give List: CExxx, Ceyyy, CE zzz etc.
These assessments compiled and evaluated by a group
of assessor, to analyse the attainment of Outcome No1.

Program Level Assessment Report


Outcome wise
Assessment Rubrics: As discussed before in terms of
students exceeding the standard, meeting the standard,
approaching the standard and not meeting the standard.
Criteria for success: Say: 70% of the students meet or
exceed the standard.
2. Description of Population to be Sampled:
All students in these courses required to participate.

Program Level Assessment Report


Outcome wise
3. Methods Used to Choose the Sample (Minimum sample size
20 % for a population of 30 or above; otherwise whole class).
No sampling used in this case.
4. Summary of Major Findings:
72% students exceeded or met the standard: Outcome
successfully achieved.
5. Action needed to address assessment results: No action
required in this case.

Performance Indicators
Key to measuring the outcomes.
Articulate key characteristics of an outcome.

Performance Indicators
Two essential characteristics:
1. Should be based on the focus of instruction.
(Steps of problem solving, ethical analysis, organising
presentations, etc.).

2. Should contain appropriate action verbs


(with clear message for both faculty and student: measurement of a
specific outcome for faculty, and a specific performance for the
student.

Performance Indicators: Example 1


Outcome: Students will demonstrate an ability to identify,
formulate and solve problems.
Students are able to:

identify key issues and variables.


Recognize the need for multiple solutions.
Analyse alternative solutions to a problem.
Justify a solution to the problem.

Performance Indicators: Example 2:


Students will have the ability to perform experiments and
interpret data.
Students are able to:

Observe good safety practices.


Design an experimental plan.
Acquire data on appropriate variables.
Compare experimental data and results to appropriate
theoretical models.
Explain observed differences between model and experiment.

How do we assess these traits?


Students will
Write experiment prereports which include the
experiment plan.
Perform the experiment.
Submit data sheets from
the lab experiment.
Write a final lab report
that includes the
analysis and
comparison.

Faculty will
Assess quality of
experimental plan.
Observe safety
practices.
Check if proper data for
needed variables are
measured.
Check whether
comparisons have been
made to theory with
understanding.

Rubrics
A method for explicitly stating the expectations for
student performance.
Describe characteristics for each level of performance
Provide clear information how well the students
performed

Sample Assessment Rubric for NBA


Outcome
Example Outcome: Ability to Apply knowledge of
mathematics, science and engineering.
Possible Rubrics:
Standard: The student shows clear evidence of being able to
apply knowledge, techniques, skills and modern tools learnt in
the discipline and use knowledge of these disciplines to
identify, analyze and solve technical problems.
Exceeding the Standard: The student shows superior
evidence of being able to apply knowledge, techniques,
skills..

Sample Assessment Rubric for NBA


Outcome
Approaching the Standard: The student shows marginal
evidence of being able to apply knowledge, techniques,
skills...
Not Meeting the Standard: The student shows no
evidence of being able to apply knowledge, techniques,
skills

Analysis and Decision Making


Rubric Data: Useful information for making decisions
about improvements.
For the outcome on Experimental Ability, a collective
look at samples of data for all courses requiring
experimentation would be useful.
Will bring out areas of strength and weakness for
suitable actions.
Actions: should be collective, to make an impact.
Check for Results on Actions in succeeding periods.
No NBA penalty if some actions or strategies do not
work!

What Do We Measure?
NBA Criteria do not tell us how to measure outcomes.
What should we look for in student performance to
determine the level to which an outcome is being
attained?
We measure performance indicators.
Degree of attainment based on students level of
performance on these indicators.

Performance Indicators: Skills or Behaviours


That Can be Observed

Performance
Indicator 1

Performance
Indicator 2

Outcome
Performance
Indicator 3

Performance
Indicator 4

Performance Indicators
Program
Educational
Objectives
Graduates will work
successfully as a
member of a team
to solve real world
problems

Student Outcomes
Students will be
able to
demonstrate
Ability to identify
and solve Real
World Complex
Problems.
Ability to conduct
experiments and
analyze data

Performance Indicators
Program
Educational
Objectives
Graduates will
work successfully
as a member of a
team to solve real
world problems

Student Outcomes
Ability to conduct
experiments and
analyze data

Performance
Indicators
Observes safe
lab practices.
Prepares a plan
for the
Experiment.
Gathers data on
suitable variables.
Compares results
to a model or a
theory.

Developing Performance Indicators


Articulate key characteristics of an outcome.
Faculty need to learn to know it when they see it:
Experience counts.
3 - 5 such characteristics for an outcome.
Should be observable and measurable, or demonstrated
in student work.
Grading Assessment; But you can do both together.

When Do We Measure?
For most outcomes need multiple points (embedded
perhaps in multiple courses, relevant to an outcome)
At what level (Lower level courses? Upper level
courses?).
Terminal Courses (Statics-Strength of Materials
Structures I Structures II etc.)
Capstone Course or Project.

You might also like