Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

ABET Accreditation for Engineering Programs: Goals, Objectives, Outcomes, and Assessment, Schemes and Mind Maps of Mathematics

Guidance for engineering faculty on the accreditation process, focusing on formulating program objectives and outcomes, identifying instructional techniques, and constructing a clear program mission. It also discusses the importance of continuous program improvement and the role of individual faculty members in achieving program outcomes.

Typology: Schemes and Mind Maps

2023/2024

Uploaded on 02/03/2024

viet-nam-2
viet-nam-2 🇻🇳

1 document

1 / 19

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
January 2003 Journal of Engineering Education 7
RICHARD M. FELDER
Department of Chemical Engineering
North Carolina State University
REBECCA BRENT
College of Engineering
North Carolina State University
ABSTRACT
Since the new ABET accreditation system was first introduced
to American engineering education in the middle 1990s as
Engineering Criteria 2000, most discussion in the literature has
focused on how to assess Outcomes 3a–3k and relatively little
has concerned how to equip students with the skills and
attitudes specified in those outcomes. This paper seeks to fill
this gap. Its goals are to (1) overview the accreditation process
and clarify the confusing array of terms associated with it
(objectives, outcomes, outcome indicators, etc.); (2) provide
guidance on the formulation of course learning objectives and
assessment methods that address Outcomes 3a–3k; (3) identify
and describe instructional techniques that should effectively
prepare students to achieve those outcomes by the time they
graduate; and (4) propose a strategy for integrating program-
level and course-level activities when designing an instructional
program to meet the requirements of the ABET engineering
criteria.
I. INTRODUCTION
The accreditation criteria used to evaluate all American engineer-
ing programs since the beginning of 2001 have been discussed exten-
sively since they were first introduced in 1996. The intense nationwide
curricular revamping that they have catalyzed could lead to dramatic
changes in engineering education; however, the potential of the new
system to improve instruction depends strongly on how well engineer-
ing faculty understand it and appreciate the extent to which their full
involvement in it is crucial.
Under the old system, the burden of preparing for an ABET
visit resided almost exclusively with the accreditation coordinator,
who did most of the work in putting together the self-study report
and preparing a display for the visitor. Not any more! In the words
of Jack Lohmann [47], “Preparing for an ABET visit is no longer
the academic equivalent of El Niño—something to be weathered
every six years until things go back to normal.” Since the work of
equipping students with the attributes specified in program out-
comes must be done at the individual course level, all faculty mem-
bers involved in teaching required courses must now understand
and be involved in the accreditation process on a continuing basis,
not just in the months preceding each visit.
Understanding the engineering criteria is no trivial goal, however;
the jargon they contain (objectives, outcomes, outcome indicators,
performance targets, etc.) is dense and confusing, and universally
agreed-upon operational definitions of the terms do not yet exist.
Moreover, while much has been written in the past few years about
the assessment of program outcomes (more specifically, of Outcomes
3a–3k), relatively little attention has been paid so far to the central
role of the individual faculty member in attaining those outcomes.
The primary purpose of this paper is to examine that role.
Many of the programmatic requirements of the new system
are similar to those of the old one and are laid out reasonably well in
the documentation on the ABET Web site [1], with most of the
departures from prior practice occurring in Criteria 2 (program ob-
jectives) and 3 (program outcomes and continuous program im-
provement). Our focus in the paper will therefore be on those two
criteria. In Section II, we overview the engineering criteria, attempt
to clarify the terms that regularly appear in the literature related to
accreditation, and briefly review the procedure for formulating
program educational objectives set forth in Criterion 2. In Sections
III–VI we assume that a program has formulated its objectives and
compatible outcomes that encompass Outcomes a–k of Criterion
3, and address the following questions:
1. How can learning objectives, assessment methods, and in-
structional techniques for individual courses be formulated to
address each of the Criterion 3 outcomes? Such a formulation
is a necessary condition for addressing the program outcomes.
2. What steps might be taken at the program and individual
course levels to raise the level of achievement of the out-
comes? Taking such steps would address the requirement for
continuous program improvement mandated by Criterion 3.
The planning, teaching, and assessment methods we will present
have all been used extensively and are well supported by educational
research. The paper briefly surveys the methods and cites sources of
information about them and the research that supports them;
the focus of the paper is the linkage between the methods and the
Criterion 3 outcomes.
II. ELEMENTS OF THE ENGINEERING CRITERIA
A. Overview and Terminology
To comply with the ABET engineering criteria, a program
must first formulate program educational objectives (broad goals) that
address institutional and program mission statements and are re-
sponsive to the expressed interests of various groups of program
stakeholders. The program must then formulate a set of program
outcomes (knowledge, skills, and attitudes the program graduates
should have) that directly address the educational objectives and
Designing and Teaching Courses to Satisfy
the ABET Engineering Criteria
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13

Partial preview of the text

Download ABET Accreditation for Engineering Programs: Goals, Objectives, Outcomes, and Assessment and more Schemes and Mind Maps Mathematics in PDF only on Docsity!

January 2003 Journal of Engineering Education 7

RICHARD M. FELDER

Department of Chemical Engineering North Carolina State University

REBECCA BRENT

College of Engineering North Carolina State University

ABSTRACT

Since the new ABET accreditation system was first introduced to American engineering education in the middle 1990s as Engineering Criteria 2000, most discussion in the literature has focused on how to assess Outcomes 3a–3k and relatively little has concerned how to equip students with the skills and attitudes specified in those outcomes. This paper seeks to fill this gap. Its goals are to (1) overview the accreditation process and clarify the confusing array of terms associated with it (objectives, outcomes, outcome indicators, etc.); (2) provide guidance on the formulation of course learning objectives and assessment methods that address Outcomes 3a–3k; (3) identify and describe instructional techniques that should effectively prepare students to achieve those outcomes by the time they graduate; and (4) propose a strategy for integrating program- level and course-level activities when designing an instructional program to meet the requirements of the ABET engineering criteria.

I. INTRODUCTION

The accreditation criteria used to evaluate all American engineer- ing programs since the beginning of 2001 have been discussed exten- sively since they were first introduced in 1996. The intense nationwide curricular revamping that they have catalyzed could lead to dramatic changes in engineering education; however, the potential of the new system to improve instruction depends strongly on how well engineer- ing faculty understand it and appreciate the extent to which their full involvement in it is crucial. Under the old system, the burden of preparing for an ABET visit resided almost exclusively with the accreditation coordinator, who did most of the work in putting together the self-study report and preparing a display for the visitor. Not any more! In the words of Jack Lohmann [47], “Preparing for an ABET visit is no longer the academic equivalent of El Niño—something to be weathered every six years until things go back to normal.” Since the work of equipping students with the attributes specified in program out- comes must be done at the individual course level, all faculty mem- bers involved in teaching required courses must now understand

and be involved in the accreditation process on a continuing basis, not just in the months preceding each visit. Understanding the engineering criteria is no trivial goal, however; the jargon they contain (objectives, outcomes, outcome indicators, performance targets, etc.) is dense and confusing, and universally agreed-upon operational definitions of the terms do not yet exist. Moreover, while much has been written in the past few years about the assessment of program outcomes (more specifically, of Outcomes 3a–3k), relatively little attention has been paid so far to the central role of the individual faculty member in attaining those outcomes. The primary purpose of this paper is to examine that role. Many of the programmatic requirements of the new system are similar to those of the old one and are laid out reasonably well in the documentation on the ABET Web site [1], with most of the departures from prior practice occurring in Criteria 2 (program ob- jectives) and 3 (program outcomes and continuous program im- provement). Our focus in the paper will therefore be on those two criteria. In Section II, we overview the engineering criteria, attempt to clarify the terms that regularly appear in the literature related to accreditation, and briefly review the procedure for formulating program educational objectives set forth in Criterion 2. In Sections III–VI we assume that a program has formulated its objectives and compatible outcomes that encompass Outcomes a–k of Criterion 3, and address the following questions:

  1. How can learning objectives, assessment methods, and in- structional techniques for individual courses be formulated to address each of the Criterion 3 outcomes? Such a formulation is a necessary condition for addressing the program outcomes.
  2. What steps might be taken at the program and individual course levels to raise the level of achievement of the out- comes? Taking such steps would address the requirement for continuous program improvement mandated by Criterion 3. The planning, teaching, and assessment methods we will present have all been used extensively and are well supported by educational research. The paper briefly surveys the methods and cites sources of information about them and the research that supports them; the focus of the paper is the linkage between the methods and the Criterion 3 outcomes.

II. ELEMENTS OF THE ENGINEERING CRITERIA

A. Overview and Terminology To comply with the ABET engineering criteria, a program must first formulate program educational objectives (broad goals) that address institutional and program mission statements and are re- sponsive to the expressed interests of various groups of program stakeholders. The program must then formulate a set of program outcomes (knowledge, skills, and attitudes the program graduates should have) that directly address the educational objectives and

Designing and Teaching Courses to Satisfy

the ABET Engineering Criteria

8 Journal of Engineering Education January 2003

encompass certain specified outcomes (Outcomes 3a–3k, shown in Table 1). In some required courses in the program curriculum, outcome-related course learning objectives (statements of things students who complete the course should be able to do—explain, calculate, derive, design,…) must be written. The program educa- tional objectives and outcomes must be set forth in a self-study re- port, which must also include statements of where the outcomes are addressed in the program curriculum, how their level of attainment is to be assessed, and how the assessment results will be used to im- prove the program. Beginning with the second accreditation visit, the program will also presumably have to demonstrate that it has implemented the improvement plan formulated in the prior visit. When first confronted with the new accreditation criteria, facul- ty members have an understandable inclination to formulate their program objectives and outcomes to fit their existing curricula. This approach is invariably frustrating and possibly self-defeating. Many existing curricula have never before been scrutinized in the light of desired learning outcomes and are consequently little more than collections of content-driven courses that have only the loosest of connections to one another. This disjointedness is reflected in the blank stares of incomprehension familiar to all engineering faculty members who have ever asked their students about material from a prerequisite or co-requisite course. Tailoring the accreditation process to perpetuate the status quo will clearly not improve this situation. The engineering criteria constitute an antidote to curricular chaos. The exercise of constructing a clear program mission, broad goals that address the mission (program educational objectives), and desired attributes of the program graduates (program out- comes) requires the faculty to consider seriously—possibly for the first time—what their program is and what they would like it to be.

The product of this exercise constitutes a unifying framework for course and curriculum development. If faculty members then struc- ture their course syllabi, learning objectives, and teaching and as- sessment methods to address the program outcomes, the result is a coherent curriculum in which all courses have well-defined and in- terconnected roles in achieving the program mission. The course learning objectives—explicit statements of what students in a course should be able to do to demonstrate their mastery of course material—are crucial to the process; among other things, they en- able the program to demonstrate precisely how specific program outcomes are addressed in the curriculum. If the outcomes are then assessed continuously and the results are used to improve instruc- tion in the courses that address them, the degree to which the program meets its self-selected goals must inevitably improve. When a program approaches accreditation in this logical manner, the burden of preparing the self-study may actually be less than what it was under the old system. Lohmann [47] reports that the self-study for the B.S. program in Mechanical Engineering at Georgia Tech occupied 576 pages in 1990 under the old system and only 180 pages in 1997 under the new one, of which 130 pages comprised the required faculty resumes and course outlines. Creating a course to achieve specified outcomes requires effort in three domains (see Figure 1): planning (identifying course con- tent and defining measurable learning objectives for it); instruc- tion (selecting and implementing the methods that will be used to deliver the specified content and facilitate student achievement of the objectives); and assessment and evaluation (selecting and im- plementing the methods that will be used to determine whether and how well the objectives have been achieved and interpreting the results). As Figure 1 shows, the three stages are not purely

Table 1. Criterion 3 outcomes and related references.

educational objectives and documenting steps taken to address Criterion 2 are offered by Carter et al. [17] and McGourty et al. [56].

C. Criterion 3 and Outcomes 3a–3k Criterion 3 requires programs seeking accreditation to formulate (1) a set of program outcomes that specify the knowledge, skills, and atti- tudes program graduates should have if the program educational objectives are achieved, (2) an assessment process for the program out- comes, (3) results from the implementation of the assessment process, and (4) “evidence that the results are applied to the further develop- ment and improvement of the program [1].” Designing the assess- ment process involves defining outcome indicators —instruments or methods to be used to assess level of attainment of the outcomes—and performance targets —target criteria for the outcome indicators. Appen- dix A provides examples of program outcomes, outcome indicators, and performance targets. As noted previously, the program outcomes must encompass the eleven outcomes (a–k) specified in Criterion 3 and listed in Table 1, but would normally go beyond them to address the complete set of program educational objectives. An important corollary of the fourth stated Criterion 3 require- ment (evidence that the assessment results are being used to im- prove the program) is that programs do not have to meet all of their outcome performance targets to be accredited , at least at the first accredi- tation visit under the new system. They must only demonstrate that they have in place a sound plan for outcomes assessment and con- tinuous program improvement and are making a serious effort to implement it. When ABET returns to re-evaluate the accredited program, however, the program will presumably have to show that it has made substantial progress with the implementation. At this point we are ready to proceed with the main topic of this paper—how engineering courses might be designed, taught, and assessed to equip students with the skills specified in Outcomes 3a–3k. Outcomes assessment has been discussed at great length in the literature and so we will spend relatively little time on it, concen- trating most of our attention on the less thoroughly examined topics of course planning (specifically, formulation of learning objectives) and instruction.

III. DESIGNING COURSES TO SATISFY CRITERION 3

A. A Matrix-Based Structure for Course and Program Assessment Suppose that educational objectives have been formulated for an engineering program following the specifications of Criterion 2, and program outcomes that encompass Outcomes 3a–3k have in turn been formulated to address the educational objectives. The next step is to identify the program core —a set of courses in the pro- gram curriculum designated to address the knowledge, skills, and attitudes specified in the outcomes. Required courses under the control of the program (for example, chemical engineering courses taken by all chemical engineering majors) are obvious candidates for the core. Required courses given by other programs (e.g., mathe- matics and humanities courses) may also be included, provided that they address program outcomes in a consistent manner. Elective courses and courses whose content varies substantially from one offering to another should not be included in the core. For each core course, a set of one or more outcome-related course learning objectives should be defined. A course learning objective is a statement of an observable student action that serves as evidence of

knowledge, skills, and/or attitudes acquired in a course [28]. The statement must include an observable action verb ( explain, calculate, derive, design, critique ,…) to qualify as a learning objective; state- ments of non-observable actions such as learn, know, understand, and appreciate might qualify as outcomes but not learning objec- tives. Understanding, for instance, cannot be directly observed; the student must do something observable to demonstrate his or her understanding. For examples of acceptable learning objectives, see Appendix A. Outcome-related learning objectives are the learning objectives for a core course that specifically address one or more program out- comes and are guaranteed to be in place in all offerings of the course, regardless of who happens to be teaching. Additional learn- ing objectives might be (and normally would be) defined by individ- ual instructors to reflect specific program requirements and their own personal goals, but the outcome-related objectives should be invariant. The program can then reasonably claim that if one or more of these course learning objectives address a particular pro- gram outcome, the course addresses the outcome, which is precisely the sort of information that ABET evaluators look for in the self- study. If the course is taught outside the program, having the presenting department sign off on the outcome-related objectives can strengthen the claim. To keep track of how and where program outcomes are addressed in the curriculum, a course assessment matrix might be constructed for each core course, with a column for each program outcome and a row for each outcome-related learning objective. Entries of 1, 2, and 3 inserted in a cell of the matrix respectively indicate that an objective addresses an outcome slightly, moder- ately, or substantively. Table 2 shows such a matrix. Once the course assessment matrices have been prepared, a program outcome assessment matrix can be constructed as shown in Table 3, with columns for program outcomes and rows for program outcome indicators and core courses. Entries of 1, 2, and 3 in the matrix re- spectively denote slight, moderate, and substantive relevance of the outcome indicators and core courses to the program out- comes. This matrix provides a concise summary of how the pro- gram outcomes are assessed and the courses to concentrate on when attempting to raise the attainment level of a particular out- come. The entries for a course should be based on a review of course materials (syllabi, learning objectives, tests and other as- sessment measures, and the course assessment matrix) conducted by a committee that includes all faculty members who teach the course. A common concern of ABET evaluators has to do with out- comes addressed in only one or two courses (especially if the courses are taught outside the program), such as communication skills ad- dressed only in one or two general education courses and safety or ethics addressed only in the capstone design course. Programs are advised to distribute their coverage of each outcome throughout the program, not only for appearance’s sake but to provide repeated practice and feedback in the skills the students will need to meet the outcome performance target. Once the program assessment has been carried out, asterisks may be placed next to matrix entries for outcome indicators in a copy of the program assessment matrix to indicate that the relevant performance targets have been met. (This information would not necessarily be included in the self-study but could serve for internal program use only.) Entries without asterisks would identify possible

10 Journal of Engineering Education January 2003

focal points for the continuous improvement effort mandated by Criterion 3.

B. Formulating Outcome-Related Course Learning Objectives Consider the following illustrative program outcome: The program graduates will be able to analyze important social and environmental problems and identify and discuss ways that engineers might contribute to solutions, including technological, economic, and eth- ical considerations in their analysis. It might reasonably be claimed that Engineering Criteria Out- comes 3e (identifying engineering problems), 3f (understanding professional and ethical responsibility), 3h (understanding global and societal implications of engineering solutions), and 3j (knowl- edge of contemporary issues) all map onto this hypothetical program outcome. For a program to meet the requirements for accreditation, each of Outcomes 3a–3k must map onto one or more program out- comes in this manner, from which it follows that the core courses in the program curriculum must collectively include learning objectives that address each of Outcomes 3a–3k. (There is no need for any in- dividual course to address all 11 outcomes, however, and it would be a rare course indeed that does so.)

It takes little skill to write learning objectives for some of the Criterion 3 outcomes. Almost any objective that could be written for an engineering course, for example, could be plausibly claimed to address Outcome 3a (apply knowledge of mathematics, science, and engineering), and it would also not be difficult to write objec- tives that address Outcome 3k (use modern engineering techniques, skills, and tools). Other outcomes pose somewhat greater chal- lenges. For example, Outcome 3e (identify, formulate, and solve engineering problems) requires some thought. Solving engineering problems is not a problem—most traditional course objectives in- volve problem-solving of one type or another—but objectives that clearly define the skills involved in problem identification and for- mulation are not obvious. Having to write objectives for some of the other outcomes throws most engineering professors into completely unfamiliar ter- ritory. Little in their background or experience provides a basis for knowing how students might show an ability to work effectively in multidisciplinary teams (3d) or to engage in lifelong learning (3i), or how they might demonstrate an understanding of professional or ethical responsibility (3f) or of the impact of engineering solutions in a global and societal context (3h).

January 2003 Journal of Engineering Education 11

Table 2. Course assessment matrix: CHE 205.

Illustrative learning objectives for each of Outcomes 3a–3k are given in Appendix B, and an exhaustive list of attributes for each ele- ment of each outcome is provided in Reference 9. Instructors seeking to formulate outcome-related learning objectives for their courses may begin by adapting items from either of these sources. If the at- tributes in the second list are not directly expressed in measurable terms (e.g., if they begin with words like “know” or “understand” or “appreciate,”), the instructors may recast them using appropriate ac- tion words, many of which are also suggested in Reference 9.

IV. ASSESSING LEARNING

Once program outcomes have been formulated and outcome in- dicators and performance targets specified, outcome-related learn- ing objectives should be drafted for all core courses and a plan should be made for assessing the degree to which the objectives are being met. The assessment plan should also specify who is responsi- ble for each part of the assessment, when the assessment will be performed, and who will receive the results [74]. Triangulation (using multiple methods to obtain and verify a re- sult) is an important feature of effective assessment [10]. The more tools used to assess a specific program outcome or course learning objective, the greater the likelihood that the assessment will be both valid (meaning that what the chosen method is actually assessing matches what is supposedly being assessed) and reliable (the conclu- sion would be the same if the assessment were conducted by other assessors or again by the same assessor). Following are some possi- ble program-level (P) and course-level (C) assessment tools:

  1. Exit surveys, exit interviews (P)
  2. Alumni surveys and interviews (P)
  3. Employer surveys and interviews (P)
  4. Job offers, starting salaries (relative to national benchmarks) (P)
  5. Admissions to graduate school (P)
  6. Performance in co-op and internship assignments and in problem-based learning situations (P,C)
  7. Assignments, reports, and tests in the capstone design course (P,C)
  8. Standardized tests—e.g., the FE Examination (the value of which is discussed by Watson [87]), the GRE, and the Force Concept Inventory in physics (P,C)
  9. Student surveys, individual and focus group interviews (P,C)
  10. Self-analyses, learning logs, journals (P,C)
  11. Peer evaluations, self-evaluations (P,C)
  12. Student portfolios (P,C)
  13. Behavioral observation, ethnographic and verbal protocol analysis (analyzing transcripts of student interviews or working sessions to extract patterns of problem-solving, thinking, or communication) (P,C)
  14. Written tests or test items clearly linked to learning objectives (C)
  15. Written project reports (C)
  16. Oral presentations (live or on videotape) (C)
  17. Research proposals, student-formulated problems (C)
  18. Abstracts, executive summaries, papers (C)
  19. Letters, memos (C)
  20. Written critiques of documents or oral presentations (C)
  21. Classroom assessment techniques [5, 60, 77] (C)

Nichols [61] and Prus and Johnson [69] summarize the strengths and weakness of many of these assessment tools. The reliability of objective tests such as standardized multiple- choice and (setting aside questions related to partial credit) quanti- tative problem-solving tests may be demonstrated with relatively little difficulty. On the other hand, ratings of such student products as portfolios and project reports are necessarily matters of opinion, and designing reliable rating methods can be difficult. An effective approach is to identify aspects of the product or presentation to be rated (e.g., for grading project or laboratory reports, the aspects might be technical soundness, organization, thoroughness of dis- cussion, and quality of writing), select a weighting factor for each aspect, and construct a rubric —a form on which the evaluator as- signs numerical ratings to each specified aspect and then uses the specified weighting factors to compute an overall rating. Trevisan et al. [85] offer suggestions regarding the effective design and use of rubrics, including a recommendation that the characteristics of the highest and lowest ratings and the midpoint rating for each feature be spelled out fairly explicitly. If several raters complete forms inde- pendently and then reconcile their ratings, the result should be very reliable, and the reliability can be increased even further by giving raters preliminary training on sample products or videotaped presentations. Considerable expertise is required to design valid questionnaires and interviews [20, 33, 82], so unless an already validated instru- ment is available, assistance from a knowledgeable consultant should be sought when using these assessment tools. Similarly, as- sembling and evaluating student portfolios is a complex and time- consuming task that can become completely unmanageable without careful planning. Several resources are available to assist in this planning for portfolios in general [7, 18, 62, 64] and for electronic portfolios [6, 73, 89]. More detailed discussions of assessment in the context of engi- neering program accreditation are given by Besterfield-Sacre et al. [9, 10], McGourty et al. [55], Olds and Miller [62], Rogers [71], Rogers and Sando [72], and Scales et al. [75]. Deek et al. [19] discuss the assessment of problem-solving skills.

V. TEACHING TO ADDRESS OUTCOMES 3 A–3K

The ABET engineering criteria have been discussed extensively in articles and presentations since they were first announced, but most of the discussion has focused on assessing Outcomes 3a–3k, with relatively little being said about what must be done to achieve those outcomes. The tacit assumption seems to be that determining whether or not students have specific skills is much harder than equipping them with those skills. In fact, the opposite is closer to the truth. We know a great deal about how to assess communica- tion skills, for example, but judging from the common complaints that most engineering graduates cannot write a coherent report or give a comprehensible talk, we clearly have not yet worked out how to raise those skills to satisfactory levels. In Section A we outline instructional methods that address each of the Criterion 3 outcomes and cite references on how to imple- ment them and on the research that supports their effectiveness. Sections B and C discuss cooperative learning and problem-based learning, two instructional approaches that have the potential to address all eleven Criterion 3 outcomes effectively. For additional

January 2003 Journal of Engineering Education 13

descriptive details and research findings about the methods to be de- scribed, see Bransford et al. [13], McKeachie [57] and Wankat [86].

A. General Instructional Methods The more explicitly students know what they are expected to do and the more practice they get in doing it, the greater the likelihood that they will acquire the desired skills [28, 34, 49]. An effective ap- proach to achieving any desired learning outcome is therefore to show the students the course learning objectives that address that outcome, either on the first day of the course or (better) in study guides for the course tests. Other instructional techniques that ad- dress specific Criterion 3 outcomes are suggested in Appendix C. While a full discussion of all of the outcomes is beyond the scope of this paper, we briefly discuss two of them as illustrations. Let us first examine Outcome 3b (ability to design and conduct experiments, analyze and interpret data). In the traditional engineer- ing laboratory course, the students work through a series of fairly rigidly prescribed experiments in which they follow instructions on equipment operation, collect the prescribed data, shut down, do the prescribed data analysis, and write and submit a report. In terms of the four elements of Outcome 3b, they can certainly be said to have conducted experiments, but whether they can claim to have done anything meaningful by way of analyzing and interpreting data is a matter of opinion, and experimental design has clearly not entered the picture. Alternative ways to conduct the laboratory course offer much bet- ter prospects of satisfying Outcome 3b and greatly improving the learning experience. Perhaps the most promising approach would be to run fewer but more open-ended experiments. For a given experi- ment, the students would be given an objective (determine a physical property, establish an empirical correlation, validate or refute a theo- retical prediction,…), provided with enough training to keep them from destroying the equipment or injuring themselves, and turned loose. It would be up to them to design the experiment (choose ex- perimental conditions, specify how many runs to carry out at each condition and the data to be collected, plan the data analysis to be carried out), run it and collect the data, perform the data analysis and interpretation, draw conclusions, and prepare and submit the report. A lab course conducted in this manner could legitimately claim to be addressing all of the elements of Outcome 3b. Addressing an outcome and satisfying it are not synonymous, however. If students are to perform well on the Outcome 3b indica- tors built into the program assessment plan, they must be helped to develop the skills in question. A general educational principle is, don’t assess skills that have not been taught. Starting in the first week of the course, instruction should be provided in experimental design and statistical data analysis and any other topics that have not been solidly addressed earlier in the curriculum. The instruction may take the form of mini-lectures, supplementary readings, or best of all, in- teractive multimedia tutorials if good ones can be found or prepared. Another effective technique is to provide real or hypothetical reports that illustrate both good and bad approaches to experimental design, data analysis and representation, interpretation and discussion, and report preparation; have the students critique the reports in teams; and then give them feedback on their critiques. Another well-known educational principle is that the assessment drives the learning. If students know they are going to be held indi- vidually accountable for course material, most will make a serious attempt to learn it; without the individual accountability, many

overburdened engineering students will choose to spend their time in more productive ways. For example, if an experiment is carried out by a team and team member A is responsible for the statistical data analysis, members B, C, and D may not try to understand or even read that section of the final report. However, if individual tests that cover statistical analysis are built into the course and study guides containing related objectives are handed out beforehand, the chances that all four students will make the effort to learn the mate- rial increase dramatically. Another instructional strategy is to ran- domly select team members to report on different sections, making the designation a short time before the reports are due, and have the team grade depend in part on the quality of the reports. The stu- dents will then be forced to learn the entire report content and not just the parts for which they were responsible. Most of the learning that takes place may occur in the tutoring sessions that precede the reports. Since the expert in, say, statistical analysis knows that her grade depends in part on how well her teammates can explain what she did, she will make it her business to see that they understand it. Laboratories are not the only places experimental skills can be ac- quired, and no outcome should be addressed in only a single course in the curriculum. In lecture courses, real or simulated experimental data may be provided in classroom exercises or in homework prob- lems and the students can be asked to perform the appropriate data analysis and interpretation. Particular care should be taken to build experimental error or discrepant results into the data to emphasize the idea that in the laboratory (as opposed to textbooks) things don’t always go the way they’re supposed to. In other class activities and assignments, the students can be asked to design an experiment to measure a variable or property or validate a theory or empirical correlation being discussed in the lectures. As before, if this part of the course content is to be taken seriously, it should be included in study guides and on tests. Let us move now to Outcome 3i (recognize the need for and be able to engage in lifelong learning). Candy [16] defines lifelong learning skill as the ability to “continue one’s own self education be- yond the end of formal schooling.” Drawing on work of McCombs [53], Marra et al. [51] suggest that if students are to be motivated and equipped to continue teaching themselves, their formal educa- tion must go beyond presentation of subject content to address four objectives: (1) helping them to understand their own learning processes, (2) requiring them to take responsibility for their own learning, (3) creating an atmosphere that promotes confidence in their ability to succeed, and (4) helping them see schooling and edu- cation as personally relevant to their interests and goals. The instructional methods suggested in Appendix C for ad- dressing Outcome 3i are consistent with these goals (as are the learning objectives suggested for this outcome in Appendix B). Ac- quainting students with their learning styles is a direct and effective way to help them understand their learning process [25]. Assign- ments that require independent literature and Web searches pro- mote a sense of individual responsibility for learning and also help develop the skill to find and organize information in the absence of texts and course notes. Presenting realistic and interesting techno- logical and socially relevant problems and asking the students to contemplate approaches to solving them ( problem-based learning , discussed in greater detail in the next section)—and assuring them that they’re going to have to do that all the time as engineers—may be the best way to impress on them the need for lifelong learning. Giving them repeated practice in formulating solution approaches

14 Journal of Engineering Education January 2003

  1. [ Course level ] For every course in the core, define observable outcome-related learning objectives that are guaranteed to be in place regardless of who happens to teach the course (Section III-B and Appendix B) and define assessment methods for each core ob- jective (Section IV and Besterfield-Sacre et al. [9]). Each of these learning objectives should map onto one or more program out- comes, and all program outcomes should be addressed by objectives in several core courses—the more, the better. Some programs have found it helpful to formulate course outcomes for each required course that include some program outcomes or outcome elements, and then formulate the course learning objectives to address the course outcomes.
  2. [ Course level ] Prepare a course assessment matrix with columns for program outcomes and rows for outcome-related course learning objectives (Table 2). Place a 1, 2, or 3 in the matrix to indicate that an objective addresses an outcome marginally, mod- erately, or substantively. The entries should reflect a consensus of all faculty members who are likely to teach the course before the next accreditation visit.
  3. [ Program level ] Prepare a program outcome assessment ma- trix with columns for program outcomes and rows for outcome in- dicators and core courses (Table 3). Place a 1, 2, or 3 in the matrix to indicate that an outcome indicator or core course addresses an out- come marginally, moderately, or substantively, basing the entries for each course on an examination of course materials and the course assessment matrix by a faculty review committee.
  4. [ Course level ] Teach each course in a manner that addresses all of the targeted program outcomes (Appendices C–E). Implement the assessment methods selected in Step 2 and place asterisks next to the 1’s, 2’s, and 3’s in the course assessment matrix when a learn- ing objective is judged to have been met.
  5. [ Program level ] Implement the program outcome assess- ment methods selected in Step 1 and evaluate the performance targets. Insert asterisks next to the 1’s, 2’s, and 3’s for an outcome indicator to indicate that the corresponding performance target has been met. If the assessment for a particular outcome indicates shortcomings or room for improvement, initiate appropriate ac- tions to improve instruction in the relevant courses. The program outcome assessment matrix should indicate which courses might be modified, and the course assessment matrix for each of those courses should suggest areas that need strengthening. Possible instructional modifications may be found in Section V and Appendices C–E. We make no claim that this procedure is the only way or the op- timal way to prepare for an ABET visit; we simply suggest that it is rational and consistent with both the letter and spirit of the engi- neering criteria, and we propose that engineering programs consid- er adapting it to their own needs and resources. Regardless of the programmatic approach adopted, however, in- dividual faculty members must take responsibility for assuring that the program outcomes are met and that program outcome assess- ment results are used for continuous program improvement. Fulfilling this responsibility entails defining outcome-related course learning objectives, selecting and implementing assessment meth- ods that address all the objectives, and teaching the courses in a way that promotes positive assessment results. Our hope is that the sug- gestions and examples presented in the body and Appendices B–E of this paper will provide a useful resource to professors engaged in this process.

ACKNOWLEDGMENTS

The preparation of this work was supported by the SUCCEED Engineering Education Coalition (NSF Cooperative Agreement EEC-9727411). The authors are indebted to Lisa Bullard of North Carolina State University, Gary Huvard of Virginia Common- wealth University, and George Peterson of ABET for their insight- ful critiques of an early draft.

REFERENCES

[1] ABET (Accreditation Board for Engineering and Technology). Criteria for accrediting engineering programs: Effective for evaluations during the 2002–2003 accreditation cycle.  http://www.abet.org/images/ Criteria/2002-03EACCriteria.pdf , accessed September 28, 2002. [2] Adams, J.L. 1991. Flying buttresses, entropy, and o-rings: The world of an engineer. Cambridge: Harvard University Press. [3] Adamy, D. 1987. Preparing and delivering effective technical presenta- tions. Norwood, MA: Artech House, Inc. [4] Aldridge, M.D., and L.D. Benefield. 1998. A model assessment plan. ASEE Prism. May–June 1998: 22–28. [5] Angelo, T.A., and K.P. Cross. 1993. Classroom Assessment Tech- niques: A Handbook for College Teachers (2nd^ Ed.) San Francisco: Jossey-Bass. [6] Barrett, H.C. 1998. Strategic questions: What to consider when planning for electronic portfolios. Learning and Leading with Technology. 26(2): 6–13. [7] Barton, J., and A. Collins, eds. 1997. Portfolio Assessment: A Hand- book for Educators. New York, Addison-Wesley. [8] Beer, D., and D. McMurrey. 1997. A Guide to Writing as an Engi- neer. New York: John Wiley & Sons. [9]  http://civeng1.civ.pitt.edu/~ec2000 , accessed September 28,

  1. See also Besterfield-Sacre, M.E., et al. 2000. Defining the outcomes: A framework for EC 2000. IEEE Transactions on Engineering Education. 43(2): 100–110. [10] Besterfield-Sacre, M.E., et al. 2000. Triangulating assessments. Proceedings, 2000 ASEE Annual Meeting. American Society for Engineer- ing Education. [11] Bloom, B.S., and D.R. Krathwohl. 1984. Taxonomy of Educational Objectives. Handbook 1. Cognitive domain. New York: Addison-Wesley. [12] Branscomb, H.E. 1997. Casting Your Net: A Student’s Guide to Research on the Internet. Needham Heights, MA: Allyn & Bacon. [13] Bransford, J.D., A.L. Brown, and R.R. Cocking, eds. 2000. How People Learn: Brain, Mind, Experience, and School. Expanded edition. Washington, D.C.: National Academy Press. Available on-line at  http://www.nap.edu/books/0309070368/html/ , accessed September 28,
[14] Brent, R., and R.M. Felder. 1992. Writing assignments—Pathways 

to connections, clarity, creativity. College Teaching. 40(2): 43–47. [15] Bucciarella, L.L. 1994. Designing Engineers. Cambridge, MA: MIT Press. [16] Candy, P. 1991. Self-direction for Lifelong Learning: A Comprehensive Guide to Theory and Practice. San Francisco: Jossey-Bass. [17] Carter, M., R. Brent, and S. Rajala. 2001. EC 2000 Criterion 2: A procedure for creating, assessing, and documenting program educational objectives. Proceedings, 2001 ASEE Annual Conference. American Society for Engineering Education. [18] Christy, A.D., and M.B. Lima. 1998. The use of student portfolios in engineering instruction. Journal of Engineering Education. 87(2): 143–148.

16 Journal of Engineering Education January 2003

[19] Deek, F.P., S.R. Hiltz, H. Kimmel, and N. Rotter. 1999. Cogni- tive assessment of students’ problem solving and program development skills. Journal of Engineering Education. 88(3): 317–326. [20] Dobson, A. 1996. Conducting Effective Interviews: How to Find Out What You Need to Know and Achieve the Right Results. Philadelphia: Trans-Atlantic Publications, Inc. [21] Edens, K.M. 2000. Preparing problem solvers for the 21st^ century through problem-based learning. College Teaching. 48(2): 55–60. [22] Evers, T., J. Rush, and I. Berdrow. 1998. The Bases of Competence: Skills for Lifelong Learning and Employability. San Francisco: Jossey-Bass. [23] Felder, R.M. 1987. On creating creative engineers. Engineering Education. 77(4): 222–227. [24] Felder, R.M. 1993. An engineering student survival guide. Chapter One. 7(3): 42-44. Available on-line at  http://www.ncsu.edu/ felder-public/Papers/survivalguide.htm , accessed September 28, 2002. [25] Felder, R.M. 1993. Reaching the second tier: Learning and teach- ing styles in college science education. J. College Science Teaching. 23(5): 286–290. Available on-line at  http://www.ncsu.edu/felder-public/ Papers/Secondtier.html , accessed September 28, 2002. [26] Felder, R.M., and R. Brent. 1994. Cooperative learning in technical courses: Procedures, pitfalls, and payoffs. ERIC Document Reproduction Service, ED 377038. Available on-line at  http://www.ncsu.edu/felder-public/ Papers/Coopreport.html , accessed September 28, 2002. [27] Felder, R.M., and R. Brent. 1996. Navigating the bumpy road to student-centered instruction. College Teaching. 44(2): 43–47. Available on- line at  http://www.ncsu.edu/felder-public/Papers/Resist.html , accessed September 28, 2002. [28] Felder, R.M., and R. Brent. 1997. Objectively speaking. Chemical Engineering Education. 31(3): 178–179. Available on-line at  http://www. ncsu.edu/felder-public/Columns/Objectives.html , accessed September 28,

[29] Felder, R.M., and R. Brent. 2001. Effective strategies for coopera- tive learning. Journal of Cooperation and Collaboration in College Teaching. 10(2): 63–69. Available on-line at  http://www.ncsu.edu/felder-public/Papers/ CLStrategies(JCCCT).pdf , accessed September 28, 2002. [30] Felder, R.M., D.R. Woods, J.E. Stice, and A. Rugarcia. 2000. The future of engineering education. 2. Teaching methods that work. Chem. Engr. Education. 34(1): 26–39. Available on-line at  http://www.ncsu.edu/ felder-public/Papers/Quartet2.pdf , accessed September 28, 2002. [31] Florman, S. 1996. The Introspective Engineer. New York: St. Martins Press. [32] Fogler, H.S., and S.E. Leblanc. 1994. Strategies for Creative Problem Solving. Englewood Cliffs, NJ: Prentice-Hall. [33] Fowler, F.J. 1993. Survey Research Methods ., 2nd^ ed. Newbury Park, CA: Sage. [34] Gronlund, N.E. 1999. How to Write and Use Instructional Objectives. 6th ed. Englewood Cliffs, NJ: Prentice-Hall. [35] Harris, Jr., C.E., M.S. Pritchard, and M.J. Rabins. 1995. Engineering Ethics: Concepts and Cases. Belmont, CA: Wadsworth. [36] Haws, D.R. 2001. Ethics instruction in engineering education: A (mini) meta-analysis. Journal of Engineering Education. 90(2): 223–229. [37] Hendley, V. 1996. Let problems drive the learning. ASEE Prism. Oct. 1996: 30–36. [38] Hicks, C.R. 1982. Fundamental Concepts in the Design of Experiments. 3rd^ ed. New York: Holt. Rinehart & Winston. [39] Hult, C.A. 1996. Researching and Writing Across the Curriculum. Boston: Allyn & Bacon. [40] Huvard, G.S., et al. 2001. ChemEngine: Realizing entrepreneurship in undergraduate engineering education. Proceedings, 2001 ASEE Annual

Conference. American Society for Engineering Education. More information about ChemEngine may be obtained from Dr. Gary Huvard, gshuvard@vcu.edu. [41] Johnson, D.W., and R.T. Johnson. 1995. Creative Controversy: In- tellectual Challenge in the Classroom. 3rd ed. Edina. MN: Interaction Book Company. See also  http://www.clcrc.com/pages/academic.html , accessed September 28, 2002. [42] Johnson, D.W., R.T. Johnson, and K.A. Smith. 1998. Active Learning: Cooperation in the College Classroom. 2nd^ ed. Edina. MN: Interac- tion Book Co. [43] Johnson, D.W., R.T. Johnson, and M.B. Stanne. 2000. Coopera- tive Learning Methods: A Meta-Analysis.  http://www.clcrc.com/pages/ cl-methods.html , accessed September 28, 2002. [44] Kaufman, D.B., R.M. Felder, and H. Fuller. 2000. Accounting for individual effort in cooperative learning teams. Journal of Engineering Education. 89(2): 133–140. [45] Krathwohl, D.R., B.S. Bloom, and B.B. Massia. 1984. Taxonomy of Educational Objectives. Handbook 2. Affective Domain. New York: Addison- Wesley. [46] Leifer, L. 1997. A collaborative experience in global product-based learning. NTU Faculty Forum. National Technological University,  www.ntu.edu , accessed September 28, 2002. [47] Lohmann, J.R. 1999. EC 2000: The Georgia Tech experience. Journal of Engineering Education. 88(3): 305–310. [48] Longworth, N., and W.K. Davies. 1996. Lifelong Learning. London: Kogan Page. [49] Mager, R.F. 1997. Preparing Instructional Objectives: A Critical Tool in the Development of Effective Instruction. 3 rd^ ed. Atlanta: Center for Effective Performance. [50] Maricopa Center for Learning and Instruction. Problem-Based Learning.  http://www.mcli.dist.maricopa.edu/pbl/problem.html , accessed September 28, 2002. [51] Marra, R.M., K.Z. Camplese, and T.A. Litzinger. 1999. Lifelong learning: A preliminary look at the literature in view of EC 2000. Proceed- ings. 1999 Frontiers in Education Conference. Institute of Electrical and Electronics Engineers. [52] Maskell, D. 1999. Student-based assessment in a multi-disciplinary problem-based learning environment. Journal of Engineering Education. 88(2): 237–241. [53] McCombs, B.L. 1991. Motivation and lifelong learning. Educational Psychologist. 26: 117–127. [54] McGourty, J., and K. De Meuse. 2000. The Team Developer: An Assessment and Skill Building Program. New York: John Wiley & Sons. [55] McGourty, J., M. Besterfield-Sacre, and L. Shuman. 1999. ABET’s eleven student learning outcomes (a–k): Have we considered the implications? Proceedings, 1999 Annual ASEE Conference. American Society for Engineering Education. [56] McGourty, J., C. Sebastian, and W. Swart. 1998. Developing a comprehensive assessment program for engineering education. Journal of Engineering Education. 87(4): 355–361. [57] McKeachie, W.J. 1999. Teaching Tips: Strategies, Research, and Theory for College and University Teachers. 10 th^ ed. Boston: Houghton Mifflin. [58] McMaster University. Problem-Based Learning.  http://www. chemeng.mcmaster.ca/pbl/pbl.htm , accessed September 28, 2002. [59] Millis, B.J., and P.G. Cottell, Jr. 1998. Cooperative Learning for High- er Education Faculty. Phoenix: American Council on Education/Oryx Press. [60] National Institute for Science Education.  http://www.wcer. wisc.edu/nise/CL1/ , accessed September 28, 2002.

January 2003 Journal of Engineering Education 17

APPENDIX A

Glossary Of Accreditation Terminology

  1. Program educational objectives —“broad, general statements that communicate how an engineering program intends to fulfill its educational mission and meet its constituencies’ needs [4].” Example : Provide students with a solid grounding in the basic sciences and mathematics, an understanding and appreciation of the arts, humanities, and social sciences, and proficiency in both engineering science and design.
  2. Program outcomes —more specific statements of program grad- uates’ knowledge, skills, and attitudes that serve as evidence of achievement of the program’s educational objectives. Example : The program graduates will be able to analyze impor- tant social and environmental problems and identify and discuss ways that engineers might contribute to solutions, including techno- logical, economic, and ethical considerations in their analysis. In Criterion 3, ABET specifies eleven outcomes (Outcomes 3a–3k, listed in Table 1). Program outcomes must encompass Outcomes 3a–3k but should not be verbatim copies of them. To meet the requirements of the engineering criteria, the program outcomes should clearly have been formulated to address all of the program educational objectives.
  3. Outcome indicators —the instruments and methods that will be used to assess the students’ attainment of the program outcomes [75]. Examples : Alumni, employer, and industrial advisory board sur- veys, exit interviews with graduating seniors, student portfolios, capstone design course performance ratings, performance on stan- dardized tests like the FE Examination and the GRE, and job placement data of graduates.
  4. Performance targets —the target criteria for the outcome indicators. Examples :  The [average score, score earned by at least 80%] of the pro- gram graduates on the [standardized test, standardized test item, capstone design report, portfolio evaluation] must be at least 75/100.  The [median rating for, rating earned by at least 80% of] the program graduates on the [self-rating sheet, peer rating sheet, senior survey, alumni survey, employer survey, final oral presentation] must be at least [75/100, 4.0 on a 1– Likert scale, “Very good”].
  5. Outcome elements —different abilities specified in a single out- come that would generally require different assessment measures. Besterfield-Sacre et al. [9] break each of Outcomes 3a–3k into separate elements. For some outcomes, such as Outcome 3b, the elements are literally extracted from the outcome statement: Outcome 3b—ability to design and conduct experiments, as well as analyze and interpret data ⇒ designing experiments, conducting experiments, analyzing data, interpreting data. For others, such as Outcome 3e, the elements are derived from an analysis of the specified abilities: Outcome 3e—ability to identify, formulate, and solve engineer- ing problems ⇒ problem identification, problem statement construction and system definition, problem formulation and ab- straction, information and data collection, model translation, validation, experimental design, solution development or experi- mentation, interpretation of results, implementation, documenta- tion, feedback and improvement.
  6. Outcome attributes —actions that explicitly demonstrate mas- tery of the abilities specified in an outcome or outcome element. The main thrust of the work of Besterfield-Sacre et al. [9] is to define attributes at the six levels of Bloom’s taxonomy of cognitive objectives [11] and at the valuation level of Krathwohl’s taxonomy of affective objectives [45] for each of Outcomes 3a–3k. Examples : Attributes proposed by Besterfield-Sacre et al. [9] for the element “Problem statement construction and system defini- tion” of Outcome 3e include:  describes the engineering problem to be solved,  visualizes the problem through sketch or diagram,  outlines problem variables, constraints, resources, and infor- mation given to construct a problem statement, and  appraises the problem statement for objectivity, complete- ness, relevance, and validity.
  7. Program core —a set of courses designated to address some or all of the program outcomes. Required courses in the major field of study would be obvious candidates for the core. Required courses given in other programs, such as mathematics, physics, chemistry, and English—might be included as long as they con- sistently address outcomes. Elective courses or courses whose content varies from one offering to another (so that the outcomes might not be addressed in a particular offering) would not be included.
  8. Course outcomes —knowledge, skills, and attitudes that the stu- dents who complete a course are expected to acquire. Some of the outcomes in program core courses should map onto or be identical with one or more program outcomes.
  9. Course learning objectives (aka instructional objectives )— statements of observable student actions that serve as evidence of the knowledge, skills, and attitudes acquired in a course. Examples : The students will be able to  explain in terms a high school student could understand the concepts of specific gravity, vapor pressure, and dew point  solve a second-order ordinary differential equation with specified initial conditions using Matlab  design and carry out an experiment to measure a tensile strength and determine a 95% confidence interval for its true value  define the four stages of team functioning and outline the re- sponsibilities of a team coordinator, recorder, checker, and process monitor Learning objectives should begin with observable action words (such as explain, outline, calculate, model, design , and evaluate ) and should be as specific as possible, so that an observer would have no trouble determining whether and how well students have accomplished the specified task. Words like “know,” “learn,” “understand,” and “appreciate” may be suitable for use in educational objectives or pro- gram or course outcomes but not learning objectives. To know whether or not students understand, say, the impact of engineering solutions in a global/societal context (Outcome 3h), one must ask them to do something to demonstrate that understanding, such as identify an important problem and discuss ways engineers might help solve it.
  10. Outcome-related course learning objectives —learning objec- tives for a core course that specifically address one or more pro- gram outcomes. These objectives would normally be cited in the self-study to establish where and how the program is addressing

January 2003 Journal of Engineering Education 19

the outcomes in its curriculum, and they must be guaranteed to be in place whenever the course is given. Core courses would also generally include other learning objectives unrelated to program outcomes. We have defined these terms because they or variants of them ap- pear in the accreditation literature and in published self-study re- ports, but there is no requirement that any individual self-study make use of all of them. The only ones mentioned by ABET are the first two (program educational objectives and program outcomes) and the ninth one (course learning objectives); the other terms might or might not be included in a self-study, depending on how the pro- gram chooses to approach the engineering criteria.

APPENDIX B

Illustrative Learning Objectives for Outcomes 3a–3k

Outcome 3a (apply knowledge of mathematics, science, and engineer- ing) and Outcome 3k (use modern engineering techniques, skills, and tools) The student will be able to (insert the usual engineering course objectives).

Outcome 3b (design and conduct experiments, analyze and interpret data) The student will be able to  design an experiment to (insert one or more goals or func- tions) and report the results (insert specifications regarding the required scope and structure of the report). Variants of this objective could be used in traditional lecture courses as well as laboratory courses.  conduct (or simulate) an experiment to (insert specifications about the goals of the experiment) and report the results (insert specifications regarding the scope and structure of the report).  develop a mathematical model or computer simulation to cor- relate or interpret experimental results (insert specifications regarding the experiment and the data). The results may be real data from a laboratory experiment or simulated data given to students in a lecture course.  list and discuss several possible reasons for deviations be- tween predicted and measured results from an experiment, choose the most likely reason and justify the choice, and formulate a method to validate the explanation.

Outcome 3c (design a system, component, or process) The student will be able to  design a system (or component or process) to (insert one or more goals or functions) and report the results (insert specifi- cations regarding the required scope and structure of the report). Variants of this objective could be included in tradi- tional lecture courses (including the freshman engineering course) as well as the capstone design course.  use engineering laboratory data to design or scale up a system (or component or process).  build a prototype of a design and demonstrate that it meets performance specifications.

 list and discuss several possible reasons for deviations between predicted and measured results from an experi- ment or design, choose the most likely reason and justify the choice, and formulate a method to validate the explanation.

Outcome 3d (function on multi-disciplinary teams) The student will be able to  identify the stages of team development and give examples of team behaviors that are characteristic of each stage.  summarize effective strategies for dealing with a variety of in- terpersonal and communication problems that commonly arise in teamwork, choose the best of several given strategies for a specified problem, and justify the choice.  function effectively on a team, with effectiveness being determined by instructor observation, peer ratings, and self- assessment.  explain aspects of a project, process, or product related to specified engineering and non-engineering disciplines.

Outcome 3e (identify, formulate, and solve engineering problems) The student will be able to  troubleshoot a faulty process or product (insert specifications regarding the nature of the process or product) and identify the most likely sources of the faults.  create and solve problems and identify their levels on Bloom’s Taxonomy.  examine a description of a problematic technology-related situation and identify ways that engineers might contribute to a solution.

Outcome 3f (understand professional and ethical responsibility) Given a job-related scenario that requires a decision with ethical implications, the student will be able to  identify possible courses of action and discuss the pros and cons of each one.  decide on the best course of action and justify the decision.

Outcome 3g (communicate effectively) The student will be able to  critique writing samples and identify both strong points and points that could be improved in grammar, clarity, and organization.  critique oral presentations and identify both strengths and areas for improvement.  write an effective memo (or letter, abstract, executive sum- mary, project report) or give an effective oral presentation… (insert specifications regarding the length and purpose of the communication and the intended audience).

Outcome 3h (understand the global/societal impact of engineering solutions) The student will be able to  discuss historical situations in which technology had a major impact on society, either positively or negatively or both, and speculate on ways that negative results might have been avoided.  propose a solution or critique a proposed solution to an engineering problem, identifying possible negative global or

20 Journal of Engineering Education January 2003

installed and failed to achieve the specified delivery rate and ask for possible reasons. (Responses to such questions might include computational errors, measurement errors, instrument calibration errors, violations of assumptions or inappropriate approximations or failure to account for im- portant factors in the design calculation, flaws in the pur- chased equipment, incorrect choice of model, algorithm, or formula, equipment failure of one type or another, sabotage, etc.)  As part of a homework assignment, ask students to make up a problem having to do with the material taught in class that week. Tell them that they will get a minimal passing grade for a completely straightforward formula substitu- tion problem and to get a higher grade their problem must call for deep understanding or critical or creative thinking on the part of the problem solver. Provide constructive feedback and examples of good responses. In a subsequent assignment, ask them to make up and solve a problem hav- ing to do with that week’s material, and later ask them to make up and solve a problem having to do with what they covered that week in this class and in some other class in the curriculum (multidisciplinary thinking), or a problem that involves an ethical dilemma (Outcome 3f) or a con- temporary issue (Outcome 3h or 3j). Make copies of some or all student-generated problems for assessment purposes and consider including good ones on course tests. (An- nounce your intention of doing so when the assignment is given.) [23]

Outcome 3f (understand professional and ethical responsibility)  Include elements of ethical and professional responsibility in course learning objectives and on tests in at least one core en- gineering course in each year of the curriculum, including the capstone design course. Provide instruction in engineering ethics in the form of lectures or supplementary handouts. (A less effective alternative is to offer an elective course on pro- fessional and ethical responsibility.)  Include several course-related professional/ethical dilem- mas in each engineering course that has professional and ethical issues in its learning objectives. Have students for- mulate responses and justifications individually, then reach consensus in pairs or teams of three. Provide constructive feedback and several alternative models of good responses, being sure to convey the idea that there is not one “correct” response and that what matters is the clarity and logical consistency of the justification [67]. Have the students re- formulate their initial responses to the dilemmas in light of the feedback.

Outcome 3g (communicate effectively)  Incorporate “writing across the curriculum” or “writing to learn” methods into engineering courses [14, 39].  Include some qualitative descriptive problems (“Explain in terms a high school senior could understand the concept of ___”) in course learning objectives, in-class exercises and homework, and study guides and tests. Grade both technical correctness and clarity of expression.  In courses that require technical report writing or oral presentation, provide preliminary instruction. Offer bad

examples for students to critique and good and bad examples for them to compare and contrast.  Have students (or student teams) critique first drafts or pre- sentations of other students’ (teams’) reports, considering both technical accuracy and presentation quality in the cri- tiques. For written reports, collect but do not grade the first drafts; for written and oral reports, grade both the critiques and the revised draft or final presentation.

Outcome 3h (understand impact of engineering solutions in a global/societal context) and Outcome 3j (know contemporary issues) Incorporate some in-class exercises, homework problems, and/or case studies that involve current global/societal issues in several engineering courses, including freshman engineering and capstone design courses. (Recent newspaper articles and science and society texts are good sources of topics.) Include such issues as environmental/economic tradeoffs, health and safety/economic tradeoffs, problems related to globalization such as movement of production facilities to other countries, total quality management, and pros and cons of government regulation of private industry. Ask students to generate potential solutions and evaluate them. Require such discussions as part of all major design projects. (A less effective approach is to include a “Science and Society” course in the curriculum.)

Outcome 3i (recognize need for and be able to engage in lifelong learning)  Teach students about learning styles, help them identify the strengths and weaknesses of their style, and give them strate- gies to improve their study and learning skills [25].  Require library and Web searches and documentation of ref- erences. Grade on the thoroughness of the searches and the quality of the documentation.  Occasionally introduce case studies of realistic industrial problems and have the students identify what they would need to know to solve them and how they would go about obtaining the needed information. (In other words, use problem-based learning.)  Use active and cooperative learning (see Section V-C and Appendix E), both approaches that move students away from relying on professors as the sole source of information and accustom them to relying on themselves and one another.  In general, anything done to meet Criteria 3e (identify and formulate engineering problems), 3f (understand professional and ethical responsibility), and 3h (understanding of global/societal context of engineering solutions) automatically addresses Criterion 3i.

Outcome 3k (Use modern engineering techniques, skills, and tools)  Have students use state-of-the-art technology for engineer- ing system design, control, and analysis, mathematical analy- sis, Web-based research, writing, and communication.  Use computer simulations to conduct extensive parametric studies, process optimization, and “what-if” explorations.  Use modern equipment and instrumentation in undergradu- ate laboratories.  Include plant visits and presentations by practicing engineers in required engineering courses to make students aware of modern engineering tools and practices.

22 Journal of Engineering Education January 2003

APPENDIX D

Problem-Based Learning Methods that Address Outcomes 3a–3k

Outcome 3a (apply knowledge of mathematics, science, and engineering) The traditional instructional approach in science, mathematics, engineering and technology that presents “fundamentals” and then (as much as three years later) presents the applications that make use of the fundamentals has repeatedly been associated with low motivation, poor learning, negative attitudes toward the subject, and high student attrition [84]. Virtually all modern research-based references on effective teaching and learning agree that students have greater motivation to learn and learn more effectively when they perceive a need to know the material being taught [30, 70]. Es- tablishing a need to know material before teaching it is almost by definition what problem-based learning does.

Outcome 3b (design and conduct experiments, analyze and interpret data) Rather than having student teams work through a large num- ber of pre-designed experiments in the engineering laboratory course, assign a small number of problems that require experi- mentation to solve (choosing problems that can be solved with existing or readily obtainable resources) and have the student teams devise and implement experiments to solve them. Provide instruction or resources for self-study in experimental design, statistical data analysis, instrument calibration, equipment opera- tion, etc., only after the teams have encountered a need to know the material.

Outcome 3c (design a system, component, or process) In the capstone design course, do not provide instruction or resources for self-study in the elements of the design process— conceptual design, cost and profitability analysis, CAD, optimiza- tion, etc.—until the student teams encounter a need for instruction in those topics in the course of developing their designs.

Outcome 3d (function on multidisciplinary teams) Assign problems whose solutions require material from several disciplines. (It would be difficult to find problems with the complex- ity and open-endedness needed to be suitable for problem-based learning that fail to satisfy this condition.) Assign different team members to take primary responsibility for each discipline, making sure to hold all team members accountable for the work done by each of them. (For suggestions about how to achieve this individual accountability, see Appendix E.)

Outcome 3e (identify, formulate, and solve engineering problems) Problem-based learning is an ideal instructional approach for helping students develop skills in problem identification, formu- lation, and solution, in that it explicitly requires students to do all three in the course of analyzing complex problems. Simply using PBL is therefore a major step toward addressing this outcome. To further facilitate development of problem formulation skills, have students formulate their own focus problems once they have acquired some experience with instructor-formulated problems.

Outcome 3f (understand professional and ethical responsibility) Incorporate professional and ethical dilemmas in focus problems. To impart a unique understanding of professional responsibilities, use a variant of the Virginia Commonwealth University student consulting team experience [40].

Outcome 3g (communicate effectively) Development of communication skills occurs automatically in problem-based learning as long as written or oral reporting is part of the implementation, especially if students work on the problems in structured teams. The greatest benefit is obtained if the implemen- tation adheres to the principles of cooperative learning delineated in Appendix E.

Outcome 3h (understand impact of engineering solutions in a global/societal context) Choosing PBL focus problems that have global or societal implications may be the most effective way of addressing this out- come. For example, assign the students to design a small, inexpen- sive, easily portable solar-powered water purification system for use in rural areas in developing countries and to explore its potential technical and economic benefits.

Outcome 3i (recognize need for and be able to engage in lifelong learning) Any instructional method that transfers some of the burden of learning from the instructor to the students gives students an aware- ness of the need to assume this burden and helps them develop their skills at doing so. Problem-based learning is a quintessentially student-centered instructional approach, and the complex open- ended problems that provide the basis of the approach are exactly the types of problems the curriculum should be preparing the students to address throughout their careers.

Outcome 3j (know contemporary issues) If focus problems involve contemporary issues, the students will end by knowing the issues to an extent that no other educational experience could provide.

Outcome 3k (use modern engineering techniques, skills, and tools) As stated previously, focus problems can be chosen to address any technique, skill, or tool that the instructor wishes to address.

APPENDIX E

Cooperative Learning Methods that Address Outcomes 3a–3k To use cooperative learning, the instructor should have some or all course assignments (problem sets, laboratory experiments, de- sign projects) done by teams of students that remain together for at least one month and as much as the entire semester. Roles should be defined for team members that rotate from one problem set, lab ex- periment, or phase of the project to the next. Possible roles are listed below:  (All settings) Coordinator (schedules meetings, makes sure all team members know what they are supposed to be doing and deadlines for doing it, recorder (coordinates preparation of the

January 2003 Journal of Engineering Education 23

work out an agreement that addresses everyone’s issues and feelings.

Outcome 3h (understand impact of engineering solutions in a global/societal context) Use structured controversy [41] to analyze case studies of contro- versial engineering solutions that have had a global or societal im- pact. Give each team member or pair of team members a position or possible alternative solution to advocate and material to help them develop arguments for their position (or have them do their own re- search, which will also address Outcome 3i), and then have them argue their positions in an intra-team debate. After each side has made its case, have them work as a team to formulate and justify a consensus position.

Outcome 3i (recognize need for and be able to engage in lifelong learning) Using cooperative learning in any way at all moves students away from depending on teachers as resources and toward relying on themselves and their peers, the principal resources for lifelong

learning. Having to work in CL teams promotes recognition of the need for independent and interdependent work, and the ex- perience of doing so promotes the ability to do so successfully.

Outcome 3j (know contemporary issues) Require teams to make up problems that place course content in the context of contemporary issues (which also addresses Outcome 3e). The issues may relate to professional or ethical dilemmas (Out- come 3f) and/or global or societal issues (Outcome 3h). In subse- quent assignments, have teams solve other teams’ problems.

Outcome 3k (use modern engineering techniques, skills, and tools) In any group, some students are likely to have greater computing skills than their teammates have. If computer applications are in- cluded in course assignments done by cooperative learning teams, the novices will benefit from one-on-one tutoring from their more experienced colleagues and the latter students will receive the depth of learning that results from teaching others. The same argument can be made for any engineering technique, skill, or tool.

January 2003 Journal of Engineering Education 25