Skip to content

FAO FAQ

Designed for FAOs, this page provides answers to commonly asked questions, as well as links to related resources and information.  For more information about program assessment, please contact your school assessment specialist and/or review the academic assessment pages on this website.  For questions related to the assessment of graduate programs, please contact the Graduate Assessment Coordinator

1. What is an "FAO"? What do they do?

2. What is "evidence" of student learning?

3. Where can I access sources of data, aside from direct evidence of student learning?

4. What are "benchmarks"?  How are they determined?

5. What is the difference between course evaluations, program learning outcomes assessment, and periodic program review?

6. What is the purpose of the annual PLO assessment report? Who is the audience?

7. What is the process for submitting the annual PLO assessment report?

8. What is the purpose of PLO report feedback and when can we expect it?

9. What resources are available to support FAOs and program assessment more generally?

10. What advice do experienced FAOs have for new FAOs?

11. What are Core Competencies and how are they related to annual PLO assessment?

12. What are some benefits of annual PLO assessment?


1. What is an "FAO"? What do they do?

Faculty Assessment Organizers (FAOs) coordinate the annual and periodic assessment activities of degree programs, undergraduate and graduate, and standalone minors. A full description of FAO responsibilities is available under Faculty Assessment Organizers under Academic/Annual Program Assessment.  This same page provides the current roster of FAOs by degree program.

Information describing the purposes of program assessment is available at here. The campus's Principles of Assessment are available here

2. What is "evidence" of student learning?

Programs are encouraged to gather both direct and indirect evidence of student learning, as a means for assessing the extent to which students are achieving the intended learning outcomes of a major or stand alone minor. 

Direct evidence is the actual work that students complete, and through which they demonstrate the knowledge and skills expected by the program learning outcome. Examples include presentations, papers, questions on final exams, lab reports, senior theses, final performances, design experiences, etc.  The forms of direct evidence used to assess student learning will vary with the discipline. 

Indirect evidence is information describing factors that affect student learning. This kind of information helps faculty understand "why" and "how" students are learning to the extent they are (as demonstrated via direct evidence).

Evidence that can help programs answer "why" and "how" questions include student perceptions of their own learning and the learning environment as well as information about the factors that affect student learning (e.g. the number of papers or a given length that have been assigned). Example methods for gathering indirect evidence of student learning include surveys, reflective writing assignments, student focus groups, etc.

 School assessment specialists can help programs identify and develop both direct and indirect forms of evidence appropriate for the program's learning outcomes and the discipline

3. Where can I access sources of data, aside from direct evidence of student learning? 

Faculty should consult with your school's assessment specialist, who can provide the data or liaise with Institutional Research and Decision Suppport (IRDS) or the Registrar for more complex requests (e.g. historical enrollments, degrees awarded, demographics, etc.).

In general, the assessment specialists can provide information on course enrollments and enrollment trends, such as identifying seniors in a given major who are enrolled in specific upper division courses, or data on course sequencing. 

The assessment specialist can also provide school and program specific results for campus surveys

4. What are "benchmarks"?  How are they determined?

Performance benchmarks refer to the levels of performance that program faculty expect of students at given stages in the degree program.  In doing so, they establish a point of reference for evaluating student learning achievement.  For example, a program might establish the expectation that 90% of graduating seniors will be assessed as at or above proficiency in their oral communication skills as scored on the program's rubric for oral communication.

Benchmarks can be established with reference to an absolute goal set by the program's faculty (for instance the example above) or with reference to prior student performance on the task. The results of prior assessments could serve as a minimum standard, and program faculty can set short-term performance benchmarks for each year that work towards future expectations. For example, a program may have had 70% of students perform at or above proficiency in prior assessment, but the ultimate goal of the program is for 90% to reach proficiency; in the current year, the program might set a benchmark of 80% of students performing at or above proficiency. 

If programs use a national disciplinary exam for assessment purposes, student performance in peer or aspirational peer programs might be used as a benchmark.  Programs might also look to disciplinary societies or other comparable programs for expectations about student performance. 

When setting a performance benchmark, programs are encouraged to evaluate potential alternatives and develop a rationale for the choice. For instance, establishing the benchmark that 80% of graduating seniors will perform at proficient or better on particular program learning outcomes suggests that the program is comfortable with 1 of every 5 graduates not having achieved proficiency at the time they graduate from UC Merced.  Reciprocally, establishing a goal that 100% of graduating seniors will be proficient or better may also not be realistic; it seems reasonable to assume that at least a small fraction of students are likely to perform poorly on a task for reasons that are unrelated to their actual abilities.

Benchmarking Performance Criteria

Benchmarking rubric criteria is another way that assessment results can be made more meaningful and informative. Programs may wish to use or adapt rubrics developed by professional organizations (e.g. the AACU VALUE Rubrics), nationally recognized programs, or disciplinary societies; a major benefit of doing so is that the criteria reflect the expectations of a large number of faculty experts. For example, when 90% of students score as proficient or higher, a large number of faculty would likely agree that those students were indeed proficient. Using or adapting these types of rubrics also communicates that program graduates possess skills and knowledge that are broadly valued.  

For similar reasons, programs might also consider using or adapting rubrics published in the teaching and learning literature of their disciplines. 

5. What is the difference between course evaluations, program learning outcomes assessment, and periodic program review?

Academic program review occurs once every seven years according to the schedule approved by the relevant Academic Senate committee.  A brief summary of periodic program review is available here. Schedules are available here. The program review process is overseen by the Periodic Review Oversight Committee (PROC). 

Annual assessment takes place each academic year, with each year focusing on a different program learning outcome or assessment question of meaning to the program. As per progran review policy, annual assessment results are an important foundation of the program review process; they are reviewed and summarized as part of the program's self-study for program review. 

Course evaluations are student evaluations relevant to a specific offering of a course. They are collected for the purpose of providing the instructor with feedback to inform subsequent vesion of the course. Course evaluation data typically are not used in annual assessment. Evaluation data aggregaged at the level of the program may be part of the data examined in program review. 

6. What is the purpose of the annual PLO assessment report? Who is the audience?

PLO reports are intended to document your program’s assessment activities, findings, and related planning, including the use of findings to improve student learning as appropriate. This information is intended to inform periodic program review, the re-assessment of this PLO at a later date, and school and institutional planning processes.

As such, the audience for this information includes present and future colleagues, including lecturing faculty and TAs; campus leadership including deans and PROC; accreditors (WSCUC, ABET); campus staff who support the program (advisors and instructional staff); and potentially, other stakeholders (as relevant to the program) such as students, alumni, industry leaders, and other institutions. 

More specifically, PLO Reports document the following

  • Program assessment methods, including the source(s) of the student work examined, the sample size, the year of the students whose work was examined (junior, senior, etc.), complementary indirect evidence addressing student perceptions of their learning or the learning environment that supports their learning (surveys, focus groups etc.),  rubrics, group interview scripts, etc.
  • The program’s results for both direct and indirect sources of evidence.
  • A clear summary of the program’s conclusions regarding student learning achievements stemming from an evaluation of its results.
  • A clear summary of the program’s conclusions regarding the effectiveness of its assessment methods for generating actionable and trustworthy (i.e. valid and reliable) results describing student learning.
  • As appropriate, revisions to the program’s curriculum, pedagogy, and/or related co-curricular learning opportunities or support services as follows from the evidence of student learning or learning support needs.   
  • Examples of student work as a reference against which to calibrate future assessment activities and for accreditors as needed.
  • The faculty and staff involved in the program’s assessment activities.

Guidelines for developing the PLO report are available here

7. What is the process for submitting the annual PLO assessment report?

Annual PLO Assessment Reports are authored by the program FAO for submission to the Periodic Review Oversight Committee (PROC) by the submission date chosen by the program's faculty. With support from the school assessment specialist, reports are first submitted to the relevant dean (prior to the PROC submission date). The dean then reviews the reports, and authors a memo summarizing the contents of all reports for that submission date, with particular attention to any resource requests. The dean then submits the summary memo and reports to PROC. This process can be seen here in a flowchart format. A more complete description of the process is available here

8. What is the purpose of PLO report feedback and when can we expect it?

The Committee for the Review of PLO Reports provides feedback to programs on their assessment practices, as summarized in the annual PLO Report. This feedback is intended to foster efficient, effective assessment practices that generate meaningful, actionable insights into student learning and success. Toward this end, the committee's feedback is organized as a structured peer review process grounded in campus expectations for effective program assessment practices.

Feedback is provided during the summer for March 1 reports, and in the spring for October 1 reports.

Additional information about the committee's activities, including the reports it generates, is available here

9. What resources are available to support FAOs and program assessment more generally?

Each school has an assessment specialist available to support the program's annual and periodic assessment efforts. Graduate level assesment support is provided by the Graduate Assessment Coordinator

The assessment specialist is the first line of support; each specialist will meet the individual needs of the program by helping the program identify tools and resources that match its assessment needs. A complete description of the services provided by the school assessment specialists is available here 

Additional assessment-related resources are available under Academic/Annual Program Assessment/Guidelines and Templates on this website. Check back often as we continue to develop the resources portion of this site. Please contact us with ideas for additional resources. 

10. What advice do experienced FAOs have for new FAOs?

Experienced FAOs advised their new FAO colleagues to

  • seek assistance from school assessment specialists and faculty colleagues.
  • familiarize oneself with the FAO role and responsibilities, and with the assessment processes and practices.
  • start the process early, develop a plan for completing the work, engage with the work regularly, and be organized.

More specifically, they suggested that FAOs

  1. Get support from your program faculty.
  2. Contact the school assessment specialist as soon as possible, and meet regularly; stay in contact and be responsive. 
  3. Direct questions to your school assessment specialist. 
  4. Familiarize yourself with assessment practices - consult the assessment specialist, attend workshops, and use provided resources. 
  5. Plan ahead, and start the process of assessment early - do not wait until the last minute. 
  6. Know what is expected of your role (FAO responsibilities). 
  7. Consider what you want to learn from assessment, and be flexible in assessment plans when necessary. 
  8. Consider creating and using timelines and checklists to stay on track. 
  9. Use available resources, including those of professional organizations, and documents available on the campus and school assessment websites. 
  10. Stay organized. 

11. What are Core Competencies and how are they related to annual PLO assessment?

Information and resources related to assessing the WSCUC Core Competencies are available here. Assessment of the Competencies is integrated into annual PLO assessment to maximize efficiency as well as relavance to individual programs. 

12. What are some benefits of annual PLO assessment?

FAOs have identified a number of ways in which annual assessment adds value to an academic program. This includes its positive impact on program planning (85%), faculty communication and group perspective (45%), pedagogy and instruction (15%), curriculum coherence (30%), administrative continuity (5%), student input (25%), and identifying data needs (5%).

When asked what they thought their colleagues found most valuable about assessment, additional benefits were identified, such as the structure that assessment provides, the opportunity to focus on individual learning outcomes, the ability to identify what is working and what is not, and the opportunity for self-assessment and due diligence.