Planning content instructions

What to Write in Each Section of Campus Labs Planning

Planning (formerly Compliance Assist) is the online repository for all program assessment information. Please contact Cathy Barrette at c.barrette@wayne.edu if you would like access to Planning.

This document highlights key information to include in each section of Planning. Please see the Academic Program Assessment Handbook or the Student Services Program Assessment Handbook for more information about best practices and for good examples of each item.
 

NB: If you copy and paste any of the items from this document into Planning, please paste as plain text by right-clicking in the window where you want to paste and choosing "Paste as plain text."

1    MISSION STATEMENT

  • A brief description of the program's (not the department's):
    • Purpose: Why the program exists, what the program does that distinguishes it from other units or programs
    • Offerings: Opportunities, experiences, areas of study that students or clients will gain from the program that help program participants meet program goals)
    • Target audience/Stakeholders: Types of individuals or groups that would benefit from the program
  • The mission statement should be:
    • Aligned with the University, division, and department missions
    • Realistic and achievable
    • Written for a general, not expert, audience

1.1    GUIDING QUESTIONS: 
To help you get started on your mission statement, you can discuss the following questions with your colleagues:

  • What need(s) does the program fulfill?
  • What will the students know, value, and be able to do as a result of your program?
  • What does this program uniquely offer that differentiates it from other programs?

1.2    PITFALLS TO AVOID: 
When writing your mission statement, avoid wording that is:

  • Too general to distinguish the program from other programs
  • Not clearly related to the program's learning outcomes
  • Focused on teaching rather than student learning
  • Written for a specialist/expert audience rather than a general audience

2    LEARNING OUTCOMES (PROGRAM-LEVEL)

Statements of the intended results of the program (not of a course or department). For Student Services programs, this item type can also include program goals.

  • Specific, measurable statements of what graduating students or program clients should know, be able to do, believe, or value
    • Derived from the mission statement
    • Focused on the results of learning or participating in your program's offerings, not on the learning process or on teaching
    • Defined with observable action verbs; see Bloom's taxonomy for examples

2.1    GUIDING QUESTIONS FOR LEARNING OUTCOMES IN ACADEMIC AND STUDENT SERVICES PROGRAMS:
To help you identify your program's learning outcomes, consider the following questions: After participating in your program, 

  • what can students do with the information and skills they have learned?
  • what do they value or care about?
  • what kinds of job skills do they take into the workforce and the community?

(Choose a few of your answers for your first round of learning outcomes and add others later.)

2.2    ADDITIONAL GUIDING QUESTION FOR PROGRAM GOALS IN STUDENT SERVICES UNITS ONLY:

  • What are operational targets do your program's activities focus on (e.g., retention rates, services rates, satisfaction levels, outreach, accessibility)?

2.3    PITFALLS TO AVOID: 

  • Combining two or more ideas into one outcome
  • Describing an outcome that is not measurable
    • Too vague
    • Too broad or inclusive
  • Writing for a specialist audience rather than a general audience
  • Labeling the outcome without defining it in layman's terms in the description box

 
3    CURRICULUM MAP (OPTIONAL FOR STUDENT SERVICES PROGRAMS)

Identifies the relationship between courses students take/activities stakeholders participate in and the program's learning outcomes or goals

  • Which course(s) or activities contribute to each learning outcome?
  • How do courses/activities build on one another over time to support students' learning or success?
  • Identify where in your program each learning outcome is:
    • Introduced
    • D-Developed/Practiced/Reviewed
    • M-Mastered 
  • Or where your program goals are:
    • In high focus and explicitly supported
    • In lower focus and explicitly supported
    • Implicitly or incidentally supported

3.1    GUIDING QUESTIONS

  • Which course(s) or activities intentionally develop each learning outcome?
  • What is the relationship between learning experiences? For example, does one course provide the foundations for another?

A template and an example of a curriculum map are available online at http://wayne.edu/assessment/document/. 

4    ASSESSMENTS

Data sources (information, evidence, metrics, performance indicators, proof) that demonstrate whether students are learning what your program intends at the desired level

  • Draw on information you already have to make data collection more practical and less time-consuming
    • One source might serve as data for multiple learning outcomes, but shouldn't overlap completely
  • Direct assessments (e.g., exams items, projects, presentations) are an important foundation; indirect assessments (e.g., surveys, interviews, institutional data) provide complementary information to make your data more robust.

4.1    GUIDING QUESTIONS
To help you select appropriate sources of evidence or data, consider the following questions:

  • What information does the assessment provide that helps identify how well students are meeting expectations for a particular learning outcome?
    • Does it include extraneous information that will bias the data? (Global scores like course grades conflate performance across multiple learning outcome, for example, so typically don't provide useful information about any individual outcome.)
  • What criterion level of performance will you set (e.g., 85% pass rate, 75% score, 80% agree or strongly agree)?
  • Is it practical to gather this information (not too time-consuming or costly)?

4.2    PITFALLS TO AVOID: 

  • Using an overall exam or project score that is affected by performance beyond what's included in the learning outcome
    • You can use a score from an appropriate section or just from some items, however!
  • Scoring criteria that don't align with the target learning outcome(s) (e.g., deduction for a late submission)

4.3    ASSESSMENT ITEMS HAVE SIX PARTS TO THEM:
4.3.1    In "Assessment Method", please identify:

  1. for which learning outcome(s) this method will provide data/evidence 
  2. what the data source is (scores from exam 3 or presentation 2 in course X,  a survey, etc.)
  3. who the data are collected from (e.g., all students in course X, majors in their final semester)
  4. how the data will be gathered and by whom
  5. how often/when data will be gathered
  6. who will evaluate/score it
  7. what criteria will be used to evaluate/score it
  8. what the evaluation scale is (percentage? proportion that strongly agree? 0-5? Pass/Fail?)
  9. the criteria for acceptable performance (e.g., 85% pass rate, an average 75% score, 80% agree or strongly agree) 
  10. who will review (analyze, interpret) the results and when they will be reviewed

4.3.2    In "Results", please provide:
An objective statement of the degree to which students met the program's performance criteria for each learning outcome and any contextual explanations

  • A summary of scores or responses for the group
    • Concrete, specific information (e.g., "63% of students met the criterion for assessment 1")
    • Context for interpretation (e.g., "Only 15% of students participated", "Canvas locked students out midway through the assignment", "This is our first use of the revised rubric and not all instructors completed the training for it.")
  • A statement of whether the results met, failed to meet, or exceeded the target or criterion level of performance.
  • A list or data file of individual scores, when possible, to support your summary
    • Omit student identifiers (e.g., names, ID numbers) 

4.3.3    "Results from Surveys Delivered through Baseline" is optional:

  • If you have conducted a survey in the online survey software named Baseline, you can directly connect your Baseline survey to your assessment plan in this section.

4.3.4    In "Program Action Plan, please: 

  • Identify at least one area of the program or of the assessment plan that will be monitored, remediated, or enhanced.
  • State at least one logical step the program will take in response to that area to improve or monitor the program.
  • Identify a person or group responsible for carrying out each step of the action plan.

4.3.5    In "Timeline for Action Plan Implementation", please:

  • Specify the program's schedule for implementing the action plan and re-assessing the learning outcome.

4.3.6    In "Reporting to Stakeholders", please state:

  • The program's plan for communicating the results of the program assessment to students and other stakeholders (e.g., students' parents, administrators, community members and supporters)
  • Audience-appropriate information about what you did, why, what you found, and how and when you will use it.
    • Format and content may vary across groups
  • Includes how/where/when you will disseminate the report to your stakeholders
    • Program websites are a good venue for passive sharing of the report, but think about proactive engagement with stakeholders as well.

5    ASSESSMENT PLANNER

An "assessment planner" is any individual in your program who is actively engaged in designing, managing or responding to the program assessment process. This work ranges from setting the mission statement, learning outcomes, or assessment methods to coordinating the curriculum map, analyzing or interpreting data, planning actions based on the results, and communicating with colleagues and other stakeholders about any phase of that process. 
5.1    GUIDING QUESTIONS

  • Who are the key individuals in your program who actively design, manage, or respond to the program assessment process? (There are options for primary and secondary planners to help direct communications, support, and outreach effectively.)
  • Who leads your program's assessment efforts? (Select "primary contact" for these people.)
  • Who collaborates with or supports your primary planner(s)? (Select "secondary contact" for these people.)

Please submit one item for each person and update the item(s) as needed. The item is short and quick to complete, asking for first and last names, email, primary vs. second contact, and position/status (e.g., faculty, staff, student).