Successful Assessment Grant Proposal: Composition Learning Community

Focus: Integrating student demographics with revised course-based outcomes data

 

Project Information


Project title:  WSU Composition Learning Community: How LC participation during BC and ICN courses impacts students' academic success

Contact person: Nicole Varty

Role/Position: Primary Investigator; Senior Lecturer; Co-coordinator of CLC

Department/Unit: English

Program name: Composition

 Participating colleagues/collaborators: 

Name

Department/Unit

Jule Thomas     

English

Austin VanKirk                

English

Sarah Primeau                

English

Adrienne Jankens           

English

 

Reason for proposal: 


1. How is program-level assessment currently done in your program?

The Rhetoric and Composition Studies Program at Wayne State University provides students at the undergraduate and graduate levels with theoretical and practical knowledge of written language. Faculty and students study the teaching of writing, professional and technical writing, writing assessment, computers and writing, research methodologies, and the history of rhetoric and composition. The Rhetoric and Composition Studies faculty have also, for over ten years, run a successful Community Writing program, and additionally oversee the operations of the Writing Center, a resource for WSU undergraduate and graduate students of all majors. The program has also recently introduced a new minor in technical writing for undergraduate students at the university, as well as student services based support in the form of the Composition Learning Community. Within the general education requirements of WSU for courses that all students must take in order to graduate, the Composition Program is responsible for ENG 1020 (Introduction to College Writing), which fulfills the university's Basic Composition (BC) requirement. It is also responsible for ENG 3010 (Intermediate Writing), ENG 3020 (Writing and Community) and ENG 3050 (Technical Communication I: Reports), all of which fulfill the Intermediate Composition (ICN) requirement. Currently, the Composition Program conducts regular, robust assessment on its curriculum and learning outcomes across the course sequence. Though not a degree-granting program, Composition does maintain its own reporting slot in Compliance Assist and has also provided annual assessment data to the General Education Oversight Committee for Gen Ed program assessment. In collaboration with composition instructors, members of the Composition Assessment Committee collect portfolios of student writing, including reflective essays that guide assessment readers through the students' experiences with the course learning outcomes and course curriculum. The course learning outcomes for each of the general education courses have been developed specifically for each course and subsequently revised based on previous rounds of assessment. Assessment reading of samples (usually representative samples) of portfolios occurs annually. We use thin-slice assessment methods as well as whole paper assessment to propel curricular revisions and professional development goals within the program.  Originally adapted by scholars in the WSU Composition Program from quantitative methods in Behavioral Psychology, thin-slice methods select and assess relatively small representative "slices" of longer interactions for multiple raters to score with a common instrument (Pruchnic et al., 2018). Thin-slice methods have proven to have good-to-excellent inter-rater reliability and can be effective in large-scale direct assessment of evaluative reflective essays, like those we work with in our regular assessment process.   

2. What, specifically, needs improvement with respect to those assessment practices, processes, or instruments?

The process of assessing composition courses is continuing to work well for curricular development and revision. However, our current assessment practices do not encompass the impact of many pedagogical and service-based supports we have put in place for BC and ICN students over the past years, specifically, the Composition Learning Community (CLC). Currently, the CLC conducts regular assessments (via surveys) under the direction of the WSU Learning Communities office. These assessments do allow us to monitor our functions as a learning community, but they so far have not been able to tell us specific things about the impact of CLC participation on student academic success and retention within the Composition Program. For example, while we regularly assess the BC and ICN learning outcomes across all sections, and collect survey data on the function of the CLC, assessment of whether or not participation in the CLC leads to higher levels of success in BC/ICN courses is not possible with our current assessment protocols. Therefore, while we have data on student interactions with peer mentors and CLC instructors, as well as some qualitative data on student participation in the CLC's Writing Showcase, this data does not relate the impact of CLC participation on students' success in the Composition Program. This becomes increasingly problematic when coupled with our growing need for budget increases for peer mentors, as well as our desire to revise training materials and community framework based on direct assessment of that success (or lack thereof). Additionally, recent learning community research has suggested that meaningful learning community work should be designed to support knowledge transfer (Camp and Bolstad 2011). Research also suggests that the traditional assessment methods for events like the Writing Showcase tell us far too little about the empirical validity of such events, and should be broadened to include more mixed methods (Carter and Gallegos 2017). Therefore, assessment of student participation (and non-participation) in the CLC in connection with student academic success in the Composition Program will allow us to either maintain or revise our current learning community practices and development of training for instructors and peer mentors. Finally, we are in need of a system of assessment that utilizes more mixed-method approaches to not only increase the validity of our current assessment practices, but also to maintain or revise the CLC sponsored Writing Showcase to maximize its impact for students. We have much of the data to begin this process (i.e. STARS and current program assessment data), but no current support to facilitate the actual assessment and analysis steps.  Thus, we require support for the development of systematic assessments to provide clear evidence of the CLC's impact. 

3. What factors or conditions have contributed to the area(s) needing improvement?

The conditions contributing to this need for improvement are largely the fact that our current assessment processes do not show us the comparative impact of student CLC participation on such metrics as the productive grade rate and/or retention of students in LC section of key general education courses like ENG 1020 and ENG 3010. In order to determine a) the continuation and sustainability of the CLC and b) the potential growth of the CLC as a support system for BC and ICN students, robust assessment is needed.

Proposed actions and expected impact: 


1. What steps will you take to improve your program's outcomes assessment practices, processes, or instruments if you receive funding? Be as specific as you can about the link between needed improvement(s) and your proposed actions.

We will pilot assessment methods that specifically look to correlate data across program and student service (namely, the CLC) goals. We will do this by cross-referencing data that we already have (via current program assessment and STARS) and data gathered through instruments specifically designed for our program's learning community. 

First, we will generate surveys for students and peer mentors to understand the impact of interventions such as conferences with peer mentors and participation in CLC composition courses on student success in BC and ICN courses. We will distribute pre- and post-semester surveys to both establish a baseline and measure student engagement over the course of the semester. Additionally, we will develop a brief interview protocol to assess student participation in and engagement with the Writing Showcase as a CLC-sponsored event.

Second, we will cross-reference this survey and interview data with STARS data using COGNOS. We plan on working with a COGNOS report writer, who will be able to customize assessment reports where students' demographics, STARS reports, coursework and use of services can be analyzed for important possible trends and correlation. These actions will help us to ascertain where the learning community framework and events may need revision to support the Composition Program goals, as well as where the learning community is successfully doing so. 

Using COGNOS will also require Chi-Squared Automatic Interaction Detection (CHAID) analysis to find important intersections amongst the data. To do so, we will need to purchase SPSS, a statistical analysis program, with the addition of CHAID. 

2. How will your assessment practices, processes, or instruments improve as a result of your project?

Our assessment practices will become more efficient by giving us a window into not only direct curricular revisions needed, but also indirect student service revisions that may be needed. It will also provide us with an empirically sound assessment process for the CLC and sponsored events that will help us effectively revise training and structure, and may also help us argue for funding. This pilot will also demonstrate the effectiveness and potential sustainability of our instruments to continue assessing the impact of CLC in the future.

Assessment expertise:


1. What experience in assessment does your team bring to the project? (NB: The goal of this item is to understand the existing assessment expertise among the collaborators. Prior experience in assessment is not a requirement of the grant.)

Nicole Varty, Jule Thomas and Adrienne Jankens all have 8 years experience designing and facilitating in-program assessment of curriculum (ENG 1010, ENG 1020, ENG 3010) leading to curricular revisions, development of learning outcomes, programmatic learning objectives and optimized assessment methods. This assessment has included portfolio assessment focused on demonstration of particular learning outcomes, as well as use of thin-slice assessment methods. Dr. Varty, Dr. Thomas and Dr. Jankens have conducted directed and content analysis methods on student interview transcripts and student writing during IRB approved research projects. All three scholars have also used Dedoose, a cross-platform web application used for mixed-method research. Dr. Thomas annually assesses the WRT Zone's student demographics and use for assessment of student needs, training and material development. Data collected has resulted in grants for tutoring hours, Graduate tutors, an ESL tutor, and the creation of the WRT Zone totaling $200,000.00.  

2. What assistance, if any, do you need from experts in assessment or in other areas to carry out your project and improve your assessment practices or instruments? Examples might include survey, test, or activity design support, statistical analysis, etc.

All members (with the exception of Dr. Thomas) have no knowledge of SPSS and statistical analysis of CHAID. Therefore we would request support learning and using SPSS for our data mining and CHAID for analysis and assessment. 

Deliverables, timeline, responsible parties: 


Deliverables

Responsible party:

Completion date:

Hire of Research Assistant 

Nicole Varty

August 2019

Hire of Student Assistant

Nicole Varty

August 2019

Development of surveys and interview protocols

Nicole Varty, Jule  Thomas, Adrienne

Jankens, Sarah  Primeau, Austin

VanKirk

August 2019

Development and creation of COGNOS data report and pilot

Nicole Varty, Jule  Thomas, Adrienne

Jankens, Sarah  Primeau, Austin

VanKirk

September 2019

Online SPSS tutorials

Nicole Varty, Adrienne Jankens, Sarah  Primeau, Austin VanKirk

September 2019

Collection and initial analysis of data

Nicole Varty, Jule  Thomas, Adrienne

Jankens, Sarah  Primeau, Austin

VanKirk

September-November

2019

Mid-year assessment of data and trends

Nicole Varty, Jule  Thomas, Adrienne

Jankens, Sarah  Primeau, Austin

VanKirk

December 2019

Report of Progress

Nicole Varty

February 2020

Collection and further analysis of data

Nicole Varty, Jule  Thomas, Adrienne

Jankens, Sarah  Primeau, Austin

VanKirk

January-

April 2020

End-year assessment of data and trends

Nicole Varty, Jule Thomas, Adrienne Jankens, Sarah  Primeau, Austin VanKirk

April 2020

Use of findings for peer mentor and instructor training, material development and strategies for student events and support

Nicole Varty, Jule  Thomas, Adrienne

Jankens, Sarah  Primeau, Austin

VanKirk

May-July 2020

Use of findings for publication in Composition Studies or NCTE journal

Nicole Varty, Jule  Thomas, Adrienne

Jankens, Sarah  Primeau, Austin

VanKirk

May-July 2020

Completion of grant activities

Nicole Varty

August 2020

Coordination with Cathy Barrette for poster presentation

Nicole Varty

September 2020

Public Presentation

Nicole Varty, Jule  Thomas, Adrienne

Jankens, Sarah  Primeau, Austin

VanKirk

January 2021

Funding Request: 


$270.00

SPSS with CHAID decision tree analysis

$20.00 

SPSS Master Class: Online Tutorial

$20.00 

SPSS For Research: Online Tutorial

$1,080.00

Research Assistant #1 for data collection and assessment of peer mentor work (3 hours per week at $12.00 per hour for 30 weeks)

$1,080.00

Research Assistant #2 for data collection and assessment of

Writing Showcase event (3 hours per week at $12.00 per hour for

 30 weeks)

$300.00

Food for lunch during assessment days

$230.00

Poster and Printing needs

$3,000.00

Total    



Return to main WSU Program Assessment Grants page