Standard 2: Assessment System and Unit Evaluation
The unit has an assessment system that collects and analyzes data on the applicant qualifications, the candidate and graduate performance, and unit operations to evaluate and improve the unit and its programs.
2.1 Assessment System and Unit Evaluation
How does the unit use its assessment system to improve candidate performance, program quality and unit operations?
Development of the Watson College of Education (WCE) Assessment System is guided by involvement of WCE and other University of North Carolina Wilmington (UNCW) faculty, and B-12 education partners. The System is described in the WCE Assessment System Handbook (2.3.a; 2.3.d) and includes candidate, program, and unit assessment with a focus on candidate learning outcomes across all levels. The System is aligned with the WCE Conceptual Framework, the UNCW Mission Statement and North Carolina Department of Public Instruction (NCDPI), NCATE, and national standards. It is facilitated through the use of information technologies including the WCE Database and Collaborative Portal, Taskstream, Digital Measures, and Select Survey. The integration of these technologies with WCE assessment processes and procedures results in a comprehensive and integrated assessment system for monitoring candidate performance and managing and improving WCE operations and programs.
The WCE Database and Collaborative Portal provides a system for collecting and managing data on candidates; faculty; B-12 partnership school districts, schools, administrators, and teachers; and WCE alumni. It allows for the collection, analysis, and use of data on applicant qualifications and the performance of teacher education and graduate program candidates (2.3.a, 2.3.b). It includes applicant information, candidate profiles and coursework, key assessment results, and field experience and clinical practice data. The database supports regular and systematic collection of candidate data for analysis and use to improve candidate performance and the quality of programs, field and clinical experiences, and unit operations. For example, the WCE Office of Professional Experiences uses the database to identify and monitor field experiences and clinical practice for teacher education programs. In 2012-2013, this included 3,781 field experience placements and 348 clinical practice placements. The WCE Professional Development System uses the database to monitor training of teachers who supervise interns. In 2012-2013, this included 126 teachers. The PDS also uses the “portal” function of the database to communicate with school partners. The portal allows a level of database access to partners for documenting updates and regular communication. In fall and spring of 2013, the collaborative portal received 1,417 log-ins from B-12 school partners. The database also includes instructor profiles, courses, academic programs, and WCE engagement activities. The WCE can generate program and unit reports in response to internal monitoring and assessment needs, as well as external requests. In addition, faculty uses the database as a resource for grant writing and program recruitment. A graphic of the database and a summary of database development are included as exhibits (2.3.a; 2.3.d)
Decisions about candidate performance are based on multiple assessments at admission into programs, appropriate transition points, and program completion. The WCE Assessment System includes identification and monitoring of transition points and key assessments for initial teacher preparation and advanced programs: program entry, during program, capstone, and program completion. Descriptions of transition points and key assessments are provided in the WCE Assessment System Handbook (2.3.b). Key assessments common to all programs are the WCE Performance Review Process and monitoring of professional dispositions by faculty and advisors (2.3.a; 2.3.d), capstone experiences, and candidate exit surveys. WCE Alumni surveys and Employer Feedback surveys also are administered as post-graduation measures of candidate performance and program/unit effectiveness. All surveys are administered through Select Survey, a secure, web-based survey tool available to UNCW faculty and staff.
The WCE uses Taskstream’s e-portfolio assessment system for the Program Evidences Folio required for initial teacher preparation programs, for the Master of School Administration (MSA) Portfolio required for the MSA program, and for other faculty-developed course projects and key assessments. Taskstream is a cloud-based system for managing assessment, accreditation, and e-portfolios that allows for secure collection of portfolio artifacts, rubric development, and monitoring and reporting. Candidates and faculty can access submissions and provide and check feedback.
WCE Program Assessment is guided by the UNCW Program Assessment Plan and Report template (2.3.a; 2.3.d), which includes assessment of candidate learning outcomes (aligned with state, UNCW, NCATE, and national standards) and program outcomes, with a focus on using assessment results for program improvement. In 2011-2012, the WCE developed and implemented a new process for collecting, summarizing, and using program assessment data (2.3.a; 2.3.d). As a direct measure of candidate learning outcomes, program faculty conduct a program level review of a sample of culminating/capstone projects to determine overall strengths and areas for improvement in relation to expected learning outcomes. Exit and other surveys are administered as indirect measures of candidate learning. In addition, programs identify 3-5 program outcomes they are striving to achieve. Program Coordinators work with the WCE Assessment Office to gather data related to targeted program outcomes. The program assessment cycle is as follows (2.3.a; 2.3.d):
Late Fall Semester: Finalize program assessment plans for the current academic year.
Spring Semester: Conduct program level reviews, collect program outcomes data; administer candidate exit surveys; analyze data; develop a preliminary program assessment report.
Early Fall Semester: Discuss preliminary program assessment reports in program meetings, identify action steps, and begin assessment planning for the current year; Develop and finalize updated program assessment reports, including action steps; Disseminate reports to program faculty and WCE administrators, and submit them to the UNCW Office of the Vice Provost and Senior Associate Vice Chancellor for Academic Affairs.
WCE unit assessment includes aggregate candidate and program assessment data, and data generated at the unit level. Common unit level data include Digital Measures data and faculty and staff Professional Development Plans/Reviews; National Survey of Student Engagement, Faculty Survey of Student Engagement, and Collegiate Learning Assessment results disseminated by the UNCW Office of Institutional Research (1.3.k); and teacher quality research reports disseminated as part of the University of North Carolina General Administration Teacher Quality Research Initiative (1.3.k). Digital Measures is an online information management system for collecting, organizing, and reporting on faculty teaching, research, and service activities. In addition to UNCW requirements for faculty to update their activities in Digital Measures annually, the WCE uses Digital Measures reports for annual faculty Professional Development Plans and Reviews, and for reporting on program-specific and unit-wide activity.
The WCE Assessment and Accreditation Committee is charged with facilitating the development of assessment and accreditation processes to allow the WCE to best support candidate learning and success. Membership of the committee includes: the WCE Assessment Director (Chair), the WCE Administrative Team (Dean, Associate Deans, Department Chairs), the WCE Director of Technology, and one faculty representative from each department (3-year term). In 2012-2013, the committee focused on WCE Professional Dispositions and the newly adopted WCE Mission and Values Statements and Conceptual Framework. The committee developed a “Dispositions Cycle Template” to be used across WCE degree programs to document how and when candidate dispositions are addressed in each program; a “WCE Conceptual Framework and WCE Categories of Professional Dispositions Alignment” document (2.3.a, 2.3.d ) showing alignment of the WCE Professional Dispositions with the WCE Mission and Values Statements; and a summary report from notes generated during a fall 2012 college-wide meeting discussion of WCE Mission and Values integration. The committee also provides feedback on assessment activities, as needed and appropriate. For example, a spring committee meeting was focused on development of WCE Alumni and Employer Feedback surveys.
The WCE works to eliminate bias in assessments and to establish the fairness, accuracy, and consistency of its assessment procedures and unit operations. Assessment data are triangulated by the use of multiple assessments of applicants, candidates, programs, and the unit from internal and external sources. Specific procedures for key components of the WCE assessment system are described in the WCE Assessment System Handbook, pages 14-15 (2.3.a). The WCE follows UNCW policies, procedures and practices for managing candidate complaints. UNCW publishes information in undergraduate and graduate student catalogs on equal opportunity, diversity, and unlawful harassment (2.3.e). The Code of Student Life is also disseminated to students, faculty, and staff and includes a Grievance Policy (2.3.e). The WCE encourages candidates to first seek resolution with the person directly involved (e.g., the faculty member). If the issue is not handled to the candidate’s satisfaction they are encouraged to discuss it with the department chair or supervisor, and if necessary, the WCE Associate Dean for Academic Programs. The Associate Dean and department chairs maintain records of formal complaints and documentation of outcomes (2.3.f). Ultimate appeals for grievances are made to the Vice Chancellor of Student Affairs.
2.2.b Continuous Improvement
- Summarize activities and changes based on data that have led to continuous improvement of candidate performance and program quality.
- Discuss plans for sustaining and enhancing performance through continuous improvement as articulated in this standard.
The WCE Assessment System is guided by the systematic collection and use of meaningful data to support continuous improvement of programs, including courses and clinical experiences, with the ultimate goal of improving candidate learning outcomes and professional success. In particular, the WCE program assessment process is an ongoing assessment cycle that is integrated into WCE and program processes (2.3.a; 2.3.d). Summaries of changes made to programs as a result of program assessment results for 2010-2011 and 2011-2012 and individual program assessment reports for 2010-2011 and 2011-2012, which include “actions taken” in response to assessment results, are provided in exhibit 2.3.g. Given the program assessment cycle, program assessment reports for 2012-2013 will be completed in fall 2013, so are not yet available.
For 2011-2012 program assessment reports, data were collected and analyzed during 2011-2012, actions to be taken were decided upon in fall 2012, and reports were finalized in late fall 2012. The majority of changes made in response to program assessment results were to courses or programs of study (32 changes; see “2011-2012 Program Assessment: Data-based Changes Made to Courses and Programs” in exhibit 2.3.g). For example, Secondary Education licensure candidates indicated, on the Exit Survey a need to have more time in schools to increase their comfort level in the classroom and provide them with more opportunities to interact with high school students. Therefore, additional field experience and tutoring opportunities were provided in SEC 320: Field Experience Block 2. For the Language and Literacy M.Ed. program, faculty review and discussion of key course and program projects revealed that candidates needed research knowledge and skills specific to language and literacy; thus, a “Research in Language and Literacy” course was developed and piloted. For the Secondary Education M.A.T. program, faculty discussion of program assessment results led to the decision to provide more options for content-specific coursework. The development of EDN 595/EVS 592: Island Ecology for Science Educators was developed as one such option for Science M.A.T. candidates. Based on 2010-2011 program assessment reports, the majority of changes made in response to assessment results were also to courses or programs of study (32 changes; see “2010-2011 Program Assessment: Data-based Changes Made to Courses and Programs” in exhibit 2.3.g). Thus, WCE faculty regularly uses program assessment results to identify needed changes in coursework and programs of study to improve programs and candidate learning outcomes.
Other common data-based changes identified in 2011-2012 program assessment reports were related to assessment (27 changes) and the culminating/capstone project (23 changes). Changes in both of these areas increased from the previous year—only 8 changes related to assessment were made and 1 change related to the culminating/capstone project was made based on assessment results in 2010-2011. The increase in changes in these areas is likely due to full implementation of the Program Evidences Folio and MSA Portfolio (capstone projects) in 2011-2012, and the revised program assessment process which requires program level review of capstone/culminating projects as direct measures of candidate learning, as well as, program meetings specifically focused on discussion of assessment results and implications. For example, when discussing assessment results in their program meeting, Educational Leadership and Administration Ed.D. program faculty decided to include a post-coursework survey, in addition to the candidate exit survey, to better differentiate candidate outcomes related to coursework from those learned through the dissertation process. Further, when focusing on culminating/capstone projects as direct measures of candidate learning, faculty often discovered issues with the projects. For example, Elementary Education faculty decided to revise the directions for Evidence 5 in the Program Evidences Folio to ensure that candidates properly conduct and better understand authentic assessment. Thus, the program assessment process has resulted in changes to assessment practice. This increases the fairness, accuracy and consistency of assessment data; therefore, increasing the usefulness of assessment data for candidates, faculty, programs, and the unit.
As previously mentioned, the WCE developed and implemented a new program assessment process in the 2011-2012 academic year. The process has already resulted in meaningful data-based changes to programs designed to support the goal of continuous improvement. As the process continues and is evaluated and modified as needed, program-level assessment, including assessment of candidate learning outcomes and program outcomes, will become more integrated into the work of the WCE. Currently, degree programs, licensure-only programs, and program minors participate in the common program assessment process. The next phase of implementation will include development of a similarly-focused program assessment/evaluation process for other WCE programs and areas such as WCE International Programs, Youth Programs, PDS, and the Education Laboratory. Although these programs currently collect and use data on their effectiveness, the processes will be reviewed and revised as needed to strengthen program assessment and evaluation. For example, in 2012-2013, the Assessment Director worked with the PDS Director and a work group of key stakeholders to conduct an evaluability assessment of the PDS. The evaluability assessment will provide information needed to develop a comprehensive evaluation plan for PDS, along with initial formative data that can be used by PDS to measure current program initiatives.
In addition, development of the WCE Database and Collaborative Portal is ongoing and responsive to stakeholder needs, including internal and external requests and requirements for information and reporting. During the 2012-2013 academic year, 33 projects were completed to make modifications/additions to database functions (2.3.a; 2.3.d). Future projects include development of a grants database to collect needed WCE grant information; a faculty expertise database function to warehouse faculty expertise data that can be matched with graduate candidate needs for advising (e.g., thesis and dissertation committees); and a database function to create a report that shows initial teacher preparation program candidate progress in coursework and the Program Evidences Folio, which requires loading from Taskstream and the UNCW Banner database system. Other efforts will focus on WCE faculty access and use of the database. Levels of access and training will be provided to faculty so they can access database reports useful for monitoring and assessment, recruiting and advising, and research and grant proposals. Currently, the Assessment Office provides this information by faculty request.
Finally, the WCE adopted a new Mission Statement, Values Statements, and Conceptual Framework in 2012-2013. Work has begun to ensure that programs, courses, exit and other surveys align with the Statements and Framework. For example, a fall 2012 college-wide meeting was devoted to facilitated discussion of the Mission and Values statements, guided by two questions: What are we doing that supports the Mission and Values Statements? What should/could we be doing in support of the Mission and Values Statements? The results of that discussion were analyzed and summarized by the WCE Assessment and Accreditation Committee and will be used as a springboard for identifying action steps in a fall 2013 college-wide meeting. The Committee has also documented alignment of the WCE Conceptual Framework with the WCE Professional Dispositions (2.3.a; 2.3.d).