HOME

FFR Addendum

 

Standard 1: Content and Pedagogical Knowledge

Task 1A(1). CAEP Feedback: Provide examples of specialty dispositions results by programs and use for improvement.  

EPP Response: Disposition ratings are collected using the UAH Professional Disposition Rating Form. During the first semester in the education program (Block 1), this instrument is completed by each candidate’s field experience cooperating teacher and the UAH faculty members who teach the Block 1 courses (ED 301, EDC 301, ED 307, ED 308, EDC 311). Additionally, candidates complete a self-evaluation and submit the Dispositions Form. Dispositions data from the Cooperating Teachers can be found in “1.1.E Students Dispositions Scored by Cooperating Teachers in Field Experiences”. Data are disaggregated by program; however, many programs are not represented because no candidates were enrolled in those programs in a given semester. We have provided a new table to present the faculty’s disposition ratings as well as candidates’ individual ratings of their own dispositions. See FFR-Attachment 1a and FFR-Attachment 1b

The EPP Faculty meet each semester and review the dispositions ratings, as well as candidates’ academic performance, when determining admission decisions for the Teacher Education Program (TEP). During this process, any candidates with scores of 2 or less in any dispositions area are discussed amongst the entire faculty. Candidates are either “fully admitted”, “admitted with a Professional Development Plan (PDP)” or “not admitted”. The dispositions are utilized to determine both non-admittance, as well specific focus areas for the PDP. Anecdotally, the most common areas for improvement addressed in PDPs are “written communication”, “oral communication”, “intellectual curiosity”, “reliability/dependability”, and “attendance/punctuality”. 

Candidates admitted with a PDP are assigned to individual faculty advisors in their licensure areas who develop a plan of action for the candidate and ensure the plan is actively monitored and identified outcomes achieved. The faculty advisor meets with other EPP faculty to discuss the progress of the candidate in meeting the PDP activities. Candidates are often required to have mandatory meetings with their faculty advisor, go to the UAH Student Success Center for academic support, or complete extra activities to address identified areas for improvement. Holistically, the EPP faculty use the data to determine if there are specific areas that should be addressed throughout the identified candidates’ progression through the program. For example, written expression is often an area of concern for candidates; therefore, we have increased the number of writing opportunities in our courses and expanded our feedback to provide substantive and instructive writing feedback. See FFR--Attachment 2.

When reviewing candidates’ self-evaluations of dispositions from fall 2017-spring 2019, mean scores aggregated across all licensure options revealed all means were 3.0 or higher, indicating candidates consider themselves “proficient”. The lowest mean was in the category of “Oral Communication” (mean of 2.96 for instrumental music). The highest means were in the categories of “Professional Appearance,” “Respect for All Learners” and “Multicultural Sensitivity.”  Overall, elementary education candidates rated themselves higher on all dispositions as compared to secondary education candidates. See FFR-Attachment 1b.

 

Task 1A(2). CAEP Feedback: Provide further evidence to ensure that candidates have dispositions assessment throughout initial and advanced programs.  

EPP Response: Dispositions of all candidates enrolled in the initial programs are assessed each semester as the cooperating teachers complete and submit the dispositions form. Education faculty meet biweekly and monitor and review all candidates on PDPs or those who might potentially require a PDP. All candidates are assigned faculty advisors upon admission who consistently monitor dispositions and professional development. Candidates enrolled in the advanced program are monitored through database queries run at the end of each academic term to identify academic issues. If a candidate struggles in a course with any disposition, the instructor alerts the Department to discuss and determine if a PDP is needed. The advanced program is online and candidates do not complete the five-week courses in a particular sequence.  The candidates are certified teachers and PDPs are only created on an “as needed” basis based upon faculty feedback (not participating in group work, written or oral communication issues, etc.). Effective Spring 2020, EPP faculty will complete a dispositions assessment on initial and advanced candidates upon completion of each course as they progress through the programs. 

 

Task 2A(1). CAEP Feedback: Is this project assessed by the cooperating teacher, university supervisor, or both?  

EPP Response: The Candidate’s Impact on Student Learning Project is completed in the cooperating teacher’s classroom, but it is only assessed by the assigned university supervisor.  

 

Task 2A(2). CAEP Feedback: Fall data of 2018 (N-12) are provided only.  

EPP Response:The Candidate’s Impact on Student Learning Project was first implemented in Fall 2018.  This new assessment in the internship was created in response to ALSDE CIEP (state program approval process) as a new assignment in the internship focused on each candidate’s impact on student learning in the second classroom placement.  The second cycle of data (Spring 2019 with n=10) is provided. Based upon the pilot year implementation and feedback from university supervisors and cooperating teachers, the assignment and scoring rubric were refined prior to Fall 2019 implementation. The third cycle of data (n= 18) will be collected by November 2019 and prepared for analysis upon completion of the Fall 2019 academic semester. See FFR-Attachment 3.

 

Task 3A(1). CAEP Feedback: What kind of assistant [sic] are candidates receiving to improve in these areas?  

EPP Response: This particular assignment is completed in EDC 311, a course in the first block of courses for all educator preparation programs. To further develop candidates’ understanding of  the theoretical framework, they begin the development of a theoretical framework notebook when enrolled in ED 308, Educational Psychology. Candidates build the notebooks throughout the program as they progress through the methods courses so they become more specific and discipline-focused. Theory is also emphasized in the internship during edTPA seminars. Dr. Fran Hamilton, faculty instructor for EDC 311, is currently engaged in an IRB-approved study with interns focused on lesson planning to gather further data on how interns conceptually understand and utilize theoretical frameworks in developing and planning learning segments and instructional units. Upon further analysis of the lower scores related to Student Work Sample, faculty determined many candidates did not collect or maintain work samples, thus resulting in no points for this requirement. The other noted concern was candidates who provided the work samples were in the beginning stage of learning about and demonstrating their ability to provide constructive feedback to students. 

As candidates move through the professional education and subject-specific methods courses, and finally edTPA and internship, they are provided multiple opportunities to engage in child/student study, collect, comment on and analyze students’ work samples, and engage in tasks which require them to demonstrate the ability to provide substantive and instructional feedback to students. In fall 2019, Dr. Fran Hamilton provided an instructional segment in the intern orientation focused on instructional planning. She is actively engaged in an IRB-approved research study which will follow interns throughout the semester and seeks to assess their developing knowledge and understanding of instructional planning, with particular focus on differentiation.

 

Task 4A(1): CAEP Feedback: Provide evidence of candidate internship assessment over three cycles.  

EPP Response: The Danielson Framework was adopted by the faculty in 2017. This was a decision prompted in the process of preparing and submitting the initial ALSDE CIEP documents for  the June 1, 2018 submission deadline . The Danielson Framework was first implemented in the internship in the Fall 2018 semester. Data are provided for both the Fall 2018 and Spring 2019 semesters. The third cycle of data will be available at the conclusion of the Fall 2019 semester. Prior to Fall 2018, the EPP utilized Form 103, an EPP-created document, to assess candidates in the internship semester. Form 103 data are stored electronically and are available for review during the onsite visit.   See FFR Attachment 4.

 

Task 4B(1): CAEP Feedback: On page 2 of SSR, the count of initial and advanced programs is needed for clarification.  

EPP Response: As per the request, an updated table with the initial and advanced programs have been provided. This is an updated version of Table 2: Program Characteristics. In order to help the reviewers better understand our programs, we listed the programs with the majority of our candidates at the top followed by our smaller programs. At the bottom, we have listed all of our programs that did not have any enrollments between Fall 2017 and Fall 2018. However, these programs are “active” and will continue to be active moving forward. The only programs that we determine to make “inactive” are Foreign Language Russian and all of our “middle grade” programs. We closed the Russian program for lack of qualified faculty at UAH. The ALSDE made changes to the “middle grade” certifications that led us to close our programs.  See FFR- Attachment 5

 

Standard 1: Preliminary Recommendations for new AFIs including rationale for each.

1a. CAEP Feedback: Data are not disaggregated by program, so there is limited evidence that all candidates demonstrate proficiency as related to the INTASC standards (component 1.1). Rationale: In that data are not consistently disaggregated by program, only global judgements related to candidates’ proficiency relate to the four areas of the InTASC standards. \

EPP Response: This particular data set is presented in”1.1.B edTPA to InTASC Correlation”. All programs that had enrolled candidates during the three cycles were included and presented as disaggregated data. Therefore, any programs with zero (0) enrollments were not included in the data table. 

 

Standard A.1. Content and Pedagogical Knowledge

Task 1C(1): CAEP Feedback:What are the benchmarks outlining all key assessments for advanced programs?  

EPP Response: Key assessments have been identified for the advanced programs. All advanced candidates complete three foundational courses (ED 530-Applied Multiculturalism, ED 535-Introduction to Applied Educational Research and ED 565-Introduction to Differentiated Instruction). A key assessment is administered in each of these foundational/core courses. These include an annotated bibliography assignment (ED 530), a literature review assignment (ED 535) and a lesson plan critique (ED 565). In each of the advanced concentrations, one or two courses are identified which require candidates to complete a key assessment. A final key assessment for each advanced candidate is required in ED 690-Master’s Action Research Project. Additionally,  ED 696-P-12 internship if required for ESOL and Reading Specialist concentrations only. Insert table. See FFR- Attachment 6.

 

Task 2C(1): CAEP Feedback: How have three cycles of data been disaggregated by licensure areas in advanced programs? 

EPP Response: The Master of Education (M.Ed) program for advanced programs was launched with the first cohort of candidates beginning in summer 2014. The first cohort of M.Ed (advanced) candidates graduated in Fall 2015.  Key assessments are closely monitored by faculty in the foundation/core courses which candidates complete before entering into their selected concentration. Given the small number of candidates in each of the six concentrations (collaborative, elementary, ESOL, reading specialist, secondary, and Visual Impairments), key assessments are reviewed by all faculty in the foundational courses and by program faculty in each of the concentrations (typically only one faculty per program). EPP faculty review performance and progress of advanced candidates in a joint meeting at least once a semester. If any candidate’s performance is of concern, faculty discuss the concern, review performance on key assessments, and determine the appropriate next steps.  EPP faculty have realized if candidates typically struggle with key assessments in foundation/core courses, it often signals the need for academic supports (often in writing) or the need to address dispositional issues.  

All advanced candidates complete ED 690, Master’s Action Research Project, which requires candidates to design and implement an action-research project appropriate for their selected concentration (and supervised by a faculty member) in their own classrooms. Candidates identify a research question, complete a literature review, design and implement the intervention/methodology, analyze pre-post data, summarize findings and identify limitations. The written documentation and evidence is submitted. This, along with a multimedia presentation of findings, constitutes the summative key assessment.  ED 690 data disaggregated by licensure area is provided in Evidence A1.1.A. Additionally, key assessment data for the three foundation/core courses has been disaggregated by licensure area and is available for review during the on-site visit.

 

Task 2C(2): CAEP Feedback:What data are available for advanced program assessments?  

EPP Response: Key assessments have been created and identified for three foundational/core courses taken by candidates in all concentrations (ED 530-Multicultural Education; ED 535-Intro. To Applied Ed. Research; ED 565-Intro. To Differentiated Instruction). Additionally, one or two key assessments are identified in courses specific to each of the six concentrations. Finally, either ED 690 (Master’s Action Research Project) or ED 696 (Internship) serves as the capstone or summative key assessment.  Candidates are provided detailed instructions and descriptions for each key assessment, along with a detailed scoring rubric. Key assessments are scored by the assigned faculty member and reviewed by program faculty within the concentrations. Data for advanced candidates is typically monitored by academic performance on assignments in required courses. Three cycles of candidate data for the three common foundational/core courses, disaggregated by licensure area, can be shared during the onsite visit.  See FFR-Attachment 6.

 

Task 3C(1): CAEP Feedback:How are candidates’ dispositions assessed and the remediation process if needed? 

EPP Response:  Candidates enrolled in the advanced program are monitored through database queries run at the end of each academic term to identify academic issues. If a candidate struggles in a course with any disposition, the instructor alerts the EPP faculty to discuss and determine if a PDP is needed. The advanced program is online and candidates do not complete the five-week courses in a particular sequence.  The candidates are certified teachers and PDPs are only created on an “as needed” basis based upon faculty feedback (e.g., not participating in group work, lack of timely submissions, tact and judgment, written or oral communication issues, etc.). In addition to one semester meeting, EPP made the decision in September 2019 to begin the process of intentionally reviewing and discussing any concerns related to dispositions of advanced candidates as a routine agenda item during faculty meetings. The University is also piloting implementation of Degree Works in the 2019-2020 academic year as a means of tracking and noting any advising concerns, such as dispositions, related to enrolled candidates. EPP faculty members serve as advisors to their assigned education candidates.  

 

Standard 2: Clinical Partnerships and Practice

Task 1C(1): CAEP Feedback: Is there additional evidence to demonstrate that this co-construction occurs?  

EPP Response: The EPP faculty, administrative team, and Coordinator of Field and Clinical Experiences work with partners to co-construct mutually beneficial P-12 school and community arrangements. This is evidenced in the meeting agendas of the Education Advisory Council where candidates’ assessment data are shared (e.g., edTPA) and assessments such as the Dispositions Assessment and Danielson Framework are reviewed and analyzed. The MOUs provide a means to formalize the partnership relationship with area P-12 school districts. Regular meetings with building level and district-level administrators provide additional opportunities to consider approaches to co-construct mutually beneficial arrangements. Additional evidence can be shared during the on-site visit or interviews can be scheduled to confirm co-construction of mutually beneficial P-12 school and community arrangements.  See FFR-Attachment 14.

 

Task 1C(2): CAEP Feedback: What data are now available for new assessments?

EPP Response: The Danielson Framework was selected by the faculty as the state program documents (CIEP) were in preparation in 2017-2018. It was first utilized in Fall 2018. Data are available for Fall 2018 and Spring 2019 semesters. Assessments are currently being collected for Fall 2019.  Prior to Fall 2018, interns were assessed using Form 106. The EPP faculty selected The Danielson Framework as it provides more instructive feedback to candidates, is a valid and reliable instrument, and aligns to indicators across a wide variety of licensure options. The Candidate’s Impact on Student Learning Project was first implemented in Fall 2018. It is completed in the second classroom placement during the internship. Data are available for Fall 2018 and Spring 2019.  See FFR-Attachment 4.

 

Task 2A(1): CAEP Feedback: Can you clarify how you address positive impact within Standard 2? Has data been obtained to support this positive impact? What further evidence is available? 

EPP Response: Alabama does not currently implement a statewide teacher observation instrument or teacher effectiveness model; further, Alabama anticipates implementing a new state assessment in Spring 2019. The EPP intentionally cultivates strong partnerships with its school partners to identify and support clinical educators who demonstrate a positive impact on candidates’ development and P-12 student learning and development. In Madison City Schools and Huntsville City Schools, the EPP interfaces directly with individual school leaders to identify cooperating teachers and place candidates in classroom settings. All cooperating teachers must have at least three years teaching experience and a Class A (Advanced) teaching license. In Madison County Schools, all cooperating teachers and assigned classrooms are identified by school district personnel in the central office. 

The Coordinator of Field and Clinical Experiences works in collaboration with key stakeholders to clearly articulate the needs for cooperating teachers, identify recommended effective teachers for candidates and interns, and use feedback from university supervisors, EPP faculty, and candidates/interns to inform future selection. Additionally, candidates and interns complete a survey to provide feedback and share insights regarding their assigned clinical educator. This allows the EPP to assess candidates’ perceptions concerning the ability of the clinical educator to positively impact their development. When concerns arise, a point of contact identified in each district and/or school is notified and appropriate actions taken to address any concerns. 

Clinical partnerships and practice are designed to ensure candidates are provided opportunities to demonstrate their positive impact on all students’ learning and development. In the second classroom placement of internship, each candidate completes an Impact on Student Learning project. Candidates conduct a baseline assessment in their assigned classroom to analyze patterns of learning. This is used to inform planning of instructional segments or interventions aligned to identified student needs. Candidates write an instructional learning plan, implement the plan, conduct a post-assessment, and analyze their impact on student learning. A reflection paper of insights and lessons learned is also a required component. Prior to the internship, candidates complete similar projects as they progress through their methods courses, reading/literacy courses, and other field experience assignments.

 

Standard 2: Preliminary Recommendations for new AFIs including rationale for each.

AFI 2.1: CAEP Feedback: Provide evidence of co-construction of instruments and evaluations for candidates and of the mutual benefits of relationships between the EPP and local schools (component 2.1). Rationale: There is no evidence presented to demonstrate co-construction of instruments and evaluations. There is minimal evidence of mutual benefits to the EPP and local schools.

EPP Response:  All partnerships are created and nurtured with the goal of mutual benefit to the EPP faculty and candidates, the P-12 students, the cooperating teacher, partnering administrator and school district. One exercise was to review the Strategic Improvement Plan of the primary partnering P-12 districts in collaboration with representatives from each to identify possible ways the EPP might support efforts in each local district. An online course was created by the EPP to provide a toolbox of professional development and informational resources delivered through the EPP’s Learning Management System (LMS) to cooperating and mentor teachers hosting both field experience candidates and interns. Local school district personnel shared video resources on instructional coaching to support the effort. Further, several EPP faculty members are engaged in LETRS training (a statewide literacy professional development initiative) alongside inservice teachers in partner districts. Outstanding mentor teachers and partner school districts are recognized each semester at the intern recognition reception based on nominations submitted by university supervisors and interns. 

The Teacher Education Advisory Council meets at least once each semester and provides another means to collaborate and develop meaningful partnerships with clinical educators and clinical experiences. In recent meetings, the Advisory Council has explored such topics as edTPA (required for both interns and teachers hired on provisional licenses in Alabama), expectations for novice teachers, and critical needs in the districts for teachers, educational leaders, and students and their families. Strategies have included carousel discussions, table talk, brainstorming sessions, exit tickets and more. One particularly useful discussion in Spring 2018 focused on how the EPP might address the expressed needs of the P-12 partners or provide meaningful support for successful initiatives underway in the districts. Members of the Advisory Council have also commented on the benefits they derive from interacting with other teachers, principals, administrators and instructional partners serving in surrounding area districts. 

The Advisory Council has engaged in learning about and reviewing the Dispositions Assessment used by the EPP, as well as the Danielson Framework. P-12 partners have participated in the process of establishing content validity via the Lawshe Method for the Dispositions Assessment, as well reviewing the Danielson Framework. In the November 2019 meeting, the Advisory Council will provide input on and engage in efforts concerning validity and inter-rater reliability for the P-12 Impact on Student Learning Assignment and Rubric. Partnering districts actively participate each semester in the principals’ panel for interns. They also welcome faculty into classrooms for guest lectures or occasional consultations when needed under the “Teacher Warranty” program the EPP offers to support any of its completers. Continued engagement and active participation of many of our partnering educators indicates the meaningful and sustained nature of the partnerships which exist. 

Candidates enrolled in ED 308, Educational Psychology, provide individual tutoring to an identified student in their assigned field experience classroom in a Huntsville City Schools Title I school. Regular meetings with the clinical faculty and the candidates allow candidates to discuss data collection, design interventions, and see their impact on student learning.  The collaborative effort also provided insight into how theoretical constructs inform design and implementation of learning experiences. Candidates experience a meaningful experience with an individual student, the school benefits from additional instructional supports for identified students, and the candidate is able to engage with the assigned clinical faculty and EPP faculty to address a real and authentic challenge related to student learning. 

Candidates enrolled in EDC 302, Introduction to Low Incidence Populations, provide a 4.5 weeks summer extended school year (ESY) experience in collaboration with public schools teachers with Madison City Schools. Candidates provide approximately half of the instruction with guidance and support of a cooperating teacher. The ESY program is provided by the district to serve elementary and secondary students enrolled in the district. EPP candidates provide whole group, small group and individual instruction focused on a theme selected by the cooperating teachers. The students are typically those with moderate to significant disabilities. Candidates enrolled in EDC 351participate in a self-contained classroom in a partner district. In consultation with the cooperating teacher, the candidates identify a target student demonstrating behavior of concern. The candidates conduct a functional behavior assessment for the student, collaborate with the cooperating teacher to develop and implement a behavior intervention plan, and then assess its effectiveness in prompting or supporting desired behavior. In many cases, candidates teach paraprofessionals or other individuals how to continue enacting the behavior plan after they have left the placement to support the student’s continued progress. Candidates present their projects in a formal poster session with peers, faculty, and the general public. Many of these posters are on display in the College of Education.

The College of Education faculty have developed an extensive resource library which allows candidates access to instructional resources they can use in field experiences and internships. Additionally, many cooperating teachers and completers often request to use these resources on loan to support classroom instruction. 

The M.Ed.-Visual Impairments (VI) concentration was created at the request of the Alabama Institute for the Deaf and Blind (AIDB). With the closing of the only VI program at the University of Alabama at Birmingham (UAB), the state recognized the need for another university to begin a VI program. During the summer, candidates in the VI program are required to complete their practicum in two parts in collaboration with AIDB and UAH. The first week is held at the Alabama School for the Blind and the Helen Keller School, both schools in the AIDB system. During this week, candidates are required to provide support to academic and extracurricular events held during the school’s extended school year program for students from throughout Alabama. The second week is completed at UAH as part of the AIDB and UAH Center for Cybersecurity Research and Education “GenCyber” camp for students with visual impairments. Again, candidates are required to provide support for the camp by providing basic orientation and mobility, behavior management, academic support, and technical assistance. This mutually beneficial partnership is one example of how the EPP has established statewide partnerships. 

During the site visit, the EPP welcomes the opportunity for the team to interact with cooperating and mentor teachers, administrators, instructional coaches, and central administration staff from our various P-12 partners. Also available are a number of email communications, comment cards from Advisory Council participants, Advisory Council meeting agendas, letters sent to cooperating teachers outlining candidate expectations, documentation of meetings with districts and individual partner schools, MOUs, etc.

 

AFI 2.2: CAEP Feedback: Present further information and data related to positive impact on P-12 candidates (component 2.3). Rationale: Though implied, there is no explicit data to demonstrate positive impact on P-12 candidates [sic]. 

EPP Response:  Clinical experiences are designed with multiple performance-based assessments at key points within the program to demonstrate candidates’ development of the knowledge, skills, and professional dispositions associated with a positive impact on the learning and development of all P-12 students. In Block 1, candidates in EDC 311/511 plan and implement standards-based instruction,The lessons are videotaped, candidates gather work samples, and then reflect on their effectiveness in designing and implementing the lessons, as well as reflect on appropriate “next steps” for differentiation of instruction based on student assessment data. Candidates enrolled in ED 308 identify one student whom they tutor throughout a semester with targeted instruction to improve the student’s understanding and skill level in a focus area identified in collaboration with the cooperating teacher. All early childhood and elementary education candidates complete ED 375 which includes a child study project with components such as administration of reading assessments to determine baseline developmental levels and creation of instructional/intervention plans for parents and teachers to implement. All interns complete an impact on student learning project in the second classroom placement of the experience which requires them to demonstrate their positive impact on P-12 students in the assigned second classroom placement of the internship.

Each initial licensure program has identified key assessments and field experiences aligned with state indicators as documented in the CIEP state program approval submissions. The alignment between courses, field experiences, and key assessments for each program assures candidates in each program must demonstrate the knowledge, skills and professional dispositions associated with a positive impact on the learning and development of all P-12 students. Data tables have been created for key assessments in each program to facilitate the monitoring and review of candidates by licensure area. The EPP is awaiting response from the Alabama State Department of Education concerning approval of the proposed key assessments based on the comprehensive CIEP program submissions. The original CIEP submissions are uploaded as evidence, and the revised CIEP program documents were submitted on October 1, 2019.

 

Standard A.2. Clinical Partnerships and Practice

Task 1C(1): CAEP Feedback: What are the field requirements for each of the programs? Though it was noted that candidates may complete the work in their own classrooms, what they are actually doing is not clear.

EPP Response: Only the English Speakers of Other Languages (ESOL) and Reading Specialist concentrations require an internship course; however, each concentration requires field experiences with job-imbedded assignments throughout the courses which require application and implementation of knowledge and skills in an advanced candidate’s classroom. If a candidate is not currently serving as a teacher of record in a classroom, a placement with a partnering school district is secured by the Coordinator of Field and Clinical Experiences.  

Candidates in the Visual Impairments concentration complete field experiences with the Alabama Institute for Deaf and Blind (AIDB) in Talladega, Alabama as part of EDC 656. These candidates are also required to complete a second field experience in partnership with AIDB and the UAH Center for Cybersecurity Research and Education during their CyberGen Camp for Students with Visual Impairments as part of EDC 653  Candidates in the Collaborative (ASD) concentration complete an intensive field experience in EDC 660 , a summer course which requires candidates to participate in a “camp” experience hosted by UAH for students with autism that focuses on transition and educational programming.  

Candidates in the Reading Specialist concentration assess a secondary school reader in ED 608 and then design, implement, and assess the intervention which consists of at least five learning plans. A similar project is completed with an early grades reader when candidates are enrolled in ED 612. Instructional segments are recorded, reviewed, and assessed by the candidate. Additionally, candidates review the student data from the identified reader at baseline, during the intervention, and upon completion of the five learning plans. Student strengths and opportunities for growth are identified.

All candidates must complete ED 690, Master’s Action Research Project, in their final semester. This requires each candidate to design and implement an action research project in the selected concentration is his/her own classroom. This includes literature review, identification of research questions, data collection and analysis, and presentation of findings. See FFR-Attachment 7.

Task 1C(2): CAEP Feedback: What data are now available for the advanced programs action research project? How do these data align with each of the components of the standard?

EPP Response: In the advanced programs, the majority of candidates complete the ED 690: Action Research Project within their own classroom. In the rare instance a candidate is not employed as a teacher of record, the EPP  secures a placement to allow the candidate to complete the field experience activities. The framework of the Action Research Project was mutually developed with the Madison City and Huntsville City school districts when the M.Ed. (advanced) program was established. Data related to the action research project can be found in A1.1.A Advanced Programs Action Research project DATA. See FFR-Attachment 11.

 

Task 1C(3): CAEP Feedback: The ESOL Internship and the Reading Specialist are clearly supervised internships. Are the field experiences for the other programs supervised by the university? 

EPP Response: The Alabama State Department of Education (ALSDE) does not require a supervised internship for advanced programs other than Reading Specialist and ESOL (since these are initial certifications that can only be earned at the graduate level). All other concentrations require field-based assignments and experiences, but the EPP does not provide on-site supervision. Many field-based assignments require the advanced candidates to record instruction and provide annotations, lesson plans or reflections on the instructional lessons or interventions. Field-based assignments are connected directly to the candidates’ classrooms. Special education programs (collaborative-ASD and visual impairments) are required to successfully complete a formal practicum course within the program. Advanced collaborative candidates must complete EDC 660, Practical Applications of Visual Strategies, which requires a minimum of seventy clock hours completed on-campus under the direct supervision of graduate faculty members. Candidates develop and implement a thematic unit utilizing best practices for students diagnosed with autism. They also plan, create, and implement all visual supports for the physical classroom. Advanced visual impairment candidates complete a required two-week practicum during the summer when enrolled in EDC 653 and EDC 656. These field experiences are completed at the Alabama Institute for Deaf and Blind (AIDB) in Talladega during a camp hosted by UAH for students with visual impairments. Both field experiences are supervised by a faculty member with expertise in visual impairments.     

 

Standard 3: Candidate Quality, Recruitment, and Selectivity

Tasks 1A(1): CAEP Feedback:  Benchmarks and goals for the recruitment plan.  

EPP Response: The faculty reviewed the recruitment and retention plan for the College and discussed existing partnerships, the expressed needs of partner districts, and high-needs teacher certification areas as identified by the state and Title II reports.  Further, a review of historical data provided a baseline from which the EPP could establish benchmarks. It was determined the following goals would guide and inform recruitment efforts for educator preparation. (1) Increase the number of male candidates admitted to initial educator preparation (Benchmark for 2019-20 academic year: 6 male candidates). (2) Increase the number of non-Caucasian candidates admitted to initial educator preparation. (Benchmark for 2019-20 academic year: 5 candidates ). The benchmarks were established after a review of longitudinal admissions data. For example, in the 2018-19 academic year, the EPP admitted only 7 male candidates and 4 non-Caucasian candidates. For the 2020-2021 academic year, the EPP has established benchmarks of 10 male candidates admitted to educator preparation, and 6 non-Caucasian candidates admitted to educator preparation. If these benchmarks are realized, it will yield a 25% increase in the 2019-2020 academic year as compared to the previous year, and a 45% increase in 2020-2021 as compared to 2018-2019.

 

Task 1A(2): CAEP Feedback Verification that data has been disaggregated by specialty licensure areas.

EPP Response: FFR-Attachment 8 displays Praxis Core data, including means and medians disaggregated by specialty licensure areas. Only programs with enrolled candidates are included.  FFR-Attachment 9 provides information by specialty licensure areas for admitted candidates and completers.

 

Task 1A(3): CAEP Feedback Documentation of how candidates with disposition issues have been addressed. Examples of PDP for candidates with disposition issues. 

EPP Response: Examples of PDPs for candidates with disposition issues are provided.  See FFR-Attachment 2.

 

Standard A.3. Candidate Quality, Recruitment, and Selectivity

Task 1A(1): CAEP Feedback: Benchmarks and goals for the recruitment plans.

EPP Response: The EPP reviewed longitudinal data for advanced completers each semester since Fall 2015 when the first cohort graduated. Based on historical data, benchmarks and goals are established for the 2019-20 and 2020-21 academic years. While it might appear the overall numbers established are lower, the numbers in 2016-17 were inflated due to a large number of candidates pursuing ESOL certification with support from a Federal grant. See FFR-Attachment 10.

 

Task 1A(2): CAEP Feedback: Verification that data has been disaggregated by program area.

EPP Response: See FFR-Attachment 11 for examples of ED 690 data and FFR-Attachment 12 for data related to GPA requirements for advanced candidates.

 

Task 1C(1): CAEP Feedback: See examples of the capstone research project and the rubric used to evaluate the project.  

EPP Response: Examples of the ED 690: Action Research Project assignments with feedback and accompanying scored rubrics have been provided for Fall 2017, Spring 2018, Fall 2018, and Spring 2019. See FFR-Attachment 11.

 

Task 1C(2): Three cycles of data verifying that the average scores for the group of candidates beginning during an academic year meet the CAEP minimum GPA of 3.0. 

EPP Response: Four cycles of data have been provided in FFR-Attachment 12 for Fall 2017, Spring 2018, Summer 2018, and Fall 2018. Summer 2018 was included in this data as candidates in the Master of Education program are admitted in all three semesters. As per the attachment, it should be noted that all admitted cohorts had an average GPA of over 3.0 (with Fall 2018 being the lowest overall at 3.39). The data have been disaggregated by appropriate licensure area (“concentration”). In reviewing the disaggregated data, all licensure areas had average cohort scores above the minimum of 3.0. 

 

Standard 4: Program Impact

Task 1C(1): CAEP Feedback: How are programs to be represented in the sample selected?

EPP Response: The initial case study is being conducted in the 2019-2020 academic year. A purposive sample of program completers was identified to include representatives from elementary, secondary and P-12 licensure options.  After identifying where the completers were employed, faculty were engaged to help determine contact information or provide insight as to which candidates might be willing to participate. The cohort includes six candidates as described in the case study, including 2 elementary educators, 3 secondary educators (Biology, English Language Arts, and Spanish) representing various disciplines, and one P-12 educator (Visual Arts). Candidates were selected from those who successfully completed the educator preparation program and were recommended for licensure in the 2018-19 academic year. These candidates were all observed and assessed using the Danielson Framework during the internship semester. As described in the case study, the EPP plans to increase the sample size to 12 candidates in 2020-2021, and purposive sampling will be utilized to assure participants represent the breadth of programs offered by the EPP.

 

Task 1C(2): CAEP Feedback: How does the EPP ensure that all programs will eventually be represented in the case study samples?  

EPP Response: As the case study approach continues, efforts will be made by the EPP to assure a variety of participants are included to represent the range of licensure options. After the pilot case study approach is completed in 2019-2020, the plan will expand to include additional participants from those candidates who become completers in this current academic year. The purposive sample will intentionally target candidates in licensure program areas not included in the original cohort. A review of the roster of fall 2019 interns, along with applications of potential spring 2020 interns, suggests opportunities to potentially engage candidates in the additional licensure areas of elementary education/collaborative, secondary (chemistry, math) and P-12 candidates in either music education or physical education . No candidates have yet completed the early childhood education/early childhood special education licensure program.

 

Task 2C(1): CAEP Feedback: How does the observation tool selected for the case study observations align with the EPP preparation experiences and expectations for completers?

EPP Response: The EPP uses the Danielson Framework Teaching Observation Form to assess teaching practices of all pre-service candidates in the internship semester. The Danielson Framework was selected by faculty during the CIEP state program approval process and first implemented with interns in Fall 2018. The Danielson Framework was selected by faculty because it provides a valid and reliable instrument to comprehensively assess all facets of classroom teaching and learning. The components of the Framework are applicable to a wide variety of licensure options from early childhood to secondary education, as well as P-12 options, such as visual arts and music. The Framework addresses four broad domains, including planning and preparation, environment, professional responsibilities, and instruction. For each licensure program option, the Framework has been crosswalked with the respective indicators to assure alignment. 

The completers participating in the case study were all observed and assessed with the Danielson Framework as interns, which yielded baseline data. Use of the Danielson Framework in the case study allows the EPP to assess how completers’ practice in year one or later as a contract teacher compares to their developmental levels as an intern. While interns secure employment across a wide variety of school districts, the EPP seeks to determine if the effective practice indicators outlined in the Danielson Framework and addressed in their preparation programs are enacted in their classroom instructional practices. 

 

Task 3C(1): CAEP Feedback: What were the return rates for the surveys of employers and completers?

EPP Response: The Alabama State Department of Education (ALSDE) administers the surveys for employers and completers for all colleges and universities providing educator preparation programs. Through email correspondence, Dr. Patience Oranika, the ALSDE individual responsible for the survey administration and analysis, explained both the employee and employer surveys were “sent statewide to 2,068 first-year teachers and their employers respectively.  A total of 1017 first-year teachers completed the survey. The total number of respondents for the ALACTE Survey for Employers of New Teachers was 467.  The number of teacher-respondents that claimed affiliation to the University of Alabama in Huntsville was 11. The number of employers of those teachers who completed the survey did not meet the reporting threshold.”  A response rate for employees can be calculated as follows: The EPP recommended 33 completers for initial licensure in the 2017-2018 academic year. In the 2018-19 survey administration, 11 first-year teachers (employees) responded, yielding a response rate of approximately 33%. 

For the 2017-18 ALSDE survey, a response rate for employees can be calculated as follows:  The EPP recommended 44 completers for initial licensure in the 2016-2017 academic year. In the 2017-18 survey administration, 16 first-year teachers (employees) responded, yielding a response rate of approximately 36%.

 

Standard 4: Preliminary Recommendations for new AFIs including rationale for each.

AFI 4.1: CAEP Feedback: Satisfaction of employers with EPP particular program completers is unclear (component 4.3). Rationale: Data from employer surveys is not disaggregated by program, so employer satisfaction with completers of a particular of a particular program cannot be determined.

EPP Response: The ALSDE launched the employer survey for all colleges and universities with approved educator preparation programs. Responses are provided to the EPP but are not disaggregated by licensure areas due to low number of responses and low number of completers in individual licensure programs. The ALSDE established a reporting threshold of n=5 as communicated via email by Dr. Patience Oranika.

 

AFI 4.2: CAEP Feedback: Satisfaction of completers with EPP particular programs is unclear. (Component 4.4). Rationale: Data from completer surveys is not disaggregated by program, so completer satisfaction with particular programs cannot be determined.  

EPP Response: The ALSDE launched the completer survey for all colleges and universities with approved educator preparation programs. Responses were provided to the EPP but are not disaggregated by individual licensure areas due to low number of responses and low number of completers in licensure programs. The ALSDE established a reporting threshold of n=5 as communicated via email by Dr. Patience Oranika.

 

Standard A.4. Program Impact

Task 1C(1): CAEP Feedback: What are the specifics of plans to obtain evidence of employer satisfaction?

EPP Response: The EPP has created a Phase-In Plan to obtain evidence of employer satisfaction via distribution of a survey. The initial survey launch will occur in Fall 2019. Please see FFR-Attachment 13 for additional information. The phase-in plan was designed to satisfy CAEP’s sufficiency criteria.

 

Task 2C(1): CAEP Feedback: What are the specifics of plans to obtain evidence of graduates’ satisfaction?

EPP Response: The EPP has created a Phase-In Plan to obtain evidence of completers’ satisfaction via distribution of a survey. The initial survey launch will occur in Fall 2019. Please see FFR-Attachment 13 for additional information. The phase-in plan was designed to satisfy CAEP’s sufficiency criteria.

 

Standard A.4: Preliminary Recommendations for Stipulation: 

STIP 4.1: CAEP Feedback: Plans to generate evidence of employer satisfaction do not meet CAEP’s sufficiency criteria for plans (component 4.1). Rationale: No evidence is tagged for A4.1. Though some activities are described, the EPP has not included the components required to meet the sufficient level as noted in the CAEP advanced handbook. 

EPP Response: The EPP has created a Phase-In Plan to obtain evidence of employer satisfaction via distribution of a survey. The initial survey launch will occur in Fall 2019. Please see FFR-Attachment 13 for additional information. The phase-in plan was designed to satisfy CAEP’s sufficiency criteria.

 

STIP 4.2: CAEP Feedback: Plans to generate evidence of candidate satisfaction do not meet CAEP’s sufficiency criteria for plans (component 4.2.). Rationale: No evidence is tagged for A4.2. Though some activities are described, the EPP has not included complentens required to meet the sufficient level as noted in the CAEP advanced handbook. 

EPP Response: The EPP has created a Phase-In Plan to obtain evidence of completers’ satisfaction via distribution of a survey. The initial survey launch will occur in Fall 2019. Please see FFR-Attachment 13 for additional information. The phase-in plan was designed to satisfy CAEP’s sufficiency criteria.

Standard 5: Provide Quality, Continuous Improvement, and Capacity.

Task 1A(1): CAEP Feedback: 5.1 produces empirical evidence that interpretations of data are valid and consistent.  

EPP Response: The EPP faculty works collaboratively with P-12 partners, as well as faculty in the College of Science and College of Arts, Humanities, and Social Sciences to develop meaningful assessments, scoring rubrics and determine the validity and reliability of instruments, as appropriate. The Dispositions Assessment was developed as an EPP-created assessment. During the spring 2019 Teacher Education Advisory Council (TEAC) meeting, it was reviewed by the attendees using the Lawshe method for establishing content validity. In the fall 2019 semester, the Danielson Framework was reviewed by content experts at the TEAC meeting. A meeting of university supervisors was convened in fall 2019 to calculate inter-rater reliability for the Danielson Framework which was first piloted for intern assessment in fall 2018. Participants completed two rounds of scoring using the Danielson Framework. One round yielded an overall inter-rater reliability of 88.57%, with each component ranging from 73.81% to 100%. The second round yielded an overall inter-rater reliability of 90%, with each component ranging from 76.19% to 100%. Standards 2 and 4 achieved 100% inter-rater reliability in one round and Standard 4 achieved 100% inter-rater reliability in the other round. 

The component in both rounds with the lowest inter-rater reliability was Standard 1 (73.81% and 76.19%). It was noted that some participants scored elements as NA (Not applicable), thus lowering the inter-rater reliability for elements 1a, 1b, 1c, and 1d. The variation in inter-rater reliability was attributed to the lack of a written lesson plan to accompany the instructional video, thus making it difficult for some elements to be accurately scores. Participants agreed the experience was helpful and informative, and another inter-rater reliability session will be conducted in Spring 2020. This session will include both video instructional segments as well as the accompanying lesson plan. It is anticipated that, with these changes, an inter-rater reliability of at least 80% on Standard 1 should be achieved.

The Impact on P-12 student learning project is an EPP-created assessment that was piloted in 2018-19 during the internship semester.. It It was then revised based on feedback from interns and university supervisors. The revised assignment and rubric will be implemented in fall 2019. The revised assignment and rubric will be reviewed by content experts who are members of the TEAC in November 2019 using the Lawshe validity process. An inter-rater reliability session will also be conducted in spring 2020 using samples generated by interns in the fall 2019 semester.

The Dispositions Assessment and The Impact on P-12 Student Learning Project are the only EPP-created assessments. All other assessments are proprietary (e.g., Praxis Core, EDUCATE Alabama, Danielson Framework, Praxis II exams, edTPA).

The EPP utilizes TK20 as a database of reporting and assessment measures. Data reports when applicable, are generated in TK20, and shared with EPP faculty. Data inform decisions, such as admission to the educator preparation program, admission to internship, development of professional development plans (PDP), and programmatic and curricular changes. Banner Student Information System (SIS) queries are also conducted each semester to analyze candidates’ GPAs, final course grades of C or lower, that might lead to academic warning or probation, as well as creation of a PDP or referrals to the Academic Student Success Center. Banner SIS is a university database which houses candidates’ demographic information, GPAs, and other relevant educator preparation information (e.g., semester candidate is admitted to the educator preparation program).

In addition to Tk20 and Banner SIS, the EPP utilizes Excel spreadsheets, Google folders and docs, and Qualtrics surveys to facilitate the use of data. The EPP shares data within Google Drives and Google Folders, including the “MasterList” which is a listing of all candidates by licensure program. Ongoing monitoring of candidates’ progress occurs during Departmental meetings and changes to programs (even at the course level) are made in cooperation and communication with EPP faculty, as well as those in other Colleges.

Some examples of recent changes prompted by review and analysis of data include:

  1. Creation of two methods courses for secondary education licensure options.
  2. Creation of ED 360, an intensive practicum course for early childhood and elementary education licensure options
  3. Creation of separate course sections of Classroom Management and Instructional Strategies courses for early childhood/elementary v. secondary education candidates
  4. When data from EDC 311 scoring rubrics revealed candidates’ lower scores on Theoretical Framework, faculty engaged in curriculum mapping to intentionally infuse theoretical frameworks into courses progressing from Block 1 to internship.
  5. Upon review of edTPA data, faculty engaged in a total redesign of Assessment courses based upon candidates’ performance in Task 3 rubrics.
  6. Based on feedback from candidates and expectations in the state’s CIEP program approval process, faculty redesigned ESOL and Reading Specialist Internships.
  7. Based on feedback from candidates and faculty, advanced candidates enrolled inED 690 are now required to design and implement the entire project rather than simply write a proposal.
  8. Key Assessments were created for each licensure area (fueled by CIEP state program approval process). This required development of key assessments aligned with indicators, field experiences, and course objectives in each licensure program.

The EPP’s organizational structures, consisting of several groups of faculty and committees working together to improve candidates’ learning and experience, facilitate data collection, analysis of data, and data-based decision making. Examples of such organizational structures are:

  • Program faculty review program data and recommend curricular and process changes which affect candidates. These are approved after review by the EPP faculty, College of Education curriculum committee, dean, UAH undergraduate or graduate curriculum committee, and Academic Affairs.
  • Teacher Education Advisory Council (TEAC) engages P-12 stakeholders and alumni.
  • Creation of a Data Analyst position to support the work of EPP faculty, Coordinator of Field and Clinical Experiences, Department Chair, Certification Officer, Associate Dean and Dean.
  • Associate Dean and Dean engage in regular communication and interfacing with the University’s Office of Institutional Research and Assessment (OIRA), UAH Registrar and Records Office (e.g., implementation of DegreeWorks), University Office of Information Technology (e.g., implementation of Faculty 180), all of which play roles in the Quality Assurance System.
  • The Dean serves on the University Task Force to facilitate sharing of information with other academic colleges and units.

The Associate Dean leads efforts related to access, security, reliability and validity of data the EPP uses and also interfaces with University data systems. He assures security of sensitive candidate data and conducts queries for EPP faculty and staff, when appropriate, to access data for decision making by licensure programs to inform continuous improvement for CIEP (state program approval), SPAs or CAEP. The Associate Dean and Data Analyst are team leads on implementation and use of Tk20, data tools used by the University (e.g., Banner SIS, Qualtrics, ETS, Pearson, software tools used by the University, and various data systems that house data used for reporting). The Dean leads staff meetings every two weeks to facilitate communication, collection and analysis of data between Dean, Associate Dean, Budget Analyst, Certification Officer, Data Analyst, Academic Advisor, and Coordinator of Field and Clinical Experiences. The Associate Dean represents the College by serving on University committees related to Academic Scheduling and University Catalog and SACSCOC Accreditation and Outcome Measures. Both the Associate Dean and Dean attend webinars related to data management and quality assurance hosted by groups, such as SkyFactor, AACTE, Title, and other assessment vendors. Additionally, both attend the CAEPCon to remain informed and aware of CAEP expectations and learn how to best support the quality management system.

 

Task 1B(1): CAEP Feedback: Quality Assurance Plan: the EPP is committed to fairness, accuracy, consistency and the avoidance of bias (pg. 11).  

EPP Response:  The EPP employs numerous data sources to support decision making and continuous improvement. Both proprietary and EPP-developed instruments are utilized with attention given to assure the quality of measures to support fair, accurate, valid and reliable decision making. Assessment instruments are utilized to assess what they are intended to assess. For example, the EPP utilizes Praxis Core exam scores in admissions decisions. Praxis Core is a relevant metric of academic knowledge and skills in reading, writing and mathematics. Praxis II content exams are utilized to assess content knowledge, general pedagogical knowledge, and content-specific pedagogical knowledge. https://www.ets.org/s/praxis/pdf/proper_use.pdf

The following strategies help ensure fairness, accuracy, consistency, and elimination of bias throughout the assessment system:

1) The unit ensures the assessments are aligned with the licensure indicators, and that relevant standards are reflected in syllabi, key assessments, and unit assessment measures.

2) Initial undergraduate and initial graduate candidates are informed of all requirements in the education program when they initially meet with their academic advisors and before they submit their application for admission to the program. Orientations are provided for candidates regarding the requirements, policies, and procedures for programs and field experiences, and individual and group advising sessions are held (e.g., ED 301/501). Advanced candidates are informed of the requirements via online information, emails and advising sessions designed to explain procedures for program matriculation. Initial undergraduate and initial graduate candidates receive a copy of the “Internship Handbook” at the beginning of the student teaching or internship. They also participate in an internship orientation.

3) Rubrics for the course-based key assessments are shared with the candidates before they are used. Thus, candidates know what they will be assessed on, what is expected of them, and the level of proficiency associated with each scoring decision.

4) All curriculum or program changes must be submitted for approval and follow the outlined approval process. This process includes review beginning at the program level, continuing through the department level, to the college level, and finally, to the university level. One of the purposes of the process is to ensure that the proposed changes are reviewed for fairness, accuracy, consistency, and freedom of bias.

5) The dispositions rubric used to assess candidates is discussed with the candidates by advisors, instructors, university supervisors, and internship coordinator. They are also shared with the cooperating teachers, and cooperating teachers assess candidates’ dispositions. Training for using standard rubrics and frameworks is provided to all full-time and part-time faculty. Program faculty participate in inter-rater reliability training to ensure inter-reliability. Rubrics used for program specific assessments are discussed with candidates each semester by program faculty members.

Guidelines: 1) Every faculty member and/or university supervisor should complete interrater reliability training, 2)  faculty and university supervisors should participate in refresher workshops on the use of the rubric at regular intervals to assure scoring remains consistent, 3) each candidate should be assessed by multiple faculty members, and 4) an independent assessment of the scoring process to review the reliability and validity of the instrument over time should be implemented.

6) Data are triangulated wherever possible to enhance the reliability of findings. For example, many of the same questions are asked on the exit surveys and on completer surveys for both the initial and advanced programs. Also for the initial programs, the intern, cooperating teacher, and university supervisor each independently complete surveys and assessments at the end of the semester.

 

Task 1C(1): CAEP Feedback: The summary data sheets are missing key data (such as score ranges, instructions, narrative summaries, complete field names or copies of questions). Please provide complete information. 

EPP Response: Additional data,such median, mode, and ranges, have been added to summary data sheets where applicable. The updated summary sheets are posted on the UAH CAEP website at https://www.uah.edu/education/caep-2019.

 

Task 1C(2): CAEP Feedback: Please provide evidence that the data is shared with stakeholders, how the feedback is collected and used for the program improvement or other actions. 

EPP Response: Data are routinely shared with stakeholders through meetings and communications/resources shared via the Teacher Education Advisory Council (TEAC).  Recent TEAC meetings have focused on sharing candidates’ performance on edTPA rubrics. TEAC members have assisted with establishing the Content Validity Index for the Dispositions Assessment utilizing the Lawshe method, and they will soon engage in a similar process for the EPP-created instrument for Impact on P-12 Student Learning Project. The EPP asked stakeholders to share, from their perspectives as active practitioners in P-12 settings, the emerging issues or concerns from the field which should be infused into the educator preparation program. These were taken to the EPP faculty to discuss and consider how they might be incorporated or addressed. 

The Alabama Educator Preparation Institutional Report Card, created by the Alabama State Department of Education, is made available publicly each year with stakeholders. The EPP maintains a website for the TEAC where data and informational items can be shared. Email communication also allows the EPP to share data with stakeholders. A variety of strategies are implemented to collect feedback, including carousel/gallery walks, table talk, and exit tickets at TEAC meetings.  See FFR-Attachment 14. The Alabama Educator Preparation Institutional Report Cards can be accessed at https://www.alsde.edu/ofc/otl/Pages/epirc-all.aspx.

Examples of program changes due to stakeholder feedback or analysis of data include:

  •  the addition of a second methods course to all initial licensure undergraduate secondary education programs
  •  the addition of a K/1 practicum (ED 360) to all initial licensure undergraduate early childhood and elementary education programs
  • refining of objectives, content, and key assessments for assessment course (ED 315)  based on candidates’ performance on edTPA Task 3 rubrics
  • increased focus on differentiation and instructional planning in EDC 311 and internship based on candidates’ performance on edTPA Task 1 in year one of consequential implementation (2018-19)
  • Adjustment in advanced capstone course (ED 690) to require candidates to implement, rather than simply, propose an action research project
  • Continuation and increase efforts to promote Early Start program due to participants’ positive feedback on the impact of the experience
  • Incorporation of emerging issues identified by stakeholders, where appropriate, in existing coursework (e.g., LETRS content, trauma-informed care, etc.)
  • An additional question was added to Dispositions Assessment based on stakeholder feedback at the Fall 2019 meeting.

 

Task 2A(1): CAEP FeedbaCk: The EPP will establish content validity and inter-rater reliability.

EPP Response: The EPP utilizes two EPP-created instruments (e.g., Dispositions Assessment and Impact on P-12 Student Learning). All other assessments are proprietary. See FFR- Attachment 15.

 

Task 2B(1): CAEP Feedback: The Lawshe Method is used for the Danielson Teacher Observation, but insufficient evidence is provided for other assessments.  

EPP Response: The Lawshe Method was used to determine content validity for the EPP-created Dispositions Assessment, not the Danielson Framework. “Impact on P-12 Student Learning Project” is the only other EPP-created assessment. It was piloted in 2018-19, and revised thereafter. The revised version is being implemented in 2019 and will be reviewed for content validity using the Lawshe method in November 2019. After samples are gathered from the fall 2019 implementation, inter-rater reliability will be determined in spring 2020 by university supervisors and EPP faculty. 

 

Task 2C(1): CAEP Feedback:  It is difficult to locate the overall response rate on some of the surveys. Please provide response rates.  
EPP Response: The Alabama State Department of Education (ALSDE) administers the surveys for employers and completers for all colleges and universities providing educator preparation programs. Through email correspondence, Dr. Patience Oranika, the ALSDE individual responsible for the survey administration and analysis, explained both the employee and employer surveys were “sent statewide to 2,068 first-year teachers and their employers respectively.  A total of 1017 first-year teachers completed the survey. The total number of respondents for the ALACTE Survey for Employers of New Teachers was 467. The number of teacher-respondents that claimed affiliation to the University of Alabama in Huntsville was 11. The number of employers of those teachers who completed the survey did not meet the reporting threshold.” A response rate for employees can be calculated as follows: The EPP recommended 33 completers for initial licensure in the 2017-2018 academic year. In the 2018-19 survey administration, 11 first-year teachers (employees) responded, yielding a response rate of approximately 33%. 

For the 2017-18 ALSDE survey, a response rate for employees can be calculated as follows:  The EPP recommended 44 completers for initial licensure in the 2016-2017 academic year. In the 2017-18 survey administration, 16 first-year teachers (employees) responded, yielding a response rate of approximately 36%.

Employer responses did not reach the reporting threshold (n=5) established by ALSDE as communicated via email by Dr. Patience Oranika. Responses are provided to the EPP but are not disaggregated by licensure areas due to low number of responses and low number of completers in individual licensure programs. 

 

Task 2C(2): CAEP Feedback: How are the assessments developed?  It is not clear if the key assessments are new and if little data is available, but the EPP has not provided evidence about the development of the key assessments based on research, best practices, conceptual frameworks, input from stakeholders and/or focus groups. Please provide information on the research used to develop the assessments or frameworks, and how the assessment is determined to be reliable and valid. 

EPP Response: The Dispositions Assessment created by the EPP has a long-standing history. It has been consistently utilized for more than ten years. It was originally created by EPP faculty in response to the need to identify and assess dispositions as reflected in the NCATE Standards. The identified dispositions are grounded in the EPP’s conceptual framework and supported by educational research. The content validity of the Dispositions Assessment is determined by content experts who are members of the Teacher Education Advisory Council using the Lawshe Method. The Content Validity Index (CVI) was most recently calculated in Spring 2019.  

The second EPP-created assessment is the Impact on P-12 Student Learning. It was created by EPP faculty in 2018 as state program approval documents (CIEP) were in preparation for the comprehensive submission. The CIEP requires each licensure program to include a key assessment focused on candidates’ impact on P-12 student learning. The assignment and scoring rubric were piloted in the 2018-19 academic year. The instrument and scoring rubric were refined based on feedback from university supervisors, and the edited version is currently used in the fall 2019 semester. Upon receipt of fall 2019 data, faculty and university supervisors will meet to review and analyze data, as well as determine inter-rater reliability and calculate content validity using either Lawshe Method or a Q-sort.  The Teacher Education Advisory Council (TEAC) reviewed the revised instrument in Fall 2019 to determine content validity.

All other assessments are proprietary in nature. Praxis Core exams are state-mandated, and they have been implemented since Fall 2016. The Praxis II exams are state-mandated, and they have been implemented for more than ten years.  EDUCATE Alabama was created by the Alabama State Department of Education (ALSDE), and has been utilized for more than ten years. Finally, edTPA became consequential for initial licensure and was state-mandated as of fall 2018. See FFR-Attachment 15.

 

Task 2C(3): CAEP Feedback: How and when are the assessments reviewed? The QAS timeline indicates the reviews started in fall 2019. Have the assessment ever been reviewed for validity and reliability? 

EPP Response: Please see response to previous question above. Only two EPP-created assessments are used. The Dispositions Assessment has been reviewed for validity (Lawshe Method), most recently in Spring 2019.  The Impact on P-12 Student Learning was piloted in 2018-19 and was revised. Its content validity was reviewed in Fall 2019, and after data are available from the Fall 2019 implementation of the revised instrument, inter-rater reliability will be established in Spring 2020.  See FFR-Attachment 16.

 

Preliminary Recommendations for Stipulation: 

STIP 5.1: CAEP Feedback: The key assessments do not meet minimum criteria as defined by the CAEP Evaluation Framework and Tool for EPP-Created Assessments. (component 5.2). Rationale: The EPP has provided insufficient evidence of content validity, or inter-reliability. The QAS timeline indicates that the reliability and validity will not be addressed until after the site visit. There is little evidence (aside from the Danielson Observation Tool) that the EPP has attempted to investigate the reliability and validity of its assessments. 

EPP Response: The EPP has reviewed the CAEP Evaluation Framework for EPP-Created Assessments. It meets the sufficient level across the seven identified indicators for both the Dispositions Assessment and Impact on P-12 Student Learning. Content validity has been achieved for both using the Lawshe Method, and inter-rater reliability will be established in Spring 2020 after the implementation of the revised instrument concludes in December 2019. Data reliability for both instruments is achieved through training of university supervisors and cooperating teachers.

 

STIP 5.2: CAEP Feedback: The EPP does not provide sufficient evidence that 80% or more of changes and program modifications are linked back to evidence/data.

Rationale: The EPP provides summary spreadsheets with data but it does not provide explanations about how the data directly informs decisions, gateways and transitions, and program improvement.  

EPP Response: Data are intentionally gathered each semester, and the EPP routinely shares data with faculty in bi-weekly meetings. Overall, candidates are performing very well in all assessments, affirming the rigor and appropriateness of programs. Please note many programs are relatively young, with launch dates in the last five years. In fact, the College of Education was established in 2014.  Additionally, many licensure options have relatively low number of candidates, thus making it somewhat challenging to make substantial program improvements or revisions based on candidates’ performance (e.g., physical education has only 3 completers and foreign language education has only 3 completers). The most significant program improvements were prompted by anticipated implementation of edTPA (consequential in Fall 2018) as well as preparation of the state program approval documents using the Continuous Improvement in Education (CIEP) process which was first piloted by the Alabama State Department in Fall 2014. The EPP submitted its first comprehensive state program review under CIEP in 2018. 

EPP faculty engaged in ongoing discussions and collaboration to backwards map edTPA into the curriculum. The EPP chose to require edTPA submission by all candidates in 2017-18, a full year prior to the state mandate. Review of candidate data revealed a need to focus on academic language as defined by edTPA as well as strengthen candidates’ understanding and skills related to Task 3.  Both ED 315 and ED 410 (assessment courses) were revised to strengthen content, key assessments, and experiences to better equip candidates for success on edTPA task 3. Another example of how data are used to inform program improvement would be the regular review of candidates’ performance on Praxis II content exams. The visual arts education program has three completers to date, but EPP faculty noted a pattern of lower scores in art history. This information was shared with faculty members in the Department of Art, Art History and Design, and collaboration resulted in reviewing course content in this particular domain. Finally, candidates completing the internship consistently indicated the value of Early Start and field experiences so faculty have worked to intentionally provide stronger field-based assignments and experiences as proposed in the 2018 CIEP submission. For example, all secondary education candidates will now be required to complete two subject-specific methods courses with imbedded field experiences. Similarly, all early childhood and elementary candidates are now required to complete ED 360, Elementary Practicum.

Diversity:

CAEP Feedback 1: In what specific efforts has the EPP engaged to recruit and retain candidates from diverse language, cultural, and ethnic groups? 

EPP Response: The EPP has hosted high school days for students enrolled in the career track cluster for teaching and education. Last years, these included students from James Clemens High School and Bob Jones High School.  Additionally, faculty visit P-12 classrooms in partner districts and serve in roles, such as science fair judges. The EPP has an INCLUDE Club which is used to promote diversity. Its members participate in events such as the International Festival hosted by the University, as well as cultivating partnerships with other University-recognized student organizations, such as the UAH Korean Student Group, the UAH Foreign Language Club, etc.  Candidates in STEM fields are also recruited and financial supported with mentors through the University’s LSAMP grant.  

 

CAEP Feedback 2: What are the student learning outcomes for the electives listed? How does the EPP demonstrate that the electives all have a similar impact on candidates understanding of diversity?  

EPP Response: Most of the diversity learning objectives are addressed in ED 307: Multicultural Foundations of Education and EDC 301: Teaching the Exceptional Child. A list of learning objectives for each course identified as a “diversity elective” is provided in FFR-Attachment 16.

 

Technology:

CAEP Feedback 1: What are the specific issues that provide evidence for infusion of technology in the internship evaluations? 

EPP Response: Specific indicators of candidates’ technology proficiencies in the internship are assessed using the EDUCATE Alabama document, as well as the Danielson Framework. See Evidence 1.5a and 1.5b. Interns are expected to utilize available technologies in their assigned internship classroom to support P-12 student learning.

 

Standards Pages: