Show simple item record

dc.contributor Christ, Tanya
dc.contributor christ@oakland.edu
dc.creator Baxa, Julienne
dc.date 2019-10-23T21:09:41Z
dc.date 2019-10-23T21:09:41Z
dc.date.accessioned 2023-03-15T11:58:36Z
dc.date.available 2023-03-15T11:58:36Z
dc.identifier http://hdl.handle.net/10323/6867
dc.identifier.uri http://localhost:8080/xmlui/handle/CUHPOERS/284436
dc.description The DIGILIT FRAMEWORK Selecting and integrating the use of digital texts or tools in literacy lessons are complex tasks. The DigiLit Framework provides a succinct model to guide planning, reflection, coaching, and formative evaluation of teachers’ successful digital text or tool selection and integration for literacy lessons. For digital text or tool selection, teachers need to consider content accuracy, quality for supporting literacy development, intuitiveness, and user interactivity. For integrating these in instruction, modeling and guided practice should be provided for both literacy skills/strategies and the use of digital text or tool affordances. Also, instruction should capitalize on the digital affordances to transform instruction beyond what is possible with paper and pencil texts or tools. Examples of using the DigiLit Framework to evaluate digital text and tool selections and their integration in literacy instruction are provided. DEMYSTIFYING IRI COMPREHENSION DATA: HOW ARE CLASSROOM TEACHERS USING IT? This study examined the classroom practices of nine teachers as they collected, scored, identified comprehension objectives, and used data from informal reading inventories (IRIs) to inform comprehension instruction with 23 students. Using open coding and constant comparative analysis (Corbin & Strauss, 2015), video recorded IRI administrations, post-IRI interviews, follow-up reading lessons, final interviews, and 440 pages of artifacts were analyzed. Data were analyzed for patterns of collection, scoring, comprehension objective identification, and follow-up instruction both within teachers and across teachers. Findings revealed that teachers showed strengths in administering suggested prompts, gaining additional information by asking open-ended questions, completely scoring comprehension sections, and scoring many sections completely accurately. Teachers’ needs were especially evident in the accurate identification of comprehension objectives for upcoming instruction based on IRI data and in how to provide appropriate follow-up instruction based on data from an IRI. Implications include the need to explore individualized professional development given that different teachers had differing strengths and needs as they used IRIs to collect, score, inform objectives and teach comprehension lessons.
dc.relation Reading
dc.subject Literacy education
dc.subject Educational technology
dc.subject Assessment based instruction
dc.subject Comprehension
dc.subject Informal reading inventories
dc.subject Qualitative
dc.title Teacher Knowledge Matters
dc.type Dissertation


Files in this item

Files Size Format View
Baxa GS Draft Approval.pdf 1.673Mb application/pdf View/Open

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse