Afternoon Workshops: 1 PM to 4 PM |
Workshop 1A: A Developmental and Adaptive Problem Based Learning (PBL) Model Across the Curriculum: From Theory to Practice in Integrating and Assessing PBL Experiences across the James Madison University Engineering Curriculum registration is complimentary thanks to support from James Madison University Olga Pierrakos, James Madison University; Robin Anderson, James Madison University; Elise Barrella, James Madison University In this collaborative and participant-centered workshop, faculty will be introduced to a novel and adaptive Problem-Based Learning (PBL) model developed and implemented in JMU's Engineering program over the past eight years and supported by NSF awards. Participants will be provided with PBL theory, PBL examples, a PBL classification framework, assessment tools, and a PBL template for use across courses and curricula. Problem-solving is generally regarded as the most important cognitive activity in everyday and professional practice. Problems in real-world practice have been described as messy, complex, and ill-structured, whereas typical engineering classroom problems have been described as well-structured with single correct solutions. How do we prepare our students for real-world problem solving? For researchers and educators alike, there is an interest in better understanding the nature of PBL experiences because not all PBL experiences are created equal. Understanding how aforementioned problem characteristics vary is essential for demystifying the process of learning through PBL and through traditional pedagogical methods. Different PBL experiences lead to different learning outcomes. Educators should intentionally design authentic learning experiences that expose students to all types of problems - well-defined to ill-defined, simple to complex in terms of knowledge integration, individual to team-based - so that students learn to be adaptive problem solvers. |
Workshop 1B: NSF Programs that support Engineering Education Research registration is complimentary thanks to support from the National Science Foundation Ece Yaprak and Karen Crosby, National Science Foundation The goal of this session is to inform the engineering and engineering education communities about various funding opportunities offered through the Engineering Education Centers (EEC) Division and the Division of Undergraduate Education (DUE) at the National Science Foundation (NSF). The intended audience for the session includes those eligible to submit and other project stakeholders such as: • 2-year and 4-year college and university faculty members in STEM and STEM education • 2-year and 4-year college and university administrators • STEM industry representatives • Institutional, educational, discipline-based educational, and social/behavioral science researchers. The workshop will be interactive as attendees will be open to ask questions freely during the presentation. NSF program directors will share details about current funding opportunities to support engineering education projects including the Advanced Technological Education (ATE) program, which, with an emphasis on two-year colleges, focuses on the education of technicians for the high-technology fields that drive our Nation's economy. ATE involves partnerships between academic institutions and industry to promote improvement in the education of science and engineering technicians at the undergraduate and secondary school (grades 7 through 12) levels. NSF Program officers will present an in-depth look at the Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) program, which seeks to increase the success of low-income academically talented students with demonstrated financial need who are pursuing associate, baccalaureate, or graduate degrees in STEM disciplines. S-STEM provides awards to Institutions of Higher Education to fund scholarships, and to enhance and study effective curricular and co-curricular activities that support recruitment, retention, student success, and graduation in STEM. Program officers will discuss one of the newest DUE programs, Improving Undergraduate STEM Education (IUSE). Engineering Education Research programs in the Engineering Directorate of NSF deal with the Professional Formation of Engineers which involves the formal and informal processes and value systems by which people become engineers. Advancing holistic engineering formation; diversifying pathways to and through engineering; exploring citizen engineering, credentialing, and expertise; developing engineering-specific theories of how engineers are formed; and understanding how change in engineering formation processes travels, translates, diffuses, and/or scales are hallmarks of the program. Elements of the programs under this umbrella include: Introductions to the profession at any age; acquisition of deep technical and professional skills, knowledge, and abilities in both formal and informal settings/domains; and development of identity as an engineer and its intersection with other identities. Additionally, the presenters will share important resources to consider when developing proposals to the NSF and discuss the importance of collaborations among 2-year and 4-year institutions, industry, and other partners to foster STEM workforce development. |
Workshop 1C: Peer Grading Development Cycle Shawn Lupoli, University of Maryland, Baltimore County The workshop will focus on several aspects of creating a successful peer grading experience for instructors and students. The development cycle includes: the right exam rubric, video creation, creating a blind "coupon" sheet, using students and class time to grade and storage. A solid rubric is the best candidate for a test to be peer graded. The rubric will be used in creating videos that along with a correct answer, describe each point breakdown in detail. The rubric can be rigid or have partial credit on it. As the instructor, the class meeting is the only time to have everyone together, answer questions, make grading decisions that affect the whole, show the answers, display exam point breakdown, and finally, grade. In order to get all of this done, a private set of videos are created to quicken the information needed for grading. The workshop will use a trial version Camtasia to create the videos. When taking the test, the students are given a test with the blind "coupon" cover sheet so later, the student graders are unable to determine which student really took this test - protecting the privacy of the original test taking student. After the tests have been taken, the next class meeting can be set aside for peer grading. The instructor plays the afore mentioned videos which are broken up into segments in order for the student to either the focus on the grading or on their own answer since the possibility of one of them getting it right. Finally, attendees will be introduced to storing, organizing and linking the videos using YouTube. The workshop is aimed at those interested in improving exam feedback and timeliness while empowering students with grading requirements and material that is essential for further learning within the course. The target audience ranges from the new instructor, to experienced, senior faculty who are interested in advancing the quality of their exams effectively correcting any lingering issues the students might continue to make if the test were graded in the traditional, slow and costly method. This workshop provides an overview of relevant research literature, and provides participants with hands on peer grading experience, video creation, and suggestion of data collection methods. |
Evening Workshops: 5 PM to 8 PM |
Workshop 2A: Tips for Turning Good Ideas into Competitive National Science Foundation Engineering Education Research CAREER Proposals: A Grant Writing Workshop registration is complimentary thanks to support form the National Science Foundation Olga Pierrakos, National Science Foundation The purpose of this participant-centered workshop is to support new faculty in developing strong engineering education research NSF CAREER proposals. Participants will learn about existing NSF programs that support engineering education research and learn tips for developing strong proposals in developing a research plan and educational plan (two critical elements of successful NSF CAREER proposals). The workshop will be interactive in nature and will include activities designed to help participants identify areas in which their ideas and proposals can be enhanced. A pre-survey will be administered to capture participant expectations; and readings will be provided to support participants' preparation for this workshop. A post-survey will be administered to evaluate the effectiveness of the session. The authors want to acknowledge the support of the National Science Foundation for supporting time and funds in the development of this workshop. The views expressed herein are those of the authors and do not necessarily represent those of NSF. |
Workshop 2B: Integrated Faculty Course Assessment Report (FCAR) Model with Traditional Rubric-Based (GR) Model to Enhance Automation of Student Outcomes Evaluation Process Fong Mak, Gannon University and Ramakrishnan Sundaram, Gannon University The traditional rubric-based assessment model has been used widely by many universities in various formats. By and large, the major contribution in engineering accreditation is attributed to Dr. Gloria Rogers' work and workshops. This workshop will term the traditional rubric-based assessment model as the GR Assessment Model. The essence of the GR model lies in classifying the courses in the curriculum to three levels: introductory, reinforced, and mastery. It is customary for the GR assessment model to include only courses in the mastery level for the program outcomes assessment. The drawbacks of looking only at the courses at the mastery level are: (1) lack of information needed at the lower level to identify the root cause of the deficiency when the symptom occurs in the higher level courses; (2) lack of the mechanism to compute a clear indicator such as the Student Outcomes (SOs) performance index based on Performance Indicators (PI) of that SO in order to facilitate the automation of the evaluation process. In this workshop, a brief summary of the essence of the GR methodology is first discussed, followed by the comparison to the essence of the FCAR methodology. A refined and tested implementation is presented to demonstrate how a GR approach can be integrated with the FCAR assessment approach to allow computation of the SO performance index from roll-up data based on the weighted average of the relevant PIs for three different levels of courses. Ultimately, each SO is assessed to determine whether the performance meets expectations, exceeds expectations, or is below expectations. Customarily, for the FCAR methodology, heuristic rules are used to gauge results on how the SO performance is measured up for the final three expectations. Results of how the SO performance index can be used to address the overall attainment of the SO expectation are shown. |
|