Associate Professor of Computer Science & Cyber Security
About Me
Dr. Jerry Schnepp is the Robert Miner Endowed Chair of the Department of Computer and Cyber Security at Roosevelt University. Before joining Roosevelt, he led the Computer Science Department at Judson University and served as an Associate Professor of Visual Communication Technology at Bowling Green State University, where he also directed the Collab Lab.
Dr. Schnepp’s research focuses on advancing AI-supported learner experience design, assessing computational thinking, and developing interactive learning and assessment tools. His work also extends to enhance computerized sign language synthesis. He is the recipient of the 2022 BGSU Faculty Mentor of the Year Award, the 2018 Elliott L Blinn Award for his work supporting undergraduate research and creative work, and the 2018 Faculty Excellence Award from the Association of Technology Management and Applied Engineering. His scholarship has been published in journals including The International Journal of Teaching and Learning in Higher Education, The Journal of Information Technology Education, Computers in Education Journal, and Sign Language & Linguistics.
- Semi-automated learner Experience Design using a Large Language Model (LLM) and Retrieval Augmented Generation (RAG)
The goal of this project is to automatically produce plans for personalized learning experiences based on insights gleaned from transcripts of student interviews and selected documents on pedagogy. It uses a retrieval augmented generation approach, harnessing the capabilities of a preexisting Large Language Model (LLM). Prompts are dynamically adjusted based on the information gathered from student interviews and the pedagogical PDFs. This ensures that the generated learning experiences are not only personalized but also grounded in established educational principles.
- Mobile Technology to Support Experiential Learning
EASEL (Education through Application-Supported Experiential Learning) is a mobile platform that allows instructors to deliver reflection prompts and content before, during, and/or after a learning experience. It puts direct contact between the student and instructor at the core of the learning environment. The EASEL platform leverages the inherent functionality of mobile devices such as GPS and persistent network connectivity to adapt the learning environment based on location or time. Reflection prompts are triggered based on time or location. EASEL facilitates multiple input modalities such as text, voice recording, photography and video – offering students convenient choices to record their reflections.
- Computerized Sign Language Synthesis
I collaborate with an international team to develop software that translates spoken English into animations of American Sign Language (ASL), the language of the Deaf in North America. This undertaking is both broad and challenging, as ASL is different from English, with its own unique grammar. It is at least as different from English as any other spoken language. Additionally, the depiction of ASL is visual/gestural as opposed to the aural/written modality of spoken languages. These challenges present the opportunity to explore varied approaches to both software engineering and usability testing. The software is complex, combining an intricate animation system, language representation, and a unique user interface. Further, it requires an implementation that draws from the diverse fields of computer graphics, linguistics, natural language processing, and human-computer interaction, necessitating multi-disciplinary collaborations.
- Ph.D. Computer Science — DePaul University
- M.S. Human-Computer Interaction — DePaul University
- B.A. Communication — University of Illinois at Chicago