Candy models used to make STEM accessible to visually impaired students – sciencedaily
About 36 million people suffer from blindness, including 1 million children. In addition, 216 million people suffer from moderate to severe visual impairment. However, STEM (science, technology, engineering and math) education continues to rely on three-dimensional imagery for education. Most of these images are inaccessible to blind students. A groundbreaking study by Bryan Shaw, Ph.D., professor of chemistry and biochemistry at Baylor University, aims to make science more accessible to people who are blind or visually impaired through small, candy-like models.
The Baylor-led study, published May 28 in the journal Scientific progress, uses millimeter-scale gelatin models – similar to gummy bears – to enhance visualization of protein molecules using oral stereognosis or visualization of 3D shapes via the tongue and lips. The aim of the study was to create smaller, more practical 3D imaging tactile models representing protein molecules. Protein molecules were selected because their structures are among the most numerous, complex, and high-resolution 3D images presented throughout STEM education.
âYour tongue is your best touch sensor – about twice as sensitive as your fingertips – but it’s also a hydrostat, similar to an octopus arm. It can wiggle into grooves your fingers won’t touch, but nobody really uses it. tongue or lips in tactile learning. We thought we would create very small high resolution 3D models and visualize them through the mouth, “said Shaw.
The study included 396 participants in total – 31 fourth and fifth graders as well as 365 students. The mouth, hands and eyesight were tested to identify specific structures. All students were blindfolded during the oral and manual tactile model tests.
Each participant had three minutes to assess or visualize the structure of a study protein with their fingertips, followed by one minute with a test protein. After the four minutes, they were asked if the test protein was the same or a different pattern from the protein in the original study. The whole process was repeated using the mouth to discern the shape instead of the fingers.
Students recognized structures orally with 85.59% accuracy, similar to visual recognition using computer animation. The tests involved identical edible gelatin models and inedible 3D printed models. Gelatin models were correctly identified at rates comparable to inedible models.
âYou can visualize the shapes of these tiny objects just as precisely by mouth as by sight. It was actually surprising,â said Shaw.
The models, which can be used for students with or without visual impairments, offer an inexpensive, portable, and convenient way to make 3D imaging more accessible. The study methods are not limited to molecular models of protein structures – oral visualization could be done with any 3D model, Shaw said.
Additionally, while gelatin models were the only edible models tested, Shaw’s team created high-resolution models from other edible materials, including taffy and chocolate. Certain surface characteristics of the models, such as a model of positive and negative surface charge proteins, could be represented using different flavor models on the model.
âThis methodology could be applied to images and models of anything, like cells, organelles, 3D mathematical surfaces or 3D artwork – all 3D rendered. It is not limited to STEM, but also useful for the humanities, âsaid Katelyn Baumer, doctoral student and lead author of the study.
Shaw’s lab views oral visualization through tiny models as a beneficial addition to the multisensory learning tools available to students, especially those with extraordinary visual needs. Models like the ones in this study can make STEM more accessible to students who are blind or visually impaired.
âBlind students are routinely excluded from chemistry, and much of STEM. Just look around our labs and you can see why – there’s braille on the elevator button to the lab and braille on the lab door. This is where accessibility Baylor is the perfect place to start making STEM more accessible. Baylor could become an oasis for people with disabilities to learn STEM, âsaid Shaw.
Shaw is not new to high profile research related to visual impairment. He was recognized for his work on the White Eye Detector app. Shaw and Greg Hamerly, Ph.D., associate professor of computer science at Baylor, created the mobile app that serves as a tool for parents to screen for pediatric eye disease. Shaw’s inspiration for the app came after his son, Noah, was diagnosed with retinoblastoma at the age of four months.