Students learn about the similarities between the human brain and its engineering counterpart, the computer. Since students work with computers routinely, this comparison strengthens their understanding of both how the brain works and how it parallels that of a computer. Students are also introduced to the "stimulus-sensor-coordinator-effector-response" framework for understanding human and robot actions.
Students learn about the anatomy of the ear and how the ears work as a sound sensor. Ear anatomy parts and structures are explained in detail, as well as how sound is transmitted mechanically and then electrically through them to the brain. Students use LEGO® robots with sound sensors to measure sound intensities, learning how the NXT brick (computer) converts the intensity of sound measured by the sensor input into a number that transmits to a screen. They build on their experiences from the previous activities and establish a rich understanding of the sound sensor and its relationship to the TaskBot's computer.
Students are provided with a rigorous background in human "sensors" (including information on the main five senses, sensor anatomies, and nervous system process) and their engineering equivalents, setting the stage for three associated activities involving sound sensors on LEGO® robots. As they learn how robots receive input from sensors, transmit signals and make decisions about how to move, students reinforce their understanding of the human body's sensory process.
Students learn about the human body's system components, specifically its sensory systems, nervous system and brain, while comparing them to robot system components, such as sensors and computers. The unit's life sciences-to-engineering comparison is accomplished through three lessons and five activities. The important framework of "stimulus-sensor-coordinator-effector-response" is introduced to show how it improves our understanding the cause-effect relationships of both systems. This framework reinforces the theme of the human body as a system from the perspective of an engineer. This unit is the second of a series, intended to follow the Humans Are Like Robots unit.
Students observe and test their reflexes, including the (involuntary) pupillary response and (voluntary) reaction times using their dominant and non-dominant hands, as a way to further explore how reflexes occur in humans. They gain insights into how our bodies react to stimuli, and how some reactions and body movements are controlled automatically, without conscious thought. Using information from the associated lesson about how robots react to situations, including the stimulus-to-response framework, students see how engineers use human reflexes as examples for controls for robots.
Students learn about human reflexes, how our bodies react to stimuli and how some body reactions and movements are controlled automatically, without thinking consciously about the movement or responses. In the associated activity, students explore how reflexes work in the human body by observing an involuntary human reflex and testing their own reaction times using dominant and non-dominant hands. Once students understand the stimulus-to-response framework components as a way to describe human reflexes and reactions in certain situations, they connect this knowledge to how robots can be programmed to conduct similar reactions.
Why do humans have two ears? How do the properties of sound help with directional hearing? Students learn about directional hearing and how our brains determine the direction of sounds by the difference in time between arrival of sound waves at our right and left ears. Student pairs use experimental set-ups that include the headset portions of stethoscopes to investigate directional hearing by testing each other's ability to identify the direction from which sounds originate.
With the challenge to program computers to mimic the human reaction after touching a hot object, students program LEGO® robots to "react" and move back quickly once their touch sensors bump into something. By relating human senses to electronic sensors used in robots, students see the similarities between the human brain and its engineering counterpart, the computer, and come to better understand the functioning of sensors in both applications. They apply an understanding of the human "stimulus-sensor-coordinator-effector-response" framework to logically understand human and robot actions.