For my experiment, I pair either a tone or a light with an airpuff and record the eyeblink response. Right now, my rig doesn’t keep all of the elements well timed and the brightness of the light and the loudness of the sound are hard to calibrate and change. If there is a way to keep all the stimuli and the video collection integrated, it would be great.

Professor 1Eyeblink Operant Conditioning

I’d want to do a hippocampal place cell experiment in a haunted house where the walls move behind you and rooms appear and disappear. We could even incorporate a fear conditioning task or an operant conditioning paradigm for a reward. All the movements would have to be automated. How does the activity of the place cell change when a room that was there before is suddenly different? Oh and position tracking for the mouse too.

Professor 2Haunted House for a Mouse

Andre Fenton did a really cool experiment where a mouse was on a rotating platform and one segment of the circle was an active shock plate. Visual cues about the location were available from the inside walls of the rotating plate and the stationary outside walls. The mouse had to learn where the plates were using the moving and stationary cues. Instead of visual cues, I want a similar set up that integrates auditory localization tasks. The landmarks would be speakers and the gerbil would have to continuously know where the sound was coming from to know where they were in the room and on the rotating platform. Then we could do optogenetic manipulations to see how the behavior changes!

Professor 3Sound Localization in a Rotating World


Submitted By
Professor 1
Eyeblink Operant Conditioning
For eyeblink conditioning, getting LED lights and speakers calibrated for brightness and loudness and timed perfectly with an air puff all while being recorded with a camera.My Info

Submit Your Own Idea