Biological imaging of live cell and tissue using 3D microscopy is able to capture time-lapse image sequences showing multiple molecular markers labeling different biological structures simultaneously. In order to analyze this complex multi-dimensional image sequence data, there is a need for automated quantitative algorithms, and for methods to visualize and interact with both the data and the analytical results. Traditional computational human input devices such as the keyboard and mouse are no longer adequate for complex tasks such as manipulating and navigating 3+ dimensional volumes. In this paper, we have developed a new interaction system for interfacing with big data sets using the human visual system together with touch, force and audio feedback. This system includes real-time dynamic 3D visualization, haptic interaction via exoskeletal glove, and tonal auditory components that seamlessly create an immersive environment for efficient qualitative analysis.