This movie was made by Celia Eddy and Josh Russell in Ben’s “Sonic and Visual Representation of Data” class, Spring 2017. It was made with python and RTcmix, using data from two simulation methods, Instaseis and mineos. The explanation is contained within !
This is a recent movie we put together using simulations of the Tohoku earthquake with sound from broadband seismometers (8 in a great circle mixed to stereo). We have been experimenting with ways to render the SPECFEM3D simulations to separately visualize surface and body waves, relatively low and high frequency parts of the spectrum respectively, to correspond to the sounds that are given low- and high-pass filters. So the two renderings shown here are from the same seismic data and the same simulation, just filtered differently. Matt has been developing methods in “yt” to render complex migrating wave fronts, which is an interesting visualization problem. This is our first presentation of these movies, with lots of improvement to come. For best results, view the movie at full screen and use big headphones.