This module contains two distinct methods for:
These methods both leverage three.js to render the visualisations in the browser. Visualisation options are controlled by setting the URL of the browser correctly.
To be able to view local data, that data needs to be served from a local server. This can be done by using the TexturingServer described elsewhere, or if no texturing is needed, two lightweight servers are included that work under Python or JS:
Follow the instructions to compile the TexturingServer.
sudo apt-get install nodejs(tested with node.js v11.13.0)
npm install express cors glob
This should work under
python3, with no additional dependencies, so as long as you have either of these installed on your system you have no need to install anything
./Texturing/TexturingServerto start a web server which will be used to serve local files.
visualise/index.htmlin chrome or firefox. Supermedium is recommended if you want to use VR.
index.htmlor can be set via the URL, for example:
fname: folder name where data is stored. Trailing slash is optional. Default is
anaglyph. Default is
undefined(normal and default),
TexturingServerto be running and works in any number of dimensions.
rotations2can be computed in the browser, and can be used with any server, and does the calculations on the GPU at render time, but only works for
N= 3 or 4.
falseto start time marching on load. Default is
floatthat sets the speed of time marching. Units are DEM time units\second. Default is
falseto render shadows. Default is
intto set quality of rendering. Higher is more compute expensive. Default is
floatto set how much to zoom in. Default is
pinky: only used if
intthat sets which particle to render in pink. Default is
falseto cache local data in the browser. Default is
falseto disable tori in VR mode to make things harder for the user. Default is
false. This is meant to be more 'fun' but YMMV.
no_axes: include this flag to disable drawing of axes.
no_walls: include this flag to disable drawing of walls.
falseto view in quasicrystal mode. Default is
falseto load MercuryDPM data instead of NDDEM data. Default is
colour_scheme: set to
invertedto invert global colour scheme to have a white background.
floatto rotate the torus around the x1 axis in degrees
falseto save frames when autoplay is ticked. Default is
initial_camera_location: three numbers (e.g.
initial_camera_location=1,2,3) to set the initial camera location if so desired.
camera_target: three numbers (e.g.
camera_target=1,2,3) to set what the camera is pointing at if so desired.
floatto set the initial timestep to display. Default is
binary: if included, this flag loads from binary data instead of csv data. You can convert csv to binary data using the included script
An example command that works for me, when the TexturingServer is running:
Below is the default output of the visualisation software for a 3D simulation of spheres flowing down an inclined plane. You can manually move through time, or enable auto playing in the top right. The speed it is playing can be controlled with the "Rate" parameter.
But what about dimensions higher than 3? For this, we need some way to project particles from higher dimensions down to 3D. In this example, we will project a 3D sphere down to a 2D circle by slicing it. Move the pink plane on the left around with the slider and see the resulting size of the circle on the right. The circle is largest when the sphere is sliced through its centre.
Now, lets try that same thing, but we are going to slice a 5D hypersphere with a 3D volume. For this, there are two coordinates for where we are slicing, which you can control with the sliders. The torus on the right hand side represents where we are in these two dimensions. Notice how as you move around, the small pink ball moves. This small pink ball represents the location of the hypersphere relative to these two coordinates. The sphere on the left is again largest when we slice through its centre — when the ball is at the centre of the black cross
Let's now go and look at inclined plane flow in 5D
To be able to see a particle rotate, we need to attach a texture to it. Here is how we visualise the earth rotating.
In 3D, there are three directions that an object can rotate in.
As we can't render the textures needed to do this without a local installation, here is a video explaining how rotations work
Initialise the threejs scene, adding everything necessary, such as camera, controls, lighting etc.
Get the current orientation of the left hand controller and set world coordinates appropriately
Get the current orientation of the right hand controller and set world coordinates appropriately
Add the left oculus controller
Add the right oculus controller
Add the two vive controllers
Make the camera and position it
Add the renderer and associated VR warnings if necessary
Add the non-VR GUI and set all sliders
Add the non-VR and/or VR controllers and associated buttons
If the current example requires a specific direction for the camera to face at all times (e.g. a 1D or 2D simulation) then set that
Make the axes, including labels and arrows
Make the scene lighting
Make any necessary walls
Add the torus(es) as necessary
Remove everything from scene - very useful for presentation mode when we don't want to kill the computer by loading multiple scenes simultaneously
Make the initial texturing if showing rotations
Load particles from MercuryDPM file format - NOTE: THIS IS NOT WORKING YET
Make the initial particles
Load textures from TexturingServer
Update textures from TexturingServer
Update sphere locations
Update camera and renderer if window size changes
In catch_mode, let the user know if they found the pink ball
Animation loop that runs every frame
Do the actual rendering
Remove loading screen