Leapfrog Verification
We stressed the need for verification with our central difference implementation. We need to repeat that analysis and show that the leapfrog technique yields comparable results and understand why the results are so similar.
Apply the same process to assess the staggered time, or leapfrog, approach as we used to assess the central difference approach. Otherwise, we risk putting our thumb on the scale. As before, start with a visual comparison, which again shows no visual distinction. We are off to a good start.
Now proceed to collect the root-mean-square error for and . The table shows error data for every 200th frame from the animation. We use 501 FDTD steps between each frame, so this accounts for over 600,000 steps in the simulation. Over the course of this run, the error quickly plateaus. However, the errors both grow. This is actually expected and is an example of numerical dispersion.
| Frame | Steps | |||
|---|---|---|---|---|
| 0 | 0 | 0.0000 | 0.0000 | 2.4794e-18 |
| 200 | 100200 | 0.0021346 | 0.0021346 | 0.0021377 |
| 400 | 200400 | 0.0042689 | 0.0042691 | 0.0027197 |
| 600 | 300600 | 0.0064026 | 0.0064026 | 0.0026756 |
| 800 | 400800 | 0.0085362 | 0.0085362 | 0.0025045 |
| 1000 | 501000 | 0.010669 | 0.010668 | 0.0023283 |
| 1200 | 601200 | 0.012800 | 0.012800 | 0.0021721 |
This table is very close to the corresponding table for the central difference implementation.
| Frame | Steps | |||
|---|---|---|---|---|
| 0 | 0 | 0.0000 | 0.0000 | 3.1811e-18 |
| 200 | 100200 | 0.0021347 | 0.0021347 | 0.0021383 |
| 400 | 200400 | 0.0042693 | 0.0042693 | 0.0027217 |
| 600 | 300600 | 0.0064031 | 0.0064031 | 0.0026787 |
| 800 | 400800 | 0.0085367 | 0.0085367 | 0.0025088 |
| 1000 | 501000 | 0.010669 | 0.010669 | 0.0023335 |
| 1200 | 601200 | 0.012800 | 0.012800 | 0.0021778 |
There is a significant similarity between the central difference and the staggered time approach. Each starts with the wave function a base time, computes derivatives using an intermediate time wave function, then combine these to compute the wave function at a more advanced time. It is this skipping over the intermediate time wave function that gives raise to the term leapfrog.
Task Manager
We also take a peek at the task manager to compare the staggered time performance against the earlier approaches. Surprisingly, they are very different.
Staggered Time Performance
The graphics engine does significantly more work for the staggered time approach while the memory copy engine is having a much more leisurely time. This is likely because the staggered time approach makes two invocations of the compute shader for every time step, but we don't shuffle the buffers around. We haven't even added the boundary conditions onto the staggered time yet.
Central Difference Performance
These differences present an almost stereotypical choice encountered so frequently in software engineering. Both approaches yield similar performance, while one makes more demands on memory and the other makes more demands on compute resources.