Conversation
erikfrey
left a comment
There was a problem hiding this comment.
Looks great, let's see what we can do about improving CPU test coverage.
| d_arr = d_arr[: d.nefc.numpy()[0]] | ||
| _assert_eq(d_arr, getattr(mjd, arr), arr) | ||
|
|
||
| @absltest.skipIf(not wp.get_device().is_cuda, "Skipping test that requires GPU.") |
There was a problem hiding this comment.
LOVE seeing these tests, thank you for doing this.
Let's think of using ScopedCapture inside a test as a last resort... I'm definitely guilty of using it, but only where my imagination comes up short on how to test expected behavior with only a few steps. It would be nice to keep the property that our test coverage on CPU only is still pretty good.
Some of these unit tests only step 10 times or so - you can check but I think for 10 steps, the test time is not too bad if we step via CPU.
For the tests that are stepping hundreds of times, can you take a look and decide if there's anyway to reduce the number of steps and still have a meaningful test?
If after reviewing these two things, we still have some tests with the absltest.skipIf at the top, that's OK.
d1b74f6 to
ba47330
Compare
|
@thowell is this PR still relevant? |
|
yes, the tests are still relevant. this pr isn't blocking release so i haven't prioritized it. to enable many of these tests to run in a reasonable time on cpu we probably need to generate keyframes that can be loaded that enable skipping most steps. the gpu only tests are still potentially useful for locally debugging (with gpu). we could add todos for now and then merge. |
adds tests from mujoco's engine_forward_tests.cc https://github.com/google-deepmind/mujoco/blob/main/test/engine/engine_forward_test.cc
fixes
_next_activationlaunch in_rk_perturb_state