Skip to content

Conversation

@peterkasson
Copy link
Contributor

This seems a bit kludgy, but it's only weird in one case (gmx_mpi exists and mdrun_mpi exists). I've now found at least two major HPC systems (stampede, archer) where gmx exists and is thread_mpi, mdrun_mpi exists and is real MPI, and mdrun exists and is thread_mpi. This patch fixes the "no mpi executables" error in that case.

@peterkasson
Copy link
Contributor Author

A few commits in here (trying to separate). Two major topics and one minor:

  1. Fixing gromacs command names so Copernicus can handle Gromacs 5.1
  2. First stage of refactoring cpcc to allow function calls from external Python as well as command-line

Minor: some tutorial bugs.

@peterkasson
Copy link
Contributor Author

Added more patches to run with Gromacs 5.1.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant