The Great Lakes Slurm cluster is a new, campus-wide computing cluster that will serve the broad needs of researchers across the university. The Great Lakes HPC Cluster has replaced Flux, the shared research computing cluster that currently serves over 300 research projects and 2,500 active users.
The Great Lakes HPC Cluster is available to all researchers on campus for simulation, modeling, machine learning, data science, genomics, and more. The platform provides a balanced combination of computing power, I/O performance, storage capability, and accelerators.
Based on extensive input from faculty and other stakeholders across campus, the Great Lakes HPC Cluster is designed to deliver similar services and capabilities as Flux, including access to GPUs and large-memory nodes and improved support for emerging uses such as machine learning and genomics. The Great Lakes HPC Cluster consists of approximately 13,000 cores.
See the Grant Resources page for information regarding grant proposals using the Great Lakes HPC Cluster.
See the LSA funding page for information on funding courses at the College of Literature, Science, and the Arts. LSA researchers who do not have access to any other account may be eligible to use the accounts provided centrally by LSA. The usage policy and restrictions on these accounts is described in detail on the LSA’s public Great Lakes accounts page.
Questions about access or use of these accounts should be sent to firstname.lastname@example.org.
See the Great Lakes Student Teams and Organizations page if your team requires HPC resources.
To establish a Slurm account for a class please contact us at
email@example.com with the following information:
- Students to be put on the account
- List of individuals to administer the account
- Any limits to be placed on the either the users or the account as a whole
Please note: all students will need to have a user login to use the account and can via this form – https://arc-ts.umich.edu/login-request/
For technical support, email firstname.lastname@example.org.