The language below can be used in grant submissions to government agencies or other funding entities.


Great Lakes Description

Computing

Great Lakes is an HPC Linux-based cluster intended to support parallel and other applications that are not suitable for departmental or individual computers. Each Great Lakes compute node comprises multiple CPU cores with at least 4 GB of RAM per core; Great Lakes has approximately 14,000 cores. All compute nodes are interconnected with InfiniBand networking.

The larger memory Great Lakes hardware comprises 3 compute nodes, each configured with 1.5 TB RAM.

Great Lakes contains 20 GPU nodes, with a total of 40 NVIDIA Tesla V100 CUDA-capable GPUs.  There are also 4 visualization nodes available, each equipped with a single NVIDIA Tesla P40.

Computing jobs on Great Lakes are managed through the SLURM Scheduler.

The Great Lakes Configuration page has a more detailed description of the Great Lakes cluster.

Storage

The high-speed scratch file system provides 2 petabytes of storage at approximately 80 GB/s performance (compared to 7 GB/s on Flux).

Intra-networking
All Great Lakes nodes are interconnected with InfiniBand HDR100 networking, capable of 100 Gb/s throughput. In addition to the InfiniBand networking, there is 25 Gb/s ethernet for the login and transfer nodes and a gigabit Ethernet network that connects the remaining nodes. This is used for node management and NFS file system access.

Inter-networking
Great Lakes is connected to the University of Michigan’s campus backbone to provide access to student and researcher desktops as well as other campus computing and storage systems. The campus backbone provides 100 Gbps connectivity to the commodity Internet and the research networks Internet2 and MiLR.

Software
The Great Lakes cluster includes a comprehensive software suite of commercial and open source research software, including major software compilers, and many of the common research-specific applications such as Mathematica, Matlab, R and Stata.

Data Center Facilities
Great Lakes is housed in the Michigan Academic Computing Center (MACC).

Hardware Grants
Lighthouse Operating Environment is a service that allows researchers to add their own compute hardware to the Great Lakes cluster, in order to take advantage of the data center, support, networking, storage, and basic software. For more information, visit the Lighthouse Operating Environment page.

Support
Great Lakes computing services are provided through a collaboration of University of Michigan units: Advanced Research Computing (in the Office of the VP of Research and the Provost’s Office), and computing groups in schools and colleges at the university.


The Following Steps Will Help You Include Great Lakes in a Grant Proposal

  1. Determine the suitability of Great Lakes for your research by considering whether a large computing resource is required. It is important that the proposed funds will provide computing cycles in a way that allows the team of researchers to allocate them as needed. Great Lakes is an on-demand service and will be billed based on usage. This billing structure is flexible to meet researcher needs and make the best possible use of the awarded funds. Faculty-owned or provided hardware cannot be accepted into Great Lakes.

2. Determine if the constraints on access to Great Lakes are suitable for your project. Access to Great Lakes and the software library will be granted to all University of Michigan faculty, staff, graduate, and undergraduate students. Contractors and collaborators from other institutions may not use Great Lakes because of licensing limitations with third party commercial software.

3. Determine an appropriate budget to include in the proposal; the cost per core month is an approved rate and may be charged as a direct cost to federal grants. There is no cost at the moment, but rates for budgeting use will be available on the Rates page soon. For questions or more information about estimating usage, contact hpc-support@umich.edu.

4. Use the appropriate parts of the Great Lakes Description above in your proposal. In NSF proposals use the category “computer service” and the phrase “cluster compute allocation” with quantities expressed as core-months or core-years to describe Great Lakes time.

5. Plan for the end of the award period or the exhaustion of the funds. At that time, no more jobs associated with that Great Lakes project can run.