Flux is a collaboration between the AVP-RCI, ITS, the Center for Advanced Computing at the College of Engineering, LSA, and the Medical School to provide a high quality High Performance Computing cluster environment for researchers at the University of Michigan Ann Arbor campus and their collaborators.
Primarily, access to Flux requires a purchase of an allocation, with a unit of allocation being 1 core for 1 month. The rate mentioned below is for an allocation of resources and time and is not based on actual usage. The charges will appear on your monthly Statement of Activity after every month in which you had an allocation. To inquire about flux allocations please email: firstname.lastname@example.org.
Flux rates are available here
Training is available for all users of Flux. Training inquires and support requests should be emailed to: email@example.com.
Flux II allocations share the same software as the CAC cluster Nyx. Software is accessible via the module system. Some units also provide their own software not managed by the CAC. This requires that unit's module to be loaded before the units software is available. An example is loading the lsa module. The module will add, in addition to the stock CAC modules, any software that the LSA stewards of Flux have added.
Flux II is accessed via the Nyx login nodes and is shared by the CAC cluster Nyx. Flux II has its own queue called flux which should be used in your pbs file. Also all Flux users must have an account with valid allocation. For more details on queuing options see the Nyx PBS documentation.
Once you get your flux allocation (in this case, let's call it "example_flux"), there are three things you'll need to change in your PBS script to use flux - the queue, the account, and the qos:
As we expand Flux to its self-sufficient state, we are planning on increasing the rate over the course of 2-4 years to reflect to full costs of operating a high-performance computing cluster.
Flux III will also have its own software library, independant of that maintained by the College of Engineering on Nyx. We expect this to be quite similar to Nyx's software library, perhaps with more focus on a broader selection of software.
Flux III will have its own storage, networking, login and administrative hosts and will not share that with the College of Engineering's cluster, Nyx. This will allow for a more transparent cost structure and more independance for Flux than is currently possible.