Author: Felzmann, C.U.
Paper Title Page
WED3O01 MASSIVE: an HPC Collaboration to Underpin Synchrotron Science 1
 
  • W.J. Goscinski
    Monash University, Faculty of Science, Clayton, Victoria, Australia
  • K. Bambery, C.J. Hall, A. Maksimenko, S. Panjikar, D. Paterson, C.G. Ryan, M. Tobin
    ASCo, Clayton, Victoria, Australia
  • C.U. Felzmann
    SLSA, Clayton, Australia
  • C. Hines, P. McIntosh
    Monash University, Clayton, Australia
  • D.A. Thompson
    CSIRO ATNF, Epping, Australia
 
  MASSIVE is the Australian specialised High Performance Computing facility for imaging and visualisation. The project is a collaboration between Monash University, Australian Synchrotron and CSIRO. MASSIVE underpins a range of advanced instruments, with a particular focus on Australian Synchrotron beamlines. This paper will report on the outcomes of the MASSIVE project since 2011, in particular focusing on instrument integration, and interactive access. MASSIVE has developed a unique capability that supports an increasing number of researchers generating and processing instrument data. The facility runs an instrument integration program to help facilities move data to an HPC environment and provide in-experiment data processing. This capability is best demonstrated at the Imaging and Medical Beamline where fast CT reconstruction and visualisation is now essential to performing effective experiments. The MASSIVE Desktop provides an easy method for researchers to begin using HPC, and is now an essential tool for scientists working with large datasets, including large images and other types of instrument data.  
slides icon Slides WED3O01 [28.292 MB]  
 
WEM303 Virtualisation within the Control System Environment at the Australian Synchrotron 1
 
  • C.U. Felzmann, N. Hobbs, A. C. Starritt
    SLSA, Clayton, Australia
 
  Virtualisation technologies significantly improve efficiency and availability of computing services while reducing the total cost of ownership. Real-time computing environments used in distributed control systems require special consideration when it comes to server and application virtualisation. The EPICS environment at the Australian Synchrotron comprises more than 500 interconnected physical devices; their virtualisation holds great potential for reducing risk and maintenance. An overview of the approach taken by the Australian Synchrotron, the involved hardware and software technologies as well as the configuration of the virtualisation eco-system is presented, including the challenges, experiences and lessons learnt.  
slides icon Slides WEM303 [1.235 MB]  
poster icon Poster WEM303 [0.958 MB]  
 
THHC3O04
A Web-Based User Interface for MX1 and MX2 Beamline Data collection at the Australian Synchrotron  
 
  • L.M. Jong, D. Aragao, T. Caradoc-Davies, M. Clift, N. Cowieson, C.U. Felzmann, N. Mudie
    SLSA, Clayton, Australia
 
  The MX1 and MX2 beamlines at the Australian Synchrotron are single crystal diffraction beamlines, servicing the needs of protein and chemical crystallography communities. A web-based user interface for driving data collections, called YAIBEX (Yet Another Integrated Beamline Environment for Crystallography) has been developed. This system is designed to replace the collect tab on the SSRL BluICE system which is written in TCL language and was forked at deployment from the original code making it difficult to take advantage of bug fixes and improvements from the upstream. Our system utilises Flask, a minimalist Python web application framework, chosen to leverage the existing Python-based infrastructure existing at the beamline, the language's widespread use in the scientific community including existing libraries, and better support from the local Controls and Scientific Computing groups. Improvements on the existing system include integration with custom beamline libraries, user portal integration for pre-filling information and an easy, tabular layout to view a history of data collections for the current session and remote access directly in the user internet browser.  
slides icon Slides THHC3O04 [3.414 MB]