SPSRC - Spanish Prototype of an SRC at the IAA-CSIC

Summary

Resources per team

Resource Access

Resource management

Software management

Documentation

Support

Resource location

Technical specifications

Overview

At the Instituto de Astrofísica de Andalucía (CSIC) in Granada (Spain), we are leading the Spanish effort to host an SRC. We are currently developing an SRC prototype aiming both to support users working with SKA precursors and pathfinders, and constituting a transversal, wavelength agnostic facility enabling knowledge exchange among a diverse community of users. We have deployed the first stage of the hardware, based on a cloud environment to be able to provide interoperable and flexible services. We are particularly engaged in the challenge of handling SKA data to extract scientific knowledge following the Open Science values, and for this reason, we are identifying and integrating in our platform tools and services able to enhance knowledge sharing and collaboration as well as to ensure transparency, as key factors to achieve scientific reproducibility (e.g. JupyterHub server, container engines or  Virtual Observatory services). 

For more details see:

Technical specifications

The OpenStack cloud gathers 200 CPUs cores and 2.5 TB of memory across five compute hypervisors, plus 600+ TB of SSD usable storage capacity managed by Ceph. The servers are interconnected by a 100Gbps network and the cluster is connected to RedIRIS (the Spanish National Research Network ) with a 10Gbps link.

Per user resource

Teams will be provided with a VM and the resources assigned to this VM will be within these ranges:

NB - Initially teams will be provided with small virtual machines ( 16 cores and 64GB memory)  during the first days/weeks, so they can deploy/test their software. Subsequently, we will increase the resources of their virtual machines up to 32 cores and 128 GB memory.

Software installed

We provide the following base images for virtual machines:

Teams will have pseudo-sudo access, so users will be able to install their own software

Volume of resource

As long as a VM is assigned to each group, there is no limitation of users per group. Users' accounts will be managed by the team itself.

After the SDC3 Challenge deadline, the VMs assigned to teams will remain active for 2 additional months, so they can run some final checks, collect their data and tools, etc. If more time is needed please contact the support team at ska-itsupport@iaa.csic.es.

GPUs if any

No GPU resources

User access

Logging in

Users will be provided with an IP address and two port numbers: 1) one port for SSH access, and 2) another port for access with a remote desktop application.

How to run a workflow

Sudo access to the VM will ensure that the users can install and configure their own environment to deploy their preferred workflow management system.

Software management

Pseudo-sudo access to the VM will allow the end user to install their own software.

Containerisation

The VM will come with podman pre-installed, which is a secure replacement for docker. Users will also be able to install singularity if they prefer so.

Documentation

General documentation available in https://spsrc-user-docs.readthedocs.io. Documentation specific for the teams participating in SDC3 will be available soon.

Resource management

Separate VMs per team will isolate the team environments. The VM flavor will be fixed for the duration of the project and will thus limit/cap usage.

Support

Users can request support via email at ska-itsupport@iaa.csic.es

Participants can join our Slack workspace by emailing  ska-itsupport@iaa.csic.es and request to be added.


Credits and acknowledgements 

We kindly request you to acknowledge the SPSRC in any work or publication derived from its use as indicated here. Thank you in advance.