Skip to content

Commit

Permalink
Merge pull request #404 from uvarc/staging
Browse files Browse the repository at this point in the history
Staging
  • Loading branch information
rsdmse authored Jul 18, 2023
2 parents f9d9dbf + 64c152a commit fe2b5da
Show file tree
Hide file tree
Showing 12 changed files with 29 additions and 10 deletions.
8 changes: 7 additions & 1 deletion content/post/2023-july-maintenance.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,13 @@ All systems are expected to return to service by **6 a.m. on Wednesday, July 19*

RC engineers will be installing a new `/scratch` storage filesystem that can be accessed at `/scratch/$USER` after the end of maintenance.

**The current `/scratch` filesystem will be permanently retired on October 17, 2023 and all the data it contains will be deleted.** We have prepared a sample script for users who wish to transfer files to the new scratch system. Users should clean up their current `/scratch` directory in preparation, to minimize the load. The downloadable script will be posted here after maintenance.
**Modified queue limits will be implemented to provide maximum read/write performance of the new /scratch filesystem.** Users are encouraged to consult our [updated documentation](https://www.rc.virginia.edu/userinfo/rivanna/queues/) and adjust their job scripts accordingly.

**The current `/scratch` filesystem will be permanently retired on October 17, 2023 and all the data it contains will be deleted.** We have prepared a sample script for users who wish to transfer files to the new scratch system. Users should clean up their current `/scratch` directory in preparation, to minimize the load. A sample script is posted below.

**Example script to copy files**

{{< pull-code file="/static/scripts/demo-copy-scratch.slurm" lang="bash" >}}

The script will also be available through the Open OnDemand Job Composer:

Expand Down
7 changes: 7 additions & 0 deletions layouts/shortcodes/code-download.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{{ $file := .Get "file" }}
{{ $lang := .Get "language" }}
{{ $code := readFile $file }}
{{ (print "```" $lang "\n" $code "\n```") | markdownify }}
<a href="{{ $file }}" download>
<i class="fa fa-download" aria-hidden="true"></i>
</a>
2 changes: 1 addition & 1 deletion static/scripts/abinit.slurm
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/bin/bash
#SBATCH --job-name=abinit
#SBATCH -N 5
#SBATCH --mem-per-cpu=6000
#SBATCH --mem-per-cpu=9000
#SBATCH --ntasks-per-node=20
#SBATCH -t 10:00:00
#SBATCH -p parallel
Expand Down
6 changes: 6 additions & 0 deletions static/scripts/demo-copy-scratch.slurm
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
#!/bin/bash
#SBATCH -A your_allocation # to find your allocation, type "allocations"
#SBATCH -t 12:00:00 # up to 7-00:00:00 (7 days)
#SBATCH -p standard

rsync -av /oldscratch/$USER/ /scratch/$USER
2 changes: 1 addition & 1 deletion static/scripts/gaussian_serial.slurm
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
#SBATCH --tasks=1
#SBATCH -t 160:00:00
#SBATCH -p standard
#SBATCH --mem=6000
#SBATCH --mem=9000
#SBATCH -A mygroup

module load gaussian/g16
Expand Down
2 changes: 1 addition & 1 deletion static/scripts/mpi_job.slurm
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
#!/bin/bash
#SBATCH --nodes=2
#SBATCH --ntasks-per-node=16
#SBATCH --ntasks-per-node=36
#SBATCH --time=12:00:00
#SBATCH --output=output_filename
#SBATCH --partition=parallel
Expand Down
2 changes: 1 addition & 1 deletion static/scripts/orca_multinode.slurm
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
#SBATCH -A mygroup # your allocation account
#SBATCH -p parallel # partition
#SBATCH -N 3 # number of nodes
#SBATCH --ntasks-per-node=40 # number of tasks
#SBATCH --ntasks-per-node=36 # number of tasks
#SBATCH -t 24:00:00 # time

module purge
Expand Down
2 changes: 1 addition & 1 deletion static/scripts/smrtlink_blasr.slurm
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --cpus-per-task=8
#SBATCH --mem-per-cpu=6000
#SBATCH --mem-per-cpu=9000
#SBATCH --time=06:00:00

module purge
Expand Down
2 changes: 1 addition & 1 deletion static/scripts/smrtlink_ngmlr.slurm
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --cpus-per-task=8
#SBATCH --mem-per-cpu=6000
#SBATCH --mem-per-cpu=9000
#SBATCH --time=06:00:00

module purge
Expand Down
2 changes: 1 addition & 1 deletion static/scripts/smrtlink_sawriter.slurm
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --cpus-per-task=1 # multi-threading not supported
#SBATCH --mem-per-cpu=6000
#SBATCH --mem-per-cpu=9000
#SBATCH --time=06:00:00

module purge
Expand Down
2 changes: 1 addition & 1 deletion static/scripts/spark_multinode.slurm
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
#SBATCH --exclusive # do not modify
#SBATCH -A myaccount # your allocation
#SBATCH -N 3 # number of nodes
#SBATCH -c 40 # number of cores per node
#SBATCH -c 36 # number of cores per node
#SBATCH -t 3:00:00 # time

module purge
Expand Down
2 changes: 1 addition & 1 deletion static/scripts/vasp.slurm
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/bin/bash
#SBATCH --account my_acct
#SBATCH --nodes=8
#SBATCH --ntasks-per-node=16
#SBATCH --ntasks-per-node=36
#SBATCH --time=3-00:00:00
#SBATCH --output=thermo.out
#SBATCH --partition=parallel
Expand Down

0 comments on commit fe2b5da

Please sign in to comment.