Skip to content

bsflynn/hw7

 
 

Repository files navigation

HPC check

Login

ssh netid@login.storrs.hpc.uconn.edu


Copy a file

On your computer

touch testcopy

scp testcopy netid@login.storrs.hpc.uconn.edu:/home/NETID

or

scp testcopy netid@login.storrs.hpc.uconn.edu:~


Editing at the Terminal

ssh netid@login.storrs.hpc.uconn.edu
nano testcopy

Dataset

  • Gorgolewski et al (2013)
  • Test-retest dataset of various functional localizers
  • Stored in /scratch/psyc5171/dataset1
  • New BIDS level: session
    • sub-01/ses-test
  • Already in BIDS form
  • Metadata is at the study level
  • Your output and related files will go in /scratch/psyc5171/NETID

Goal

  • Today: run a single subject afni_proc on the cluster
  • Next week: run the entire dataset

Steps

  • Fork and clone hw7
  • Figure out slice timing (non-multiband)
  • Create event timing files
  • Write afni_proc.py command
  • Copy to cluster
  • As a job:
    • Run afni_proc.py
    • Generate proc script
    • Run proc script

Slice timing

  • Look at dataset1/task-fingerfootlips_bold.json
  • How are the slices being acquired?

Slice timing

  • Slice times start at 0 and increase
  • If we checked, the first slice would be inferior (=ascending)
  • But not monotonically (=interleaved)
  • Slice code is alt+z
  • No timing file needed: use -tpattern alt+z

Events

  • Block design motor localizer (fingerfootlips)
  • 15 second blocks
  • Timing for all subjects/sessions in dataset1/task-fingerfootlips_events.tsv
  • Create a timing file (think back to HW1 or last class)
  • Which convolution function should be used with 3dDeconvolve?

'BLOCK(15)'


Creating a timing file

  • This can be done with a shell command similarly to HW1
  • Use awk 'BEGIN { ORS=" " }; {print $1}' as part of the command

Creating a timing file

  • Let's modify log2times.py from the last class

Contrasts

  • finger-foot
  • finger-lips
  • foot-lips

Modify ap.sh

  • afni_proc.py generates processing scripts
  • The afni_proc.py command is generated by uber_subject.py
  • Edit ap.sh to work with the current data
  • Paths for input data should start with /scratch/psyc5171/dataset1

Edit the sbatch script

  • Customize sbatch_ap.sh and sbatch_proc.sh
  • Push your hw7 repository
  • Login to HPC
  • cd /scratch/psyc5171/NETID
  • Clone your hw7 repository on the cluster

Submit your job

To generate afni_proc.py script

sbatch sbatch_ap.sh

Submit your job

To run the proc script:

sbatch sbatch_proc.sh

About

Using AFNI on HPC

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Shell 97.5%
  • Python 2.5%