Data Processing at SACLA

From cctbx_xfel
Revision as of 01:48, 18 November 2017 by Mlxd (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Obtaining metadata like detector position

1) AgBeh (silver behenate). Determine detector distance and beam center --> update SACLA-provided *.geom file (CrystFEL format) --> run sacla geom to json on *.geom to get equivalent for DIALS processing

2) h5_mpi_submit --> launches dials.stills_process with process.phil, and queueing options.

Specify what runs (integers) /work/jkern/2017B8085/xrd/r234567-0/*.h5

2.5) Data visualization.

3) metrology refinement dials.combine_experiments reference_from_experiment.detector=0.

Takes 1000 images, puts into 1 file. Output: combined_experiments.json + combined_reflections.pickle

1 Experiment = crystal + detector + beam

Must cherry pick data if there is scare data out to the corners. (but not covered here. but: largest pickle files are highly diffracting).

dials.refine combined* hierarchy_level=[0|1] # Use 0 first (refine detector as a block) then 1(refine each panel) To keep detector flat: refinement.parameterisation.detector.fix_list=Tau2,Tau3

Level 0: refine dist, shift1, shift2. Fix: tau

Level 1: refine shift1, shift2, tau1 Fix: dist, tau2, tau3

Evaluation--how do you know if it made a difference?

dev.cctbx.xfel.detector_residuals json pickle # also specify hierarchy_level=1 residuals.plot_max=0.3

program -c -e 10 -a 2# gets all config parameters for a program at expert level 10, giving all help strings.

4) redo integration with reference geometry:

reference_geometry=refined_experiments.json

5) merge

take merging script from LQ79. Take it verbatim. Use cxi.merge

  1. !/bin/bash
  2. PBS -q [smp|serial]

smp: lots of memory up to 44 pros serial: up to 14 pros, 1 node b13-occupancy: reserved for you

DIALS workflow

dials.import file.h5 (the h5 will have 1000's of images in it) --> datablock.json. Has experimental models as abstracted from image header

dials.find_spots datablock.json --> strong.pickle

dials.index strong.pickle datablock.json --> indexed.pickle experiments.json

dials.refine --> refined_experiments.json refined_reflections.pickle

dials.integrate

Aggregate processing at XFELS

Need to submit a single job for each *.h5 file (manually, or write a script) Instead of running the individual steps: dials.stills_process *.h5 process.phil

Phil file must have good parameters for data processing. Take one from previous users.

Converting SACLA pipeline geometry file to DIALS

libtbx.python modules/cctbx_project/xfel/sacla/mpccd_geom2json.py <SACLA_GEOM> distance=<DETECTOR DISTANCE>

Modifications to detector distance

Distance can be specified by modifying line 55 of /home/jkern/xfel_env/conda_install/modules/cctbx_project/dxtbx/format/FormatHDF5SaclaMPCCD.py

`self.distance = 100.0`

Submission of jobs

Use the file '/work/jkern/2017B8085/xrd/run_scripts/test_pierre.sh'

echo -n "enter phil file to use:" read phil_inp echo $phil_inp

for run in $(seq $1 $2); do

for chunk in $(seq $3 $4)

       do

cxi.mpi_submit input.data_dir=/work/jkern/2017B8085/xrd/data/${run}-${chunk} input.run_num=${run} \

 		input.dispatcher=dials.stills_process output.output_dir=/work/jkern/2017B8085/xrd/results \
 		input.target="/work/jkern/2017B8085/xrd/processing_phils/${phil_inp}" input.run_chunk=$chunk\
 		mp.method=pbs mp.queue=bl2-occupancy mp.nnodes=1 mp.nproc_per_node=28 \
 		mp.env_script=/home/jkern/xfel_env/setup_env.sh input.data_template="run%d-%d.h5"

done done