Data Processing at SACLA: Difference between revisions
Nicksauter (talk | contribs) No edit summary |
Nicksauter (talk | contribs) |
||
Line 4: | Line 4: | ||
--> update SACLA-provided *.geom file (CrystFEL format) | --> update SACLA-provided *.geom file (CrystFEL format) | ||
--> run sacla geom to json on *.geom to get equivalent for DIALS processing | --> run sacla geom to json on *.geom to get equivalent for DIALS processing | ||
2) h5_mpi_submit --> launches dials.stills_process with process.phil, and queueing options | |||
Specify what runs (integers) | |||
== DIALS workflow == | == DIALS workflow == |
Revision as of 22:25, 17 November 2017
Obtaining metadata like detector position
1) AgBeh (silver behenate). Determine detector distance and beam center --> update SACLA-provided *.geom file (CrystFEL format) --> run sacla geom to json on *.geom to get equivalent for DIALS processing
2) h5_mpi_submit --> launches dials.stills_process with process.phil, and queueing options Specify what runs (integers)
DIALS workflow
dials.import file.h5 (the h5 will have 1000's of images in it) --> datablock.json. Has experimental models as abstracted from image header
dials.find_spots datablock.json --> strong.pickle
dials.index strong.pickle datablock.json --> indexed.pickle experiments.json
dials.refine --> refined_experiments.json refined_reflections.pickle
dials.integrate
Aggregate processing at XFELS
Need to submit a single job for each *.h5 file (manually, or write a script) Instead of running the individual steps: dials.stills_process *.h5 process.phil
Phil file must have good parameters for data processing. Take one from previous users.