mmtbx.scaling package

Submodules

mmtbx.scaling.absence_likelihood module

mmtbx.scaling.absence_likelihood.halton_x(n=100)
mmtbx.scaling.absence_likelihood.log_p(z, sigz, centric_flag, n=40, lim=5.0)
mmtbx.scaling.absence_likelihood.test()

mmtbx.scaling.absences module

class mmtbx.scaling.absences.absences(mult=2.0, threshold=0.95)

Bases: object

check(abs_type, hkl, return_bool=False)
check_condition(hkl, condition)

magic

check_mask(hkl, mask)
class mmtbx.scaling.absences.absences_list(obs, was_filtered=None)

Bases: xtriage_analysis, systematic_absences_info

Container for lists of systematic absences. This subclass simply overrides the default output of the base class in cctbx.miller to be consistent with the rest of Xtriage.

show(*args, **kwds)

For each possible space group, show a list of possible systematically absent reflections and corresponding I/sigmaI.

class mmtbx.scaling.absences.analyze_absences(miller_array, isigi_cut=3, sigma_inflation=1.0)

Bases: xtriage_analysis

check_conditions(abs_lower_i_threshold=1e-06)
propose(ops, thres=1)
score_isigi(isig, absent=False, a=30.0)
class mmtbx.scaling.absences.conditions_for_operator(s)

Bases: object

absence_type()
condition()
mmtbx.scaling.absences.likelihood(z, sigz, absent_or_centric_or_acentric, sigma_inflation=1.0)
class mmtbx.scaling.absences.protein_space_group_choices(miller_array, threshold=3, protein=True, print_all=True, sigma_inflation=1.0, original_data=None)

Bases: xtriage_analysis

suggest_likely_candidates(acceptable_violations=1e+90)
class mmtbx.scaling.absences.sgi_iterator(chiral=True, crystal_system=None, intensity_symmetry=None)

Bases: object

comparator(sgi)
list()
mmtbx.scaling.absences.test()

mmtbx.scaling.absolute_scaling module

mmtbx.scaling.absolute_scaling.anisotropic_correction(cache_0, p_scale, u_star, b_add=None, must_be_greater_than=0.0)
class mmtbx.scaling.absolute_scaling.expected_intensity(scattering_info, d_star_sq_array, p_scale=0.0, b_wilson=0.0, magic_fudge_factor=2.0)

Bases: object

This class computes the expected intensity for a given d_star_sq_array given some basic info about ASU contents.

class mmtbx.scaling.absolute_scaling.gamma_nucleic(d_star_sq)

Bases: object

class mmtbx.scaling.absolute_scaling.gamma_protein(d_star_sq)

Bases: object

gamma
sigma_gamma
class mmtbx.scaling.absolute_scaling.kernel_normalisation(miller_array, kernel_width=None, n_bins=23, n_term=13, d_star_sq_low=None, d_star_sq_high=None, auto_kernel=False, number_of_sorted_reflections_for_auto_kernel=50)

Bases: object

class mmtbx.scaling.absolute_scaling.ml_aniso_absolute_scaling(miller_array, n_residues=None, n_bases=None, asu_contents=None, prot_frac=1.0, nuc_frac=0.0, ignore_errors=False)

Bases: xtriage_analysis

Maximum likelihood anisotropic wilson scaling.

Parameters:
  • miller_array – experimental data (will be converted to amplitudes if necessary

  • n_residues – number of protein residues in ASU

  • n_bases – number of nucleic acid bases in ASU

  • asu_contents – a dictionary specifying scattering types and numbers ( i.e. {‘Au’:1, ‘C’:2.5, ‘O’:1’, ‘H’:3 } )

  • prot_frac – fraction of scattering from protein

  • nuc_frac – fraction of scattering from nucleic acids

analyze_aniso_correction(n_check=2000, p_check=0.25, level=3, z_level=9)
aniso_ratio_p_value(rat)
compute_functional_and_gradients()
format_it(x, format='%3.2f')
pack(g)
summarize_issues()
unpack()
class mmtbx.scaling.absolute_scaling.ml_iso_absolute_scaling(miller_array, n_residues=None, n_bases=None, asu_contents=None, prot_frac=1.0, nuc_frac=0.0, include_array_info=True)

Bases: xtriage_analysis

Maximum likelihood isotropic wilson scaling.

Parameters:
  • miller_array – experimental data (will be converted to amplitudes if necessary

  • n_residues – number of protein residues in ASU

  • n_bases – number of nucleic acid bases in ASU

  • asu_contents – a dictionary specifying scattering types and numbers ( i.e. {‘Au’:1, ‘C’:2.5, ‘O’:1’, ‘H’:3 } )

  • prot_frac – fraction of scattering from protein

  • nuc_frac – fraction of scattering from nucleic acids

compute_functional_and_gradients()
summarize_issues()
class mmtbx.scaling.absolute_scaling.scattering_information(n_residues=None, n_bases=None, asu_contents=None, fraction_protein=None, fraction_nucleic=None)

Bases: object

scat_data(d_star_sq=None)

mmtbx.scaling.basic_analyses module

class mmtbx.scaling.basic_analyses.basic_analyses(miller_array, phil_object, out=None, out_plot=None, miller_calc=None, original_intensities=None, completeness_as_non_anomalous=None, verbose=0)

Bases: object

mmtbx.scaling.data_statistics module

Collect multiple analyses of experimental data quality, including signal-to-noise ratio, completeness, ice rings and other suspicious outliers, anomalous measurability, and Wilson plot.

class mmtbx.scaling.data_statistics.analyze_measurability(d_star_sq, smooth_approx, meas_data, miller_array=None, low_level_cut=0.03, high_level_cut=0.06)

Bases: xtriage_analysis

class mmtbx.scaling.data_statistics.analyze_resolution_limits(miller_set, d_min_max_delta=0.25)

Bases: xtriage_analysis

Check for elliptical truncation, which may be applied to the data by some processing software (or as a post-processing step). As a general rule this is not recommended since phenix.refine and related programs will handle anisotropy automatically, and users tend to apply it blindly (and even deposit the modified data).

is_elliptically_truncated(d_min_max_delta=None)
max_d_min_delta()

Return the maximum difference in d_min along any two axes.

class mmtbx.scaling.data_statistics.anomalous(miller_array, merging_stats=None, plan_sad_experiment_stats=None)

Bases: xtriage_analysis

summarize_issues()

Traffic light

class mmtbx.scaling.data_statistics.completeness_enforcement(miller_array, minimum_completeness=0.75, completeness_as_non_anomalous=None)

Bases: object

class mmtbx.scaling.data_statistics.data_strength_and_completeness(miller_array, isigi_cut=3.0, completeness_cut=0.85, completeness_as_non_anomalous=None)

Bases: xtriage_analysis

Collect basic info about overall completeness and signal-to-noise ratios, independent of scaling.

high_resolution_for_twin_tests()
i_over_sigma_outer_shell()
summarize_issues()
class mmtbx.scaling.data_statistics.i_sigi_completeness_stats(miller_array, n_bins=15, isigi_cut=3.0, completeness_cut=0.85, resolution_at_least=3.5, completeness_as_non_anomalous=None)

Bases: xtriage_analysis

Collects resolution-dependent statistics on I/sigma expressed as percentage of reflections above specified cutoffs.

class mmtbx.scaling.data_statistics.ice_ring_checker(bin_centers, completeness_data, z_scores_data, completeness_abnormality_level=4.0, intensity_level=0.1, z_score_limit=10)

Bases: xtriage_analysis

Check intensity and completeness statistics in specific resolution ranges known to have strong diffraction when crystalline ice is present.

summarize_issues()
class mmtbx.scaling.data_statistics.log_binned_completeness(miller_array, n_reflections_in_lowest_resolution_bin=100, max_number_of_bins=30, min_reflections_in_bin=50, completeness_as_non_anomalous=None)

Bases: xtriage_analysis

Table of completeness using log-scale resolution binning.

class mmtbx.scaling.data_statistics.possible_outliers(miller_array, prob_cut_ex=0.1, prob_cut_wil=1e-06)

Bases: xtriage_analysis

Flag specific reflections with suspicious intensities. Inspired by: Read, Acta Cryst. (1999). D55, 1759-1764

fraction_outliers()
n_outliers()
remove_outliers(miller_array)
summarize_issues()
class mmtbx.scaling.data_statistics.wilson_scaling(miller_array, n_residues, remove_aniso_final_b='eigen_min', use_b_iso=None, n_copies_solc=1, n_bases=0, z_score_cut=4.5, completeness_as_non_anomalous=None)

Bases: xtriage_analysis

Calculates isotropic and anisotropic scale factors, Wilson plot, and various derived analyses such as ice rings and outliers.

show_worrisome_shells(out)
summarize_issues()

mmtbx.scaling.fa_estimation module

class mmtbx.scaling.fa_estimation.ano_scaling(miller_array_x1, options=None, out=None)

Bases: object

class mmtbx.scaling.fa_estimation.cns_fa_driver(lambdas)

Bases: object

average_all()
normalise_all()
class mmtbx.scaling.fa_estimation.combined_scaling(miller_array_x1, miller_array_x2, options=None, out=None)

Bases: object

perform_least_squares_scaling()
perform_local_scaling()
perform_outlier_rejection()
class mmtbx.scaling.fa_estimation.mum_dad(lambda1, lambda2, k1=1.0)

Bases: object

class mmtbx.scaling.fa_estimation.naive_fa_estimation(ano, iso, options, out=None)

Bases: object

class mmtbx.scaling.fa_estimation.singh_ramasheshan_fa_estimate(w1, w2, k1, k2)

Bases: object

compute_coefs()
compute_determinant()
compute_fa_values()
set_sigma_ratio()
class mmtbx.scaling.fa_estimation.twmad_fa_driver(lambda1, lambda2, k1, k2, options, out=None)

Bases: object

mmtbx.scaling.fest module

mmtbx.scaling.fest.print_banner(command_name)
mmtbx.scaling.fest.run(args, command_name='phenix.fest')

mmtbx.scaling.make_param module

class mmtbx.scaling.make_param.phil_lego

Bases: object

This class facilitates the construction of phil parameter files for the FA estimation program FATSO.

add_wavelength_info()
default_2wmad()
default_3wmad()
default_rip()
default_sad()
default_sir()
default_siras()
mmtbx.scaling.make_param.run(args)

mmtbx.scaling.massage_twin_detwin_data module

class mmtbx.scaling.massage_twin_detwin_data.massage_data(miller_array, parameters, out=None, n_residues=100, n_bases=0)

Bases: object

return_data()
write_data(file_name, output_type=<libtbx.AutoType object>, label_extension='massaged')

mmtbx.scaling.matthews module

class mmtbx.scaling.matthews.component(mw, rho_spec)

Bases: object

Macromolecule component

classmethod nucleic(nres)
classmethod protein(nres)
class mmtbx.scaling.matthews.density_calculator(crystal)

Bases: object

Calculate Matthews coefficient and solvent fraction

macromolecule_fraction(weight, rho_spec)
solvent_fraction(weight, rho_spec)
vm(weight)
mmtbx.scaling.matthews.exercise()
mmtbx.scaling.matthews.get_log_p_solc()
class mmtbx.scaling.matthews.matthews_rupp(crystal_symmetry, n_residues=None, n_bases=None, out=None)

Bases: xtriage_analysis

Probabilistic estimation of number of copies in the asu

mmtbx.scaling.matthews.number_table(components, density_calculator)
mmtbx.scaling.matthews.p_solc_calc(sc)

Calculate solvent fraction probability

class mmtbx.scaling.matthews.p_vm_calculator(crystal_symmetry, n_residues, n_bases=None, out=None, verbose=0)

Bases: object

solvent content/matthews probability calculator

guesstimate()
p_solc_calc(sc)
solc(vm)
vm(copies)
vm_prop_table()

mmtbx.scaling.outlier_plots module

mmtbx.scaling.outlier_plots.plotit(fobs, sigma, fcalc, alpha, beta, epsilon, centric, out, limit=5.0, steps=1000, plot_title='Outlier plot')
mmtbx.scaling.outlier_plots.run(args)

mmtbx.scaling.outlier_rejection module

class mmtbx.scaling.outlier_rejection.outlier_manager(miller_obs, r_free_flags, out=None)

Bases: object

apply_scale_to_original_data(scale_factor, d_min=None)
basic_wilson_outliers(p_basic_wilson=1e-06, return_data=False)
beamstop_shadow_outliers(level=0.01, d_min=10.0, return_data=False)
extreme_wilson_outliers(p_extreme_wilson=0.1, return_data=False)
make_log_beam_stop(log_message, flags)
make_log_model(log_message, flags, ll_gain, p_values, e_obs, e_calc, sigmaa, plot_out=None)
make_log_wilson(log_message, flags, p_values)

produces a ‘nice’ table of outliers and their reason for being an outlier using basic or extreme wilson statistics

model_based_outliers(f_model, level=0.01, return_data=False, plot_out=None)

mmtbx.scaling.pair_analyses module

class mmtbx.scaling.pair_analyses.delta_f_prime_f_double_prime_ratio(lambda1, lambda2, level=1.0)

Bases: object

class mmtbx.scaling.pair_analyses.delta_generator(nat, der, nsr_bias=1.0)

Bases: object

class mmtbx.scaling.pair_analyses.f_double_prime_ratio(lambda1, lambda2)

Bases: object

compute_functional()
compute_functional_and_gradients()
compute_gradient()
compute_gradient_fd()
show(out=None)
class mmtbx.scaling.pair_analyses.mum_dad(lambda1, lambda2, k1=1.0)

Bases: object

class mmtbx.scaling.pair_analyses.outlier_rejection(nat, der, cut_level_rms=3, cut_level_sigma=0, method={'rms': False, 'rms_and_sigma': True, 'solve': False}, out=None)

Bases: object

detect_outliers()
detect_outliers_rms()
detect_outliers_sigma()
detect_outliers_solve()

TT says: I toss everything > 3 sigma in the scaling, where sigma comes from the rms of everything being scaled:

sigma**2 = <delta**2>- <experimental-sigmas**2>

Then if a particular delta**2 > 3 sigma**2 + experimental-sigmas**2 then I toss it.

remove_outliers()
class mmtbx.scaling.pair_analyses.reindexing(set_a, set_b, out=None, relative_length_tolerance=0.05, absolute_angle_tolerance=3.0, lattice_symmetry_max_delta=3.0, file_name=None)

Bases: object

Reindexing matrices

analyse()
select_and_transform(matches_cut_off=0.75)
class mmtbx.scaling.pair_analyses.singh_ramasheshan_fa_estimate(w1, w2, k1, k2)

Bases: object

compute_coefs()
compute_determinant()
compute_fa_values()
set_sigma_ratio()

mmtbx.scaling.pre_scale module

class mmtbx.scaling.pre_scale.pre_scaler(miller_array, pre_scaling_protocol, basic_info, out=None)

Bases: object

mmtbx.scaling.random_omit module

the parameters should have this scope
omit {

perform_omit = True fraction = 0.15 max_number = 1e5 number_of_sets = 100 root_name = ‘omit_

}

class mmtbx.scaling.random_omit.random_omit_data(miller_array, parameters)

Bases: object

write_datasets()

mmtbx.scaling.relative_scaling module

class mmtbx.scaling.relative_scaling.local_scaling_driver(miller_native, miller_derivative, local_scaling_dict, use_intensities=True, use_weights=False, max_depth=10, target_neighbours=1000, sphere=1, threshold=1.0, out=None)

Bases: object

local_lsq_scaling(out)
local_moment_scaling(out)
local_nikonov_scaling(out)
r_value(out)
class mmtbx.scaling.relative_scaling.ls_rel_scale_driver(miller_native, miller_derivative, use_intensities=True, scale_weight=True, use_weights=True)

Bases: object

show(out=None)
class mmtbx.scaling.relative_scaling.refinery(miller_native, miller_derivative, use_intensities=True, scale_weight=False, use_weights=False, mask=[1, 1], start_values=None)

Bases: object

functional(x)
gradients(x)
hessian(x, eps=1e-06)
hessian_transform(original_hessian, adp_constraints)
pack(grad_tensor)
unpack(x)

mmtbx.scaling.relative_wilson module

class mmtbx.scaling.relative_wilson.relative_wilson(miller_obs, miller_calc, min_d_star_sq=0.0, max_d_star_sq=2.0, n_points=2000, level=6.0)

Bases: xtriage_analysis

curve(d_star_sq)
get_z_scores(scale, b_value)
modify_weights(level=5)
show_summary(out)
std(d_star_sq)
summary()
target(vector)
class mmtbx.scaling.relative_wilson.summary(all_curves, level=6.0, all_bad_z_scores=False)

Bases: xtriage_analysis

data_as_flex_arrays()
n_outliers()

mmtbx.scaling.remove_outliers module

mmtbx.scaling.remove_outliers.print_help(command_name)
mmtbx.scaling.remove_outliers.run(args, command_name='phenix.remove_outliers')

mmtbx.scaling.rip_scale module

mmtbx.scaling.rip_scale.run(args)

mmtbx.scaling.sad_scale module

mmtbx.scaling.sad_scale.run(args)

mmtbx.scaling.sigmaa_estimation module

class mmtbx.scaling.sigmaa_estimation.sigmaa_estimator(miller_obs, miller_calc, r_free_flags, kernel_width_free_reflections=None, kernel_width_d_star_cubed=None, kernel_in_bin_centers=False, kernel_on_chebyshev_nodes=True, n_sampling_points=20, n_chebyshev_terms=10, use_sampling_sum_weights=False, make_checks_and_clean_up=True)

Bases: object

alpha_beta()
fom()
phase_errors()
show(out=None)
show_short(out=None, silent=False)
sigmaa()
sigmaa_model_error()
mmtbx.scaling.sigmaa_estimation.sigmaa_estimator_kernel_width_d_star_cubed(r_free_flags, kernel_width_free_reflections)
class mmtbx.scaling.sigmaa_estimation.sigmaa_point_estimator(target_functor, h)

Bases: object

compute_functional_and_gradients()

mmtbx.scaling.sir_scale module

mmtbx.scaling.sir_scale.run(args)

mmtbx.scaling.siras_scale module

mmtbx.scaling.siras_scale.run(args)

mmtbx.scaling.ta_alpha_beta_calc module

mmtbx.scaling.ta_alpha_beta_calc.sigmaa_estimator_kernel_width_d_star_cubed(r_free_flags, kernel_width_free_reflections)
class mmtbx.scaling.ta_alpha_beta_calc.sigmaa_point_estimator(target_functor, h)

Bases: object

compute_functional_and_gradients()
class mmtbx.scaling.ta_alpha_beta_calc.ta_alpha_beta_calc(miller_obs, miller_calc, r_free_flags, ta_d, kernel_width_free_reflections=None, kernel_width_d_star_cubed=None, kernel_in_bin_centers=False, kernel_on_chebyshev_nodes=True, n_sampling_points=20, n_chebyshev_terms=10, use_sampling_sum_weights=False, make_checks_and_clean_up=True)

Bases: object

alpha_beta()
eobs_and_ecalc_miller_array_normalizers()
fom()
phase_errors()
show(out=None)
show_short(out=None)
sigmaa()
sigmaa_model_error()

mmtbx.scaling.thorough_outlier_test module

mmtbx.scaling.thorough_outlier_test.exercise(d_min=3.5, k_sol=0.3, b_sol=60.0, b_cart=[0, 0, 0, 0, 0, 0], anomalous_flag=False, scattering_table='it1992', space_group_info=None)
mmtbx.scaling.thorough_outlier_test.run()
mmtbx.scaling.thorough_outlier_test.run_call_back(flags, space_group_info)

mmtbx.scaling.twin_analyses module

mmtbx.scaling.twin_analyses.analyze_intensity_statistics(self, d_min=2.5, completeness_as_non_anomalous=None, log=None)

Detect translational pseudosymmetry and twinning. Returns a twin_law_interpretation object.

class mmtbx.scaling.twin_analyses.britton_test(twin_law, miller_array, cc_cut_off=0.995, verbose=0)

Bases: xtriage_analysis

get_alpha(x, y)
property table
class mmtbx.scaling.twin_analyses.correlation_analyses(miller_obs, miller_calc, twin_law, d_weight=0.1)

Bases: xtriage_analysis

find_maximum()
class mmtbx.scaling.twin_analyses.detect_pseudo_translations(miller_array, low_limit=10.0, high_limit=5.0, max_sites=100, height_cut=0.0, distance_cut=15.0, p_value_cut=0.05, completeness_cut=0.75, cut_radius=3.5, min_cubicle_edge=5.0, completeness_as_non_anomalous=None, out=None, verbose=0)

Bases: xtriage_analysis

Analyze the Patterson map to identify off-origin peaks that are a significant fraction of the origin peak height.

closest_rational(fraction, eps=0.02, return_text=True)
guesstimate_mod_hkl()
p_value(peak_height)
suggest_new_space_groups(t_den=144, out=None)
mmtbx.scaling.twin_analyses.get_twin_laws(miller_array)

Convenience method for getting a list of twin law operators (as strings)

class mmtbx.scaling.twin_analyses.h_test(twin_law, miller_array, fraction=0.5)

Bases: xtriage_analysis

property table
class mmtbx.scaling.twin_analyses.l_test(miller_array, parity_h=2, parity_k=2, parity_l=2)

Bases: xtriage_analysis

Implementation of:

J. Padilla & T. O. Yeates. A statistic for local intensity differences: robustness to anisotropy and pseudo-centering and utility for detecting twinning. Acta Crystallogr. D59, 1124-30, 2003.

This is complementary to the NZ test, but is insensitive to translational pseuo-symmetry.

property table
mmtbx.scaling.twin_analyses.merge_data_and_guess_space_groups(miller_array, txt, xs=None, out=None, sigma_inflation=1.0, check_absences=True)
class mmtbx.scaling.twin_analyses.ml_murray_rust(miller_array, twin_law, n_points=4)

Bases: xtriage_analysis

Maximum-likelihood twin fraction estimation (Zwart, Read, Grosse-Kunstleve & Adams, to be published).

property table
class mmtbx.scaling.twin_analyses.ml_murray_rust_with_ncs(miller_array, twin_law, out, n_bins=10, calc_data=None, start_alpha=None)

Bases: object

calc_correlation(obs, calc, out=<_io.TextIOWrapper name='<stdout>' mode='w' encoding='utf-8'>)
compute_functional_and_gradients()
string_it(x)
class mmtbx.scaling.twin_analyses.n_z_test(normalised_acentric, normalised_centric)

Bases: xtriage_analysis

property table
class mmtbx.scaling.twin_analyses.obliquity(reduced_cell, rot_mx, deg=True)

Bases: slots_getstate_setstate

delta
h
t
tau
type
u
class mmtbx.scaling.twin_analyses.r_values(miller_obs, twin_law, miller_calc=None, n_reflections=400)

Bases: xtriage_analysis

r_vs_r(input_obs, input_calc)
r_vs_r_classification()
resolution_dependent_r_values()
class mmtbx.scaling.twin_analyses.symmetry_issues(miller_array, max_delta=3.0, r_cut=0.05, sigma_inflation=1.25, out=None)

Bases: xtriage_analysis

get_r_value_total(start_pg, end_pg)
make_pg_r_table()
make_r_table()
return_point_groups()
class mmtbx.scaling.twin_analyses.twin_analyses(miller_array, d_star_sq_low_limit=None, d_star_sq_high_limit=None, d_hkl_for_l_test=None, normalise=True, out=None, out_plots=None, verbose=1, miller_calc=None, additional_parameters=None, original_data=None, completeness_as_non_anomalous=None)

Bases: xtriage_analysis

Perform various twin related tests

mmtbx.scaling.twin_analyses.twin_analyses_brief(miller_array, cut_off=2.5, completeness_as_non_anomalous=None, out=None, verbose=0)

A very brief twin analyses and tries to answer the question whether or not the data are twinned. possible outputs and the meaning: - False: data are not twinned - True : data do not behave as expected. One possible explanantion

is twinning

  • Nonedata do not behave as expected, and might or might not be

    due to twinning. Also gives none when something messes up.

class mmtbx.scaling.twin_analyses.twin_law(op, pseudo_merohedral_flag, axis_type, delta_santoro, delta_le_page, delta_lebedev)

Bases: slots_getstate_setstate

Basic container for information about a possible twin law, with scores for fit to crystal lattice.

axis_type
delta_le_page
delta_lebedev
delta_santoro
operator
twin_type
class mmtbx.scaling.twin_analyses.twin_law_dependent_twin_tests(twin_law, miller_array, out, verbose=0, miller_calc=None, normalized_intensities=None, ncs_test=None, n_ncs_bins=None)

Bases: xtriage_analysis

Twin law dependent test results

property britton_frac
property h_frac
property ml_frac
class mmtbx.scaling.twin_analyses.twin_law_quality(xs, twin_law)

Bases: object

Various scores for a potential twin law given the crystal lattice.

delta_le_page()
delta_lebedev()
delta_santoro()
strain_tensor()

this gives a tensor describing the deformation of the unit cell needed to obtain a perfect match. the sum of diagonal elements describes the change in volume, off diagonal components measure associated shear.

class mmtbx.scaling.twin_analyses.twin_laws(miller_array, lattice_symmetry_max_delta=3.0, out=None)

Bases: xtriage_analysis

Container for all possible twin laws given a crystal lattice and space group.

class mmtbx.scaling.twin_analyses.twin_results_interpretation(nz_test, wilson_ratios, l_test, translational_pseudo_symmetry=None, twin_law_related_test=None, symmetry_issues=None, maha_l_cut=3.5, patterson_p_cut=0.01, out=None)

Bases: xtriage_analysis

compute_maha_l()
has_abnormal_intensity_statistics()
has_higher_symmetry()
has_pseudo_translational_symmetry()
has_twinning()
make_sym_op_table()
max_twin_fraction()
patterson_verdict()
show_verdict(out)
summarize_issues()
mmtbx.scaling.twin_analyses.weighted_cc(x, y, w)

Utility function for correlation_analyses class.

class mmtbx.scaling.twin_analyses.wilson_moments(acentric_z, centric_z)

Bases: xtriage_analysis

acentric_e_sq_minus_one_library = [0.736, 0.541]
acentric_f_ratio_library = [0.785, 0.885]
acentric_i_ratio_library = [2.0, 1.5]
centric_e_sq_minus_one_library = [0.968, 0.736]
centric_f_ratio_library = [0.637, 0.785]
centric_i_ratio_library = [3.0, 2.0]
compute_ratios(ac, c)
class mmtbx.scaling.twin_analyses.wilson_normalised_intensities(miller_array, normalise=True, out=None, verbose=0)

Bases: xtriage_analysis

making centric and acentric cut

mmtbx.scaling.twmad_scale module

mmtbx.scaling.twmad_scale.run(args)

mmtbx.scaling.xtriage module

Main program driver for Xtriage.

mmtbx.scaling.xtriage.change_symmetry(miller_array, space_group_symbol, file_name=None, log=<_io.TextIOWrapper name='<stdout>' mode='w' encoding='utf-8'>)

Encapsulates all operations required to convert the original data to a different symmetry as suggested by Xtriage.

mmtbx.scaling.xtriage.check_for_pathological_input_data(miller_array, completeness_as_non_anomalous=None)
class mmtbx.scaling.xtriage.data_summary(miller_array, was_merged=False)

Bases: xtriage_analysis

Basic info about the input data (somewhat redundant at the moment).

summarize_issues()
mmtbx.scaling.xtriage.finish_job(result)
class mmtbx.scaling.xtriage.launcher(args, file_name, output_dir=None, log_file=None, job_title=None)

Bases: target_with_save_result

run()
mmtbx.scaling.xtriage.make_big_header(text, out)
class mmtbx.scaling.xtriage.merging_statistics(i_obs, crystal_symmetry=None, d_min=None, d_max=None, anomalous=False, n_bins=10, reflections_per_bin=None, binning_method='volume', debug=False, file_name=None, model_arrays=None, sigma_filtering=<libtbx.AutoType object>, use_internal_variance=True, eliminate_sys_absent=True, d_min_tolerance=1e-06, extend_d_max_min=False, cc_one_half_significance_level=None, cc_one_half_method='half_dataset', assert_is_not_unique_set_under_symmetry=True, log=None)

Bases: xtriage_analysis, dataset_statistics

Subclass of iotbx merging statistics class to override the show() method and use the Xtriage output style.

property cc_one_half_outer
summarize_issues()
mmtbx.scaling.xtriage.print_banner(appl, out=None)
mmtbx.scaling.xtriage.print_help(appl)
mmtbx.scaling.xtriage.run(args, command_name='phenix.xtriage', return_result=False, out=None, data_file_name=None)
class mmtbx.scaling.xtriage.summary(issues, sort=True)

Bases: xtriage_analysis

property n_problems
mmtbx.scaling.xtriage.validate_params(params, callback=None)
class mmtbx.scaling.xtriage.xtriage_analyses(miller_obs, miller_calc=None, miller_ref=None, params=None, text_out=None, unmerged_obs=None, log_file_name=None)

Bases: xtriage_analysis

Run all Xtriage analyses for experimental data, with optional Fcalc or reference datasets.

Parameters:
  • miller_obs – array of observed data, should be intensity or amplitude

  • miller_calc – array of calculated data

  • miller_ref – array with ‘reference’ data, for instance a data set with an alternative indexing scheme

  • text_out – A filehandle or other object with a write method

  • params – An extracted PHIL parameter block, derived from master_params

property aniso_b_min

Convenience method for retrieving the minimum anisotropic B_cart tensor. Used in AutoSol.

property aniso_b_ratio

Ratio of the maximum difference between anisotropic B_cart tensors to the mean of the tensors. Used in PDB validation server.

property aniso_range_of_b

Convenience method for retrieving the range of anisotropic B_cart tensors. Used in AutoSol.

estimate_d_min(**kwds)

Suggest resolution cutoffs based on selected statistics (if merging was included in analyses). See iotbx.merging_statistics.dataset_statistics for underlying function documentation.

property i_over_sigma_outer_shell
is_twinned()

Convenience method for indicating whether the data are likely twinned.

property iso_b_wilson

Convenience method for isotropic Wilson B-factor

property l_test_mean_l

<|L|> from the L test for abnormal intensity distributions. Used in PDB validation server.

property l_test_mean_l_squared

<L^2> from the L test for abnormal intensity distributions. Used in PDB validation server.

property low_d_cut

Shortcut to resolution_limit_of_anomalous_signal().

matthews_n_copies()

Convenience method for retrieving the number of copies.

property max_estimated_twin_fraction

Estimated twin fraction from the most worrysome twin law. Used by PDB validation server.

new_format = True
property number_of_wilson_outliers

Number of centric and acentric outliers flagged by Wilson plot analysis. Used in PDB validation server.

property overall_i_sig_i
property patterson_verdict

Plain-English explanation of Patterson analysis for TNCS detection. Used by PDB validation server.

resolution_cut()

Convenience method for retrieving a conservative resolution cutoff.

resolution_limit_of_anomalous_signal()

Convenience method for retrieving the recommended resolution cutoff for anomalous substructures search. Used in AutoSol.

summarize_issues()
class mmtbx.scaling.xtriage.xtriage_summary

Bases: object

Old result class, minus initialization. Provides backwards compatibility with pickle files from Phenix 1.9 and earlier.

get_completeness()
get_data_file()
get_merging_statistics()
get_relative_wilson()
is_centric()
new_format = False
original_intensities_flag()

Module contents

Base module for Xtriage and related scaling functionality; this imports the Boost.Python extensions into the local namespace, and provides core functions for displaying the results of Xtriage.

class mmtbx.scaling.data_analysis

Bases: slots_getstate_setstate

show(out=<_io.TextIOWrapper name='<stdout>' mode='w' encoding='utf-8'>, prefix='')
class mmtbx.scaling.loggraph_output(out)

Bases: xtriage_output

Output class for displaying ‘loggraph’ format (from ccp4i) as plain text.

gui_output = True
newline()

Print a newline and nothing else.

show_big_header(text)

Print a big header with the specified title.

show_header(text)

Start a new section with the specified title.

show_lines(text)

Show partially formatted text, preserving paragraph breaks.

show_paragraph_header(text)

Show a header/title for a paragraph or small block of text.

show_plot(table)

Display a plot, if supported by the given output class.

show_plots_row(tables)

Display a series of plots in a single row. Only used for the Phenix GUI.

show_preformatted_text(text)

Show text with spaces and line breaks preserved; in some contexts this will be done using a monospaced font.

show_sub_header(title)

Start a sub-section with the specified title.

show_table(*args, **kwds)

Display a formatted table.

show_text(text)

Show unformatted text.

show_text_columns(*args, **kwds)

Display a set of left-justified text columns. The number of columns is arbitrary but this will usually be key:value pairs.

warn(text)

Display a warning message.

write(text)

Support for generic filehandle methods.

class mmtbx.scaling.printed_output(out)

Bases: xtriage_output

Output class for displaying raw text with minimal formatting.

newline()

Print a newline and nothing else.

out
show_big_header(text)

Print a big header with the specified title.

show_header(text)

Start a new section with the specified title.

show_lines(text)

Show partially formatted text, preserving paragraph breaks.

show_paragraph_header(text)

Show a header/title for a paragraph or small block of text.

show_plot(table)

Display a plot, if supported by the given output class.

show_plots_row(tables)

Display a series of plots in a single row. Only used for the Phenix GUI.

show_preformatted_text(text)

Show text with spaces and line breaks preserved; in some contexts this will be done using a monospaced font.

show_sub_header(title)

Start a sub-section with the specified title.

show_table(table, indent=2, plot_button=None, equal_widths=True)

Display a formatted table.

show_text(text)

Show unformatted text.

show_text_columns(rows, indent=0)

Display a set of left-justified text columns. The number of columns is arbitrary but this will usually be key:value pairs.

warn(text)

Display a warning message.

write(text)

Support for generic filehandle methods.

class mmtbx.scaling.xtriage_analysis

Bases: object

Base class for analyses performed by Xtriage. This does not impose any restrictions on content or functionality, but simply provides a show() method suitable for either filehandle-like objects or objects derived from the xtriage_output class. Child classes should implement _show_impl.

show(out=None)
summarize_issues()
class mmtbx.scaling.xtriage_output

Bases: slots_getstate_setstate

Base class for generic output wrappers.

flush()

Support for generic filehandle methods.

gui_output = False
newline()

Print a newline and nothing else.

show(text)
show_big_header(title)

Print a big header with the specified title.

show_header(title)

Start a new section with the specified title.

show_lines(text)

Show partially formatted text, preserving paragraph breaks.

show_paragraph_header(text)

Show a header/title for a paragraph or small block of text.

show_plot(table)

Display a plot, if supported by the given output class.

show_plots_row(tables)

Display a series of plots in a single row. Only used for the Phenix GUI.

show_preformatted_text(text)

Show text with spaces and line breaks preserved; in some contexts this will be done using a monospaced font.

show_sub_header(title)

Start a sub-section with the specified title.

show_table(table, indent=0, plot_button=None, equal_widths=True)

Display a formatted table.

show_text(text)

Show unformatted text.

show_text_columns(rows, indent=0)

Display a set of left-justified text columns. The number of columns is arbitrary but this will usually be key:value pairs.

warn(text)

Display a warning message.

write(text)

Support for generic filehandle methods.