bw2data.weighting_normalization =============================== .. py:module:: bw2data.weighting_normalization Classes ------- .. autoapisummary:: bw2data.weighting_normalization.Normalization bw2data.weighting_normalization.Weighting Module Contents --------------- .. py:class:: Normalization(name) Bases: :py:obj:`bw2data.ia_data_store.ImpactAssessmentDataStore` LCIA normalization data - used to transform meaningful units, like mass or damage, into "person-equivalents" or some such thing. The data schema for IA normalization is: .. code-block:: python Schema([ [valid_tuple, maybe_uncertainty] ]) where: * ``valid_tuple`` is a dataset identifier, like ``("biosphere", "CO2")`` * ``maybe_uncertainty`` is either a number or an uncertainty dictionary .. py:method:: process_row(row) Given ``(flow key, amount)``, return a dictionary for array insertion. .. py:attribute:: _metadata .. py:attribute:: matrix :value: 'normalization_matrix' .. py:attribute:: validator .. py:class:: Weighting(name) Bases: :py:obj:`bw2data.ia_data_store.ImpactAssessmentDataStore` LCIA weighting data - used to combine or compare different impact categories. The data schema for weighting is a one-element list: .. code-block:: python Schema(All( [uncertainty_dict], Length(min=1, max=1) )) .. py:method:: process_row(row) Return an empty tuple (as ``dtype_fields`` is empty), and the weighting uncertainty dictionary. .. py:method:: write(data) Because of DataStore assumptions, need a one-element list .. py:attribute:: _metadata .. py:attribute:: matrix :value: 'weighting_matrix' .. py:attribute:: validator