REGISTER NOW: DdS Autumn School! 🇨🇭 Grosshöchstetten (Switzerland) 🗓️ 6.-11. October 2024

bw2data.weighting_normalization#

Module Contents#

Classes#

Normalization

LCIA normalization data - used to transform meaningful units, like mass or damage, into "person-equivalents" or some such thing.

Weighting

LCIA weighting data - used to combine or compare different impact categories.

class bw2data.weighting_normalization.Normalization(name)[source]#

Bases: bw2data.ia_data_store.ImpactAssessmentDataStore

Inheritance diagram of bw2data.weighting_normalization.Normalization

LCIA normalization data - used to transform meaningful units, like mass or damage, into “person-equivalents” or some such thing.

The data schema for IA normalization is:

Schema([
    [valid_tuple, maybe_uncertainty]
])
where:
  • valid_tuple is a dataset identifier, like ("biosphere", "CO2")

  • maybe_uncertainty is either a number or an uncertainty dictionary

_metadata[source]#
matrix = 'normalization_matrix'[source]#
validator[source]#
process_row(row)[source]#

Given (flow key, amount), return a dictionary for array insertion.

class bw2data.weighting_normalization.Weighting(name)[source]#

Bases: bw2data.ia_data_store.ImpactAssessmentDataStore

Inheritance diagram of bw2data.weighting_normalization.Weighting

LCIA weighting data - used to combine or compare different impact categories.

The data schema for weighting is a one-element list:

Schema(All(
    [uncertainty_dict],
    Length(min=1, max=1)
))
_metadata[source]#
matrix = 'weighting_matrix'[source]#
validator[source]#
process_row(row)[source]#

Return an empty tuple (as dtype_fields is empty), and the weighting uncertainty dictionary.

write(data)[source]#

Because of DataStore assumptions, need a one-element list