Skip to content

algorithms.knowledge.nsga2_ktmm

Classes

NSGA2KTMM

NSGA2KTMM(**kwargs)

Bases: DNSGA2

Knowledge Transfer with Mixture Model.

References

Zou, J., Hou, Z., Jiang, S., Yang, S., Ruan, G., Xia, Y., and Liu, Y. (2025). Knowledge transfer with mixture model in dynamic multi-objective optimization. IEEE Transactions on Evolutionary Computation, in press. https://doi.org/10.1109/TEVC.2025.3566481

Source code in pydmoo/algorithms/knowledge/nsga2_ktmm.py
def __init__(self, **kwargs):

    super().__init__(**kwargs)

    self.size_pool = 14  # the size of knowledge pool
    self.denominator = 0.5

Functions

_response_mechanism
_response_mechanism() -> Population

Response mechanism.

Source code in pydmoo/algorithms/knowledge/nsga2_ktmm.py
def _response_mechanism(self) -> Population:
    """Response mechanism."""
    pop = self.pop
    X = pop.get("X")

    # recreate the current population without being evaluated
    pop = Population.new(X=X)

    # sample self.pop_size solutions in decision space
    samples_old = self.sampling_new_pop()

    # select self.pop_size/2 individuals with better convergence and diversity
    samples = samples_old[:int(len(samples_old)/2)]

    # knowledge in decision space
    means_stds_ps, mean, std = self._in_decision_or_objective_space_1d(samples, "decision_space")
    mean_new, std_new = self._select_means_stds(means_stds_ps, mean, std)

    # sample self.pop_size solutions in decision space
    X = univariate_gaussian_sample(mean_new, std_new, self.pop_size, random_state=self.random_state)

    # bounds
    if self.problem.has_bounds():
        xl, xu = self.problem.bounds()
        X = np.clip(X, xl, xu)  # not provided in the original reference literature

    # merge
    pop = Population.merge(samples_old, Population.new(X=X))

    return pop

Functions