Without additional subjective preference information, there may exist a (possibly infinite) number of Pareto optimal solutions, all of which are considered equally good.
Researchers study multi-objective optimization problems from different viewpoints and, thus, there exist different solution philosophies and goals when setting and solving them.
Another example involves the production possibilities frontier, which specifies what combinations of various types of goods can be produced by a society with certain amounts of various resources.
To do this, the central bank uses a model of the economy that quantitatively describes the various causal linkages in the economy; it simulates the model repeatedly under various possible stances of monetary policy, in order to obtain a menu of possible predicted outcomes for the various variables of interest.
For example, energy systems typically have a trade-off between performance and cost[5][6] or one might want to adjust a rocket's fuel usage and orientation so that it arrives both at a specified place and at a specified time; or one might want to conduct open market operations so that both the inflation rate and the unemployment rate are as close as possible to their desired values.
If the design of a paper mill is defined by large storage volumes and paper quality is defined by quality parameters, then the problem of optimal design of a paper mill can include objectives such as i) minimization of expected variation of those quality parameters from their nominal values, ii) minimization of the expected time of breaks and iii) minimization of the investment cost of storage volumes.
[21] In 2013, Ganesan et al. carried out the multi-objective optimization of the combined carbon dioxide reforming and partial oxidation of methane.
Ganesan used the Normal Boundary Intersection (NBI) method in conjunction with two swarm-based techniques (Gravitational Search Algorithm (GSA) and Particle Swarm Optimization (PSO)) to tackle the problem.
[22] Applications involving chemical extraction[23] and bioethanol production processes[24] have posed similar multi-objective problems.
In 2013, Abakarov et al. proposed an alternative technique to solve multi-objective optimization problems arising in food engineering.
[27] The purpose of radio resource management is to satisfy the data rates that are requested by the users of a cellular network.
Each user has its own objective function that, for example, can represent some combination of the data rate, latency, and energy efficiency.
The network operator would like to both bring great coverage and high data rates, thus the operator would like to find a Pareto optimal solution that balance the total network data throughput and the user fairness in an appropriate subjective manner.
Radio resource management is often solved by scalarization; that is, selection of a network utility function that tries to balance throughput and user fairness.
The choice of utility function has a large impact on the computational complexity of the resulting single-objective optimization problem.
[28] For example, the common utility of weighted sum rate gives an NP-hard problem with a complexity that scales exponentially with the number of users, while the weighted max-min fairness utility results in a quasi-convex optimization problem with only a polynomial scaling with the number of users.
Some authors have proposed Pareto optimality based approaches (including active power losses and reliability indices as objectives).
For this purpose, different artificial intelligence based methods have been used: microgenetic,[31] branch exchange,[32] particle swarm optimization [33] and non-dominated sorting genetic algorithm.
Typically, planning such missions has been viewed as a single-objective optimization problem, where one aims to minimize the energy or time spent in inspecting an entire target structure.
is obtained, it suffices to solve but in practice, it is very difficult to construct a utility function that would accurately represent the decision maker's preferences,[2] particularly since the Pareto front is unknown before the optimization begins.
Evolutionary algorithms such as the Non-dominated Sorting Genetic Algorithm-II (NSGA-II),[51] its extended version NSGA-III,[52][53] Strength Pareto Evolutionary Algorithm 2 (SPEA-2)[54] and multiobjective differential evolution variants have become standard approaches, although some schemes based on particle swarm optimization and simulated annealing[55] are significant.
The main advantage of evolutionary algorithms, when applied to solve multi-objective optimization problems, is the fact that they typically generate sets of solutions, allowing computation of an approximation of the entire Pareto front.
Recently, hybrid multi-objective optimization has become an important theme in several international conferences in the area of EMO and MCDM (see e.g.,[79][80]).
First, a number of points of the Pareto front can be provided in the form of a list (interesting discussion and references are given in[81]) or using heatmaps.
The decision maker takes this information into account while specifying the preferred Pareto optimal objective point.
The idea to approximate and visualize the Pareto front was introduced for linear bi-objective decision problems by S. Gass and T.
[84] A review of methods for approximating the Pareto front for various decision problems with a small number of objectives (mainly, two) is provided in.
One of them, which is applicable in the case of a relatively small number of objective points that represent the Pareto front, is based on using the visualization techniques developed in statistics (various diagrams, etc.
The figures that display a series of bi-objective slices of the Pareto front for three-objective problems are known as the decision maps.
[87] More recently, N. Wesner[88] proposed using a combination of a Venn diagram and multiple scatterplots of the objective space to explore the Pareto frontier and select optimal solutions.