668430-Roa

68 Chapter 3. Data-Driven Inference of Fault Tree Models Exploiting Symmetry and Modularisation resulting FTs, we do not proceed further. If both MCSs are distinct, we add C to the set of MCSs with which it shares the most BE. For example, we add C to M1 i if |C↖BEs1| ⇒ |C↖BEs2|. By this choice, we ensure that adding C to M1 i does not add too many new BE to BEs1 and we keep the number of shared BE between BEs1 and BEs2 small. Note that the split can still yield two parts which share a significant amount of BE. Composing the two resulting FTs can therefore yield an FT which is larger than the single FT inferred without the split. However, the composed FT will capture the symmetric structure present in the given MCSs. Example 5 (Split the Minimal Cut Sets). We continue with symmetry ϑ2 = (AE)(CD) and MCSs M1 ={{A, C}, {B, C}, {B, D}, {D, E}} from Example 4. We start the algorithm with MCS {A, C}. The symmetric MCS is ϑ({A, C}) = {D, E}. The first split yields M1 1 = {{A, C}} and M2 1 = {{D, E}}. The next MCS {B, C} is added to M1 1 because they both share BE C. The final split is: M 1 1 ={{A, C}, {B, C}} BEs1 ={A, B, C}, M 2 1 ={{D, E}, {B, D}} BEs2 ={B, D, E}. The split corresponds to the purple and dark blue sub-trees in Figure 3.1. Step 5: Infer Fault Tree. If no further partitioning of the MCSs Mi w.r.t. Steps 2-4 is possible, we use existing techniques to infer an FT from the (reduced) MCSs. SymLearn is modular and supports the use of any learning approach in this step, for example, based on genetic algorithms (Linard, Bucur, and Stoelinga, 2019) or Boolean logic (Lazarova-Molnar, Niloofar, and Barta, 2020). In our setting, we use the multi-objective evolutionary algorithmFT-MOEA (Jimenez-Roa, Heskes, Tinga, et al., 2023). FT-MOEA starts in the first generation by default with two parent FTs: one FT consists of an AND-gate connected to all BEs, and the other one uses an OR-gate. In each generation, several genetic operators are applied which randomly modify the FT structure. Each FT is evaluated according to three metrics given in Section 3.2: size of the FT|F|, error based on the failure dataset (ωd), and error based on the set of MCSs (ωc). The aim is to minimise the multi-objective function (|F|, ωd, ωc) by applying the Elitist Non-dominated Sorting Genetic Algorithm(NSGA-II) (Deb, Pratap, Agarwal, et al., 2002) and obtain the Pareto sets. Only the best candidates according to the metrics are then passed to the next generation. The algorithm stops if no improvement was made in a given number of generations and returns the FTs ordered according to the multi-objective function. Example 6(FT-MOEA). Given the MCS{{A, C}, {B, C}}, we use FT-MOEA to infer an FT. The resulting FT is the sub-tree indicated by purple colour in Figure 3.1.

RkJQdWJsaXNoZXIy MjY0ODMw