Volume 2022, Issue 1 2006574
Research Article
Open Access

Computing the Entropy Measures for the Line Graphs of Some Chemical Networks

Muhammad Farhan Hanif

Muhammad Farhan Hanif

Abdus Salam School of Mathematical Sciences, Government College University, Lahore, Pakistan gcu.edu.pk

Search for more papers by this author
Hasan Mahmood

Hasan Mahmood

Abdus Salam School of Mathematical Sciences, Government College University, Lahore, Pakistan gcu.edu.pk

Department of Mathematics, Government College University, Lahore, Pakistan gcu.edu.pk

Search for more papers by this author
Shazia Manzoor

Shazia Manzoor

Department of Mathematics, COMSATS University Islamabad, Lahore Campus, Pakistan comsats.edu.pk

Search for more papers by this author
Fikre Bogale Petros

Corresponding Author

Fikre Bogale Petros

Department of Mathematics, Addis Ababa University, Addis Ababa, Ethiopia aau.edu.et

Search for more papers by this author
First published: 06 October 2022
Citations: 1
Academic Editor: Gohar Ali

Abstract

Chemical Graph entropy plays a significant role to measure the complexity of chemical structures. It has explicit chemical uses in chemistry, biology, and information sciences. A molecular structure of a compound consists of many atoms. Especially, the hydrocarbons is a chemical compound that consists of carbon and hydrogen atoms. In this article, we discussed the concept of subdivision of chemical graphs and their corresponding line chemical graphs. More preciously, we discuss the properties of chemical graph entropies and then constructed the chemical structures namely triangular benzenoid, hexagonal parallelogram, and zigzag edge coronoid fused with starphene. Also, we estimated the degree-based entropies with the help of line graphs of the subdivision of above mentioned chemical graphs.

1. Introduction

Mathematical chemistry is a field of theoretical chemistry that uses mathematical approaches to discuss molecular structure without necessarily referring to quantum mechanics [1]. Chemical Graph Theory is a branch of mathematical chemistry where a chemical phenomenon is theoretically described using graph theory [2, 3]. The growth of organic disciplines has been aided by Chemical Graph Theory [4, 5]. In mathematical chemistry, graph invariants or topological indices are numeric quantities that describe various essential features of organic components and are produced from an analogous molecular graph [6, 7]. Degree-based indices are among the topological indices used to predict bioactivity, boiling point, draining energy, stability, and physico-chemical properties of certain chemical compounds [8, 9]. Due to their chemical applications, these indices have significant role in theoretical chemistry. Zhang et al. [1012] discuss the topological indices of generalized bridge molecular graphs, Carbon Nanotubes and product of chemical graphs. Zhang et al. [1315] provided the physical analysis of heat for formation and entropy of Ceria Oxide. For further study about indices, see [16, 17].Shannon [18] originated the conception of information entropy in communication theory. However, it was later discovered as a quantity that applied to all things with a set nature [19, 20], including molecular graphs [2123]. In chemistry, information entropy is now used in two modes. Firstly, it is a structural descriptor for assessing the complexity of chemical structures [24]. Information entropy is useful in this regard for connecting structural and physico-chemical features [25], numerically distinguishing isomers of organic molecules [26], and classifying natural products and synthetic chemicals [27, 28]. The physico-chemical sounding of information entropy is a different mode of application. As a result, Terenteva and Kobozev demonstrated its utility in analyzing physico-chemical processes that simulate information transmission [29]. Zhdanov [30] used entropy values to study organic compound chemical processes. The information entropy is defined as:
(1)
Here, the logarithm is considered to be with base e while , and Λ(lm) represent the vertex set, the edge set and the edge weight of the edge (lm) in Λ. Many graph entropies have been calculated in the literature utilising characteristic polynomials, vertices degree, and graph order [3134]. Graph entropies, which are based on independent sets, matchings, and the degree of vertices [35], have been estimated in recent years. Dehmer and Mowshowits proposed several graph complexity and Hosoya entropy relationships [23, 32, 36, 37]. For further study, see [19, 21, 3842, 59, 60].The graph is structured into ordered pairs, with one object being referred to as a vertex set and the other as an edge set , and these vertices and edges being connected. When two vertices of share an edge, they are said to be neighboring. The sum of the degrees of all neighboring vertices of l is denoted by Al, and the degree of a vertex l is represented by . By replacing each of ’s edges with a path of length two, the subdivision graph is formed. The line graph is denoted by the symbol in which and two vertices of are adjacent iff their corresponding edges share a common end points in .

1.1. Randić Entropy [43, 44]

If , then
(2)
Now (1) represent the Randi Entropy.
(3)

1.2. Atom Bond Connectivity Entropy [45]

If , then
(4)
Thus (1) is converted in the following form:
(5)

1.3. The Geometric Arithmetic Entropy [43, 44]

If , then
(6)
Now (1) takes the form as given below.
(7)

1.4. The Fourth Atom Bond Connectivity Entropy [35]

If , then
(8)
Now (1) converted in the following form as:
(9)

1.5. The Fifth Geometric Arithmetic Entropy [35]

If , then
(10)
Equation (1) is now changed to the following form, which is known as fifth geometric arithmetic entropy.
(11)

See [35, 44] for further information on these entropy measures.

2. Formation of Triangular Benzenoid

Triangular benzenoids are a group of benzenoid molecular graphs and are denoted by Tx, where x characterizes the number of hexagons at the bottom of the graph and 1/2x(x + 1) represents the total number of hexagons in Tx. Triangular benzenoids are a generalization of the benzene molecule C6H6, with benzene rings forming a triangular shape. In physics, chemistry, and nanosciences, the benzene molecule is a common molecule. Synthesizing aromatic chemicals is quite fruitful [46]. Raut [47] calculated some toplogical indices for the triangular benzenoid system. Hussain et al. [48] discussed the irregularity determinants of some benzenoid systems. Kwun [49] calculated degree-based indices by using M polynomials. For further details, see [50, 51]. The hexagons are placed in rows, with each row increasing by one hexagon. For T1, there are only one type of edges e1 = (2,2) and |e1| = 6. Therefore, V(T1) = 6 and E(T1) = 6 while three kinds of edges are there in T2 e.g. e1 = (2,2), e2 = (2,3), e3 = (3,3) and |e1| = 6, |e2| = 6, |e3| = 3. Therefore, V(T − 2) = 13 and E(T2) = 15. Continuing in this way, |V(Tx)| = x2 + 4x + 1 and |E(Tx)| = 3/2x(x + 3). The subdivision graph of Tx and its line graph are demonstrated in Figure 1. It is to be noted that |V(L(S(Tx)))| = 3x(x + 3) and |E(L(S(Tx)))| = 3/2(3x2 + 7x − 2).

Details are in the caption following the image
(a) Triangular benzenoid T5, (b) Subdivision of T5,  (c) The line graph of subdivision graph of T5.
Details are in the caption following the image
(a) Triangular benzenoid T5, (b) Subdivision of T5,  (c) The line graph of subdivision graph of T5.
Details are in the caption following the image
(a) Triangular benzenoid T5, (b) Subdivision of T5,  (c) The line graph of subdivision graph of T5.

Let . i-e. is the line graph of the subdivision graph of triangular benzenoid Tx. We will use the edge partition and vertices counting technique to compute our abstracted indices and entropies. The degree of each edge’s terminal vertices is used in the edge partitioning of . It is easy to see that there are only three types of edges shown in Table 1.

Table 1. Edge partition of L(S(Tx)).
Ni Set of Edges
(2,2) 2(x + 3) E1
(2,3) 6(x − 1) E2
(3,3) 3/2(3x2 + x − 4) E3

2.1. Entropy Measure for L(S(Tx))

We’ll calculate the entropies of in this section.

2.1.1. Randi Entropy of L(S(Tx))

The Randi index and entropy for α = 1, −1, 1/2, −1/2, with the help of Table 1, and equation (3) is:
(12)
By putting α = 1, −1, 1/2, −1/2, in (3), we get the Randi entropies as given below:
(13)

2.1.2. The ABC Entropy of L(S(Tx))

The ABC index and entropy measure with the help of Table 1 and equation (5) is:
(14)

2.1.3. The Geometric Arithmetic Entropy of L(S(Tx))

The GA index and entropy measure with the help of Table 1 and equation (7) is:
(15)

2.1.4. The ABC4 Entropy of L(S(Tx))

The edge partition of the graph L(S(Tx)) is grounded on the degree addition of terminal vertices of every edge, as shown in Table 2.

Table 2. Edge partition of L(S(Tx)).
(Al, Am) Ni Set of Edges
(4, 4) 9
(4, 5) 6
(5, 5) 3(x − 2)
(5, 8) 6(y − 1)
(8, 8) 3(x − 1)
(8, 9) 6(x − 1)
(9, 9) 3/2(3x2 + 2 − 5x)
After simple calculations, by using Table 2 subject to the condition that x ≠ 1, we get
(16)
By using (9), the ABC4 entropy as follows:
(17)

If we consider x = 1, Then , and .

2.1.5. The GA5 Entropy of L(S(Tx))

After some simple calculations, the GA5 index may be calculated using Table 2 under the constraint that x ≠ 1.
(18)
Therefore, (11), with Table 2 converted in the form:
(19)

3. Formation of Hexagonal Parallelogram Nanotubes H(x, y),

Hexagonal parallelogram nanotubes are formed by arranging hexagons in a parallelogram fashion. Baig et al. [52] computed counting polynomials of benzoid carbon nanotubes. Also, see [53]. We will denote this structure by , in which x and y represent the quantity of hexagons in any row and column respectively. Also, the order and size of H(x, y) is 2(x + y + xy) and 3xy + 2x + 2y − 1 respectively. The subdivision graph of H(x, y) and its line graph is shown in Figure 2, see [46]. Let , then and . To compute our results, we will use edge partition technique which is grounded on the degree of terminal vertices of every edge. It is to be noted that there are only three types of edges, see Figure 2. The edge partition of chemical graph L(S(H(x, y))) depending on the degree of terminal vertices is presented in Table 3.

Details are in the caption following the image
(a) Hexagonal parallelogram H(x, y), (b) Subdivision of H(x, y),  (c) The line graph of subdivision graph of H(x, y).
Details are in the caption following the image
(a) Hexagonal parallelogram H(x, y), (b) Subdivision of H(x, y),  (c) The line graph of subdivision graph of H(x, y).
Details are in the caption following the image
(a) Hexagonal parallelogram H(x, y), (b) Subdivision of H(x, y),  (c) The line graph of subdivision graph of H(x, y).
Table 3. Edge partition of L(S(H(x, y))).
Ni Kinds of Edges
(2, 2) 2(4 + y + x)
(2, 3) 4(−2 + y + x)
(3, 3) 9xy − 2m − 2n − 5

3.1. Entropy Measure for L(S(H(x, y)))

We will enumerate the entropies of in this section.

3.1.1. Randić Entropy of

The Randi index for α = 1, −1, 1/2, −1/2, by using Table 3 is:
(20)
So the (3) with Table 3 gives the Randi entropy and is converted in the form:
(21)
Now substitute α = 1, −1, 1/2, −1/2, in (20), we get the Randi entropies as given below:
(22)

3.1.2. The ABC Entropy of

With the use of Table 3 and equation (5), we can calculate the ABC index and entropy measure as follows:
(23)
Therefore, the equation (5), with Table 3 becomes as following and is called the atom bond connectivity entropy.
(24)

3.1.3. The Geometric Arithmetic Entropy of

We can calculate the GA index and entropy measure using Table 3 and equation (7) as follows:
(25)

3.1.4. The ABC4 Entropy of

Case 1. when x > 1, y ≠ 1

The edge partition of L(S(H(x, y))) is shown in Table 4.

Therefore, the ABC4 index and entropy measure with the help of Table 4 and equation (9) yield as:

(26)

Since has seven kinds of edges, So (9) by using Table 4 is converted in the form:

(27)

Table 4. Edge partition of L(S(H(x, y))).
(Al, Am) Ni Kinds of edges
(4, 4) 8
(4, 5) 8
(5, 5) 2(−4 + y + x)
(5, 8) 4(−2 + y + x)
(8, 8) 2(−2 + x + y)
(8, 9) 2(−2 + x + y)
(9, 9) 9xy − 8x − 8y + 7

Case 2. when x = 1, y ≠ 1

By using the same process, we get the closed expressions for the ABC4 index and ABC4 entropy as:

(28)

3.1.5. The Fifth Geometric Arithmetic Entropy of

Case 3. when x > 1, y ≠ 1The fifth geometric arithmetic entropy can be estimated by using (11), and Table 4 in the following manner:

(29)

So the (11), with Table 4 can be written as:

(30)

Case 4. when x = 1, y ≠ 1By using Table 5 and using (11) we get the closed expressions for the GA5 index and GA5 entropy as:

(31)

Table 5. Edge partition of L(S(H(x, y))), for x = 1.
(Al, Am) Ni Kinds of edges
(4, 4) 10
(4, 5) 4
(5, 5) 2(y − 2)
(5, 8) 4(y − 1)
(8, 8) 2(y − 1)
(8, 9) 2(y − 1)
(9, 9) y − 1

4. Formation from Fusion of Zigzag-Edge Coronoid with Starphene ZCS(x, y, z) Nanotubes

If a zigzag-edge coronoid ZC(x, y, z) is fused with a starphene St(x, y, z), then we will obtain a composite benzenoid. It is to b noted that |V(ZCS(x, y, z))| = 36x − 54 and |E(ZCS(x, y, z))| = −63 + 15(z + y + x). The subdivision graph of ZCS(x, y, z) and its line graph are illustrated in Figure 3. We can see from figures that the order and the size in the line graph of the subdivision graph of ZCS(x, y, z) are −126 + 30(z + y + x) and −153 + 39(z + y + x) respectively [46]. Let represents the subdivision graph of ZCS(x, y, z)’s line graph. The edge division is determined by the degree of each edge’s terminal vertices. Table 6 illustrates this.

Details are in the caption following the image
(a) ZCS(4,4,4), (b) subdivision of ZCS(4,4,4),  (c) L(S(ZCS(4,4,4))).
Details are in the caption following the image
(a) ZCS(4,4,4), (b) subdivision of ZCS(4,4,4),  (c) L(S(ZCS(4,4,4))).
Details are in the caption following the image
(a) ZCS(4,4,4), (b) subdivision of ZCS(4,4,4),  (c) L(S(ZCS(4,4,4))).
Table 6. Edge partition of L(S(ZCS)).
Ni Kinds of Edges
(2,2) 6(−5 + z + y + x)
(2,3) 12(−7 + z + y + x)
(3,3) −39 + 21(z + y + x)

4.1. Entropy Measure for L(S(ZCS(x, y, z)))

We’ll calculate the entropies of in this section.

4.1.1. Randi Entropy of

For α = 1, −1, 1/2, −1/2, the Randi index with the help of Table 1 is
(32)
Using (3) Randi entropy is:
(33)
By putting α = 1, −1, 1/2, −1/2, in (32), we get the Randi entropies as given below:
(34)

4.1.2. The ABC Entropy of

The ABC index and entropy measure with the help of Table 6 and equation (5) are:
(35)

4.1.3. The Geometric Arithmetic Entropy of

The GA index and corresponding entropy with the help of Table 6 and equation (7) are:
(36)

4.1.4. The ABC4 entropy of

Table 7 shows the graph L(S(ZCS(x, y, z)))’s edge partition, which is based on the degree addition of each edge’s terminal vertices.

Table 7. Edge partition of L(S(ZCS(x, y, z))) established on degree sum of terminal vertices, for every x = y = z ≥ 4
(Al, Am) Ni Kinds of Edges
(4, 4) 6
(4, 5) 12
(5, 5) 6(x + y + z − 8)
(5, 8) 12(x + y + z − 7)
(8, 8) 6(x + y + z − 9)
(8, 9) 12(x + y + z − 5)
(9, 9) 3(x + y + z + 25)
After simple calculations, the ABC4 index and entropy measure with the help of Table 7 and equation (9) subject to the condition that x = y = z ≥ 41
(37)

4.1.5. The GA5 Entropy of

After some simple calculations, the GA5 index and corresponding entropy measure with the help of Table 7 and equation (11) subject to the condition that x = y = z ≥ 4.
(38)

5. Concluding remarks for Computed Results

The applications of information-theoretic framework in many disciplines of study, such as biology, physics, engineering, and social sciences, have grown exponentially in the recent two decades. This phenomenal increase has been particularly impressive in the fields of soft computing, molecular biology, and information technology. As a result, the scientists may find our numerical and graphical results useful [54, 55]. The entropy function is monotonic, which means that as the size of a chemical structure increases, so does the entropy measure, and as the entropy of a system increases, so does the uncertainty regarding its reaction.

For L(S(Tx)), the numerical and graphical results are shown in Tables 8 and 9 and Figures 47. In Table 9, the fifth arithmetic geometric entropy is zero which shows that the process is deterministic for x = 1. When the chemical structure L(S(Tx)) expands, the Randi entropy for α = 1/2 develops more quickly than other entropy measurements of L(S(Tx)), whereas the Randi entropy for α = −1/2 develops more slowly. This demonstrates that different topologies have varied entropy characteristics. For L(S(H(x, y))), the numerical and graphical results are shown in Tables 1013 and Figures 812. When the chemical structure L(S(H(x, y))) expands, the geometric arithmetic entropy develops more quickly than other entropy measurements of L(S(H(x, y))), whereas the ABC4 entropy develops more slowly. Finally, for L(S(ZCS(x, y, z))), the numerical and graphical results are shown in Table 14 and Figures 1316. When the chemical structure L(S(ZCS(x, y, z))) expands, the geometric arithmetic entropy develops more quickly than other entropy measurements of L(S(ZCS(x, y, z))), whereas the Randi entropy for α = −1 develops more slowly.

Table 8. Comparison of randic entropies for L(S(Tx)).
[x]
[46] 0.4055 2.5590 2.4849 2.6263
[52] 3.1863 3.0463 3.5667 3.5970
[25] 4.0316 3.6767 4.2203 4.2280
[26] 4.5797 4.2928 4.6981 4.6991
[24] 4.9945 4.8714 5.0779 5.0764
[23] 5.3312 5.4107 5.3942 5.3918
[27] 5.6159 5.9131 5.6658 5.6631
[2] 5.8632 6.3820 5.9041 5.9013
[56] 6.0820 6.8208 6.1164 6.1136
[31] 6.2785 7.2325 6.3080 6.3053
Table 9. Comparison of ENTABC, ENTGA, , and for L(S(Tx)).
[x] ENTABC ENTGA
[46] 2.3116 2.4849 2.1972 0
[52] 3.5239 3.5835 3.5749 3.5835
[25] 4.2025 4.2341 4.2263 4.2341
[26] 4.6897 4.7095 4.7028 4.7095
[24] 5.0739 5.0876 5.0817 5.0876
[23] 5.3926 5.4027 5.3975 5.4026
[27] 5.6655 5.6733 5.6687 5.6733
[2] 5.9046 5.91087 5.9066 5.9108
[56] 6.1174 6.1225 6.1187 6.1225
[31] 6.3093 6.3135 6.3100 6.3135
Details are in the caption following the image
(a) R1 entropy, (b) R−1 entropy.
Details are in the caption following the image
(a) R1 entropy, (b) R−1 entropy.
Details are in the caption following the image
(a) R1/2 entropy, (b) R−1/2 entropy.
Details are in the caption following the image
(a) R1/2 entropy, (b) R−1/2 entropy.
Details are in the caption following the image
(a) The ABC entropy, (b) The GA entropy.
Details are in the caption following the image
(a) The ABC entropy, (b) The GA entropy.
Details are in the caption following the image
(a) The ABC4 entropy, (b) The GA5, entropy.
Details are in the caption following the image
(a) The ABC4 entropy, (b) The GA5, entropy.
Table 10. Comparison of randic entropies for L(S(H(x, y))).
[x, y]
[1,1] 2.4849 2.4849 2.4849 2.4849
[2,2] 3.7917 3.7830 3.8344 3.8332
[3,3] 4.5635 4.5428 4.5933 4.5906
[4,4] 5.1096 5.0872 5.1323 5.1294
[5,5] 5.5345 5.5129 5.5530 5.5502
[6,6] 5.8833 5.8630 5.8988 5.8962
[7,7] 6.1794 6.1615 6.1928 6.1904
[8,8] 6.4368 6.4194 6.4486 6.4464
[9,9] 6.6646 6.6483 6.6751 6.6731
[10,10] 6.8688 6.5370 6.8783 6.8822
Table 11. Comparison of ENTABC and ENTGA entropies for L(S(H(x, y))).
[x, y] ENTABC ENTGA
[1,1] 2.4849 2.4849
[2,2] 3.8497 3.8501
[3,3] 4.6048 4.6051
[4,4] 5.1413 5.1416
[5,5] 5.5604 5.5607
[6,6] 5.9051 5.9053
[7,7] 6.1982 6.1985
[8,8] 6.4534 6.4536
[9,9] 6.6794 6.6796
[10,10] 6.8822 6.8824
Table 12. Comparison of and Entropies for L(S(H(x, y))), x > 1 and y ≠ 1.
[x, y]
[2,2] 3.7879 3.4822
[3,3] 4.5387 2.2596
[4,4] 5.0783 4.8387
[5,5] 5.5018 5.2952
[6,6] 5.8509 5.6704
[7,7] 6.1481 5.9882
[8,8] 6.4068 6.2636
[9,9] 6.6360 6.5064
[10,10] 6.8417 6.7234
Table 13. Comparison of and entropies for L(S(H(x, y))), x = 1 and y ≠ 1.
[y]
[52] 3.1846 3.2958
[25] 3.5933 3.6888
[26] 3.8884 3.9702
[24] 4.1184 4.1896
[23] 4.3064 4.3694
[27] 4.4653 4.5217
[2] 4.6027 4.6539
[56] 4.7238 4.7706
[31] 4.8319 4.8751
Details are in the caption following the image
(a)R1 entropy, (b)R−1 entropy.
Details are in the caption following the image
(a)R1 entropy, (b)R−1 entropy.
Details are in the caption following the image
(a)R1/2 entropy, (b)R−1/2 entropy.
Details are in the caption following the image
(a)R1/2 entropy, (b)R−1/2 entropy.
Details are in the caption following the image
(a) The ABC entropy, (b) The GA, entropy.
Details are in the caption following the image
(a) The ABC entropy, (b) The GA, entropy.
Details are in the caption following the image
(a) The ABC4 entropy, (b) The GA5 entropy, x ≥ 1, y ≠ 1.
Details are in the caption following the image
(a) The ABC4 entropy, (b) The GA5 entropy, x ≥ 1, y ≠ 1.
Details are in the caption following the image
(a) The ABC4 entropy, (b) The GA5 entropy x = 1, y ≠ 1.
Details are in the caption following the image
(a) The ABC4 entropy, (b) The GA5 entropy x = 1, y ≠ 1.
Table 14. Comparison of randic entropies for L(S(ZCS(x, y, z))).
[x, y, z]
[4,4,4] 5.7200 5.70060 5.7432 5.7407
[5,5,5] 6.0342 6.0165 6.0587 6.0565
[6,6,6] 6.2730 6.2564 6.2982 6.2961
[7,7,7] 6.4657 6.4497 6.4913 6.4893
[8,8,8] 6.6272 6.6117 6.6531 6.6511
[9,9,9] 6.7662 6.7511 6.7923 6.7904
[10,10,10] 6.8883 6.8734 6.9145 6.9126
Details are in the caption following the image
(a)R1 entropy, (b)R−1 entropy.
Details are in the caption following the image
(a)R1 entropy, (b)R−1 entropy.
Details are in the caption following the image
(a)R1/2 entropy, (b)R1/2 entropy.
Details are in the caption following the image
(a)R1/2 entropy, (b)R1/2 entropy.
Details are in the caption following the image
(a)ABC entropy, (b)The GA, entropy.
Details are in the caption following the image
(a)ABC entropy, (b)The GA, entropy.
Details are in the caption following the image
(a)ABC4 entropy, (b)GA5 entropy.
Details are in the caption following the image
(a)ABC4 entropy, (b)GA5 entropy.

The novelty of this article is that entropies are computed for three types of benzenoid systems. These entropy measures are useful in estimating the heat of formation and many Physico-chemical properties. In statistical analysis of benzene structures, entropy measures showed more significant results as compared to topological indices. Therefore, we can say that the entropy measure is a newly introduced topological descriptor.

6. Conclusion

Using Shanon’s entropy and Chen et al. [31] entropy definitions, we generated graph entropies associated to a new information function in this research. Between indices and information entropies, a relationship is created. Using the line graph of the subdivision of these graphs, we estimated the entropies for triangular benzenoids Tx, hexagonal parallelogram H(x, y) nanotubes, and ZCS(x, y, z). Thermodynamic entropy of enzyme-substrate complexions [57, 58] and configuration entropy of glass-forming liquids [56] are two examples of thermodynamic entropy employed in molecular dynamics studies of complex chemical systems. Similarly, using information entropy as a crucial structural criterion could be a new step in this direction.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Authors’ Contributions

This work was equally contributed by all writers.

Data Availability

The data used to support the findings of this study are cited at relevant places within the text as references.

    The full text of this article hosted at iucr.org is unavailable due to technical difficulties.