Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Next Article in Journal
The Gibbs Fundamental Relation as a Tool for Relativity
Previous Article in Journal
Advancing Rice Grain Impurity Segmentation with an Enhanced SegFormer and Multi-Scale Feature Integration
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Entropies in Electric Circuits

by
Angel Cuadras
*,
Victoria J. Ovejas
and
Herminio Martínez-García
Electronics Engineering Department (DEEL), Energy, Power and Integrated Circuits (EPIC), Escola d’Enginyeria de Barcelona Est (EEBE), Universitat Politècnica de Catalunya—BarcelonaTech (UPC), Av. d’Eduard Maristany, 16 Edifici A Campus Besòs, 08029 Barcelona, Spain
*
Author to whom correspondence should be addressed.
Entropy 2025, 27(1), 73; https://doi.org/10.3390/e27010073
Submission received: 1 October 2024 / Revised: 10 January 2025 / Accepted: 14 January 2025 / Published: 15 January 2025
(This article belongs to the Section Multidisciplinary Applications)

Abstract

:
The present study examines the relationship between thermal and configurational entropy in two resistors in parallel and in series. The objective is to introduce entropy in electric circuit analysis by considering the impact of system geometry on energy conversion in the circuit. Thermal entropy is derived from thermodynamics, whereas configurational entropy is derived from network modelling. It is observed that the relationship between thermal entropy and configurational entropy varies depending on the configuration of the resistors. In parallel resistors, thermal entropy decreases with configurational entropy, while in series resistors, the opposite is true. The implications of the maximum power transfer theorem and constructal law are discussed. The entropy generation for resistors at different temperatures was evaluated, and it was found that the consideration of resistor configurational entropy change was necessary for consistency. Furthermore, for the sake of generalization, a similar behaviour was observed in time-dependent circuits, either for resistor–capacitor circuits or circuits involving degradation.

1. Introduction

Energy is inextricably linked to entropy, as its transformations are governed by the second law of thermodynamics. Although this statement applies to all forms of energy, the application of entropy is generally found in thermal energy conversion [1]. Its use in electrical circuit analysis is much less common [2] and a general agreement on its understanding has yet to be established. For instance, it is claimed that either entropy generation is maximal [3,4] or minimal [5,6]. Moreover, the state of the art on entropy in electrical circuits also describes the organization of the circuit network configuration in terms of entropy, i.e., how the electrical elements are interrelated.
Considering that the theory of circuit analysis is well established, based on the conservation of charge and energy, and ultimately described by Kirchhoff’s circuit laws, we aim to evaluate if the use of entropy can clarify the intrinsic behaviour of circuit analysis in cases where the circuit can be at different temperatures, such as in the case of a battery. Thus, we will briefly describe the most valuable contributions of entropy from both a network point of view and thermodynamic point of view, in order to relate them in this contribution, taking into account that the total entropy of the system will be the sum of all forms of entropy [7], and always keeping in mind that, as Jaynes explicitly stated, “We must warn at the outset that the major occupational disease of this field is a persistent failure to distinguish between the information entropy, which is a property of any probability distribution, and the experimental entropy of thermodynamics, which is instead a property of a thermodynamic state, […] But in case the problem happens to be one of thermodynamics, there is a relationship between them” [8].

1.1. Network Entropy

Electrical networks have been investigated in terms of the drunkard’s walk problem, a well-studied situation in the theory of probability [9,10,11,12], so that resistor networks can be interpreted in terms of probability. Nachmias developed a mathematical description of the physical network laws from voltage and current harmonic functions [12]. Perelson described thermodynamics in terms of networks and Kirchhoff’s law [13], which is similar to the approach of this contribution, in which we apply thermodynamics to networks. Since a network is a connected graph endowed with positive edges, such as resistances, there is an extensive body of literature on entropy in graphs theory [14,15], with different entropy-based measures based on network invariants, such as the number of vertices or the vertex degree sequence. Moreover, Anand and Bianconi studied entropy in networks [16], comparing Gibb’s, Shannon and Von Neumann entropy in networks describing microcanonical and canonical examples. Their results have been applied in several fields, such as robotics [17], biophysics [18], and stock markets [19]. Finally, though several references can be found for resistor networks, the presence of capacitors or inductors is much less common, being limited to the modelling of power lines [20], DC–DC converters [21], or with quantum effects [22], without playing an active role in the description of network entropy.

1.2. Thermal Entropy in One Resistor

In the framework of irreversible thermodynamics, the entropy associated with energy conversion (thermodynamic entropy) in a resistor R is described by Ohm’s law, V = I R, that is, by the relationship between current I (flow) and voltage V (gradient). It is well established that, when the current flows through the resistor, electrical energy is converted to heat and thermodynamic entropy Stherm is generated in the resistor at a rate of
S ˙ t h e r m = d S t h e r m d t = I V T R = P d i s s T R
where TR is the resistor temperature [23,24] and Pdiss is the dissipated power. In this basic configuration, we have only one degree of freedom and Stherm describes how electrical energy is converted into thermal energy. It is worth noting that the entropy that reaches the environment at a temperature T (cold source) is given by Equation (1) plus the entropy generated from transferring the heat from the resistor (hot source) to the environment:
S ˙ t h e r m _ e n v = S ˙ t h e r m + P d i s s 1 T 1 T R = P d i s s T
which is simply the Gouy–Stodola theorem [25]. In terms of entropy balance, the entropy change at the resistor S ˙ R , is due to entropy generation iS and exchanged entropy with the environment eS [23]:
S ˙ R = i S ˙ + e S ˙
The resistor R where S ˙ R is generated must be understood as the system in which the energy conversion from electrical to thermal takes place, as here, no physical change in the resistor structure is considered.

1.3. Literature Review of Thermal Entropy in Two Resistors

When two resistors are connected in parallel to a power source (Figure 1) a current divider is created.
The following derivation to find the electronic behaviour of the current divider from the minimisation of the entropy generation rate can be found in the literature [26,27]. The dissipated power Pdiss is given by the following:
P d i s s = I 1 V + I 2 V
where V is the voltage drop across the two resistors. The parallel configuration introduces the following constraint:
I = I 1 + I 2
For linear relationship between fluxes and gradients (Ohm’s law):
V = I 1 R 1   a n d   V = I 2 R 2
And thus, the entropy generation rate is written as follows:
S ˙ t h e r m = I 1 V + I 2 V T ,
where T is assumed to be constant in an isothermal process with no temperature change at the resistors. For a steady state, it is claimed that, if the entropy generation rate is minimized and equal to zero
d S ˙ t h e r m d I 1 = 1 T d P d I 1 = 1 T d d I 1 I 1 V + I 2 V = 1 T d d I 1 I 1 2 R 1 + I 2 2 R 2 = 1 T d d I 1 I 1 2 R 1 + ( I I 1 ) 2 R 2 = 1 T 2 I 1 R 1 + 2 I 1 R 2 2 I R 2 = 0
Hence:
I 1 R 1 + I 1 R 2 I R 2 = 0
it leads to the expression of the current divider
I 1 = R 2 R 1 + R 2 I
Another approach to entropy in parallel resistor circuits is based on variational analysis. A review shows that maximum entropy production (MaxEP) and minimum entropy production principles (MinEP) can be applied [28] as long as the temperature is homogeneous [29]. Furthermore, Christen [27] pointed out the superiority of MaxEP as it is applicable far from the equilibrium, while MinEP is restricted to near equilibrium, although no direct application to the electrical network is presented for MaxEP. Yet, a dynamic approach based on the theorem of minimum entropy production in linear systems with inertial effects showed that entropy can be split into excess entropy, related to dissipation and total entropy, including the inertial effects [30].

1.4. Degradation

Entropy is also present in systems undergoing degradation. Degradation is the process in which the input energy in a system modifies the internal structure of the system. Basaran pioneered its introduction in soldered joints [31,32] and developed the unified mechanics theory as an attempt to combine mechanics and thermodynamics into a single discipline. Naderi et al. [33] applied the theory to mechanical systems and, extending the application from mechanical systems to electrical systems, our group has focused on electrical systems, such as resistors, capacitors, LEDs, and batteries, along with energy degradation in terms of energy efficiency [34,35,36,37,38]. These studies only consider Stherm, and describe degradation to failure with an increase of Stherm up to a threshold [39].

1.5. Literature Gap and Paper Structure

When we reviewed the literature, some concerns were raised about the derivation of Equations (8)–(10) because the temperature is assumed to be the same in both resistors and, therefore, has no effect on the derivation. For the sake of illustration, we consider the counter example where two resistors R1 and R2 are at different temperatures, T1 and T2. The entropy generation rate at R1 and R2, respectively, is as follows:
S ˙ 1 = I 1 V T 1   a n d   S ˙ 2 = I 2 V T 2
Then, the total entropy generation rate is as follows:
S ˙ t h e r m = S ˙ 1 + S ˙ 2 = I 1 V T 1 + I 2 V T 2
and minimising the entropy generation rate:
d S ˙ d I 1 = 0 = d d I 1 I 1 V T 1 + I 2 V T 2 = d d I 1 I 1 2 R 1 T 1 + ( I I 1 ) 2 R 2 T 2 = 2 I 1 R 1 T 1 + 2 I 1 R 2 2 I R 2 T 2
Hence,
I 1 = I R 2 T 2 R 1 T 1 + R 2 T 2
The current depends on the temperature of each resistor. Therefore, only in the case that T1= T2 we recover the expression obtained from Kirchhoff’s law. The discrepancy between Equation (14) and Kirchhoff’s law was tested experimentally with two variable resistors in order to have the same resistance at different temperatures (i.e., 1 kΩ at 25 °C and 50 °C), finding that the current profile obeys Kirchhoff’s current law in Equation (10) and minimum energy dissipation [40] but not the entropy minimisation described by Equation (14).
Thus, it seems clear from the literature that a deeper analysis of the entropy of electrical circuits is worthy of further investigation for a number of reasons. Firstly, electrical circuits are ubiquitous in modern technology. Secondly, constant temperature is a strong constraint that could have powerful applications in, for example, battery management, where the role of entropy is ignored. Finally, the methodology can be straightforwardly generalised to other linear systems, as equivalent electrical circuits can be found in thermal management and biological applications [13], among other applications. Within this framework, we aim to fill the gap of the application of the fundamental second law of thermodynamics in electrical circuits. For this, we need to identify the behaviour of the circuit in terms of entropy. Thus, the objective of this manuscript is to evaluate the entropy of simple electronic circuits in order to elucidate the relevance of two different types of entropy: the network configuration expressed in terms of Sconfig, and the energy conversion expressed in terms of Stherm. The innovation of the manuscript is, thus, to simultaneously investigate the entropy related to circuit design with the entropy related to energy transformation in the same circuit and find their correlations in different types of circuits. It will also be possible to infer the impact of circuits in non-equilibrium, that is, that different elements of the circuit are at different temperatures.
The paper is organised as follows. First, we explain the methodology on how to calculate network entropy for circuits with resistors and capacitors, then we present the results of entropy correlation for parallel and series configurations with voltage and current sources. We continue with two tree-shaped networks and two source circuits to clarify the influence of element disposition in thermal dissipation. Finally, we evaluate the impact of network configuration due to time dependent circuits in R-C configurations and degradation.

2. Materials and Methods

As mentioned in the introduction, we are concerned with two types of entropy, Sconfig, related to network configuration, and Stherm, related to energy conversion. Stherm is computed as described above in Equation (1) in terms of voltage, current, and temperature. As we only focus steady state circuits, we obtain that S t h e r m = S ˙ t h e r m   Δ t , and unless stated different, we take Δt = 1 s.
Sconfig is the new term we propose to characterize the circuit. We calculated it in terms of the probability that an element of the network configurations takes place in the analysis, which is described by Kirchhoff’s and Ohm’s law. We describe the analysis of the entropy calculation for two resistors in parallel, in the configuration shown in Figure 1. Applying Kirchhoff’s and Ohm’s laws, we can write the relationship between currents and voltages in terms of a network matrix:
V V = R 1 0 0 R 2 I 1 I 2
The probabilistic interpretation of this expression in terms of conductance can be found in [9] but, for our purposes in relating it to thermal entropy, we retain the resistive form. We aim to calculate the entropy of this matrix. As the matrix is diagonal, we can use the Shannon entropy using the expression (see [41]):
S c o n f i g = i = 1 n p x ln p x
Equation (16) thus provides information about the configuration of the system, which in this case has two degrees of freedom. Landauer proposed this approach, but he did not develop it [40]. In this expression, we consider that p(x) represents the probability of current flowing through one resistor. This is a slightly different interpretation of standard network entropy found in graph theory, which defines the probability of moving between nodes, whereas we define the probability in terms of energy distribution among the different network elements using Ohm’s law. Considering the two resistors of a current divider together, and assuming a constant temperature T, the result is as follows:
S c o n f i g = R 2 R 1 + R 2 l n R 2 R 1 + R 2 + R 1 R 1 + R 2 l n R 1 R 1 + R 2
It can be seen that the probability of current flowing through one resistor or the other is simply given by the current divider solution given in Equation (10). We consider this expression to describe the configurational entropy of a circuit.
Another typical resistor configuration is two resistors in series (see Figure 2). In this case, the circuit considered is a voltage divider instead of a current divider, defined by the following:
V 1 = R 1 V R 1 + R 2 ,      V 2 = R 2 V R 1 + R 2
We want to determine Sconfig using the same procedure as that for parallel resistors. The matrix is again diagonal:
I I = R 1 0 0 R 2 V 1 V 2
Therefore,
S c o n f i g = R 1 R 1 + R 2 l n R 1 R 1 + R 2 + R 2 R 1 + R 2 l n R 2 R 1 + R 2
In this particular case, Sconfig is the same for both parallel and series resistors. We extend this analysis to R-C networks, which are less common. Following our considerations on Sconfig, we notice that this circuit (Figure 3) has only one degree of freedom in the DC analysis and the entropy is zero.
If E is time-dependent, e.g., a sinusoidal source in steady state, then the capacitor impedance is frequency-dependent Z c = 1 i ω C , which introduces a new degree of freedom. Accordingly, we can write Sconfig as follows:
S c o n f i g = R 1 R 1 + Z c l n R 1 R 1 + Z c + Z c R 1 + Z c l n Z c R 1 + Z c
where ln (r e) = ln r + iθ is the principal value of the logarithm. The use of complex Shannon entropy has already been studied [42,43].
To obtain Stherm for these circuits, we follow the description of the state of the art described in Equation (11). For capacitive networks, Stherm can be evaluated either in the time domain or in the frequency domain, taking advantage of the invariance of energy and entropy under time transformations. For simplicity, we consider a single sinusoidal excitation signal characterised by its root mean square voltage (Vrms).
For completeness, we also investigate the time dependence in systems undergoing degradation. In this case, we consider that the input electrical energy modifies one of the resistors, R2, due to either material fatigue or material breakdown. Thus, Sconfig is a function of time, and Stherm is obtained from the time integration of the input electrical power. We consider three different time-dependent functions for R2 degradation, described by: (i) a linear function,
R 2 d e g 1 t = R 2   t   H e a v i s i d e 10 t
(ii) a quadratic function:
R 2 d e g 2 t = 0.1   R 2   t 2   H e a v i s i d e 10 t
or (iii) an exponential function:
R 2 d e g 3 t = 0.001   R 2   e t   H e a v i s i d e 10 t
which are commonly used in fatigue and breakdown studies. Each function describes material fatigue as a linear, quadratic, or exponential increase in Sconfig. In addition, we also introduce an entropy threshold discontinuity with a Heaviside function, which describes structure breakdown, in cases that it is reached. The multiplying constants in R2deg2 and R2deg3 are chosen to keep the degradation in the same range for t = 10 s. We substitute R2 for R2deg in entropy calculations in order to introduce time-dependent degradation in the resistor.

3. Results

We present the results for resistor networks, evaluating the relationship of Sconfig and Stherm and relating both types of entropy. We conclude the analysis with two types of time-dependent circuits, R-C circuits and systems with degradation.

3.1. Entropy in Two Parallel Resistors

We evaluate Sconfig for different ratios of resistor pairs (Figure 4) using Equation (17); the maximum entropy is found when the resistors are equal and decreases as their difference increases. Note that the graph is symmetrical with respect to the maximum found for R2/R1 = 1. Once Sconfig is characterised, we evaluate how energy is transferred through this system. The associated dissipation considering E as a current source is given by
P d i s s I = I R 1 2 R 1 + I R 2 2 R 2 = I 2 R 1 R 2 R 1 + R 2 = I 2 R e q ,
where Req = R1R2/(R1 + R2). For a voltage source
P d i s s V = V 2 R e q
For both sources, the generated thermodynamic entropy rate is as follows:
S ˙ t h e r m = P d i s s T
If we assume a constant temperature T and steady state, as in electronic circuit analysis, for a fixed time, we obtain S t h e r m P d i s s . We plot this in Figure 4. From now on, we use a normalized current source of 1 A, voltage source of 1 V, and integration time of 1 s in all future calculations (in steady state S t h e r m = S ˙ t h e r m · t ). We would like to point out that Stherm describes the conversion of electrical energy into thermal energy, whereas Sconfig describes the arrangement of the resistors and, thus, they are different.
It is worth plotting the relationship between the two types of entropy for this circuit (illustrated in Figure 5) for a normalized R1 = 1 Ω by sweeping R2 for R1R2 ≤ ∞. We consider Sconfig a dimensional and Stherm with units JK−1; using a common unit would scale one axis. Notice that the maximum Sconfig is the same as inferred from Equations (17) and (20). In the limit of R2→∞, according to Equations (25) and (26) Stherm converge to the same value. The interest of this result lies in the fact that Stherm increases with Sconfig for voltage but decreases for current, i.e., the gradient (V) dissipates more energy as the difference between resistors decreases, whereas the flow accommodates better as the difference between resistors increases. This relationship between the design of the circuit and the energy dissipation in the circuit seems to be useful in other fields: it could help to improve energy efficiency or adiabatic computing in electrical circuits or, in a more general approach, it might justify the constructal law [25,44]. We discuss these possibilities and further implications in Section 4.
Having introduced the relationship between configurational and thermal entropy, we can consider the case of two resistors at different temperatures, which was the main motivation for this study. The electrical properties of materials are temperature dependent. We consider standard resistors whose resistance depends on temperature as follows:
R 2 T T = R 2 1 + α T 2 T 1
where α is the temperature coefficient of resistance and is valid for small temperature variations. The current divider becomes temperature-dependent at a steady state:
I 1 = I R 2 T R 1 + R 2 T
I 2 = I R 1 R 1 + R 2 T
The temperature coefficient can be either positive (for most metals) or negative (for germanium and carbon, for instance). In Figure 6, we plot the currents for the reference configuration (α = 0 and constant temperature), temperature-dependent resistors with α = 0.004 following Equation (20), temperature-dependent resistors with α= −0.0005 and temperature-independent resistors (described by Equation (14)). All examples are calculated with T1 = 300 K and T2 = 400 K. It can be seen that the curve for positive α is below the reference curve, as expected due to the increase in resistance with temperature, whereas for negative α, it is above the curve. With regard to the case described by Equation (14), we note that it is above the reference case, which is impossible, meaning that it is not possible to change the temperature of the material without changing its configurational properties. Therefore, in order to minimise the entropy generation, we have to consider the change in Sconfig due to thermal variation. This is a kind of Maxwell’s demon [45], where α = 0 means that we have a system (the resistors) that allows energy to be converted from electricity to heat without modifying the circuit, i.e., without assuming any change in the entropy of the system and violating the second law of thermodynamics.
As the resistance changes due to the temperature difference, Sconfig is modified accordingly:
S c o n f i g = R 2 T R 1 + R 2 T l n R 2 T R 1 + R 2 T + R 1 R 1 + R 2 T l n R 1 R 1 + R 2 T
the difference between the two cases is depicted in Figure 7. The overall magnitude does not change because we are plotting the normalized R2/R1, but the peak is shifted as follows:
R 2 R 1 = 1 1 + α ( T 2 T 1 )
Similar behaviour is found for Stherm (Figure 8), where we considered either a normalized current source of 1 A or a normalized voltage source of 1 V to estimate Stherm.
Finally, we evaluate Stherm with respect to Sconfig for the reference case and the temperature-dependent case (see Figure 9). The temperature difference modifies the internal resistance which implies that both Sconfig and Stherm are modified simultaneously. The difference between the curves is related to the entropy of the resistor, both configurational and thermal, due to the resistor’s structural changes associated with resistance variation and due to the different heat dissipations, respectively. As shown in Figure 6, it is not possible to modify only one type of entropy. If there is a change in the thermal dissipation, it must be due to a change in the material properties of the resistor. We also point out that this behaviour is the same for both positive and negative α and symmetrical for a voltage source, as in Figure 5.

3.2. Entropy in Series Resistors

We carry out the same procedures as for a parallel configuration in a series configuration (see circuit in Figure 2). To find S ˙ t h e r m for a voltage source, we can write the following:
S ˙ t h e r m = P d i s s T = 1 T 1 R 1 R 1 V R 1 + R 2 2 + 1 R 2 R 2 V R 1 + R 2 2 = 1 T V 2 R 1 + R 2
And for a current source, as follows:
S ˙ t h e r m = P d i s s T = 1 T I 2 ( R 1 + R 2 )
The relationship between Sconfig and Stherm for series resistors is illustrated in Figure 10, as we have illustrated for parallel resistors in Figure 5. For a voltage source, we observe an increase of thermal dissipation with Sconfig. The point of maximum Sconfig corresponds to R1 = R2. It is worth noting the relationship with the principle of maximum power transfer in a voltage divider, which occurs for R1 = R2. Maximum power transfer occurs when both types of entropy are at their maximum, i.e., when the difference between resistors is null and also the heat dissipation is at its maximum.
Now, consider the maximum power in the case of a current source. We can transform the circuit in Figure 9 using the Norton equivalent to obtain a current source of magnitude I = V/R1. In this case,
P d i s s I = I 2 R 1 R 2 R 1 + R 2 = V R 1 2 R 1 R 2 R 1 + R 2 = V 2 R 2 R 1 1 R 1 + R 2
which shows the validity of the maximum power transfer theorem for current sources. Note that Equations (34) and (25) are different and only equal in the case where R1 = R2, the limit case. Note, also, that this case is different from the case depicted in Figure 5, where the current is I and, here, it is V/R1.
Finally, the temperature dependence for a series network, as illustrated in Figure 11, produces a change in the resistor according to Equation (31), the curve is slightly shifted similarly to the behaviour for a parallel network depicted in Figure 9.

3.3. Tree Shape Networks

Previous results, as shown in Figure 5, suggested the possibility of constructal law justification. To further investigate this possibility, we analyse typical structures proposed in the constructal law literature, such as river basins and capillary networks [25]. We consider two tree-shaped networks, with three and seven elements, respectively, and we compute Sconfig and Stherm. The network structure and entropy results are given in Figure 12 for three elements and in Figure 13 for seven elements. We consider a voltage source of 1 V and compare R3 with 1 Ω and 10 Ω in both cases. The arrows shown in Figure 12d point to the maximum Sconfig. The physical interpretation is given in the Section 4.

3.4. Circuits with More than One Source

Once we have evaluated circuits with a different number of resistors, we now study circuits with more than one power source. We add a voltage source V2 in series to R1 with E as the voltage source (see Figure 14a). For Stherm, we calculate the power dissipated in R1 and R2 by V1 and add the dissipated power by V2.
P 2 = V 1 2 R 2
P 1 = V 1 V 2 2 R 1
P T = P 1 + P 2
The relevant question here is to analyse Sconfig. When we studied the current and voltage dividers, Sconfig was given by the relationship between resistors. However, Shannon’s entropy relates the probability of the electrical to thermal dissipation in the circuit. Thus, it will depend on the different sources and resistors, so we consider the probability as the ratio of the power dissipated at each resistor with respect to the total power injected in the system. For the circuit depicted in Figure 14, substituting in Equation (16) with p(x) = Pi/PT. we can write the following:
S c o n f i g = i = 1 n p x ln p x = P 1 P T l n P 1 P T + P 2 P T l n P 2 P T
Which is a generalization of Equations (17) and (20).

3.5. Time Dependent Entropy in an R-C System

We describe the results for a resistor in series with a capacitor (see the circuit in Figure 3). Following the same methodology described for resistors, we evaluate the configurational and thermal entropy. Thus, we briefly describe the behaviour of the entropy in terms of the degrees of freedom of the circuit, R, C, and ω, illustrated in Figure 15 as a function of frequency and as a Nyquist plot (Figure 16). We can see that the maximum configurational entropy is found at typical cut-off frequency ω = 1 2 π   R C and is equal to 1.132. Also, in the Nyquist plot, we find that the maximum entropy corresponds to the only real point in the plot. Both results are obtained from Equation (21). Moreover, in the limit of low and high frequencies, Sconfig tends to be 0, which is reasonable since the capacitor is either short-circuited or open circuit and, thus, we have only one degree of freedom and entropy is null. It should be remembered that, at the cut-off frequency, the impedance modulus of the resistor and the capacitor are equal, which is consistent with the previous results of finding the maximum power transfer at the maximum configurational entropy.
Once we have presented the configurational entropy, we analyse the relationship between circuit configuration and energy conversion (Figure 17). First, we consider E to be a constant voltage source. The energy stored in the capacitor when fully charged is E c = 1 2 C V 2 . Integrating over the entire capacitor charge, the energy lost at the resistor during the charging process is E R = 1 2 C V 2 , which is independent of R and the frequency and, thus, Sconfig is null.
Secondly, we consider E as a sinusoidal source. In this case, the power dissipated at the resistor depends on the root mean square of voltage Vrms, where E is the amplitude of the sinusoidal.
V r m s _ R = R R + Z c   E 2
And the generated entropy is as follows:
S ˙ t h e r m = I r m s V r m s _ R T
The relationship between Sconfig and Stherm (integration of S ˙ t h e r m over time), as depicted in Figure 17, illustrates a similar behaviour to that found for resistors, justifying the feasibility of applying the method to linear systems in general.

3.6. Time-Dependent Entropy for Degradation

We consider the degradation functions proposed in the Section 2 to study the effect of the degradation of a resistor in a parallel resistor network (Figure 18a) as a function of time in order to obtain Sconfig (t) (Figure 18b), Stherm (t) (Figure 18c), and their relationship (Figure 18d). The abrupt change in behaviour at t = 10 s is due to the Heaviside function, representing the resistor breakdown. The different degradation mechanisms lead to different SconfigStherm relationships, showing that the method is valuable for investigating systems that evolve with time. Moreover, it may be significant that the entropy generation rate S ˙ therm is independent of the degradation mechanism, as long as only Sconfig and S ˙ therm are involved in the degradation process due to R2, but it deserves further investigation.

4. Discussion

So far, we have obtained valuable results that are worth discussing: the entropy relation, the effect on constructal law and the theorem of maximum power transfer, and the effect of time on entropy in R-C circuits and degradation.
We can begin with entropy itself. In the literature, we found a discussion between Claude Shannon and John Von Neumann about entropy. C. Shannon said,
My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage’.
Whether the story is true or not, it is certain that the concept of entropy still generates some debates on its interpretation and, as a physical magnitude and a mathematical property, it is worth devoting more research effort to it. We use the quotation to point out that they were probably both right. Entropy is a mathematical function that can be applied in different disciplines, but it has a different physical meaning in each problem. As mentioned above, Jaynes prevents us from confusing information entropy and thermodynamic entropy unless we are dealing with thermodynamic problems. So, what we have tried to investigate in this paper is not whether they are the same thing, but how they are related, as there are two types of entropy describing different characteristics of the system: this is the purpose of relating Sconfig to Stherm. This correlation is possible because it is a consequence of Ohm’s law, that relates a flow, i.e., the current, and a gradient, i.e., the voltage with the entropy generation. This type of relationships is the basis of irreversible thermodynamics and, from this approach, the second law is introduced in the circuit analysis. Once we have introduced entropy, the introduction of the probability approach to the circuit description is easier to interpret. Thus, Kirchoff’s laws guarantees charge and energy conservation, whereas Ohm’s law allows for the introduction of the second law. Hence, Stherm is clearly related to energy conversion, i.e., when the energy enters the circuit, it is converted into heat. Sconfig is related to network structure, and is either decided by us, i.e., we decide whether to design a series or parallel circuit, or it evolves naturally over time due to degradation induced by energy dissipation. It is important to note that, if the relationship R2/R1 ratio changes due to energy dissipation, we have a causal relationship between Sconfig and Stherm, as illustrated in Figure 5. Conversely, if we change the ratio arbitrarily or manually, there is a correlation, but not a causal relationship between Sconfig and Stherm. The same happens with temperature, and here we find the origin of the problem inferred from Equation (14). We find that the change in entropy of the resistor (system) was not previously considered when its temperature changed. Taking this into account, the relationship SconfigStherm clarifies the fact that entropy is a function of state of Sconfig and Stherm for the network system, but it is also needed to consider the entropy change in the universe to introduce the change in the network.
In order to investigate both types of entropy, as to our knowledge, the entropy of a system is not usually correlated in complex systems; thus, we investigated how energy is transformed in a linear system as a function of the network configuration. On the one hand, Sconfig was derived from a probability of the electrical circuit; on the other hand, Stherm is inferred from thermodynamic relations. A consequence of deriving entropy from a probability is that we can deal with causal and not-causal systems. Obviously, Stherm is causal because it is governed by the laws of thermodynamics. But Sconfig is not causal, the network change is decided by the design of the electric circuit. In probability, if the system is not causal, the probability spaces are likely to be independent in many cases and, thus, the probabilities can be computed as independent. In our opinion, this approach allows an interdisciplinary approach to complex problems, as it allows us to deal with different properties of the system with the same magnitude. For instance, the studied Sconfig and Stherm are independent as long as Sconfig is not modified by the same energy. This explanation is consistent with the constructal law, as we proposed in the Section 3, which states that “for a finite size flow to persist in time (to survive) it must evolve in such a way that it provides easier and easier access to the currents that flow through it” [25]. Several examples of internal organization have been proposed, such as river basins, city streets, and railways maps [25]. To our knowledge, it has not been illustrated using this approach, i.e., the correlation between a geometrical entropy and a flow entropy, which can help to clarify its understanding. For a better understanding of this, we considered the typical examples of this law, such as tree-shaped structures that can be found either in rivers or blood capillaries. To gain a deeper insight into this result, we obtained the entropy correlation for tree-shaped structures with three and seven elements (see Figure 12 for three elements and Figure 13 for seven elements). In these networks, the symmetry of the resistor configuration is broken and, thus, Sconfig (R2/R1) is no longer symmetric. It is important to note that the maximum of Sconfig corresponds to an extreme of the Stherm. As indicated by Bejan [25], the design of the structure organizes to fit the flows. From our results, and in other words, the organisation obeys a causal relationship between the energy involved in the process and the final network structure. From Bejan’s definition, we had understood that design configured the energy SconfigStherm but it seems to be more convenient to plot SthermSconfig. For instance, replotting Figure 18d with the axis exchanged, it is easy to write a function Sconfig = f(Stherm) as illustrated in Figure 19 and find the maximum as follows:
d S c o n f i g d S t h e r m a l = 0
The correlation between both types of entropy leads to a maximum in which the constructal law is satisfied. This approach therefore opens up the possibility of exploring a law for the dual (gradients), which has not been considered before, since gradients behave similarly to flows, as well as exploring causal and non-causal interdisciplinary problems.
Continuing with the constructal law applied to resistors, we can also interpret that Equation (17) depends on geometry, if we consider that the resistance is given by R = ρ l/A, where ρ is the resistivity, l is the length, and A is the area of the resistor. This agrees with the relationship between geometry design and energy dissipation given in the law.
Another interesting result concerns the theorem of maximum power transfer. This theorem is traditionally derived from a minimisation process of the energy transferred. In Figure 10, we found that the minimum point corresponds to the theorem. It is not surprising that this theorem depends simultaneously on the network and thermal entropy, since its classical derivation minimises the power of the network. However, using our approach Sconfig = f(Stherm), we can have a better understanding of the power transfer not only at the extremal point but as a function of the network configuration, allowing further predictions of circuit performance.
Regarding the increase in information with more resistors or power sources, we can point out that changing the power sources affects the energy conversion but not the network configuration and, thus, while Stherm changes, Sconfig remains the same. On the contrary, increasing the number of resistors in the network introduces an entropy term for each resistor, thus changing Sconfig. This is an interesting issue that we can discuss using the three-resistors circuit illustrated in Figure 12a. From an electrical point of view, we can connect the resistors in parallel and in series to find the equivalent resistor. The dissipated power will be the same for R1, R2, and R3, for R1 + R2//R3, or for Req = (R1 + R2//R3), where//means connected in parallel. However, the entropy associated with each case, assuming all resistors with R = 1 Ω, are 0.9634, 0.6365, and 0, respectively. Obviously, the entropy decreases as the circuit simplifies.
Time-dependent results are also interesting. On the one hand, there is the degradation studies; on the other hand, there is the complex entropy in R-C systems. Degradation is a causal constraint of network modification due to energy dissipation, which can be described by Stherm. We proposed three arbitrary time-dependent curves. Previously, we had already investigated the physical degradation of resistors [34] and capacitors [35] in terms of S ˙ t h e r m a l . In degradation analysis, it is usually stated that the breakdown takes place when an entropy threshold is reached; that is, the system cannot hold more thermal entropy [39]. In fact, during our degradation experiments in [34], we pointed out that the thermal entropy threshold could explain either fatigue wear or breakdown in the resistor. However, as these two degradation mechanisms modify the internal structure of the material, it seems reasonable to investigate them in terms of Sconfig. Results in Section 3.6 illustrate this approach, i.e., the material degradation leads to a resistance variation which leads either to a change in Sconfig or Stherm, as illustrated in Figure 18. An interesting result, which could be worthy of further research, is that the relationship between Sconfig and S ˙ t h e r m is invariant to the degradation mechanism, as long as they are the only types of entropy involved in the process. However, Sconfig and Stherm are dependent on the degradation mechanism. Both results can be useful for degradation characterisation in order to complement the approaches given in [34,39].
The presence of imaginary entropy in an R-C configuration is related to the time-dependent behaviour of the capacitance. However, it is striking that, when the real part is maximum, and the imaginary part is zero, the entropy is maximum, which also corresponds to the same impedance modulus of the resistance and the capacitance at the point of maximum power transfer ratio. This behaviour could be further investigated in terms of active and reactive power.
To conclude this section, after discussing the theoretical implications of entropy correlations, we suggest possible practical applications that could benefit from these results. First, electrical circuit analysis can benefit from an additional criterion to improve energy efficiency, which can be of interest in the field of adiabatic computing. Moreover, for our interest, it may be useful in microgrid optimization or in battery performance analyses, which are usually described using lumped models in non-equilibrium conditions. The introduction of both types of entropy may help to better understand battery performance when degradation processes take place. Finally, we have correlated Sconfig and Stherm in electrical circuits, which are linear systems. We expect that this analysis can be useful in other linear systems, either in mechanics or in fluid applications.

5. Conclusions

In conclusion, the thermodynamic analysis of a simple electric circuit has yielded valuable conclusions, as we have clarified relationships between types of entropy. Firstly, we have related two different types of entropy that describe two different characteristics of the system. In particular, the relationship between configurational entropy and thermal entropy in different linear electric circuits provides a clear illustration of how energy transformation is related to the structure of the system upon which it is transformed. Secondly, the relationship between current flow and geometric design is highlighted as an explanation for the constructal law, which describes how a system must change to accommodate an entropic change. Thirdly, the change in configurational entropy due to a temperature difference is demonstrated to justify the thermal dissipation of the electric circuit. Fourthly, it is shown that the maximum power transfer theorem satisfies an entropy maximum, both for thermal and configurational entropy. Finally, the methodology was extended to R-C systems, resulting in a dependence similar to that observed in resistors. It is anticipated that these results, derived from simple examples, can be generalized to more complex systems.

Author Contributions

Conceptualization, A.C.; methodology, A.C.; software, A.C.; validation, A.C.; formal analysis, A.C. and V.J.O.; investigation, A.C.; data curation, A.C.; writing—original draft preparation, A.C.; writing—review and editing, V.J.O. and H.M.-G.; funding acquisition, H.M.-G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Spanish Ministerio de Ciencia, Innovación y Universidades (MICINN) and Agencia Estatal de Investigación (AEI) under projects PID2022-138631OB-I00 and PID2022-139479OB-C21 (MICIU/AEI/10.13039/501100011033 and “ERDF/EU”).

Data Availability Statement

Dataset available on request from the authors.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of this study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Bejan, A. Thermodynamics today. Energy 2018, 160, 1208–1219. [Google Scholar] [CrossRef]
  2. Harsha, N.R.S.; Prakash, A.; Kothari, D.P. The Foundations of Electric Circuit Theory; IOP Publishing Ltd.: Bristol, UK, 2016; ISBN 9780750312660. [Google Scholar]
  3. Županović, P.; Juretić, D.; Botrić, S. Kirchhoff’s loop law and the maximum entropy production principle. Phys. Rev. E Stat. Nonlinear Soft Matter Phys. 2004, 70, 056108. [Google Scholar] [CrossRef] [PubMed]
  4. Martyushev, L.M.; Seleznev, V.D. Maximum entropy production principle in physics, chemistry and biology. Phys. Rep. 2006, 426, 1–45. [Google Scholar] [CrossRef]
  5. Feynman, R.P.; Leighton, R.B.; Sands, M.L. The Feynman Lectures on Physics; Basic Books: New York, NY, USA, 2010; ISBN 9780465023820. [Google Scholar]
  6. Miranda, E.N.; Nikolskaia, S. Producción de entropía en circuitos eléctricos sencillos entropy production by simple electrical circuits. arXiv 2004, arXiv:1210.0850. [Google Scholar] [CrossRef]
  7. Ng, S.K. Information and system modelling. Math. Comput. Model. 1996, 23, 1–15. [Google Scholar]
  8. Jaynes, E.T. Probability Theory: The Logic of Science; Bretthorst, G.L., Ed.; Cambridge University Press: Cambridge, UK, 2003; ISBN 0521592712. [Google Scholar]
  9. Doyle, P.G.; Snell, J.L. Random Walks and Electric Networks; The Mathematical Association of America, Inc.: Washington, DC, USA, 1984. [Google Scholar]
  10. Roy, R. Ohm’s Law, Kirchoff’s Law and the Drunkard’s Walk 2. The Drunkard’s Walk. Resonance 1997, 2, 33–38. [Google Scholar] [CrossRef]
  11. Roy, R. Ohm’s law, Kirchoff’s law and the Drunkard’s Walk: 1. Related electrical networks. Resonance 1997, 2, 36–47. [Google Scholar] [CrossRef]
  12. Nachmias, A. Random Walks and Electric Networks. In Planar Maps, Random Walks and Circle Packing; Lecture Notes in Mathematics; Springer: Cham, Switzerland, 2020; Volume 2243, pp. 11–31. [Google Scholar]
  13. Perelson, A.S. Network thermodynamics. An overview. Biophys. J. 1975, 15, 667–685. [Google Scholar] [CrossRef]
  14. Li, X.; Wei, M. Graph Entropy: Recent Results and Perspectives. Math. Found. Appl. Graph Entropy 2016, 6, 133–182. [Google Scholar]
  15. Dehmer, M.; Mowshowitz, A. A history of graph entropy measures. Inf. Sci. 2011, 181, 57–78. [Google Scholar] [CrossRef]
  16. Anand, K.; Bianconi, G. Entropy measures for networks: Toward an information theory of complex topologies. Phys. Rev. E Stat. Nonlinear Soft Matter Phys. 2009, 80, 045102. [Google Scholar] [CrossRef]
  17. Caravelli, F. Trajectories entropy in dynamical graphs with memory. Front. Robot. AI 2016, 3, 179782. [Google Scholar] [CrossRef]
  18. Liu, C.; Ma, Y.; Zhao, J.; Nussinov, R.; Zhang, Y.C.; Cheng, F.; Zhang, Z.K. Computational network biology: Data, models, and applications. Phys. Rep. 2020, 846, 1–66. [Google Scholar] [CrossRef]
  19. Zhu, J.; Wei, D. Analysis of stock market based on visibility graph and structure entropy. Phys. A Stat. Mech. Its Appl. 2021, 576, 126036. [Google Scholar] [CrossRef]
  20. Gupta, S.; Yadav, V.K.; Singh, M. Optimal Allocation of Capacitors in Radial Distribution Networks Using Shannon’s Entropy. IEEE Trans. Power Deliv. 2022, 37, 2245–2255. [Google Scholar] [CrossRef]
  21. Cheng, L.; Bi, C.; Kang, Q.; He, J.; Ma, X.; Zhao, L. Study on Entropy Characteristics of Buck-Boost Converter with Switched Capacitor Network. In Proceedings of the 2021 IEEE/IAS Industrial and Commercial Power System Asia (I&CPS Asia), Chengdu, China, 18–21 July 2021; pp. 136–139. [Google Scholar]
  22. Freitas, N.; Delvenne, J.C.; Esposito, M. Stochastic and Quantum Thermodynamics of Driven RLC Networks. Phys. Rev. X 2020, 10, 31005. [Google Scholar] [CrossRef]
  23. Lebon, G.; Jou, D.; Casas-Vázquez, J. Understanding Non-Equilibrium Thermodynamics: Foundations, Applications, Frontiers; Springer: Berlin/Heidelberg, Germany, 2008; ISBN 9783540742517. [Google Scholar]
  24. Kondepudi, D.; Prigogine, I. Modern Thermodynamics From Heat Engines to Dissipative Structures British Library Cataloguing in Publication Data; John Wiley: Hoboken, NJ, USA, 1998. [Google Scholar]
  25. Bejan, A. Advanced Engineering Thermodynamics; Wiley: Hoboken, NJ, USA, 2006; Volume 2. [Google Scholar]
  26. Herrmann, F. Simple examples of the theorem of minimum entropy production. Eur. J. Phys. 1986, 7, 130–131. [Google Scholar] [CrossRef]
  27. Christen, T. Application of the maximum entropy production principle to electrical systems. J. Phys. D Appl. Phys. 2006, 39, 4497–4503. [Google Scholar] [CrossRef]
  28. Sree Harsha, N.R. A review of the variational methods for solving DC circuits. Eur. J. Phys. 2019, 40, 033001. [Google Scholar] [CrossRef]
  29. Bruers, S.; Maes, C.; Netočný, K. On the validity of entropy production principles for linear electrical circuits. J. Stat. Phys. 2007, 129, 725–740. [Google Scholar] [CrossRef]
  30. Rebhan, E. Generalizations of the theorem of minimum entropy production to linear systems involving inertia. Phys. Rev. A 1985, 32, 581–589. [Google Scholar] [CrossRef]
  31. Basaran, C.; Yan, C. Damage Mechanics of Solder Joints. J. Electron. Packag. 1998, 120, 379–384. [Google Scholar] [CrossRef]
  32. Basaran, C.; Nie, S. A thermodynamics based damage mechanics model for particulate composites. Int. J. Solids Struct. 2007, 44, 1099–1114. [Google Scholar] [CrossRef]
  33. Naderi, M.; Amiri, M.; Khonsari, M.M. On the thermodynamic entropy of fatigue fracture. Proc. R. Soc. A Math. Phys. Eng. Sci. 2010, 466, 423–438. [Google Scholar] [CrossRef]
  34. Cuadras, A.; Crisóstomo, J.; Ovejas, V.J.V.J.; Quilez, M. Irreversible entropy model for damage diagnosis in resistors. J. Appl. Phys. 2015, 118, 2016. [Google Scholar] [CrossRef]
  35. Cuadras, A.; Romero, R.; Ovejas, V.J.V.J. Entropy characterisation of overstressed capacitors for lifetime prediction. J. Power Sources 2016, 336, 272–278. [Google Scholar] [CrossRef]
  36. Cuadras, A.; Yao, J.; Quilez, M. Determination of LEDs degradation with entropy generation rate. J. Appl. Phys. 2017, 122, 145702. [Google Scholar] [CrossRef]
  37. Rico, A.; Ovejas, V.J.; Cuadras, A. Analysis of energy and entropy balance in a residential building. J. Clean. Prod. 2021, 333, 130145. [Google Scholar] [CrossRef]
  38. Cuadras, A.; Miró, P.; Ovejas, V.J.; Estrany, F. Entropy generation model to estimate battery ageing. J. Energy Storage 2020, 32, 101740. [Google Scholar] [CrossRef]
  39. Basaran, C. Introduction to Unified Mechanics Theory with Applications; Springer Nature: Cham, Switzerland, 2021. [Google Scholar]
  40. Landauer, R. Stability and entropy production in electrical circuits. J. Stat. Phys. 1975, 13, 1–16. [Google Scholar] [CrossRef]
  41. Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  42. Nalewajski, R.F. Complex entropy and resultant information measures. J. Math. Chem. 2016, 54, 1777–1782. [Google Scholar] [CrossRef]
  43. Qian, G.; Iu, H.H.C.; Wang, S. Complex Shannon Entropy Based Learning Algorithm and Its Applications. IEEE Trans. Veh. Technol. 2021, 70, 9673–9684. [Google Scholar] [CrossRef]
  44. Reis, A.H. Constructal theory: From engineering to physics, and how flow systems develop shape and structure. Appl. Mech. Rev. 2006, 59, 269–282. [Google Scholar] [CrossRef]
  45. Zemansky, M.W.; Dittman, R.H. Heat and Thermodynamics; McGraw-Hill: New York, NY, USA, 1990. [Google Scholar]
Figure 1. Current divider with two resistors in parallel. E describes a power source, either of voltage or current.
Figure 1. Current divider with two resistors in parallel. E describes a power source, either of voltage or current.
Entropy 27 00073 g001
Figure 2. Voltage divider with two resistors in series. E stands either for a voltage or current source.
Figure 2. Voltage divider with two resistors in series. E stands either for a voltage or current source.
Entropy 27 00073 g002
Figure 3. Equivalent series resistance and capacitor with a power source E.
Figure 3. Equivalent series resistance and capacitor with a power source E.
Entropy 27 00073 g003
Figure 4. Configurational entropy (left) and thermal entropy (right) as a function of the normalized resistor ratio for a current source of 1 A and integration time of 1 s. Sconfig maximum is 0.68 for equal resistors and evolve to 0 when the difference between resistors increases. Stherm increases with R2.
Figure 4. Configurational entropy (left) and thermal entropy (right) as a function of the normalized resistor ratio for a current source of 1 A and integration time of 1 s. Sconfig maximum is 0.68 for equal resistors and evolve to 0 when the difference between resistors increases. Stherm increases with R2.
Entropy 27 00073 g004
Figure 5. Thermodynamic entropy vs. Configurational entropy from a normalized current source (1 A, in black) and normalized voltage source (1 V, in dashed red line) and integrated for 1 s.
Figure 5. Thermodynamic entropy vs. Configurational entropy from a normalized current source (1 A, in black) and normalized voltage source (1 V, in dashed red line) and integrated for 1 s.
Entropy 27 00073 g005
Figure 6. Current through R2 for reference case at constant temperature (black), temperature-dependent resistor with α = 0.0040 (red), temperature-dependent resistor with α = −0.0005 (green), and temperature-independent resistors (blue). All cases considered T1 = 300 K and T2 = 400 K. The difference between the blue curve and the reference curve indicates that it is not possible to modify the thermal dissipation without changing the structure of the material, i.e., the temperature coefficient term proportional to α.
Figure 6. Current through R2 for reference case at constant temperature (black), temperature-dependent resistor with α = 0.0040 (red), temperature-dependent resistor with α = −0.0005 (green), and temperature-independent resistors (blue). All cases considered T1 = 300 K and T2 = 400 K. The difference between the blue curve and the reference curve indicates that it is not possible to modify the thermal dissipation without changing the structure of the material, i.e., the temperature coefficient term proportional to α.
Entropy 27 00073 g006
Figure 7. Sconfig change due to resistance variation on temperature with α = 0.0040 (dashed red line) with respect to the reference configuration (black).
Figure 7. Sconfig change due to resistance variation on temperature with α = 0.0040 (dashed red line) with respect to the reference configuration (black).
Entropy 27 00073 g007
Figure 8. Stherm for reference case (black) and for temperature-dependent resistor with α = 0.0040 (in red) for a current source of 1 A.
Figure 8. Stherm for reference case (black) and for temperature-dependent resistor with α = 0.0040 (in red) for a current source of 1 A.
Entropy 27 00073 g008
Figure 9. Stherm as a function of Sconfig for reference configuration without temperature variation (black), thermal-dependent resistor with α = 0.0040 (dashed red line) and resistor with α = −0.0005 (dotted blue line) for a current source of 1 A. The difference between curves is related to the configurational entropy change of the resistor due to the heat injection with the consequent temperature variation.
Figure 9. Stherm as a function of Sconfig for reference configuration without temperature variation (black), thermal-dependent resistor with α = 0.0040 (dashed red line) and resistor with α = −0.0005 (dotted blue line) for a current source of 1 A. The difference between curves is related to the configurational entropy change of the resistor due to the heat injection with the consequent temperature variation.
Entropy 27 00073 g009
Figure 10. Stherm as a function of Sconfig for current source (black) and voltage source (dashed red line) for series resistors. The red point of maximum Sconfig and maximum Stherm corresponds to R1 = R2 as described by the maximum power transfer theorem and pointed out with the arrow.
Figure 10. Stherm as a function of Sconfig for current source (black) and voltage source (dashed red line) for series resistors. The red point of maximum Sconfig and maximum Stherm corresponds to R1 = R2 as described by the maximum power transfer theorem and pointed out with the arrow.
Entropy 27 00073 g010
Figure 11. Stherm as a function of Sconfig for reference configuration (black) and thermal-dependent resistor (dashed red line) with α = 0.0040 for a voltage source. The difference between both curves is related to the entropy change of the resistor.
Figure 11. Stherm as a function of Sconfig for reference configuration (black) and thermal-dependent resistor (dashed red line) with α = 0.0040 for a voltage source. The difference between both curves is related to the entropy change of the resistor.
Entropy 27 00073 g011
Figure 12. (a) Tree shape network with three elements. (b) Sconfig for 3 resistor circuit. R1= 1 Ω, R2 is the variable, R3 is studied for two cases: R3 = 1 Ω (black line), and R3 = 10 Ω (dashed red line). (c) Stherm for the circuit with R3 = 1 Ω (black line) and R3 = 10 Ω (dashed red line) E = 1 V and (d) Sconfig and Stherm relationship with R3 = 1 Ω (black line) and R3 = 10 Ω (dashed red line), R2 as a variable and E = 1 V. R symmetry is lost when R3 and R1 are different. The arrows point at the maximum Sconfig.
Figure 12. (a) Tree shape network with three elements. (b) Sconfig for 3 resistor circuit. R1= 1 Ω, R2 is the variable, R3 is studied for two cases: R3 = 1 Ω (black line), and R3 = 10 Ω (dashed red line). (c) Stherm for the circuit with R3 = 1 Ω (black line) and R3 = 10 Ω (dashed red line) E = 1 V and (d) Sconfig and Stherm relationship with R3 = 1 Ω (black line) and R3 = 10 Ω (dashed red line), R2 as a variable and E = 1 V. R symmetry is lost when R3 and R1 are different. The arrows point at the maximum Sconfig.
Entropy 27 00073 g012
Figure 13. (a) Tree shape network with seven elements (b) Sconfig for 7 resistor circuit. R2 is variable and R3 = 1 Ω (black line) and R3 = 10 Ω (dashed red line). All other resistors are fixed to 1 Ω. (c) Stherm for the circuit with R3 = 1 Ω (black line) and R3 = 10 Ω (dashed red line) E = 1 V and (d) Sconfig and Stherm relationship with R3 = 1 Ω (black line) and R3 = 10 Ω (dashed red line), R2 as a variable and E = 1 V. Symmetry is lost when R3 and R1 are different.
Figure 13. (a) Tree shape network with seven elements (b) Sconfig for 7 resistor circuit. R2 is variable and R3 = 1 Ω (black line) and R3 = 10 Ω (dashed red line). All other resistors are fixed to 1 Ω. (c) Stherm for the circuit with R3 = 1 Ω (black line) and R3 = 10 Ω (dashed red line) E = 1 V and (d) Sconfig and Stherm relationship with R3 = 1 Ω (black line) and R3 = 10 Ω (dashed red line), R2 as a variable and E = 1 V. Symmetry is lost when R3 and R1 are different.
Entropy 27 00073 g013
Figure 14. (a) Circuit with two voltage sources and two resistors. (b) Sconfig − Stherm relationship for V1 = 1 V, V2 = 2 V and R1 = 1 Ω or 10 Ω. R2 is the swept variable.
Figure 14. (a) Circuit with two voltage sources and two resistors. (b) Sconfig − Stherm relationship for V1 = 1 V, V2 = 2 V and R1 = 1 Ω or 10 Ω. R2 is the swept variable.
Entropy 27 00073 g014
Figure 15. Modulus of Sconfig as a function of frequency for C = 1 F, R = 1 Ω (black), and R= 10 Ω (dashed red line).
Figure 15. Modulus of Sconfig as a function of frequency for C = 1 F, R = 1 Ω (black), and R= 10 Ω (dashed red line).
Entropy 27 00073 g015
Figure 16. Nyquist plot for Sconfig for R = 1 Ω, C = 1 F, and 1 mHz < ω < 1 kHz.
Figure 16. Nyquist plot for Sconfig for R = 1 Ω, C = 1 F, and 1 mHz < ω < 1 kHz.
Entropy 27 00073 g016
Figure 17. Relationship between Sconfig and Stherm for an R-C system (R = 1 Ω, C = 1 F, and T = 300 K). A similar behaviour to resistor circuits is found, showing the generality of the method for linear systems.
Figure 17. Relationship between Sconfig and Stherm for an R-C system (R = 1 Ω, C = 1 F, and T = 300 K). A similar behaviour to resistor circuits is found, showing the generality of the method for linear systems.
Entropy 27 00073 g017
Figure 18. Time dependent profiles for degradation in a parallel R1//R2 circuit with R1 = 1 Ω, T = 300 K, and I = 1 A. (a) Time evolution of resistor degradation according to Equations (22)–(24). (b) Time dependent evolution of Sconfig. (c) Time dependent evolution of Stherm. (d) Relationship between SconfigStherm. (e) Relationship between Sconfig and S ˙ thermal.
Figure 18. Time dependent profiles for degradation in a parallel R1//R2 circuit with R1 = 1 Ω, T = 300 K, and I = 1 A. (a) Time evolution of resistor degradation according to Equations (22)–(24). (b) Time dependent evolution of Sconfig. (c) Time dependent evolution of Stherm. (d) Relationship between SconfigStherm. (e) Relationship between Sconfig and S ˙ thermal.
Entropy 27 00073 g018aEntropy 27 00073 g018b
Figure 19. Relationship between SthermSconfig. It is the same data from Figure 18d with the axis exchanged.
Figure 19. Relationship between SthermSconfig. It is the same data from Figure 18d with the axis exchanged.
Entropy 27 00073 g019
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cuadras, A.; Ovejas, V.J.; Martínez-García, H. Entropies in Electric Circuits. Entropy 2025, 27, 73. https://doi.org/10.3390/e27010073

AMA Style

Cuadras A, Ovejas VJ, Martínez-García H. Entropies in Electric Circuits. Entropy. 2025; 27(1):73. https://doi.org/10.3390/e27010073

Chicago/Turabian Style

Cuadras, Angel, Victoria J. Ovejas, and Herminio Martínez-García. 2025. "Entropies in Electric Circuits" Entropy 27, no. 1: 73. https://doi.org/10.3390/e27010073

APA Style

Cuadras, A., Ovejas, V. J., & Martínez-García, H. (2025). Entropies in Electric Circuits. Entropy, 27(1), 73. https://doi.org/10.3390/e27010073

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop