|
Lets face it: Thermodymanics is not
easy! It is not possible to learn it by just reading
through this module. |
|
|
However, if you fought your way through
thermodynamics proper at least once, and thus are able to look at it from a
distance without getting totally confused by the "details" (which you
don't have to know anymore, but must be able to understand when they come up),
it's not so difficult either. |
|
|
It gets even easier by restricting ourselves to
solids which means that most of the time we don't have to worry about the
pressure p anymore - it is
simply constant. |
|
In this primer we will review the most important
issues necessary for understanding defects, including defects in
semiconductors. In order to stay simple, we must "cut corners". This
means: |
|
|
We will usually not show the functional relationships by
showing the variables. We thus simply write G for the free
enthalpy, and not G(T, S, ni),
showing that G is a function of the temperature T,
the entropy S and the particle numbers
ni. |
|
|
In the same spirit, we will omit the indexes showing what
stays constant for partial derivations, i.e. we write for the chemical
potential µi of the particle sort i the simple
form |
|
|
|
|
|
|
|
|
|
|
|
In full splendor it should be |
|
|
|
|
|
∂G(T, S,
ni)
∂ni |
| |
S, T, ni ¹ j
|
= |
µi(T, S, ni ¹ j) |
|
|
|
|
|
|
|
We also are sloppy about standards.
ni may refer to particle numbers or concentrations; in the latter case
particles/cm3 or mol/cm3 - you must know
what is meant from the context. You are also supposed to know that if
Boltzmanns constant k comes up
in an equation, we are working with properties per particle, whereas the gas
constant R signifies properties per mol. |
|
This, admittedly, is dangerous. But multi-indexed
quantities are confusing (and not easily written in HTML, anyway)! Lets stay
simple and refer to complications whenever they come up. |
|
If we restrict ourselves to crystals,
it is rather easy to consider the concepts behind the all-important
thermodynamic quantities Internal Energy, Enthalpy,
Entropy, Free Energy and Free Enthalpy. We start with the
internal energy U of a crystal.
|
|
|
Neglecting external
energies (e.g. the gravitational potential) and internal energies that never change (e.g. the energy
of the inner electrons), we are essentially left with the internal energy
U being contained in the vibrations of
the crystal atoms (or molecules), which express themselves in the
temperature T of the system
according to |
|
|
|
|
|
|
|
|
|
|
|
With U = average energy per atom,
f = degree of freedoms for "investing" energy in an
atom (f = 6 for crystals; 3 for the kinetic energy in
vx, vy, vz, and 3 for the
potential energy at (x, y, z)); k =
Boltzmanns constant and T = (absolute) temperature |
|
The (macro)state of the system is thus given by the
number of atoms N, the pressure p and the
temperature T. Knowing these numbers is all there is to know
about the system on a macroscopic base.
|
|
|
We can change the state of the system by adding or removing
heat Q, putting mechanical work W into the system
or taking it out, and by changing the number of atoms (or more generally,
particles) by some ΔN. |
|
|
Since at this point we keep the number of atoms in our crystal
constant, we only have to consider Q and W if we
change the state. The following basic equations (a formulation of the
1st law of
thermodynamics) (german link)
holds |
|
|
|
|
|
|
|
|
|
|
|
With the changes written in differential form. Note that the
regular "d " is not the sign for (partial) derivatives
(that would be ∂) but for
"delta". dU, e.g., stands for total change of
U. Note that sometimes the
d indicates a total
differential (e.g. in the case of dU), sometimes it does not
(e.g. in the case of dQ or dW). |
|
|
Any mechanical work must change the volume (something must
move); for the normal conditions encountered with crystals where the pressure
stays constant it can always be expressed as |
|
|
|
|
|
|
|
|
|
|
This term pdV is
cumbersome as long as only situations involving crystals under constant
pressure are considered. We thus introduce a new state
function called enthalpy H and
define it as |
|
|
|
|
|
|
|
|
|
|
|
If we again change the state of the system by adding or
subtracting heat Q and mechanical work W, we now
obtain for the total change in enthalpy
dH |
|
|
|
|
|
|
|
|
|
|
|
With Vdp = 0, because the pressure
p is constant, and dU = dQ
pdV we obtain |
|
|
|
|
|
|
|
|
|
|
|
This is a simple relation always best suited for systems under
constant pressure and also clarifying why we tend to think of enthalpy as
heat. |
|
|
dH is a measure of of the energy needed to form
a substance in a given state, it is occasionally also called the heat of formation (always refering to the
difference between two states). |
|
|
Of course, not much happens if the substance is just heated a
bit but does not change its chemical nature - lets say we look at a mixture of
H2 and O2 which we heat up a bit. All the
fun comes from chemical reactions (or phase changes) - in our example it would
be the formation of H2O in a somewhat violent fashion. |
|
It was thought that the sign of
dH would indicate if a reaction should or should not occur. A
negative sign would mean that the reaction would transfer energy to the
surroundings and thus could easily happen, whereas a positive sign would tell
us that energy would have to be pumped into the system - nothing would happen
by itself. |
|
|
It's not that simple! While this point of view was true enough
for relatively large dH (lets say > 100 kcal/mol), the
criterion often does not work for smaller changes of dH. |
|
|
The reason, of course, is that we neglected the change of the
entropy S of the system,
dS, that occurs parallel to dH. |
|
Purely
mechanical systems (consisting of non-interacting mass points) would
be in equilibrium for the lowest possible internal energy, i.e. for a minimum
in their potential energy and no movement - just lying still at the lowest
possible point. But thermodynamic systems
consisting of many interacting particles and some externally fixed condition
(e.g. a constant temperature), are in equilibrium if the best possible balance between a small energy and a large
entropy is achieved! |
|
|
We just take that as an article of faith (or law
of nature) at this point. |
|
|
Often, both quantities are opposed to each other: High
entropies mean high energies and vice verse. The entropy part becomes more
important at high temperatures, and the thermodynamic potential which has to be
minimized for systems under constant pressure, is the free enthalpy G (also called Gibbs energy).
It is defined as |
|
|
|
|
|
|
|
|
|
|
|
With S =
entropy = dQrev/T
in classical thermodynamics (the suffix "rev" refers to reversible
processes). |
|
|
If you have a system with constant volume (and variable
pressure), the best suited state function is the free
energy F (also called
Helmholtz energy). It is defined as |
|
|
|
|
|
|
|
|
|
|
Before turning to the entropy, a word
to the choice of state functions. We now already have four: U,
H, G, F - but for a given system,
there is only one state. Two things are
important in this context: |
|
|
State functions, by definition, must describe the state of a
system no matter how this state developed - they must, in other words, meet all
the requirements for potentials and thus are thermodynamic potentials. We have
not proved if this is the case for U, H,
G, F - turn to the
potential module for some input to this
question - but they really are potentials. |
|
|
Any state function or
thermodynamic potential can be used to describe any system (always for equilibrium, of course), but
for a given system some are more convenient than others. The most convenient
(and thus important) one for crystals (usually under constant pressure) is the
free enthalpy. |
|
|
|
|
The key question is: |
|
|
|
|
|
|
|
|
|
|
|
There is a classical answer, but here we only use
the statistical definition where entropy is the
measure of the "probability" w of a given
macrostate, or, essentially the same thing, the number P of microstates possible for the given
macrostate. |
|
|
Not too helpful: What is a microstate or a macrostate? Or the
probability of a macrostate?
|
|
|
Well, any
particular arrangement of atoms (or more generally, particles) where we look
only on average quantities is a macrostate, while any individual arrangement defining the properties (e.g.
location and momentary velocity) of all the particles for a given macrostate is
a microstate. |
|
|
In other words, and somewhat simplified: For a
microstate it matters what individual particles do, for the macrostate it does
not. |
|
|
The difference between microstates and
macrostates is best illustrated for for a gas in a closed container: We can
define many possible macrostates, e.g.
- 1. All molecules are in the left half of the container.
- 2. 70 % of the molcules are in the left half of the
container, 30 % in the right half.
- 3. Equal (average) distribution of the molecules.
and all these macrostates (plus many more) could have exactly the same internal
energy U (or H). |
|
|
However, the probability of experimentally finding one or the
other of those macrostates is very different. The probabilities of the
macrostates 1. and 2. are certainly much much smaller than the
probability of macrostate No. 3. |
|
|
For all the possible macrostates, the state function tells us which one will be realized
(= is most probable) in thermal equilibrium. |
|
How do we calculate the probability
of a macrostate? Lets see: |
|
|
For every possible macrostate we can think of,
there are many microstates to realize it. Its exactly like playing dice: Lets
assume you have 3 dice. A macrostate would be some possible number you
may throw; e.g. 9. The corresponding microstates are the possible
combinations of the individual dice. For throwing 9 we have |
|
|
|
|
|
Dice |
1. Poss. |
2. Poss. |
3. Poss |
.... |
1 |
1 |
1 |
1 |
|
2 |
1 |
2 |
3 |
|
3 |
7 |
6 |
5 |
|
|
|
|
|
|
|
- and so on. You get the picture. |
|
The probability for such a macrostate would be the
number of microstates divided by the number of all possible combinations of the
dice (which is a constant). We can see off-hand that the macrostates
"3" and "18" are the most unlikely ones,
having only one microstate at their disposal, while 9, 10, or
12 are more likely to occur. |
|
|
Now we know what the
number of possible ways to generate the same macrostate means and
why the "probability" w of a given macrostate is
"almost" the same thing. |
|
An example just as easy as playing
dice, comes from our friend, the vacancy. We simply ask: How many ways
P (= microstates) are there
to arrange n vacancies (= macrostate) in a crystal of N atoms?
|
|
|
When we figure that out, we can use the
equilibrium condition to select the most likely macrostate and this gives us
the number of vacancies in equilibrium. |
|
The fundamental point now is that
just knowing the internal energy U of a system with a constant
volume and temperature is not good enough to tell us what the equilibrium
configuration will be because we could
think of many macrostates with the same U (and mother nature, to
be sure, can come up with lots more). |
|
|
That's why just minimizing U (or
H) is not good enough, we have to minimize F = U
TS or G = H TS to find the
equilibrium configuration of the system, and for that we have to know the
entropy, because we now can interprete these formulas: |
|
|
Of all the many macrostates possible for a given U (or
H) the one with the largest
entropy at the given temperature will be the one that the system
will adopt |
|
Obviously, we need to be able to
calculate the entropy of a certain macrostate and this is done by employing the
statistical definition of the entropy S, the famous Boltzmann
entropy equation
(german link): |
|
|
|
|
S |
= |
k · ln w |
|
|
|
|
or |
|
|
|
|
S |
= |
k · ln P |
|
|
|
|
|
|
|
With w = probability of a macrostate and
P = number of microstates for a macrostate. |
|
If you feel that the ambiguity with
respect to taking w or P is a bit puzzling - that's
because it is! You should consult the
link to see that at least it is nothing to worry about. Whatever you chose
to work with, the results you will get in the end will be the same. |
|
|
Entropy S, by the way, is
not a state function (TS
would be one). |
|
We used the statistical definition of
entropy and the minimization of the free enthalpy in
chapter 2.1; and in an
exercise module it can be
seen in detail how to apply it to derive the formula for the vacancy
concentration |
|
|
© H. Föll