The second law of thermodynamics has been talked about quite often so I'm sure you've heard of it.
It simply says that the entropy of a system will always increase.
Sometimes the entropy is called the disorder of a system or how chaotic a system is.
But we won't have a particular simple expression for the entropy.
So for example, if I showed you a system like the one we have here,
We have maybe one particle and one part of your system is a lot hotter than the rest of the system.
We already know that the energy, just sort of from our own intuition which spread out,
that the more heated particle would interact with the things around it
and transfer that energy to the different parts of the system,
and that heat would start spreading through the system as you can see in the lower picture here.
This would be considered to be a sort of change in the entropy of the system
and an increase in the entropy of the system.
Because as this heat is spread through the system,
we would consider this to be sort of more a disordered way for the energy to be relative to the original state.
So if you look at the original state up top,
We would say that the heat being collected all in one place
is a very organized way for the system to be for all that energy to be in one spot.
It's a very particular way for the system to be.
And systems usually will not want to stay in these kinds of configurations.
But for this particular phenomenon that I just described where the heat might transfer through your system and go through your system.
We don't have a particular simple expression for the entropy that you'll be responsible for,
so I couldn't give you something about the particles' positions or locations
and then ask you to tell me what the entropy of the system is in any sort of a practical sense.
Instead, we'll be responsible for knowing some properties about the entropy of a system,
the way the entropy changes, how we can describe certain properties of the entropy,
and that's what we'll do now.
Some other properties are that the entropy is also a state variable,
which means it's not like the heat added to a system
where it could depend on the history and what you've done beforehand,
it's something that could be measured in principle immediately just by looking at the system itself.
We also have that the disorder for systems or the entropy of systems
is to going to be greater for gases and for liquids than it is for solids.
And the entropy of a gas is also greater than the entropy of a liquid.
And fortunately, this follows our intuition quite well.
We would think of a gaseous state where all this particles are bouncing around in the air
as being a lot more disordered or chaotic.
Then we would consider a simple solid which is very immutable relative to that great motion of particles in a gaseous system.
We would also say that the entropy, unlike something like the temperature, would increase with the size of the system.
So what do I mean by this? If I take a system which has a particular entropy
and suppose I double the size of that system,
and maybe have two versions of that same system,
the entropy of this new overall system where I have doubled it
would now be double the original entropy.
This sounds almost too intuitive, so you might again ask, 'Well, what system wouldn't follow this law?'
But again, temperature wouldn't.
Imagine you have a system that is at ten degrees.
You double it, so you now have two systems both at ten degrees.
You would say the whole system is now at twenty degrees because that's not how the temperature adds.
Temperature a simple property of the system intrinsically rather than extrinsically like the entropy is.
So again, for entropy we have this variable that increases as the size of the system increases,
more like something like energy.
If I double the size of a system just by duplicating the system,
I would say there is now twice as much energy present, and the entropy follows that same idea.
For dispersed liquids and gases, we also have the rule that the entropy for these dispersed liquids and gases
would be more than the entropy for a more collected system.
And again, some of these principles follow a good intuition for what entropy should do
if we consider entropy as a disorder or an amount of chaos in a system.
For a given reaction, if we know what the entropy of the products are
and we know the entropy of the reactants are before they form those products,
the change in the entropy of the system through the reaction
or delta S, the change in entropy, is simply the difference.
It's the entropy of the product minus the entropy of the reactants.
If we reverse that reaction and I had it going the other way,
we would simply put a minus sign because we would simply have different reactants and product just from our definition.
And so we would say that the entropy of the reverse reaction or the change in entropy of your system
for the reversed reaction is simply the negative of the change in the entropy for the normal forward reaction.
Finally, we have what we sometimes call the third law,
which is sort of also under the heading of entropy because it's just the way to think about where we define the zero of entropy.
And this third law of thermodynamics as it's sometimes called just says that we define the zero of entropy
to take place corresponding with the absolute zero of temperature.
So, we would say that a system at zero Kelvins in temperature, so absolute zero,
Would just have zero entropy -- no disorder in that system,
Which again follows our intuition because if there is nothing moving, nothing can happen,
there's no place for chaos to really occur in that system.