# What is Enthalpy and Entropy?

64 views
0

Thanks in advance, I wants to understand the concept and not just know the equations!

In: 8

Chemical processes change the energy of two pots, enthalpy and entropy. While the first is not dependent on temperature where you run this reaction in, the second is. The first is how much energy your molecules store in bonds. Some bonds between atoms are much more stable, so upon forming them they release the difference as enthalpy change, ultimately heat that you can measure. This alone, however doesn’t tell you if the reaction does occur spontaneously. For that, both terms in their sum need to be negative. Entropy is that second term, it tells you how much order you lost. It’s a bit more than that, but in the end the system wants to increase states that you cannot tell apart from each other when you swap the position of two members, such as molecules flying in your room. Only if you create more stable molecules and increase chaos, the reaction will occur spontaneously.
Since entropy depends on temperature, there are some funny reactions that only occur at very low or very high temperatures

I know you’re asking about the physics concepts, but you might also be interested to know that information theory exists, and has its own concept of entropy!

In that world, entropy is a measure of the evenness of the distribution of probability of encountering symbols – and correspondingly, the disorganization of information.

If you roll a fair die a bunch of times, you have to write down the entire sequence to identify it uniquely, and there’s not really a great way to improve on that. If your die is loaded? There are schemes that do better on average, and only take more work when something unusual happens. Imagine if you had a loaded die that favored 1, and so you rolled 20 1s, a 4, and then 10 more 1s- I just described that long sequence in a really compact way, right?

If your mind goes to compression, you’re right on the money. This is the principle that lossless compression schemes like Zip (LZW) use – while there are lots of schemes to shrink data that’s organized in some way, you don’t benefit from those schemes if the entropy is high, and so some data are going to compress better than others!

Since the question is tagged “Physics” I’ll tackle that side of the problem.
Suppose you have some fluid in a box. You can measure its temperature (T), pressure (P), and volume (V). From those you can determinete its internal energy (U). As its name says, U is a measure of the energy of the fluid in its current state.
You can increase U by heating up the box (heat flow toward the fluid) and/or compressing the box (applying work to the fluid); and you can decrease U by cooling down the box (heat flow from the fluid) and/or letting the box expand (the fluid does work toward the environment).

But, for many practical applications, your fluid is not in a box, but it is flowing through a pipe or going throug a machine (think a pump, a turbine, a radiator, etc). You can still measure T and P for the fluid, and if you consider a fixed lenght of pipe you can still measure its volume, though it is generelly more useful to think in terms of density (d), as in d = volume / mass.
You can also still consider U, but once again it is not that useful. Since the fluid is in motion, it is constantly entering and exiting your volume, which creates an additional energy called Pulsion Work.
I am not talking about kinetic energy, which depends on the velocity of the fluid, that is a separate term.
This pulsion work is calculated as P * V. So if you managed to determinate U, you can also determinate the Pulsion Work. Given that it depends on the same properties as U, and that it is more significative for flowing systems, the Enthalpy (H) was defined as H = U + P * V.
I hope this helps.

We have a Chemist’s and a Physicist’s answers. I’ll try to give the Engineer’s answer:

Internal energy only refers to the heat energy content of a system. Enthalpy is the total energy content of the system which is the internal energy + the work the system could perform by expanding and pushing the surrounding environment out of the way due to pressure. It’s the internal heat energy + potential mechanical work.

For Entropy: A machine does a task. The task requires a transfer of a certain amount of energy to feed the machine. Depending on how well built the machine is designed and maintained it might do the task well or poorly. That is, do alot of work or a little for that amount of energy. Entropy tells you how much more work or energy you could have gotten out of your machine if you were as efficient as possible. Entropy is that amount of energy divided by the temperature of the space around the machine. Now replace “machine” with “any volume of space” and “task” with “any interaction at all between the space and its surroundings.”