Calorimetry is the experimental technique used to measure enthalpy changes. It relies on the principle that heat lost by a reaction = heat gained by the surroundings (usually water).
Key Values
The specific heat capacity of water is 4.18 J g⁻¹ K⁻¹.
This means it takes 4.18 J to raise the temperature of 1 g of water by 1 K (or 1 °C).
Worked Example
Example: Combustion of Ethanol
0.500 g of ethanol (C₂H₅OH, M = 46.08 g mol⁻¹) is burned and heats 100.0 g of water from 22.0 °C to 35.4 °C. Calculate the enthalpy of combustion.
Step 1: \( q = mc\Delta T = 100.0 \times 4.18 \times (35.4 - 22.0) = 5601 \text{ J} = 5.601 \text{ kJ} \)
Step 2: \( n = \frac{m}{M} = \frac{0.500}{46.08} = 0.01085 \text{ mol} \)
Step 3: \( \Delta H = -\frac{q}{n} = -\frac{5.601}{0.01085} = -516 \text{ kJ mol}^{-1} \)
The answer is negative because combustion is exothermic. The literature value is −1367 kJ mol⁻¹ — the large discrepancy is due to heat loss.
Sources of Error in Calorimetry
Why Experimental Values Differ from Literature
- Heat loss to the surroundings (biggest source of error)
- Incomplete combustion — not all fuel is fully oxidised
- Evaporation of water or volatile reactants
- Assumption that the specific heat capacity of solution = water
- Heat absorbed by the calorimeter itself (not just the water)
Think About It
Why do experimental ΔH values for combustion always give a smaller magnitude than the literature value?
Because heat is lost to the surroundings, meaning less heat is captured by the water than is actually released by the reaction. The measured temperature change is smaller than it should be.