Our collective approach to water has taken quite a journey to put us where we are today. Historically, clean water has not always been available to the masses. Unfortunately for those who came before us, disease and other conditions were rampant. It was only after the connection was made between those diseases and the quality of our population’s water supply that real efforts were dedicated toward providing clean drinking water for major populations.
It was realized very early that an outside water source was needed for drinking. The Romans developed aqueducts solely dedicated to this purpose as early as 312 B.C. Later, London constructed the New River of clean water into the city in the 17th century. However, as populations grew, this would prove to be less than enough.
Before modern sewage and water treatment plants were in place, humans largely determined the quality of water based on taste. Unfortunately, we now know this is not an accurate test to determine the suitability of drinking water. Some harmful agents are tasteless, odorless, and colorless. Needless to say, humans soon realized the need for a more accurate way to test their drinking water.
Although there are mentions of boiling water and filtering through gravel and sand dating back to prehistoric times, it wasn�t until the early 19th century that we would see a town being supplied with water run through a filter. Paisley, Scotland became the first city to use a filter to supply an entire municipality with water. A few English cities followed suit soon after, with European cities adopting some filtration as well. An early attempt in the United States by Richmond, Virginia to install a filter failed in 1832.
With the spread of disease and the overwhelming of city cesspools, it became obvious to people that some kind of purification was necessary. Disease became widespread, but many still did not believe in the link between unclean drinking water and the presence of contagions. Finally, in 1890, William Thompson Sedgwick used bacteriology to prove a connection between contaminated water and cholera.
In the late 1800�s, many cities in the United States began to adopt water filtration processes for city drinking water. The early systems involved straining water through sand and gravel to remove sediment. By the beginning of the 1900�s, cities began to realize that slow sand filters could remove some germs, notably the typhoid germ.
In addition, it was recognized that treatment of water may also be necessary. In the early 1900�s, many cities employed chlorination to treat water. This involved adding chlorine to the water, but this was recognized to be an unsafe solution. Due to thousands of cases of typhoid fever and diarrhea, the need for water treatment was still an urgent matter.
The first federal regulations in the United States were enacted in 1914. Later, during the 1960�s, the realization that industrial processes were contaminating the clean water supply led to harsher restrictions. As a result, the 1972 amendments that are largely responsible for our current legislation were implemented.
Most of what now governs our drinking water is set forth in the Clean Water Act (CWA) which was amended as such in 1972. According to the EPA, the major changes made in the CWA include
- Regulations for pollutant discharges into US waters
- Assignment of pollution control programs to the EPA
- Maintaining existing regulations for contaminants in surface water
- The obtaining of a permit for discharging pollutants into navigable waters
- Funding for sewage treatment plants
- Recognized the necessity for problems related to pollution
Although these policies have evolved significantly over time, we are lucky enough to have the promise of safe, clean drinking water. Want to learn more? Read our post that discusses how SCADA Improves Water Plant Functionality.