Silver shows promise in early wound treatment
Expert explains healing benefits of topical nanosilver
Silver has been used for centuries in many facets of life ranging from food storage to wound care, according to Adam Friedman, M.D., F.A.A.D. But silver has become more mainstream in the early 1900s and especially after the 1920s when the U.S. Food and Drug Administration approved it as a treatment.
Despite its longevity, the use of silver continues to remain a hot topic to this day, says Dr. Friedman, associate professor of dermatology, residency program director and director of translational research in the department of dermatology at The George Washington University School of Medicine & Health Sciences.
Some of that conversational staying power could have to do with the level of misunderstanding surrounding the topic.
According to Dr. Friedman, silver isn’t used as readily as it could be largely due to a number of common misperceptions;1 the first of which is that silver dressings don’t improve healing rates. Systematic reviews and meta-analyses have confirmed the positive effects of silver when dressings are used appropriately.
Another common myth is that silver dressings cause systemic toxic effects such as argyria, a condition that can turn the skin a blueish hue. Silver dressings, he explains, are unlikely to cause true argyria because only low levels of silver are used.
The idea that silver dressings delay healing and make bacteria resistant to antibiotics is another misperception.
Terminology can also contribute to the confusion since some physicians use colloidal silver and silver sulfadiazine almost interchangeably. “They are absolutely not the same,” Dr. Friedman says. “I think it is important to delineate between those very different wound medicines, as there is a growing body of evidence demonstrating that silver sulfadiazine can actually delay wound healing.”