Racing, sport and pleasure horse foals are typically artificially weaned between four and six months of age. Whether weaning is abrupt or more graduated, in all cases artificial weaning takes place when the foal is still closely bonded to its mother.

Along with maternal deprivation, foals generally experience other major life-changing stressors (e.g. changes in housing, feeding, and management, and increased human intervention). Weanlings commonly experience increased distress behaviours including decreased eating and sleeping, reduced play, increased aggression, weight loss, elevated stress hormone levels, increased heart rate, decline in growth, decreased bone density, compromised immune function, and an increased risk of respiratory and GI infection (e.g. Erber 2012). Weaning also leaves foals highly vulnerable to the development of stereotypies, described as chronic, invariant, seemingly purposeless behaviours such as cribbing, weaving, and self-mutilation (Warran et al., 2008).

Weaning has therefore been described as the most stressful event in a young horse’s life (Henry et al., 2020; McGreevey et al., 2018). Given its profound negative impact, it is worth asking the question of whether weaning is necessary. As we will see, nature does a rather exemplary job all by itself.

Why is weaning stressful?

In natural conditions foals gradually transition to solid food as the mare’s milk composition decreases in fat, protein and calories, encouraging foals to expand their diet. Foals stop suckling between nine months and more than a year (interestingly it is the foals who take the lead on this one), and by 10 months are spending approximately 60% of their time grazing, corresponding to that of a mature horse (Henry et al., 2020).

As opposed to artificial weaning, in natural conditions the close bond between the mare and foal is not severed at the cessation of suckling, or in some cases, ever. Research following foals over time find that foals gradually spend more time away from their mothers, but only become fully independent if and when they leave the herd at sexual maturity (at two or three years old). In a study of 16 mare and foal pairs of Icelandic horses living in natural conditions, Henry and colleagues found that even when foals were no longer suckling, and next years’ foal was on the ground, weanlings still spent most of their time within one horse length of their mothers. They showed an equally strong preference for their mothers over other herd members after weaning as they had prior, and although they found new herd buddies as they aged, the mother always remained the most preferred partner (Henry et al., 2020; see also Crowell-Davis & Weeks, 2005).

A Canadian study by equine researcher Katrina Merkies from Guelph University compared traditional abrupt weaning to a potentially less stressful two-part weaning process (nutritional separation, followed by physical separation). Two-stage foals initially stayed with their mothers who were outfitted with udder covers which prevented suckling; after four days the mares were removed as in a traditional abrupt weaning. The two-stage foals showed no physiological or behavioural indicators of stress while they remained with their mothers with udder covers. However, once the mares were removed, the two-stage foals were equally as stressed as the abruptly weaned foals, with increased vocalizations, running, foal-to-foal aggression, and higher fecal cortisol levels.

Merkies concluded that their results did not support the hypothesized stress-reducing benefit of two-stage weaning. More interestingly, however, their results spoke clearly to the fact that it is the physical separation of mare and foal that is by far the greater stressor for both, rather than the nutritional separation.

Early artificial weaning has been defended by the fact that the mare’s milk is no longer nutritionally adequate for the foal. Merkies research points to the fallacy of using food’s nutritional value as a reference point for the appropriate weaning time.

Merkies’ findings also mirror results from Harry Harlow’s work with rhesus monkeys in the early 1960s. In a series of barbaric experiments removing infant monkeys from their mothers at birth and raising them with a “wire mother” who provided food (a wire mesh figure outfitted with a suckling device) and a “cloth mother” made of soft terrycloth, but that provided no nourishment, Harlow was able to show that it was not food that was key in the formation of the mother-infant bond, but what he called “contact comfort.” The infant monkeys spend almost all of their time on the terrycloth mother, sought her out when frightened, and used her as a secure base to seek comfort when exploring a new environment; they chose the wire mother only when they had to feed and quickly returned to their cloth mother.

Many of these reasons are based more on habits and tradition, perhaps even on false beliefs, and clearly not on the prospects for improving the welfare of domestic foals.

Harlow’s work formed the basis for Attachment Theory proposed by John Bowlby (1980, 1982) who proposed that attachment evolved as a way of maintaining bonds to close others for protection that was seen as critical to an infant’s survival, bonds that are maintained throughout the lifespan. Attachment figures are not easily replaced, attachments do not decrease over time, and the grief upon loss is immeasurable. Even when attachment relationships are problematic and attachment figures do not provide the secure base and safe haven that the child needs, the attachment bond is no less persistent.

Although horses’ attachment relationships may not be as complex as ours, there is ample evidence to support the fact that horses form enduring relationships with particular others and suffer when these relationships are ruptured (Warran et al. 2008). It is therefore an unlikely assumption that once a mare or her foal’s weaning distress behaviours have subsided that they have simply forgotten about this integral attachment relationship, they have learned a necessary lesson in growing up, and they suffer no ill consequences.

How might we make it better?

There have been a number of graduated approaches to weaning that may make the procedure somewhat less stressful for foals (see “Reducing Weaning Stress”). The strongest evidence for reducing weaning stress are methods that leave the weanling with an unrelated but familiar adult mare or mares (e.g. Erber et al., 2012; Henry et al. 2012). Henry and colleagues found that although weaning was stressful for all foals (whether weaned alone, with other foals, or with other adults) foals were the least stressed when weaned with another familiar adult mare or mares. Interestingly, foals weaned with other foals (a ubiquitous industry practice) did not fare well, demonstrating more peer aggression, more abnormal behaviours, and higher cortisol levels.

A paradigm shift: What about not weaning?

Some graduated weaning processes may be better than others, but there is little research following these horses over time, leaving us with a paucity of knowledge about the long-term impacts of artificial weaning. Furthermore, graduated artificial weaning (which decreases suckling and increases separation) does a poor job of mimicking nature where suckling frequency does not decrease in the weeks prior to weaning, and mare-to-foal proximity remains stable before and after weaning. Finally, unlike any graduated artificial process, natural weaning elicits no stress reactions in either foal or dam (Henry et al., 2020).

Although natural weaning may not be feasible in large-scale operations readying young thoroughbreds for the fall sales, the majority of breeding situations could consider natural weaning with minor management changes.

Today, there are various economical and practical reasons for early artificial weaning: marketing of foals, encouraging foals to focus on humans rather than their mothers, optimizing the mare’s reproductive efficiency, and controlling the foal’s nutritional intake. However, as Henry (2020) notes, many of “these reasons are based more on habits and tradition, perhaps even on false beliefs, and clearly not on the prospects for improving the welfare of domestic foals.”

Although natural weaning may not be feasible in large-scale operations readying young thoroughbreds for the fall sales, the majority of breeding situations could consider natural weaning with minor management changes (installing selective feeders so foals have access to adequate nutrition, supplying free-choice forage so that mares do not lose condition, and so on).

The benefits of natural weaning may well outweigh any potential costs to this management shift. Foals readily learn from their dams and much of the foal’s education (handling, trailering, desensitization to novel objects, etc.) can be addressed with the aid of a well-seasoned and calm mother (Christensen, 2016). Natural weaning also put foals at a greatly lowered risk of developing stereotypies. Research has shown that as many of 67% of individually housed foals developed a stereotypy after abrupt weaning (Visser, 2008). Even with more natural housing arrangements, the majority of all equine stereotypies develop within one month of weaning (Nicol 1999), placing weanlings in the highest risk age group for stereotypy development. This risk factor alone should be sufficiently alarming to make breeders rethink weaning practices.

Stereotypies are typically lifelong, almost impossible to eliminate once established, detract from the animal’s value and marketability, and in extreme cases may lead to such overwhelming behavioural disturbances (e.g. severe cases of equine self-mutilation syndrome,) that the only recourse is to have the horse destroyed.

Given the compelling evidence of the negative effects of artificial weaning and the clear benefits of natural weaning, I would suggest a paradigm shift to make natural weaning the default and artificial weaning the anomaly, rather than the other way around. At the very least, we should always start with the question, “Do I actually need to wean this foal right now?”