Animals should respond not only to the presence of predators but the pattern of their coming and going


Animals live in environments where the availability of food and the risk of being killed by predators varies over time and space. The risk allocation hypothesis (PK6X8JTD6) showed how animals should change their foraging effort when danger varies, and how effort under high risk will depend on the proportion of time that there is high risk. However, animals are able to build up fat reserves than they can live off during dangerous periods. This means it will matter how long the dangerous periods are, because they will have to store more fat if periods are long. If they want to store more fat to survive long dangerous periods, the safe periods must be long enough to build up fat, otherwise they have to forage in dangerous periods. So, the pattern of predation risk might really matter, rather than just the proportion of time that foraging is dangerous.


This model develops the risk allocation model by focussing on the duration of safe and dangerous periods and considering how the durations might be very different (e.g. long periods of safety interspersed with brief very dangerous periods). The study also explored what happens when food availability varies, rather than predation risk. The overall result is that the most important effect on foraging effort should be the duration of good periods (low risk or lots of food) and bad periods (high risk or little food), rather than the overall proportion of each type. Foraging effort can either decrease or increase when the proportion of dangerous times increases, depending on how the durations have changed. The model also shows that if food availability varies, animals should forage more in good times if durations are short, but more in bad times in durations are long.


Overall, the model highlights how the clumping of food or predators in time and space should have huge influence on how and when animals forage for food. The original risk allocation hypothesis has often not been supported by the results of experiments, in that animals did not respond to increasing frequency of short high risk periods, such as the number of times a day a silhouette of a bird predator is passed over fish in a tank. This study show this test is not likely to result in big changes to foraging effort, because animals can afford to completely stop foraging during short high risk periods. It would be better to alter the duration of dangerous periods when safe periods are fairly long. This study also highlights that equally effective tests of risk allocation can use varying food availability and this might be easier to do, such as the number of times animals need to press a lever to get food.


Behavioural ecology

Subject Group

Zoology and Ecology







Posted by


on Wed May 27 2020

Article ID


Details of original research article:

Higginson AD, Fawcett TW, Trimmer PC, McNamara JM, Houston AI. Generalized Optimal Risk Allocation: Foraging and Antipredator Behavior in a Fluctuating Environment. The American Naturalist. 2012;180:589-603.

Preceded by:

keywords: foraging , mating , food , temperature , survival ...

Managing fat to survive the night explains why birds sing at dawn

Posted by: AndrewDHigginson Posted Wed Jun 03 2020

keywords: predation , foraging , time budge , food , risk ...

Animals must carefully choose when to look for food when the predation risk varies

Posted by: AndrewDHigginson Posted Fri May 22 2020

Followed by:

keywords: fat , foraging , physiology , memory

Gut instinct makes animals appear clever

Posted by: AndrewDHigginson Posted Mon Apr 16 2018


Add new comment