Measuring circulating levels of
stress hormones (cortisol, corticosterone, etc.) in wild animals can provide
valuable information for many types of studies. For example, conservation
biologists may want to determine if high levels of snowmobile traffic are
associated with increased cortisol levels in elk. Behavioral ecologists may
want to know if white-faced capuchins (a type of monkey from Central/South
America) have lower levels of cortisol if they come from a troop with a high
level of social support. And physiological ecologists may be interested in
whether Galapagos marine iguanas have a corticosterone rhythm driven by
photoperiod or tidal cycle (*see bottom of page for more details).
In order to measure levels of
corticosterone or cortisol (CORT), researchers need to obtain blood, feces,
urine, saliva, feathers, or hair from their animals. Here’s some information
about each method and their advantages and disadvantages:
Blood
This is the most common way to
measure CORT concentrations. Because hormones circulate through the blood, this
method provides researchers with the most functional measure of CORT. Since
CORT has a very high concentration in the blood (compared to other hormones),
blood samples do not need to be very big (for degus, 30uL of whole blood is
usually enough to determine baseline CORT). The downside to using blood samples
is that within 3 minutes of encountering a stressor, an animal’s CORT levels
start to increase. So, if a researcher wants to determine baseline levels of
CORT, they must get a blood sample within 3 minutes of capture.
Another downside of using blood
samples is that each sample is only a snapshot of an animal’s stress response.
In order to get a more integrated picture of an animal’s stress profile,
researchers usually take a series of blood samples over the course of 1-2 hours.
That way, they can see the rise from baseline, the peak of the CORT response,
and then the gradual decrease caused by negative feedback.
After obtaining a blood sample, the
samples must be kept chilled until they can be spun in a centrifuge (you want
to spin within 24 hours of collection). Once the samples are spun, the plasma
or serum (the clear layer on the top) needs to be drawn off and saved for
analysis. The plasma or serum must be frozen at -20˚C and can be stored for several
months. These requirements are fairly reasonable, but sometimes field sites are
far away from cities. Last year I worked in a national park that was pretty remote,
so I had to manually spin my samples with a hand-crank centrifuge and store my
samples in a cooler full of ice. Luckily, steroid hormones like CORT are fairly
sturdy, so lengthy processing times aren’t a big deal.
Finally, to process the plasma
samples, researchers need to run a competitive binding assay called a radioimmunoassay
(RIA). Basically, known amounts of radiolabelled CORT and CORT-antibody are
added to the sample. The radiolabelled CORT and normal CORT from the sample
will compete to bind with the antibody. The unbound radiolabelled CORT is then
washed off, and the radioactivity is measured. The more CORT you have in your
sample, the more it will bind with the antibody and displace the radiolabelled
CORT. Therefore, the higher the CORT in the sample, the lower the radioactivity.
RIAs are fairly straightforward but require some special equipment, so they
usually have to be run in laboratories that frequently measure hormone
concentrations (like my lab!). The enzyme immunoassay (EIA) is another way to
measure hormone concentrations and requires less equipment, but is generally more
expensive to run than a RIA.
As an additional step, researchers
may also want to measure levels of corticosteroid binding globulins (CBG). CORT
binds to CBG in the bloodstream, and it’s thought that only unbound CORT is
“free” and able to bind to cell glucocorticoid receptors. Therefore, the
functional measure of CORT is the “free” CORT, so some researchers also measure
CBG levels to determine the relative amounts of unbound CORT. However, it’s
still under debate as to whether only unbound CORT can bind to glucocorticoid receptors,
and it has also been pointed out that without CBG, CORT would be unable to
circulate through the whole bloodstream and reach all the cells in the body.
Loading the centrifuge with my samples. |
I pipette the plasma into a plastic eppendorf tube which then goes into a -20C freezer. |
Saliva
CORT
can pass from the blood to the saliva, and salivary samples are actually a
common way to measure cortisol levels in humans. While captive animals can be
trained to lick, chew, or drool on something, obtaining sufficient salivary
samples from wild animals in non-invasive manner is most likely impossible. However,
it usually takes 20 minutes after a stressor for CORT levels to increase in the
salvia, so taking salivary samples could give researchers more time to collect
a sample after capture.
Feces/Urine
Another way to assess CORT levels
is to measure the metabolic products of CORT in the feces. However, because
this method measures the metabolites of CORT, it doesn’t have quite the same
functional value that a blood sample does. There are several steps involved in
breakdown of CORT, and CORT metabolite formation can be affected by sex,
season, metabolic rate, and diet. Therefore, experimental power is lost when
using fecal samples for CORT analysis because animals cannot be compared
between different seasons, locations, or sexes.
The main advantage of using fecal
samples, however, is that they’re fairly non-invasive. Researchers can follow
individuals and collect feces, or in the case of small mammals, check traps
after a certain period of time and collect any feces from the bottom of the
cage. Using feces to measure CORT levels is also great for repeated sampling,
since frequent blood sampling and other more, invasive measures can change an
animal’s stress response.
Using fecal samples to determine
stress hormone levels makes a lot of sense if you’re working with an animal
that’s endangered or difficult to capture (like an arboreal monkey). Another
advantage of using feces is that they represent an integrated measure of CORT
exposure; if you know how long it takes for an animal to form a fecal pellet, then
you can estimate their total CORT exposure over a period of several hours.
Collecting feces is usually pretty
easy, but investigators should always try to obtain fresh samples from a known
individual. If this isn’t possible because the study animal is hard to find or
lives in an aquatic environment, then dog trackers can be used. The Wasser Lab
at the University of Washington (my alma mater!) uses canine trackers to find
feces from whales, tigers, wolves, and other, elusive endangered animals (http://conservationbiology.net/conservation-canines/#scat).
Storing fecal samples can be tricky; temperature, storage
liquid, autoclaving, and storage time can all affect fecal glucocorticoid
metabolite (FGM) concentrations. Sample mass can also affect FGM
concentrations, as very small samples have disproportionally higher FGM levels.
Getting an adequate sample mass can be difficult when studying small animals
like songbirds, mice, etc., so many studies have to combine several fecal
samples in their assays.
After homogenizing the fecal
samples, FGM concentrations can be measured via RIA or EIA. Another downside of
using fecal samples is that the CORT-antibody may not bind with all the
different CORT metabolites. Therefore, researchers usually need to do a
validation experiment to determine that increased CORT levels in the blood
correspond to increased FGM concentrations in the feces. One advantage of using
feces, though, is that by measuring levels of metabolites the researcher is
essentially measuring “free” CORT, so there’s no need to measure levels of CBG.
Like feces, urine also contains
glucocorticoid metabolites (and some unmetabolized CORT). In the field,
however, it’s very difficult to collect urine samples in a non-invasive manner,
so urine collection for CORT analysis is rarely used outside of the laboratory.
Feathers/Hair
Believe
it or not, you can actually measure CORT in feathers. During molt (feather
growth), each individual feather is vascularized, and CORT from the blood can
be deposited in the feather. After the feather stops growing, the blood supply
is cut of and, supposedly, no further CORT can be deposited in the feather.
Therefore, feathers provide a good, integrative measure of CORT during the
period of molt.
Plucking
a feather is easy to do and there’s no time limit, unlike blood sampling. Using
feathers can also be non-invasive if the researcher obtains naturally molted
feathers. And another big advantage of using feathers to determine CORT levels
is that feathers from dead birds can also be assayed (like museum specimens!). Also,
feathers do not need to frozen or stored in any specific manner.
The
big downside to using feathers for CORT analysis is that we’re still not too
sure what form of CORT we’re actually measuring. Different antibodies have been
found to have different levels of success during feather CORT analysis, which
has made researchers unsure whether they’re measuring CORT, CORT metabolites,
or other molecules similar in structure to CORT.
Another downside to using feathers
is that, like feces, there’s a sample mass bias where smaller feather samples
have disproportionally higher CORT. Getting a large enough sample can be
difficult because researchers don’t want to compromise a bird’s flying ability,
and there’s also the problem that different feathers are grown at different
times, so pooling certain feathers together may not be appropriate. Feather
mass requirements can also prevent researchers from examining portions of the
feather (sections near the end of the feather are older, and thus represent a
different time period than sections near the bottom), but a recent study found
that different feather sections didn’t correspond to the CORT concentrations in
the blood at the time of their growth, anyway. Feather CORT can be measured via
RIA, but the preparation is a real pain (you have to mince the feather into
little, tiny bits).
CORT
can also be deposited in hair during follicle growth. One advantage of hair
samples is that they can represent CORT exposure over a very long period of time.
The downside to hair samples is that it’s hard to collect a large enough sample
in a non-invasive manner. And, like feathers, there are still a lot of
questions about what the antibody is actually binding to in the assay. Another
disadvantage to using hair is that the sample could be contaminated with other,
CORT-containing secretions, like saliva or sweat.
So,
I’ve tried to give a basic, if somewhat extensive overview of the different
ways to measure CORT in wild animals. After I was almost done writing this
blog, I came across a great review paper that includes everything I mentioned
and more: Sheriff, M.J. et al. (2011) Measuring stress in wildlife: techniques
for quantifying glucocorticoids. Oecologia
166:869–887. Also, here’s an excellent review on fecal CORT: Goyman, W. (2012) On
the use of non-invasive hormone research in uncontrolled, natural environments:
the problem with sex, diet, metabolic rate and the individual. Methods in Ecology and Evolution 3, 757–765. And finally, my favorite
feather CORT paper: Lattin, C.R. et al. (2011) Elevated corticosterone
in feathers correlates with corticosterone-induced
decreased feather quality: a validation study. Journal of Avian Biology 42, 247-252.