Delaying vaccination with a modified-live virus (MLV) product in newly received beef calves improves gain and immune response, University of Arkansas researchers say.

Stress commonly associated with weaning, marketing and shipment of feeder cattle can temporarily compromise immune function, thus reducing the effective response to vaccination intended to control bovine respiratory disease (BRD).

Two vaccination timing treatments were used to evaluate the effects of timing of a multivalent MLV BRD vaccine on health, performance and infectious bovine rhinotracheitis (IBR) antibody titers of newly received stocker cattle.

Crossbred bull and steer calves (n = 528) were weighed (434 lbs.) and randomly assigned to MLV vaccination treatment 1) upon arrival (AMLV), or 2) delayed 14 days (DMLV).

All cattle were processed similarly according to routine procedures, with the exception of the initial MLV vaccination timing. Subsequently, bodyweight was recorded on day 14, 28 and 42. Blood samples were collected on day 0, 14, 28 and 42 to determine serum IBR titers; comparisons were made between treatments on a receiving-day basis and an equivalent postvaccination-day basis.

Average daily gain (ADG) was greater for DMLV calves from day 0 to 14 (2.56 vs. 1.94 lbs./day) and from day 0 to 42 (1.65 vs. 1.43 lbs./day). Days to first treatment, total treatment cost, percentage death loss and pasture ADG after the 42-day receiving period did not differ. Morbidity rates for BRD were high for both AMLV and DMLV (71.5% and 63.5%, respectively) and were not statistically significant.

Positive IBR titer seroconversion was greater for DMLV calves on day 42 of the study and for the 28- and 42-day equivalent postvaccination basis.

Researchers conclude that delaying vaccination by 14 days may increase ADG during the receiving period compared with AMLV, and seroconversion to IBR was greater in DMLV calves, indicating a possible improvement in acquired immune response when MLV vaccination is delayed.
Richeson et al, 2008, Journal of Animal Science, 86:999.

Low-moisture blocks (LMB) are an effective attractant that can be used to lure cattle to graze high elevations away from water, New Mexico and Montana State University researchers say.

A study was conducted to compare the effects of strategically placed salt and LMB on grazing distribution and diurnal behavior patterns of individual cows grazing foothill rangeland in northern Montana during autumn.

The study was divided into two sets, each containing two consecutive 10-day periods. Cows (n = 32) were tracked with global positioning system collars for one set. Salt and LMB were available for one period, and only salt was available for the other period. During these two periods, all supplements were placed in approximately the same location (within a 12.4 acre area) on ridges away from water and that historically received little use.

When LMB was available, cows used higher elevations (3,878 ft.) and were farther horizontally from water (1,742 ft.) than when only salt (3,842 ft. and 1,594 ft., respectively) was provided.

Cows traveled 2.7 miles/day when supplemented with LMB, and 2.45 miles/day with salt. Observed differences between treatments for time spent near supplements were most apparent in the higher terrain between 32-328 ft. from placement sites. Cows were more active (not resting) when LMB was available than when only salt was available, but much of the difference in activity between treatments appeared to be consumption of LMB at night. Over a 24-hour period, 47 of the 73 minutes that cows spent within 32 ft. of LMB (a visit) occurred at night.

Results from this study support previous research suggesting that LMB is an effective attractant than can be used to lure cattle to graze high elevations away from water, researchers conclude. Strategic placement of LMB in high terrain away from water may increase grazing use of typically underused terrain by changing where cattle choose to rest and spend the night.
Bailey et al, 2008, Journal of Animal Science, 86:1271.

When competition at the feeder is increased, final body weights of heifers vary, but performance is not affected, researchers from Spain find.

Researchers sought to examine the effects of increasing the number of animals per concentrate feeding place on performance, behavior, welfare indicators and ruminal fermentation of feedlot heifers.

Seventy-two Friesian heifers were used in a factorial arrangement, with three treatments and three blocks of similar bodyweight. Treatments consisted of 2 (T2), 4 (T4), or 8 (T8) heifers per each place in the concentrate feeder (8 heifers/pen). Concentrates and straw were fed at 8:30 a.m. in individual feeders that allowed ad libitum consumption. During six periods of 28 days each, dry matter intake and average daily gain (ADG) were measured, and blood and rumen samples were taken. Fecal glucocorticoid metabolites and behavior were measured at period 1, 3 and 6.

Final body weight, ADG and gain-to-feed were not affected by treatments. Variability in final bodyweight among heifers sharing the same pen tended to increase, and concentrate intake decreased linearly as competition increased. The proportion of abscessed livers responded quadratically, with 8%, 4% and 20% for T2, T4 and T8 treatments, respectively.

Concentrate eating time decreased and eating rate increased linearly, whereas the variability between penmates in concentrated eating time was greatest in T4 and T8.

Increasing competition resulted in a quadratic response in daily lying time (greatest in T2), whereas standing time increased linearly. The number of displacements among penmates from the concentrate feeders, as well as the total sum of displacements, increased linearly with increasing competition.

The pen average fecal glucocorticoid metabolites was not affected by treatments, but the pen's maximum concentration responded quadratically, being greatest in T2, with dominate heifers as the most affected. Serum haptoglobin concentration increased linearly with competition, particularly within the most subordinate heifers. Increased competition reduced ruminal pH only in period 1 and 2 and increased ruminal lactate.

Researchers conclude that increasing competition at concentrate feeders increased the variability in final body weight, but performance wasn't affected. Detrimental effects on animal welfare might be deduced from the altered feeding behavior, reduced resting time and increased aggression. Ruminal lactate and blood haptoglobin indicate that the risk of rumen acidosis might increase with competition, whereas liver abscesses increased at 8 heifers/feeder.
Gonzales, et al, 2008, Journal of Animal Science, 86, 1446.

Calves from the Southeast part of the U.S. had a greater profit per head than Midwest calves, a collaborative study between Iowa State University and Certified Angus Beef (CAB) shows.

Data from a total of 27,538 feeder calves from 15 states fed in 10 Iowa feedlots over six years (2002-07) were used to evaluate the effect of calf origin on feedlot performance and carcass traits. Ten Southeast (SE) and five Midwest (M) states were represented. Numbers of calves from each region were 18,228 and 9,310 from SE and M, respectively.

Morbidity rates, treatment costs and mortality rates for SE and M calves were 15.22%, $5.01/head and 1.43%; and 30.76%, $7.38/head and 1.76%, respectively. The percentage of Prime, Choice, Select and Standard carcasses did not differ significantly between regions.

However, a significantly higher percentage of SE vs. M calves (21.57% and 19.02%, respectively) of the black-hided Angus calves eligible for the CAB program were accepted.

When considering all costs and returns, the SE calves had significantly greater profit/head than M calves ($48.63 vs. $37.31).

Researchers concluded that SE calves had fewer health problems, higher CAB acceptance rates and greater profit/head, while M calves tended to have slightly better feedlot performance.
Busby et al, 2008, Southern Section ASAS, Abstract 18.