Minerals for beef cattle are a topic that is difficult to discuss, while there are libraries of data regarding minerals and their role in beef cattle production, surprisingly we know very little. One topic of great interest is mineral intake, and when the topic of minerals come up at Extension meetings, the topic of intake comes up. Intake is a measurement that is “felt” by beef producers, thus it is hard to quantify. Producers know when they are putting out beef minerals, but are they consistent in putting out minerals? The one theme that is discussed time and time again is the role that forage quality plays in mineral intake, with the consensus being that as forage quality declines mineral intake increases. I would disagree with that thought. The thought behind that idea is sound, since forage quality is decreased, the animal “knows” it needs more minerals, therefore it eats more minerals. In summarizing data from an implant study we had conducted over a two year period, we had forage quality data on a rough 28 d schedule, as well as mineral issue and weighback information, mostly to see if an implant effect existed with mineral intake (which there was no effect), and I was putting that data together, I realized something. While not related to implant status, I had mineral intake data as well as forage quality data for a 110-150 d period, I could analyze the data and see if mineral intake was related to forage quality. If conventional wisdom was correct intake should increase as the growing season progressed and the forage lost quality. In general terms, we had 6 sampling periods, from January to late May, we looked at forage quality parameters (Crude Protein, Acid detergent fiber, and Dry Matter) and how they related to mineral intake.