Biologists study animals and plants, and their interactions with the environment in which they live. Most biologists major in biology in college and then go on to complete a doctorate in a specific field such as microbiology or even marine biology. Some of the largest employers of biologists include government agencies, universities, and drug manufacturing companies. The type of research biologists do helps increase our knowledge in many areas, from advancing modern medicine to solving environmental problems. There are a wide range of career options available in this field!
Read about women who are biologists: