Do Diets EVER Work?

Do diets ever work? A new study just out in the ‘Journal of Nutrition’ suggests focusing on weight loss can actually lead to weight gain and even can have negative affects on overall health. Co-authors Linda Bacon and Lucy Aphramor cite evidence from almost 200 studies on what to do instead…