Entity

Time filter

Source Type

Bells Corners, Canada

Cooper M.,Nutrition Research Division
Health reports / Statistics Canada, Canadian Centre for Health Information = Rapports sur la santé / Statistique Canada, Centre canadien d'information sur la santé | Year: 2012

Iron deficiency is the most common nutritional deficiency in the world, but little is known about the iron status of people in Canada, where the last estimates are from 1970-1972. The data are from cycle 2 (2009 to 2011) of the Canadian Health Measures Survey, which collected blood samples from a nationally representative sample of Canadians aged 3 to 79. Descriptive statistics (percentages, arithmetic means, geometric means) were used to estimate hemoglobin and serum ferritin concentrations, and other markers of iron status. Analyses were performed by age/sex group, household income, self-perceived health, diet, and use of iron supplements. World Health Organization reference values (2001) were used to estimate the prevalence of iron sufficiency and anemia. The overall prevalence of anemia was low in the 2009-to-2011 period--97% of Canadians had sufficient hemoglobin levels. Generally, hemoglobin concentration increased compared with 1970-1972; however, at ages 65 to 79, rates of anemia were higher than in 1970-1972. Depleted iron stores were found in 13% of females aged 12 to 19 and 9% of females aged 20 to 49. Lower household income was associated with a lower prevalence of hemoglobin sufficiency, but was not related to lower serum ferritin sufficiency. Self-perceived health and diet were not significantly associated with hemoglobin and serum ferritin levels. The lack of a relationship between iron status and diet may be attributable to the use of questions about food consumption frequency that were not specifically designed to estimate dietary iron intake. Factors other than iron intake might have contributed to the increase in the prevalence of anemia among seniors.


Gilani G.S.,Nutrition Research Division
British Journal of Nutrition | Year: 2012

The subject of protein quality assessment of foods and diets was addressed at the Codex Committee on Vegetable Proteins (1982-1989), FAO/WHO (1989, 2001) and WHO/FAO (2002) expert reviews. These international developments are summarized in this manuscript. In 1989, a Joint FAO/WHO Expert Consultation on Protein Quality Evaluation reviewed knowledge of protein quality assessment of foods, and specifically evaluated amino acid score corrected for protein digestibility, the method recommended by the Codex Committee on Vegetable Proteins. The report of the Consultation published in 1991 concluded that the Protein Digestibility-corrected Amino Acid Score (PDCAAS) method was the most suitable approach for routine evaluation of protein quality for humans. The Consultation recognized that the amino acid scoring pattern proposed by FAO/WHO/UNU (1985) for preschool children was at that time the most suitable pattern for calculating PDCAAS for all ages except infants in which case the amino acid composition of human milk was recommended to be the basis of the scoring pattern. The rat balance method was considered as the most suitable practical method for predicting protein digestibility by humans. Since its adoption by FAO/WHO (1991), the PDCAAS method has been criticised for a number of reasons. The FAO/WHO (2001) Working Group on analytical issues related to protein quality assessed the validity of criticisms of the PDCAAS method. While recognizing a distinct regulatory use of protein quality data, the Working Group recommended that the PDCAAS method may be inappropriate for the routine prediction of protein quality of novel and sole source foods which contain high levels of anti nutritional factors; and that for regulatory purposes, the method should be revised to permit values of >100 for high quality proteins. In evaluating the recommendations of the Working Group, the WHO/FAO (2002) Expert Consultation on Protein and Amino Acid Requirements endorsed the PDCAAS method with minor modifications to the calculation method but also raised several issues. These included the calculation of scoring patterns; prediction of amino acid digestibility by faecal and ileal methods; reduced bioavailability of lysine in processed proteins; truncation of the amino acid score and consequent PDCAAS value; protein digestibility as a first limiting factor in determining the overall available dietary nitrogen; and the calculation of amino acid score for a dietary protein mixture. These concerns were considered particularly important in relation to the regulatory aspects of protein quality of foods, and their resolution was urgently recommended through a new separate expert review. © 2012 The Author.


Aziz A.,Nutrition Research Division | Dumais L.,Regulations | Barber J.,Regulations
American Journal of Clinical Nutrition | Year: 2013

The glycemic index (GI) is a system that ranks foods according to the blood glucose-increasing potential of servings of foods that provide the same amount of available carbohydrate. The GI was originally developed as a tool for carbohydrate exchange in the dietary management of glycemia in persons with diabetes, and studies have generally supported modest benefits of low-GI diets in this population. Despite inconsistent results for the utility of the GI in the nondiabetic population, there is some interest in its universal application on food labels to assist consumers in making food choices that would help them meet their dietary goals. The objective of this review was to evaluate the usefulness of including the GI values of foods as part of the information on food labels in Canada. Health Canada's assessment identified 3 areas of concern with respect to GI labeling: 1) the GI measure has poor accuracy and precision for labeling purposes; 2) as a ratio, the GI does not vary in response to the amount of food consumed and the partial replacement of available carbohydrates with unavailable carbohydrates, whereas the glycemic response does; and 3) an unintended focus on the GI for food selection could lead to food choices that are inconsistent with national dietary guidelines. Hence, Health Canada's current opinion is that the inclusion of the GI value on the label of eligible food products would be misleading and would not add value to nutrition labeling and dietary guidelines in assisting consumers to make healthier food choices. © 2013 American Society for Nutrition.


Sinclair S.E.,Regulations | Cooper M.,Nutrition Research Division | Mansfield E.D.,Regulations
Journal of the Academy of Nutrition and Dietetics | Year: 2014

Recent menu labeling initiatives in North America involve posting the calorie content of standard menu items, sometimes with other nutrients of public health concern, with or without contextual information (such as the recommended daily caloric intake for an average adult) or interpretive information (such as traffic light symbols). It is not clear whether this is an effective method to convey nutrition information to consumers wanting to make more-informed food choices. Of particular concern are those consumers who may be limited in their food and health literacy skills to make informed food choices to meet their dietary needs or goals. The purpose of this systematic review was to determine whether the provision of menu-based nutrition information affects the selection and consumption of calories in restaurants and other foodservice establishments. A secondary objective was to determine whether the format of the nutrition information (informative vs contextual or interpretive) influences calorie selection or consumption. Several bibliographic databases were searched for experimental or quasiexperimental studies that tested the effect of providing nutrition information in a restaurant or other foodservice setting on calories selected or consumed. Studies that recruited generally healthy, noninstitutionalized adolescents or adults were included. When two or more studies reported similar outcomes and sufficient data were available, meta-analysis was performed. Menu labeling with calories alone did not have the intended effect of decreasing calories selected or consumed (-31 kcal [. P=0.35] and -13 kcal [. P=0.61], respectively). The addition of contextual or interpretive nutrition information on menus appeared to assist consumers in the selection and consumption of fewer calories (-67 kcal [ P=0.008] and -81 kcal [ P=0.007], respectively). Sex influenced the effect of menu labeling on selection and consumption of calories, with women using the information to select and consume fewer calories. The findings of this review support the inclusion of contextual or interpretive nutrition information with calories on restaurant menus to help consumers select and consume fewer calories when eating outside the home. Further exploration is needed to determine the optimal approach for providing this menu-based nutrition information, particularly for those consumers who may be limited in their food and health literacy skills. © 2014.


Chan Y.-M.,University of Toronto | Chan Y.-M.,The Research Institute | MacFarlane A.J.,Nutrition Research Division | O'Connor D.L.,University of Toronto | O'Connor D.L.,The Research Institute
Journal of Nutrition | Year: 2015

Background: Mandatory folic acid fortification of white-wheat flour and selected other grain products has reduced the prevalence of neural tube defects in Canada; however, the fortification of whole-wheat flour is not permitted. Objective: The objective of this study was to model the impact of adding folic acid to whole-wheat flour on the folate intake distribution of Canadians. Methods: Twenty-four-hour dietary recall and supplement intake data (n = 35,107) collected in the 2004 Canadian Community Health Survey 2.2 were used to calculate the prevalence of folate inadequacy (POFI) and the proportion of folic acid intakes above the Tolerable Upper Intake Level (UL). In model 1, folic acid was added to whole-wheat flour-containing foods in amounts comparable to those that are mandatory for white-wheat flour-containing foods. In model 2, a 50% overage of folic acid fortification was considered. Models 3 and 4 included assessment of folate intake distributions in adult whole-wheat consumers with or without a fortification overage. SIDE (Software for Intake Distribution Estimation; Department of Statistics and Center for Agricultural and Rural Development, Iowa State University) was used to estimate usual folate intakes. Results: Mean folate intakes increased by ~5% in all sex and age groups when whole-wheat foods were fortified (models 1 and 2; P < 0.0001). Folic acid fortification of whole-wheat flour-containing foods did not change the POFI or percentage of intakes above the UL in the general population, whether in supplement users or nonusers. Among whole-wheat consumers, the POFI was reduced by 10 percentage points after fortification of whole-wheat flour-containing foods (95% CIs did not overlap). The percentage of whole-wheat consumers with intakes above the UL did not change. Conclusion: Although folic acid fortification of whole-wheat flour-containing foods is unlikely to change the POFI or proportion of folic acid intakes above the UL in the general Canadian population, this fortification strategy may reduce the POFI in adult whole-wheat consumers. © 2015 American Society for Nutrition.

Discover hidden collaborations