Time filter

Source Type

MacFarlane A.J.,Nutrition Research Division | MacFarlane A.J.,University of Ottawa | MacFarlane A.J.,Carleton University | Greene-Finestone L.S.,Public Health Agency of Canada | And 2 more authors.
American Journal of Clinical Nutrition | Year: 2011

Background: Vitamin B-12 is an important cofactor required for nucleotide and amino acid metabolism. Vitamin B-12 deficiency causes anemia and neurologic abnormalities - a cause for concern for the elderly, who are at increased risk of vitamin B-12 malabsorption. Vitamin B-12 deficiency is also associated with an increased risk of neural tube defects and hyperhomocysteinemia. The metabolism of vitamin B-12 and folate is interdependent, which makes it of public health interest to monitor biomarkers of vitamin B-12, folate, and homocysteine in a folic acid-fortified population. Objective: The objective was to determine the vitamin B-12, folate, and homocysteine status of the Canadian population in the period after folic acid fortification was initiated. Design: Blood was collected from a nationally representative sample of ∼5600 participants aged 6-79 y in the Canadian Health Measures Survey during 2007-2009 and was analyzed for serum vitamin B-12, red blood cell folate, and plasma total homocysteine (tHcy). Results: A total of 4.6% of Canadians were vitamin B-12 deficient (<148 pmol/L). Folate deficiency (<320 nmol/L) was essentially nonexistent. Obese individuals were less likely to be vitamin B-12 adequate than were individuals with a normal BMI. A total of 94.9% of Canadians had a normal tHcy status (≤13 μmol/L), and individuals with normal tHcy were more likely to be vitamin B-12 adequate and to have high folate status (>1090 nmol/L). Conclusions: Approximately 5% of Canadians are vitamin B-12 deficient. One percent of adult Canadians have metabolic vitamin B-12 deficiency, as evidenced by combined vitamin B-12 deficiency and high tHcy status. In a folate-replete population, vitamin B-12 is a major determinant of tHcy. © 2011 American Society for Nutrition.


MacFarlane A.J.,Cornell University | MacFarlane A.J.,Nutrition Research Division | Perry C.A.,Cornell University | McEntee M.F.,University of Tennessee at Knoxville | And 2 more authors.
Cancer Research | Year: 2011

Folate-mediated one-carbon metabolism is required for the de novo synthesis of purines, thymidylate, and S-adenosylmethionine, the primary cellular methyl donor. Impairments in folate metabolism diminish cellular methylation potential and genome stability, which are risk factors for colorectal cancer (CRC). Cytoplasmic serine hydroxymethyltransferase (SHMT1) regulates the partitioning of folate-activated one-carbons between thymidylate and S-adenosylmethionine biosynthesis. Therefore, changes in SHMT1 expression enable the determination of the specific contributions made by thymidylate and S-adenosylmethionine biosynthesis to CRC risk. Shmt1 hemizygosity was associated with a decreased capacity for thymidylate synthesis due to downregulation of enzymes in its biosynthetic pathway, namely thymidylate synthase and cytoplasmic thymidine kinase. Significant Shmt1-dependent changes to methylation capacity, gene expression, and purine synthesis were not observed. Shmt1 hemizygosity was associated with increased risk for intestinal cancer in Apcmin/+ mice through a gene-by-diet interaction, indicating that the capacity for thymidylate synthesis modifies susceptibility to intestinal cancer in Apc min/+ mice. ©2011 AACR.


Zinck J.W.R.,Public Health Agency of Canada | Zinck J.W.R.,Nutrition Research Division | De Groh M.,Public Health Agency of Canada | MacFarlane A.J.,Nutrition Research Division
American Journal of Clinical Nutrition | Year: 2015

Background: Genetic variation can cause variable responses to environmental stimuli. A number of single-nucleotide polymorphisms (SNPs) have been associated with B vitamin status or chronic diseases related to vitamin B-12 and folate metabolism. Objective: Our objective was to identify associations between common SNPs in genes related to folate and vitamin B-12 metabolism or associated with B vitamin-related chronic diseases and biomarkers of nutrient status in a population exposed to folic acid fortification. Design: A panel of 116 SNPs was sequenced by using the Sequenom iPLEX Gold platform in a sample of 3114 adults aged 20-79 y from the Canadian Health Measures Survey, cycle 1. Associations between these SNPs and red blood cell (RBC) folate, serum vitamin B-12, and plasma total homocysteine were determined. Results: Twenty-one SNPs and 6 haplotype blocks were associated with RBC folate, serum vitamin B-12, and/or plasma homocysteine concentrations. Vitamin status was associated mainly with SNPs in genes directly involved in vitamin absorption/uptake (CUBN, CD320), transport (TCN1, TCN2), or metabolism (BHMT2, CBS, MTHFR, MUT, SHMT1). Other SNPs included those in the DNMT2, DPEP1, FUT2, NOX4, and PON1 genes. Conclusions: We identified novel associations between SNPs in CD320 and DNMT2, which had been previously associated with neural tube defects, and vitamin B-12 status, as well as between SNPs in SHMT1, which had been previously associated with colorectal cancer and cardiovascular disease risk, and RBC folate status. These novel associations provide a plausible metabolic rationale for the association of these SNPs with B vitamin-related diseases. We also observed a novel association between an SNP in CUBN with RBC folate and confirmed the association of a number of SNPs with B vitamin status in this large cross-sectional study. © 2015 American Society for Nutrition.


Chan Y.-M.,University of Toronto | Chan Y.-M.,The Hospital for Sick Children | MacFarlane A.J.,Nutrition Research Division | O'Connor D.L.,University of Toronto | O'Connor D.L.,The Hospital for Sick Children
Journal of Nutrition | Year: 2015

Background: Mandatory folic acid fortification of white-wheat flour and selected other grain products has reduced the prevalence of neural tube defects in Canada; however, the fortification of whole-wheat flour is not permitted. Objective: The objective of this study was to model the impact of adding folic acid to whole-wheat flour on the folate intake distribution of Canadians. Methods: Twenty-four-hour dietary recall and supplement intake data (n = 35,107) collected in the 2004 Canadian Community Health Survey 2.2 were used to calculate the prevalence of folate inadequacy (POFI) and the proportion of folic acid intakes above the Tolerable Upper Intake Level (UL). In model 1, folic acid was added to whole-wheat flour-containing foods in amounts comparable to those that are mandatory for white-wheat flour-containing foods. In model 2, a 50% overage of folic acid fortification was considered. Models 3 and 4 included assessment of folate intake distributions in adult whole-wheat consumers with or without a fortification overage. SIDE (Software for Intake Distribution Estimation; Department of Statistics and Center for Agricultural and Rural Development, Iowa State University) was used to estimate usual folate intakes. Results: Mean folate intakes increased by ~5% in all sex and age groups when whole-wheat foods were fortified (models 1 and 2; P < 0.0001). Folic acid fortification of whole-wheat flour-containing foods did not change the POFI or percentage of intakes above the UL in the general population, whether in supplement users or nonusers. Among whole-wheat consumers, the POFI was reduced by 10 percentage points after fortification of whole-wheat flour-containing foods (95% CIs did not overlap). The percentage of whole-wheat consumers with intakes above the UL did not change. Conclusion: Although folic acid fortification of whole-wheat flour-containing foods is unlikely to change the POFI or proportion of folic acid intakes above the UL in the general Canadian population, this fortification strategy may reduce the POFI in adult whole-wheat consumers. © 2015 American Society for Nutrition.


Aziz A.,Nutrition Research Division | Dumais L.,Regulations | Barber J.,Regulations
American Journal of Clinical Nutrition | Year: 2013

The glycemic index (GI) is a system that ranks foods according to the blood glucose-increasing potential of servings of foods that provide the same amount of available carbohydrate. The GI was originally developed as a tool for carbohydrate exchange in the dietary management of glycemia in persons with diabetes, and studies have generally supported modest benefits of low-GI diets in this population. Despite inconsistent results for the utility of the GI in the nondiabetic population, there is some interest in its universal application on food labels to assist consumers in making food choices that would help them meet their dietary goals. The objective of this review was to evaluate the usefulness of including the GI values of foods as part of the information on food labels in Canada. Health Canada's assessment identified 3 areas of concern with respect to GI labeling: 1) the GI measure has poor accuracy and precision for labeling purposes; 2) as a ratio, the GI does not vary in response to the amount of food consumed and the partial replacement of available carbohydrates with unavailable carbohydrates, whereas the glycemic response does; and 3) an unintended focus on the GI for food selection could lead to food choices that are inconsistent with national dietary guidelines. Hence, Health Canada's current opinion is that the inclusion of the GI value on the label of eligible food products would be misleading and would not add value to nutrition labeling and dietary guidelines in assisting consumers to make healthier food choices. © 2013 American Society for Nutrition.


Sinclair S.E.,Regulations | Cooper M.,Nutrition Research Division | Mansfield E.D.,Regulations
Journal of the Academy of Nutrition and Dietetics | Year: 2014

Recent menu labeling initiatives in North America involve posting the calorie content of standard menu items, sometimes with other nutrients of public health concern, with or without contextual information (such as the recommended daily caloric intake for an average adult) or interpretive information (such as traffic light symbols). It is not clear whether this is an effective method to convey nutrition information to consumers wanting to make more-informed food choices. Of particular concern are those consumers who may be limited in their food and health literacy skills to make informed food choices to meet their dietary needs or goals. The purpose of this systematic review was to determine whether the provision of menu-based nutrition information affects the selection and consumption of calories in restaurants and other foodservice establishments. A secondary objective was to determine whether the format of the nutrition information (informative vs contextual or interpretive) influences calorie selection or consumption. Several bibliographic databases were searched for experimental or quasiexperimental studies that tested the effect of providing nutrition information in a restaurant or other foodservice setting on calories selected or consumed. Studies that recruited generally healthy, noninstitutionalized adolescents or adults were included. When two or more studies reported similar outcomes and sufficient data were available, meta-analysis was performed. Menu labeling with calories alone did not have the intended effect of decreasing calories selected or consumed (-31 kcal [. P=0.35] and -13 kcal [. P=0.61], respectively). The addition of contextual or interpretive nutrition information on menus appeared to assist consumers in the selection and consumption of fewer calories (-67 kcal [ P=0.008] and -81 kcal [ P=0.007], respectively). Sex influenced the effect of menu labeling on selection and consumption of calories, with women using the information to select and consume fewer calories. The findings of this review support the inclusion of contextual or interpretive nutrition information with calories on restaurant menus to help consumers select and consume fewer calories when eating outside the home. Further exploration is needed to determine the optimal approach for providing this menu-based nutrition information, particularly for those consumers who may be limited in their food and health literacy skills. © 2014.


Gilani G.S.,Nutrition Research Division
British Journal of Nutrition | Year: 2012

The subject of protein quality assessment of foods and diets was addressed at the Codex Committee on Vegetable Proteins (1982-1989), FAO/WHO (1989, 2001) and WHO/FAO (2002) expert reviews. These international developments are summarized in this manuscript. In 1989, a Joint FAO/WHO Expert Consultation on Protein Quality Evaluation reviewed knowledge of protein quality assessment of foods, and specifically evaluated amino acid score corrected for protein digestibility, the method recommended by the Codex Committee on Vegetable Proteins. The report of the Consultation published in 1991 concluded that the Protein Digestibility-corrected Amino Acid Score (PDCAAS) method was the most suitable approach for routine evaluation of protein quality for humans. The Consultation recognized that the amino acid scoring pattern proposed by FAO/WHO/UNU (1985) for preschool children was at that time the most suitable pattern for calculating PDCAAS for all ages except infants in which case the amino acid composition of human milk was recommended to be the basis of the scoring pattern. The rat balance method was considered as the most suitable practical method for predicting protein digestibility by humans. Since its adoption by FAO/WHO (1991), the PDCAAS method has been criticised for a number of reasons. The FAO/WHO (2001) Working Group on analytical issues related to protein quality assessed the validity of criticisms of the PDCAAS method. While recognizing a distinct regulatory use of protein quality data, the Working Group recommended that the PDCAAS method may be inappropriate for the routine prediction of protein quality of novel and sole source foods which contain high levels of anti nutritional factors; and that for regulatory purposes, the method should be revised to permit values of >100 for high quality proteins. In evaluating the recommendations of the Working Group, the WHO/FAO (2002) Expert Consultation on Protein and Amino Acid Requirements endorsed the PDCAAS method with minor modifications to the calculation method but also raised several issues. These included the calculation of scoring patterns; prediction of amino acid digestibility by faecal and ileal methods; reduced bioavailability of lysine in processed proteins; truncation of the amino acid score and consequent PDCAAS value; protein digestibility as a first limiting factor in determining the overall available dietary nitrogen; and the calculation of amino acid score for a dietary protein mixture. These concerns were considered particularly important in relation to the regulatory aspects of protein quality of foods, and their resolution was urgently recommended through a new separate expert review. © 2012 The Author.


Cooper M.,Nutrition Research Division
Health reports / Statistics Canada, Canadian Centre for Health Information = Rapports sur la santé / Statistique Canada, Centre canadien d'information sur la santé | Year: 2012

Iron deficiency is the most common nutritional deficiency in the world, but little is known about the iron status of people in Canada, where the last estimates are from 1970-1972. The data are from cycle 2 (2009 to 2011) of the Canadian Health Measures Survey, which collected blood samples from a nationally representative sample of Canadians aged 3 to 79. Descriptive statistics (percentages, arithmetic means, geometric means) were used to estimate hemoglobin and serum ferritin concentrations, and other markers of iron status. Analyses were performed by age/sex group, household income, self-perceived health, diet, and use of iron supplements. World Health Organization reference values (2001) were used to estimate the prevalence of iron sufficiency and anemia. The overall prevalence of anemia was low in the 2009-to-2011 period--97% of Canadians had sufficient hemoglobin levels. Generally, hemoglobin concentration increased compared with 1970-1972; however, at ages 65 to 79, rates of anemia were higher than in 1970-1972. Depleted iron stores were found in 13% of females aged 12 to 19 and 9% of females aged 20 to 49. Lower household income was associated with a lower prevalence of hemoglobin sufficiency, but was not related to lower serum ferritin sufficiency. Self-perceived health and diet were not significantly associated with hemoglobin and serum ferritin levels. The lack of a relationship between iron status and diet may be attributable to the use of questions about food consumption frequency that were not specifically designed to estimate dietary iron intake. Factors other than iron intake might have contributed to the increase in the prevalence of anemia among seniors.


Gilani G.S.,Nutrition Research Division | Xiao C.W.,Nutrition Research Division | Cockell K.A.,Nutrition Research Division
British Journal of Nutrition | Year: 2012

Dietary antinutritional factors have been reported to adversely affect the digestibility of protein, bioavailability of amino acids and protein quality of foods. Published data on these negative effects of major dietary antinutritional factors are summarized in this manuscript. Digestibility and the quality of mixed diets in developing countries are considerably lower than of those in developed regions. For example, the digestibility of protein in traditional diets from developing countries such as India, Guatemala and Brazil is considerably lower compared to that of protein in typical North American diets (54-78 versus 88-94%). Poor digestibility of protein in the diets of developing countries, which are based on less refined cereals and grain legumes as major sources of protein, is due to the presence of less digestible protein fractions, high levels of insoluble fibre, and/or high concentrations of antinutritional factors present endogenously or formed during processing. Examples of naturally occurring antinutritional factors include glucosinolates in mustard and canola protein products, trypsin inhibitors and haemagglutinins in legumes, tannins in legumes and cereals, gossypol in cottonseed protein products, and uricogenic nucleobases in yeast protein products. Heat/alkaline treatments of protein products may yield Maillard reaction compounds, oxidized forms of sulphur amino acids, D-amino acids and lysinoalanine (LAL, an unnatural nephrotoxic amino acid derivative). Among common food and feed protein products, soyabeans are the most concentrated source of trypsin inhibitors. The presence of high levels of dietary trypsin inhibitors from soyabeans, kidney beans or other grain legumes have been reported to cause substantial reductions in protein and amino acid digestibility (up to 50%) and protein quality (up to 100%) in rats and/or pigs. Similarly, the presence of high levels of tannins in sorghum and other cereals, fababean and other grain legumes can cause significant reductions (up to 23%) in protein and amino acid digestibility in rats, poultry, and pigs. Normally encountered levels of phytates in cereals and legumes can reduce protein and amino acid digestibility by up to 10%. D-amino acids and LAL formed during alkaline/heat treatment of lactalbumin, casein, soya protein or wheat protein are poorly digestible (less than 40%), and their presence can reduce protein digestibility by up to 28% in rats and pigs, and can cause a drastic reduction (100%) in protein quality, as measured by rat growth methods. The adverse effects of antinutritional factors on protein digestibility and protein quality have been reported to be more pronounced in elderly rats (20-months old) compared to young (5-weeks old) rats, suggesting the use of old rats as a model for assessing the protein digestibility of products intended for the elderly. © 2012 The Authors.


Uush T.,Nutrition Research Division
Journal of Steroid Biochemistry and Molecular Biology | Year: 2014

Dietary calcium intake in relation to calcium status in Mongolian children was investigated. This survey was a cross-sectional survey. A total of 835 children were randomly selected from 4 economic regions and Ulaanbaatar city. Information on dietary intake was collected from 835 children in the 1-3, 4-7, and 8-14 year old groups by a 24-h recall method. The average daily intake of calcium from diet was calculated for individuals. Blood samples were collected from 104 children. The mean of daily calcium intakes as 273 ± 30.0 mg in 1-3-year old children, 309.0 ± 30.0 mg in 4-7 year old children, and 317.0 ± 31.0 mg in 8-14 year old children, respectively. There were statistically significant differences in calcium intakes between the age groups 1-3 years, 4-7 years, and 1-3 years, 8-14 years of children (p < 0.001). Calcium intakes in all studied children of all age groups were lower (39%, 30.9%, and 24.4%) than the recommended level of calcium intakes. In 22.1% of studied children, the serum total calcium concentration levels were below the normal range. Based on the total serum calcium, the prevalence of hypocalcemia was higher among children in the age group 8-14 years (27.6%) than the prevalence among children in the age group <1 year (p < 0.05). Based on the corrected serum calcium values, the prevalence of hypocalcemia was higher (52.4%, 63.6%, and 51.1%) among children in the age groups 1-3, 4-7, and 8-14 years. The mean level of corrected serum calcium were low (2.02 ± 0.04, 2.05 ± 0.73, and 1.99 ± 0.64 mg) in children in the age groups 1-3, 4-7, and 8-14 years. These findings suggest that low dietary calcium intakes may be reflected by hypocalcemia in Mongolian children. In conclusion, there is a need to improve a consumption of milk, dairy products in Mongolian children. In addition, there is need to use a vitamin D supplementation with a calcium supplementation in children with severe vitamin D deficiency rickets. This article is part of a Special issue entitled "16th Vitamin D Workshop". © 2014 Elsevier Ltd.

Loading Nutrition Research Division collaborators
Loading Nutrition Research Division collaborators