Open Access Grey Literature

Origin of the Recommended Intake of L-Histidine by Infants

Leise Berven, Linda Atkins, Derek Castles, Dorothy Mackerras

European Journal of Nutrition & Food Safety, Page 404-407
DOI: 10.9734/EJNFS/2014/11579

Background: In mid-2012, Food Standards Australia New Zealand (FSANZ) received an application to change the Australia New Zealand Food Standards Code so that the minimum L-histidine content of infant formula sold in Australia and New Zealand could be reduced from 12 to 10 mg/100kJ. Infant formula refers to breast milk substitutes that satisfy nutritional requirements of infants up to 12 months of age. 
L-histidine is an essential amino acid. Although there is a population nutrient recommendation regarding total protein intake for infants in Australia and New Zealand, there are no recommendations regarding intake of the essential amino acids. The 12 mg/100kJ level that was in existence in 2012 had originally been set in 2002 based on the 1985 Joint FAO/WHO/UNU Expert Consultation report (updated in 1991). This report had estimated amino acid requirements calculated from breast milk composition data from four papers. In 2007, Codex Alimentarius adopted a minimum of 10 mg/100kJ for L-histidine based on the 2005 ESPGHAN Co-ordinated International Expert Group report.
The Policy Guideline on the Regulation of Infant Formula Products issued in 2011 by the Australia and New Zealand Food Regulation Ministerial Council recommends that the primary reference for the compositional requirements of infant formula and follow-on formula should be breast milk. Therefore, the FSANZ risk assessment examined which studies of breast milk composition had been included in earlier reports and whether there was more recent data that should be included. 
Aim: To determine if infant formula containing a minimum content of L-histidine of 10 mg/100kJ is consistent with reported levels in breast milk and will support normal growth in formula-fed infants. It is assumed that the amount of L-histidine in breast milk is adequate to meet the requirements of infants for normal growth. The review focused on infants <6 months of age because older infants would be receiving complementary foods.
Methods: The studies which contributed to the average L-histidine amount in breast milk calculated in the FAO/WHO and ESPGHAN reports and similar reports from prominent bodies in Europe and the US were examined. A literature review of the cited studies and related research reporting L-histidine content of breast milk going back to 1954 was done. Because there was great variability in the units used to describe the L-histidine content of breast milk (e.g. nmol L-histidine/ml of milk, mg L-histidine/g milk, mg L-histidine/g nitrogen, g L-histidine/kg/day etc.) all values for L-histidine were converted to mg/g crude protein to allow comparison. After examining previous reports, the estimated L-histidine content in breast milk was updated by selecting studies based on criteria around sample collection, analytical methods for amino acid analysis and measurement of protein content, and reporting items (such as full and available publication in the peer-reviewed literature). To confirm whether a lower L-histidine level would support an infant’s physiological requirements, additional literature was reviewed to examine effects of formulas with different histidine concentrations on growth and plasma histidine concentration.
Results: The six reports that were examined calculated the average L-histidine intake from studies of breast milk as the basis of their recommendations. The 1985 and 1991 FAO/WHO reports recommended an intake of 26 mg/g crude protein based on four studies of breast milk. None of the studies cited in these two reports were cited by the later reports, probably because they were conducted prior to the development of modern analytical techniques. The later reports used primary data from a total of 11 studies between them, but their recommendations, which ranged from 21-23 mg L-histidine/g crude protein, were obtained by averaging data from three to seven studies (see table below). Other studies were often cited in these reports but were not included when calculating the average. No reasons were given for inclusion or exclusion of studies. Ten studies met the inclusion criteria specified by FSANZ. Of these, only five had been included in calculations by one or more of the earlier reports. FSANZ calculated an unweighted mean of 24 mg L-histidine/g crude protein. Using appropriate composition data for protein, fat and carbohydrate in breast milk, this is equivalent to 10 mg L-histidine/100 kJ milk.
Human milk has a variable composition and depends on a range of factors including the duration of the feed and length of lactation. In the FSANZ calculation, all of the studies used convenience samples of women and reported L-histidine concentrations ranging from 18-40 mg/g crude protein. Only one study (Raiha et al. 2002) examined physiological and biochemical outcomes in infants. In combination with six additional studies examining growth patterns and plasma concentrations of amino acids in breast and formula-fed infants, FSANZ determined that formula-fed infants consuming a formula containing 10 mg L histidine/100 kJ are comparable to breastfed infants on the basis of the physiological and biochemical outcomes examined.
Conclusion: The conversion of results from different studies to a common base, combined with rounding, inevitably introduces error when deriving a mean value. There was a 2-fold range of L-histidine concentrations across the individual studies examined. FSANZ concluded that 24 mg L-histidine/g crude protein (10 mg L-histidine/100kJ) is consistent with the average composition of breast milk and would support growth. This analysis also highlights the variability of the evidence base used by bodies making recommendations for amino acid intakes by infants. Systematic literature searching with justification for inclusion or exclusion of studies does not seem to have been widely used when deriving intake recommendations for L-histidine in infants. References are given in the full report.
The complete report “Supporting document 1 – Comparative Nutrition Safety Assessment” can be downloaded for free from the FSANZ website:
http://www.foodstandards.gov.au/code/applications/documents/A1074_SD11.pdf

Open Access Grey Literature

A Through-chain Analysis of Food Safety Hazards and Control Measures Associated with the Production and Supply of Seed Sprouts for Human Consumption

Hong Jin, Duncan Craig, Patricia Blenman, Adele Yates, Scott Crerar, Marion Healy

European Journal of Nutrition & Food Safety, Page 424-428
DOI: 10.9734/EJNFS/2014/11245

Background: Seed sprouts contaminated with pathogenic microorganisms, such as Salmonella spp. and Shiga toxin-producing Escherichia coli (STEC) present an unacceptable health risk to consumers. An outbreak that occurred in Australia during 2005 and 2006 due to the consumption of alfalfa sprouts contaminated with Salmonella Oranienburg resulted in 141 infected cases, and cost an estimated $1.19 million to the Australian community. In Japan in 1996, consumption of radish sprouts contaminated with STEC O157:H7 affected more than 10,000 individuals. The outbreak of E. coli O104:H4 linked to the consumption of fenugreek sprouts that occurred in Europe in 2011 was an unprecedented foodborne outbreak. More than 4,000 individuals were infected by STEC O104:H4. Among them, 908 developed haemorrhagic uraemic syndrome (HUS), and 50 died of STEC infection. This demonstrates the potential food safety risk arising from seed sprouts and that the consequences can be devastating.

Food Standards Australia New Zealand (FSANZ) initiated the development of a primary production and processing standard for seed sprouts in 2009 to enhance the safety of seed sprouts produced and sold in Australia. After extensive consultations with the State and Territory food safety regulators, and a thorough investigation of the Australian industry practices in producing seed sprouts for human consumption, a technical paper was prepared to inform the design of potential risk mitigation measures for a national food safety standard on seed sprout production. This technical paper described the Australian seed sprout industry, depicted the steps involved in the production of seed sprouts for human consumption, and provided an analysis of potential food safety hazards that could occur during seed sprout production and processing. A food safety standard for the production and sale of seed sprouts in Australia was finalised in November 2011.
This extended abstract describes the key aspects of the technical paper.

Aims: To provide technical and scientific information to support risk management decisions aimed at maximizing the safety of seed sprouts produced for human consumption in Australia.

Study Design: A through-chain qualitative food safety risk analysis.

Place and Duration of Study: FSANZ, Canberra, Australia, between July 2009 and January 2010.

Methodology: This through-chain risk analysis was prepared upon a comprehensive review of literature available at the time on: investigations of foodborne outbreaks associated with consumption of seed sprouts; surveys of microbial contamination of seed sprouts; specific publications on crop production, seed harvest, post-harvest processing and storage of seeds; production of seed sprouts; risk assessments on seed sprouts; and regulatory guidelines published by Australian and international food safety regulatory authorities on seed sprouts.

Members of the FSANZ project team conducted field studies of sprout production, lucerne crop production, lucerne seed processing, wholesale and retail sale of seed sprouts. A survey was conducted on the variety, volume and value of sprouts produced, source and quantity of seeds used to produce sprouts for human consumption, trend of consumption of seed sprouts in Australia, as well as the size and the location of sprout producers in Australia.

Stakeholders were consulted through a FSANZ standard development committee with participants from State and Territory food safety regulators, peak sprout producer industry bodies, seed producers and seed processors, major food retailers, and consumer representatives.

The through-chain analysis of food safety hazards associated with the production and processing of seed sprouts was prepared in line with the principles of hazard analysis critical control points (HACCP).

Results:

Key pathogens of concern: Among the range of biological, chemical and physical food safety hazards that were likely to be associated with seed sprouts produced for human consumption, pathogenic microorganisms represent the highest risk to consumers. Outbreaks associated with the consumption of seed sprouts contaminated with pathogenic microorganisms were seen to be frequent events in developed economies despite food regulatory interventions. The key pathogenic microorganisms of concern were Salmonella spp. and STEC. Salmonella spp. were found to be the causative pathogen almost five times more frequently than STEC.

Main varieties of seed sprouts causing foodborne illness: Among the 41 reported outbreaks that occurred worldwide between 1988 and 2007 involving consumption of seed sprouts contaminated with pathogenic microorganisms, alfalfa sprouts represented 68% of the outbreaks, followed by mungbean clover (5%), radish (2%), cress (2%) and sprouts (22%).

Source of pathogenic microorganisms: FSANZ divided the production and supply of seed sprouts for human consumption into eleven consecutive steps, starting with seed production in the field and ending with transportation and distribution of seed sprouts to retail establishments. This was to enable a systematic identification of the food safety hazards, sources of the hazards, specific controls that could be applied to control or eliminate food safety hazards, and the associated requirements of food safety management practices including food safety knowledge and food safety skills.

Contamination of seeds by pathogenic microorganisms such as Salmonella spp. and STEC can occur during seed production, seed harvest, seed processing, seed storage and transportation. The origin of these pathogenic microorganisms is animal faeces and manure present in the field where the crop is grown. Soil for growing the seed crop, water used for irrigation, and machinery used for crop management including the harvest of seeds, can be contaminated with pathogenic microorganisms and can transfer the contamination to seeds during crop production and seed harvest. Seed processing as a post-harvest step may also contribute to seed contamination. For example, blending different harvest lots of seeds for seed cleaning can spread what was originally a localised contamination into a larger volume of seeds. Rodent, insect and bird activities in seed processing and seed storage establishments can introduce and spread pathogenic microorganisms to seeds.

Provided that seeds delivered to sprout production sites are free of pathogenic microorganisms, activities of rodents, insects, and infected workers in seed receipt, storage, sprout production, sprout storage and transportation at sprouting establishment can lead to contamination of seed sprouts by pathogenic microorganisms. So is the use of contaminated water for sprouting. Much of these are also applicable to retail handling and storage of seed sprouts.

Investigations into the source of sprout contamination for outbreaks that occurred between 1988 and 2007 found that in almost every case the pathogenic microorganisms causing the outbreaks were present in the seeds used for sprout production. In approximately 20% of the outbreaks, contamination in sprouting establishments was also identified as a likely source of contamination.

Identified risk mitigation measures: Based on an analysis of a wide range of possible recommendations aimed at improving the safety of seed sprouts, the though-chain analysis recommended the following good agricultural practices to be implemented in the primary production phase of seeds:

• Environment - soil and environment where seeds are grown for the production of seed sprouts as a human food should be suitable.
• Inputs - manure, biosolids and other natural fertilisers should only be used for the growth of seed crops when a high level of pathogen reduction has been achieved; equipment (bins, containers, silos, vehicles) and machinery are maintained and used in a manner that minimises and/or avoids contamination of seeds with pathogenic microorganisms.
• Protection - grazing animals and wild animals are prevented from entering the field where seeds are grown; and seed crops are protected from contamination by human, animal, domestic, industry and agricultural wastes.
• Segregation - seeds produced for the production of sprouts for human consumption are segregated from seeds produced for the production of animal feed and are clearly labelled.

The through-chain analysis also recommended the following components to be included in a Food Safety Program that must be effectively implemented in sprout production establishments:

• Environment – the sprouting facility (including the seed storage area) should not allow access of rodents, insects, pests or animals; sprouting facility and equipment are effectively cleaned and sanitised to ensure the environment is suitable for producing ready-to-eat foods.
• Input – each seed lot is tested for the presence of microbial pathogens of concern and seeds should not be used unless the testing results are negative; solid medium supporting sprout growth and water for sprouting are treated to eliminate pathogenic microorganisms; seeds are disinfected prior to sprouting to eliminate microbial pathogens.
• Separation – seed rinsing and microbiological decontamination, seed germination/sprouting, and storage of seed sprouts are physically separated from each other to prevent cross contamination.
• Monitoring – implement appropriate sampling/testing programs to regularly monitor microbial pathogens during and at the end of production of seed sprouts.

Implementation of food safety controls on farm presents many challenges. One of the main obstacles is the inability to control environmental factors under conventional farming practices. The environment under which seeds are produced for the production of seed sprouts for human consumption should exclude animal grazing and minimise and avoid pest and wildlife interference. The cost involved in growing seeds under these conditions can be prohibitive unless sprout producers are willing to pay a premium price for such seeds. As a result, the primary production and processing standard developed by FSANZ limited the control measures to sprout production. The standard is available at http://www.comlaw.gov.au/Details/F2012L00023.

A number of chemical and physical means have been investigated by different research organisations around the world for effective seed decontamination. As a result of the FSANZ standard development activity, a subsequent research and development study funded by the Australian Government and the Australian sprout industry investigated the efficacy of various disinfectants applied to seed decontamination. The study, Improving Seed Sprout Food Safety – a farm to retail assessment, recommended the use of multiple hurdles, such as a combination of heat treatments and chemical decontamination of seeds, to control the potential presence and growth of foodborne pathogens during sprout production. The full report of this study is available at https://rirdc.infoservices.com.au/items/13-010.

Conclusion: Seed sprouts, particularly those that are consumed raw, are one of the food vehicles frequently implicated in foodborne outbreaks. Pathogenic microorganisms, such as Salmonella spp. and STEC, are key food safety hazards associated with past outbreaks involving consumption of seed sprouts.

Seeds used for sprouting are the primary source of contamination with pathogenic microorganisms that originate from animal faeces, untreated fertilisers, contaminated soils and water.

Seed decontamination prior to sprouting is considered the critical control point in producing safe seed sprouts.

Through-chain food safety management presents the ultimate solution to the reduction of foodborne illnesses associated with the consumption of seed sprouts. In addition to sprout producers, seed producers, seed processors and seed merchants have a role to play in ensuring seed sprouts produced for human consumption are safe.

Open Access Method Article

Comparison of a Web-Based Frequency Questionnaire for Assessment of Beverage Intake with a Validated 7-Day Web-Diary from Danish Teenagers

Anja Biltoft-Jensen, Jeppe Decker Iversen, Lene Møller Christensen, Jeppe Matthiessen

European Journal of Nutrition & Food Safety, Page 577-591
DOI: 10.9734/EJNFS/2014/9486

Aims: To compare beverage intake measured using a web-based quantitative beverage frequency questionnaire (BFQ) with a 7-day estimated beverage diary (WebDAS), and to evaluate the BFQ’s feasibility.
Study Design: Cross-sectional comparison of the BFQ, which contained 37 beverage types including energy drinks and caffeinated beverages, with the WebDAS. 
Place and Duration of Study: Sample: Three 9th grade local authority school classes totalling 73 14-16-year-old students from a suburban area in Copenhagen were recruited. The study was carried out between September 2013 and November 2013.
Methodology: First respondents completed the WebDAS at home, and after 2 weeks they completed the BFQ at school. McNemar’s test, the Wilcoxon signed rank test, Spearman’s rank correlation coefficients, a Bland-Altman plot, weighted Kappa statistics and percentages of exact agreement were used to compare the results of the two methods. 
Results: 49 (29 boys; 20 girls) had acceptable data from both the WebDAS, and the BFQ. The mean total beverage intake measured by the two methods differed significantly (BFQ: 1566 vs. WebDAS: 1231g/day, P<.01). The Spearman rank correlations were positive (r=.41-75) for all beverages including energy drinks, and significant for most beverages. Significant agreement for the numbers of consumers was observed between methods, except for soft drinks and chocolate. The mean (SD) difference was 335 (769)g/day, primarily due to an intake of water measured with BFQ almost twice as high as that measured with WebDAS. This was reflected in the Bland-Altman plot and in the percentages of exact agreement, which were lower for water than for other beverages (29% vs. 39-46%). 
Conclusion: The BFQ gave results comparable to a 7-day beverage diary (WebDAS) in 14-16-year-olds. With a few adjustments, especially with regard to portion sizes and entries for water, we believe the BFQ will be useful in large population-based studies for assessment of beverage intake.

Open Access Short Research Article

Food Risk Perceptions of Women in Rural and Urban Households- A Study in India

Swetha Boddula, Vemula R Sudershan, Balakrishna Nagalla, Snehasree Saha, Subba Rao M Gavaravarapu

European Journal of Nutrition & Food Safety, Page 380-391
DOI: 10.9734/EJNFS/2014/10366

Aims: The current study attempted to examine risk perceptions related to safety of various commonly consumed foods and perceived health hazards associated with such risk perceptions among home food preparers. 
Study Design: A cross-sectional study in urban and rural areas. Study locations were selected purposively but the participants were recruited using stratified random sampling technique. 
Place and Duration of the Study: This study was conducted in Hyderabad, capital city of the state of Andhra Pradesh in South India and Kothapally Village in Karimnagar District for urban and rural population respectively. The study period was 4 months.
Methodology: Considering size and geographical spread, Hyderabad was divided into 3 natural zones and the village was considered as only one zone. From each zone, 30 households were selected (@ 10 each from lower, middle and upper economic strata in order to capture variations in perceptions, if any), making the total sample 120 with 90 from urban and 30 from rural locations respectively. Women, who were directly involved in food preparation were interviewed using a pre-tested, pre-coded questionnaire. 
Results: This study revealed that infestation and adulteration were perceived as major risks in cereals and pulses. Majority of respondents perceived pesticide residues as risks in vegetables and fruits. About 83% perceived swarming of flies and mosquitoes as the only risk for uncooked non-vegetarian foods like meat and fish. Perceived risks were also linked with food-borne diseases by many of the respondents. We did not find any significant co-relation between income, education or habitat and perceptions related to food risk perception.
Conclusions: This study gives an overview of perceived risks related to commonly consumed foods. These results provide cues and set direction for further research to explore if perceived risks match with actual risks or not.

Open Access Short communication

Legal Structures of Food Safety in Europe

Klaus J. Henning, Susanne Kaus, Lea Herges, Susann Stehfest, Gaby-Fleur Böl

European Journal of Nutrition & Food Safety, Page 375-379
DOI: 10.9734/EJNFS/2014/11197

In view of the rapid increase in the globalisation of the economy, assuring food safety within the European Union is a challenge. The range and variety of foods on offer in Europe continue to rise steadily. The demands not only on food companies but also on the European Union and its member states grow, that food risks should be scientifically assessed, minimised and communicated in a way that can be easily understood. Private, as well as criminal and public law aspects have to be considered, especially in possible crises. The structures of and responsibilities for the public law tasks of EU institutions and on the level of the Member States and even within the Member States themselves are often not sufficiently well known. This results in confusion and accusations in times of crisis and a duplication of efforts and negative competence conflicts in times of peace and quiet. The “EU Food Safety Almanac” published by the German Federal Institute for Risk Assessment (Bundesinstitut für Risikobewertung, BfR) is intended to help to perceive responsibilities in food safety in a proper manner. It provides an overview of the structures of food and feed safety within the Member States and the European Union. In doing so, it becomes clear how food safety is organised and implemented differently within the scope of the constitutional and administrative law of 35 respective countries.

Open Access Short communication

Identifying Food Consumption Patterns among Young Consumers by Unsupervised and Supervised Multivariate Data Analysis

Ulf Hammerling, Eva Freyhult, Anna Edberg, Salomon Sand, Sisse Fagt, Vibeke Kildegaard Knudsen, Lene Frost Andersen, Anna Karin Lindroos, Daniel Soeria-Atmadja, Mats G. Gustafsson

European Journal of Nutrition & Food Safety, Page 392-403
DOI: 10.9734/EJNFS/2014/9082

Although computational multivariate data analysis (MDA) already has been employed in the dietary survey area, the results reported are based mainly on classical exploratory (descriptive) techniques. Therefore, data of a Swedish and a Danish dietary survey on young consumers (4 to 5 years of age) were subjected not only to modern exploratory MDA, but also modern predictive MDA that via supervised learning yielded predictive classification models. The exploratory part, also encompassing Swedish 8 or 11-year old Swedish consumers, included new innovative forms of hierarchical clustering and bi-clustering. This resulted in several interesting multi-dimensional dietary patterns (dietary prototypes), including striking difference between those of the age-matched Danish and Swedish children. The predictive MDA disclosed additional multi-dimensional food consumption relationships. For instance, the consumption patterns associated with each of several key foods like bread, milk, potato and sweetened beverages, were found to differ markedly between the Danish and Swedish consumers. In conclusion, the joint application of modern descriptive and predictive MDA to dietary surveys may enable new levels of diet quality evaluation and perhaps also prototype-based toxicology risk assessment.

Open Access Policy Article

Grey Area Novel Foods: An Investigation into Criteria with Clear Boundaries

Corinne Sprong, Rick van den Bosch, Sven Iburg, Kathelijne de Moes, Elleander Paans, Sofia Sutherland Borja, Hannah van der Velde, Henk van Kranen, Henk van Loveren, Bernd van der Meulen, Hans Verhagen

European Journal of Nutrition & Food Safety, Page 342-363
DOI: 10.9734/EJNFS/2014/8662

In the European Union novel foods are defined by the Novel Foods Regulation as food products and food ingredients that have not been consumed to a significant degree in the European Union before May 1997. However, there are new foods for some reason not considered as novel foods, although it may not be excluded that they differ from conventional foods to such an extent that an assessment of their safety prior to their entry to the market would be called for. Previously, we reported that this ‘grey area’ of novel foods exists and comprises: (1) food products or ingredients for which the current Novel Foods Regulation leaves too much space for different interpretations and (2) food products or ingredients that are not novel according to the current Novel Foods Regulation, because it contains gaps. This paper focuses on how to handle these interpretation differences and gaps and provides recommendations to improve these pitfalls of the current Novel Foods Regulation. To this end, we propose criteria with clear boundaries as part of an assessment tool to reduce the uncertainties in interpretation with respect to consumption to a significant degree in the European Union, which take into account the commercial availability, length, extent and frequency of use of the particular food/ingredient. In addition, biological relevant boundaries for the criteria regarding changes in the nutritional value, metabolism (better all aspects of absorption, distribution, metabolism and excretion), and levels of undesirable substances are proposed for significant changes in the composition of foods due to changes in the production process. In addition, criteria are proposed to cover ambiguities and gaps in the Novel Foods Regulation dealing with food products and food ingredients obtained from 1) animals on a new feeding regime, 2) new varieties of organisms, 3) other growth stages of crops. Finally, a criterion that takes into account the total ingredient intake rather than single product intake is added to deal with the risk of overexposure to substances. Taken together, the proposed boundaries and criteria may contribute to diminishing the interpretation issues regarding the Novel Foods Regulation and thus to reducing the extent of the grey area of novel foods.

Open Access Policy Article

A Simple Visual Model to Compare Existing Front-of-pack Nutrient Profiling Schemes

Daphne Van Der Bend, Joost Van Dieren, Marta De Vasconcelos Marques, Nick L. W. Wezenbeek, Niki Kostareli, Patricia Guerreiro Rodrigues, Elisabeth H. M. Temme, Susanne Westenbrink, Hans Verhagen

European Journal of Nutrition & Food Safety, Page 429-534
DOI: 10.9734/EJNFS/2014/10305

Nutrient profiling is an important tool for governments, non-governmental organizations (NGO’s) and for the food industry, to help consumers make healthier food choices. Multiple nutrient profiling systems (NPS) have been introduced worldwide. There is, however, no agreement on the use of a single NPS in leading regions like the USA and Europe. In 2008, the Arrow Model of Verhagen and van den Bergwas created to illustrate and compare characteristics of existing NPS. Recent developments in nutrient profiling give rise to the need for an updated Model. The present study aims to develop a comprehensive model, which can be used to explain and compare various front-of-pack nutrient profiling systems (FOP-NPS). An extensive literature research was conducted to obtain an overview of existing FOP-NPS worldwide. Only FOP-NPS that are currently in use, focus on health-related product aspects and target the general population (adults and children) were included. The Funnel Model was developed based on the analysis of 40 existing FOP-NPS and expert interviews. This Model illustrates different FOP-NPS and allows comparison among them. The Funnel Model includes several new characteristics compared to the Arrow Model. Numerous ingredients and four new characteristics were added to the Funnel Model: directivity, type of institution initiating the system, purpose and utilization. Several other characteristics were expanded with new elements. The Funnel Model also has a new visual presentation which is useful to clearly explain and compare FOP-NPS.

Open Access Minireview Article

Short Review of Calcium Disodium Ethylene Diamine Tetra Acetic Acid as a Food Additive

Marijke M. H. Van De Sande, Sabrina Wirtz, Ellen Vos, Hans Verhagen

European Journal of Nutrition & Food Safety, Page 408-423
DOI: 10.9734/EJNFS/2014/10405

Calcium disodium ethylenediaminetetraacetate (Calcium Disodium EDTA, C10H12CaN2Na2O8.2H2O) is a derivative of EthylenediamineTetraacetic Acid and is an approved food additive (E385). It is used as preservative, sequestrant, flavouring agent, and colour retention agent in foods. As a drug it is used for the reduction of blood and mobile depot lead in the treatment of acute and chronic lead poisoning. Calcium Disodium EDTA is very poorly absorbed from the gastrointestinal tract following ingestion. The compound is metabolically inert and no accumulation in the body has been found. Acute, short-term, sub chronic and chronic toxicity studies carried out with Calcium Disodium EDTA in laboratory animals found that the compound is nephrotoxic at high doses. In similar high doses, application of Calcium Disodium EDTAcan result in complexation of zinc ions, thus interfering with the zinc homeostasis and causing developmental toxicity. No evidence exists suggesting the compound exerts genotoxic or carcinogenic effects. Overall, Calcium Disodium EDTAseems to be safe for use as a food additive, as the noted toxic doses are higher than can be achieved via the addition of Calcium Disodium EDTA to food. However, human data is limited and the gross of available (human and animal) data, as well as the ADI, stems from several decades ago. Caution should also be taken when Calcium Disodium EDTA is administered as treatment for lead poisoning, as the exposure increases greatly. Until 2020, EFSA will carry out new risk assessments, and subsequently the Commission will revise the list of food additives and the conditions of use specified therein. The deadline for food additives other than colours and sweeteners is 31 December 2018, which seems appropriate regarding the non-acute need for re-evaluation of Calcium Disodium EDTA as food additive.

Open Access Original Research Article

Vitamin A Deficiency among School Going Adolescents in Rural Areas of Bareilly

Ajay Kumar Agarwal, H S Joshi, Arun Singh

European Journal of Nutrition & Food Safety, Page 318-324
DOI: 10.9734/EJNFS/2014/7766

Objective: 1. To find out the prevalence of Vitamin A deficiency (VAD) on the basis of presence of bitot’s spot and conjunctival xerosis among rural school going adolescents of District Bareilly, Uttar Pradesh, India. 2. To identify the associated factors and to suggest the suitable measures to prevent VAD among them. 
Study Design: Cross sectional study. 
Place and Duration of Study: Field practices areas Department of Community Medicine RMC&H Bareilly, Uttar Pradesh India, between Jan 2012 to Dec 2012. 
Participants: 900 school going adolescents. 
Sampling: Multistage sampling method. A structured schedule was used to collect the information. 
Statistical Analysis: Data were analyzed with SPSS 17. Significant difference was determined using Chi- square test. 
Results: The overall prevalence of VAD was found to be 42.22%. It was higher in 15-19 years of age group adolescents (48.77%) as compare to 10-14 years (41.6%). The prevalence of VAD was slightly higher among boys (p value=0.666). Out of total 398 (42.22%) VAD adolescents 300 adolescents were from socioeconomic class V. 
Conclusion: Nutrition education regarding regular intake of foods rich in vitamin A rich is needed to prevent the deficiency.

Open Access Original Research Article

A Community-based Randomized Double Blind Controlled Trial of Lactobacillus paracasei and Bifidobacterium lactis on Reducing Risk for Diarrhea and Fever in Preschool Children in an Urban Slum in India

R. Hemalatha, Arthur C. Ouwehand, Sofia D. Forssten, J. J. Babu Geddan, Raja Sriswan Mamidi, V. Bhaskar, K. V. Radhakrishna

European Journal of Nutrition & Food Safety, Page 325-341
DOI: 10.9734/EJNFS/2014/8280

Aims: The aim of the study was to determine the effect of probiotics on diarrhea and fever in preschool children in a community setting in a developing country.
Study Design: Double blind randomized controlled trial.
Place and Duration of Study: The study was performed in Addagutta; a slum of Hyderabad (India), from July 2010 to April 2011.
Methodology: Healthy preschool children (2-5 years, n=379) in an Urban Slum in India. Three randomly allocated groups of children received either of the two probiotics (Lactobacillus paracasei Lpc-37 and Bifidobacterium lactis HN019) or the placebo for a period of 9 months and were assessed for weight gain, linear growth and incidence of diarrhea and fever.
Results: Neither of the tested probiotics; L. paracasei Lpc-37 or B. lactis HN019 had any influence on weight gain or linear growth. There was no significant difference between the groups in incidence of diarrhea and fever when assessing the whole study period. However, during the wet season, in the months of August and September, incidence of diarrhea was significantly higher in placebo group (16.9%) compared to L. paracasei Lpc-37 (11.7 %) and B. lactis HN019 groups (8.4 %). Similarly, the incidence of fever was significantly higher in the month of August in the placebo group (11.5%) compared to the L. paracasei Lpc-37 group (7%) and B. lactis HN019 group (7.3%). Probiotic supplementation had no effect on fecal calprotectin, but fecal IgA and serum interleukin 8 were decreased significantly in the B. lactis HN019 group compared to placebo. Consumption of L. paracasei Lpc 37 lead to increased levels of fecal L. paracasei.
Conclusion: During the rainy season, when incidence of fever and diarrhea was highest, the administered probiotics reduced the incidence of these symptoms. Over the whole study period, the probiotics did, however, not influence incidence of diarrhea or fever.

Open Access Original Research Article

Mothers’ Nutritional Knowledge, Infant Feeding Practices and Nutritional Status of Children (0-24 Months) in Lagos State, Nigeria

I. A. Akeredolu, J. O. Osisanya, J. S. Seriki-Mosadolorun, U. Okorafor

European Journal of Nutrition & Food Safety, Page 364-374
DOI: 10.9734/EJNFS/2014/7604

Aim: This study examined the nutritional knowledge, infant feeding practices of mothers and the nutritional status of children in Lagos State, Nigeria.
Study Design: A cross sectional survey design was used.
Place and Duration of Study: The study was conducted in three selected Local Government Areas (LGAs) of Lagos state. The LGAs were Ikeja, Shomolu and Ikorodu representing urban, sub-urban and rural areas respectively.
Methodology: A validated questionnaire and group interview was used as the instrument for data collection. Data was collected from 300 randomly selected mothers of children aged 0 months to 24 months, who visited three Government- owned childcare centres in Lagos State, Nigeria. Anthropometric indices were used to determine the children’s nutritional status. The data obtained from mothers were analyzed using simple percentage and frequency counts. Epi Info 6 was used to analyze the anthropometric data.
Results: The findings indicated that the mothers’ nutritional knowledge as revealed by the test score was fairly good. Majority (75%) of the respondents breastfed their children but only 14.7% of the mothers practiced exclusive breast feeding for six months, while 43.3% of the mothers in addition to breast feeding, included complementary foods for their children at 4-6 months of age. About 16% of the mothers introduced complementary feeding and solid foods to their infants before 6 months of age. Occupation type was the most (45%) influential factor affecting breast feeding and appropriate complementary feeding practices. Also, 16.3% of the children were stunted while the prevalence of under-weight and wasting were 13% and 10% respectively. The chi-square test showed a statistically significant association (p<0.05) between mothers’ nutritional knowledge and the children’s nutritional status. 
Conclusion: These findings are of public health concern. It is therefore recommended that the duration of maternity leave should be increased. Women of child bearing age should be educated by trained nutritionists on the types of locally available foods that promote growth in children.

Open Access Original Research Article

Dietary Intake and Health Risk Assessment of Polybrominated Diphenyl Ethers in the Netherlands Based on Data Collected in 2004 and 2008

M. J. Zeilmaker, B. G. H. Bokkers, J. D. TE Biesebeek, M. J. B. Mengelers, C. W. Noorlander

European Journal of Nutrition & Food Safety, Page 535-557
DOI: 10.9734/EJNFS/2014/6756

Brominated flame retardants, like polybrominated diphenyl ethers (PBDEs), are environmental contaminants which have entered the human food chain. In this context the concentrations of several PBDEs were measured in food items commonly available in the Netherlands in 2004 and 2008. In food BDE-47, -99 and -100 were analysed and detected in 2004 and 2008, whereas BDE-209 was only analysed and detected in 2008. The highest BDE concentrations were found in seafood (fatty fish and crustaceans). The life-long dietary intake of these compounds in humans was calculated using the concentration data. For BDE-47, -99 and -100 the intake in 2008 was higher than in 2004. 
A risk assessment based on the most sensitive toxic effects of PBDEs in experimental animals was possible for BDE-47, -99 and 209 (but not for BDE-100, [3]). These effects consist of neurodevelopmental toxicity resulting from the disturbance of growth of the central nervous system (BDE-47, -99 and -209) and reproductive toxicity resulting from the disturbance of spermatogenesis after intrauterine exposure (BDE-99). 
A risk assessment based on the dietary exposure of individual PBDE congeners revealed that in The Netherlands the dietary exposure to PBDE-47 and -209 does not pose a health concern with respect to neurodevelopmental toxicity. However, with regard to reproductive toxicity and neurodevelopmental toxicity the dietary exposure in The Netherlands to BDE-99 is of potential health concern.

Open Access Original Research Article

Finding the Optimum Scenario in Risk-benefit Assessment: An Example on Vitamin D

F. L. Berjia, J. Hoekstra, H. Verhagen, M. Poulsen, R. Andersen, M. Nauta

European Journal of Nutrition & Food Safety, Page 558-576
DOI: 10.9734/EJNFS/2014/9285

Background: In risk-benefit assessment of food and nutrients, several studies so far have focused on comparison of two scenarios to weigh the health effect against each other. One obvious next step is finding the optimum scenario that provides maximum net health gains. 
Aim: This paper aims to show a method for finding the optimum scenario that provides maximum net health gains. 
Methods: A multiple scenario simulation. The method is presented using vitamin D intake in Denmark as an example. In addition to the reference scenario, several alternative scenarios are simulated to detect the scenario that provides maximum net health gains. As a common health metric, Disability Adjusted Life Years (DALY) has been used to project the net health effect by using the QALIBRA (Quality of Life for Benefit Risk Assessment) software.
Results: The method used in the vitamin D example shows that it is feasible to find an optimum scenario that provides maximum net health gain in health risk-benefit assessment of dietary exposure as expressed by serum vitamin D level. With regard to the vitamin D assessment, a considerable health gain is observed due to the reduction of risk of other cause mortality, fall and hip fractures when changing from the reference to the optimum scenario. 
Conclusion: The method allowed us to find the optimum serum level in the vitamin D example. Additional case studies are needed to further validate the applicability of the approach to other nutrients or foods, especially with regards to the uncertainty that is usually attending the data.

Open Access Original Research Article

Antioxidant and Neuroprotective Effect of Organic and Conventional White Grape Juices on Oxidative Stress Induced by Sodium Azide in Cerebral Cortex of Rats

Bárbara Roberta Ongaratti, Fernanda De Souza Machado, Niara Da Silva Medeiros, Camila Destri, Edson Ribeiro Da Silva, André Quincozes-Santos, Caroline Dani, Cláudia Funchal

European Journal of Nutrition & Food Safety, Page 592-603
DOI: 10.9734/EJNFS/2014/8470

Aims: Diet plays an important role in the prevention of some diseases related to oxidative stress. Although the beneficial effects of a moderate intake of wine are well known, information about the antioxidant properties of grape juice is still limited. Therefore, the objective of this study was to investigate the In vitroneuroprotective effect of conventional and organic white grape juices (Vitis labrusca) on oxidative stress induced by sodium azide in cerebral cortex of 10-day-old-rats.
Study Design: Experimental study using an animal model.
Place and Duration of Study: Biochemistry Laboratory, Methodist University Center – IPA, Porto Alegre, RS, Brazil, between January and December 2012.
Methodology: Cerebral cortex was homogenized and preincubated for 30 minutes with 40% (w/v) conventional or organic white grape juices, furthermore the homogenates were incubated for 1 hour with 5 mM sodium azide in the presence or absence of the juices. The assays of thiobarbituric acid reactive substances (TBARS), carbonyl, sulfhydryl, the activity of the antioxidant enzymes catalase (CAT) and superoxide dismutase (SOD) and the production of nitric oxide (NO) were performed in the homogenates.
Results: Sodium azide was able to enhance lipid peroxidation (TBARS) and carbonyl content and also to reduce the non-enzymatic antioxidants defenses (sulfhydryl) in cerebral cortex homogenates. Moreover, sodium azide inhibited the activity of CAT and SOD and enhanced NO levels. Conventional and organic white grape juices were able to prevent the effects caused by sodium azide in TBARS, carbonyl, sulfhydryl and CAT assays. 
Conclusion: These results indicated that sodium azide induces oxidative stress in cerebral cortex of young rats and that conventional and organic white grape juices exhibit antioxidant properties capable to ameliorate the oxidative damage caused by this compound.

Open Access Original Research Article

A Retrospective Study on the Relationship of Changes in Likes/Dislikes with Food Habits in 4- and 6-Year-Old Children

Tomoko Osera, Setsuko Tsutie, Misako Kobayashi, Nobutaka Kurihara

European Journal of Nutrition & Food Safety, Page 604-613
DOI: 10.9734/EJNFS/2014/10604

Objective: This is a retrospective cohort study to investigate children’s food habits, affecting changing their likes/dislikes, on the basis of questionnaires answered by the mothers of 222 children. 
Methods: The questionnaire data was analyzed on 4 years old children at the beginning of the first year of kindergarten and on the same children at the end of the second year, when they were 6 years old. The questionnaire included 18 questions regarding their children’s lifestyle, likes/dislikes, attitude toward foods and guardian’s food habits. The Kruskal Wallis test was performed to compare the ordered categorical outcomes among four groups of changes in likes/dislikes: “(+) to (−)” means that the children disliked some foods as 4 year olds, and that as 6 year olds they disliked no foods, “(+) to (+)”, “(−) to (+)”, and “(−) to (−)”. 
Results: In total, 71.0% results reported (+) to (+), 10.0% reported (+) to (−), 6.5% reported (−) to (+) and 12.5% reported (-) to (-). Among the four groups, “Respect for food” (p<0.001), “Enjoying school lunches” (p<0.01), and “Family’s deviated food habits” (p<0.01) significantly varied. Children in the “(+) to (−)” group showed significantly more favorite behaviors than in the “(+) to (+)”. 
Conclusion: Changes in children’s likes/dislikes during kindergarten 2 years’ course may be related to “Respect for food”, “Enjoying school lunch” and “Family’s deviated food habits”. Therefore, managing these habits may be important when trying to change children’s likes/dislikes.

Open Access Review Article

Use of Nanomaterials in the Detection of Food Contaminants

Sachin K Sonawane, Shalini S. Arya, Jean Guy LeBlanc, Neetu Jha

European Journal of Nutrition & Food Safety, Page 301-317
DOI: 10.9734/EJNFS/2014/6218

Food safety plays an important role in public health and thus to society as a whole. Food borne illness associated with toxins, pathogens or other food contaminants poses a serious health threat all over the world. Food may become unsafe due to the presence of adulterants such as melamine, food born pathogenic bacteria, toxins such as cholera, shiga, aflatoxin amongst others. The present review is focused on the potential role of nanomaterials, which are currently being used in various biosensors, for the detection of various chemical contaminants, toxins and pathogens.