Silver Spring, MD, United States
Silver Spring, MD, United States

Time filter

Source Type

Minnikanti S.,Dakota Consulting Inc. | Minnikanti S.,U.S. National Institute of Standards and Technology | Gangopadhyay A.,George Mason University | Gangopadhyay A.,U.S. National Institute of Standards and Technology | Reyes D.R.,U.S. National Institute of Standards and Technology
Polymers | Year: 2014

The formation of polyelectrolyte multilayers (PEMs) for the first time, two decades ago, demonstrating the assembly on charged substrates in a very simple and efficient way, has proven to be a reliable method to obtain structures tunable at the nanometer scale. Much effort has been put into the assembly of these structures for their use in biological applications. A number of these efforts have been in combination with microfluidic systems, which add to the nanoassembly that is already possible with polyelectrolytes, a new dimension in the construction of valuable structures, some of them not possible with conventional systems. This review focuses on the advancements demonstrated by the combination of PEMs and microfluidic systems, and their use in biological applications. © 2014 by the authors.


Goodwillie C.,East Carolina University | Ness J.M.,Dakota Consulting Inc
American Journal of Botany | Year: 2013

Premise of the study: The roles of hybridization and mating systems in the evolution of angiosperms have been well studied, but less work has focused on their interactions. Self-incompatible and self-compatible species often show asymmetry in hetero-specific pollen rejection. Self-fertilization can preempt ovules before opportunities for hybridization. In turn, hybridization might affect mating system evolution through selection for selfing to avoid production of low fitness hybrids. Methods: AFLP and morphological analyses were used to test for hybrids in a contact zone between species with contrasting breeding systems. Crossing experiments examined the relative contributions to reproductive isolation of pollen-pistil interactions, timing of self-fertilization, and F 1 viability and fertility. A diallel cross of siblings tested for an association between heterospecific incompatibility and S -genotype in the self-incompatible species. Key results: A low frequency of hybrids was detected in the contact zone. Pollen-pistil interactions were partially consistent with the SI × SC rule; some individuals of the self-incompatible species rejected heterospecific pollen, whereas the self-compatible species was fully receptive to it. In the selfing species, individuals with early selfing produced fewer hybrid progeny than did those with delayed self-compatibility when heterospecific pollen was applied after self-pollen. Viability of F 1s was high but fertility was low. Variability in heterospecific pollen rejection was not related to S-genotype. Conclusions: Both self-fertilization and self-incompatibility are associated with limits to hybridization at this site. The strong effect of timing of selfing on production of low fitness F 1 s suggests that hybridization might select for early selfing in this population. © 2013 Botanical Society of America.


Barletta I.,Chalmers University of Technology | Larborn J.,Chalmers University of Technology | Mani M.,Dakota Consulting Inc. | Johannson B.,Chalmers University of Technology
Sustainability (Switzerland) | Year: 2016

There is a lack of structured methodologies to support stakeholders in accessing the sustainability aspects for e-waste management. Moreover, the increasing volume of electronic waste (e-waste) and the availability of automated e-waste treatment solutions demand frequent reconfigurations of facilities for efficient e-waste management. To fill this gap and guide such ongoing developments, this paper proposes a novel methodological framework to enable the assessing, visualizing and comparing of sustainability impacts (economic, environmental and social) resulting from changes applied to a facility for e-waste treatment. The methodology encompasses several methods, such as discrete event simulation, life cycle assessment and stakeholder mapping. A newly-developed demonstrator for sorting e-waste is presented to illustrate the application of the framework. Not only did the methodology generate useful information for decision making, but it has also helped identify requirements for further assessing the broader impacts on the social landscape in which e-waste management systems operate. These results differ from those of previous studies, which have lacked a holistic approach to addressing sustainability. Such an approach is important to truly measure the efficacy of sustainable e-waste management. Potential future applications of the framework are envisioned in production systems handling other waste streams, besides electronics. © 2016 by the authors.


Sonmez M.T.,U.S. National Institute of Standards and Technology | Sonmez M.T.,Dakota Consulting Inc | Peralta R.,U.S. National Institute of Standards and Technology
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2015

A generic way to design lightweight cryptographic primitives is to construct simple rounds using small nonlinear components such as 4×4 S-boxes and use these iteratively (e.g., PRESENT [1] and SPONGENT [2]). In order to efficiently implement the primitive, efficient implementations of its internal components are needed. Multiplicative complexity of a function is the minimum number of AND gates required to implement it by a circuit over the basis (AND, XOR, NOT). It is known that multiplicative complexity is exponential in the number of input bits n. Thus it came as a surprise that circuits for all 65 536 functions on four bits were found which used at most three AND gates [3]. In this paper, we verify this result and extend it to five-variable Boolean functions. We show that the multiplicative complexity of a Boolean function with five variables is at most four. © Springer International Publishing Switzerland 2015 (outside the US)


Mani M.,Dakota Consulting Inc. | Mani M.,U.S. National Institute of Standards and Technology | Larborn J.,Chalmers University of Technology | Johansson B.,Chalmers University of Technology | And 2 more authors.
Journal of Manufacturing Science and Engineering, Transactions of the ASME | Year: 2016

Sustainability assessments are dependent on accurate measures for energy, material, and other resources used by the processes involved in the life cycle of a product. Manufacturing accounts for about 1/5 of the energy consumption in the U.S. Minimizing energy and material consumption in this field has the promise of dramatically reducing our energy dependence. To this end, ASTM International [1] has formed both a committee on Sustainability (E60) and a Subcommittee on Sustainable Manufacturing (E60.13). This paper describes ASTM's new guide for characterizing the environmental aspects of manufacturing processes [2]. The guide defines a generic representation to support structured processes. Representations of multiple unit manufacturing processes (UMPs) can be linked together to support system-level analyses, such as simulation and evaluation of a series of manufacturing processes used in the manufacture and assembly of parts. The result is the ability to more accurately assess and improve the sustainability of production processes. Simulation is commonly used in manufacturing industries to assess individual process performance at a system level and to understand behaviors and interactions between processes. This paper explores the use of the concepts outlined in the standard with three use cases based on an industrial example in the pulp and paper industry. The intent of the use cases is to show the utility of the standard as a guideline for composing data to characterize manufacturing processes. The data, besides being useful for descriptive purposes, is used in a simulation model to assess sustainability of a manufacturing system. Copyright © 2016 by ASME.


Sharp N.,U.S. National Institute of Standards and Technology | Fassett J.D.,Dakota Consulting Inc. | Simons D.S.,U.S. National Institute of Standards and Technology
Journal of Vacuum Science and Technology B: Nanotechnology and Microelectronics | Year: 2016

Secondary ion mass spectrometry (SIMS) plays an important role in nuclear forensics through its ability to identify isotopic ratios of particles accurately and precisely from samples obtained by inspectors [Boulyga et al., J. Anal. At. Spectrom. 30, 1469 (2015)]. As the particle mass can be on the order of subpicograms, it is important to maximize the sample utilization efficiency of U+ to make high-quality isotopic measurements. The influence of primary ion beam species and polarity on U+ sample utilization efficiency has been previously investigated by Ranebo et al. [J. Anal. At. Spectrom. 24, 277 (2009)]. However, the effect of sample substrate on uranium ion production efficiency and sputtering profile has not been investigated. This work will explore those influences on sample utilization efficiency by analyzing monodisperse uranium oxide microspheres deposited onto graphite and silicon planchets. The particles were mapped using an automated scanning electron microscope, and their coordinates were converted to the SIMS coordinate system using fiducial marks. Results indicate higher U+ sample utilization efficiencies when sputtering with O- and O2 - on graphite planchets compared with O2 +, whereas O2 - gave higher U+ sample utilization efficiencies with silicon wafers compared to O- and O2 +. Additionally, during sputtering of uranium particles on silicon wafers with O- and O2 -, a sudden drop in U+ signal intensity was observed, which was not present during sputtering with O2 + or any primary ion species for particles on graphite. This drop in U+ signal intensity occurred simultaneously with an increase in UO+ and UO2 + signals, indicating a change in the local matrix around the uranium particle that is unique to silicon compared to graphite. © 2016 U.S. Government.


Awad G.,Dakota Consulting Inc. | Over P.,U.S. National Institute of Standards and Technology | Kraaij W.,TNO | Kraaij W.,Radboud University Nijmegen
ACM Transactions on Information Systems | Year: 2014

This article presents an overview of the video copy detection benchmarkwhichwas run over a period of 4 years (2008-2011) as part of the TREC Video Retrieval (TRECVID) workshop series. The main contributions of the article include i) an examination of the evolving design of the evaluation framework and its components (system tasks, data, measures); ii) a high-level overview of results and best-performing approaches; and iii) a discussion of lessons learned over the four years. The content-based copy detection (CCD) benchmark worked with a large collection of synthetic queries, which is atypical for TRECVID, as was the use of a normalized detection cost framework. These particular evaluation design choices are motivated and appraised.


Parikh H.,U.S. National Institute of Standards and Technology | Parikh H.,Dakota Consulting Inc. | Mohiyuddin M.,Bina Technologies | Lam H.Y.K.,Bina Technologies | And 10 more authors.
BMC Genomics | Year: 2016

Background: The human genome contains variants ranging in size from small single nucleotide polymorphisms (SNPs) to large structural variants (SVs). High-quality benchmark small variant calls for the pilot National Institute of Standards and Technology (NIST) Reference Material (NA12878) have been developed by the Genome in a Bottle Consortium, but no similar high-quality benchmark SV calls exist for this genome. Since SV callers output highly discordant results, we developed methods to combine multiple forms of evidence from multiple sequencing technologies to classify candidate SVs into likely true or false positives. Our method (svclassify) calculates annotations from one or more aligned bam files from many high-throughput sequencing technologies, and then builds a one-class model using these annotations to classify candidate SVs as likely true or false positives. Results: We first used pedigree analysis to develop a set of high-confidence breakpoint-resolved large deletions. We then used svclassify to cluster and classify these deletions as well as a set of high-confidence deletions from the 1000 Genomes Project and a set of breakpoint-resolved complex insertions from Spiral Genetics. We find that likely SVs cluster separately from likely non-SVs based on our annotations, and that the SVs cluster into different types of deletions. We then developed a supervised one-class classification method that uses a training set of random non-SV regions to determine whether candidate SVs have abnormal annotations different from most of the genome. To test this classification method, we use our pedigree-based breakpoint-resolved SVs, SVs validated by the 1000 Genomes Project, and assembly-based breakpoint-resolved insertions, along with semi-automated visualization using svviz. Conclusions: We find that candidate SVs with high scores from multiple technologies have high concordance with PCR validation and an orthogonal consensus method MetaSV (99.7% concordant), and candidate SVs with low scores are questionable. We distribute a set of 2676 high-confidence deletions and 68 high-confidence insertions with high svclassify scores from these call sets for benchmarking SV callers. We expect these methods to be particularly useful for establishing high-confidence SV calls for benchmark samples that have been characterized by multiple technologies. © 2016 Parikh et al.


Awad G.,Dakota Consulting Inc | Snoek C.G.M.,University of Amsterdam | Smeaton A.F.,Dublin City University | Quenot G.,University Grenoble Alpes | Quenot G.,French National Center for Scientific Research
ITE Transactions on Media Technology and Applications | Year: 2016

Semantic indexing, or assigning semantic tags to video samples, is a key component for content-based access to video documents and collections. The Semantic Indexing task has been run at TRECVid from 2010 to 2015 with the support of NIST and the Quaero project. As with the previous High-Level Feature detection task which ran from 2002 to 2009, the semantic indexing task aims at evaluating methods and systems for detecting visual, auditory or multi-modal concepts in video shots. In addition to the main semantic indexing task, four secondary tasks were proposed namely the "localization" task, the "concept pair" task, the "no annotation" task, and the "progress" task. It attracted over 40 research teams during its running period. The task was conducted using a total of 1,400 hours of video data drawn from Internet Archive videos with Creative Commons licenses gathered by NIST. 200 hours of new test data was made available each year plus 200 more as development data in 2010. The number of target concepts to be detected started from 130 in 2010 and was extended to 346 in 2011. Both the increase in the volume of video data and in the number of target concepts favored the development of generic and scalable methods. Over 8 millions shots×concepts direct annotations plus over 20 millions indirect ones were produced by the participants and the Quaero project on a total of 800 hours of development data. Significant progress was accomplished during the period as this was accurately measured in the context of the progress task but also from some of the participants' contrast experiments. This paper describes the data, protocol and metrics used for the main and the secondary tasks, the results obtained and the main approaches used by participants. © 2016 by ITE Transactions on Media Technology and Applications (MTA).


Kelsey J.,U.S. National Institute of Standards and Technology | McKay K.A.,U.S. National Institute of Standards and Technology | Turan M.S.,U.S. National Institute of Standards and Technology | Turan M.S.,Dakota Consulting Inc
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2015

Random numbers are essential for cryptography. In most real-world systems, these values come from a cryptographic pseudorandom number generator (PRNG), which in turn is seeded by an entropy source. The security of the entire cryptographic system then relies on the accuracy of the claimed amount of entropy provided by the source. If the entropy source provides less unpredictability than is expected, the security of the cryptographic mechanisms is undermined, as in [5, 7, 10]. For this reason, correctly estimating the amount of entropy available from a source is critical. In this paper, we develop a set of tools for estimating entropy, based on mechanisms that attempt to predict the next sample in a sequence based on all previous samples. These mechanisms are called predictors. We develop a framework for using predictors to estimate entropy, and test them experimentally against both simulated and real noise sources. For comparison, we subject the entropy estimates defined in the August 2012 draft of NIST Special Publication 800-90B [4] to the same tests, and compare their performance. © International Association for Cryptologic Research 2015.

Loading Dakota Consulting Inc collaborators
Loading Dakota Consulting Inc collaborators