DigiPen Institute of Technology

Redmond, WA, United States

DigiPen Institute of Technology

Redmond, WA, United States
Time filter
Source Type

Bacha S.B.,DigiPen Institute of Technology | Bede B.,DigiPen Institute of Technology
Annual Conference of the North American Fuzzy Information Processing Society - NAFIPS | Year: 2017

In the present paper we propose, and investigate different constructive approaches to approximate Mamdani fuzzy systems, using Takagi-Sugeno systems. The Takagi- Sugeno fuzzy systems that we construct will be based on a piecewise linear approximation, an interpolation polynomial, and an approach based on cubic splines. Since using a Mamdani system with Center of Gravity defuzzification is computationally expensive, a Takagi-Sugeno system is a natural idea to reduce computation and keep high quality on the performance side. We extend these approaches to fuzzy rule bases with multiple antecedents. As application we construct a computing with words system using the proposed approach, and also we use the proposed approach in a video game. © 2016 IEEE.

Ming Z.-Y.,Digipen Institute of Technology | Chua T.S.,National University of Singapore
Information Sciences | Year: 2015

Names are important atomic information carriers in unstructured text. Matching names that refer to the same entities is an important issue in text analysis and a key component in many real world applications. Generally referred to as entity linking, it is defined as a task that aligns a name mentioned in free text to its corresponding entry in a Knowledge Base (KB). The difficulty of the task lies in the many-to-many correspondence between names and entities, causing the pseudonymity and polysemy issues. Existing work usually focuses on resolving polysemy by aggregating large numbers of loosely arranged features in supervised learning frameworks, with very few targeting the pseudonymity or both issues with the same depth. In this work, we tackle both issues by comprehensive modeling of an entity's name and context: we tackle the pseudonymity by modeling name variants on the query name and the KB title; and polysemy by modeling heterogeneous aspects of the query and KB context. Specially, we harness entity coreferences within query and KB documents together with the external alias resources for modeling name variants, and further use the name variants to identify focused context. Moreover, we propose a recall-boosted retrieval method for efficient candidate entity generation. Experimental results show that our proposed approach outperforms the state-of-the-art systems on the benchmark data. © 2015 Elsevier Inc. All rights reserved.

Ye J.,MOZAT PTE. LTD of Singapore | Ming Z.Y.,Digipen Institute of Technology | Chua T.S.,National University of Singapore
ACM Transactions on Intelligent Systems and Technology | Year: 2016

Document summarization is playing an important role in coping with information overload on the Web. Many summarization models have been proposed recently, but few try to adjust the summary length and sentence order according to application scenarios. With the popularity of handheld devices, presenting key information first in summaries of flexible length is of great convenience in terms of faster reading and decision-making and network consumption reduction. Targeting this problem, we introduce a novel task of generating summaries of incremental length. In particular, we require that the summaries should have the ability to automatically adjust the coverage of general-detailed information when the summary length varies. We propose a novel summarization model that incrementally maximizes topic coverage based on the document's hierarchical topic model. In addition to the standard Rouge-1 measure, we define a new evaluation metric based on the similarity of the summaries' topic coverage distribution in order to account for sentence order and summary length. Extensive experiments on Wikipedia pages, DUC 2007, and general noninverted writing style documents from multiple sources show the effectiveness of our proposed approach. Moreover, we carry out a user study on a mobile application scenario to show the usability of the produced summary in terms of improving judgment accuracy and speed, as well as reducing the reading burden and network traffic. © 2016 ACM 2157-6904/2016/02-ART29 $15.00.

Chalco-Cano Y.,University of Tarapacá | Lodwick W.A.,University of Colorado at Denver | Bede B.,DigiPen Institute of Technology
Fuzzy Sets and Systems | Year: 2014

In the present paper we propose a variant of constraint interval arithmetic that operates with a single parameter (level) in each interval operand of an expression. This results in a computationally simple interval arithmetic that has several desirable properties not shared in general by interval operations previously defined in the literature. The actualization and usefulness of this approach to fuzzy (interval) arithmetic is accomplished via alpha-levels. The evaluation of arithmetic expressions using the proposed single level constrained arithmetic is discussed in detail and we extend this theory to the fuzzy context. © 2014 Elsevier B.V. All rights reserved.

Love J.J.,U.S. Geological Survey | Thomas J.N.,NorthWest Research Associates, Inc. | Thomas J.N.,Digipen Institute of Technology | Thomas J.N.,University of Washington
Geophysical Research Letters | Year: 2013

We examine the claim that solar-terrestrial interaction, as measured by sunspots, solar wind velocity, and geomagnetic activity, might play a role in triggering earthquakes. We count the number of earthquakes having magnitudes that exceed chosen thresholds in calendar years, months, and days, and we order these counts by the corresponding rank of annual, monthly, and daily averages of the solar-terrestrial variables. We measure the statistical significance of the difference between the earthquake-number distributions below and above the median of the solar-terrestrial averages by χ2 and Student's t tests. Across a range of earthquake magnitude thresholds, we find no consistent and statistically significant distributional differences. We also introduce time lags between the solar-terrestrial variables and the number of earthquakes, but again no statistically significant distributional difference is found. We cannot reject the null hypothesis of no solar-terrestrial triggering of earthquakes. Key Points Solar-terrestrial triggering of earthquakes is not statistically significant Hypotheses of solar-terrestrial triggering of earthquakes need to be tested Solar-terrestrial variables are not useful for earthquake prediction. ©2013 American Geophysical Union. All Rights Reserved.

Bede B.,DigiPen Institute of Technology | Stefanini L.,Urbino University
Fuzzy Sets and Systems | Year: 2013

In the present paper, using novel generalizations of the Hukuhara difference for fuzzy sets, we introduce and study new generalized differentiability concepts for fuzzy valued functions. Several properties of the new concepts are investigated and they are compared to similar fuzzy differentiabilities finding connections between them. Characterization and relatively simple expressions are provided for the new derivatives. © 2012 Elsevier B.V. All rights reserved.

Bede B.,DigiPen Institute of Technology | O'Regan D.,National University of Ireland
Knowledge-Based Systems | Year: 2013

In the present paper we develop a theory of pseudo-linear operators. A generalization of the classical Lp spaces to the setting of pseudo-analysis is constructed. We obtain a Riesz representation theorem together with a Lax-Milgram type lemma. © 2012 Elsevier B.V. All rights reserved.

Hanson J.,DigiPen Institute of Technology
General Relativity and Gravitation | Year: 2013

The canonical decomposition of a Lorentz algebra element into a sum of orthogonal simple (decomposable) Lorentz bivectors is defined and discussed. This decomposition on the Lie algebra level leads to a natural decomposition of a proper orthochronous Lorentz transformation into a product of commuting Lorentz transformations, each of which is the exponential of a simple bivector. While this later result is known, we present novel formulas that are independent of the form of the Lorentz metric chosen. As an application of our methods, we obtain an alternative method of deriving the formulas for the exponential and logarithm for Lorentz transformations. © 2012 Springer Science+Business Media New York.

Portegys T.E.,DigiPen Institute of Technology
Neurocomputing | Year: 2015

The Caenorhabditis elegans nematode worm is a small well-known creature, intensely studied for decades. Its entire morphology has been mapped cell-by-cell, including its 302 neuron connectome. The connectome is a synaptic wiring diagram that also specifies neurotransmitters and junction types. It does not however specify the synaptic connection strengths. It is believed that measuring these must be done in live specimens, requiring emerging or yet to be developed techniques. Without the connection strengths, it is not known how the nematode[U+05F3]s nervous system produces behaviors. Discovering these strengths as a set of weights is a challenging and important problem: an artificial worm embodying the connectome and trained to perform a set of behaviors taken from measurements of the actual C. elegans would behave realistically in its environment. This is a crucial step toward creating a functional artificial creature. Indeed, knowing the artificial weights might cast light on the actual ones. In this project a genetic algorithm was used to train the entire connectome, a large space of 3680 synapse weights, to learn behaviors defined as sensory-motor sequences. It was found that utilizing the topology of the connectome for local optimization and crossover significantly boosts the performance of the genetic algorithm. Using a network of artificial neurons, random sequences involving the entire connectome were successfully trained. Additionally, for locomotion training, sinusoidal body postures were observed when sensory touch neurons were stimulated. Locomotion training was done using a Fourier Transform fitness function. Finally, using the NEURON tool to simulate a biologically higher fidelity network, the pharyngeal assembly of neurons was successfully trained. © 2015 Elsevier B.V.

Van Ginneken L.P.P.P.,DigiPen Institute of Technology
Proceedings of the International Symposium on Physical Design | Year: 2016

This paper reviews theoretical advances to the field of simulated annealing optimization and applications to physical design contributed by Ralph Otten. At this year's ISPD, Ralph Otten receives the ISPD Lifetime Achievement award for outstanding contributions made to the field of physical design automation over multiple decades. For this occasion this paper highlights some of the research contributions by Ralph Otten in simulated annealing. It recounts how he moved from applications of simulated annealing to analyzing and understanding the method from a theoretical standpoint and how it was developed into a general purpose algorithm. © 2016 ACM.

Loading DigiPen Institute of Technology collaborators
Loading DigiPen Institute of Technology collaborators