Entity

Time filter

Source Type

UT, United States

Roy S.,University of Texas Health Science Center at San Antonio | Chakravarty D.,University of Texas Health Science Center at San Antonio | Chakravarty D.,New York Medical College | Cortez V.,University of Texas Health Science Center at San Antonio | And 7 more authors.
Molecular Cancer Research | Year: 2012

Breast cancer metastasis is a major clinical problem. The molecular basis of breast cancer progression to metastasis remains poorly understood. PELP1 is an estrogen receptor (ER) coregulator that has been implicated as a proto-oncogene whose expression is deregulated in metastatic breast tumors and whose expression is retained in ERnegative tumors.Weexamined the mechanism and significance of PELP1-mediated signaling in ER-negative breast cancer progression using two ER-negative model cells (MDA-MB-231 and 4T1 cells) that stably express PELP1- shRNA. These model cells had reduced PELP1 expression (75% of endogenous levels) and exhibited less propensity to proliferate in growth assays in vitro. PELP1 downregulation substantially affected migration of ER-negative cells in Boyden chamber and invasion assays. Using mechanistic studies, we found that PELP1 modulated expression of several genes involved in the epithelial mesenchymal transition (EMT), including MMPs, SNAIL, TWIST, and ZEB. In addition, PELP1 knockdown reduced the in vivo metastatic potential of ER-negative breast cancer cells and significantly reduced lung metastatic nodules in a xenograft assay. These results implicate PELP1 as having a role in ER-negative breast cancer metastasis, reveal novel mechanism of coregulator regulation of metastasis via promoting cell motility/EMT by modulating expression of genes, and suggest PELP1 may be a potential therapeutic target for metastatic ER-negative breast cancer. ©2011 AACR. Source


Navaz H.K.,Kettering University | Kehtarnavaz N.,UT Dallas | Jovic Z.,Kettering University
Proceedings of SPIE - The International Society for Optical Engineering | Year: 2014

This work presents the development of a multi-input, multi-output neural network structure to predict the time dependent concentration of chemical agents as they participate in chemical reaction with environmental substrates or moisture content within these substrates. The neural network prediction is based on a computationally or experimentally produced database that includes the concentration of all chemicals presents (reactants and products) as a function of the chemical agent droplet size, wind speed, temperature, and turbulence. The utilization of this prediction structure is made userfriendly via an easy-to-use graphical user interface. Furthermore, upon the knowledge of the time-varying environmental parameters (wind speed and temperature that are usually recorded and available), the time varying concentration of all chemicals can be predicted almost instantaneously by recalling the previously trained network. The network prediction was compared with actual open air test data and the results were found to match. © 2014 SPIE. Source


Gouissem A.,Qatar University | Gouissem A.,University of Burgundy | Hamila R.,Qatar University | Al-Dhahir N.,UT Dallas | Foufou S.,Qatar University
Eurasip Journal on Advances in Signal Processing | Year: 2016

In this paper, we propose and investigate two novel techniques to perform multiple relay selection in large multi-hop decode-and-forward relay networks. The two proposed techniques exploit sparse signal recovery theory to select multiple relays using the orthogonal matching pursuit algorithm and outperform state-of-the-art techniques in terms of outage probability and computation complexity. To reduce the amount of collected channel state information (CSI), we propose a limited-feedback scheme where only a limited number of relays feedback their CSI. Furthermore, a detailed performance-complexity tradeoff investigation is conducted for the different studied techniques and verified by Monte Carlo simulations. © 2016, The Author(s). Source


Wang Y.,Qatar University | Desmedt Y.,UT Dallas
2014 IEEE Information Theory Workshop, ITW 2014 | Year: 2014

One of the important problems in secret sharing schemes is to establish bounds on the size of the shares to be given to participants in secret sharing schemes. The other important problem in secret sharing schemes is to reduce the computational complexity in both secret distribution phase and secret reconstruction phase. In this paper, we design efficient threshold (n, k) secret sharing schemes to achieve both of the above goals. In particular, we show that if the secret size |s| is larger than max{1 + log2 n, n(n - k)/(n - 1)}, then ideal secret sharing schemes exist. In the efficient ideal secret sharing schemes that we will construct, only XOR-operations on binary strings are required (which is the best we could achieve). These schemes will have many applications both in practice and in theory. For example, they could be used to design very efficient verifiable secret sharing schemes which will have broad applications in secure multi-party computation and could be used to design efficient privacy preserving data storage in cloud systems. © 2014 IEEE. Source


News Article
Site: http://phys.org/space-news/

New research suggests that oscillating heavy particles generated "clocks" in the primordial universe that could be used to determine what produced the initial conditions that gave rise to the universe. Credit: Yi Wang and Xingang Chen How did the universe begin? And what came before the Big Bang? Cosmologists have asked these questions ever since discovering that our universe is expanding. The answers aren't easy to determine. The beginning of the cosmos is cloaked and hidden from the view of our most powerful telescopes. Yet observations we make today can give clues to the universe's origin. New research suggests a novel way of probing the beginning of space and time to determine which of the competing theories is correct. The most widely accepted theoretical scenario for the beginning of the universe is inflation, which predicts that the universe expanded at an exponential rate in the first fleeting fraction of a second. However a number of alternative scenarios have been suggested, some predicting a Big Crunch preceding the Big Bang. The trick is to find measurements that can distinguish between these scenarios. One promising source of information about the universe's beginning is the cosmic microwave background (CMB) - the remnant glow of the Big Bang that pervades all of space. This glow appears smooth and uniform at first, but upon closer inspection varies by small amounts. Those variations come from quantum fluctuations present at the birth of the universe that have been stretched as the universe expanded. The conventional approach to distinguish different scenarios searches for possible traces of gravitational waves, generated during the primordial universe, in the CMB. "Here we are proposing a new approach that could allow us to directly reveal the evolutionary history of the primordial universe from astrophysical signals. This history is unique to each scenario," says coauthor Xingang Chen of the Harvard-Smithsonian Center for Astrophysics (CfA) and the University of Texas at Dallas. While previous experimental and theoretical studies give clues to spatial variations in the primordial universe, they lack the key element of time. Without a ticking clock to measure the passage of time, the evolutionary history of the primordial universe can't be determined unambiguously. "Imagine you took the frames of a movie and stacked them all randomly on top of each other. If those frames aren't labeled with a time, you can't put them in order. Did the primordial universe crunch or bang? If you don't know whether the movie is running forward or in reverse, you can't tell the difference," explains Chen. This new research suggests that such "clocks" exist, and can be used to measure the passage of time at the universe's birth. These clocks take the form of heavy particles, which are an expected product of the "theory of everything" that will unite quantum mechanics and general relativity. They are named the "primordial standard clocks." Subatomic heavy particles will behave like a pendulum, oscillating back and forth in a universal and standard way. They can even do so quantum-mechanically without being pushed initially. Those oscillations or quantum wiggles would act as clock ticks, and add time labels to the stack of movie frames in our analogy. "Ticks of these primordial standard clocks would create corresponding wiggles in measurements of the cosmic microwave background, whose pattern is unique for each scenario," says coauthor Yi Wang of The Hong Kong University of Science and Technology. However, current data isn't accurate enough to spot such small variations. Ongoing experiments should greatly improve the situation. Projects like CfA's BICEP3 and Keck Array, and many other related experiments worldwide, will gather exquisitely precise CMB data at the same time as they are searching for gravitational waves. If the wiggles from the primordial standard clocks are strong enough, experiments should find them in the next decade. Supporting evidence could come from other lines of investigation, like maps of the large-scale structure of the universe including galaxies and cosmic hydrogen. And since the primordial standard clocks would be a component of the "theory of everything," finding them would also provide evidence for physics beyond the Standard Model at an energy scale inaccessible to colliders on the ground. This research is detailed in a paper by Xingang Chen and Mohammad Hossein Namjoo (CfA/UT Dallas) and Yi Wang (The Hong Kong University of Science and Technology). It has been accepted for publication in the Journal of Cosmology and Astroparticle Physics and is available online.

Discover hidden collaborations