News Article | October 23, 2015
Scientists have developed a hand-held optical scanner with the potential to offer breast cancer imaging in real time. The results are reported today, Oct. 23, 2015, in the journal Biomedical Physics & Engineering Express.
Home > Press > Details from the inner life of a tooth: New X-ray method uses scattering to visualize nanostructures Abstract: Both in materials science and in biomedical research it is important to be able to view minute nanostructures, for example in carbon-fiber materials and bones. A team from the Technical University of Munich (TUM), the University of Lund, Charite hospital in Berlin and the Paul Scherrer Institute (PSI) have now developed a new computed tomography method based on the scattering, rather than on the absorption, of X-rays. The technique makes it possible for the first time to visualize nanostructures in objects measuring just a few millimeters, allowing the researchers to view the precise three-dimensional structure of collagen fibers in a piece of human tooth. In principle, X-ray computed tomography (CT) has been around since the 1960s: X-ray images are taken of an object from various directions, and a computer then uses the individual images to generate a three-dimensional image of the object. Contrast is produced by the differential absorption of X-rays in dissimilar materials. However, the new method, which was developed by Franz Pfeiffer, professor for Biomedical Physics at TUM and his team utilizes the scattering of X-rays rather than their absorption. The results have now been published in the journal Nature. Scattering provides detailed images of nanostructures Theoretically, X-rays act like light with a very short wavelength. This principle lies at the heart of the new method: When a light is shone on a structured object, for example a CD, the reflected light produces a characteristic rainbow pattern. Although the fine grooves in the CD cannot be seen directly, the diffraction of the light rays - known as scattering - indirectly reveals the structure of the object. The same effect can be observed with X-rays, and it is this phenomenon that the researchers take advantage of in their new technique. The advantage of X-rays over visible light is that they are able to penetrate into materials, thus providing detailed information about the internal structure of objects. The researchers have now combined this three-dimensional information from scattered X-rays with computed tomography (CT). Conventional CT methods calculate exactly one value, known as a voxel, for each three-dimensional image point within an object. The new technique assigns multiple values to each voxel, as the scattered light arrives from various directions. "Thanks to this additional information, we're able to learn a great deal more about the nanostructure of an object than with conventional CT methods. By indirectly measuring scattered X-rays, we can now visualize minute structures that are too small for direct spatial resolution," Franz Pfeiffer explains. Internal view of a tooth For demonstration purposes the scientists examined a piece of human tooth measuring around three millimeters. A large part of a human tooth is made from the substance dentin. It consists largely of mineralized collagen fibers whose structure is largely responsible for the mechanical properties of the tooth. The scientists have now visualized these tiny fiber networks. A total of 1.4 million scatter images were taken, with the scattered light arriving from various directions. The individual images were then processed using a specially devised algorithm that builds up a complete reconstruction of the three-dimensional distribution of the scattered rays step by step. "Our algorithm calculates the precise direction of the scatter information for each image and then forms groups having the same scatter direction. This allows internal structures to be precisely reconstructed," says Martin Bech, former postdoc at the TUM and now assistant professor at the University of Lund. Using this method, it was possible to clearly view the three-dimensional orientation of the collagen fibers within a sample of this size for the first time. The results are in agreement with knowledge previously obtained about the structures from thin sections. "A sophisticated CT method is still more suitable for examining large objects. However, our new method makes it possible to visualize structures in the nanometer range in millimeter-sized objects at this level of precision for the first time," says Florian Schaff, lead author of the paper. For more information, please click Contacts: Prof. Dr. Franz Pfeiffer Chair of Biomedical Physics Department of Physics / IMETUM Technical University of Munich Tel.: +49 89 289 12551 (office) +49 89 289 12552 (secretariat) Vera Siegler 49-892-892-2731 If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.
News Article | April 8, 2015
There’s a rising tide of healthcare data. It lifts many hopes for better healthcare, but also surfaces one troubling issue: reliability of data. Just how confident are you of the reliability of your data? As a healthcare provider, you already know that data permeate your office workload. This impacts a critical feature of your operations: your workflow, a process you probably evolved over many years. Suddenly, you’re now doing “refreshes” to accommodate the new data volumes you’re seeing. “We’ve always done it this way” – that just doesn’t cut it any longer. Time was when you had dictation, writing and paper records. You now have many data input options (EMRs, voice-enabled documentation and more). So volume keeps growing, tools get more complex. Bigger yet are the issues around understanding your data, some not really obvious. For physicians, the EMR demands careful checks of patient records, new ways to capture care offered elsewhere, new diagnostic tools and ways for updating your patient’s condition — plus a bigger focus on “quality assurance.” Your “inputs” now need accuracy checks. It also means you’re the new data entry analyst on the block, and you’re burdened with an extra tall order for vigilance. Now, how reliable are your data? Example: At the point of care, as ICD codes get assigned to cases, there are some common errors, but their rates may top the 20% level (and higher still in some studies that have carefully assessed the data error issue). [Please see: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1361216/] The inaccuracies may come from patient behavior, the record trail itself, the provider or physician. But errors do seep into the record: A physician uses a typical synonym to label ‘‘stroke’’: She can choose “cerebrovascular accident,” “cerebral occlusion,” “cerebral infarction” or “apoplexy.” Which is right? Even if we correct for study differences among error-rate studies of clinical data, we know the error rates are unacceptable. The complexity of a case, provider inexperience, patients lacking skill in discussing symptoms, or the reluctance to listen to a patient’s view of her condition – all matter. It’s not uncommon to hear that even professional nurses may not be taken seriously as they describe their own condition to an emergency room provider. If it seems special pleading to belabor the issues around data, picture: —Your child, driven from athletic field to an ER, being diagnosed for a traumatic hit to the head. Will all diagnoses work, and will you trust them? —Your aging parent, living with multiple chronic conditions, uncertain about the new pains in her body. Where in her treatment steps will you overlook “understandable” error? At moments of truth like these, we lose patience and tolerance for any (let alone “understandable”) errors. But medicine is still catching up with us. Clinical errors still can, and do create chains of miscues that prove fatal. The quiet fact: While errors continue to crop up – at stubborn levels and rates — we know how to minimize them at the point of care. We know the “hacks” needed to produce much better data quality, and how to use those tools. At SyTrue, we use a comprehensive data platform so that diagnoses are done right, coded right, can be queried in “natural language” terms and can yield C-CDA care records that patients will take anywhere. Nonetheless, data errors continue to get “spiraled” into the medical record and analytics trail. So when it’s time for analytics, there’s only so much that can be extracted as the “true record” of a patient meeting. By then, it’s too late. By that point, an inaccurate diagnosis and recording errors could well be compounded by the medications and treatment used. Simple human error shadows many issues even with data missing from the picture. In the US, remember, we still see more than 2,300 annual wrong-site, wrong-patient operations (about 46 per week). These may be “understandable”—but acceptable? Add data to this kind of picture, and it’s a volatile mix. The healthcare system is beginning to tackle the healthcare data issue with some pace: But put very simply, the data remain unreliable. We know US medicine has many core issues, so while data quality gets mention, it doesn’t attract follow-through. Healthcare, meanwhile, correctly still targets the triple aim (better care, better quality, lower cost). It responds to practical concerns: Expensive drugs (Sovaldi) or new policies (ONC on interoperability). But data quality issues live on and may well escalate. FOR INTEROPERABILITY to WORK, FIX the DATA ISSUES Healthcare’s “Holy Grail” is interoperability. It’s been missing in action while getting lots of notice in planning. But with the ONC’s new urgency to achieve interoperability by 2017 – sooner than envisioned in 2013 – we’re seeing a tough road ahead. It may mean “mountain climbing” over many hills of unreliable data, just to get to a base camp near the top. ONC former Chief Science Officer, Douglas Fridsma, once quipped in 2013 that the US standard of interoperability is a “modem and a fax machine.” What’s next: Our many proprietary US clinical documentation systems, each with data error levels that may not be thoroughly understood, may be asked to lead this vanguard to the “interoperability” summit. Let’s get the data issue right — before the path begins to look like a “bridge too far.” Why not fix the reliability of data and help all patients get better care?
Velroyen A.,Biomedical Physics |
Bech M.,Biomedical Physics |
Bech M.,Lund University |
Zanette I.,Biomedical Physics |
And 12 more authors.
PLoS ONE | Year: 2011
Purpose: The aim of the study was to investigate microstructural changes occurring in unilateral renal ischemia-reperfusion injury in a murine animal model using synchrotron radiation.Material and Methods: The effects of renal ischemia-reperfusion were investigated in a murine animal model of unilateral ischemia. Kidney samples were harvested on day 18. Grating-Based Phase-Contrast Imaging (GB-PCI) of the paraffin-embedded kidney samples was performed at a Synchrotron Radiation Facility (beam energy of 19 keV). To obtain phase information, a two-grating Talbot interferometer was used applying the phase stepping technique. The imaging system provided an effective pixel size of 7.5 μm. The resulting attenuation and differential phase projections were tomographically reconstructed using filtered back-projection. Semi-automated segmentation and volumetry and correlation to histopathology were performed.Results: GB-PCI provided good discrimination of the cortex, outer and inner medulla in non-ischemic control kidneys. Post-ischemic kidneys showed a reduced compartmental differentiation, particularly of the outer stripe of the outer medulla, which could not be differentiated from the inner stripe. Compared to the contralateral kidney, after ischemia a volume loss was detected, while the inner medulla mainly retained its volume (ratio 0.94). Post-ischemic kidneys exhibited severe tissue damage as evidenced by tubular atrophy and dilatation, moderate inflammatory infiltration, loss of brush borders and tubular protein cylinders.Conclusion: In conclusion GB-PCI with synchrotron radiation allows for non-destructive microstructural assessment of parenchymal kidney disease and vessel architecture. If translation to lab-based approaches generates sufficient density resolution, and with a time-optimized image analysis protocol, GB-PCI may ultimately serve as a non-invasive, non-enhanced alternative for imaging of pathological changes of the kidney. © 2014 Velroyen et al. Source