SAN DIEGO—Original drawings and sketches from Walt Disney Animation Studio’s more than 90-year history—from Steamboat Willie through Frozen—traveled internationally for the first time this summer. This gave conservators the rare opportunity to monitor the artwork with a new state-of-the-art sensor. A team of researchers report that they developed and used a super-sensitive artificial “nose,” customized specifically to detect pollutants before they could irreversibly damage the artwork. The researchers report on their preservation efforts at the 251st National Meeting & Exposition of the American Chemical Society (ACS). “Many pollutants that are problematic for human beings are also problematic for works of art,” says Kenneth Suslick, Ph.D. For example, pollutants can spur oxidative damage and acid degradation that, in prints or canvases, lead to color changes or decomposition. “The ability to monitor how much pollution a drawing or painting is exposed to is an important element of art preservation,” he says. However, works of art are susceptible to damage at far lower pollutant levels than what’s considered acceptable for humans. “The high sensitivity of artists’ materials makes a lot of sense for two reasons,” explains Suslick, who is at the University of Illinois at Urbana-Champaign. “Human beings are capable of healing, which, of course, works of art cannot do. Moreover, human beings have finite lifetimes, whereas ideally works of art should last for future generations.” To protect valuable works of art from these effects, conservators enclose vulnerable pieces in sealed display cases. But even then, some artists’ materials may “exhale” reactive compounds that accumulate in the cases and damage the art. To counter the accumulation of pollutants, conservators often hide sorbent materials inside display cases that scrub potentially damaging compounds from the enclosed environment. But it is difficult to know precisely when to replace the sorbents. Suslick, a self-proclaimed “museum hound,” figured he might have an answer. He had already invented an optoelectronic nose—an array of dyes that change color when exposed to various compounds. But it is used largely for biomedical purposes, and it can’t sniff out the low concentrations of pollutants that damage works of art. To redesign the nose with the aim of protecting artwork, he approached scientists at the Getty Conservation Institute (GCI), a private non-profit institution in Los Angeles that works internationally to advance art conservation practice. He proposed that his team devise a sensor several hundred times more sensitive than existing devices used for cultural heritage research. The collaboration took off, and the scientists built a keener nose. At the time, GCI was involved in a research project with the Walt Disney Animation Research Library to investigate the impact of storage environment on their animation cels, which are transparent sheets that artists drew or painted on before computer animation was developed. Such research ultimately could help extend the life of this important collection. The new sensors would monitor levels of acetic acid and other compounds that emanate from these sheets. Before the exhibit, “Drawn from Life: The Art of Disney Animation Studios,” hit the road on tour, Suslick recommended placing the sensors in discrete places to monitor the pollution levels both inside and outside of the sealed and framed artworks. If the sensors indicated pollution levels inside the sealed frames were rising, conservators traveling with the Disney exhibit would know to replace the sorbents. An initial analysis of sensor data showed that the sorbents were effective. Suslick says he expects to continue expanding the sensors’ applications in the field of cultural heritage. Collaborators in the project include Maria LaGasse, a graduate student in Suslick’s lab; Kristen McCormick, art exhibitions and conservation manager at the Walt Disney Animation Research Library; Herant Khanjian, assistant scientist; and Michael Schilling, senior scientist at the Getty Conservation Institute. Suslick acknowledges funding from the National Science Foundation. The Walt Disney Company provided funding to support the GCI research on animation cels.
Modsy is today launching a private beta for its novel approach to home design. Using sophisticated 3D rendering solutions, the service turns photos of your simple room into the swanky home of your dreams. This startup has built a service that can bring professional interior decorating to the masses. The service works like this: The homeowner takes several pictures of the space they want to decorate — ideally a photo from each of the room’s corners. Then, these photos are uploaded to Modsy after which the homeowner takes a style quiz about their likes and dislikes. This takes about 5 minutes and basically involves clicking on photos. Then Modsy takes over. About 48 hours later, the results are delivered. The homeowner receives a package of media including complex renderings of their room complete with new furnishings and flooring and the fripperies. It’s a complete room makeover — and it’s your room. This isn’t an Ikea catalog of some random room. Modsy puts everything in your room. The service also delivers a virtual reality view of the room. This VR view is a crowdpleaser. The image loads on a smartphone and the viewer can stand in the middle of the room twirling around to get a sense of the new design. The VR view is fun, but I found after using the service, the still images do a better job of showing off the room Modsy made for me. The still images are a higher quality, and I found myself staring at them, wondering how much it would cost to buy everything in this fantasy world. But Modsy has a solution for that too. This service isn’t free, yet the cost is pennies compared to using a professional decorator. But in this case, perhaps Modsy’s biggest selling point is the time savings — it just requires a few minutes to get a fresh take on a room. Modsy founder Shanna Tellerman has a long history with 3D platforms. She was the founder and CEO of Sim Ops Studio, a 3D game development company spun out of Carnegie Mellon and later acquired by Autodesk. Later, she worked as a product manager at Autodesk and went on to be an investment parter at Google Ventures. Tellerman enlisted the help of Margarita Bratkova who serves as Modsy’s CTO. Previously, Bratkova worked on R&D at PDI/DreamWorks Animation and ImageMovers Digital/The Walt Disney Company. The company raised $2.75 million in the Spring of 2015 from Norwest Venture Partners, Metamorphic Ventures and others including angel investors from the home design space like Susan Feldman (Cofounder of One Kings Lane) and Eoin Harrington (SVP, Restoration Hardware). Modsy uses a technique called point cloud reconstruction. This is the easy part, Tellerman says explaining the whole process is similar to a film pipeline with a preproduction, production and post production. The homeowner provides the service with the necessary media. The company then puts it through its rendering systems that reconstructions the room in a 3D space and then a system plugs in furnishings per the homeowner’s style direction. The company creates some of the 3D models displayed in the images. Modsy also has partnerships with retailers who provide the 3D objects. After the rendering is complete, the homeowner gets several images and the VR view. The system allows the images to be slightly modified by selecting new furnishings. At that point, Modsy will generate a new set of renderings based on the updated selections. This currently takes about 48 hours. Tellerman tells me that eventually Modsy will be 90% automated. She notes that every customer has a different room, though, so each render is manually reviewed. For instance, in the rendering of my room, the company had to work out the lighting caused by the room’s large windows and tall ceilings. Modsy is launching a wait list for a private beta. At this point, the service doesn’t have a price, but eventually Tellerman sees the service costing between $30 or $40. Nearly everything pictured in a Modsy render is available for purchase. There are links to retailers and product pages under the images and Modsy is working on integrating a purchase button directly on the site so consumers can buy the furnishings a bit quicker. Tellerman says Modsy is not going to handle inventory, though. The purchase will still done through a retail partner. Will I use Modsy’s design? My wife and I already bought a wild sectional couch that we would have never considered before. It’s not a whole-home makeover – yet – but it’s a start and I wouldn’t have considered it had I not seen something similar in a Modsy redesign.
News Article | September 14, 2015
There are a lot of amazing vehicles in the Star Wars saga. Unfortunately, they're all the stuff of science fiction and the future. For now, the best that most humans have for earthly travel is the airplane, but it is possible to bring a bit of Star Wars to it. Japanese airline All Nippon Airways, or ANA for short, unveiled the new R2-D2 ANA Jet, a Boeing 787-9 Dreamliner painted to look like the beloved Star Wars droid, at the Boeing factory in Everett, Wash. Saturday. This is the first time a Star Wars character has ever appeared on the exterior of a commercial aircraft, according to ANA. In case you ever wondered what the trash can-like R2-D2 would look like as a huge airplane, here's your answer. The nose of the R2-D2 ANA Jet has the iconic blue, white and gray design of the droid in the movies, complete with its red sensor. The Star Wars logo also appears on the side of the aircraft in huge, black letters, if, for some reason, you need a reminder of what you're looking at. The interior of the plane will also be a Star Wars fan's dream with themed in-flight decorations, such as headrest covers, paper napkins and cups. Passengers will also be able to watch the six Star Wars movies that have already been released, the first time they have ever been a part of an in-flight entertainment system. The inaugural flight of the R2-D2 ANA Jet will be a Fan Appreciation Flight scheduled for Oct. 17, which will take off from and return to Handea Airport in Tokyo. The flight, which will last approximately two to three hours, will carry 89 passengers, all of whom are required to wear Star Wars costumes. The R2-D2 ANA Jet is scheduled to go into service on Oct. 18 with a flight from Tokyo to Vancouver. The plane will then fly between Japan and other cities in the United States, Europe, Asia and Australia. ANA originally revealed its plans for Star Wars-themed aircrafts during the Star Wars Celebration in Anaheim, Calif. in April. The airline's partnership with The Walt Disney Company will also feature two other aircrafts with Star Wars themes, including a Boeing 767-300 of BB-8, the new droid appearing in the upcoming Star Wars: Episode VII — The Force Awakens, and a Boeing 777-300ER featuring both R2-D2 and BB-8. The BB-8 plane will begin flying domestic routes in Japan in November, and the BB-8 and R2-D2 plane will fly mainly between Japan and North America starting in March 2016. While you wait to board your Star Wars plane from Tokyo's Narita Airport, you can always play the Star Wars Battle Pod arcade game, which is installed in ANA's lounge. The airport lounge might not look exactly like the Mos Eisley Cantina, but that is a pretty out-of-this-world amenity anyway.