Bangkok, Thailand

Kasetsart University
Bangkok, Thailand

Kasetsart University is a top-ranked public university in Thailand. It is ranked at number 401-500 in the world by QS. It was the first agricultural university and the third oldest university in Thailand. Kaset University was established on February 2, 1943, with the primary aims in promoting subjects related to agricultural science. To date, Kasetsart University has revised its curricula and expanded the subject areas to cover science, arts, social science, humanities, education, engineering, and architecture. Recently, the university made an attempt to include medicine and health science. Kasetsart University has seven campuses throughout Thailand, where its main and flagship campus is at Bang Khen, Bangkok. With 58,000 students enrolled it is the largest university in Thailand. Wikipedia.

Time filter
Source Type

News Article | May 17, 2017

The 2nd annual $100,000 international RobotArt Competition just announced its 2017 winners. The competition’s goal is to get teams to paint beautiful art using robots, physical brushes, and paint. 39 painting robots responded to the challenge, submitting more than 200 artworks. This was twice the participation of the inaugural 2016 contest. In addition to more teams, there were more approaches to creating art. In addition to traditional looking robotic arms, the top competitors included thought-controlled robots, writhing snakebots, and even flying quadcopter drones. The brains of the robots — the software — was equally varied and included deep learning algorithms, 3D scans, matricism, as well as systems designed to collaborate with human artists. The style and subject matter was as varied as the hardware and software. Many art genres were represented including impressionism, abstract expressionism, cubism, and realism. These styles were used to paint portraits, still-lifes, landscapes and abstract compositions. Andrew Conru, the event's sponsor and organizer, notes: "I’m excited to see growth of the contest — many of the paintings have reached levels comparable with human painters and prove that robot artists are capable of expressing real human emotions.” First Place and $40,000 was awarded to the Creative Machines Lab at Columbia University. Second Place ($25,000) went to Thailand’s Kasetsart University. Third Place ($10,000) and Top Technical Contributor ($5,000) went to the independent team CloudPainter, who used deep learning to create artwork based on the distinctive brushstrokes and styles of human artists. Perhaps the most interesting subject was a series of Jackson Pollock inspired robotic self-portraits by the Fourth Place finisher, e-David of Germany. More pictures of the paintings and a full list of the top teams can be seen at Robotic painting and computational creativity is an emerging genre of art. It will continue to be showcased in the RobotArt competition over the next several years. As interest in this art form and competition continues to grow, teams are encouraged to begin preparing for the 2018 contest. With the increase of sophistication seen between the 2016 and 2017 RobotArt Contest, the quality of the 2018 entries will be limited only by the imagination of the teams that choose to participate. Furthermore, there will be a special category for first-time entries to encourage new teams and ideas at this annual event. In addition to next year's online competition, there will also be a physical Art Exhibition featuring past and present RobotArt paintings in the Summer of 2018 — location and venue to be announced shortly.

News Article | May 16, 2017

Art has always been fundamentally intertwined with technology. New techniques and materials have constantly allowed artists to innovate and create new types of works. In this series we look at the impact of digital technologies on art and how artists are creating entirely novel forms of art using these modern tools. We've previously examined the fields of "datamoshing", ASCII art, BioArt, Minecraft Art and Internet Art. In this instalment we examine a fascinating world where scientists are teaching robots how to paint works of art. Artificial intelligence systems are currently excelling at producing elaborate digitally generated works of art. Every other week we seem to see a new neural network developed to mimic a famous artists' aesthetic or convert a photograph into a painterly image. But what about machines actually mimicking the process a human artist uses to paint on a canvas? That particularly human skill seems to be a lot harder for machines to replicate. In 2016, the RobotArt competition was founded by Stanford educated mechanical engineer Andrew Conru. The competition was designed to stimulate robotic engineers to create new mechanical painting devices. In setting up the competition Conru noted that many of the initial entries were expected to be variations of a simple mechanism where a robotic arm mimics the movements of a human artist, but many teams took the challenge a step further. The competition saw a variety of different entries, from a team using an eye-tracking system to control a robot's movement, to a system that had users remotely control a robot via internet-directed brush stroke commands. All the weird and wonderful results reinforced the question of how truly creative a robotically generated work of art could really be. Below are the recently announced winners of the 2017 RobotArt competition. Be sure to click through to our gallery to get a broader look at each winner's work. From a mechanical engineering team at Colombia University we get the winner of RobotArt 2017, a bot by the name of PIX18. Apparently this is the third generation of a system developed with the goal of creating a robot capable of creating original artwork using the classic medium of oil on canvas. Judging comments applauded this robot's ability to produce "some lovely paintings from sources or scratch" and noted that the work had "brush strokes evocative of Van Gogh". The ReART system uses a haptic recording system to record artists painting a work. The system tracks the position of the brush, the force being exerted and a variety of other data points. A robot then "plays back" the recording, creating a perfectly mimicked ink brush drawing. The project is from the Department of Electrical Engineering at Kasetsart University in Thailand and looks to develop motion control robotics for a variety of industrial and creative uses. CloudPainter is one of the most technically sophisticated projects in the RobotArt competition. Utilizing AI and deep learning systems, the project aims to get the machine to make as many individual creative decisions as possible. According to the creators, currently "the only decision made by a human is the decision to start a painting." More info on their process can be found on their website. One of the judges said of the machine's work, "Spontaneous paint, "mosaicing" of adjacent tones, layering effects and the graphical interplay between paint strokes of varying textures, are all hand/eye, deeply neurally sophisticated aspects of oil painting..." e-David is an evolving robotic painting system that uses a visual feedback loop to constantly record and re-process how the machine is interpreting its recreation of an input image. Using an ordinary industrial welding robot combined with cameras, sensors and a control computer, the system can correct errors as it paints, while also understanding what the makers call "human optimization processes". This is one of our favorite works from the competition. From a student at New York University Shanghai, this project is inspired by the aesthetic of American artist Chuck Close. The system starts with an input image that is converted to a low resolution and painted pixel by pixel using a mobile robot with omni wheels. Each oversized, low-res pixel that is cribbled by the robot is roughly the size of a human hand and each entire artwork is 176 X 176 cm (5.7 x 5.7 ft), or just about as tall as a human being. HEARTalion is a project from Halmstad University in Sweden that attempts to develop a system that can recognize and subsequently depict a person's emotional state. The system captures emotional signals using a Brain-Machine Interface (BCI) and a robot then attempts to convey the emotions visually based on a model that was developed with advice from two local painters in Halmstead, Peter Wahlbeck and Dan Koon. One of the impressed RobotArt judges remarked in reference to HEARTalion, "If this body of work was exhibited at a gallery and I was told that the artist aimed to capture emotion through color, composition, and textures — I would buy." This independent entry from an electronic engineer who put in most of the work after his wife and kids had gone to bed uses a simple XYZ axis painter bot guided by two basic behavioral rules. All of this project's work is from reinterpretations of input images, but because the robot receives no feedback from sensors or cameras, the mixing of colors isn't faithful to the source. However, the novel strength of this project comes from its gorgeous use of watercolor paint. Using the precision of a robotic artist to its advantage, this project created a system that minutely controls the pressure and movement of single brush strokes to create stunning images that a human would struggle to accurately produce. The members of the team describe their process in greater detail here and have also publicly offered up their source code in the hope others will build upon their work. CARP, or Custom Autonomous Robotic Painter, comes from a team at the Worcester Polytechnic Institute in Massachusetts. The system uses image decomposition techniques to dissemble input images, which are then reconstructed by a robot. Visual feedback systems are also incorporated into the process allowing for dynamic corrections to be applied to the work as it is being created. An experimental project from a team at MIT. This is an evolving robot arm that was saved from an existence as a decorative coat rack and has slowly been given more peripherals, such as an auto-brush cleaner and wireless control via a video game controller. Equipped with machine learning abilities, the robot can grow its skill set from project to project. Take a closer look through some more of the amazing and varied robot painted artworks in our gallery.

Ghamari-Langroudi M.,Vanderbilt University | Srisai D.,Kasetsart University | Cone R.D.,Vanderbilt University
Proceedings of the National Academy of Sciences of the United States of America | Year: 2011

Melanocortin-4 receptor (MC4R) is critical for energy homeostasis, and the paraventricular nucleus of the hypothalamus (PVN) is a key site of MC4R action. Most studies suggest that leptin regulates PVN neurons indirectly, by binding to receptors in the arcuate nucleus or ventromedial hypothalamus and regulating release of products like α-melanocyte-stimulating hormone (α-MSH), neuropeptide Y (NPY), glutamate, and GABA from first-order neurons onto the MC4R PVN cells. Here, we investigate mechanisms underlying regulation of activity of these neurons under various metabolic states by using hypothalamic slices from a transgenic MC4R-GFP mouse to record directly from MC4R neurons. First, we show that in vivo leptin levels regulate the tonic firing rate of second-order MC4R PVN neurons, with fasting increasing firing frequency in a leptin-dependent manner. We also show that, although leptin inhibits these neurons directly at the postsynaptic membrane, α-MSH and NPY potently stimulate and inhibit the cells, respectively. Thus, in contrast with the conventional model of leptin action, the primary control of MC4R PVN neurons is unlikely to be mediated by leptin action on arcuate NPY/agouti-related protein and proopiomelanocortin neurons. We also show that the activity of MC4R PVN neurons is controlled by the constitutive activity of the MC4R and that expression of the receptor mRNA and α-MSH sensitivity are both stimulated by leptin. Thus, leptin acts multinodally on arcuate nucleus/PVN circuits to regulate energy homeostasis, with prominent mechanisms involving direct control of both membrane conductances and gene expression in the MC4R PVN neuron.

The findings of this study support the argument made by many learner autonomy scholars that the road to autonomy is a process conditioned by each individual's zone of proximal development (ZPD) and that there are different degrees of autonomy. The description of behavioural patterns found from the experiment supports this notion. The findings show that once the direction was initiated by the teacher with the help of an external structure like a course management system (CMS), the learners could organise the resources in the system autonomously, took on new learning roles that were different from those in a traditional face-to-face classroom, and eventually they could develop autonomous perceptions and behaviours as an outcome of their engagement in this blended learning environment. The data from four research tools: i.e.; questionnaire, student learning journals, interviews and classroom observation are triangulated and amalgamated to increase the validity and reliability of the findings. © 2012 Elsevier Ltd. All rights reserved.

Hasin P.,Kasetsart University
Journal of Physical Chemistry C | Year: 2014

A simple and efficient synthesis of Co2C using graphene oxide (GO) as a carbon source has been established. The procedure consists of two steps: (1) formation of a GO/Co3O4 nanocomposite via the ammonia-evaporation-induced method and (2) conversion of Co3O 4 to Co2C under a H2/N2 mixture at a low temperature (200 C). Transmission electron microscopy (TEM) analysis showed that Co2C has a crystallite size of around 5 nm and a mesoporous structure with a pore size of ca. 3-5 nm. The amphiphilic behavior of GO contributes to the high porosity, large specific surface area, and narrow pore size distribution of the Co2C. Tungsten carbide has also been successfully obtained using GO as a carbon source at a much lower temperature than that of the traditional carbothermal synthesis. Therefore, this method could be extended to the production of other important carbides with desired mesoporous features at low temperatures. © 2014 American Chemical Society.

Witoon T.,Kasetsart University
Ceramics International | Year: 2011

The carbonation-calcination looping cycle of calcium-based sorbents is considered as an attractive method for CO2 capture from combustion gases because it can reduce the cost during the capture steps compared to conventional technologies, e.g., solvent scrubbing. In this study, waste eggshell was used as raw material for calcium oxide-based sorbent production. The commercially available calcium carbonate was employed for comparison purpose. Calcination behavior, crystal type and crystallinity, surface chemistry, qualitative and quantitative elemental information, specific surface area and pore size, morphology of the waste eggshell and the calcined waste eggshell were characterized by thermal gravimetric analysis (TGA), X-ray diffraction (XRD), Fourier transform infrared spectroscopy (FT-IR), X-ray fluorescence (XRF), N2 sorption analysis and scanning electron microscopy (SEM), respectively. The carbonation-calcination cycles were carried out using a TGA unit with high purity CO2 (99.999%). It was found that the carbonation conversion of the calcined eggshell was higher than that of the calcined commercially available calcium carbonate after several cycles at the same reaction conditions. This could be due to the fact that the calcined eggshell exhibited smaller particle size and appeared more macropore volume than the calcined commercially available calcium carbonate. As results, the calcined eggshell provided a higher exposed surface for the surface reaction of CO 2. © 2011 Elsevier Ltd and Techna Group S.r.l.

Klinkesorn U.,Kasetsart University
Food Reviews International | Year: 2013

This article reviews the basic principles of emulsion formation and stabilization through the electrosteric function of chitosan. Chitosan, which is a polycationic biopolymer, may act as an emulsifier and emulsion stabilizer through adsorption of the protective layer at oil-water interfaces, viscosity enhancement, and interaction with surface-active agents (e.g., surfactants, proteins, and polysaccharides). The interaction of chitosan at droplet interfaces can be associated with flocculation or electrosteric stabilization, depending on the nature and concentration of the chitosan, emulsifier characteristics, and the pH and ionic strength of solution. © 2013 Copyright Taylor and Francis Group, LLC.

The entropy generation of a fully developed laminar flow in a hexagonal duct is investigated in this study. A constant heat flux condition was applied in this analysis. Two fluids, water and engine oil, were used to study the effect of fluid properties on the entropy generation. The fluid properties were evaluated using average temperature between inlet and outlet duct sections. The aspect ratio of the hexagonal duct was varied to show its effect on the entropy generation. Attention was also given to the supplied heat flux affecting the entropy generation. Finally, the entropy generation calculated from the hexagonal duct was compared with that from rectangular and circular ducts having the same hydraulic diameter and cross-sectional area. © 2010 Elsevier Ltd.

Soodchomshom B.,Kasetsart University
Journal of Applied Physics | Year: 2014

The spin-valley currents in silicene-based normal/sublattice-dependent ferromagnetic/normal junction are investigated. Unlike that in graphene, the pseudo Dirac mass in silicene is generated by spin-orbit interaction and tunable by applying electric or exchange fields into it. This is due to silicon-based honeycomb lattice having buckled structure. As a result, it is found that the junction leads to currents perfectly split into four groups, spin up (down) in k- and k ′-valleys, when applying different values of the electric field, considered as a perfect spin-valley polarization (PSVP) for electronic application. The PSVP is due to the interplay of spin-valley-dependent Dirac mass and chemical potential in the barrier. The PSVP also occurs only for the energy comparable to the spin-orbit energy gap. This work reveals potential of silicene for spinvalleytronics applications. © 2014 AIP Publishing LLC.

Agency: European Commission | Branch: H2020 | Program: RIA | Phase: SFS-20-2015 | Award Amount: 6.91M | Year: 2016

Strength2Food is a 5-year, 6.9 million project to improve the effectiveness of EU food quality schemes (FQS), public sector food procurement (PSFP) and to stimulate Short Food Supply Chains (SFSC) through research, innovation and demonstration activities. Our 30-partner consortium representing 11 EU and 4 non-EU countries combines leading academic, communication, SME and stakeholder organisations to ensure a multi-actor approach. It will undertake case study-based quantitative research to measure economic, environmental and social impacts of FQS, PSFP and SFSC. The impact of PSFP policies on balanced nutrition in schools will also be assessed. Primary research will be complemented by advanced econometric analysis of existing datasets to determine impacts of FQS and SFSC participation on farm performance and survival, as well as understand price transmission and trade patterns. Consumer knowledge, confidence in, valuation and use of FQS labels and products will be assessed via cross-national survey, ethnographic and virtual supermarket-based research. Lessons from the research will be applied and verified in 6 pilot initiatives, focusing on less-developed and transition regions. These initiatives bring together academic and non-academic stakeholder partners in action research. The six pilot actions are: a school meals initiative to improve the nutritional outcomes and economic benefits for local agri-food producers; in-store trials (undertaken with a grocery retailer) to upscale sales of local produce; a scheme to stimulate a sustainable SFSC that adds value to the fishing community; and pilot actions to expand regional food labelling; increase sales of FQS products in non-traditional markers; and improve returns to local producers at food fairs and farmers markets (via a smartphone app). Project impact will be maximised through a knowledge exchange platform, hybrid forums, school educational resources, a Massive Open Online Course and practitioner recommendations.

Loading Kasetsart University collaborators
Loading Kasetsart University collaborators