Summary: The Texas Advanced Computing Center (TACC) is part of the University of Texas at Austin. TACC designs and operates some of the world's most powerful computing resources. The center's mission is to enable discoveries that advance science and society through the application of advanced computing technologies.
Paleoanthropologist Denne Reed of UT Austin is interviewed by host Jorge Salazar about making connections in big data from fossils of human origins. New discoveries might lie buried deep in the data of human fossils. That's according to Denné Reed, an associate professor in the Department of Anthropology at The University of Texas at Austin (UT Austin). Reed is the principal investigator of PaleoCore, an informatics initiative funded by the National Science Foundation (NSF). The PaleoCore project aims to get researchers of human origins worldwide all on the same page with their fossil data. Reed said PaleoCore is doing this by implementing data standards; making a place to store all data of human fossils; and developing new tools to collect the data. What he hopes to come out of this are deeper insights into our origins from better integration and sharing between different research projects in paleoanthropology and paleontology. "We've tried to take advantage of some of the geo-processing and database capabilities that are available through Wrangler to create large archives," Reed said. The big data Reed wants to archive on Wrangler are the entirety of the fossil record on human origins. PaleoCore will also include geospatial data such as satellite imagery. "For many of the countries that we're working in, this is their cultural heritage. We need to be able to ensure that not only are the data rapidly available, accessible, searchable, and everything else, but that they're safely archived," Reed said. Music Credits: Raro Bueno, Chuzausen freemusicarchive.org/music/Chuzausen/
Computer scientist Joshua New of the Oak Ridge National Laboratory speaks with host Jorge Salazar on how to optimize buildings to save energy using computer models. Saving energy saves money. Scientists at Oak Ridge National Laboratory (ORNL) are using supercomputers to do just that by making virtual versions of millions of buildings in the U.S. The Wrangler data-intensive supercomputer is working jointly with ORNL's Titan in a project called Autotune that trims the energy bills of buildings. Computer scientist Joshua New of the ORNL Building Technology Research and Integration Center is the principal investigator of the Autotune project, funded by the U.S. Department of Energy. Autotune takes a simple software model of a building's energy use and optimizes it to match reality. "What we're trying to do is create a crude model from publicly available data," New said. "Then the Autotune project takes utility bill data, whether it's monthly electrical utility bills, or hourly bills from advanced metering infrastructure, and calibrates that software model to match measured data." New said that once Autotune calibrates the model well enough, it can be legally used in multiple ways including for optimal building retrofit packages. Music Credits: Raro Bueno, Chuzausen freemusicarchive.org/music/Chuzausen/
UT Austin biologist Rebecca Young discuss her work with host Jorge Salazar about how she traces the genes behind monogamous behavior using the Wrangler supercomputer at the Texas Advanced Computing Center. Scientists at the Hofmann Lab of UT Austin are using the Wrangler data-intensive supercomputer to find orthologs — genes common to different species. They'll search for them in each of the major lineages of vertebrates — mammals, birds, reptiles, amphibians and fishes. "What we want to know is, even though they've evolved independently, whether it's possible that some of the same genes are important in regulating this behavior, in particular expression of these genes in the brain while monogamous males are reproductively active," said Rebecca Young. Young is a research associate in the Department of Integrative Biology and at the Center for Computational Biology and Bioinformatics, UT Austin. Music Credits: Raro Bueno, Chuzausen freemusicarchive.org/music/Chuzausen/
Scientists and engineers at TACC have created a new kind of supercomputer to handle big data.Featured on the podcast is Niall Gaffney, Director of Data Intensive Computing at the Texas Advanced Computing Center. Gaffney leads efforts at TACC to bring online a new data-intensive supercomputing system called Wrangler.The National Science Foundation's Division of Advanced Cyberinfrastructure awarded TACC and its collaborators 11.2 million dollars in November of 2013 to build and operate the Wrangler supercomputer. Indiana University, TACC, and the University of Chicago worked together on the project.In April of 2015, Wrangler began early operations for the open science community, where results are made freely available to the public. Wrangler will augment the Stampede supercomputer, one of the most powerful in the world. And Wrangler will join the cyberinfrastructure of NSF-funded XSEDE, the eXtreme Science and Engineering Discovery Environment.Niall Gaffney:We went to propose to build Wrangler with (the data world) in mind. We kept a lot of what was good with systems like Stampede, but then added new things to it like a very large flash storage system, a very large distributed spinning disc storage system, and high speed network access to allow people who have data problems that weren't being fulfilled by systems like Stampede and Lonestar to be able to do those in ways that they never could before.
The 2015 ACM Gordon Bell Prize, given in recognition of outstanding achievement in high-performance computing, was awarded to researchers Johann Rudi and Omar Ghattas of the Institute for Computational Engineering and Sciences at the University of Texas at Austin. They share the award with their study co-authors, who utilized the Stampede supercomputer of the Texas Advanced Computing Center and the IBM Sequoia supercomputer at Lawrence Livermore National Laboratory. The award-winning study modeled the flow thousands of kilometers deep in the mantle, which moves Earth's plates and triggers unpredictable events like volcanic eruptions and massive earthquakes. The SC15 supercomputing conference took place in Austin, November 15-20, 2015. SC showcases the latest in high performance computing, networking, storage and analysis to advance scientific discovery, research, education and commerce. The study, "An Extreme-Scale Implicit Solver for Complex PDEs: Highly Heterogeneous Flow in Earth's Mantle," was funded in part by the National Science Foundation and the Department of Energy. Co-authors include Johann Rudi and Omar Ghattas of ICES; A. Cristiano Malossi, Peter Staar, Yves Ineichen, Costas Bekas, and Alessandro Curioni at the Foundations of Cognitive Solutions, IBM Research – Zurich, Switzerland; Tobin Isaac of ICES; Georg Stadler of the Courant Institute of Mathematical Sciences, New York; and Michael Gurnis of the Seismological Laboratory at CalTech. Omar Ghattas: The absolute large number of cores and the big scaling result was done on the IBM system at Livermore. We couldn't have done it without the IBM guys. But the actual science runs, a lot of the day-to-day stuff, testing of the solvers - Johann was using TACC's Stampede system. Johann Rudi: Mainly, my research was done on the Stampede supercomputer at TACC. One part of the work is developing algorithms. That was wholly supported by TACC machines. And also, the help that I got from TACC a couple of times was very valuable to me. There were certain small issues that I couldn't even see from where I was working with the machine. But people from the internal status, running the systems, they could see when something was going wrong. They actually helped a lot. I was happy to work with TACC. Especially Bill Barth. I remember him helping me a lot. I was glad. The development of the solvers was done on TACC. Also, everything in the paper that shows the scientific results, the visualizations - these were also done on TACC machines. The science part was also supported by TACC.
This November 2015 marks 100 years of Einstein's field equations that describe space and time as one interwoven continuum - and predict the existence of black holes and more. Manuela Campanelli is a professor at the Rochester Institute of Technology and the Director of the Center for Computational Relativity and Gravitation. Dr. Campanelli was invited to give a presentation at SC15 titled "Revealing the Hidden Universe with Supercomputer Simulations of Black Hole Mergers." Dr. Campanelli uses the computational resources of XSEDE, the Extreme Science and Engineering Discovery Environment, to probe the mysteries of black holes. She spoke by Skype to talk about that and about the 100th anniversary of Einstein's field equations and about her work that takes on the complexity of accurately describing black hole mergers. The SC15 supercomputing conference takes place in Austin, November 15-20, 2015. SC15 showcases the latest in high performance computing, networking, storage and analysis to advance scientific discovery, research, education and commerce. Manuela Campanelli: General Relativity is celebrating this year a hundred years since its first publication in 1915, when Einstein introduced his theory of General Relativity, which has revolutionized in many ways the way we view our universe. For instance, the idea of a static Euclidean space, which had been assumed for centuries and the concept that gravity was viewed as a force changed. They were replaced with a very dynamical concept of now having a curved space-time in which space and time are related together in an intertwined way described by these very complex, but very beautiful equations.
Thomas Jordan is a professor of Earth Sciences at University of Southern California and the Director of the Southern California Earthquake Center. It's a big national collaboration of over a thousand earthquake experts and 70 institutions. Dr. Jordan uses the computational resources of XSEDE, the Extreme Science and Engineering Discovery Environment, to model earthquakes and help reduce their risk to life and property. Dr. Jordan was invited to speak at SC15 on the Societal Impact of Earthquake Simulations at Extreme Scale. The SC15 supercomputing conference takes place in Austin, November 15-20, 2015. SC15 showcases the latest in high performance computing, networking, storage and analysis to advance scientific discovery, research, education and commerce. Thomas Jordan: One thing people need to understand is we need a lot of supercomputer time in order to be able to do these calculations. Some of our simulation models that are based on the simulation of earthquake physics can take hundreds of millions of hours of computer time to generate. These are very complex system-level calculations. They're of the similar complexity of trying to calculate what Earth's climate is going to be like in 50 years because of human activities and CO2 charging of the atmosphere. It's a similar scale of problem. These problems that deal with natural hazards and the complexity of the Earth system really require very large computers to be able to simulate that activity. We're frankly looking forward to the day when computers are ten times or a hundred times or more faster than they are today.
Alan Alda, actor, director and writer, has had a lifelong interest in science. He hosted the PBS program Scientific American Frontiers from 1993 to 2005, an experience he called "the best thing I ever did in front of a camera." Perhaps best known as surgeon 'Hawkeye' Pierce on the TV series MASH, Alda has won seven Emmys, six Golden Globes, and three Directors Guild of America awards for directing. His two memoires are both New York Times bestsellers. A recipient of the National Science Board's Public Service Award, Alda is a visiting professor at and founding member of Stony Brook University's Alan Alda Center for Communicating Science, where he helps develop innovative programs on how scientists communicate with the public. He is also on the Board of Directors of the World Science Festival. SC15 is the 27th annual International Conference for High Performance Computing, Networking, Storage and Analysis. The event showcases the latest in supercomputing to advance scientific discovery, research, education and commerce. Alan Alda: I think the kind of transformation that's already been brought about by high performance computing is extraordinary. And for it to go further and fully realize its potential requires another kind of transformation… Powerful computing affects all our lives and can hopefully save our lives. It can eventually help us survive some of our unfortunate efforts that have affected climate, for instance. To model climate change is one of the great benefits we're going to get from supercomputing. The trouble is, to really help the public understand all the benefits that they can get from supercomputing, it has to be communicated with clarity so that they get it and they get excited by it… I think we have to transform the scientists who are explaining this to the public before the public will allow them and participate with them in transforming their own lives with this amazing ability to model things on supercomputers.
Robert McLay manages the software tools group in high performance computing at the Texas Advanced Computing Center. Dr. McLay is one of the developers of XALT, a software tool developed with funding by the National Science Foundation. XALT tracks user codes and environments on a computer cluster. Robert McLay and Mark Fahey of the Argonne National Laboratory will be co-leading a session called "Understanding User-Level Activity on Today's Supercomputers with XALT" at SC15. The SC15 supercomputing conference takes place in Austin, November 15-20, 2015. SC15 showcases the latest in high performance computing, networking, storage and analysis to advance scientific discovery, research, education and commerce. Robert McLay: XALT is a tool that me and my colleague, Dr. Mark Fahey, put together to help people try and use our systems. We run the system. We manage the system. We develop software for the system and install software for our users. We want to know what's used and what's not. XALT gives us a way to find out in a very inexpensive way.
This podcast features an interview with biologist Mikhail Matz, Department of Integrative Biology, College of Natural Sciences, The University of Texas at Austin. Matz was part of a study funded by the National Science Foundation and the Australian Institute of Marine Science. In June of 2015 they published results in the journal Science that found the first evidence that corals can genetically adapt to warmer waters from climate change. Podast host Jorge Salazar interviewed Matz about his computationally-based findings and about open source tools other scientists can freely use to analyze genomes of plants and animals.
Podcast host Jorge Salazar interviews scientists Michael Sacks, Institute for Computational Engineering and Sciences at the University of Texas at Austin; and Ming-Chen Hsu, Department of Mechanical Engineering at Iowa State University. New supercomputer simulations have come closer than ever to capturing real behavior of human heart valves. The studies focused on how heart valve tissue realistically responds to blood flow. And to be clear this is ongoing research, meaning they don't have all the answers yet, but they do say they've made progress on a really tough problem that potentially affects hundreds of thousands of people each year with hearth disease. The scientists say their new supercomputer models can potentially help doctors make more durable repair and replacement of heart valves.
Host Jorge Salazar interviews scientists Karen Vasquez and Albino Bacolla of the University of Texas at Austin. Supercomputers have helped scientists find a surprising link between cross-shaped pieces of DNA and human cancer, according to a study at The University of Texas at Austin. DNA naturally folds itself into cross-shaped structures called cruciforms that jut out along the sprawling length of its double helix. The DNA cruciforms typically aren't anything to worry about. In fact, previous evidence show that DNA cruciforms are essential to life. They enable DNA replication, part of how cells make copies of themselves. And they help initiate gene expression, which makes proteins. What's more small DNA cruciforms are commonly found inside our bodies. Scientists estimate as many as 500,000 cruciform-forming sequences of DNA can exist on average in a normal human genome. What the UT scientists are doing is investigating the origins of human cancer. And what they've found is that these tiny cruciforms - just a small shape of normal DNA - are linked to mutations that can elevate cancer risk.
Host Jorge Salazar interviews scientists Min Chen of Rice University and Jeroen Tromp of Princeton University. An international science team reported a discovery of gigantic rock structures hidden deep under East Asia, centered on the Tibetan Plateau. Scientists used supercomputers to process earthquake data and make images in 3-D down to depths of about 900 kilometers, or about 560 miles below ground. Scientists from China, Canada, and the U.S. worked together to publish their results March of 2015 in the American Geophysical Union Journal of Geophysical Research, Solid Earth. The study area is a hotspot for earthquakes. And it's surrounded by networks of seismographic stations, 1869 stations in all. That's where scientists got their data to take cat scans of the Earth using the supercomputer model they developed. The science team says their research could potentially help discover hidden pockets of hydrocarbon resources like oil and gas. More broadly they say their work will help explore the Earth hidden miles under East Asia and elsewhere.
Host Jorge Salazar reports from the Texas Advanced Computing Center an interview with Michael Grabe, an associate professor in the Department of Pharmaceutical Chemistry and the Cardiovascular Research Institute at the University of California, San Francisco. For the first time ever, scientists designed completely from scratch a protein molecule that behaves like a slice of life. It mimics a natural protein found in living cells that transports ions across a cell membrane. The cell membrane surrounds living cells like an envelope. And ion transport through the membrane helps keep us alive. It lets nutrients in and waste out of cells, and it also transmits signals between nerve cells of the brain and spinal cord. Scientists used the Stampede supercomputer at TACC to model the stability and dynamics of the designed protein. They did this with an allocation through XSEDE, the Extreme Science and Engineering Discovery Environment, funded by the National Science Foundation. The researchers published their results in the journal Science in December 2014. This research has wide potential application, such as targeting medicines more specifically into cancer cells and driving charge separation potentially for harvesting energy for batteries.
Matthew Hanlon manages the Web and Mobile Applications Group at the Texas Advanced Computing Center. And Matt Vaughn directs the Life Sciences Computing Group at TACC. Vaughn and Hanlon present a one-hour core conversation for South by Southwest Interactive on Monday, March 16. It's called A Next Generation Platform for Open Data. In the podcast they discuss their work on the Arabidopsis Information Portal, a new online resource for plant biology research. Matthew Hanlon: We're looking to attract both data scientists and portal developers, anyone who has experienced developing, hosting, running or trying to market an open data portal to the community. Matthew Vaughn: The Arabidopsis Information Portal, Araport for short, serves two purposes. It's a clearinghouse for genetic, genomic, protein and gene expression information for the model plant Arabidopsis thaliana… But it's also a resource for people who build portals.