Editor’s note: this
article is by IBM Research – Austin Director Kevin Nowka
The Texas A&M University
System and IBM recently agreed to create one of the largest computational sciences infrastructures in the
world, dedicated to advances in agriculture, geosciences and engineering. This
new High Performance Computing (HPC) system will link 11 universities and seven
state agencies, conducting such disparate projects as wind turbulence
simulation, to text analytics of medical documents.
In my role as director of the
director of IBM’s Austin
lab, my team will develop these opportunities with Texas A&M, and solve these
grand challenges.
The university system’s research compute power has always
been distributed over their 11 campuses and government agencies. We want to, with
three domain-optimized systems, allow for access across all of their users –
and eventually expanding from engineering to geoscience and agriculture, and
grow it across to all of their HPC and analytics users.
The systems behind
the research
One arm of the infrastructure is 2,000 compute nodes of a Blue
Gene/Q, where its modeling and simulation capabilities will be put to use in the
life sciences, computational biology, and geosciences. We’re already working
with Texas A&M Research to optimize their code in climate modeling,
computational materials science, and even wind turbulence analysis.
The Power 7 part of the infrastructure, on the other hand, is
focused on big data analytics research. It will provide Texas A&M with
cognitive computing capability equal to what IBM Watson used to play Jeopardy!
In this case: using IBM Big Insights for text analytics and data mining. Its
text analytics capability is already working on teasing out promising technical
literature around cancer treatment identification.
The third system – our NextScale System x – is the basis for
much of Texas A&M’s overall HPC update. It’s adding capability such as
shared compute services across their campuses and state agencies.
Other projects underway also include geoscience data
management of the Gulf of Mexico, called SmartGulf, and atomic-level modeling
to come up with new materials for energy, aerospace, structural and defense
applications.
I’m looking forward to writing more soon about the results from this unique – and massive – infrastructure, projects, and collaboration in the coming
years.
For more: read a Q&A with Jon Mogford, Vice Chancellor for Research, Texas A&M University System.Labels: big data, bluegene, high performance computing, power7, texas a&m