A serious challenge of nowadays Software Eco-System
Prof. Guang R. Gao
9:30am, Nov 11, room 3-404
Bio:
Professor Gao’s research has focused in high-performance computer
system, specifically in compiler optimization techniques, instruction scheduling,
software, and dataflow programming model.
He is the Endowed Distinguished Professor at the Department of
Electrical & Computer Engineering in University of Delaware, ACM Fellow and
IEEE Fellow. He won the CCF overseas outstanding contribution award in 2013.
Abstract:
Nowadays High-Performance Computing system and Big Data system are
diverges from system architecture to software design, which leads to different
eco-system. There needs a unified model to combine Big Data analytics and
High-Performance Computing. This can provide a way to deal with big volume data
while guarantee high computing performance.
My proposal to achieve it is the dataflow model. Dataflow paradigm
“is the most suitable computing paradigm for Big Data”. It offers “superior speedup,
power savings and size reduction”. Dataflow model has many advantages over
other models: The argument –Fetching Dataflow Model encouraging a smooth
integration of hybrid control-flow and dataflow, location consistency breaks
the barrier on memory and spontaneous dataflow model permits a smooth handling
on nondetermism introduced by external events. By using our new compute
Engine—HAMR, requirements from both Big Data and High-Performance can be
satisfied.