Big data is real, but is it the transformation technology that some claim it is? In the opening keynote of the ISC Big Data’13 conference in Heidelberg, Germany (September 25 – 26), Dr. Felix Wortmann offers his perspective on the big data phenomenon sweeping across the IT landscape. No stranger to the hype cycles of the computer industry, Wortmann, a former exec at SAP, is currently an assistant professor at the Institute of Technology Management, University of St. Gallen and the scientific director of the Bosch Internet of Things. We asked him to preview some of the themes of his upcoming keynote.
1. Data analytics and databases have been around for decades. How is big data different?
FW: Indeed, in the last 20 years companies have heavily invested into data analytics infrastructures. The corresponding business intelligence and data warehousing initiatives have addressed information needs of the whole enterprise, including production, sales, marketing, service, and finance. However, these data platforms mainly focus on internal well-structured data. Moreover, these infrastructures have inherent performance and latency constraints. Big data technology now offers new possibilities regarding data volume and speed of analysis. Moreover, there is a shift of focus on external and unstructured data. This shift goes beyond technology. Companies now have means to better understand and interact with their environment, for example, customers, competitors or partners.
2. What do you see as the enabling technologies for big data?
FW: Big data is often associated with technologies like Hadoop and NoSQL. Moreover, main memory databases are heavily discussed. Overall, there is a very fragmented market of available big data solutions. If you take a step back, these solutions have one common denominator. They all leverage the potential of today’s hardware. Computing has changed fundamentally over the last 20 years. Basic ingredients of computing like processing power, memory, and networking bandwidth remain the same. However, multicore processors, main memory instead of disk, and high-speed networks have not been taken as the basis for software design 20 years ago.
3. Do you think this set of technologies will deliver the disruptive innovation that many proponents are promising?
FW: We definitely have seen disruptive innovation on the basis of big data. Just think about the large internet companies. However, if we talk about bringing big data technologies into other domains, we can be very optimistic, but should be careful. More data and low latency do not directly translate into additional business value. To facilitate disruptive innovation and new business models, you have to bring together business opportunities as well as IT capabilities. Therefore, business and IT have to collaborate and rethink how business is done tomorrow.
4. We often hear about these technologies are being used to surreptitiously gather private information — the most recent example being the NSA PRISM program in the United States. How do you think this affects the public’s perception of big data?
FW: Privacy is a major concern, which has to be taken serious. And for sure, cases like PRISM shape public’s perception. However, people are willing to share data if they see a benefit. Moreover, trust becomes a major business asset and companies start realizing this. Sharing and exchanging information is a fundamental pillar of our information society. We definitely need to reconsider existing practices from different perspectives. Furthermore, it will not only be about regulation but also education and responsibility. Not only do enterprises and public agencies have to follow “good practices,” the same holds true for the individual. Just think about privacy issues around Google Glasses.
5. What comes after big data?
FW: Before jumping on the next big thing we should really harvest the business potentials of big data. We do not only have to understand and deploy technology but also understand and deploy value generating use cases. However, there are different fundamental developments which will change how we life and how we conduct business. One of them is the Internet of Things: The gap between the Internet and the physical world will diminish creating tremendous opportunity. We definitely have to be aware of these fundamental changes.
About ISC Big Data’13
The inaugural ISC Big Data conference aims to bring together IT strategies, architects, CTOs and CIOs to Heidelberg, Germany on September 25 and 26. This event is beneficial for representatives, managers and decision makers from industry and research and their staff, responsible for big data R&D and deployment within their organizations. The program is in particular designed for people covering data analytics, data storage and data center management, system architectures, systems and software engineering, big data software tools, and specialists who are implementing and running enterprise, scientific and engineering applications.
The conference is also meant for companies facing technological challenges in big data hardware, software and algorithms.
Experienced big-data practitioners from big enterprises like Paypal, British Telecom, Virgin Insights and Google as well as users will be sharing their case studies during these two days. Visit the website for the full program.
Source: ISC Big Data’13
Cray Cools HPC with CS300™ Cluster SupercomputersLearn how Cray addresses datacenter cooling challenges with direct liquid-cooled computing architecture.
Feeds by Topic
Feeds by Industry
Feeds by Content Type
Subscribe to All Content
Feature ArticlesProgramming Competition Allows Students to “Geek Out” and Gain Crucial SkillsetsBinary search trees, dynamic arrays, matrix multiplication — these are some of the reasons that more than 50 students traveled to San Diego in July as part of the 2nd Annual XSEDE (Extreme Science and Engineering Discovery Environment) conference.Read more…
Data Management Expert Receives Berkeley Lab Lifetime Achievement AwardMore than 25 years ago, the recipient realized that researchers were facing significant challenges in organizing, managing and analyzing their scientific data so he set out to develop computer applications to help them better meet the challenges. He was honored at an Aug. 8 ceremony with the Berkeley Lab Prize for Lifetime Scientific Achievement.Read more…
Consolidating HPC’s GainsDespite phenomenal progress in HPC over a sustained period of decades, a few issues limiting its effectiveness and acceptance remain. Prominent among these are the repeatability, transportability, and openness of HPC applications. As we prepare to move HPC to the exascale level, we should take the time and effort to consolidate HPC’s gains and deal with these residual issues from the early days of computational science.Read more…
Read more HPCwire features…
Short TakesSpider II Emerges to Give ORNL a Big Speed BoostAug 16, 2013 |
When the Jaguar supercomputer at Oak Ridge National Laboratory morphed into Titan in 2012, it delivered a huge increase in computational power. Recently, the ORNL’s parallel file system, called Spider, received a similar overhaul, and is in the process of emerging as Spider II.Read more…
Cray Addresses Academic HPC Needs with RoundtableAug 15, 2013 |
Universities have been at the forefront of high performance computing for decades, and many of the world’s largest supercomputers run at academic institutions. But when it comes to HPC and academia, there is no one-size-fits-all solution, representatives from supercomputer maker Cray said at a recent roundtable.Read more…
UberCloud Preps for Round Four of Cloud HPC ExperimentsAug 14, 2013 |
Since it launched last July, the UberCloud Experiment has held three rounds of trials that demonstrate the viability of running HPC workloads in the cloud. With the fourth round set to begin soon, attention once again focuses on encouraging the HPC community to invest in the development of cloud HPC resources that have the potential to benefit smaller organizations.Read more…
GPUs Show Big Potential to Speed Pricing Routines at BanksAug 13, 2013 |
The London-based bank HSBC demonstrated that it may be able to save millions of dollars in computer costs by moving a portfolio pricing process from a grid of Intel Xeon processors to NVIDIA Tesla GPUs, reports Xcelerit, the company that helped the bank with its experiment by providing CUDA programming tools.Read more…
Cray Supercomputer Gave Forecasters an Edge in Tornado PredictionAug 12, 2013 |
The deadly EF5 tornado that hit Moore, Oklahoma on May 20 was unique in several ways. Not only was it one of the strongest twisters ever recorded, but forecasters were able to issue a tornado warning 36 minutes in advance, saving lives. Playing a part in that forecast was a Cray supercomputer at the National Institute for Computational Sciences (NICS).Read more…
Read more headlines…
Technical Computing for a New Era
07/30/2013 | IBM | This white paper examines various means of adapting technical computing tools to accelerate product and services innovation across a range of commercial industries such as manufacturing, financial services, energy, healthcare, entertainment and retail. No longer is technically advanced computing limited to the confines of big government labs and academic centers. Today it is available to a wide range of organizations seeking a competitive edge.
The UberCloud HPC Experiment: Compendium of Case Studies
06/25/2013 | Intel | The UberCloud HPC Experiment has achieved the volunteer participation of 500 organizations and individuals from 48 countries with the aim of exploring the end-to-end process employed by digital manufacturing engineers to access and use remote computing resources in HPC centers and in the cloud. This Compendium of 25 case studies is an invaluable resource for engineers, managers and executives who believe in the strategic importance of applying advanced technologies to help drive their organization’s productivity to perceptible new levels.
View the White Paper Library
Xyratex, presents ClusterStor at the Vendor Showdown at ISC13
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
HPCwire Live! Atlanta’s Big Data Kick Off Week Meets HPC
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta’s first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today’s big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
HPC Job Bank
Visit the HPCwire Job Bank
August 7, 2013
– September 04, 2013
The 2013 HPCwire Readers’ Choice Awards Nominations are now open! San Jose ,
September 9, 2013
– September 11, 2013
HPC User Forum US MeetingBoston, MAUnited States
September 9, 2013
– September 09, 2013
10th Annual HPC for Wall StreetNew York City, NYUnited States
September 12, 2013
– September 12, 2013
HPC Advisory Council Spain Conference 2013Barcelona, Spain
November 17, 2013
– November 22, 2013
SC’13Denver, COUnited States
Post an Event