Ever-increasing amounts of big data have many tech insiders wondering how the data will affect network demands, datacentres and computer infrastructures, going forward.
A handful of experts discussed just that, during the opening day of GigaOm’s annual Structure cloud computing pow-wow on Wednesday afternoon.
One of the suggestions on how to look at challenges surrounding big data, is to change viewpoints. For example, it’s the cloud-based network that is becoming the source for workloads and collaboration, rather than computers.
Cisco Systems chief technology officer and vice president Lew Tucker commented that, when we talk about data, it’s not just bits and atoms needing to be analysed. In the enterprise world, he added, we’ve never seen the amounts of data that are being accumulated now.
However, as panel moderator and GigaOm writer Stacey Higginbotham pointed out, it’s hard to establish markets and business models for big data, which is holding the sector back.
Ken Barnes, senior vice president and global head of platform services at NYSE Technologies, argued that big data requires a different mindset for businesses and that it can’t really succeed as vendor-buyer relationships; it needs to be partnerships instead.
Nevertheless, Lew Tucker said that the ability to process big data, and get value out of it, has also never been possible until now, thanks to growing trends like social networking.
Serban Simu, vice president of engineering and co-founder of high-speed transfer solutions provider Aspera, said that the first problem his company is seeing with its customer base, is figuring out how to move the data from where it is produced to where it can be stored and/or processed.
“The bottleneck there is network capacity,” said Simu. “Data can be produced a lot faster than it can be moved.”
Thus, just like the network and computer situation, it might be better to bring applications closer to the data, rather than vice versa, which has been the status quo, thus far.
Simu posited that proximity isn’t as important as the cloud and app infrastructures — so long as they are built for this kind of distribution. But Tucker warned that if app developers and managers aren’t cognisant of how far away the data might be, that could cause some problems.
At Cisco, Tucker said that the tech giant is trying to produce a networking model under the direction of application software, instead of having the infrastructure just guess what to do. Therefore, businesses working with big data can be more dynamic on an application-by-application basis, to reconfigure their cloud networks and solutions for better performance.
Via ZDNet US
Related posts:
Views: 0