Hadoop is an open source software framework licensed by the Apache Software Foundation that uses a distributed compute infrastructure to handle large, batch analytics jobs on very large data sets. It does this by breaking these projects down into a…
Hadoop is an open source software framework licensed by the Apache Software Foundation that uses a distributed compute infrastructure to handle large, batch analytics jobs on very large data sets. It does this by breaking these projects down into a…
To solve tier-1 database performance problems, it is important to understand the nature of tier-1 applications. Standard definitions of tier-1 include: (i) extremely high cost, extremely high performance applications – sometimes referred to as “tier-0” (e.g., Wall Street trading platforms)…
A lot has been written about the size and scope of the purpose built backup appliance (PBBA) market. When backup appliances first came to market over a decade ago, they were designed primarily as large disk repositories, some of which…
Implementing server-side caching with the right solid state disk (SSD) can be like conducting a ‘surgical strike’ on storage performance problems. Installing this combination of hardware and software can eliminate the storage roadblock to increased transactions per second, while not…
The backup process has a myriad of costs associated with them. There is the obvious cost of the hardware – backup servers, backup storage and network infrastructure. There is also the cost of operations to manage the backup process. One…
Almost two years ago, Storage Switzerland predicted that automated tiering technologies would be able to do more than just move data between hard disk and flash. It would begin to move data between different types of flash based storage as…
In the modern data center, storage system upgrades are rarely caused by a storage system running out of capacity; rather, it more often occurs due to an unanticipated lack of performance or exorbitant maintenance prices. In fact, performance related upgrades…
The cost of storage is important in all data-dependent companies, but in hyper-scale environments like web-based enterprises, it can literally consume the business. With the simultaneous requirements of scalable capacity, reliability and availability, these organizations face serious challenges with storage…
Thanks to an easy-to-measure return on investment (ROI) most data centers’ virtual server environments are growing rapidly. Many organizations have implemented a “virtualize first” policy, where all new servers are virtualized. In addition, legacy servers are being migrated to virtual…
While specialized all-flash storage systems grab the headlines, hybrid storage systems are increasingly becoming the workhorses of the data center, hosting the majority of applications, virtual machines and file shares. To expand the number of workloads that they can support…