Menu

Future Defined Storage

We have worked with many hardware and software companies over the years and built relationships that help us provide the correct data storage solutions for industry. The gradual demise of tape and latterly hard disks makes the choice of a storage solution more difficult.

Industry today is moving more information into the cloud and choosing flash storage to provide the on-premise storage to replace legacy systems. This creates other issues relating to the LAN and WAN

  • How much data can we send and receive over the WAN?
  • What is the cost of upgrading links?
  • Can your current provider increase the WAN link capacity, if not how easy is it to move providers?
  • Will the staff notice any delays?
  • How secure is our cloud based information?
  • What happens when we cannot access the cloud?
  • How easy is it to move legacy IT to the cloud?
  • What happens when the cloud vendors increase prices?
  • How easy is it move data from cloud providers?
  • Do we need to improve LAN performance so that users can now benefit from flash storage?
  • How scalable and reliable is flash storage?
  • Should all our data reside in the cloud due to national security, intellectual property etc?
  • We now have more links to our data, how secure is our infrastructure?

Shrinking Data Centres

Shrinking data centres

Data centres are gradually shrinking and become more energy efficient as storage capacities rise. Five years ago a 16 bay 4U rackmount storage array could store 32TB’s of data, today those same 16 bays can store at least 160TB’s a 5x increase! This causes problems for backups as less spinning disks = less throughput, in addition to this a RAID rebuild that normally rebuilt overnight now takes many days and whilst it is rebuilding the chance of another disk failure is higher as the drives are working harder to rebuild the RAID. The data centre of tomorrow will look vastly different to today with a hybrid approach to using the cloud and on-premise/edge data centres for local access.

Software defined everything

With software defined storage and networking the intelligence of the network and storage controllers is increasingly being handled in the software. RAID is being replaced with software to provide access to individual disk rather than RAID groups, the data blocks in software defined storage can easily be replicated to multiple disks or pools to overcome the problems with RAID rebuild issues. Depending on the software defined solution you could take this a step further moving data blocks across multiple storage arrays and servers over the LAN or WAN. Network switches once had ASIC’s designed to provide functionality within the switch, software defined networks provide the same functionality controlling individual ports and routing tables within software allowing you to dynamically transfer data via a certain network path or route. Software defined certainly allows for a more flexible data centre as you can mix both SSD and hard disks of differing drive capacities, interface types and capacities to provide a tiered storage system for classifying data types. Also with software defined performance is normally greater compared to a RAID controller as you can control individual drives parameters and add differing memory to increase performance using caching?

The future

Flash storage is certainly taking centre stage with more technologies vying to become the dominant leader with Intel and Micron 3D Xpoint, HP and SanDisk memristor /ReRam, IBM PCM, along with SLC, MLC, TLC, QLC, 3D NAND, SSD, PCIe and NVMe. With SSD drive capacities announced by both Seagate and Samsung reaching 60TB and 15TB respectively.

Silicon photonics is now widely accepted as a key technology in next-generation communications systems and data interconnects. This is because it brings the advantages of integration and photonics—high data densities and transmission over longer distances allowing current server and storage components to separately housed in containers. Each container will connect via Silicon Photonics and within each container, it will hold either processors, memory, storage, graphics GPU’s etc. This then provides an interesting look as to how future data centres will be viewed. Today data centres are vast halls some the size of football pitches humming with the sound of AC cooling, hard disks, fans and power supplies all cooled to the same temperature. If we separate the individual components into containers then these containers can be cooled to different temperatures and they can also be mounted on different floors vertically throughout the building as they take up less space! All this is connected via fibre running at 1,000 Tbits/s with a sub nanosecond access time. This type of data centre greatly enhances power efficiency, performance and reduces costs as you purchase only what you need i.e. more GPU or CPU performance. As you connect containers the operating software instantly knows you have connected more storage and adds it to the pool with no human input!

Cloud, for now, is winning the mindset of industry leaders, the issue is your OPEX is now out of your control i.e. Microsoft increase Office 365 pricing by or reduce OneDrive storage this scenario could easily happen. Your WAN provider increases its prices or you lose your connection through a system failure. The cloud provides 24x7x365 to information and applications allowing users with phones, computers, tablets or IoT to instantly buy tickets online or stream a movie. Generation “X” have grown up in a connected world, sharing posts, video, music and pictures freely with very little regard for privacy or security until something unwanted appears online that wasn’t meant to be shared. Ransomware is increasingly being used to target companies to cause as much disruption as possible to a business.

  • How is your data protected in the cloud?
  • Do you pay to have it replicated to another cloud data centre?
  • If your cloud data is corrupted where is your backup or contingency?

The cloud works and in many industries saves time and money, we always recommend moving forward with cloud deployments cautiously fully consulting all departments to ensure that they understand the motives behind the transition and ask what comebacks there could be should something happen.

Tape a technology in decline, however, IBM recently announced that their TS1155 has the ability to store 330TB’s of uncompressed data on a cartridge that will fit in your hand. This could be used as a long term archival storage platform to store huge amounts of information in a space smaller than a filing cabinet consuming very little power to operate.

Disk drives for now, will remain as the key storage medium with capacities of 100TB by 2025. Yes, the worldwide market share of disk drives is declining but is this figure being skewed by the increase in capacity? Sure flash storage is getting more headline space but it still costs more compared to a hard disk capacity equivalent and not all data needs to delivered at lightning speed.

Optical has remained at 100/125GB per Blu-ray disk for the past few years and Sony has recently announced 3.3TB disks. This technology will never compete with flash for performance, tape for capacity or disk drives for price. Optical isn’t about any of this it’s about the longevity of having a stable medium that can hold 1,000’s of images, documents, audio files etc on media that doesn’t age and requires very little environmental handling or power. Information stored on optical doesn’t need to be migrated to the next generation because it can’t be read its use is in storing data 10, 20, 50 or 100+ years in the future allowing future generations the ability to learn from the past.

In development

All of these technologies exist in the lab and one day might become a commercial reality.

5D Glass Data Optical Disk – Developed by Southampton University and with the ability to store 360TB’s on a disk the size of a penny.

DNA Digital Storage – Harvard researchers have been looking at storing data in DNA strands. Just 1 gram of DNA could store 2.2PB’s of data.

Holographic Storage – Was once hailed as a secure method to store data in 3 dimensions on an optical disk holding 500Gb, sadly, for now, this technology is on hold.

Quantum Physics – This involves attaching a bit of data to a spinning electron, but the data is only available for 24 hours.

How can we help?

We understand all of the following and work directly with partners to secure and win business.

  • SDDC – Software Defined Data centre
  • SDS – Software Defined Storage
  • SDN – Software Defined Network
  • SDI – Software Defined Infrastructure
  • HCI – Hyper-converged Infrastructure
  • Cloud – Public/Private
  • MSSD – Microsoft Storage Spaces Direct
  • MAS – Microsoft Azure Stack

The volumes of information created today are huge compared to the amount we stored just 5 years ago and a comparison is in 2015 worldwide 10 Zettabytes of data was created by 2025 this figure is predicted to reach 180 Zettabytes an 18 fold increased within 10 years!

  • Megabyte - 1,000,000 bytes
  • Gigabyte - 1,000,000,000 bytes
  • Terabyte – 1,000,000,000,000 bytes
  • Petabyte - 1,000,000,000,000,000 bytes
  • Exabyte - 1,000,000,000,000,000,000 bytes
  • Zettabyte - 1,000,000,000,000,000,000,000 bytes
  • Yottabyte - 1,000,000,000,000,000,000,000,000 bytes

No matter what industry you operate technology is creating more information than ever before from higher resolution cameras, social media, through to 4K/8K movies are all adding to the data mountain. Planning for this amount of information isn’t something to be taken lightly and involves many industries storing data outside of how they have been operating for the past 20 years. We like to think with decades of experience we can help you plan your next phase of data storage.

Smarter, strategic thinking.
Site designed and built using Oxygen Builder by Fortuna Data.
®2024 Fortuna Data – All Rights Reserved - Trading since 1994
Copyright © 2024