TECH NEWS

Technology trends for 2017

This post will describe four technology trends for 2017.

February 1, 2017

Hu Yoshida

Hu Yoshida

I have divided up what I see as IT trends for 2017 into three sections. Data Center trends, Technology Trends, and IT/OT/IOT trends. In my first post in this series I gave an overview of 10 trends for 2017. My second post covered four data center trends and this post will describe four technology trends for 2017. – By Hu Yoshida from Hitachi Data System Community

1. Bi-Modal IT

bimodal_hitachiCompanies that are not born in the cloud have systems of record that they must maintain and modernize while they transform to new systems of innovation. Bimodal IT refers to having two modes of IT, each designed to develop and deliver information and technology-intensive services in its own way:

  •  Mode 1: Traditional — emphasizes safety and accuracy
  •  Mode 2: Innovation — emphasizes agility and speed

IT must be able to manage both modes and implement systems that can bridge between these two modes. While some may consider this to be a data center trend, this requires technology to integrate these two modes.

BI-modal IT has been publicized by Gartner as an approach to digital transformation. This approach is not new. Older establishments used this approach to transition from mainframe systems to open systems where applications had to be redesigned and re-programed for new operating and storage systems without disruption to core business services. This transition has taken a long time and many establishments still run some of their core applications on mainframes.

A bi-modal approach to digital transformation involves moving from structured to unstructured data and from private to public cloud services. With the right technology it can help you modernize and nurture your traditional core systems while enabling migration to new systems of innovation. Three key technologies that will help bridge between these two modes are converged solutions for bridging the infrastructure, object stores for bridging data, and integration tools for bridging the information. Converged solutions can optimize the deployment of mode 1 applications and bridge to mode 2 with orchestration and cloud ready interfaces. Data can be bridged through the use of object storage, that can support cloud interfaces, traditional structured data, and synch and share with mobile devices. Object storage can scale beyond hierarchical file systems, has extensible meta data that can ensure governance, privacy and immutability, and provide efficient, intelligent, search capability. Information that resides in mode 1 data warehouses and mode 2 unstructured data stores can be integrated in a data lake through tools like Pentaho.

2. Flash First

Source : community.hds.com

Source : community.hds.com

The TCO per bit for multi-terabyte flash is already lower than hard disks based on 5 year projections for power, cooling, floor space, maintenance, and ease of management. Multi-terabyte flash eliminates those 3am calls complaining about slow response time. The cost argument against all flash is eliminated and you no longer have to argue with a user whether his data is tier 1 or 2. As a result, analyst are projecting that the revenue for flash storage will cross over the revenue for Hard disks in 2017 as the transition to Flash accelerates.

Two factors which will create a wider price gap between flash and hard disk prices are the difference in technical roadmaps and the increasing use of flash in commodity devices. The technology to drive higher bit densities in disk drives has leveled off while flash densities continue to increase with 3D flash and Tri level cells. Prices are driven by volumes, and the volumes for hard disk are declining as PCs and other high volume commodity devices move to flash at the expense of hard disks.

3. A Centralized Data Hub

hcp-hci_hitachi

Source : community.hds.com

Data is exploding, and data is becoming more valuable as we find ways to correlate data from different sources to gain more insight, or we repurpose old data for different uses. Data can also be a liability if it is accessed by the wrong people, is exposed, or is lost, especially if we are holding that data in trust for our customers or partners.
Data is our crown jewels, but how can we be good stewards of our data if we don’t know where it is, on some one’s mobile device, an application silo, an orphan copy, or somewhere in the cloud? How can we provide governance for that data without a way to prove immutability, and show the auditors who accessed it when, and how we can ensure that the data was destroyed? For these reasons, IT will be creating a centralized data hub for better management, protection and governance of their data.
This centralized data hub will need to be an object store that can scale beyond the limitations of file systems, ingest data from different sources, provide secure multi-tenancy, with extensible meta data that can provide search and governance across public and private clouds and mobile devices. Scalability, security, data protection and long term retention will be major considerations. Backups will be impractical and will be eliminated through replication and versioning of updates.
An additional layer of Content Intelligence, can connect and aggregate data, transforming and enriching data as it’s processed, and centralize the results for authorized users to access. Hitachi’s content platform, HCP with Hitachi Content Intelligence (HCI) can provide an object centralized data hub with seamlessly integrated cloud-file gateway, enterprise file synchronization and sharing, and big data exploration and analytics.

4. Real time analysis, Hadoop, visualization, and predictive analytics will be a major focus

This trend will see the expanded use of in-memory data bases like SAP S/4 HANA to shorten data analysis cycles. Data streaming platforms will provide real time analysis of developing trends and analytics will be embedded in applications. Real time analytics will be connected with Hadoop analytics for further analysis and results will be stored in an object store for the possibility of future analysis. Deep learning tools have a need for historical data. Analytic tools like Pentaho will combine structured and unstructured data from different sources to provide a 360 view for analysis. Visualization tools will be designed for the business user to help make sense of the data. Predictive analytics is becoming more prevalent as businesses try to anticipate the events that affect their business. AI and robotics are beginning to enter the picture as some early adopters are using robotic call centers. Visual analytics are being used with Hitachi Visualization Systems for public safety application.

However, no matter how fast the analysis is speeded up, it does no good if the down-stream processes and decisions do not capitalize on this analysis. Legacy systems and databases may still hinder the ability to achieve faster results unless they are aligned with in-memory analytics.

The ability to modernize core systems with technologies like in-memory computing and new analytics can prove to be highly transformational. The key is to integrate these new technologies into an overall business architecture to achieve digital transformation and deliver real business improvements.

Watch video

In the same category