Skip navigation
1 2 3 Previous Next

Hu's Place

207 posts


Digital transformation is a journey that will continue to evolve over time. During this time there will be exponential changes in technologies, an even greater explosion in information, and new business models that will need to be integrated into this transformation in order for it to be sustainable. The platforms we choose for transformation must have the elasticity to not only scale to meet the volume demands but also incorporate new changes in technologies.


That is what makes the partnership with Hitachi and VMware so exciting. We have demonstrated over the years how we have been able to be at the forefront for integrating new technologies. This year Hitachi was singled out by VMware for winning their Global Innovation OEM partner award


In the early days of virtualization Hitachi was the first vendor to integrate our storage virtualization with VMware’s server virtualization. When converged systems came into play, Hitachi’s unique integration with VMware through its Unified Compute Platform orchestration software layer has always been a key innovation from the very beginning. Last year Hitachi Data Systems announced enhanced integration with VMware Solutions with new support for VMware NSX, Docker Containers and enhanced VMware vSphere Virtual Volumes, which strengthens Converged and Hyperconverged Portfolios to automate IT, improve security and enable application continuity. Hitachi also debuted all-flash versions of our converged solutions and now offer all-flash configurations across its entire portfolio of converged and hyperconverged infrastructure platforms, to deliver powerful performance for enterprise applications and help customers move to an all-flash data center. Hitachi extended its OEM agreement to include VMware NSX for use with Hitachi Unified Compute Platform. These were all made possible through the elasticity provided by our Storage Virtualization Operating System, SVOS and our UCP orchestration software layer


Digital transformation requires an elastic and reliable infrastructure that can deliver cloud like economics and agility, with improved efficiency and simplified manageability. IT transformation requires a solution that can modernize their core legacy applications while enabling them to bridge to innovative new cloud based solutions. The innovations in the partnership between VMware and Hitachi Data Systems will help businesses drive change and stay ahead of the competition. Unlike DIY or joint efforts by multiple compute, storage and networking vendors, Hitachi’s OEM relationship with VMware provides a major benefit in streamlining the acquisition, maintenance, and ongoing support for VMware vSphere solutions.


Hitachi and VMware have the technical elasticity to scale for future  growth from core to edge to cloud.


See what else is new with Hitachi at VMworld 2017 | August 27-31, 2017 Mandalay Bay Hotel, Las Vegas, NV.


I have been out sick for the last two weeks due to a stomach bug. Never missed as much work before in my life. It was so bad I went to acute care twice for an IV. During this time between bouts of fitful sleep and trips to the bathroom, I listened to the TV news which was even more depressing with escalating nuclear threats, opioid explosion, extremist demonstrations, and tragic hate crimes.


However, there was one bit of news which I felt promising. The San Jose Mercury News reported that Blockchain technology is being used to fight food fraud which has become a serious global issue. Walmart is just completing a trial run to trace pork in China where it has more than 400 stores. A Chinese company said it will use the technology to track chickens from coop to processing to the store. Alibaba is also investigating block chain technology to provide greater product integrity working with food suppliers in Australia and New Zealand as well as in China.


If there was a global way to ensure the safety of our food supply, it may have helped prevent the suffering which I just went through. This is just another use of technology for social innovation.

Hyatt Delhi.png


Next week I will be participating in IDC’s Digital Summit in New Delhi, 4-5 August at the Andaz Hotel. The theme of this Summit is “Dominance of Digital Enterprise 1.0”. IDC has been a thought leader in recognizing the digital transformation trend and many are familiar with their Digital Transformation MaturityScape which is a methodology for companies to measure their progress in the digital transformation journey. Others have written and blogged about the Digital Enterprise, but Digital Enterprise 1.0 seems to be IDC’s identification of a special stage in the journey.


IDC say that the conversation has changed from “Is it real?” to “How do we leverage it?” IDC also understands that digital transformation is as much about organization, change management, and new business models as it is about people, process, and technology. Digital Enterprise 1.0 is an enterprise where digital becomes all pervasive in the corporate world.


IDC identifies the following subthemes to the Digital Enterprise 1.0


Continuous Altering of Business models: Dominance through designs that disrupt, “not just affording digital opportunities for customer experiences or value, but altering business fundamental such as “car as a Service”


Managing Exponential Technology Change: “Managing disruption with exponential change requires a different enterprise technology fabric – a fabric which demands enabling agility and architectural elasticity and provide a foundation to successful digital Transformation”.


Measuring Digital Transformation: “It is crucial that enterprises now start measuring business value and Digital Dividend at digital velocity.


These themes fit well with Hitachi’s approach to digital transformation and IoT.


Continuous altering of the business model is enabled by our co-creation approach to innovation. Instead of taking the “build it and they will come” approach, Hitachi has completely revised their research and development organization to a “market –in” approach. I blogged about this is in my post about the transformation of Hitachi Research, where the focus is now about business outcomes rather than the number of patents filed.


Managing exponential technology change is enabled by our Lumada platform. Lumada is an open and adaptable architecture that can apply to a wide range of industries and use cases, where core solutions can be extended or repurposed across multiple verticals or easily tailored to specific situations. Hitachi’s extensive expertise in OT and IT provides a unique understanding of the fundamental requirements to build and deploy IoT architectures at scale. This allows Hitachi to help customers get actionable business insight that translates into real business value, faster.”


Measuring Digital transformation is crucial to success. You cannot improve what you do not measure. Hitachi engagement begins with an economic assessment of where we are in the enterprise and projects out what the expectations should be. Hitachi is willing to share the risk with the customer, providing on demand, pay as you go, or managed services based upon KPIs.


I will be presenting at this conference next week, giving some use cases that illustrate our support of these three sub themes.


If you are in New Delhi next week I hope to see you there

In my last post on The Future of Secure Authentication: Biometrics, a comment came in from Jens-Uwe Dzikowski who referenced an article in the recent Hitachi Research website entitled: Making Society Safe and Convenient with High-Precision Walkthrough Finger Vein Authentication : Research & Development.


This describes the research being done to provide a walk through Finger Vein authentication in .3 seconds which makes it possible for use in high traffic situations. According to the  international Journal of Advances in Science and Technology which I referenced in my last post, finger vein has the highest accuracy, long term stability, and security level compared to other biometrics such as facial recognition, iris scan, finger print, and voice recognition. Finger vein patterns are created in the womb and do not change as we age. Hitachi Finger vein technology is proven technology. It was introduced in Japan in 2002 for ATMs and while ATMs are the biggest use case, it has been used for other applications like access control, claims verification, and time and attendance management in Europe, Asia, and Asia Pacific. Barclay’s bank provides UK corporate banking customers with a finger vein biometric authentication dongle that attaches to a USB slot on their PC and lets them “easily access their online bank accounts and authorize payments within seconds, without the need for PIN, passwords or authentication codes”. With finger vein there is no need for two factor authentication. However, up to now, it required people to stop and place their finger on a fixed location, with an infrared light above the finger that shines through so that the image of blood in the veins can be captured.

FV schematic.png

Although finger prints are not as accurate or secure as finger vein, finger prints have been considered easier to use and subsequently has higher market acceptance. Finger print authentication often requires two factor authentication such as an ID card or photo. Now Hitachi researchers have found a way to read finger veins by passing your hand over an L-shaped scanner using an array light source of multiple LEDs, making it possible to expose the fingers to near-infrared rays no matter where the fingers are. Also since you scan all five fingers, you can authenticate as long as three of the five finger’s vein patterns match providing very high accuracy despite the open environment. Now it is more convenient and faster to use than finger print scanners, while still providing higher accuracy and security without the need for two factor authentication.

Walk thru FV.png

Last week, I had the opportunity to visit the research lab where this walkthrough finger vein authentication is being developed. I was able to register my finger veins in less than half a minute by scanning my hand 3 times. Once I was registered I was able to pass through with a single scan. My colleague from Asia Pacific was very excited to see this demonstration since there is a growing demand for high volume biometric authentication in his region. The uniqueness and accuracy of the finger vein may eliminate the need for two factor authentication and the non-contact scanning eliminates the health concerns associated with finger print scanning, where you press your fingers on a plate that has been touched by hundreds of people before you.


Currently this is being tested for entry control in a Hitachi building in Tokyo. Instead of scanning their employee badge, employees pass their hands over a finger vein reader. This prototype processes 60 to 80 persons per minute per unit. While walkthrough is currently in development, other finger vein devices are available today from Hitachi. Businesses should know about the advantages of this biometric technology and consider it in their security planning.


This new twist in high precision, walkthrough finger vein authentication, will make it easier to use where individuals in large queues need to be authenticated, at transportation centers, concerts, hospitals, kiosks, etc. If you register your finger vein with an electronic wallet you could also use your finger vein in place of a debit/credit card. This could provide better security for vending machines users. Avanti Markets, a provider of vending machine kiosks, was hacked earlier this month, with malware that infected 1900 of its vending machines and stole the names, payment card data, and finger print data of some 1.9 million users.


With finger vein, you eliminate the need for a name and payment card data and if they steal your finger vein pattern, you’re the only one who can use it because you need a live finger with blood in the veins for it to work. With the increasing number of hack attacks and terrorism threats, the time is right for high precision, walkthrough finger vein authentication.


You can hear more about this and other social innovation solutions at our NEXT 2017 users conference, Sept 18-20 in Mandalay Bay, Las Vegas. Click on the website for NEXT 2017 and register today to attend this conference.

Seven years after the founding of Hitachi in 1910 by Namihei Odaira, Hitachi established its first research laboratory in 1917. This was the foundation of Hitachi’s Central Research Laboratory which has kept Hitachi in the forefront of technical innovation for the past 100 years. In that same year Hitachi published the first edition of the Hitachi Hyoron technical journal in which technicians published their research results. This publication continues today providing valuable insights into Hitachi’s future technology direction. Today, Hitachi research publications can be found at this website


In the past, research has been measured by the number of patents that have been filed and Hitachi has consistently ranked among the companies with the highest number of patents. Innovation strategies for the most part concentrated on processes and products with the objectives of producing quality products at a low price, and building products with novel performance and functions. In the Hitachi Review published in July of 2015, Hitachi reveled a new direction for research which is measured less on the number of patents and more on business outcomes that support our Social Innovation strategy. The goal is now to build the future by working with customers to deal with the challenges facing society on a global basis. Hitachi believes there is a need to consider research and development from the perspectives of customers and business in this new age of digital transformation.


In response to the need for digital transformation and the focus on business outcomes, Hitachi’s R&D group was reorganized along three strategic axes.


The Center for Exploratory Research CER, is focused 10 to 20 years into the future and is pure research. The Center for Technology Innovation CTI is focused on technologies that drive business outcomes, and the Global Center for Social Innovation CSI is focused on co-creation of services and solutions with customers and collaborating partners.


In addition to the CSI lab in Tokyo, four CSI labs were established around the world to facilitate regional collaboration with customers.  CSI labs are located in APAC, China, Europe, and the Americas. CSI Americas has locations in Santa Clara, Detroit, and Brazil, with the biggest facility in Santa Clara, in the heart of Silicon Valley.


While the number of patents has declined in the past year the quality and focus of Hitachi’s patents has increased and has resulted in market leadership.

IoT Market.jpg

More importantly this approach has accelerated social innovations in healthcare, transportation, energy, and public safety.  Many of the new innovations today have their roots in the vision-driven exploratory basic research of the past. For instance, Hitachi filed a blockchain patent in 2003 under the title “Hysteresis signature research” five years before the introduction of Bitcoin. While the customer facing part of Hitachi research is now the Centers for Social innovation, Hitachi still maintains its centers for exploratory research CER, and technology innovation CTI which feed into CSI.


To learn more about Hitachi Social Innovation, Hitachi Research and see demonstrations of some IoT projects, signup to join us at the NEXT 2017 event in Las Vegas, September 18 – 20. The theme of this event is “Lead What’s Next”. Better outcomes. Better business. Better society.


We will have a number of researchers speaking, like Dr Yano, Hitachi Chief Scientist, speaking on Artificial Intelligence and David Pinski, Chief Strategist for Financial Innovation and Head of the Financial Innovation Laboratory in CSI Americas speaking on Blockchain. We will also have developers and business leaders who have translated Hitachi research projects like Finger Vein biometric authentication into business outcomes. 



The world is going through a massive digital transformation. As the world becomes more digital and distributed, being absolutely certain about who we are dealing with and protecting our access credentials is critical to our safety and well-being. It used to be that an access badge was sufficient to grant access to a restricted area. A passport and signature was enough to get you through immigration, and a credit card and signature was enough to buy goods and services, or an ATM card and pin number was enough to draw money from your account. In this digital age that is not enough and many are turning to biometrics for authentication.


Biometrics are human features which are unique to an individual. Common biometrics include facial recognition, voice, iris scanning, finger vein and finger print, and vary in terms of cost, accuracy, ease of use and security level. With a biometric authentication system, the user enrolls in a system or service and provides a biometric sample, such as a fingerprint. Later, when the user wants to use the system, he presents his biometric to a scanner which matches his biometric with the previously stored biometric template. If the biometric matches the template, he is granted access to the system. If the biometric data is linked to a funding source, the process acts as both authentication and transaction enabler, greatly simplifying the transaction process. Biometrics are easier to use than having to remember and manage passwords. Smart phones like the iPhone are now equipped with an app to register a fingerprint and scan it for authentication in place of a password.


There are some criticisms about the use of biometrics. Facial recognition and voice have low accuracy. In the case of fingerprints, studies have shown that skin conditions like dryness or Dermatitis may cause fingerprint verification failures. The need to press your flesh against a scanner that hundreds of other people have touched is also a hygienic concern. When the touch verification was introduced by iPhone, articles were written on how biometrics would not work because, they could not be kept secret. We leave our fingerprints on everything we touch and if there was a way to lift our prints and use them on a scanner, there is no way that we can change our finger print, like we could with a password. Researchers have shown that Gummi bears can be used to generate counterfeit fingerprints.


Iris or retina scanning is a biometric approach which is considered to be the most accurate. This has been popularized in many movies. In the 2002 futuristic movie Minority Report, Tom Cruise’s character replaces his eye with one he purchased on the black market to bypass this surveillance technique. This is harder to spoof, but more expensive and not as convenient as other biometric approaches.


Iris scan.jpg


Hitachi’s approach to biometric is the use of finger vein. Although not as well known as finger print, finger vein is as convenient to use as finger print but bypasses the concerns of using finger prints. Finger vein has very high accuracy, and is not transferable or obscured by conditions on the skin. Unlike fingerprints that are left on everything you touch, finger veins are hidden inside the finger. When a near-infrared light is transmitted through the finger and partially absorbed by hemoglobin in the veins it is possible to capture a unique finger vein pattern profile, which is then matched with a pre-registered profile to verify individual identity. The finger does not need to touch the camera or the light source, so it relieves the concern for hygiene. If a finger is not live, or otherwise detached from the blood supply, there would be no vein ID.


A comparison of Biometric methods, published by the international Journal of Advances in Science and Technology, shows that finger vein biometrics has the highest accuracy, long term stability, and security level compared to facial recognition, iris scan, finger print, voice recognition and lip recognition.


BIo Comparisons.jpg


Hitachi VeinID technology was introduced in Japan in 2002 and has gained wide acceptance with over 40,000 ATMs and several million smart cards with finger-vein ‘match on card’. In 2010, finger vein ATMs were introduced in Poland and Barclay’s bank provides UK corporate banking customers with finger vein biometric authentication devices that will let them “easily access their online bank accounts and authorize payments within seconds, without the need for PIN, passwords or authentication codes”.


Hitachi VeinID has been gaining wider acceptance in Europe since its introduction in Poland. The man most responsible for introducing VeinID to Poland is Tadeusz  Woszczyński, who is currently the country manager for Poland and CCE for Hitachi Europe. Ben Edgington, Head of Engineering, Information Systems Group, Hitachi Europe, leads the engineering team, covering pre-sales, development, delivery, post-sales support and product management across our Digital Security solutions which includes VeinID. For more information you can link to this data sheet which was published by Hitachi Europe.


I am very pleased to announce that both Tadeusz and Ben will be at our NEXT 2017 users conference, Sept 18-20 in Mandalay Bay, Las Vegas to present a breakout session entitled: Putting identity at the heart of security: strong authentication via Hitachi's biometric technology. Click on the website for NEXT 2017 and register today to attend this conference.


NEXT promo event_banner-resize_v3.png

Hu Yoshida

Blockchaining in Singapore

Posted by Hu Yoshida Jun 26, 2017

Last week I had the opportunity to spend a few days in Singapore with David Pinski, Chief Strategist Financial Services for Hitachi and Toshiya Cho, Hitachi Fintech and board member of the opens source Hyperledger consortium. The initial purpose of our visit was to attend the 2017 Blockchain for Finance Conference where one of the Hitachi Proof of Concepts was being presented by Bank of Tokyo-Mitsubishi UFJ. I would describe Singapore as being one of the leading nations for blockchain activity, primarily due to the efforts of the government of Singapore and the Monetary Authority Singapore (MAS) who has established a sandbox for the use of fintech developers. The Bank of Tokyo-Mitsubishi UFJ POC was done in this sandbox.


However, we were only able to attend a few sessions due to the demand for us to visit Financial Services companies in Singapore who know us for our IT solutions and were interested in understanding what Hitachi is doing in the blockchain space. One of the companies who seemed to be the most interested in blockchain was looking at an application outside of the financial services space.


While I have been posting blogs about blockchain and Fintechs over the past two years, I was really amazed at what progress that has been made in this area after hearing David and Cho-san, who come from financial services backgrounds, discuss this with financial IT leaders in Singapore. While blockchain is most famously known as the technology behind crypto currencies like Bitcoin and Ether, there are a number of POCs that are beginning to show promise in other areas of the financial industry. Even when POCs fail, each POC helps to shape the future of various blockchain applications.


In Asia, crypto currencies are reaching fever pitch, as the value of Bitcoins in USD has more than doubled since the beginning of 2017 to over $2500. In June of 2016 the New York Times reported that over 70 percent of the transactions on the Bitcoin network were going through just four Chinese companies, known as Bitcoin mining pools — and most flowed through just two of those companies. Bitcoin received a boost in recognition with the recent “Wanna Cry” ransomware attack, which demanded payment in Bitcoin. Nearly everyone I talked to in Singapore knew some relative or friend who has already made a fortune in Bitcoin. However, the New York Times reported last week that Bitcoin is being out performed by another cryptocurrency, Ether, which has risen 4500% since the beginning of the year with a total worth 82% as much as all the Bitcoin in the world. Ether is another cryptocurrency that is based on a block chain distributed computing technology called Ethereum.


Although the rampant speculation in Bitcoin and Ether sounds like a pyramid scheme and a currency for criminal activities , there are many other applications of the basic technology, which could drive more beneficial social innovation. Blockchain acts like a distributed ledger where all transactions are properly conducted and recorded without the bottleneck of a central ledger.

  • Blockchain can shorten the settlement of funds transfer from days to minutes, for earlier release of funds, reduction of risk, and reduction of processing fees.
  • It could be used for a distributed cloud where no one vendor controls all of your online assets.
  • Blockchain technologies could make tracking and managing digital identities both secure and efficient, resulting in seamless sign-on and reduced fraud.
  • Programmable digitized contracts can be entered on the blockchain as variables and statements that can automatically release funds using the blockchain network as the executor, rather than trusting a single central authority.
  • Block chain may even solve the problem of “Fake News”


Hitachi has a long history in blockchain development going back to 2003 when they were granted their first patent on what was then known as Hysteresis signature research. The picture below illustrates how transactions are chained together. It shows a triangle representing a hash of a block, chained to and embedded into the hash of another block represented by a cross, then embedded into a hash represented by a hexagon, embedded into a hash representing a circle, etc. In this way it is not possible to delete or change one of the blocks in the chain.

Hitachi Patent.png

Hitachi is an active participant in the open source Hyperledger consortium, an umbrella project of open source blockchains and related tools, started in December 2015 by the Linux Foundation, to support the collaborative development of cross-industry, blockchain-based distributed ledgers.


I am pleased to announce that we have secured David Pinski to present at our NEXT 2017 users conference September 18-20 in Mandalay Bay, Las Vegas. David will cover Hitachi’s participation and contributions to the blockchain effort.  I will provide more information on this event and David's presentation was we get closer to the date.

VMware Partner Leadership Summit 2017 which took place last month, concluded with awards ceremonies recognizing exemplary achievements in the VMware Partner ecosystem. Hitachi was singled out as the Innovation OEM partner, globally and in EMEA. The two awards recognise Hitachi's accomplishments in 2016, including unprecedented year-over-year revenue growth and continued integration between Hitachi Unified Compute Platform (UCP) offerings and VMware vSphere, vSAN, vRealize and NSX solutions.

VMware partnerr of the year.jpg

Hitachi’s unique integration with VMware through its Unified Compute Platform orchestration software layer has always been a key innovation from the very beginning of our OEM relationship. In 2016 Hitachi Data Systems announced enhanced integration with VMware Solutions with new support for VMware NSX, Docker Containers and enhanced VMware vSphere Virtual Volumes, which strengthens Converged and Hyperconverged Portfolios to Automate IT, improve security and enable application continuity. Hitachi also debuted all-flash versions of UCP 4000, UCP 4000E, UCP 2000 and UCP HC V240F and now offers all-flash configurations across its entire portfolio of converged and hyperconverged infrastructure platforms, to deliver powerful performance for enterprise applications and help customers move to an all-flash data center. Hitachi extended its OEM agreement to include VMware NSX for use with Hitachi Unified Compute Platform.


The justification for the Innovation OEM partner award from VMware can be summarized in the data sheet “Top 10 reasons to select Hitachi Unified Compute Platform for VMware vSphere


The following are the key points. For more detail and customer use cases, please take the time to review the referenced data sheet.

1. Hitachi’s UCP Director and UCP Advisor orchestration software can plug into VMware vCenter for mouse-click simplicity for extending the cloud management capabilities.

2. UCP for VMware vSphere is a pre-engineered, validated, turnkey solution that is preconfigured at the factory and delivered to your site racked, cabled, and configured for fast deployment and rapid transition to the cloud.

3. Accelerate time to market by leveraging all the elements of a software-defined data center including VMware’s software –defined networking solution NSX. This integration helps automate SDN capabilities and improve security with VM-level firewalls.

4. Infrastructure flexibility satisfies the requirements of any VMware data center infrastructure from high-end block or file-based highly available, cost-effective storage products to converged and hyperconverged systems, to enterprise cloud solutions.

5. Industry leading performance dramatically increases the pace of innovation for the most demanding online transaction or data analytics application. In addition to all flash support, UCP for VMware vSphere is SAP certified. Hitachi enterprise-level solutions for SAP HANA enable 24/7, mission critical, in memory, HANA environments.

6. Significant economic benefits result from UCP’s product family architecture which consists of an economical open system that leverages modular building blocks for converged and hyperconverged infrastructures.A third party report by the Edison group shows significant TCA and TCO savings over comparable DIY and VCE configurations.

7. Reduce TCO for virtualization and cloud management through industry leading infrastructure orchestration technology which is fully integrated into the overall system. When used with cloud automation software such as vRealize Automation, UCP solutions enable cloud self-service and role based access functionality.

8. Protect current IT investment and modernize for future growth. UCP for VMware vShpere integrates and protects your current infrastructure investment while enabling you to expand as needed.

9. Create a software defined data center (SDDC) with enterprise grade storage, compute, network and software from Hitachi Data Systems and leading virtualization capabilities from VMware including VMware Virtual SAN and VMware NSX.

10. Optimize Business Continuity with UCP Director and VMware vCenter Site Recovery Manager for automated disaster recovery and fast replication to move secondary copies to off-site recovery locations.

Digital transformation requires reliable infrastructure that can deliver cloud like economics and agility, with improved efficiency and simplified manageability. IT transformation requires a single solution that can modernize their core legacy applications while enabling them to bridge to innovative new cloud based solutions. The innovations in the partnership between VMware and Hitachi Data Systems will help businesses drive change and stay ahead of the competition. Unlike DIY or joint efforts by multiple compute, storage and networking vendors, Hitachi’s OEM relationship with VMware provides a major benefit in streamlining the acquisition, maintenance, and ongoing support for VMware vSphere solutions.

Twenty five years ago when I returned to California after living in Japan, I was able to realize my dream of building my own custom home. At that time home builders in California were buying up farmland and throwing up houses as fast as they could. I wanted something different. Something with the quality and attention to detail that I was used to in Japan but on a larger scale. I was fortunate to find an architect and contractor who shared my vision for quality. Building a house is a very manual process with opportunities for human error at every step. While other contractors hired low cost itinerant subcontractors who moved from site to site to do the framing or shingle the roof, our contractor kept the same subcontractors for each job to ensure quality execution and follow through.


House plans.png


My colleague in Infrastructure Solutions, Tony Huynh, believes that architecting and building an effective data center is akin to how a proven home builder creates a blueprint for a rock-solid home that customers buy with confidence in its long  term value. Both the IT Architect and the home builder has similar qualities – solid engineering prowess, efficient execution, and a predictable, high quality result. Conversely, a shoddy home builder can put something together quickly (and even cheaper), but the results will reflect their effort. Tony contributed the following post to explain the benefits of using the Hitachi Automation Director in building an effective data center.


The approach to implementing a modern data center should be looked at in its’ totality.  This is especially true when you’re considering deploying flash technology for your critical tier 0 and tier 1 applications. In addition to the core hardware “bits”, an assessment should be taken for the robustness of the underlying OS, as well as complementary software that makes the solution complete, whole – a solid brick house.


Intellectual property and engineering prowess has always mattered, more so now more than ever.

For our Hitachi VSP F all flash and VSP G hybrid arrays, we offer Hitachi Automation Director – our data center automation software that helps reduce, in some cases manual processes by 90% by utilizing templates for automatic provisioning and other resource intensive tasks.  By reducing manual processes, this can then result in significantly reduced probability of human errors. And we all know the financial, brand, and customer impact of a single keystroke error.


Today, we have a large Hitachi Automation Director customer that previously spent 23 hours+ manually provisioning storage for their AIX servers, and they did this 100+ times a month (that’s not a typo). With Automation Director, they have now reduced the same provision process down to less than <50 minutes.



23 Hours x ~100 times a month = ~2300 manual hours. HIGH PROBABILITY FOR HUMAN ERRORS



<50 minutes x ~100 times a month = 83 hours. SIGNIFICANT REDUCTION IN HUMAN ERRORS

Imagine what can be accomplished with the additional ~2200 hours freed up per month in IT resources



We are constantly investing in features that help customers optimize their VSP F and G Flash deployments. Introducing Hitachi Automation Director 8.5.1, with  automatic flash pool optimization.


This new feature reduces the manual processes associated with increasing flash pool sizes, which would take seven steps.


With HAD 8.5.1, this can now be collapsed into two steps (see figure below). That’s a 70% reduction over manual processes, but the benefits are compounded since HAD will automatically increase the pool size without future storage admin/user intervention.




When it comes to choosing a solution for critical flash workloads, it’s important to look for the entirety of the solution. Your VSP Flash deployments can reach its maximum effectiveness with Hitachi Automation Director by reducing manual provisioning processes by 90%. This has a direct impact on significantly lowering risks of human errors, and re-directing those IT resources to other strategic projects.



This week we announced a number of new versions for products in our Hitachi Content Platform Portfolio. These included HCP v8.0 our object storage platform, HDI v6.1 cloud gateway, HCP Anywhere v3 for file synch and share, and general availability of our Hitachi Content Intelligence for big data exploration and analytics which we announced in 4Q2016. In my last post I talked about some of the many new features and capabilities that are integrated in this portfolio, which Gartner and IDC recognize as the only offering that allows organizations to bring together object storage, file synch and share, and cloud storage gateways to create a tightly integrated, truly secure, simple and smart cloud storage solution.


One of the benefits that I failed to mention is the benefit that this provides for the DevOps process. The agility and quality of the products in this portfolio are a great example of the DevOps process that is used by the HCP development and QA teams.  Hitachi Content Platform, which is recognized by the industry for cloud computing excellence, is also one of the tools in our DevOps tool chain in Waltham where we develop our Hitachi Content Platform portfolio.


Recently there have been some articles about difficulties in orchestrating the DevOps tool chain. A DevOps tool chain is a set or combination of tools that aid in the delivery, development, and management of applications throughout the software development lifecycle. While DevOps has streamlined the application development process from the old "waterfall approach",  DevOps toolchains are often built from discrete and sometimes disconnected tools, making it difficult to understand where bottlenecks are in the application delivery pipeline. Many of these tools are great at performing their intended function but may not apply all the disciplines needed for enterprise data management.


HCP's main benefit for DevOps is its high availability capabilities, which can help insulate downstream test automation tools from software upgrades, hardware maintenance/failures, etc. as well as insulating from availability issues in the upstream tools as well. We use Jenkins for continuous integration and if it goes down or is being upgraded, the downstream test tools don't notice or care since they are fetching builds from the always-online HCP.


HCP’s Metadata Query Engine (MQE) helps abstract away where the artifacts are located and named in a namespace. As long as the objects are indexed, MQE will find them and present them to the client, regardless of the object name & path. Even further downstream, after the automated tests are run, we can again take advantage of HCP by storing the test results and logs on the HCP (preferably in a separate namespace than the build artifacts). HCP’s security and encryption features ensure a secure enterprise environment, which is not always available with DevOps tools. DevOps is about automation and HCP can automate managing the space consumption by taking advantage of retention & disposition to "age out" and delete old logs or old builds, or tier them off elsewhere for long-term storage (such as an HCP S storage node or public cloud). HCP also provides an automated backup solution, using its replication feature as a way to get copies of the backups off-site for DR. HCP Anywhere and HDI are also valuable to ensure a secure and available distributed environment.


There is no doubt that DevOps has contributed to the Speed and Agility of HCP development. In return the use of HCP in the DevOps tool chain has made the development more secure and available, and facilitated the quality integration of features and products in the HCP portfolio.


Enrico Signoretti, Head of Product Strategy at OpenIO, in a March 2017 Gigaom report called Sector Roadmap: Object storage for enterprise capacity-driven workloads, wrote the following: “The HCP (Hitachi Content Platform) is one of the most successful enterprise object storage platforms in the market. It has more than 1700 customers, with an average cluster capacity between 200 and 300TB. … Alongside the hardware ecosystem, HDI (remote NAS gateway) and HCP Anywhere (Sync & Share) stand out for the quality of their integration and feature set.”


If you already have or are planning to implement an HCP, consider including DevOps as another tenant in your multitenant HCP. For reference you can download this white paper written by our HCP developers on how they use HCP as a Continuous Integration Build Artifact storage system for DevOps.

Digital transformation is the transformation of business activities and processes that fully leverage the capabilities and opportunities of new digital technologies. There are many cases of companies who have adopted new technologies but have not transformed their business process or business model to fully leverage the technology. An example is a bank who adopts mobile technology so that a loan request can be entered on a mobile app. While the mobile app is easy to use, it can still take weeks to approve and process the loan if the back office processes are not changed. This puts them at a disadvantage when competing with Fintech companies who can process a loan request in two days.


Fintech companies have an advantage in that they are technology companies that were born in the cloud and do not have the legacy that traditional financial companies have. In order to compete, traditional companies must take a bi-modal approach to digital transformation. They must continue to enhance and modernize their core applications while they transition to new systems of innovation. In the case of the bank processing loans they need to have access to more information in order to evaluate the credit worthiness of the applicant. If that information is locked up in different silos, it will be difficult to provide the agility required to shorten the loan process.


Traditional companies may have an advantage over Fintechs, if they can unlock the wealth of data that is already in their legacy systems. The key to success will be the ability to free the data from their legacy silos, and use them to augment new sources of data that can be analyzed to create new business opportunities. Object storage with its rich metadata capabilities, open interfaces, and scalability can eliminate these silos. However, no single object storage product can do it all from mobile, to edge, to cloud, to core. There must be an integrated portfolio of object management products that can optimize and modernize the traditional core systems while activating and innovating new business processes with technologies like cloud, mobile, analytics, and big data. The lack of an integrated approach will create increased complexity and costs.


Today Hitachi Data Systems announced enhancements to our Hitachi Content Platform portfolio which will further enhance its value as a digital transformation platform


HCP Porfolio.jpg


Hitachi Content Platform (HCP) is an object storage solution that enables IT organizations and cloud service providers to store, share, sync, protect, preserve, analyze and retrieve file data from a single system. Version 8 is the latest version of the object storage repository. The HCP repository is usually mirrored across two site to provide availability and eliminate the need for back up. Geo Protection services has been added which essentially does an erasure coding of objects across three to six sites, so that data is protected when any site fails. It also uses less storage capacity than full replication of data. With erasure coding across three site you save 25% capacity compared to mirroring, and with six sites you can save up to 40%. Erasure coding will impact retrieval time so provisions are offered to retain whole copies locally for a specified time. The support for KVM, the use of 10TB drives in the storage nodes, a 55% increase in objects per node and simplified licensing has been added to improve the economics of private versus public cloud.


Hitachi Data Ingestor (HDI) is an elastic- scale, backup-free cloud file server with advanced storage and data management capabilities. It appears as a standard NFS or CIFS server to a user but every file is replicated over HTTP to an HCP so it does not require backup. When the local storage begins to reach a threshold, the older files are stubbed out to HCP so it appears to be a bottomless filer. HDI v6.1 has been enhanced with R/W content sharing and universal file migration improvements.


HCP Anywhere provides file synchronization and sharing on mobile devices with end user data protection and mobile device management to increase productivity and reduce risks in a secure, simple and smart way. Version 3 provides next generation Windows client with VDI support and thin client desktop client (selective caching and Synching). There are also enhancements for data protection, collaboration, and usability such as Android app enhancements. You can use “My Documents” as HCP Anywhere directory and protect your files from ransom ware. Any alteration of your file would be stored as a new version, so recovery would simply be a matter of accessing the previous version.


Hitachi Content Intelligence was announced in 4Q 2016 and is now available for general availability and rounds out the HCP portfolio with a new data analytics and recommendation capability. With Content Intelligence, organizations can connect to and aggregate data from siloed repositories, transforming and enriching data as it’s processed and centralizing the results for authorized users to access.  By balancing our existing HCP Portfolio, Hitachi is now the only object storage vendor in the market with seamlessly integrated cloud-file gateway, enterprise file synchronization and sharing, and big data exploration and analytics. Hitachi Content Intelligence is capable of connecting to and indexing data that resides on HCP, HDI, HCP Anywhere, and cloud repositories. Once data repositories are connected, Hitachi Content Intelligence supports multiple processing workflows for analytics, extraction, transformation and enrichment stages that can be applied as your data is processed.


Last year Gartner and IDC gave the Hitachi Content Platform portfolio their highest marks, recognising it as the only offering that allows organisations to bring together object storage, file synch and share, and cloud storage gateways to create a tightly integrated, truly secure, simple and smart cloud storage solution. Hitachi is further solidifying that lead with today's announcements.


For more details see the announcement letter

Hu Yoshida

HCP Stands the Test of Time

Posted by Hu Yoshida May 26, 2017

Archive Storage.png


Recently FINRA fined an investment firm $17m for inadequate investigation of anti-money laundering “red flags”, during a period 2006 to 2014 when they had significant growth in their business. Their successful growth was not accompanied by the growth in systems to assure compliance. They had patch work solutions for large volumes of data resulting in data silos which made it difficult to combine the data for investigations.


The period from 2006 to 2014 is a very long time in terms of the evolution of technology and changes in regulations and financial instruments, so I can see how this situation could easily happen. If you did not have a data management system that could scale, consolidate silos of data, leverage technology advances and respond to regulatory and business changes, compliance can easily get out of hand.


Ten years ago data management for compliance was mainly dependent on CAS (Content Addressable Storage) which was successfully marketed by EMC as Centera. CAS is based on hashing the object (file) and using that hash as the address into the CAS repository. The hash could also be checked on retrieval to show immutability which was a plus for compliance. Another plus was that it had a flat structure and could grow to large capacities for of low cost storage. Access to Centera required an API which made it proprietary, but that did not deter users who saw it as a solution for retention of compliance data. Many ISV were happy to jump in and provide application specific solutions based on the Centera API since it provided them with a proprietary lockin.


Hitachi Data Systems offering at that time was a product called HCAP (Hitachi Content Archive Platform), which was developed in partnership with Archivas. Hitachi and Archivas took the approach of indexing the metadata and content as the files were written to the archive so that HCAP had awareness of the content and could provide event-based updating of the full text and metadata index as retention status changed or as files were deleted. Hashing of the object was also provided, but for immutability, not for addressing. While proprietary CAS solutions focused on storing content, HCAP was focused on accessing it. The interfaces to HCAP were non-proprietary, supporting NFS, CIFS, SMTP, WebDAV, and HTTP and supported policy-based integration from many distributed or centralized repositories such as e-mail, file systems, databases, applications and content or document management systems. The elimination of silos enabled users to leverage a set of common and unified archive services such as centralized search, policy-based retention, authentication and protection.


In 2007 Hitachi Data Systems acquired Archivas and shortly thereafter changed the name to Hitachi Content Platform since this product offered more capabilities than just archiving. HCP is a secure, distributed, object –based storage system that was designed to deliver smart web-scale solutions. HCP obviated the need for a siloed approach to storing an assortment of unstructured content. With massive scale, multiple storage tiers, multi-tenancy and configurable attributes for each tenant, HCP could support a range of applications on a single HCP instance and combine the data for in depth investigations. As technology evolved, HCP added support for additional interfaces like Amazon S3 and Open Stack Swift, the latest advances in server and storage technology like VMware and erasure coding, and numerous advances in security, clustering, and data management enhancements


HCP is designed for the long term with its open interfaces and non-disruptive hardware and software upgrades to take advantage of the latest technology solutions and business trends. A customer who purchased HCAP in 2006, could have non-disruptively upgraded through multiple generations of hardware and 7 versions of HCP. More importantly, they could adopt new technologies and information management practices as new initiatives like cloud, big data, mobile, and social evolved.  While HCP remains up to date and positioned for future growth, many analysts like Gartner are claiming that Centera is obsolete  and are reccomending compliance archiving alternatives to Centera after the Dell acquisition of EMC. With an estimated 600 PB of data on Centera, migration will be a major problem.


HCP is at the core of Hitachi's object storage strategy, and Hitachi Data Systems is unique in the way that it has expanded its object storage portfolio around HCP.

  • Hitachi Data Ingestor (HDI) is a cloud storage gateway that enables remote and branch offices to be up and running in minutes with a low cost, easy to implement, file serving solution for both enterprise and service providers.

  • Hitachi Content Platform Anywhere (HCP Anywhere) a secure enterprise file synch-and share solution, which enables a more productive workforce with secure access and sharing across mobile devices, tablets, and browsers.


  • Hitachi Content Intelligence (HCI) connects and indexes data that resides on HCP, HCP Anywhere, HDI, and cloud repositories to automate the extraction classification, and categorization of data.


Initially in 2006, HCP used DAS for small and intermediate configurations and SAN attached storage for large enterprise configurations. Today, HCP configurations include low cost, erasure coded HCP S10 and S30 network attached storage nodes as well as public cloud, enabling hundreds of PB of object storage all under one control without the need for a SAN. HCP server nodes have been consolidated to one model, the HCP G10 and HCP can also run on a VM. By separating logical services from physical infrastructure, HCP allows both to scale independently, while continuing to utilize existing assets.


HCP’s track record has proven that it can support your long term and changing requirements for archive, compliance and analytics. You can be sure that there will be a version 8 of HCP as it evolves to leverage new technologies and information management practices. You can also be sure that version 8 and the integrated portfolio of HDI, HCP Anywhere and HCI will continue with non-disruptive upgrades.


Today May 12, there has been a massive denial of service attack that began with the NHS system in the UK which affected dozens of hospitals, and spread to six continents affecting an estimated 75,000 machines!


According to gizmodo.comUnknown attackers deployed a virus targeting Microsoft servers running the file sharing protocol Server Message Block (SMB). Only servers that weren’t updated after March 14 with the MS17-010 patch were affected; this patch resolved an exploit known as ExternalBlue, once a closely guarded secret of the National Security Agent, which was leaked last month by ShadowBrokers, a hacker group that first revealed itself last summer.

The ransomware, aptly named WannaCry, did not spread because of people clicking on bad links. The only way to prevent this attack was to have already installed the update.”


Attached is a screen shot shown on Kaspersky Lab's blog on WannaCry. The ransom started at $300worth of bitcoin, but has since been raised according to the Kaspersky post.




The scope of this attack is unprecedented and underscores the need to keep current with security patches. While this attack may not have come from clicking on bad links, as a reminder, many of these attacks start from a link or attachment inside an email. Do not click on links or open attachments in emails that you are not expecting. It is also recommended that you reboot your computer on a regular basis to complete any security patches that may be waiting to complete.


It also underscore the need to have a recovery plan. Recovering from Ransomware attacks may be possible if backups have been taken. and you have a point in time copy prior to the attack. Scott Sinclair of Enterprise Strategy Group recommends the use of (some vendor's) Object storage in a recent report: Object storage helps with ransomware protection.


Scott notes that some object storage systems like Hitachi’s HCP, supports a feature called object versioning. Object storage systems are designed for write once, read many (WORM). With object versioning, any change or update to the object is written as a new version of the object, while the previous version is retained as well. When malware encrypts the data to prevent it use, it is written as a new version and the original object is not changed. In other block or file systems the original data is locked up with encryption and not available until a ransom is paid.


With HCP the storage admin simply sends out a command to roll affected objects back to their previous versions. This restoration is much faster, simpler and less costly than restoring data from a backup copy, if one is available and if it is current.


In the case of these systems that are affected, if they do not have such an object storage in place or the recovery from backups is too costly, they might just have to pay the ransom.

Laptop ban.png

Airlines told to "be prepared" for an expanded ban on carry-on electronic devices allowed on airplanes.


On February 2, 2016, an explosive that was hidden in a laptop exploded 20 minutes into the flight of a Somali airlines plane. It blew a hole in the airplane, but since it had not reached cruising altitude, the plane was able to return to the airport and landed safely. Fortunately, the plane’s departure had been delayed for an hour which prevented the explosion from occurring at cruising altitude where it would have destroyed the plane and everyone onboard. The explosives in the laptop were not detected during the x-ray screening prior to boarding.


As a result of this incident and a similar airliner downing in 2015, the United States Transportation Administration and United Kingdom Transport Security placed a ban on electronic devices larger than a cell phone/smart phone in carry-on luggage on certain direct flights to the United States and the United Kingdom. Currently the ban applies to U.S. and U.K. inbound flights from eight countries in the Middle East as of March 24, 2017.


On May 9, 2017, Homeland Security spokesman David Lapan confirmed to reporters that the U.S. administration is considering expanding the ban on laptops, to potentially include "more than a couple" other regions, including flights from Western Europe.


What does this mean for those of us who travel frequently? It means we can’t take our laptops on board to finish that last minute presentation before we arrive at our next meeting. While we might welcome the excuse to put off our work, it creates a bigger problem since it means we will have to pack our laptop in our checked luggage where it is subject to damage, loss and possible theft.


Hitachi Data Systems’ Mobile Computing Policy prohibits company issued laptops, tablets and mini tablets to be placed in checked baggage just for that reason. Packing our Laptops in our checked luggage is not an option for us. However, that does not create a problem, since we use our HCP Anywhere, file synch and share solution. All the files we need for the trip can be loaded into our HCP Anywhere folder and we can retrieve it from our smartphones or a loaner PC at our destination.


It just so happens that I will be in Istanbul next week to participate in several conferences. My presentations are loaded in my HCP Anywhere folder, and I am leaving my laptop at home. I plan to pack one of my private iPads in my checked baggage, to use when I get there since I prefer a larger screen than my iPhone. and I can face time with the family. I have several iPads since I seem to get one free every time I upgrade my iPhone. These iPads are cheap to replace and they don't contain company data. I will spend my time on the flight sleeping, watching movies or doing email. This ban is likely to be extended to other countries, and possibly domestic airports, and that is fine with me since it means another layer of security. So not having a laptop in my carryon, means a safer, more restful flight. 


Since airlines are not the only places where I can lose a laptop while I am travelling, HCP Anywhere eliminates the liabilities of travelling with a company laptop altogether.

Hu Yoshida

Provenance and Blockchain

Posted by Hu Yoshida May 3, 2017

Declaration of independence.jpg

Last month two Harvard researchers stunned the experts with the discovery of a second parchment copy of the United States Declaration of Independence in the UK’s West Sussex Records Office. They tracked down the manuscript and confirmed its provenance.


Provenance is the history of the ownership and transmission of an object. In the world of art and antiquities, provenance includes the auction houses, dealers, or galleries that have sold an item, the private or institutional collections in which the item has been held, and exhibitions where the item has been displayed. Provenance is defined by Meriam Webster as “the history of ownership of a valued object or work of art or literature”. Today provenance can be extended to anything of value through the implementation of blockchain technology.


Blockchain technology is the technology behind Bitcoin. According to Wikipedia, A blockchain facilitates secure online transactions. A blockchain is a decentralized and distributed digital ledger that records transactions across many computers in such a way that the registered transactions cannot be altered retroactively. (I blogged about blockchain over a year ago and recently posted about the use of blockchain in Hitachi’s 150 million member PointInfinity awards system.) Blockchain can provide a secure immutable record of when an object was created, its history of use, and where it is now. Block chain is all about provenance.


While blockchain was originally about crypto currencies and financial transactions, now I am hearing about new application of this technology in different fields almost on a daily basis, from startups as well as established companies.


Everledger is a London startup that is using blockchain technology as a platform for provenance and combating insurance fraud in the selling and trading of diamonds. Instead of being reliant on paper receipts or certificates that can be easily lost or tampered with, blockchain provides an electronic distributed ledger that is immutable. The chain can also be tracked back to the creation of a particular diamond to certify that it is not a blood diamond, diamonds which are mined in war torn areas and are illegally traded to finance conflict in those areas. Everledger plans to extend this to other luxury items.



Bosch, a 130 year old German company, has created a system that uses an in-car connector to regularly send a vehicle’s mileage to its “digital logbook,” a blockchain based system which stores the mileage in an unalterable way. If the odometer on the car is suspected to have been tampered with, its mileage can be checked against the mileage it has recorded on Bosch’s system, via a smartphone app. A car owner could log their mileage on the blockchain and when they attempt to sell their vehicle, receive a certificate of accuracy from Bosch that confirms the veracity of their car’s mileage. Carfax reports that in Germany, police have estimated that approximately every 3rd car has been subject to odometer fraud causing over € 6 billion in damage per year. In addition to defrauding used car buyers and insurance companies, under reported mileage can result in unexpected failure due to inadequate maintenance. Bosch can prove the provenance of your car mileage.


The provenance of blockchain can provide an immutable record of when something was created, its history, and its current state. Perhaps we can use blockchain to eliminate “fake news”