Skip navigation

To the list of greatest inventions of the world such as the wheel, compass, steam engine, concrete, automobile, railways, airplane add 21st century’s offering the Internet of Things or IoT. Gartner estimates the total economic value-add from IoT will reach US$1.9 trillion worldwide in 2020. Other reports claim close to half the companies in sectors like oil, gas and manufacturing industries are already using Instrumented devices capable of providing valuable data.


The major benefit of IoT is the ability to measure, monitor and manage any asset in any location from anywhere at any time. However, companies implementing IoT face some real challenges. At its heart IoT involves dealing with data streams from a wide variety of sensors. For example a regular automotive car has roughly 30k parts and the new generation autonomous cars may likely have even higher number of smarter parts. These parts either by themselves or through connected instrumentation generates about a GB of data per second in operation, we are talking about 3600GB of data per hour. All this data must be taken through an intelligent lifecycle from capture to archive and used all along to support the data driven decision process.

How do you make sense of this? Where do you begin? How do you create an environment that learns by itself to use or discard the insights generated by using these data? Is it about accuracy or precision or both? What impact does it have on real time operations and decision making?

The primary challenge is handling the massive amounts of data. This includes security of the data and the network as well as the analytics required to derive usable business intelligence from it. IoT technologies have to support this entire process from sensing, transforming, network, analysis and action. If it were to be a homogenous environment where, we are dealing with assets of one class or type, it would still be a scalability challenge. With number of asset types and assets of each class multiplying in the process, the complexity is getting multiplied across the layers from edge to action. There is no interoperability among these asset types and the whole technology landscape has gotten crowded with several siloed solutions and products.


Apart from this an IoT implementation faces issues like:

  • Ubiquitous connectivity: Yes, it is 2017 but network outages do occur even in the most advanced nations leaving the very concept of in stream data analytics at risk.
  • Interoperability: The numbers of different systems connected through IoT continue to create interoperability challenges.
  • Competing standards: Different IoT vendors are pushing their standards creating a veritable Tower of Babel. This will take time to sort out.


A successful IoT implementation needs to address three issues:

  • Complexity: IoT is not a one-size fits all kind of solution. The key is to find a scalable platform that integrates all aspects of the business to ensure seamless information flow across the enterprise.
  • Data Usage: Sensors will throw up a huge amount of data. There needs to be sanity across all layers of implementation. The overall platform architecture needs to have a robust data management and distributed analytics framework to create actionable insights where it matters.
  • Security: Hackers and sensitive data loss created a big risk and interrupt operations. It is therefore a business imperative to have a solution that identifies each and every asset which is locked-down at every tier using authentication and authorization while enabling logging, blacklisting and encryption of data at rest or in transit.


The benefits that accrue from a smart enterprise far outweigh the risk and the complexities involved in its implementation. Customers have the real need of dealing with new product innovation, eliminate wastage in their process, improve product quality, reduce operational costs and create new consumption led business models.


So the bottom line really is not if you need to make your business smarter, but how soon you can do so. For delaying the process is simply handing your competition an unassailable advantage.

42! The answer to life, universe and EVERYTHING!

If only everything would be as easy...


IoT, Industrie 4.0, Predictive Maintenance and the use of data in different areas of industry is a
big topic nowadays.


In our webinar series The Hitchhikers Guide to IoT we give you insights into the world of IoT
and how Hitachi can help to focus on the right solution and learn from best practices.


TitleDateSpeakerRegistration Link
IoT is everywhere, but DON’T PANIC. 6 Considerations to make them successful.


Christian Dornacher

Director, Storage & Analytics Solutions EMEA

IoT, Big Data & Moving from Data to Knowledge


Wael Elrifai

Sr. Director of Sales Engineering, EMEA & APAC

Pentaho a Hitachi Group Company


Related Blog:
IoT – Tools and Technologies to build the IoT Stack


Christian Dornacher

Director, Storage &
Analytics Solutions EMEA

IoT – Critical
Success Factors

Christian Dornacher

Director, Storage &
Analytics Solutions EMEA


Industrial IoT -

Digitization of Manufacturing:

from Theory to Practice


Greg Kinsey,

Vice President at Hitachi Insight Group
Digitization of Automotive Manufacturing

November 9th, 2017,

11:00 am CEST

Greg Kinsey,

Vice President at Hitachi Insight Group

On July 13, 2016, Hitachi and Daicel announced their collaborative efforts in development of an Image Daicel_company_logo.pngAnalysis system to detect signs of facility failures and deviations in front-line worker activities.

Daicel, a chemical company based in Japan, creates high performance chemicals and engineering plastics such as automotive airbag inflators. In recent years, mega-recalls in various industries (specifically in the automotive industry regarding part failures) have brought a renewed interest in accumulating and managing manufacturing performance data. Manufacturing performance data is essentially a gold mine - it can be used to identify the causes of product defects and implement countermeasures. Prior to Hitachi's involvement, Daicel was unable to identify when or where production issues occurred within their manufacturing plants.

3013789-poster-1920-faq-about-self-building-self-tooling-people-free-manufacturing-plants.jpgTogether, Hitachi and Daicel developed an Image Analysis System that uses in depth cameras to extract 3D forms to measure worker activities. Hitachi stressed the importance of gathering a wide range of work related performance data including manufacturing performance and inspection data and the results of visual checks by workers. The new image analysis system uses depth cameras to detect operational failures in production in facilities and deviations in worker activities on the front lines of manufacturing. Additionally, new manufacturing execution systems (MES) that incorporate Hitachi's IoT technologies including advanced analytics were crucial for Daicel to implement, especially in regards to automotive airbag inflator production processes.


What Are The Expected Results?

  1. The Image Analysis Technology is expected to improve quality and productivity
  2. The solution is expected to dramatically improve the time to discovery of machine and material defects which will reduce the overall number of product recallsDaicel2.jpg
  3. By using obtained image data, the role of on-site management supervisors will shift their focus on monitoring of trends and preventative measures to prevent outages and failures before they occur
  4. Hitachi and Daicel will begin operations of this solution at the Harima Plant in FY2016 and plan to promote the rollout to six of Daicel's main overseas plants in the coming fiscal year


Learn more about how Hitachi Insight Group can help you lead your industry by optimizing your enterprise.

It’s been about 4 months since I transitioned from Business Applications solutions marketing to IoT marketing within the newly formed Hitachi Insight Group. In this role, I am now focused on energy, specifically microgrids, which is a new and exciting field for me. As I’ve learned in these last few months, microgrids are key to achieving energy resilience as written by Jeremy Deaton in this article, Here's why the lights stayed on at NYU while the rest of Lower Manhattan went dark during Hurricane Sandy. Jeremy Deaton (@deaton_jeremy) explained how microgrids provide cleaner power. He also stated that microgrids can lower energy costs, and I’d like to dive deeper into that topic with this blog post.


Historically, microgrids often cost more than traditional power. As such, they were more attractive to niche applications with special needs, such as hospitals and remote military bases. Today, microgrid costs, particularly solar photovoltaic (PV) systems are coming down, which is helping microgrids enter the mainstream of U.S. power supply.


Hitachi, for its part, is focused on three things that enable energy resilience as well as lower costs:

  • Energy-first design
  • Microgrid controls that use IoT technology
  • One-stop shop product and services, including finance

MG.jpgOur energy-first design approach has enabled us to deliver microgrids, sized between 1.5MW and 40MW, at a cost equal to and sometimes lower than current energy prices. This is important in parts of the United States where electricity costs are high such as in the East and West coasts as well as Hawaii. This design approach takes into account what customers’ energy needs are first and foremost. Hitachi Microgrids are designed to use the strength of the entire portfolio. And since we aren’t pushing any particular product or system, we are able to use best-of-breed components that are best suited to our customers’ requirements.  For example, we’ll take a combined heat and power unit or turbine and we combine that with the variable load shape of a solar PV array that actually matches a load curve in a building or community very well and we tie that together with energy storage.  All these are actively managed by intelligent microgrid controls.


The microgrid controls are monitoring the equipment and incorporating sensor, weather and other types of data enabling us to take advantage of the Internet of Things (IoT) and advanced analytics to fine tune energy supply every moment of the day.  We can also monitor operations 24/7 and if the data shows a particular trend that can affect one of the energy resources, we can get our maintenance team to go an explore the issue before it causes a major disruption.   In the end we optimize resources  and increase cost savings over time.


Hitachi, which recently announced its North America Energy Solutions Division, has been in the energy business for decades thus has a thorough understanding of the market. The North America team takes a unique approach to the overall economics of a microgrid project. Since microgrids use a number of energy resources (check out this video to see how microgrids work), designing, building, operating and managing them could be a very costly proposition. Many vendors offer a piece of the value chain, whether it be the microgrid components or the construction, operations and maintenance. Very few can deliver on the full microgrid lifecycle, Hitachi on the other hand can deliver microgrids from “soup to nuts” from feasibility studies and solution design all the way to construction, operations and maintenance often times surprising many of our customers who are accustomed to dealing with multiple vendors. And through Hitachi Capital as well as partnerships with other banks, Hitachi can offer financing, eliminating a common roadblock for projects. Hitachi uses a power purchase agreement (PPA) model which means that customers don’t have to pay any upfront costs. Hitachi and its financing partner create a special purpose entity for ownership and operation of the microgrid. Hitachi gets paid back over a 10 to 20 year period.


As the number of power outages in the US continue to rise due to weather-related incidents, microgrids are a solution to “keeping the lights on.”  Not only can it provide resiliency, but cleaner power at a price that doesn’t cost "an arm and a leg".


Are you considering microgrids for your community?  I’d love to get your comments.


Remember analogue and disparate surveillance systems? These are rapidly being transformed into significant enterprise security platforms.

Modern deployments look more like a standard ICT roll-out then a traditional CCTV solution. Customers are regularly asking for both expertise in the security domain and a bullet-proof platform to support their implementations.

We are repeatedly asked for the following requirements:

  • Support ALL video managment solutions (VMS), video content analytics (VCA), access control, monitoring tools and any other applications needed to deliver a security platform
  • Provide video durability i.e. no dropped frames,  no video loss, 100% low latency recall
  • Scale capacity from small rollouts to 10,000+ cameras
  • Support petabytes of storage
  • Significantly longer retention periods and long term video storage options
  • Deliver high reliability i.e. platform that does not break
  • Require negligible maintenance for customer
  • Off low latency and high performance systems
  • Support analytics layers including data acquisition and visualisation


In addition to these essential requirements, platforms must remain commercially competitive.


Hitachi Design Goals

Our global and regional teams have hundreds of years experience in rolling out solutions at the site, suburb, city, region and even small country perspectives. The importance of best practice site inspection, camera selection, VMS and video analytics selection, network design and ICT architecture form part of the discussion in every security implementation.


Some of key design considerations include:


  • Appliances that meet industry standards, but also can be highly and elegantly customized.
  • 'Out of the box' fault-tolerant platforms that requires minimal (if any) user maintenance and are always on.
  • 'Beyond Paper' certification programs that ensure VMS, VCA, access control and other solutions are guaranteed to perform well on our platform.
  • Straight-forward industry best practice architectural design from small to very large deployments
  • No single point of failure in architecture across Power, Storage, SAN, Compute  and Switching.
  • Designed and governed by our highly-specialised global and regional security engineering teams, but able to be installed, configured and maintained by every local Hitachi Customer Service and Support team.
  • Commercially effective video tiers for economic long term retention.

Under the Hood

Here is the basic specifications from our "VMP 150", our entry point appliance. Others include VMP 500 and VMP 1000 (which I will introduce in future blogs)

We can design this specifically for your camera types, applications, site topology, retention and growth.

Our compute is virtualised with VMWare and most architectures are laid out in virtual machines as follows:

The architecture can readily include other solutions (including standard ICT) as required.

Our 'Hi-Track' system automatically monitors our solution and raises tickets on hardware issues such as replacement disks. This results in very rare (if ever) system outages as small problems are fixed before they become big issues.

Hardware Maintenance is done by Hitachi using (always available) Hitachi spares.

Disk failures are very rare. Our controller management system monitors the disk pools and 'invalidates disks' when they start logging errors.  Spare disks are automatically become activated.

Buying Questions for Vendors

Can you provide a single solution for ALL security applications?

How will you scale for 20% or 200% more cameras?

Do you know how our Cameras, VMS and other solutions will perform on your platform?

How will you architect for 100% up-time during regular operation, support and upgrades ?

Can you provide 100% fault tolerance, backup and continuity in a multi-site implementation?

Can you integrate your solution into our greater ICT platform and/or extend it to become our entire platform?

Are you cloud ready?

Do you have strong experience in Public Safety, Security and Digital Transformation?


With its 100+ years heritage in building core infrastructure for society, Hitachi can provide a comprehensive solutions for security ecosystem.


The Internet of Things, or IoT, has become an increasingly growing topic of conversation with the potential to impact how we live about our day to day lives.


The question becomes "what exactly is IoT and how exactly can this transform the way we live our lives?" Well, for those of you who are relatively unfamiliar with one of the tech industry’s most popular buzzwords, I’m here to make that knowledge transfer a little easier.

Wikipedia defines IoT as “the network of physical objects—devices, vehicles, buildings and other items—embedded with electronics, software, sensors, and network connectivity that enables these objects to collect and exchange data.”


While that’s a whole lot to digest, let’s think of it in simpler terms. Essentially IoT is the concept of connecting any device (with an on and off switch) to the Internet.  Consequently, this provides an ever-expanding universe of connected devices and data from consumer and home products, to industrial machines, to city infrastructure... the list goes on.


Industry Buzzword or Business Value?MaximizeValue4.jpg

Imagine a world where you have a doctor’s appointment right in the middle of rush-hour, at 8 am sharp. Unfortunately you’re running late and know you won’t make the appointment on time with your usual route. With IoT, your automobile should have access to your calendar and direct you to the quickest route so you can make your 8 am appointment. Let’s take the example of an industrial scenario. You own a trucking company that transports perishable goods from point A to point B, which often times leads to spoilage or contamination. Unfortunately this is due to the manual monitoring of temperature in these critical environments. But, with sensors deployed on these carriers, real-time remote monitoring and reporting systems enable Food Safety and Operations Directors the ability to ensure product safety, consistency and superior overall experience. These are the business value applications that IoT can provide!


Hitachi Insight Group is Delivering IoT Business Value

Recently announced Hitachi Insight Group provides the unity of many Hitachi Group companies to accelerate Hitachi's global Internet of Things business. With that announcement, we also launched Lumada, our core IoT platform, in addition to 4 key market segments: Smart Cities, Smart Energy, Smart Healthcare and Smart Industry. Smart Industry, also known as Industrial IoT (IIoT), is focused on optimizing efficiency and reducing downtime by bringing OT (operational technology) and IT (information technology) together. Hitachi Insight has implemented Industrial IoT solutions that help customers dramatically improve operational efficiency. For example, factories increase operational capabilities and overall efficiency by 5-10% with our Optimized Factory solution. Manufacturers manage and monitor chillers with our Predictive Maintenance solution, reducing energy costs that result from time wasting maintenance. We even grow food more efficiently! Famers reduce CO2 emissions by approximately 33% with our IoT-based solution using sensors, cameras and external data.


Watch our Industrial IoT Overview Video to learn about the many

Industrial IoT applications Hitachi Insight Group is focused on today.

Early in my days as a software developer I had the extreme pleasure of working with some of the toughest maintenance guys in the world, working on trucks that were almost three stories high, running non stop seven days a week, in a mining community in the very dry and dusty outback, South Australia. It was during my time with these guys that I developed an appreciation for the surgical precision that is involved in running a mine, it looks like a lot of heavy industrial machine driven labor, but what I learnt is that everything is orchestrated down to a finely tuned process, and any unforeseen downtime can result in millions in lost revenue, but let me come back to why that is important later.


I find the entire conversation around IoT fascinating. When I built software for enterprise customers in the past, it usually dealt with records of information, perhaps a record tracking system of purchases, account information, insurance claims etc. Now fast forward to today, we aren't capturing single records in disparate systems, we are now talking about capturing thousands, millions of points of data from perhaps a single object and analyzing the collected information in real time to provide insights that lead to people or organizations to take action.


Engineering the sort of system that enables you to capture and analyze all that information is a feat in itself, but then so is the skillsets needed to turn the analysis in to actionable information.


Enter Lumada (and yes it took me a few times to say it after mispronouncing it as Lambada), an open, adaptable and verified & secure IoT platform designed to help customers accelerate digital transformation in their organizations.


Let me quickly highlight what those terms mean;

Open: Built using a combination of open source and proprietary technologies providing integration points for the partners in Hitachi's open IoT ecosystem including PTC, SAP and Microsoft.

Adaptable: The Lumada platform is not tied to any specific IoT domain, it can be leveraged across many domains including manufacturing, operations, marketing etc.

Verified & Secure: With Hitachi's experience in the manufacturing of "things" such as train systems, medical equipment, construction etc we are already able to verify how these work in real environments today in areas like manufacturing and predictive maintenance. From a security point of view, Lumada already incorporates secure data collection techniques to help protect data for public and commercial applications.




The "Predictive Maintenance" use case is one that has me very excited based on the experience I shared with you at the beginning of the article.


Now this mining operation had to ensure that they had regular scheduled maintenance on all their unit rigs, to ensure that nothing could go wrong, so regardless whether or not the truck needed work and checks, you would always have to bring them in at the predetermined time to run though a check on all the components to make sure it was running in tip top shape for maximum efficiency.


Imagine that same mining operation with predictive maintenance applied to it. In this scenario you could alter your maintenance schedule to only bring in the trucks that really needed maintenance, because the systems in place would notify you in realtime how the trucks are performing. Not only would this result in a reduction in planned downtime, but the same system could detect emerging faults to help reduce unplanned downtime too. What does this mean for the mining operation, well simple, it means trucks are out in the mine longer, and when every trailer load of minerals counts, this is the sort of application of IoT that can save you millions of dollars.


IoT has some amazing applications, and I am excited that with Lumada, Hitachi is playing a leading role with practical implementations to help all industries.



Millennials are of course the driving force behind the IoT revolution. This is greatly because many millennials are incapable of “shutting things off,” which enables connectivity – the core of IoT. The thing is, millennials generally don’t see the problem with being connected 24/7.



“Roughly 80 percent say they reach for their smartphone first thing in the morning, while 78 percent spend more than two hours each day texting, talking, tweeting, and surfing the web. They’re choosing – either by peer pressure or perceived need – to stay connected.” – Larry Alton, DZone

millenials.pngIn my relatively short time in the technology industry I have identified several trends that I feel will takeoff in 2016. While my main focus is the Machine to Machine (M2M) and Internet of Things (IoT) space, I’m also interested in other technology advancements that certainly will improve the lives of millennials across the globe.


Trend #1: The adoption of cloud, particularly the public cloud, is a big trend among many industries. Non-embracive sectors like banking and finance are even jumping on the band wagon. Many millennials see cloud as an opportunity for increased productivity and reduced costs and are generally at the front end of innovation for SaaS (software a service), PaaS (platform as a service) and IaaS (infrastructure as a service) delivery methods. On the other side of the spectrum are baby boomers, in which many have yet to fully-embrace the cloud-effect. Learn more about our cloud-based analytics solution, Hitachi Live Insight for IT Operations, here.




Trend #2: Non-traditional business models continue to emerge with many organizations looking for the next untapped value source to follow the leaders in this trend like Airbnb, Uber, Lyft and Alibaba. With the rise of non-traditional companies, we might see “friendship as a service” surface soon. This is guaranteed to be the most personal service yet – human connection at the touch of a button, which is likely to be a hit with the millennial generation. Read more here.




Trend #3: As the IoT craze heightens, the “security of things” takes center stage. With all this interconnectivity comes inherent risks that organizations will need to address. The problem with millennials and hacking is that many of them are unaware of the impact. In a survey of nearly 4,000 young adults worldwide, commissioned by Raytheon and the National Cyber Security Alliance (NCSA), the results show a mix of misplaced confidence and startling ignorance among the 18- to 26 year-old respondents when it comes to their online security.




Trend #4: Workplaces are beginning to utilize social media internally. While in the past many companies blocked our access to some of the most popular social sites, these social tools are able to connect us even more in the workplace today. An example of this is Facebook at Work, which allows a single sign in and users are connected to a work-facing Facebook as well as personal Facebook at the same time.  Millennials are sure to welcome this trend – "From the 839 survey responses, researchers gleaned that Millennials are "always on." The group spends an average of 17.8 hours a day with media.” I'm one of the many millennials who is constantly "connected" and feel that workplaces utilizing social media will help shape company culture and allow employees to build and maintain relationships with colleagues.




Trend #5: Fashion lines are expanding their brands into the wearable tech space. This is where fashion meets function. Hey, everyone wants a piece of the cake! Many have already launched lines like Tory Burch for Fitbit and Ralph Lauren’s Salvo, a sports shirt that monitors heartbeat, respiration and stress levels. Of course there is also the Apple iWatch. Just launched this February, Hermes and Apple have partnered to expand on Apple’s luxury line. Lavish wearable tech? While I personally haven't been interested in the iWatch, THIS is a game changer. We're going to start seeing more brands collaborate like Apple and Hermes which is sure to appeal to millennials - the more options, the better.




Do you agree with my 2016 trends?

Let me know in the comments below!


Wow. “You won’t recognize the internet in five years,” says a Quartz article recently sponsored. The statement results in a reasonable premise taken to a logical conclusion. Simply put, when we connect the data generating devices from our daily lives (like home thermostats, connected appliances, and fitness trackers) via the Internet of Things (IoT) and intelligently manage the information onslaught, everything will change. Will I really not recognize the Internet in 2020? Hm. It’s a bold claim – but the more I think about it, I believe they may be right.


The broad concept may not be new but they make two interesting points in the article worth thinking more about. First, that the integration of information technology (IT) and operational technology (OT) will fuel an Internet of Things that matter.  Many of the things that have a daily impact on our lives, like energy, healthcare, and transportation, are continually generating data in silos without connection and integration to IT systems.  Uniting IT and OT enables all the potential of IoT around things that matter to everyone, everyday.  If you want to hear more on this, you can watch a short video on “The Internet of Things That Matter.”


screen-capture-1.pngThe second thought provoking concept is that by connecting isolated devices, IoT is prompting the integration of physical and digital worlds with analytics as the key enabler. After all, consider the billions of sensors, mobile devices, smart machines, and actuators: How would we make sense and combine of all the data generated by connecting these things if we didn’t have analytics? What’s compelling here is that I don’t think we can comprehend just how integrated our daily physical lives will be with a digital existence.


They further make the claim that advantages are being driven by the shift from historical analysis to predictive analytics, although I might contend that the historical information is really subsumed for predictive use. Again, it’s all about merging and integrating different elements.  


I know Hitachi has been driving solutions in our Social Innovation business for governments and business in many industries ranging from smart city and public safety to energy and machine-to-machine analytics. I’m wondering if my colleagues Sara Gardner (@IoTThatMatters) or David Parsons (@D_A_Parsons) have any real-world examples they could share?


And what do you think: If we could jump ahead 5 years, do you think you’d recognize the Internet of 2020? What do you think the major changes will look like?


Many (over-)hyped phrases take on a life on their own. After a while they no longer reflect the original intent or the really important part. I’ve long believed “Big Data” has run into this problem,  and actually had it from the beginning.


Just this week, this popped up again and I decided I had to write a blog about it, as I've been thinking about this for some time. It’s a bit long, so please bear with me.


“Big” is a relative term

There’s no question, data sets are big and getting bigger - so from that point of view it’s certainly a central theme. But “Big” is a relative term. What’s “big” for someone in Healthcare is different from someone in Oil & Gas, or Banking, or Telecom, or Web services, etc. When you think about it that way, the term “Big” can mean very different things in different context.


Different use cases have different definitions of what "Big" is, even within an industry. The total data set for a marine oil field is in the PB range, and that’s big by almost any definition. However, a big data set for optimizing production for a land-based oil field is maybe “only” around a few hundred  terabytes. Both use cases are in Oil & Gas, but are also very different. You can find similar examples in almost any industry.


What’s “Big” for one person just isn’t for another...

Relative big.pngfull disclosure.png


Enough about that. So what’s so important about Big Data then?


The Value of Big Data

Let’s look at what you actually do with the data. What value you get out of it and what’s different with “Big Data” compared to before?


The way you access and analyze these data sets varies a great deal. For some, the whole data set is accessed only very rarely, but it needs to be accessible within “reasonable” times to verify conditions, re-assess decisions etc.; while other data sets need to constantly be available for guiding operations or detecting anomalies within minutes, or seconds or less. This in turn drives decisions on how to store the data set and what architecture to use for analyzing the data.


So far I haven’t described anything that really is new with “Big Data.” The sheer size of it and how quickly you can move it is nothing new for customers in industries such as Life Sciences, Oil & Gas, Product Design or Financial Services. They have all been using techniques originally developed for HPC to handle that aspect of it.


Meeting the different requirements for accessing and analyzing the data the classical way is also not really something new. It’s a “solved problem” and you can scale this if you throw enough money at it. And price/performance is improving at the rate of Moore’s law has a tendency to fix most cost issues over time – if the problem to be solved stays the same.


How about cost?

scissor cost.png

However, you can also develop more optimal, more cost-efficient solutions to be better suited in cases where you upfront understand the data and what type of analysis will be done. Doing so is much more efficient and costs less than a brute force approach, like classical HPC. That’s what Google did, which eventually evolved into what we now loosely refer to as an Hadoop-type of analysis. Facebook is taking a similar clean sheet approach to Open Computing. We all can benefit from this today.


That’s all good, but I don’t think that even this approaches the true value of “Big Data.”


Industry analysts have been using labels such as “Volume, Velocity and Variety” when explaining Big Data. I’ve touched on volume (the “Big”) above, and velocity in terms of how quickly you expect to access the data sets for different needs.


So what’s left?


That leaves variety. This is what is really important for big data, and what sets it apart (in my humble opinion)


By being able to access a wide variety of data types from a wide range of data sources - more or less in real time - you’re now able analyze that data and find new connections between disparate data points that simply was impossible (or at least extremely impractical) in the past.


This is where I think the true value of “Big Data” lies today.

pace of change.png


This is also the area where the pace of change is accelerating, by combining the technologies of “Big Data” with the Internet of Things (as a source of a rich variety of data). The variety of data is essentially exploding. You could also say that the volume and velocity is increasing due to the rapid increase in the number of deployed devices. However, I still contend that the variety is the core part what “Big Data” brings.


[For further reading for those interested, see “Accelerating Change”]


Look at Healthcare for example. This area has long produced lots of data, but it’s been in silos and often locked into different application-specific formats. Trying to correlate data has been an insurmountable task. Expect months or years, when you instead need the answer right away – not really practical.


To get to the point where you can perform analyses of data coming from various modalities, plus historical data, plus environmental and demographic data, plus insurance data, etc. – you first have to get it into one place or at least be able to access it all. Call it a repository or a data lake if you will.


At that point, the real interesting work can start,  by answering questions such as “Which part of the population is most at risk for X?” With big data analytics, you can answer this by combining data from the results of different medical tests, as well as information about where they live, how old they are, what medicine they take, what they’ve been exposed to in the past, their genomic profile, etc., etc.


data to knowledge.png

With such knowledge, health care professions could make proactive decisions at a completely new level to help their patients. For example, these insights can help people manage diabetes to avoid some of the consequences of that disease, or help people protect their eye sight by avoiding certain things that trigger degenerative macular disorder or proactively do more of other things to minimize the risks.


Now let’s look at telecommunications, where you can start to combine the information you have about your network, with data coming from external sources such as weather forecasts and social media to determine how to configure your network to accommodate network traffic appropriately for anticipated changes in traffic patterns. Or a smart city environment, where you combine staffing levels, equipment status and external factors to pre-position resources to handle that incoming snow storm or to manage traffic and cleanup around an unexpectedly well-attended event.


This is the interesting area where Big Data analytics on a variety of data sources really can make a difference. By providing much deeper insights, you can to enable higher efficiencies, or avoid potential problems. This is the fourth “V” that has been assigned to Big Data: Value.


And that, is “kind of a big deal” - to quote Ron Burgundy (Anchorman)

big deal.png





screen-capture-17.pngAccording to a recent IDG Research Services survey, the idea of “Social Innovation” — using technology to improve not only revenues but also society — is gaining significant traction across multiple industries, from health care and financial services to telecommunications, high tech, and the public sector. A strong majority of respondents (80%) agree that doing more for society is ultimately good for business. 


Conversely, over half report they struggle with making business decisions based on this belief that – I’ll reiterate – they think it is ultimately good for business.  What?  screen-capture-18.png


This is one of the many interesting findings of a survey of 200 IT and business executives that are summarized as a whitepaper on Big Data, Big Business and the Greater Good. A one-page summary of these findings is also in the September issue of CIO Magazine. 


Over the next 2 weeks our Social Innovation team will take a closer look at these results with examples in finance, healthcare, telecom, and government sectors.   Furthermore, we’ll highlight some of the reported challenges that I believe are related to this conflict around making socially conscious business decisions.


QUICK POLL: Does your organization’s business decisions align with the idea that doing more of society is ultimately good for business?


You can follow this thread or watch for my tweets on this topic @amyhodler

Australia's CSIRO (Commonwealth Scientific and Industrial Research Organization) has started a research project with Intel and Hitachi to fit tiny sensors to the backs of European honey bees in an effort to discover the cause(s) of Colony Collapse Disorder. They hope that the data collected by the sensors will identify the environmental causes for the declining bee population, which is likely due to multiple stressors including parasitic mites and neonicotinoid insecticides.


10,000 bees have already been instrumented in Tasmania, with more to be added in Sydney and Canberra. This could be one of the most interesting applications of IoT and a harbinger of things to come in highly localized environmental monitoring.



Phil Townsend one of my colleagues just forwarded this article about Netflix and Data Driven movie/TV show design.

Big Data Goes to the Movies - Experfy Insights

Phil's passion is the car market and he is working with Hitachi Automotive and Clarion our Telematics businesses to map out our Social Innovation strategy there.  His point, if Netflix can do data-driven show design, that is select plot, actors, characters etc to best match audience desires then why can't we do that in other markets?  In other words, from data-driven adoption analysis to data-driven design


Which segues nicely to a great call I had with Matt Aslett and Brian Partridge from 451 group this morning.  We talked about how the initial biggest gains for companies from IoT are likely to be in the areas of cost savings and increased efficiencies/margin increase but... the really exciting and transformative opportunities come when IoT enables brand new business models involving likely linkages across markets and companies.    In other words from data-driven cost savings to data-driven revenue generation.


I plan to spend some time looking at these hypotheses from a Hitachi perspective over the coming months but would love to here from others examples of where they see Data-driven adoption analysis leading to Data-driven design and ditto with Data-driven cost savings leading to Data-driven revenue generation.




VP and Chief Technology Officer, 

Social Innovation and Global Industries Solutions

Hitachi Data Systems


Slate just published an interesting article by Mark Jules (our VP for public safety and visualization) on public safety for smart cities.


Article Introduction: "Cities around the world are becoming smarter and more connected. It’s projected that by 2020, 50 billion smart physical objects – everything from oil rigs to refrigerators, cars, and medical devices - will be connected to each other and to humans, generating unprecedented amounts of data. Analyst firm IDC estimates that in the same timeframe, over 42% of all data will come from machines.

The ability to integrate and analyze these massive amounts of data in a way that enables businesses, governments, smart cities and others to anticipate, mitigate, and even prevent many problems they will face is key to delivering positive outcomes ..."  READ MORE


I really like the perspective of thinking about how all the diverse systems will have to work together.  And if you think about a total solution as including not only information technology but also operational technology and even human processes, then the challenges of such ecosystems become more apparent.  For the people that know more about this than I do: Do you think larger public safety systems will evolve in somewhat linear ways or will they be more organic?