Manufacturing & Logistics IT spoke with a number of the leading analysts and vendors about Big Data, asking questions such as: how should the term be defined, what are the current key discussion points, what are the main benefits to the user and what can we look forward to in the near future with regard to further development?
Within the data and analytics sphere, no term has attracted as much attention and debate as Big Data over the past two or three years. However, before we look to investigate the benefits of what Big Data can offer companies within the manufacturing, logistics and retail sectors, it seems only logical to get as firm a grasp as possible on the meaning, or meanings, of the term, and then develop the discussion from this vantage point.
Dwight Klappich, research VP at Gartner: Gartner, defines Big Data as high-volume, high-velocity and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making. "Supply chain organisations have long been awash in data but lacked the tools to exploit this information to rapidly make better informed business decisions," he said. "The emergence of new generations of analytical tools allows SCM organisations to move beyond simple after the fact reporting of what happened to using these to dig into why things happened and to predict what might happen and what the results will or should be in the future."
Rafael Hernandez, EMEAI Industry Marketing, Honeywell Scanning and Mobility, considers that Big Data can have a very different meaning depending on its purpose and how it is used. "What we call Big Data today might not be considered as such in a year," he reflected. However, he added that, in his view, probably the most common definition of Big Data is the gathering and storing of large chunks of information which can be analysed and repurposed. "This information is very valuable for businesses as it often gives crucial insights into market trends and customer behavior," he said. "It is especially useful for the logistics and retail industry, often predicting potential customer questions, which in turn enables businesses to provide the correct answer immediately – usually before the question is even asked."
Defined generally, Richa Gupta, senior analyst at VDC Research, comments that Big Data is simply a set of data too large to be stored and managed using traditional processes. She points out that Oracle breaks down Big Data into three general categories: traditional enterprise data, social data, and machine-generated and/or sensor data. Gupta adds that data capture technology contributes to the collection of this third category of data – machine-generated data – especially in retail, logistics, and supply chain environments.
The rise of 'non-traditional' data
However, as Gupta explains, the above generic definition does not identify the specific advantages that Big Data brings to the supply chain. "Big Data is important in the modern connected world as the collection of 'non-traditional' data increases; information vital to accurate business decision-making can be found not only in transactional data collection methods, but even more so in imagers, machine vision, sensors, and scanners," she said.
Gupta added that the machines that produce these non-traditional data sets are now more than ever connected to one another in what is called the Internet of Things (IoT). "The IoT allows for continuous, real-time data transactions between devices such as barcode scanners and mobile computers," she explained. "In many ways, data capture solutions are facilitating the Big Data phenomenon that then helps streamline operational processes.
For example, FedEx's SenseAware platform combines location solutions (GPS) with temperature readings and real-time notifications if a shipment has been opened or exposed to light. FedEx is able to analyse these data sets in order to rectify inefficiencies that would have been left unidentified had the data not been available.
The IoT uses AIDC technology to provide continuous real-time data in all parts of the supply chain, facilitating optimisation. When combined with Big Data-driven platforms such as SenseAware, scanners and sensors can be used to assure visibility, traceability and quality control at all steps in the supply chain. These traits contribute to successful short-term and long-term business decisions."
Carl Wiper, senior policy officer at ICO, reflects that it is difficult to produce a watertight definition of Big Data. "This is because it's really a phenomenon, rather than a specific technology," he said. However, he added that it is normally explained in terms of the 'three Vs' – volume, variety and velocity of data. In addition, Wiper comments that the use of multiple algorithms to find correlations, the tendency to use 'all the data' rather than a sample and the repurposing of datasets are notable features of Big Data analytics. "Big Data analytics have the potential to provide new insights, and promote innovation and the development of improved, more finely tuned services," remarked Wiper. "As the regulator for data protection, the Information Commissioner's Office (ICO) is interested in Big Data when it involves the use of personal data. We want to make sure that the benefits of Big Data don't come at the expense of privacy and trust.
In a similar vein, Richard Hughes-Rowlands, EMEA product marketing, Zebra Technologies Europe, comments that one of the standard ways of defining the term Big Data is to think of it as divisible in terms of the 'Four Vs'. He goes on to explain that the first 'V' refers to the volume of data because of the sheer amount that is now available. The second 'V' stands for velocity, which refers to the sheer amount of changing data. The third 'V' stands for variety, which points to the large number of different data types. Then the fourth 'V' stands for veracity, because we are often dealing with data sets that might be incomplete, while still offering up considerable value to the user. However, Hughes-Rowlands believes that over time the phrase Big Data is going to continue to be something that means different things to various groups of people based on their different perspectives and business needs.
While not necessarily an 'official' definition, Tony Baer, principal analyst at Ovum, considers that Big Data could be referred to as the different types of data that companies often used to use just once, but potentially now have the opportunity to leverage greater advantage from in the form of greater levels of analytics. Baer makes the point that operational data has historically been used to do things such as register that a system is working properly or prepare a customer's bill. "Beyond that this is data that businesses really didn't have the capability to collect, clean, store, analyse and aggregate in an effective and economical way before," he said.
According to Baer, another definition refers to data that is coming at an order of magnitude that previously wasn't easily accommodated by traditional SQL-type relational databases. "By being able to access and manage greater volumes of data in faster, efficient and affordable ways, companies now have the opportunity to take different approaches in order to improve business processes," he remarked. As an example, Baer added that Big Data might enable companies to ask different questions; questions that could move them towards some of the goals they already have, but in a more effective way through utilising a methodology that gives them a business and operational edge as opposed to taking traditional routes.
Dr. Ziad Nejmeldeen, head of Infor Dynamic Science Labs, comments that there are at least three or four major current definitions of Big Data. "The concept began with the creation of a hardware infrastructure as well as the arrival of computer power that was able to process much greater levels of data than before," he explained. "That stage really was the genesis of Big Data. Before then, using and managing data was more time-consuming from a labour perspective, as well as being very expensive; they were the main pain points."
From there, Nejmeldeen pointed out that it moved beyond basic data management whereby users could access and add data as required. "Things developed to the extent that data could be leveraged to 'tell us things'," he said. "That's really where a lot of the original business intelligence (BI) came from. The BI user experience was really about being able to visualise and then describe the data, allowing users then to be able to understand why there was, for example, a loss in sales at a particular point in time, or why things were erratic in a way that they didn't necessarily understand at face value. So, really it was about better diagnostics and the ability of a typical user to be able to access the type of valuable data they had never seen before that played a big part in the Big Data story."
Nejmeldeen added that the term Big Data then started being applied, especially within Healthcare in the US, within the context of unstructured data, comprising a lot of information available in different forms, and often not very easy to access. However, he added that, today, more and more data in Healthcare has to be searchable, including video data, so these many pieces of information are now feeding the definition of Big Data within Healthcare.
The last definition outlined by Nejmeldeen – which is one Infor has been largely focusing on – is the realisation that users have data coming from multiple sources. "This data is usable, although not always very connected," he said. "However, if you piece it all together it can tell you a lot about what you should be doing in order to improve your business and operational processes.
So Big Data may mean you have a large amount of data, it might mean it requires some very serious computational power, but it may also mean that you have a manageable amount of data but the data is all very disparate meaning you need to find a way to make all the different types of data talk to one another in order to get the best answers and recommendations."
Therefore, from a Big Data technology provider perspective, Nejmeldeen maintains that there needs to be a team on the science side who can do that at the back-end, plus a team that is able to visualise how to make all these different sources of data accessible in an easy-to-use and meaningful way. "Within Infor, this is where Hook & Loop comes in," he explained. "Hook & Loop has been thinking about this issue of disparate sources of data and how to create a user interface that allows users to move from one data source to another as easily, seamlessly and intuitively as possible."
Enriching the use of traditional applications
Moving on to some of the key talking points within the Big Data arena, Baer believes Big Data is going to help enrich how companies benefit from their traditional applications. He maintains that from the results of deeper levels of analysis through the medium of Big Data, companies could potentially craft improved strategies with regard to inventory forecasting, logistics planning and so on. "The analytics that companies perform using Big Data can help them to fine-tune or more radically change some of the strategies that they would otherwise have to plan by using their traditional systems," he said.
Klappich points out that Gartner conducts an annual supply chain management technology 'user wants and needs' study. "When companies were asked about the importance of different supply chain initiatives to their organisations in the next year, companies identified building end-to-end visibility and enhancing decision making with analytics as their top two initiatives," he said. "Given that end-to-end is only achieved through descriptive, diagnostic or predictive analytics this means that Big Data analytics ranks as one of the most important initiatives to supply chain organisations over the next year." Additionally, Klappich explains that more than 60 per cent of the respondents ranked enhancing decision making using analytics as 6 and 7 on a 1 to 7 scale (7 is the most important and 1 is the least important). "This underscores how important supply chain analytics is to support fact-based supply chain decisions," he remarked.
Hernandez considers that the ability to gather so much information about customer behaviour and industry trends can be invaluable, especially in such competitive industries as logistics and retail. However, he adds that gathering and storing this data does not provide any long-term impact. "It's the analysis of this information that gives the competitive advantage – it helps to predict what the customer wants and exceed his expectations," he said.
Hernandez made the point that gathering and analysing this information often requires multiple devices in different locations; therefore connectivity and access to real-time information is crucial. "Different resource platforms have to be communally linked, even connected with ERPs from other factors in the supply chain," he said. "The information gathered using these platforms has to be swiftly analysed and fed further into the business. This in turn influences demand planning which affects the supply chain and logistic process."
Hughes-Rowlands believes that as Big Data methodologies become increasingly well-defined, practical and actionable, standardisation issues related to increased levels of security and the sharing of data will come to the fore. "This is because companies don't want to share all available data sources with all their partners," he remarked. "It needs to be selective, relevant and within the boundaries of necessary levels of security, confidentiality together with legal responsibility. Therefore, there needs to be well-defined ways of sharing what is relevant and appropriate depending on the needs and rights of different parties within the integrated supply chain."
One of the things Nejmeldeen finds interesting about the position of Dynamic Science Labs within Infor is this group does not sit within any one of the particular product areas or within any specific target industry sector. "So as we hear problems from a set of customers in one micro-industry, this allows us to think about how that same problem, in terms of how we approach it from a solution standpoint, could be used somewhere entirely differently," he said. "If you look at something like inventory optimisation the things that hospitals care about are going to be different to the things that retailers or manufacturers care about. Whether it's about maximising profit or service level, or quality of care in hospital, certainly you have different objectives. Nevertheless, in terms of the underlying mathematics and the way we go about solving these problems, and in terms of some of the fundamental data that's required to do that, many user requirements relate to the same type of solution metrics. So it allows us to think about creating a solution that we can then leverage multiple times in multiple industries."
Drivers for development
In terms of some of the main drivers behind the development of Big Data, a Gartner study found that 89 per cent of respondents felt that supply chain complexity was going to increase over the next several years, with 47 per cent of those saying complexity would increase significantly. Customer expectations, the velocity of change and competitive pressures along with globalisation were sighted as key things driving increased complexity which if not managed properly can drive up the costs of doing business as well as increasing business risk.
While under-performing companies often look to reduce complexity, leaders said that exploiting complexity is a potential source of competitive advantage. "These companies look to Big Data and analytics to help them determine where to go leveraging information to make faster and more informed business decision where underperforming organisations continue to focus on ex post facto reporting to tell them where they are," said Klappich.
Hughes-Rowlands reflects that we are all consumers of one sort or another, and we will have had both good and bad experiences when using IT solutions. "When experiencing positive results we then want to apply these back into our business world," he said. "For example, how can we make our Business-to-Business solutions as easy to use or as low cost or as fully connected as consumer systems or devices. These needs are some of the things that are driving innovation within the B2B space. So consumerisation, falling costs and connectivity are definitely some of the key drivers for technological developments such as Big Data."
Nejmeldeen points out that one of the major hurdles for the development of the Big Data concept around 10 years ago was having access to the required sources of raw data. Another hurdle was not having the level of infrastructure that would be cost-effective and strong enough to handle all the computational power required; which today Amazon Web Services (AWS) provides for companies such as Infor very effectively. "So, finally solutions can now be built that can not only make predictions but can also evaluate all the possibilities under those predictions, and make valuable recommendations," he said.
The right type of presentation
However, when these optimisation solutions are in place and in the hands of end-users, Nejmeldeen's concern is that they may not understand where Infor's recommendations are coming from. Moreover, he adds that even though the company could claim to be using a particular piece of software, there are so many overrides that users could make. "After all, at the end of the day they have to justify to their bosses why they made a certain decision," he said, "so they may feel a lot more comfortable making a bad decision of their own rather than doing something they have been told to do but don't fully understand." Therefore, in Nejmeldeen's view, it is about presenting Big Data to users in a way that they can appreciate and understand. "This is where it's critical that we bring in the design team from Hook & Loop to add that translation layer from science to user," he remarked.
Many companies are now aiming to reach their existing goals in a more efficient way, but, according to Baer, some of the goalposts would have moved. In his view, customer engagement is probably the best example of this. "Traditionally, engagement with customers was mainly managed and optimised based on how customers approached you through your call centre, your website or your bricks-and-mortar store," he said. "But today it's largely about engaging with their lifestyle, which means monitoring attitudes and behaviour through social networks or location-based interactions through mobile data.
To gain benefit from this involves performing different types of analytics in order to ask 'are my customers happy' or 'am I going to lose them?' So, again, it's worth emphasising that the goals around customer satisfaction and the sale of goods haven't changed; it's just that the goalposts have shifted from relating to whether your call centre picked up the customer's call quickly to knowing what customers want by being involved in things like social media."
Intelligence at the point of interaction
According to Hernandez, developments in Automatic Data Collection have been key in the Big Data process, especially related to connectivity and intelligence at the point of interaction. "Several years ago, we felt highly satisfied identifying an asset by a barcode," he remembered. "The barcode was read by a scanner, and this serial number transmitted to a computer: the asset was identified.
Now, we have intelligent tags that not only give us a serial number, but also a full traceability report with historic data including the interaction with temperature or shock sensors. Information is collected by a portable computer giving basic GPS information but also information about user behaviour, battery life, and unsuccessful scans – all enabling predictive maintenance. The amount of data collected, transmitted, stored and analysed have increased dramatically."
In parallel, Hernandez observes that retailers want to use this data to personalise the offer. "They want to know more about potential customers; what are they buying and why. This information gives retailers the knowledge to make the purchase easy and satisfactory. From paper based systems we are now in a full tracking mode. From collecting 'the minimum information needed' we move to 'choosing the level of information' depending on process capacity." Hernandez added that manufacturers plan the production based on accurate and on-time deliveries and automatically track all items through the production flow.
Transportation organises the most effective routes through the GPS coordinates – moving goods between intelligent distribution centres, which know exactly where each asset is located. Retailers agree with terms of sale for products located in external facilities that will be delivered by non-owned fleets. "The customer constantly increases the demand and this is why value-added services are becoming increasingly important," said Hernandez. "The business that can meet and exceed customer demands and often offer something extra will be the one who gets the customer in the end."
Gartner recognises different styles of analytics within the domain of Big Data:
- Descriptive analytics (What happened?). Tools such as dashboards, business activity monitoring, complex-event processing, geospatial analysis, pattern analysis.
- Diagnostic analytics (Why did it happen?). Tools such as data mining, visualisation, root-cause analysis.
- Predictive analytics (What will happen?). Tools such as statistical forecasting, machine learning, simulation, crowdsourcing.
- Prescriptive analytics (What should happen?). Tools such as optimisation, heuristics, business rules, decision analysis.
While there have been point solutions in these various areas, Klappich believes the greatest value is when SCM organisations can exploit these capabilities more broadly. He maintains that, without question, companies have derived value such as improved customer service, process efficiencies and reduced costs using traditional descriptive analytics (i.e., after-the-fact reporting). "The greatest value and impact is when organisations can begin to first understand better why things happened and then use this insight to look out in time to make better forward-looking decisions," he said. As an example, Klappich points out that in transportation descriptive analytics might inform a company that it has a higher than expected number of late shipments. Then, in diagnosing this, the company could find its carrier capacity constraints have led to a higher than expected turn-down rate.
Klappich adds that, while this is valuable information in itself, the greater value comes when companies apply predictive analytics to better plan freight in the future to secure capacity, and prescriptive analytics to determine what is the optimal way to address this challenge going forward.
Better choice and value
Hernandez considers that access to more information gives more options to customers and offers them a better choice. "It provides access to vendors from all over the world who offer different products and services," he explained. "Customers can then compare the vendors and choose the best fit for them – whatever their requirements might be." He added that, when it comes to businesses, Big Data allows them to gather more information about the customers and market trends which can then be stored, analysed and effectively used for competitive advantage.
As Big Data develops and becomes more widely adopted, Hughes-Rowlands thinks the key end results to the user would be better value and better experience. "Ultimately, all businesses are providing something for a consumer," he said. "And the consumer naturally wants to gain benefits as a result of suppliers adopting Big Data; even if it's simply knowing that a parcel is going to arrive at a certain time on a particular day.
So, by aggregating all the available relevant data together for particular uses, I think consumers will increasingly enjoy a better customer experience." Hughes-Rowlands added that the more informed you are as a consumer the better you will feel about the product. "That's always been the case," he remarked. "One of the most simplistic examples of meeting the customer's requirements is if you're standing waiting for a lift it's good to know it's on the way and you can see the lighted panel telling you the floor it's currently on. By connecting together all the systems and the intelligence you have at your disposal within the supply chain, consumers will be happier, and, as a consequence, this should translate into repeat business."
Living up to expectations
And what of remaining challenges in general with regard to the Big Data space? According to Gupta, there are undoubtedly challenges in managing Big Data, particularly with respect to AIDC technology. "Data capture is not the same concept now as it was ten years ago," she said. "What started out as just barcode scanning has now evolved into capturing a variety of non-traditional data types. Solution providers have to work twice as much to live up to expectations as clients place frequent demands for multipurpose scanners and software that support a range of data capture functions, not limited to simple track-and-trace. This leads into the next challenge that participants in the supply chain face. Big Data is only as useful as the way in which it is analysed and presented."
So, how does one integrate all of these vastly different types of data sets together in a meaningful way? For end users, the issue (and the solution) is in the analytics, states Gupta. "Companies such as FedEx are now hiring 'data scientists'. These professionals specialise in analysing and presenting Big Data in a way that contributes to effective decision-making.
Any firm seeking to use Big Data to its advantage will need to invest in a specified team of data scientists for this purpose." In Gupta's view, the future of the AIDC market will depend on participants' ability to address these prevalent issues head-on. "As data volume and variety are continuously increasing, it is no longer enough for a scanner to read a barcode," she said. "Data capture solutions providers will need to innovate and make investments in solution development that goes beyond hardware in order to meet increasing demands of today's connected, Big-Data-driven Internet of Things world."
Continuing the integration theme, Nejmeldeen makes the point that if a start-up company decided it wanted to build any sort of Big Data solution – for example, an inventory optimisation solution – the first thing it will need to do is upload a lot of data. Then, once this has occurred, he explains that there needs to be a way of going through this 'unclean' unstructured data and making it more structured and useable. "This process can, in some instances, be a longer process than the entire Big Data solution implementation," he said, adding that Infor has the benefit of being able to go directly to its SkyVault data platform where all of the relevant data services have provided it with data from a clean source ready for use. "So we have the data and, secondly, we have clean ready-to-use data."
Legislation and legal issues
Have changes in legislation or other wider legal aspects influenced the development of Big Data over the past year or so? Klappich believes it is not so much that changes in legislation have influenced the development of tools as much as these issues have accelerated the adoption of Big Data tools to help companies more effectively deal with these changes. "For example, in the US new hours of service rules have forced both shippers and carriers to analyse the impact of these rule changes on their operations and to evaluate various approaches to dealing with these impacts going forward," he pointed out.
Hernandez observes that specific laws such as those surrounding food traceability have clearly been forcing some of these developments. However, in general, he thinks the Internet is the main driver behind these trends. "Legislation is usually behind technical capabilities or global trading trends," said Hernandez. "Countries and governing bodies follow the global commerce, suppressing custom barriers, allowing the free circulation of goods and services through different countries. Today we see applications for collaborative networks or sharing services where people are travelling using resources out of traditional channels like taxi or bus transportation companies; what if tomorrow we see people selling, or renting, or sharing products or services by themselves, far away from traditional retailers?"
Hughes-Rowlands reflects that we hear about data breaches involving everything from credit card companies to banks. "No one wants to be the next organisation leaking their Big Data, which could have not only legal consequences but also prove to be a PR disaster for the brand in question," he said. "I think the main point here is about consumer protection through the protection of data.
On the legislation side, one thing people can be concerned about when new EU regulations will be introduced involving, for example, the tracking and tracing of goods in the supply chain or in the warehouse or on the factory floor. This could be driven by the need to ensure that the content list of ingredients for a particular product is safe and within a certain date threshold for consumer use. This type of legislative issue can force companies to change their behaviour. In this regard, we may see Big Data – because of its ability to track, trace and analyse the storage, sale and dispatch – being increasingly deployed in order to remain compliant with these types of rulings."
Also, in terms of data protection Hughes-Rowlands stresses that companies have to be sure they are sharing the right amount of relevant data with the right people, and not disseminating the data it has at its disposal irresponsibly, or even in some cases illegally. "This is one of the areas where Big Data can have a lot of value, and it's one of the current big challenges for the Big Data concept," he said.
What are some of the key differences regarding brands and types of Big Data solutions currently available? Hernandez comments that Big Data management often requires specialised companies to solve issues of search, collection, sharing, storage, analysis and security in all phases. "The company that understands these factors will ultimately offer the best solution for their customers," he said. "The end customer, manufacturing company and retailer have to be focused on their core activity and rely on external suppliers to manage and solve the inherent issues that come with handling large portions of information."
Hughes-Rowlands considers that one differentiator would be the open/closed nature of systems, whether you are locked in, not whether you can put together a best of breed type solution or whether you are looking for someone to do it all for you.
Nejmeldeen points out that Infor is building solutions that have the flexibility to look across all the industries. "Then, as we serve and learn from issues related to customers in one micro-industry, from a solution standpoint we can also think about how the way we solve those problems could also be of use in another micro-industry," he said. Secondly, Nejmeldeen explained that Infor has access to clean data with its ERP solutions having been integrated via the company's middleware ION and Cloud service SkyVault.
Thirdly, he commented that Infor has many thousands of customers who can benefit from this software doing something either on the prediction or optimisation side that it doesn't currently do. "We are not simply looking to build entirely new solutions that sit remotely from the user's existing Infor experience," he said, "we are looking to understand what else users would like Infor to accomplish through the ERP system they are using, and then supplement this with greater levels of optimisation and flexibility."
Security and confidentiality
Are there any enduring security or confidentiality concerns within the world of Big Data? Klappich has not seen these types of concerns as much in SCM as in other areas such as consumer retailing. "We have seen some push-back on the part of carriers to companies exploiting things like Big Data to benchmark carrier rates and to create aggregated carrier score cars," he said. "However, these initiatives are nascent and so far have not caused any notable issues."
Wiper's main concern is around transparency and fairness, when the analytics involve personal data. "Increasingly data is collected without people necessarily being aware of it; for example from connected devices (the so-called Internet of Things), from location data or metadata from people's Internet and social media activity," he said. "Big Data can repurpose this data in unexpected ways, in order to profile people and make decisions about them.
Given the complexity of the analytics, Big Data can seem opaque to the people whose data is being collected." Wiper added that Big Data creates a big challenge for the organisations collecting and analysing it. "They have to provide information to people about how they're using their data. How do they explain what they're doing in a way that people can understand; for example when they're downloading an app? If the organisation is relying on people giving consent, how can they ensure that it's meaningful and valid?"
According to Wiper, Big Data challenges organisations to be as innovative in giving this important privacy information, as they are in carrying out the analytics. ICO discusses this, and other data protection issues, in its paper on Big Data and data protection.
Hernandez stresses that storing and sharing customer data is definitely something that has to be regulated and closely monitored. "Knowing who is accessing the information is key in this process," he said. "Therefore, establishing suitable access points, privacy rules, channels of transmission, storage and data governance rules, security protocols (and data encryption) is absolutely crucial for the security of the data."
Making the best decisions
In terms of other concerns, the biggest from Nejmeldeen's perspective would be the potential for user overrides in the system. "Therefore, one of the things we are building into our software is the ability for it to monitor its own performance," he said. "As part of that, because someone using a system may want to override or change the recommendations that that system is making the system needs to be able to track not just its own recommendations but also all the overrides that may have taken place, together with what actions were actually executed.
The system needs to be able to report on all of these findings in order for the end-user to make the best decisions in the future and to reduce the number of overrides. Or, if the user is making the overrides for good reason then the system needs to learn from this in order to improve its own operation and recommendations."
Among the many real-world examples of where companies are benefiting from a Big Data methodology Hughes-Rowlands cites the example of a manufacturer of bespoke kitchens. He explains that the company designs customers' kitchen units, builds them in its factory then delivers and installs them. "Because of the sheer number of separate parts that need to be delivered to the customer's address, the company was aware there was a risk of certain parts going missing, or not being put on the delivery van at the point of dispatch," he said. "In view of this potential constraint, the company decided to RFID-track every single part in the process, including readers on the delivery vehicles. RFID now covers everything from the company's own internal logistics processes to its various suppliers downstream. Everything is identified and all the data that's collected can be analysed in order to further improve processes where possible."
Hughes-Rowlands adds that the ultimate benefit of this methodology is that, because there is now a substantially reduced risk of items going missing, customer satisfaction is extremely high. "This may not strictly tie-in with the Big Data concept," he reflected, "but it's a good example of how processes can be massively improved by identifying and tracking every item and part through the supply chain to the end user and then using that large volume of data to delight the customer whilst minimising the cost of incomplete deliveries."
Future innovation and development
What might be some of the sweet-spots of further development and innovation within the world of Big Data? Hughes-Rowlands believes we will continue to see some interesting applications coming from new sources. "There is a lot of data of different types that is becoming available," he said, "and this data is in many cases becoming a lot cheaper to obtain, connect and deliver to customers. So, the awareness of the benefits of these various data sources and the increasing cost-effectiveness of obtaining them, coupled with further technological developments from the vendor community, means the uptake of Big Data increases substantially over the next few years.
Big Data is still in an early adoptive phase. There may be some level of disillusionment as people look to implement Big Data and adapt it to their own specific challenges, but I think it will continue to offer up many real benefits."
Hughes-Rowlands also believes standardisation will be a key driver of adoption, allowing for rapid deployments and interaction between disparate systems. "With products like Zatar from Zebra Technologies, we're addressing the challenges that Big Data brings," he said. "Collecting, aggregating and securely sharing information from multiple sources to applications that will ultimately bring a better customer experience and drive repeat business."
Baer points out that there are some new-generation applications that are emerging. "But these are additive – they don't mean companies are going to have to replace their existing ERP system, CRM system and so on. The new applications will be deployed for specific processes and will be very niche-specific."
Wiper has noticed that some companies are putting a lot of work into developing what could be called an ethical approach to Big Data that respects privacy concerns. "This isn't just about complying with the law, it's about avoiding reputational risk," he said. "It's also about building trust with their customers, and positioning themselves as good custodians of personal data. We see this as a positive trend, and we'll be interested to see if it becomes more widespread in business over the next year or two." In Wiper's view, Big Data is also driving an increasing business emphasis on data quality and data governance. "A lot of the issues that chief data officers are concerned with are also data protection issues," he pointed out. "We think that getting data protection right helps companies to improve their data management."
Klappich observes that in transportation and logistics predictive analytics has been of interest. However, until recently Gartner has only seen very advanced companies looking at these types of tools. "For example, freight forecasting is largely done on spreadsheets today, if done at all; but with increased volatility and capacity constraints," said Klappich. "This is no longer good enough, companies need more robust tools."
Also, Klappich explains that things like traffic and weather pattern analysis to help build better operational strategies are being used by advanced companies, although they remain nascent. "We find that logistics largely trails other areas of the business such as manufacturing and sales in using predictive analytics and there are numerous use cases where this will prove beneficial. For example take traffic pattern analysis for a major city. Today many companies have advanced route optimisation tools that define the optimal path to follow given certain constraints. But many of these cannot factor in things like traffic congestion at points in time which, if considered, might suggest different routes that reduce idling time sitting in traffic jams."
Hernandez comments that off-the-shelf or specific ad hoc developments will be dependent on different factors. "All main IT software providers are investing huge amounts into Big Data – not only to solve technical issues, but most importantly to make the access to right data as easy as possible for their users," he said.
Nejmeldeen believes that within a couple of years the Internet of Things concept will really come into its own. He elaborated: "The Internet of Things is all about objects and people talking to each other in a meaningful way. The basis of the concept isn't entirely new in that sensors have been fitted to devices for some time. What's really new, and what will be the real game changer, is the greater uptake of mobility solutions together with the availability of much greater computer power.
If you look at Infor Ming.le Agile Project Management, the market is already firmly on the road to the Internet of Things. Infor Ming.le has been described as a workplace social collaboration tool, but I think it's really much more than that. It's about assets that the manufacturer would have with sensors that have been put into those assets that are tracking all sorts of progress. When a task has been completed this event should be reportable. And when you have machines and people all in communication with each one another, from a Big Data optimisation perspective you now have this prospect of being able to know when a particular object has completed a certain task at a certain time."
Additionally, Nejmeldeen explained that if an object fails or is at risk of failing – maybe due to a worn part that the object's sensor has detected – the Internet of Things methodology means a technician with the right training and qualifications can be immediately alerted. "So, with the Internet of Things we have this ability to manage people and inventory together," he said.
Hughes-Rowlands reflects that customer expectations are certainly becoming greater. "As a consumer myself, I sometimes get frustrated when I receive parcels delivered from multiple delivery companies and only one or two of them can tell me exactly where my parcel currently is and when it will be delivered," he said. "Many of these companies can only give the customer a delivery day, while the actual anticipated time of delivery can be within a window of several hours.
The companies that can provide more accurate information are likely to be those who are tracking everything all the time." So, Hughes-Rowlands believes that, increasingly, consumers will come to expect a more accurate time of arrival so if they need to be at home in order to take receipt of the package they can plan their day's activities accordingly. "The basic tracking of goods may not be Big Data, but things such as the gathering of data related to delivery patterns, quickest routes and the most efficient delivery services and then being able to analyse this at a higher level has a Big Data resonance," he remarked.
Privacy and Data Protection issues can be more challenging in some cultures or regions, observes Hernandez. "It is no longer just a question of what information to collect and how to store it – users are becoming increasingly concerned with who will have the access to that information and which data belongs to whom," he said. "It is often difficult to establish who the owner of the information is.
When a retailer is gathering data about a particular customer, does the customer own the rights to that data or is the party that captured and stored the information? Which types of information are private and which ones can be captured and shared? These are all quite controversial issues, especially with raising security concerns among customers."
Focusing on tomorrow
To conclude, Klappich makes the point that Big Data and analytics are not entirely new concepts. "We have had some sophisticated point solutions in certain areas for many years now," he remarked. "What is newer is the recognition that analytics must become a core competency of SCM organisations that want to maintain or improve competitive advantage.
What this says is that organisations must view data as a strategic asset and they need to build the organisation and competencies to manage and exploit data strategically." In Klappich's view, developing things like a Big Data analytics centre of excellence is one step on this journey. He adds that companies also need to break down functional and systemic barriers within their own organisations to fully leverage end-to-end data.
Last but not least, he maintains that SCM organisations need to re-orient their attention from mostly focusing on 'what happened' to placing more emphasis on what will and should happen tomorrow and the next days, weeks and months." Big Data is largely about the future.