To survive the next decade, businesses will need to be able to utilise Big Data.
Buzzword or not, it’s probably the most important feature of the fourth industrial revolution, a new economic asset that will define the near future of commercial evolution.
Structured data has been used by companies for a long time, but Big Data is distinct from this in three main ways: volume, velocity and variety. In simple terms, there is a lot of it, being produced at an extremely fast rate, and occurring in many different forms.
Businesses will need to evolve in their practices of data generation, acquisition, storage and analysis. Their success in these skills will dictate the new hierarchy within almost every industry.
Big Data is already transforming the way businesses operate. It is the key to the success of super-platforms such as Google, Amazon, and Facebook, and, via Internet of Things technology, will soon enable great improvements in transport, energy distribution and healthcare, as well as profound changes to the way we live.
Any future-facing business will need to become adept at collecting, processing and analysing data if they wish to survive the next decade.
Several studies have aimed to measure the impact that data-driven decision-making (DDD) has on the performance of a company. MIT research this year, which included 330 US companies, found that businesses in the top third of their industry for DDD were “on average, five per cent more productive and six per cent more profitable than their competitors”.
In a study of Big Data technologies’ impact on businesses, the economist Prasanna Tambe identified “significant additional productivity growth”. The research found that “one standard deviation higher utilisation of Big Data technologies is associated with one to three per cent higher productivity than the average firm”.
Think of Amazon’s hugely effective automated recommendations as an example, which harness the masses of data they collect about their customers. The US giant Sears has started doing the same, and, with the help of the data storage and processing software Hadoop, was able to reduce the time needed to generate personalised promotions from eight weeks to one.
A major US airline used Big Data technology to improve ETA predictions. Their savings at each airport are reported to be several millions of dollars each year. Visa uses Hadoop to process 73bn transactions in 13 minutes, having previously required a month.
The advertising industry has naturally been one of the forerunners in embracing Big Data. Real time response in digital advertising enables precise measurement of the performance of an ad, and this plethora of information can be complemented by data collected by websites and third party providers.
Digital advertising technology has enhanced the targeting, optimisation and measurement of marketing to an extent that was inconceivable in the previous century.
Advertising is, in some ways, a vision of the future for every industry. The only relevant ad agencies are those that have developed their ability to process and analyse data. The days of the HiPPO (Highest Paid Person’s Opinion) leading decision-making in this sector are long gone, and a far more scientific era has taken its place.
Big Data’s benefits to other sectors will be just as profound. McKinsey and Company researched the potential value of Big Data to five core industries in the US. As an example, they predicted savings of $300bn for US healthcare, if Big Data was used effectively.
In 2009, Google famously used Big Data to help identify people who had been infected during the flu pandemic. Based on users’ search behaviour, Google was able to forecast where the disease might spread to and predict where it had spread from.
There really is no limit to the potential of data to improve how businesses operate. Whatever your industry, you should be doing whatever you can to get better at Big Data.
source: City A.M website - Thursday 31, 2017. By Daniel Gilbert.
City A.M.'s opinion pages are a place for thought-provoking views and debate. These views are not necessarily shared by City A.M.
We are sure: the data will revolutionize the world, but who will really benefit? In 2011, consulting firm Gartner said: "Information is the oil of the 21st century, and analytics is the engine of combustion". In spite of this metaphor, each one legitimately raises the question of who tomorrow will be the tycoons of the given ... More, what will be the methods of the giants of today and tomorrow: will they be, like the oil giants in their time , at the limit of legality? Will there be oil spills? Will there be collusion with our policies?
21st Century Petrolueum
Our society, our economy, our lifestyles will be profoundly altered by the data that now carries much of the growth of Western countries. In this sense, it is the oil of the 21st century. But where a landowner got rich from an oil well on his land, what about our data? What is their value? Will she come back to us? Every day, every moment, each one of us generates a lot of personal or professional data, data belonging to himself, his company or published on the Internet or harvested by third parties.
That's why many companies offer free mail, data sharing services. The datum has value only if it is massive. Having services that allow global users to collect their users' data is vital for GAFAM - Google, Apple, Facebook, Amazon and Microsoft - who are masters in the valuation and monetization of our personal data.
Future data "oil spills"
If the data is the new oil, what happens when the data escapes? Whether it is the work of pirates who voluntarily steal data or are accidental escapes, no one is immune. The news is full of revelations of large-scale incidents. One can cite the theft of 412 million accounts of the site of meeting AldultFriendFinder in 2016 or an error that allowed the dispersion of confidential information on the members of the G20. Other examples could be used to illustrate these massive leaks, but there is a much greater danger to users.
Thus, many sites trade in your login data, but beyond the sale, the information is aggregated from different sources to make them more relevant to hackers. Basic security rules dictate that users do not use the same login and password on different sites. But honestly did not you happen to derogate from this rule?
A strong geopolitical issue
Let us not be naive: mastering the data is a primordial geopolitical stake. There is often a tendency, and large groups are involved, to believe that the Internet, the cloud and all the tools around the data are beyond any notion of nationality. Now, on closer inspection, the United States, and California in particular, are hegemonic about our data. Caricatured to the extreme, to the question "Who benefits from the explosion of our data? ", One answer might be: Silicon Valley.
To convince oneself of the geopolitical aspect of the data, let us note for example the decision of the American justice, confirmed on appeal on April 19 last, to oblige Google to provide the data stored outside the United States. China has also understood this by pursuing a policy of determined protectionism that allowed the BATXs (Baidu, Alibaba, Tencent and Xiaomi) to thrive in the face of GAFAM. The BATX, strongly supported by the Chinese state, now aim to conquer the international market and Europe in particular.
OPEC of the data
The latter seems stuck in an outdated view of computers and the Internet. The beneficiaries of the explosion of data will be numerous in Europe: telephone operators and digital services companies: they will benefit from this revolution. And we will not forget the many start-ups that emerge around the Internet of objects and data analysis. But let's not be mistaken, the only real beneficiaries of the explosion of data will be those who will own them within their data-centers! Will GAFAM and other BATXs create, thanks to their infrastructures, the OPEC of the data that will fix the course of your data on the world markets?
Source: www.theconversation.com - June 14th, 2017
Eric Cassar is an engineer and architect, founder of Arkhenspaces, whose "Habiter l'infini" project won the Grand Prix Le Monde Smart-cities 2017. He calls in this forum to create new ownership models for digital data buildings.
"The development of the Internet is akin to the arrival of new dimensions. Age 1 had accelerated our exchanges, with emails, and then gave us access to a growing number of information and services. Age 2 has facilitated the linking of individuals with other individuals, with social networks. Age 3 is the continuing relationship of individuals with space, through smart-building or smart-city: a physical space in close relation with the digital space thanks to fixed connected objects or Movement, and the generalization of sensors in our cities.
Our buildings will therefore process an innumerable amount of new data that will produce key information about the functioning of human settlements at different scales: the building, the block, the neighborhood, the city, the territory. Data related to environments (energy consumption, affluence, access) but also attached to new local social networks.
The effective use of this large amount of information will improve the functioning and efficiency of these estates, in particular by correlating supply and demand, distributing needs and resources and then anticipating. It will be able to suggest, initiate or promote social ties of proximity, and increase the number of local synergies.
A precious raw material
Singapore is tiny, beside the giants of India and China; it nevertheless manages to play an important role where ‘smart cities’ are concerned. With its 4.7 million inhabitants, ‘little’ Singapore has established itself as one of the models in the world for the ‘smart city’. The city-state plays a key role in the extraordinary development of the two Asian giants, China (1.4 billion inhabitants) and India (1.3 billion inhabitants). It experiments on all fronts, loves innovations and multinationals, and seduces clients from all over the world.
Playing in part on its Chinese-origin community (75% of the population), Singapore has signed several partnership agreements with China for testing and further development of smart cities. This is the case, amongst others, for the business park in Suzhou and the high-tech eco-island of Nanjing, the former capital, which has a population of over 8 million inhabitants. The experiments are then reproduced in other cities, including the eco-city of Tianjin and the City of Knowledge in Guangzhou. This methodology “provides a platform enabling Singaporean and Chinese firms to demonstrate their capacities in matters of technologies in a holistic manner», explains the INFOCOMM Agency in Singapore.
Data collection on a massive scale
And China is not alone. To develop its high-priority project of creating 100 smart cities by the year 2020, the Indian Prime Minister, Narendra Modi has also turned towards the know-how and investment capacity of the Singaporeans (of whom over 7% are of Indian origin). Singapore scores points when Hong Kong is marking time. Hong Kong is further away geographically and relies on the gamble – made long ago – of betting heavily on Information and Communication Technologies (ICT).
In more specific sectors, the Nation-state positions itself as a sort of life-size laboratory at world level of the city of tomorrow with cutting-edge experiments on autonomous vehicles or ethnic diversity by neighbourhood and even by apartment building. Above all, data collection on a massive scale combined with the predictive intelligence of big data is used in all sectors for modelling projects, planning changes and endeavouring to offer the most innovative serves, whether it be a question of fluidity, of security, of the comfort of buses or the location of child-care centres. The municipality has moreover implemented a measure which is often a source of anxiety elsewhere. It has set up a sophisticated traffic control system, the price of which varies in function of the amount of traffic, the neighbourhood, the time and the day of the week.
The brand knows how to sell
With the support of this strong position, Singapore organises a World Cities Summit each year. In 2016 this brought together mayors and leaders from 103 cities in 63 countries: all either clients already or potentially so. The brand knows how to sell and it sells itself well. It knows how to rely on its powerful financial community and, for implementation, on various private firms, including Surbana Jurong, present all over the world.
In 2014, the Prime Minister launched the Smart Nation programme, placed under the responsibility of the Minister for Foreign Affairs, Vivian Balakrishnan, thus displaying its ambition to look further afield. A cardinal virtue, the government has understood the importance of a systemic vision for improving cities by means of technology.
Both at home and abroad, Singaporeans focus on sustainable development and concern for citizens in the form of provision of quality services. But, overall, their model tends more to the ‘datapolis’ side than to the ‘participolis’ side: it prioritizes collection and processing of data more than the effective participation of citizens.
This model is limited, as Anthony Townsend, a researcher at the University of New York points out. In an article published by the Technology Review, he states that “The perfectly controlled and efficient utopia of a very safe and smart city can work in a place like Singapore. But it would probably never work in New York or Sao Paulo, where expectations in terms of conception and of what make the vitality of a community, are completely different”. Townsend thus confirms that there is no single model of a smart city. This in no way prevents this small nation-state from innovating or from finding clients all over the world.
The customer has taken control of the industrial world. It is impossible to risk the breakdown of production while the whole production chain must work under tension: customer delays, supply, etc. Time to Market has become a key issue of differentiation and sometimes even survival. Maintenance is now a strategic function in companies. Good news, innovative technologies will make it possible to go from a penalizing healing to a safe preventive!
Who still believes that repairing a breakdown even in the shortest possible time is a sufficient solution to satisfy the increasingly pressing demands of customers on deadlines? The development of competition and the unrestrained race to competitiveness, leads to a search for total quality and especially a reduction in costs! Forecast malfunctions, anticipate breakdowns, etc. Over the years the technology has come to the rescue of the maintenance activity. Many innovations invade the production lines, sometimes requiring a very rapid adaptation of the teams in place.
This is the case of SAV Réso, located in Ozoir-la-Ferrière. In partnership with OptimData (a start-up specialized in the interpretation of Big Data data), it offers its customers to equip their machines with a solution called Need & Use. This solution aims at equipping a GPRS box with the machines that allow them to communicate, which promotes the harvesting and analysis of real-time usage data.
All of its data is intended to provide critical information in order to provide an effective predictive maintenance plan, increase the life of the equipment, reduce the probability of operational failures and reduce downtime in the event of overhaul or failure. In addition, preventive maintenance is aimed at avoiding additional costs such as lubricant or spare parts, costly corrective maintenance interventions, or even securing the work of employees.
By Jessica Grasser | 22/05/2017 - lejournaldeleco.fr
To speed up its digital moulting, SNCF focuses on industrial Internet. Today, 400 employees of the transport group work on some fifty IoT projects at different stages of development. With the Internet of objects, the group intends to improve the security of its network, the quality of its service and generate significant productivity gains. Here's how.
SNCF places its sights on the Internet of objects. "In our company, IoT is the main lever for performance and efficiency. The true technological revolution is the IoT," said Thursday, May 18, 2017, Guillaume Pepy, the president of the group, at a press conference devoted to the digital transformation of the company, to which the group will spend an additional 900 million euros over the next three years.
400 people dedicated to IoT
"Today, 400 people at the SNCF are working in the IoT, representing 50 teams in different departments of the group," said Benoît Tiers, CEO of e-SNCF, a new entity of the group bringing together the 4,000 employees dedicated to digital and information systems. Today, SNCF counts five industrial Internet projects in the industrialization phase, 12 in pre-industrialization and 30 in the experimental phase. "And 11 of our technocentres are engaged in the transformation of the plant of the future," adds Benoît Tiers.
Of the five IoT projects in the industrialization phase, one finds in particular a system of connected coupons which makes it possible to replace the mercury thermometers. "Before we manually retrieved the data by touring the hot channels during summer, we are now able to measure the temperature remotely with the deployment of 500 connected coupons, which improves the punctuality of our trains, as we reduce their speed only on the portion where the sensors send an alert to indicate that the temperature is above 45 degrees, "explains Claude Solard, Deputy General Manager Security, Innovation & Industrial Performance at SNCF Network.
Combine data to accelerate predictive maintenance
Another example is the deployment of ground cameras to monitor the state of health of pantographs, which are connected to the catenaries and which allow running of the trains. "This equipment falls down very badly but when there is one, it is a disaster, so we carried out periodic checks," explains Claude Solard. The challenge here is to go through a logic of predictive maintenance. The cameras are thus connected and carry self-learning image processing software capable of detecting tiny defects. The system was developed by the start-up GST, based in Dijon. "Six stations have been equipped with these cameras and SNCF Réseau plans to deploy 40 of them by mid-2018," Claude Solard said.
The acceleration of predictive maintenance operations will depend to a large extent on the SNCF's ability to cross its data from a variety of sources. Today, the company works on data from connected objects, employee smartphones or measuring trains, but these data are located in silos. The crossing of this information should enable the teams to move from a descriptive model to a prescriptive model. "We want to be able to predict where we need to intervene and what is the best intervention to achieve, whether it is maintenance or regeneration," explains Benoît Tiers.
Each day, more than 10 millions of people, travelers and visitors, cross the SNCF station platforms. Interactive kiosks, travel information systems but also management and maintenance of stations and reams...SNCF bets more than ever on the digital strategy. At first to optimize the movements of the travelers in station, so to offer them the best services while maximizing the generated sales. The project, named Magnolia, aims at understanding better the flows of about two billions of travelers passing in transit every year, by 3 029 stations, by studying the connexions with other means of transportation (taxi, buses, self service bikes,..).
Thanks to the data generated by courses visitors in stations, the SNCF(FRENCH NATIONAL RAILWAY COMPANY) can also know if the traveler passes by businesses or if he goes directly towards the exit.
The SNCF put also on Big Data to set up an effective preventive maintenance. A real challenge for the suburban parisian trains: it is not only about a remote diagnostic of the breakdowns, but also to predict them each 30 minutes.
Data, which yesterday, were analyzed in a manual way to identify the technical problems and to activate the operations of maintenance, will be automated tomorrow, thanks to a real time vision of the global state of the material.
The system, which should be functional at the end of this year, uses an engine of predictive analysis (Machine learning), led to learn the scenarios of failure of trains by crossing the data of functioning and those of the exploitation.
Bus, subway, train, RER, tram …Today, we consider that a person spends an average time of 53 hours of his life per year in transportation. This figure should increase exponentially before 2050 to achieve 106 hours per year.
New technologies follow us in all our travels: geolocation by mobile phone, systems of computer ticketing, of maintenance … The slightest data is recorded: the entrances and exit of travelers, the incidents, the delays … The flow of information is tremendous and infinite. The user himself is an endless source of data, at the same time consumer and ressource of Big Data.
Still it is necessary to run all these data to analyze them. It is the aim of Big Data: make significant the mass of the collected data.
Big Data becomes a real asset for the actors in charge of transportation today: it offers in real time a dynamic display of transport networks. From now on, we can evaluate the performance of an operating system, analyze the behavior of the travelers, understand the evolution of flows, appreciate the impact of the changes of services and the operations of maintenance, and specify the high-risk of accidents zones, the nodes of traffic congestion …
Analyzed, these data are going to help the operator to optimize his tool and so to improve the customer experience. Big Data becomes then a real asset for all the projects of Smart Transports for which the objective is to decrease the deadlines and the costs of the means of transportation by making them intelligent and autonomous.
The figures speak for themselves : today, 54 % of the population live in urban area. In 2050, more than 65 % of the world’s population will live in towns. Therefore, urban issues will be equally amplified : cities will have to answer the questions of mobility, energetics efficiency, urban sprawl, quality of life.
The increasing digitalization of all the activities makes cities act as data provider sources. Transportation, energy, interactions, trade, all these flows generate enormous mass of information. The technological progress, the usage of the smartphone for daily uses, the strong rising of the social networks, and, more recently, the collaborative consumption, the explosive increase in consumption of digital services, opens quickly the path to smart cities.
Smart City, thanks to Big Data, aims to be a relevant answer to the new challenges facing urbanization. The collection of tremendous amount of data continuously generated by the sensors, the internal services of the city, the urban operators, the companies of the territory and the citizens, will improve the services provided to the population. The inhabitants become themselves a source of information feedback which is going to help cities to optimize even more their services.
Like Copenhagen, key model of smart cities, which started a vast ultra-connected environmental program: intelligent trash cans, measurement tools for the air quality, smart parking lot, connected lamps …
Data is everywhere especially in E-commerce. Big Data is now a tremendous springboard to personalize the customer relationship.
Amazon is the first one which has invested in Big Data. The IT colossus and the associated services have considerably changed the environment of the French distribution but also at a worlwide level too : considering the prices competition, the Internet trafic, the customer behavior, the supply chain optimization …Amazon is the undisputed leader in the dynamic pricing gestion and the supply chain.
Essential basis for the new predictive marketing, Big Data has become a formidable weapon for competition. Far more than a targeting advertisement, new technologies allow the implementation of real time actions : we can today anticipate customer’s needs and push to them relevant offers suitted to their profile. Structured data (user behavior, ordering history,registration form) but also non structured ones (web analytics, social networks), are so many data that enable you to have a global 360° customer vision. But, in order to be effective, such new strategies of customer retention have to be quick because client needs and wishes are changing regularly.
In the heart of this international customer on line experience management, the british start-up Qubit has raised fundings for 40 M of $ to optimise its solutions of data analysis connected to the customer experience for e storekeepers.