Groupement INI : Intégration Numérique pour l'Industrie

31 août 2017

Big Data will make or break the businesses of the next decade

To survive the next decade, businesses will need to be able to utilise Big Data.

Buzzword or not, it’s probably the most important feature of the fourth industrial revolution, a new economic asset that will define the near future of commercial evolution.

Structured data has been used by companies for a long time, but Big Data is distinct from this in three main ways: volume, velocity and variety. In simple terms, there is a lot of it, being produced at an extremely fast rate, and occurring in many different forms.

Businesses will need to evolve in their practices of data generation, acquisition, storage and analysis. Their success in these skills will dictate the new hierarchy within almost every industry.

Big Data is already transforming the way businesses operate. It is the key to the success of super-platforms such as Google, Amazon, and Facebook, and, via Internet of Things technology, will soon enable great improvements in transport, energy distribution and healthcare, as well as profound changes to the way we live.

Any future-facing business will need to become adept at collecting, processing and analysing data if they wish to survive the next decade.

Several studies have aimed to measure the impact that data-driven decision-making (DDD) has on the performance of a company. MIT research this year, which included 330 US companies, found that businesses in the top third of their industry for DDD were “on average, five per cent more productive and six per cent more profitable than their competitors”.

In a study of Big Data technologies’ impact on businesses, the economist Prasanna Tambe identified “significant additional productivity growth”. The research found that “one standard deviation higher utilisation of Big Data technologies is associated with one to three per cent higher productivity than the average firm”.

Think of Amazon’s hugely effective automated recommendations as an example, which harness the masses of data they collect about their customers. The US giant Sears has started doing the same, and, with the help of the data storage and processing software Hadoop, was able to reduce the time needed to generate personalised promotions from eight weeks to one.

A major US airline used Big Data technology to improve ETA predictions. Their savings at each airport are reported to be several millions of dollars each year. Visa uses Hadoop to process 73bn transactions in 13 minutes, having previously required a month.

The advertising industry has naturally been one of the forerunners in embracing Big Data. Real time response in digital advertising enables precise measurement of the performance of an ad, and this plethora of information can be complemented by data collected by websites and third party providers.

Digital advertising technology has enhanced the targeting, optimisation and measurement of marketing to an extent that was inconceivable in the previous century.

Advertising is, in some ways, a vision of the future for every industry. The only relevant ad agencies are those that have developed their ability to process and analyse data. The days of the HiPPO (Highest Paid Person’s Opinion) leading decision-making in this sector are long gone, and a far more scientific era has taken its place.

Big Data’s benefits to other sectors will be just as profound. McKinsey and Company researched the potential value of Big Data to five core industries in the US. As an example, they predicted savings of $300bn for US healthcare, if Big Data was used effectively.

In 2009, Google famously used Big Data to help identify people who had been infected during the flu pandemic. Based on users’ search behaviour, Google was able to forecast where the disease might spread to and predict where it had spread from.

There really is no limit to the potential of data to improve how businesses operate. Whatever your industry, you should be doing whatever you can to get better at Big Data.

source: City A.M website - Thursday 31, 2017. By Daniel Gilbert.

City A.M.'s opinion pages are a place for thought-provoking views and debate. These views are not necessarily shared by City A.M.

Posté par Nina Legras à 11:05 - - Commentaires [0] - Permalien [#]
Tags : , , ,

28 juin 2017

Big data, big money: who benefits from the explosion of data?

We are sure: the data will revolutionize the world, but who will really benefit? In 2011, consulting firm Gartner said: "Information is the oil of the 21st century, and analytics is the engine of combustion". In spite of this metaphor, each one legitimately raises the question of who tomorrow will be the tycoons of the given ... More, what will be the methods of the giants of today and tomorrow: will they be, like the oil giants in their time , at the limit of legality? Will there be oil spills? Will there be collusion with our policies?

21st Century Petrolueum

Our society, our economy, our lifestyles will be profoundly altered by the data that now carries much of the growth of Western countries. In this sense, it is the oil of the 21st century. But where a landowner got rich from an oil well on his land, what about our data? What is their value? Will she come back to us? Every day, every moment, each one of us generates a lot of personal or professional data, data belonging to himself, his company or published on the Internet or harvested by third parties.

The tools to protect privacy, property, the right to know, modify, delete are multiple: intellectual property, CNIL, French law, international rights ... But what about the value of these data? The information from your connected watch, your smartphone but also your photographs, videos, digital invoices, appointments of your agenda speak of you in your place. They define, far more than you think, what you are, look for, appreciate ...

Your relationships and comments on Facebook can infer many traits of your personality ... Including your sexual orientation. The more you are on the web, the more the profile companies have will be precise and will target your expectations. Your profile is not just what you enter into your personal information: it is the result of complex algorithms developed by experts in machine learning and big data. So, some articles say that your bank can predict your divorce before you even start taking action! It is obvious today that our data are today the main value of Internet businesses.

Valorisation at any price and the temptation to cheat


That's why many companies offer free mail, data sharing services. The datum has value only if it is massive. Having services that allow global users to collect their users' data is vital for GAFAM - Google, Apple, Facebook, Amazon and Microsoft - who are masters in the valuation and monetization of our personal data.

In this unrestrained race, algorithms that customize and improve our favorite services can sometimes be considered ethically debatable, even contrary to the law. In the context of Machine Learning applied to personal data, companies are systematically confronted with serious ethical questions and legislation. Numerous examples demonstrating the complexity of controlling and enforcing the law. For example, the investigation launched in May 2017 against Uber and his algorithm Greyball allowing him to detect the police officers and thus escape the controls. One could also cite the experiments carried out in great secrecy by Facebook and the uninvited Facebook-Tinder collisions. Justice and the police are confronted with algorithms and are still today lacking in front of these complex cases defying the traditional rules of our societies.


Future data "oil spills"

If the data is the new oil, what happens when the data escapes? Whether it is the work of pirates who voluntarily steal data or are accidental escapes, no one is immune. The news is full of revelations of large-scale incidents. One can cite the theft of 412 million accounts of the site of meeting AldultFriendFinder in 2016 or an error that allowed the dispersion of confidential information on the members of the G20. Other examples could be used to illustrate these massive leaks, but there is a much greater danger to users.


Thus, many sites trade in your login data, but beyond the sale, the information is aggregated from different sources to make them more relevant to hackers. Basic security rules dictate that users do not use the same login and password on different sites. But honestly did not you happen to derogate from this rule?

If this is the case, hackers have been able to retrieve different information from several leaks in order to rebuild your digital identity (ies). This information may allow you to enter other information systems or use your credentials to mislead your personal or business relationships.

Beyond the known leaks, how many companies have not communicated on data leaks or, even worse, how many do not know that their data are compromised? Companies may be the victim of ransom attempts. In March 2017, a group of hackers claimed to be able to compromise hundreds of thousands of iCloud accounts if a ransom was not paid. Bluff or reality? Apple did not communicate on this subject nor on the payment or not of the ransom. And, of course, there was the WannaCry case.

Oil spills are visible on our beaches, data leaks are not and are often hidden from the general public. As you read this, it's not impossible that hackers will watch your holiday photos on Google Drive while listening to your Deezer playlist (rest assured, it's not the same password ...) Like a bird that tries to extricate itself from the oil in which it is stuck, will you be able to survive the flight of all your data?


A strong geopolitical issue

Let us not be naive: mastering the data is a primordial geopolitical stake. There is often a tendency, and large groups are involved, to believe that the Internet, the cloud and all the tools around the data are beyond any notion of nationality. Now, on closer inspection, the United States, and California in particular, are hegemonic about our data. Caricatured to the extreme, to the question "Who benefits from the explosion of our data? ", One answer might be: Silicon Valley.

To convince oneself of the geopolitical aspect of the data, let us note for example the decision of the American justice, confirmed on appeal on April 19 last, to oblige Google to provide the data stored outside the United States. China has also understood this by pursuing a policy of determined protectionism that allowed the BATXs (Baidu, Alibaba, Tencent and Xiaomi) to thrive in the face of GAFAM. The BATX, strongly supported by the Chinese state, now aim to conquer the international market and Europe in particular.


OPEC of the data

The latter seems stuck in an outdated view of computers and the Internet. The beneficiaries of the explosion of data will be numerous in Europe: telephone operators and digital services companies: they will benefit from this revolution. And we will not forget the many start-ups that emerge around the Internet of objects and data analysis. But let's not be mistaken, the only real beneficiaries of the explosion of data will be those who will own them within their data-centers! Will GAFAM and other BATXs create, thanks to their infrastructures, the OPEC of the data that will fix the course of your data on the world markets?

Source: - June 14th, 2017

Posté par Nina Legras à 14:14 - - Commentaires [0] - Permalien [#]
Tags : , , , ,

21 juin 2017

Data from cities, a common good inseparable from buildings

Eric Cassar is an engineer and architect, founder of Arkhenspaces, whose "Habiter l'infini" project won the Grand Prix Le Monde Smart-cities 2017. He calls in this forum to create new ownership models for digital data buildings.

"The development of the Internet is akin to the arrival of new dimensions. Age 1 had accelerated our exchanges, with emails, and then gave us access to a growing number of information and services. Age 2 has facilitated the linking of individuals with other individuals, with social networks. Age 3 is the continuing relationship of individuals with space, through smart-building or smart-city: a physical space in close relation with the digital space thanks to fixed connected objects or Movement, and the generalization of sensors in our cities.

Our buildings will therefore process an innumerable amount of new data that will produce key information about the functioning of human settlements at different scales: the building, the block, the neighborhood, the city, the territory. Data related to environments (energy consumption, affluence, access) but also attached to new local social networks.

The effective use of this large amount of information will improve the functioning and efficiency of these estates, in particular by correlating supply and demand, distributing needs and resources and then anticipating. It will be able to suggest, initiate or promote social ties of proximity, and increase the number of local synergies.

A precious raw material

The processing of these data must respect certain fundamentals such as the preservation of privacy to ensure the free will of everyone by protecting his personal data. They must therefore be properly anonymised and aggregated according to strict rules. This could involve the implementation of several vigilance actions:
  • The interoperability of instruments and systems divided into three layers: sensors, infrastructure, cloud. This model is supported by a whole group of players in the building and IT industry grouped together within the SBA (Smart Building Alliance).
  • The requirement for access to classified, organized or tagged data, and smart-data. In the same way that to read a text one must know its language, a message must also contain within it a decoder and the knowledge of the environment where it was captured. A data is worth nothing as such. To be able to exploit it, it is necessary to know its unity, know where it has been recovered, by what sensor etc. So as to be able to detect the source of errors or malfunctions.
  • Securing access and information flow.
  • A right to disconnect for any inhabitant or citizen.
  • The establishment of human mediators between citizens and the digital world. New trades with "added human value" will emerge: a concierge will also be community manager etc.
  • The constitution and regulation of independent trusted third parties who will be in charge of the storage of these data and their scheduling.
Creating new legal forms
Once these guarantees are integrated, "datas" will become a resource of increasing value over time, a raw material of great wealth for all companies wishing to offer new services.
Finally, there is the question of the membership and accessibility of the data. From my point of view, the "datas" produced in a place and linked to this environment must be attached to this place. They are a common good, but the common good of a localized whole. The data captured inside a smart-building belong - after being anonymized and aggregated - to the building.
It will therefore be necessary to create new legal forms that are inseparable from "physical architectures". The data created will enrich by aggregation the virtual avatar of the building, its digital model or BIM (Building information modeling).
The access to this data can be transferred to the physical architecture of buildings and towns (roads, squares, intersections, etc.) to finance the storage and management of this data, but also the maintenance and Management of buildings and cities.
From passive to active building
Buildings will shift from the state of passive entities that cost, to active entities that relate. Good data management can eventually reduce the operating cost and therefore the overall cost of an immovable. The building will become active for the benefit of its inhabitants. He will maintain himself in a virtuous circle. But this passage will require rethinking the role of each of the actors within the building industry.
The same is true on another scale for the smart-city. In addition to continuous improvements for the benefit of its users, the new facilities will provide new resources, the benefits of which will lead to a reduction in the costs necessary for the functioning of the city.
The access to the data can be determined democratically at the scale of the whole concerned: building, block, city. Indeed, the implementation of such tools will facilitate consultations of all concerned citizens, including the silent majorities that are now absent from public debates. These new challenges require that all actors address them collectively, bearing in mind that cities must introduce and / or liberate social, sensitive and imaginative intelligences alongside rational intelligences. "
Source: Le Monde, May 11, 2017. By Eric Cassar

Posté par Nina Legras à 14:13 - - Commentaires [0] - Permalien [#]
Tags : , , , , , , ,

06 juin 2017

How Singapore exports its model for a Smart City

This tiny country has established itself as a model for smart urban development and exports its know-how to giants like India and China.

Singapore is tiny, beside the giants of  India and China; it nevertheless manages to play an important role where ‘smart cities’ are concerned. With its 4.7 million inhabitants, ‘little’ Singapore has established itself as one of the models in the world for the ‘smart city’. The city-state plays a key role in the extraordinary development of the two Asian giants, China (1.4 billion inhabitants) and India (1.3 billion inhabitants). It experiments on all fronts, loves innovations and multinationals, and seduces clients from all over the world.

Playing in part on its Chinese-origin community (75% of the population), Singapore has signed several partnership agreements with China for testing and further development of smart cities. This is the case, amongst others, for the business park in Suzhou and the high-tech eco-island of Nanjing, the former capital, which has a population of over 8 million inhabitants. The experiments are then reproduced in other cities, including the eco-city of Tianjin and the City of Knowledge in Guangzhou. This methodology “provides a platform enabling Singaporean and Chinese firms to demonstrate their capacities in matters of technologies in a holistic manner», explains the INFOCOMM Agency in Singapore.

Data collection on a massive scale

And China is not alone. To develop its high-priority project of creating 100 smart cities by the year 2020, the Indian Prime Minister, Narendra Modi has also turned towards the know-how and investment capacity of the Singaporeans (of whom over 7% are of Indian origin). Singapore scores points when Hong Kong is marking time. Hong Kong is further away geographically and relies on the gamble – made long ago – of betting heavily on Information and Communication Technologies (ICT).

In more specific sectors, the Nation-state positions itself as a sort of life-size laboratory at world level of the city of tomorrow with cutting-edge experiments on autonomous vehicles or ethnic diversity by neighbourhood and even by apartment building. Above all, data collection on a massive scale combined with the predictive intelligence of big data is used in all sectors for modelling projects, planning changes and endeavouring to offer the most innovative serves, whether it be a question of fluidity, of security, of the comfort of buses or the location of child-care centres. The municipality has moreover implemented a measure which is often a source of anxiety elsewhere. It has set up a sophisticated traffic control system, the price of which varies in function of the amount of traffic, the neighbourhood, the time and the day of the week.

The brand knows how to sell

With the support of this strong position, Singapore organises a World Cities Summit each year. In 2016 this brought together mayors and leaders from 103 cities in 63 countries: all either clients already or potentially so. The brand knows how to sell and it sells itself well. It knows how to rely on its powerful financial community and, for implementation, on various private firms, including Surbana Jurong, present all over the world.

In 2014, the Prime Minister launched the Smart Nation programme, placed under the responsibility of the Minister for Foreign Affairs, Vivian Balakrishnan, thus displaying its ambition to look further afield. A cardinal virtue, the government has understood the importance of a systemic vision for improving cities by means of technology.

Both at home and abroad, Singaporeans focus on sustainable development and concern for citizens in the form of provision of quality services. But, overall, their model tends more to the ‘datapolis’ side than to the ‘participolis’ side: it prioritizes collection and processing of data more than the effective participation of citizens.

This model is limited, as Anthony Townsend, a researcher at the University of New York points out. In an article published by the Technology Review, he states that “The perfectly controlled and efficient utopia of a very safe and smart city can work in a place like Singapore. But it would probably never work in New York or Sao Paulo, where expectations in terms of conception and of what make the vitality of a community, are completely different”. Townsend thus confirms that there is no single model of a smart city. This in no way prevents this small nation-state from innovating or from finding clients all over the world.


Posté par Nina Legras à 11:09 - - Commentaires [0] - Permalien [#]
Tags : , , , ,

31 mai 2017

Industry: big data for maintenance

The customer has taken control of the industrial world. It is impossible to risk the breakdown of production while the whole production chain must work under tension: customer delays, supply, etc. Time to Market has become a key issue of differentiation and sometimes even survival. Maintenance is now a strategic function in companies. Good news, innovative technologies will make it possible to go from a penalizing healing to a safe preventive!

Who still believes that repairing a breakdown even in the shortest possible time is a sufficient solution to satisfy the increasingly pressing demands of customers on deadlines? The development of competition and the unrestrained race to competitiveness, leads to a search for total quality and especially a reduction in costs! Forecast malfunctions, anticipate breakdowns, etc. Over the years the technology has come to the rescue of the maintenance activity. Many innovations invade the production lines, sometimes requiring a very rapid adaptation of the teams in place.

The implementation of innovative tools such as the connectivity of the equipment makes it possible to implement maintenance more predictive and much less curative. Who has not dreamed of being able to identify the malfunction of an equipment upstream of the problem so as to be able to intervene before the breakdown?

While most manufacturers offer diagnostic solutions over the Internet to remedy incidents as quickly as possible, today's machine connectivity allows for preventive analyzes and trigger 'alerts' and operator intervention before the failure. In this area, as in many others, artificial intelligence is progressing rapidly to avoid production disruptions.

However, it is not enough to implement a dialogue limited to machines and platforms. The latter are necessarily collaborative and - good news - the man still finds its place in the monitoring of the equipment. This technical monitoring can now be carried out by the entire community of concerned stakeholders: the user, the customer, the salesman, the technician and even the equipment that collaborates with humans for better production monitoring.

The suppliers of portable connected objects, the "wearables", discovered that the professional market was as reactive as the market of the general public. If connected objects begin to interfere with the everyday life of companies, at the heart of this trend, wearables bring new comfort but also increased efficiency to operators and technicians in the field. Thus, for example, the connected eyeglasses help to improve the efficiency of the interventions and provide additional technical support to the after-sales service of the distributor / manufacturer.

The majority of companies have seen a major shift in the proportion of maintenance activities entrusted to specialized external firms. Companies tend to refocus on their main function and to delegate what is not their core business. To entrust an external company with the maintenance function ensures strong expertise and, above all, outsourcing makes it possible to minimize costs: the company pays "service" and does not have to bear an internal permanent charge.
While many operators are attempting to take on the future maintenance train, some players have long realized that connected, predictive, and collaborative maintenance is at the heart of the success of 21st century industry.

This is the case of SAV Réso, located in Ozoir-la-Ferrière. In partnership with OptimData (a start-up specialized in the interpretation of Big Data data), it offers its customers to equip their machines with a solution called Need & Use. This solution aims at equipping a GPRS box with the machines that allow them to communicate, which promotes the harvesting and analysis of real-time usage data.

All of its data is intended to provide critical information in order to provide an effective predictive maintenance plan, increase the life of the equipment, reduce the probability of operational failures and reduce downtime in the event of overhaul or failure. In addition, preventive maintenance is aimed at avoiding additional costs such as lubricant or spare parts, costly corrective maintenance interventions, or even securing the work of employees.

By Jessica Grasser | 22/05/2017 -

Posté par Nina Legras à 14:29 - - Commentaires [0] - Permalien [#]
Tags : , , , , ,

23 mai 2017

The IoT, the new SNCF locomotive

To speed up its digital moulting, SNCF focuses on industrial Internet. Today, 400 employees of the transport group work on some fifty IoT projects at different stages of development. With the Internet of objects, the group intends to improve the security of its network, the quality of its service and generate significant productivity gains. Here's how.

SNCF places its sights on the Internet of objects. "In our company, IoT is the main lever for performance and efficiency. The true technological revolution is the IoT," said Thursday, May 18, 2017, Guillaume Pepy, the president of the group, at a press conference devoted to the digital transformation of the company, to which the group will spend an additional 900 million euros over the next three years.

400 people dedicated to IoT

"Today, 400 people at the SNCF are working in the IoT, representing 50 teams in different departments of the group," said Benoît Tiers, CEO of e-SNCF, a new entity of the group bringing together the 4,000 employees dedicated to digital and information systems. Today, SNCF counts five industrial Internet projects in the industrialization phase, 12 in pre-industrialization and 30 in the experimental phase. "And 11 of our technocentres are engaged in the transformation of the plant of the future," adds Benoît Tiers.

Of the five IoT projects in the industrialization phase, one finds in particular a system of connected coupons which makes it possible to replace the mercury thermometers. "Before we manually retrieved the data by touring the hot channels during summer, we are now able to measure the temperature remotely with the deployment of 500 connected coupons, which improves the punctuality of our trains, as we reduce their speed only on the portion where the sensors send an alert to indicate that the temperature is above 45 degrees, "explains Claude Solard, Deputy General Manager Security, Innovation & Industrial Performance at SNCF Network.

Combine data to accelerate predictive maintenance
Another example is the deployment of ground cameras to monitor the state of health of pantographs, which are connected to the catenaries and which allow running of the trains. "This equipment falls down very badly but when there is one, it is a disaster, so we carried out periodic checks," explains Claude Solard. The challenge here is to go through a logic of predictive maintenance. The cameras are thus connected and carry self-learning image processing software capable of detecting tiny defects. The system was developed by the start-up GST, based in Dijon. "Six stations have been equipped with these cameras and SNCF Réseau plans to deploy 40 of them by mid-2018," Claude Solard said.

The acceleration of predictive maintenance operations will depend to a large extent on the SNCF's ability to cross its data from a variety of sources. Today, the company works on data from connected objects, employee smartphones or measuring trains, but these data are located in silos. The crossing of this information should enable the teams to move from a descriptive model to a prescriptive model. "We want to be able to predict where we need to intervene and what is the best intervention to achieve, whether it is maintenance or regeneration," explains Benoît Tiers.

100 million euros savings thanks to digitalization
The French mobility giant sees the IoT as a means of improving network security and the quality of service rendered to its customers, but also of improving economic performance. Overall, SNCF hopes to realize 100 million euros in productivity gains over the next five years thanks to digital, including IoT.

By Juliette Raynal, on / May 18th 2017

15 mai 2017

SNCF : Big Data on the right tracks

Each day, more than 10 millions of people, travelers and visitors, cross the SNCF station platforms. Interactive kiosks, travel information systems but also management and maintenance of stations and reams...SNCF bets more than ever on the digital strategy. At first to optimize the movements of the travelers in station, so to offer them the best services while maximizing the generated sales. The project, named Magnolia, aims at understanding better the flows of about two billions of travelers passing in transit every year, by 3 029 stations, by studying the connexions with other means of transportation (taxi, buses, self service bikes,..).

Thanks to the data generated by courses visitors in stations, the SNCF(FRENCH NATIONAL RAILWAY COMPANY) can also know if the traveler passes by businesses or if he goes directly towards the exit.

The SNCF put also on Big Data to set up an effective preventive maintenance. A real challenge for the suburban parisian trains: it is not only about a remote diagnostic of the breakdowns, but also to predict them each 30 minutes.

Data, which yesterday, were analyzed in a manual way to identify the technical problems and to activate the operations of maintenance, will be automated tomorrow, thanks to a real time vision of the global state of the material.

The system, which should be functional at the end of this year, uses an engine of predictive analysis (Machine learning), led to learn the scenarios of failure of trains by crossing the data of functioning and those of the exploitation.



Big Data, the new fuel of the public transportation

Bus, subway, train, RER, tram …Today, we consider that a person spends an average time of 53 hours of his life per year in transportation. This figure should increase exponentially before 2050 to achieve 106 hours per year.
New technologies follow us in all our travels: geolocation by mobile phone, systems of computer ticketing, of maintenance … The slightest data is recorded: the entrances and exit of travelers, the incidents, the delays … The flow of information is tremendous and infinite. The user himself is an endless source of data, at the same time consumer and ressource of Big Data.

Still it is necessary to run all these data to analyze them. It is the aim of Big Data: make significant the mass of the collected data.
Big Data becomes a real asset for the actors in charge of transportation today: it offers in real time a dynamic display of transport networks. From now on, we can evaluate the performance of an operating system, analyze the behavior of the travelers, understand the evolution of flows, appreciate the impact of the changes of services and the operations of maintenance, and specify the high-risk of accidents zones, the nodes of traffic congestion …

Analyzed, these data are going to help the operator to optimize his tool and so to improve the customer experience. Big Data becomes then a real asset for all the projects of Smart Transports for which the objective is to decrease the deadlines and the costs of the means of transportation by making them intelligent and autonomous.



Big Data and smart cities : the sacred union

The figures speak for themselves : today, 54 % of the population live in urban area. In 2050, more than 65 % of the world’s population will live in towns. Therefore, urban issues will be equally amplified : cities will have to answer the questions of mobility, energetics efficiency, urban sprawl, quality of life.

The increasing digitalization of all the activities makes cities act as data provider sources. Transportation, energy, interactions, trade, all these flows generate enormous mass of information. The technological progress, the usage of the smartphone for daily uses, the strong rising of the social networks, and, more recently, the collaborative consumption, the explosive increase in consumption of digital services, opens quickly the path to smart cities.

Smart City, thanks to Big Data, aims to be a relevant answer to the new challenges facing urbanization. The collection of tremendous amount of data continuously generated by the sensors, the internal services of the city, the urban operators, the companies of the territory and the citizens, will improve the services provided to the population. The inhabitants become themselves a source of information feedback which is going to help cities to optimize even more their services.

Like Copenhagen, key model of smart cities, which started a vast ultra-connected environmental program: intelligent trash cans, measurement tools for the air quality, smart parking lot, connected lamps …



Posté par Nina Legras à 15:20 - - Commentaires [0] - Permalien [#]
Tags : , , , ,

Big Data the formidable weapon on digital retail

Data is everywhere especially in E-commerce. Big Data is now a tremendous springboard to personalize the customer relationship.

Amazon is the first one which has invested in Big Data. The IT colossus and the associated services have considerably changed the environment of the French distribution but also at a worlwide level too : considering the prices competition, the Internet trafic, the customer behavior, the supply chain optimization …Amazon is the undisputed leader in the dynamic pricing gestion and the supply chain.

Essential basis for the new predictive marketing, Big Data has become a formidable weapon for competition. Far more than a targeting advertisement, new technologies allow the implementation of real time actions : we can today anticipate customer’s needs and push to them relevant offers suitted to their profile. Structured data (user behavior, ordering history,registration form) but also non structured ones (web analytics, social networks), are so many data that enable you to have a global 360° customer vision. But, in order to be effective,  such new strategies of customer retention have to be quick because client needs and wishes are changing regularly.

In the heart of this international customer on line experience management, the british start-up Qubit has raised fundings for 40 M of $ to optimise its solutions of data analysis connected to the customer experience for e storekeepers.



Posté par Nina Legras à 15:20 - - Commentaires [0] - Permalien [#]
Tags : , , , ,