Groupement INI : Intégration Numérique pour l'Industrie

22 février 2018

Retail: Never without my mobile

70% of consumers now use their mobile in stores.

Locate a store, consult reviews or compare prices ...: the mobile is today become indispensable to the consumer even in his journey offline purchase. To better understand the role and increasing impact of smartphone in store, the MMA has just published its Guide to interactions mobile point of sale.

Two years after its first edition, the Mobile Marketing Association France announces a major update to its Point of Sale Interaction Guide. "70% of consumers are now using their mobile in stores. In this guide, traders can discover the latest technologies, optical or radio, to interact with consumers. But this publication offers them, above all, a real user manual for accelerate the digitalization of their point of sale with an important focus on management of personal data at the dawn of RGPD 2018. " - explains Anh-Vu Nguyen, head of the working group "Mobile Interactions in point of sale "at the MMA.

In addition to an overview of different technologies (NFC, BLE, Wi-Fi, recognition image, QR code and now VLC) and use cases available to traders, the guide proposes a new legal section: Maître Thomas Beaugrand, lawyer at the Staub & Associés, partner of the association, shares its vision of the legal and consequences of the GDPR on the collection and exploitation of data. "The power of e-commerce giants today rests on data exploitation and traditional traders can also benefit to better understand their consumers, optimize organizing their point of sale or maximizing their sales".

However, this requires a clear understanding of the new legal framework of the General Data Protection Regulation (GDPR 2018), distinguishing between 
clearly the good and the bad practices in terms of collection and data exploitation. "Adds AnhVu Nguyen. 

This guide was produced by member companies of the Mobile Marketing Association France: Atsukè, EzeeWorld, Fidzup, SLMS, Snapp '& userADgents, with the support of the AFSCM, the law firm Staub & Associés and thanks to the support of BNP Paribas and Hello bank!


16 février 2018

How EDF tests IoT technologies

EDF is working with Nokia to test IoT technologies in its R & D laboratories in Ile-de-France. His various professions also lead a work on big data.
EDF is slowly but surely moving towards the Internet of Things. After carrying out a general interest study on the subject for all of the group's businesses in 2015, the energy company's R & D division began a more concrete testing phase at the end of 2017. It has teamed up with Nokia to evaluate the use of IoT technologies in an industrial environment.
"We want to define the strengths and weaknesses of each protocol and technology, in an agnostic approach, out of the marketing promises, to enlighten the group's business lines," summarizes Stéphane Tanguy, director of research programs on information tools. EDF and IS responsible for R & D. He insists on the importance of conducting real tests in the laboratory and not relying solely on the simulation of physical phenomena. "It's important to see how sensors and protocols behave in an industrial environment, how they go from a 'sleep' mode to an 'active' mode for example, to know the real energy consumption, to understand the exchanges between objects. and protocols ... ", he explains.
EDF tests A wide range of technologies
EDF, by the nature of its activity, wants to ensure the robustness of the different IoT solutions before a massive deployment. Among the parameters monitored especially by R & D teams, there is the endurance of objects, their consumption, but also the cost per use. On the network side, it is the level of penetration of materials by the signal (especially concrete), geolocation services and, of course, cybersecurity that are reviewed. EDF tests both cellular technologies (NB-IoT, LTE-m) and networks using free frequencies (Sigfox, LoRa).
These tests, which will run until the summer of 2018, will allow EDF to build an IoT toolbox in which the various businesses of the group will be able to draw for their projects. In parallel with this work of the R & D teams, divisions of the group have already developed POCs (prooves of concept) on various possible use cases of IoT, such as the protection of workers in remote areas or "geo-tracking". "We had identified a hundred different use cases for our business in our 2015 study, the potential is important, says Stéphane Tanguy.The deployment schedules are different according to the business.
Big data & data science
The purpose of the IoT is to put together new data that is useful for improving operational efficiency, saving money and creating new services. But the company already has many data that it seeks to exploit better. "We already have process sensors (temperature, pressure) that are used for monitoring, but do not necessarily communicate in real time, they are already providing us with valuable information that was confined to separate and heterogeneous applications. Let's pour them into datalakes to de-silot them and analyze them.The goal is to apply algorithms to find correlations, develop applications, make inter-factory comparison, learn from ... ", details Stéphane Tanguy.
If EDF does not appear particularly ahead of the IoT, it is rather well positioned on the big data. "When I talk to other manufacturers, I see that we are well placed to put in place innovation devices to apply new data-science techniques to our data."
By Sylvain Arnul, Usine Digitale, February 14th, 2018

Posté par Nina Legras à 10:39 - - Commentaires [0] - Permalien [#]
Tags : , , , , , , ,

10 janvier 2018

Ford and Autonomic are building a smart city cloud platform

Ford and Silicon Valley-based Autonomic will work together to build a new open platform upon which cities can build out infrastructure communications, including connected traffic lights and parking spots, called the “Transportation Mobility Cloud.” Ford CEO Jim Hackett announced the news on Monday at the CES 2018 keynote kicking off the annual conference.

The platform is designed to help connect smart transportation services, as well as adjacent connected offerings, uniting them with one common language to help coordinate all this efforts in real-time. That means tying together personal cars with vehicle-to-everything communications built in, incorporating things like bike sharing networks, public and private transportation services, including buses, trains, ride hailing and beyond.

The Transportation Mobility Cloud will support location-based services, determining routes, sending out alerts about things like service disruptions, handing identity management and payment processing, as well as dealing with data gather and analytics. It’s intended not only as a kind of connective tissue for the forward-thinking services and vehicles that will make up the smart city of tomorrow, but also as a platform upon which new apps and services can be built from the heath of data available.

Ford says to think of it like “a box of Legos” with pieces that can be quickly taken apart and reassembled to build new types of assets and products to better serve city residents. It’s intended to be flexible enough to work with all partners, and to change from city-to-city depending on local requirements and implementation specifications.

In a blog post detailing the news, Ford suggests some possible uses to illustrate what the platform could do, including routing autonomous vehicles away from the most densely clogged arteries occupied by human cars in times of peak traffic, and rerouting cars on the fly to help reduce congestion, or even letting cities fence off ares of the city to restrict them to EV only zones in order to help mitigate air quality and emissions issues.

Ford stresses that it has designed this platform “for everyone,” a road base group that includes transit service operators, as well as competitor automakers, who it invites to join in with the effort in order to help make it as widely compatible as possible. Ford says it hopes to use its open approach to drive adoption to the point where it can claim to be the smart city platform with the most connected vehicles by the end of 2019, and eventually it hopes to achieve a 100 percent compatibility rate with vehicles and services on the road.

It’s a massive undertaking, but if successful, it could pave the way to cities better able to launch and incorporate Ford’s growing stable of mobility service offerings, including things like last mile shared commute service Chariot, as wells Ford GoBike and its forthcoming autonomous ride hailing fleets. Teaming with Autonomic, a company that Ford invested in last year, will help it ramp quickly since the Palo Alto company’s staff has lots of experience building platforms intended for integration on a broad scale, including Amazon Web Services.

Part of the promise of ride-hailing has been that it would reduce congestion in cities – but studies show the opposite is true, which Ford says it hopes to help correct with a platform like that which can help optimize their rollout and integration into existing services and traffic flows.

Source: by Darrell Etherigton - January 9th 2018

Posté par Nina Legras à 15:36 - - Commentaires [0] - Permalien [#]
Tags : , , , ,

03 janvier 2018

Ericsson forecasts 20 billion IoT-connected devices, including 1.8 billion by cellular network in 2023

According to the latest updated report on the mobility of the Swedish equipment manufacturer Ericsson, the number of objects connected to the IoT (connected cars, industrial machines, meters, sensors, consumer products of types wearables, etc.) should grow on average 19% worldwide by 2023, to reach a total of 20 billion units.

By that time, all devices connected to a network (including mobile phones, PCs, tablets) will reach 30 billion units.
Ericsson distinguishes objects that will be connected by short-range wireless technologies (Wi-Fi, Bluetooth, Zigbee, etc.) from those connected to a long-distance network (3G, 4G, 5G, NB-IoT, and LPWA networks such as Sigfox or Lora).
Objects connected by short-range technologies will be largely in the majority (17.4 billion units in 2023) and their number will increase on average by 18% per year by 2023.

On the contrary, objects connected to a long-distance network will be very much in the minority ("only" 2.4 billion objects in 2023 and 600 million in 2017), but their number will grow faster: + 26% on average per year .
At the end of 2017, Ericsson estimates that about 500 million objects will be connected to IoT through a cellular connection. Their number is expected to reach 1.8 billion units, or 75% of the long-distance category. Clearly, Ericsson, one of the first equipment manufacturers of cellular network infrastructure, believes that Sigfox and LoRa will not break through ... What remains to be verified.

Currently, the long distance segment is dominated by GSM / GPRS technology, but unsurprisingly, by 2023, 4G LTE and 5G cellular technologies will take over. The 4G will then represent the majority of cellular IoT connections, while the 5G will support the most critical applications.
The first cellular IoT networks based on Cat-M1 and Narrow Band-IoT technologies (NB-IoT) were launched in early 2017 and Ericsson currently has more than 20 cellular IoT networks using these technologies. operating commercially around the world.

Source:, November 29th, 2017

Posté par Nina Legras à 10:24 - - Commentaires [0] - Permalien [#]
Tags : , , , , ,

20 décembre 2017

Artificial intelligence will upset the bank's business

Advisers assisted by robots, improved HRD thanks to software ... a study lists the changes underway in the banking sector.

Will the HR departments of the banks soon become "HRMD", "Human Resources and Machinery Departments"? The idea, a bit scary, is among the recommendations of a report from the firm Athling, "Artificial intelligence in the bank: employment and skills," which will be released on December 7. This study, commissioned and managed by the Observatory of the Bank's trades, the statistical and prospective collection body of the banking branch, sought to evaluate the consequences that artificial intelligence (AI) could have on the sector.

While trade union organizations warn of the growing concern of employees about the irruption of these cognitive technologies, the authors of the study "aware of tensions", deliberately chose not to quantify the volume of jobs that could be deleted. This assessment is considered as "too dependent on the strategies of the institutions and exogenous factors (regulation, economic activity ...)", warns the report, of which Le Monde obtained a copy.

But, once these precautions are taken, the document does not hide the upheaval ahead for the sector. In the first place, because banks are among the first companies to have computerized their operations and thus have data from millions of customers "with considerable historical depths", an indispensable material for artificial intelligence.

Messy initiatives

Athling has identified a "plethora of IA projects in the banking sector". He notes that at this stage, only 15% of the experiments concern customer advisers. Tests of chatbots or robots in contact with customers exist "but they remain very limited, because of performances deemed unsatisfactory".

Source: Le Monde économie, December 2nd, 2017 by Véronique Chocron

Posté par Nina Legras à 10:24 - - Commentaires [0] - Permalien [#]
Tags : , , , , ,

08 décembre 2017

Big Data Market in Smarter Cities: Forecast (2017-2022)

LONDON, Nov. 20, 2017 /PRNewswire/ -- Big Data can be best defined as the capture, storage, search and analysis of large and complex data sets which are generally difficult to be processed or handled by traditional data processing systems. Smart city corresponds to the integration of Information and Communication Technology (ICT) and Internet of Things (IoT) in a secure fashion to manage the cities assets such as schools, hospitals, power plants, waste management among others.

Download the full report:

Smart cities utilize several technologies to improve the performance of health, transportation, energy, education among others in order to provide higher levels of comfort of their citizens. One such technology that has a huge potential to enhance smart city services is big data analytics, it plays a key role in making cities smarter.

The report big data market in smarter cities is segmented into 4 verticals: By data generators, by type, by data type, by application. Data generators refer to the sensors, recorders, consumer electronic goods and few others which act as data sources.

Regarding type report is classified into infrastructure, software & services. The three different data types mentioned in the report include structured, un structured and semi structured data types. By application report is segmented into city planning & Operations, public safety, IoT, Transport & CO2 emission and others. Moreover report is segregated based on geographic regions that include Americas, Europe, APAC and RoW to provide intense knowledge about the market

Government regulations to deploy big data analysis in smart home concept in order improve the living standards of people, enhanced technologies in smart traffic to manage the traffic efficiently among others acts as key growth drivers of big data market in smarter cities.

North America holds major share for big data market in smarter cities owing to major smart cities in this region and is witnessed to be the leading market during the forecast period 2017-2023 as many of the cities in this region coming forward to become smarter. Asia Pacific is the fastest growing segment during the forecast period due to the extensive growth of smart city momentum in developing countries like china, India, Korea in this region.

Posté par Nina Legras à 11:59 - - Commentaires [0] - Permalien [#]
Tags : , , , ,

01 décembre 2017

Creating smart city visions of the future

The concept of smart city visions is advancing across the globe. But what will they look like and how will they work? Rohit Talwar, Steve Wells and Alexandra Whittington of Fast Future explore the possibilities

The city of the future is a symbol of progress. The sci-fi vision of the future city with sleek skyscrapers and flying cars has started to give way to a more plausible, human, practical and green image known as the smart city.

While smart city visions differ, at their heart is the notion that in the coming decades, the planet’s most heavily concentrated populations will occupy city environments where a digital blanket of sensors, devices and cloud-connected data are being brought together to enhance the city living experience for all.

Smart city visions encompass all of the key elements that enable city ecosystems to function effectively – from traffic control and environmental protection to the management of energy, sanitation, healthcare, security and buildings.

The world’s premier cities are now competing to build highly interconnected smart environments where people, government and business operate in symbiosis with technologies such as big data, the Internet of Things (IoT), cloud computing, hyperconnectivity, artificial intelligence (AI), robots, drones, autonomous green vehicles, 3D/4D printing and renewable energy.

Future smart cities promise to harmonise the benefits of these key disruptive technologies for society and provide a high quality of life by design. At the same time, they assume that pervasive surveillance and data capture are permissible to city residents.

Some have already begun implementing smart city mechanisms and, as the concepts, experiences and success stories spread, the pursuit of smart will become a key driver in the evolving future of cities as communities and economic centres.

Engagement and vision

The evidence to date from smart city initiatives around the world is that the best results come when we have a clear sense of what the end goal is. Sadly, there is an array of smart city applications that are barely used because they don’t really make life better and the population doesn’t want or need them.

City governments have to create inclusive processes that inform citizens about the forces shaping the future and the possibilities and challenges on the horizon, then engage the population in dialogue about the kind of future we want to create. We have to explore what a liveable city means to its people and be clear on how we will attract and support a constant flow of industries of the future.

Alongside this, we need to articulate a clear vision and direction around education, environment, public services, access to justice, city administration and civic engagement. These pillars then provide the guiding requirements which will, in turn, influence the design of the physical, digital and human elements of a smart city infrastructure.

smart city visions

Big data

Smart cities are designed to inform decisions by capturing massive amounts of data about the population and its patterns, such as water use and traffic flows. This information gathering results in what is called big data, and it is essentially gathered via surveillance. There can also be voluntary efforts to collect information, but the ease and affordability of sensors, AI and advanced analytics in the future will mean this function can be completely automated.

The data can be collated from a constantly evolving and expanding IoT – encompassing traffic lights and cameras, pollution sensors, building control systems and personal devices – all literally feeding giant data stores held in the cloud. The ability to crunch all this data is becoming easier due to rampant growth in the use of device algorithms, AI and predictive software –  all running on networks of high performance computing and storage devices.

Singapore is a leading example of a smart city and is constantly evolving its “city brain”, a backbone of technologies used to help control pollution, monitor traffic, allocate parking, communicate with citizens and even issue traffic fines.

The behavioural aspect is not to be overlooked. Singapore’s “brain” is attempting to modify human behaviour; for example, one system rewards drivers for using recommended mapped routes, and punishes those who do not. Ultimately, Singapore’s planners hope to discourage driving and guide most commuters to making greater use of public transportation. The city is planning for 100m “smart objects” including smart traffic lights, lamp posts, sensors and cameras on its roadways, which will be used to monitor and enforce laws.

Internet of Things

Smart city visions rest on advanced technology to make sense of massive collections of information. Indeed, the amount of information on the internet is expected to grow exponentially as a result of the IoT. Essentially, IoT means that everything (“things”) – and potentially everyone – will become beacons and data collection devices, gathering data on ambient and behavioural patterns from its surroundings and from the information it is fed – and networking all this data via the cloud. Hence, after data, the IoT is the second driving force behind the rise of smart infrastructure.

Companies and planners are already beginning to explore the possibilities. For example, a case study from India suggests that light poles along the highways can offer both smart city and connectivity solutions. In addition to helping monitor road conditions, the light poles could be fitted as high-speed data connections. Data is a critical element of the smart city/smart road future.

However, because this option will further expand the relationship between internet service providers, surveillance and private business, including advertisers, there are several issues around privacy to be considered. Certainly, the information from smart cities and roads should be used to keep citizens safe, healthy and protected. But should companies be allowed to target users with adverts based on information collected for other purposes?

Smart roads

Within and between the smart cities of the future, smart roads in particular are where planners can put into effect the ultra-efficient mechanisms that will characterise the modern smart city. In general, the concepts around smart cities, smart roads and smart infrastructure are becoming less visionary and more strategic and sustainable by the day. Big data, the IoT and renewable energy are all beginning to work in tandem to transform the day-to-day.

For example, South Korea is planning an entire network of smart roads by 2020, including battery-charging stations for electric vehicles (EVs), as well as infrastructure to handle autonomous vehicles.

Autonomous vehicles will require roads to transform into information superhighways; the vehicles will need to communicate not only with each other, but also with the city infrastructure. Mapping, traffic signals and safety regulations, for instance, are all parts of the physical and digital infrastructure that will have to become highly coordinated for autonomous vehicles to function.

smart city visions


All this data and awareness will enable decisions that make the best possible use of space, fuel, energy, water, electricity and all other resources, with an emphasis on sustainability. For example, a clear priority is being able to anticipate big traffic jams and provide alternate routes to save time, fuel and reduce the impact on city infrastructure itself.

Electric vehicles are growing their market foothold, hence the charging concerns related to EVs are gaining urgency in the eyes of many policymakers and planners. One of the biggest hurdles to electric vehicle adoption is keeping a charge across long distances. To help address this challenge, UK researchers are testing out smart road technology that charges electric cars while they are being driven. The achievement of smart infrastructures could drive the required policy and behavioural tweaks to enable wide adoption of today’s renewable resources.

Another smart technology involves pavement surfaces prepared with panels that capture solar energy allowing streets to power themselves, which has been tested in the US and the Netherlands. Meanwhile, a special powder that gathers sunlight and glows at night has shown promise.

The smart city movement now afoot has the potential to transform the organisation of people and physical objects in a way that transcends urban development as we know it. The shift to smart infrastructure is not simply fashionable or aspirational; in many ways, it appears to be a critical enabler of the future sustainability of cities. It can be argued that the future of human life on the planet rests on a smooth transition to cities that are more efficient, less wasteful and more conscious of the impacts of the individual upon the greater good.

The smart city shouldn’t be an apocalyptic future where citizens are stripped of their free will, nor does the smart road lead to a utopia. However, it is now possible to create and deliver a city vision with citizens at its heart and that is enabled by forward-thinking infrastructure planning coupled with judicious use of enabling technologies. A well thought-through vision, enabled by a robust and well-executed smart city model, could provide a foundation stone for the next stage of our development, where science and technology are genuinely harnessed in service of creating a very human future.


Rohit Talwar, Steve Wells,Alexandra Whittington

Fast Future

November, 30 2017 -

Creating smart city visions of the future | Planning & Building Control Today

The city of the future is a symbol of progress. The sci-fi vision of the future city with sleek skyscrapers and flying cars has started to give way to a more plausible, human, practical and green image known as the smart city.


Posté par Nina Legras à 11:58 - - Commentaires [0] - Permalien [#]
Tags : , , , , , ,

21 novembre 2017

Difference between Machine Learning, Data Science, AI, Deep Learning, and Statistics

In this article, I clarify the various roles of the data scientist, and how data science compares and overlaps with related fields such as machine learning, deep learning, AI, statistics, IoT, operations research, and applied mathematics. As data science is a broad discipline, I start by describing the different types of data scientists that one may encounter in any business setting: you might even discover that you are a data scientist yourself, without knowing it. As in any scientific discipline, data scientists may borrow techniques from related disciplines, though we have developed our own arsenal, especially techniques and algorithms to handle very large unstructured data sets in automated ways, even without human interactions, to perform transactions in real-time or to make predictions. 

1. Different Types of Data Scientists

To get started and gain some historical perspective, you can read my article about 9 types of data scientists, published in 2014, or my article  where I compare data science with 16 analytic disciplines, also published in 2014. 

The following articles, published during the same time period, are still useful:

More recently (August 2016)  Ajit Jaokar discussed Type A (Analytics) versus Type B (Builder) data scientist:

  • The Type A Data Scientist can code well enough to work with data but is not necessarily an expert. The Type A data scientist may be an expert in experimental design, forecasting, modelling, statistical inference, or other things typically taught in statistics departments. Generally speaking though, the work product of a data scientist is not "p-values and confidence intervals" as academic statistics sometimes seems to suggest (and as it sometimes is for traditional statisticians working in the pharmaceutical industry, for example). At Google, Type A Data Scientists are known variously as Statistician, Quantitative Analyst, Decision Support Engineering Analyst, or Data Scientist, and probably a few more.
  • Type B Data Scientist: The B is for Building. Type B Data Scientists share some statistical background with Type A, but they are also very strong coders and may be trained software engineers. The Type B Data Scientist is mainly interested in using data "in production." They build models which interact with users, often serving recommendations (products, people you may know, ads, movies, search results). Source: click here.

I also wrote about the ABCD's of business processes optimization where D stands for data science, C for computer science, B for business science, and A for analytics science. Data science may or may not involve coding or mathematical practice, as you can read in my article on low-level versus high-level data science. In a startup, data scientists generally wear several hats, such as executive, data miner, data engineer or architect, researcher, statistician, modeler (as in predictive modeling) or developer.

While the data scientist is generally portrayed as a coder experienced in R, Python, SQL, Hadoop and statistics, this is just the tip of the iceberg, made popular by data camps focusing on teaching some elements of data science. But just like a lab technician can call herself a physicist, the real physicist is much more than that, and her domains of expertise are varied: astronomy, mathematical physics, nuclear physics (which is borderline chemistry), mechanics, electrical engineering, signal processing (also a sub-field of data science) and many more. The same can be said about data scientists: fields are as varied as bioinformatics, information technology, simulations and quality control, computational finance, epidemiology, industrial engineering, and even number theory.

In my case, over the last 10 years, I specialized in machine-to-machine and device-to-device communications, developing systems to automatically process large data sets, to perform automated transactions: for instance, purchasing Internet traffic or automatically generating content. It implies developing algorithms that work with unstructured data, and it is at the intersection of AI (artificial intelligence,) IoT (Internet of things,) and data science. This is referred  to as deep data science. It is relatively math-free, and it involves relatively little coding (mostly API's), but it is quite data-intensive (including building data systems) and based on brand new statistical technology designed specifically for this context. 

Prior to that, I worked on credit card fraud detection in real time. Earlier in my career (circa 1990) I worked on image remote sensing technology, among other things to identify patterns (or shapes or features, for instance lakes) in satellite images and to perform image segmentation: at that time my research was labeled as computational statistics, but the people doing the exact same thing in the computer science department next door in my home university, called their research artificial intelligence. Today, it would be called data science or artificial intelligence, the sub-domains being signal processing, computer vision or IoT.

Also, data scientists can be found anywhere in the lifecycle of data science projects, at the data gathering stage, or the data exploratory stage, all the way up to statistical modeling and maintaining existing systems. 

2. Machine Learning versus Deep Learning

Before digging deeper into the link between data science and machine learning, let's briefly discuss machine learning and deep learning. Machine learning is a set of algorithms that train on a data set to make predictions or take actions in order to optimize some systems. For instance, supervised classification algorithms are used to classify potential clients into good or bad prospects, for loan purposes, based on historical data. The techniques involved, for a given task (e.g. supervised clustering), are varied: naive Bayes, SVM, neural nets, ensembles, association rules, decision trees, logistic regression, or a combination of many. For a detailed list of algorithms, click here. For a list of machine learning problems, click here.

All of this is a subset of data science. When these algorithms are automated, as in automated piloting or driver-less cars, it is called AI, and more specifically, deep learning. Click here for another article comparing machine learning with deep learning. If the data collected comes from sensors and if it is transmitted via the Internet, then it is machine learning or data science or deep learning applied to IoT.

Some people have a different definition for deep learning. They consider deep learning as neural networks (a machine learning technique) with a deeper layer. The question was asked on Quora recently, and below is a more detailed explanation (source: Quora)

  • AI (Artificial intelligence) is a subfield of computer science, that was created in the 1960s, and it was (is) concerned with solving tasks that are easy for humans, but hard for computers. In particular, a so-called Strong AI would be a system that can do anything a human can (perhaps without purely physical things). This is fairly generic, and includes all kinds of tasks, such as planning, moving around in the world, recognizing objects and sounds, speaking, translating, performing social or business transactions, creative work (making art or poetry), etc.
  • Machine learning is concerned with one aspect of this: given some AI problem that can be described in discrete terms (e.g. out of a particular set of actions, which one is the right one), and given a lot of information about the world, figure out what is the “correct” action, without having the programmer program it in. Typically some outside process is needed to judge whether the action was correct or not. In mathematical terms, it’s a function: you feed in some input, and you want it to to produce the right output, so the whole problem is simply to build a model of this mathematical function in some automatic way. To draw a distinction with AI, if I can write a very clever program that has human-like behavior, it can be AI, but unless its parameters are automatically learned from data, it’s not machine learning.
  • Deep learning is one kind of machine learning that’s very popular now. It involves a particular kind of mathematical model that can be thought of as a composition of simple blocks (function composition) of a certain type, and where some of these blocks can be adjusted to better predict the final outcome.

What is the difference between machine learning and statistics?

This article tries to answer the question. The author writes that statistics is machine learning with confidence intervals for the quantities being predicted or estimated. I tend to disagree, as I have built engineer-friendly confidence intervals that don't require any mathematical or statistical knowledge.  

3. Data Science versus Machine Learning

Machine learning and statistics are part of data science. The word learning in machine learning means that the algorithms depend on some data, used as a training set, to fine-tune some model or algorithm parameters. This encompasses many techniques such as regression, naive Bayes or supervised clustering. But not all techniques fit in this category. For instance, unsupervised clustering - a statistical and data science technique - aims at detecting clusters and cluster structures without any a-priori knowledge or training set to help the classification algorithm. A human being is needed to label the clusters found. Some techniques are hybrid, such as semi-supervised classification. Some pattern detection or density estimation techniques fit in this category.

Data science is much more than machine learning though. Data, in data science, may or may not come from a machine or mechanical process (survey data could be manually collected, clinical trials involve a specific type of small data)  and it might have nothing to do with learning as I have just discussed. But the main difference is the fact that data science covers the whole spectrum of data processing, not just the algorithmic or statistical aspects. In particular, data science also covers

  • data integration
  • distributed architecture
  • automating machine learning
  • data visualization
  • dashboards and BI
  • data engineering
  • deployment in production mode
  • automated, data-driven decisions

Of course, in many organisations, data scientists focus on only one part of this process.

Source: - Posted by Vincent Granville on January 2, 2017 at 8:30pm

14 novembre 2017

Big Data: ally or enemy for the luxury goods industry?

All sectors of trade and of the economy are facing more or less to the massive development of the digital technology. The industry of the luxury is not being spared. In this sector where the human relation between the seller and the customer is favored, how to integrate artificial intelligence into products which are naturally very far from this connected notion? One thing's for sure: the buyer of the 21th century is more and more demanding. So enriching the relation with clients, anticipating their expectations, while maintaining a personal and human link is all the challenge of Big Data that must make itself the interpreter of the desires of the customer.

Behavioral data, purchasing history, satisfaction survey are so many keys which will allow to better understand the needs of the buyer. So, in a concern of personalization, the group L'Oréal already collects  named specific data through the visits of the customers who buy by internet. This trend becomes widespread in the prestigious brands:

the idea is to address individually every customer, but without ever being considered as intrusive.

It is a question before any accompanying the customer in a unique and exclusive experience, of valuing him and making him feel the feeling to belong to the same family.

Following the example of the group LVMH which proposes to its most faithful customers experiences top of the range in domains such as the oenology, the jewelry(jeweler's store), or of these hotel chains top - of range which exploit the data of consultation of the customers on Web to anticipate their desires.

Big Data, in the industry of the luxury, aims to be above all “smart "and "small ".

By INI Consortium/ November 2017

Posté par Nina Legras à 10:05 - - Commentaires [0] - Permalien [#]
Tags : , , ,

07 novembre 2017

Big Data takes off in the air transport

Each second a plane takes off in the world. Each of these planes today is equipped with an impressive quantity of chips and other digital sensors. An engine can soon generate 5 000 signals to be checked every second. Others can produce 10 To of data by flight.

Aircraft manufacturers, motoristes, equipment manufacturers, airline companies, all are on the outlook for any megadata which allow to anticipate efficient maintenance of all devices. Thanks to Big Data, we can make today speak the parts of the plane: its wear, its reaction in the conditions of atmospheric pressures, in the vibrations, in the magnetic field …

Aircraft manufacturers, engine and equipment manufacturers, airline companies, all are on the lookout for any megadata which permits to anticipate the aircraft maintenance. Thanks to Big Data, we can now make sens with collected data on devices of the plane: its wear, its reaction in response to certain atmospheric pressure, to vibrations or to magnetic fields …

The predictive maintenance is one of the most important axes of development in the aircraft industry: following the example of Safran which developed an entity dedicated to Big Data, Safran Analytics, leading the airline companies to reduce their fuel consumption, thanks to on-line data analysis of the engines during the flights.

Similar enthusiasm for Airbus company which has set up a data basis to evaluate and experiment Big Data projects or for GE which has spread a European platform allowing its manufacturers to continuously check their installations so that they can give real time advises to their customers.

The exploitation of Big Data is also an advantage for pilots who can rely on better indications of meteorological events and soon for the travelers themselves: 44% of companies anticipate to set up a geolocation of luggage by 2018.

 By INI consortium/ November 2017

Posté par Nina Legras à 10:03 - - Commentaires [0] - Permalien [#]
Tags : , , , ,