Groupement INI : Intégration Numérique pour l'Industrie

11 mai 2018

H&M Looks To Big Data For Store Insights

To reduce markdowns and break out of a lull in sales, H&M is turning to artificial intelligence (AI) and Big Data to tailor its merchandising mix in its brick-and-mortar stores.

The fashion retailer is using algorithms to gain insights from returns, receipts and data from loyalty cards to improve its bottom lines, according to news source Retail Dive reports.

H&M is utilizing the technology in a store located in an upscale section of Stockholm, Sweden. It has so far learned that women make up most of its customer base, and that fashionable items such as floral skirts have sold at better-than-predicted rates. Sales have improved with these insights, and H&M is moving away from the idea of stocking each location with a similar selection. That strategy previously led to unsold inventory and subsequent markdowns, as the retailer needed to clear out approximately $4 billion in excess merchandise.

The brand has also announced that 2018 will feature fewer brick-and-mortar store openings as H&M moves to adapt to increasingly digital shopping patterns. The change comes after years of rapid growth from the fast-fashion giant, which now finds itself somewhat struggling to integrate into the eCommerce landscape. According to news from CNBC, H&M will only open about 220 stores in 2018, as opposed to the 388 it built in 2017. That 220 is a net number, however, and the retailer will actually be opening 390 stores and shuttering 170.

“The scale of the reduction will surprise some today,” wrote Morgan Stanley analysts Geoff Ruddell and Amy Curry, who had categorized H&M as an “underweight” back in January. “It will leave the bears questioning why H&M still enjoys a ‘growth stock’ rating.”

Other concerns for Wall Street investors include that the retailer ended 2017 with a net debt on its balance sheet — instead of net cash — for the first time in two decades. Cash flow was reportedly hurt by an uptick in stagnant inventory.

Source: on May, 10th 2018

Posté par Nina Legras à 16:00 - - Commentaires [0] - Permalien [#]
Tags : , , , ,

20 avril 2018

Smart cities need thick data, not big data

In Barcelona, high-tech data platforms generate demand for old-fashioned community development.

Residents living around Plaça del Sol joke that theirs is the only square where, despite the name, rain is preferable. Rain means fewer people gather to socialise and drink, reducing noise for the flats overlooking the square. Residents know this with considerable precision because they’ve developed a digital platform for measuring noise levels and mobilising action. I was told the joke by Remei, one of the residents who, with her ‘citizen scientist’ neighbours, are challenging assumptions about Big Data and the Smart City.

The Smart City and data sovereignty

The Smart City is an alluring prospect for many city leaders. Even if you haven’t heard of it, you may have already joined in by looking up bus movements on your phone, accessing Council services online or learning about air contamination levels. By inserting sensors across city infrastructures and creating new data sources - including citizens via their mobile devices – Smart City managers can apply Big Data analysis to monitor and anticipate urban phenomena in new ways, and, so the argument goes, efficiently manage urban activity for the benefit of ‘smart citizens’.

Barcelona has been a pioneering Smart City. The Council’s business partners have been installing sensors and opening data platforms for years. Not everyone is comfortable with this technocratic turn. After Ada Colau was elected Mayor on a mandate of democratising the city and putting citizens centre-stage, digital policy has sought to go ‘beyond the Smart City’. Chief Technology Officer Francesca Bria is opening digital platforms to greater citizen participation and oversight. Worried that the city’s knowledge was being ceded to tech vendors, the Council now promotes technological sovereignty.

On the surface, the noise project in Plaça del Sol is an example of such sovereignty. It even features in Council presentations. Look more deeply, however, and it becomes apparent that neighbourhood activists are really appropriating new technologies into the old-fashioned politics of community development.

Community developments

Plaça de Sol has always been a meeting place. But as the neighbourhood of Gràcia has changed, so the intensity and character of socialising in the square has altered. More bars, restaurants, hotels, tourists and youngsters have arrived, and Plaça del Sol’s long-standing position as venue for large, noisy groups drinking late into the night has become more entrenched. For years, resident complaints to the Council fell on deaf ears. For the Council, Gràcia signified an open, welcoming city and leisure economy. Residents I spoke with were proud of their vibrant neighbourhood. But they recalled a more convivial square, with kids playing games and families and friends socialising. Visitors attracted by Gràcia’s atmosphere also contributed to it, but residents in Plaça del Sol felt this had become a nuisance. It is a story familiar to many cities. Much urban politics turns on the negotiation of convivial uses of space.

What made Plaça del Sol stand out can be traced to a group of technology activists who got in touch with residents early in 2017. The activists were seeking participants in their project called Making Sense, which sought to resurrect a struggling ‘Smart Citizen Kit’ for environmental monitoring. The idea was to provide residents with the tools to measure noise levels, compare them with officially permissible levels, and reduce noise in the square. More than 40 neighbours signed up and installed 25 sensors on balconies and inside apartments.

The neighbours had what project coordinator Mara Balestrini from Ideas for Change calls ‘a matter of concern’. The earlier Smart Citizen Kit had begun as a technological solution looking for a problem: a crowd-funded gadget for measuring pollution, whose data users could upload to a web-platform for comparison with information from other users. Early adopters found the technology trickier to install than developers had presumed. Even successful users stopped monitoring because there was little community purpose. A new approach was needed. Noise in Plaça del Sol provided a problem for this technology fix.

Through meetings and workshops residents learnt about noise monitoring, and, importantly, activists learnt how to make technology matter for residents. The noise data they generated, unsurprisingly, exceeded norms recommended by both the World Health Organisation and municipal guidelines. Residents were codifying something already known: their square is very noisy. However, in rendering their experience into data, these citizen scientists could also compare their experience with official noise levels, refer to scientific studies about health impacts, and correlate levels to different activities in the square during the day and night.

The project decided to compare their square with other places in the city. At this point, they discovered the Council’s Sentilo Smart City platform already included a noise monitor in their square. Officials had been monitoring noise but not publicising the open data. Presented with citizen data, officials initially challenged the competence of resident monitoring, even though official data confirmed a noise problem. But as Rosa, one of the residents, said to me, “This is my data. They cannot deny it”.

Thick data

Residents were learning that data is rarely neutral. The kinds of data gathered, the methods used, how it gets interpreted, what gets overlooked, the context in which it is generated, and by whom, and what to do as a result, are all choices that shape the facts of a matter. For experts building Big Data city platforms, one sensor in one square is simply a data point. On the other side of that point, however, are residents connecting that data to life in all its richness in their square. Anthropologist Clifford Geertz argued many years ago that situations can only be made meaningful through ‘thick description’. Applied to the Smart City, this means data cannot really be explained and used without understanding the contexts in which it arises and gets used. Data can only mobilise people and change things when it becomes thick with social meaning.

Noise data in Plaça del Sol was becoming thick with social meaning. Collective data gathering proved more potent than decibel levels alone: it was simultaneously mobilising people into changing the situation. Noise was no longer an individual problem, but a collective issue. And it was no longer just noise. The data project arose through face-to-face meetings in a physical workshop space. Importantly, this meant that neighbours got to know one another better, and had reasons for discussing life in the square when they bumped into one another.

Attention turned to solutions. A citizen assembly convened in the square one weekend publicised the campaign and discuss ideas with passers-by. Some people wanted the local police to impose fines on noisy drinkers, whereas others were wary of heavy-handed approaches. Some suggested installing a children’s playground. Architects helped locals examine material changes that could dampen sound.

The Council response has been cautious. New flowerbeds along one side of the square remove steps where groups used to sit and drink. Banners and community police officers remind people to respect the neighbourhood. The Council recently announced plans for a movable playground (whose occupation of the centre of the square can be removed for events, like the Festa Major de Gràcia). Residents will be able to monitor how these interventions change noise in the square. Their demands confront an established leisure economy. As local councillor Robert Soro explained to me, convivial uses have also to address the interests of bar owners, public space managers, tourism, commerce, and others. Beyond economic issues are questions of rights to public space, young peoples’ needs to socialise, neighbouring squares worried about displaced activity, the Council’s vision for Gràcia, and of course, the residents suffering the noise.

The politics beneath Smart City platforms

For the Council, technology activists, and residents of Plaça del Sol, data alone cannot solve their issues. Data cannot transcend the lively and contradictory social worlds that it measures. If data is to act then it needs ultimately to be brought back into those generative social contexts - which, as Jordi Giró at the Catalan Confederation of Neighbourhood Associations reminds us, means cultivating people skills and political capacity. Going beyond the Smart City demands something its technocratic efficiency is supposed to make redundant: investment in old-fashioned, street-level skills in community development. Technology vendors cannot sell such skills. They are cultivated through the kinds of community activism that first brought Ada Colau to prominence, and eventually into office.

Adrian Smith, The Guardian, April 18th 2018

Posté par Nina Legras à 12:23 - - Commentaires [0] - Permalien [#]
Tags : , , , ,

16 mars 2018

Apps account for 54% of mobile sales for merchants

The latest edition of Criteo's Global Commerce Review reveals that European retailers achieve 54% of mobile sales through apps in the fourth quarter of 2017. In France, consumers are active in all environments and buy more and more via their smartphone .

Criteo unveils the results of its quarterly Global Commerce Review for the fourth quarter of 2017. This report explores consumer activities, behaviors and preferences across all devices and browsers.

Highlighting consumers' increasing use of consumer applications, the report confirms the focus on mobile and its utility in driving omnichannel marketing business strategies around the world. "While the use of smartphones continues its strong ascent, the increased adoption of mobile apps and mobile browsers is leading to exciting omnichannel buying trends," said François Costa de Beauregard, Criteo's Managing Director France. Merchants and brands can leverage these trends to optimize their business efforts, enabling them to reach consumers more effectively to generate the best possible results. "

The opportunity of applications

When merchants prioritize application optimization in addition to the mobile web, the performance gains are substantial. At the global level, advertisers experienced an almost 50% year-over-year increase in transaction-based applications, which reached 46% in the fourth quarter of 2017.
In Europe, apps account for 54% of mobile sales for merchants who jointly invest in mobile browser and merchant application. The conversion rate for merchant applications reaches 13%, a figure more than three times higher than the 4% rate usually observed on the mobile web.

Mobile growth

The use of the mobile web has reached maturity but consumers are volatile, and continue to make purchases in a mobility situation, at different frequencies, on all connected devices.
Smartphone transactions in France increased by 45% compared to Q4 2016 (excluding applications). The use of tablets is declining, generating 7% fewer transactions compared to the previous year.

Computers remain the most used during office hours but show a small decline in transactions from one year to the next (6%).

The sectors with the highest proportions of mobile sales are sporting goods (35%), clothing (35%) and health and cosmetics (34%). Seasonal variations led to a slight decline in the number of computer transactions preceded by a mobile click, as consumers were more active on mobile devices during the summer. In the 4th quarter, 28% of computer transactions in France were preceded by a click on mobile.

Omnishoppers, a high value

Omni-channel strategies help guide consumers through their winding journey, resulting in improved online results. Globally, omnishoppers have the highest long-term value for brands and merchants, generating 27% of all sales, although they represent only 7% of consumers.
Consumers are constantly switching between computer and mobile, depending on the time of day and the day of the week. Marketers targeting the workforce need to consider the prevalence of the computer during office hours, especially in the morning. However, it is essential to optimize the targeting of smartphones and tablets in the evening and on weekends.

The combination of purchase intention data also gives the opportunity to increase purchasing opportunities for each consumer, as the average amount of the basket is significantly higher - up to 8% on average in France - for consumers identified.

This trend is particularly apparent in the fashion and luxury goods, health and cosmetics sectors, and in high-tech.

The Global Commerce Review report was based on analysis of the respective navigation and purchasing data collected from more than 5,000 merchants in more than 80 countries during the fourth quarter of 2017.

Source: by Dalila Bouaziz on February 28th, 2018

Posté par Nina Legras à 13:29 - - Commentaires [0] - Permalien [#]
Tags : , , , ,

01 mars 2018


In recent years, geolocalisation technologies have created "Location Based Services": products and services adapted to people's past and present geolocations. Made possible thanks to Roofstreet technology, the next evolution is in progress with the arrival of "Trip Based Services": products and services that adapt not only to positions but also to past and future trips.

Who needs to understand travel today?

Understanding travel is a current issue and essential to many sectors.


Customer knowledge has always been central to the retail business. A detailed knowledge of the needs of its customers and their backgrounds allows to refine its communication, better target offers and provide them with better services.

While Retail now uses a lot of offlines data from, for example, loyalty cards or some point-of-sale information (postal code), onlines data are more difficult to find and are mainly used by third-party players in the market. mobile advertising for mobile targeting and visitation purposes.

Retailers must also make decisions on display strategies, implementation of point of sale, or understand their position relative to the competition in terms of attendance on a particular consumption basin.

Retailers offer to differentiate more and more services to their customers in order to simplify their lives. The digital vector, application and mobile site is often the channel allowing them to make a difference.

Thanks to mobile applications and sites, it is possible for them to better know their customers (products consulted, intentions to purchase), but we note that the use of geolocation in the context of customer knowledge is still under exploited.

Media agencies and some retailers are building specific departments in the analysis of travel behavior given the importance of the challenge for the future.
Local Authorities
Local authorities are also very interested in travel information.
This data enables them to take the right decisions on the sizing of their transport infrastructure and also of public facilities in order to rationalize costs and provide the citizen with better service and for tourists a better attractiveness.
Local authorities seek in particular to model the inter and intra-communal flows or between the different poles generating activity (activity zones, shopping centers, etc.). It is thus possible for them to understand travel to adapt / create new transport and infrastructure offers.
Carriers and operators:
The movements and journeys of citizens are at the heart of the problems of the transport sector. Upstream of the operational phases often comes to communities diagnostic phases during which actors will identify requests and existing transport offers.

This diagnostic phase must identify the population movement flows in order to plan and optimize the sizing of the infrastructures and to plan the routes of the various available lines as well as possible.

It also makes it possible to supplement the lack of transport supply in certain areas with alternative modes of transport that are either soft or shared (carpooling).

In a more operational logic, the carriers seek to provide quality services to users and in particular by setting up proactive services, anticipating the movements of these to know them better and inform them specifically about their journeys and only at the moment. timely.

Moreover, challenges remain on the ability to trace the entirety of a user journey, if the ticketing data allow most of the time to detect the entry into a mode of transport modulo the fraud, they do not allow on the other hand in most cases to know the exit places of these users.
What are the existing solutions and data sources and their limitations?
INSEE data:

The available flow data are usually presented from one municipality to another with a count of the number of people making a commute to work over a year.
Dividing the territory into 16,000 "IRIS" zones, each zone gathers on average between 1800 and 5000 people according to the densities of the agglomerations and have an average surface area of 41KM².

Most geomarketing tools and feed data providers use the IRIS field as the most granular area for representing origin / destination matrices.

This use presents several difficulties, on this type of data it is for example not possible to obtain the flow per borrowed axis, nor to go below the commune or the IRIS zone, nor to consider a period of time finer than that studied at the orgine.

The area of the IRIS areas is sometimes too large to respond to fine analyzes, especially in dense urban areas.

Finally, customers often have their own areas of analysis that do not always coincide with the proposed areas.

Field surveys:

Many companies use field surveys conducted by specialized institutes: these surveys can be carried out by physical counting or by telephone.

Field counts are usually performed on 1 or more specific areas requested by the customer. A count covering the whole of an agglomeration would be too expensive.

Some of them make it possible, in particular, to reconstruct the pathways of panelists on a declarative logic.

If the statistical extrapolation models are very robust, the samples taken are often very small.

Tracers data:

Tracers can be either GPS or specific boxes that collect all the information on the conduct of the person. The data provided is of good quality and can also be used to understand driving behavior (useful for example to insurers).

Devices of this type are generally hardware, they require the acquisition of hardware, their installation and uninstallation as well as the associated software and hardware updates.

Some mainstream applications (such as Waze) can calculate actual road routes, but these data are the property of those who collect them and are not available on the market.
In addition, these data only cover motorists and therefore do not allow for non-motorized or public transport trips.

Public transport data:

Public transport open data if they can be very useful to know the bus schedules, the stations, routes of lines, station of self-service bicycle, concern very little the displacements of people.

The ticketing data attempting to model the routes takes into account only the entries in the vehicle but not the exits.The user does not validate his ticket at the end of the trip so the reconstitution of the journey is only partial.

Telecoms data:

Telecoms typically provide origin-destination matrices based on the location of people in areas of telephone or aggregated antennas at IRIS.

If the representativeness is interesting given the penetration rates and market share of each actor, technical limitations exist and generally do not allow to go down to a level of precision less than several hundred meters.

Data from GPS tracks of smartphones

This data comes from embedded location systems in smartphones.

SDKs (Software Development Kit) are integrated in consumer mobile applications and collect (if the user has given them the right) the positions returned by smartphones.

These data are used today primarily for advertising and customer targeting purposes by geofencing methods or segment targeting of people who have been seen at a location that is of interest to an advertiser.

These data are of great value because of their precision which varies between a few tens and several hundreds of meters, and by their frequency of collection and their real time aspect.

There are, however, common geolocation errors (use of underground transport) and accuracy is not always proven. In addition, the frequency and the mode of collection of points becomes more and more restrictive and the collected data are more and more fragmented.

Actors processing this type of data must ensure that they have implemented algorithmic methods that detect errors, delete or weight the data according to their accuracy.

While this data may be relevant for mobile advertising stakeholders in order to target and measure visits to a point of sale, it is difficult to use to model a person's journey in its entirety and does not allow us to understand where did a person pass between two point surveys.

Trip-Based Services: the paradigm of "going through" / " will go through":

Technical value

Smartphone data is highly accurate notwithstanding the ability to filter, process and correct geolocation errors.

Roofstreet has developed algorithms that allow on a weak base point to reconstruct all of a person's journeys.

The algorithm will thus link the points between them in an intelligent way by observing the habits.

The more repetitive the motion, the more accurate the path returned by these algorithms.

Moreover, when a trip is created it is possible to understand how a person moves from point A to point B in order to detect recurrent displacements and to use them as a basis for predicting the next trip (s).

The ability to generate precise paths on a small number of points collected makes it possible to overcome major constraints being:

Save the phone's battery consumption
Respect the restrictions imposed by manufacturers on the frequency of collection.
Cases of use made possible by the knowledge of the journeys
Thanks to the creation and aggregation of individual paths, it is now possible:

- For local authorities, to know the flow of movement by Road or non-road between neighborhoods and / or communes / generating poles and with a very fine temporal granularity (time, typical day, typical week).

- For the retail sector, catchment area 2.0 consists of identifying the provenances, places of residence or work and the journeys made by any person before going to a given area. Only paths can reconstruct this type of history.

Retailers can now rely on a database of real-time refreshed travel data to optimize marketing investments, particularly with regard to urban signage thanks to audience measurements, to define the ideal location for the positioning of customers. long-life panels as well as the best distribution areas for advertising printed matter in combination with existing data sets.

In addition, the e-commerce players are currently considering a way to take into account the journeys of a person to offer him adapted withdrawal points to facilitate his experience with the brand.

- For the transport, the knowledge of the paths associated with the line plan makes it possible to know automatically, without having to ask the user, which lines it borrows in order to be able to warn it in advance, on these lines and only at the right time, disturbances.

From an operational point of view, these paths and travel intentions can be provided to the operators of the control towers to provide an additional decision element for the programming of train departures for unloading on the platforms.
Moving from "location based services" to "trip based service" is therefore an issue with great potential for any brand or public service wishing to better understand travel and communicate better with its customers.

Source: Blog - 11 Jan, 2018 

Posté par Nina Legras à 10:40 - - Commentaires [0] - Permalien [#]
Tags : , , , , , ,

22 février 2018

Retail: Never without my mobile

70% of consumers now use their mobile in stores.

Locate a store, consult reviews or compare prices ...: the mobile is today become indispensable to the consumer even in his journey offline purchase. To better understand the role and increasing impact of smartphone in store, the MMA has just published its Guide to interactions mobile point of sale.

Two years after its first edition, the Mobile Marketing Association France announces a major update to its Point of Sale Interaction Guide. "70% of consumers are now using their mobile in stores. In this guide, traders can discover the latest technologies, optical or radio, to interact with consumers. But this publication offers them, above all, a real user manual for accelerate the digitalization of their point of sale with an important focus on management of personal data at the dawn of RGPD 2018. " - explains Anh-Vu Nguyen, head of the working group "Mobile Interactions in point of sale "at the MMA.

In addition to an overview of different technologies (NFC, BLE, Wi-Fi, recognition image, QR code and now VLC) and use cases available to traders, the guide proposes a new legal section: Maître Thomas Beaugrand, lawyer at the Staub & Associés, partner of the association, shares its vision of the legal and consequences of the GDPR on the collection and exploitation of data. "The power of e-commerce giants today rests on data exploitation and traditional traders can also benefit to better understand their consumers, optimize organizing their point of sale or maximizing their sales".

However, this requires a clear understanding of the new legal framework of the General Data Protection Regulation (GDPR 2018), distinguishing between 
clearly the good and the bad practices in terms of collection and data exploitation. "Adds AnhVu Nguyen. 

This guide was produced by member companies of the Mobile Marketing Association France: Atsukè, EzeeWorld, Fidzup, SLMS, Snapp '& userADgents, with the support of the AFSCM, the law firm Staub & Associés and thanks to the support of BNP Paribas and Hello bank!


16 février 2018

How EDF tests IoT technologies

EDF is working with Nokia to test IoT technologies in its R & D laboratories in Ile-de-France. His various professions also lead a work on big data.
EDF is slowly but surely moving towards the Internet of Things. After carrying out a general interest study on the subject for all of the group's businesses in 2015, the energy company's R & D division began a more concrete testing phase at the end of 2017. It has teamed up with Nokia to evaluate the use of IoT technologies in an industrial environment.
"We want to define the strengths and weaknesses of each protocol and technology, in an agnostic approach, out of the marketing promises, to enlighten the group's business lines," summarizes Stéphane Tanguy, director of research programs on information tools. EDF and IS responsible for R & D. He insists on the importance of conducting real tests in the laboratory and not relying solely on the simulation of physical phenomena. "It's important to see how sensors and protocols behave in an industrial environment, how they go from a 'sleep' mode to an 'active' mode for example, to know the real energy consumption, to understand the exchanges between objects. and protocols ... ", he explains.
EDF tests A wide range of technologies
EDF, by the nature of its activity, wants to ensure the robustness of the different IoT solutions before a massive deployment. Among the parameters monitored especially by R & D teams, there is the endurance of objects, their consumption, but also the cost per use. On the network side, it is the level of penetration of materials by the signal (especially concrete), geolocation services and, of course, cybersecurity that are reviewed. EDF tests both cellular technologies (NB-IoT, LTE-m) and networks using free frequencies (Sigfox, LoRa).
These tests, which will run until the summer of 2018, will allow EDF to build an IoT toolbox in which the various businesses of the group will be able to draw for their projects. In parallel with this work of the R & D teams, divisions of the group have already developed POCs (prooves of concept) on various possible use cases of IoT, such as the protection of workers in remote areas or "geo-tracking". "We had identified a hundred different use cases for our business in our 2015 study, the potential is important, says Stéphane Tanguy.The deployment schedules are different according to the business.
Big data & data science
The purpose of the IoT is to put together new data that is useful for improving operational efficiency, saving money and creating new services. But the company already has many data that it seeks to exploit better. "We already have process sensors (temperature, pressure) that are used for monitoring, but do not necessarily communicate in real time, they are already providing us with valuable information that was confined to separate and heterogeneous applications. Let's pour them into datalakes to de-silot them and analyze them.The goal is to apply algorithms to find correlations, develop applications, make inter-factory comparison, learn from ... ", details Stéphane Tanguy.
If EDF does not appear particularly ahead of the IoT, it is rather well positioned on the big data. "When I talk to other manufacturers, I see that we are well placed to put in place innovation devices to apply new data-science techniques to our data."
By Sylvain Arnul, Usine Digitale, February 14th, 2018

Posté par Nina Legras à 10:39 - - Commentaires [0] - Permalien [#]
Tags : , , , , , , ,

10 janvier 2018

Ford and Autonomic are building a smart city cloud platform

Ford and Silicon Valley-based Autonomic will work together to build a new open platform upon which cities can build out infrastructure communications, including connected traffic lights and parking spots, called the “Transportation Mobility Cloud.” Ford CEO Jim Hackett announced the news on Monday at the CES 2018 keynote kicking off the annual conference.

The platform is designed to help connect smart transportation services, as well as adjacent connected offerings, uniting them with one common language to help coordinate all this efforts in real-time. That means tying together personal cars with vehicle-to-everything communications built in, incorporating things like bike sharing networks, public and private transportation services, including buses, trains, ride hailing and beyond.

The Transportation Mobility Cloud will support location-based services, determining routes, sending out alerts about things like service disruptions, handing identity management and payment processing, as well as dealing with data gather and analytics. It’s intended not only as a kind of connective tissue for the forward-thinking services and vehicles that will make up the smart city of tomorrow, but also as a platform upon which new apps and services can be built from the heath of data available.

Ford says to think of it like “a box of Legos” with pieces that can be quickly taken apart and reassembled to build new types of assets and products to better serve city residents. It’s intended to be flexible enough to work with all partners, and to change from city-to-city depending on local requirements and implementation specifications.

In a blog post detailing the news, Ford suggests some possible uses to illustrate what the platform could do, including routing autonomous vehicles away from the most densely clogged arteries occupied by human cars in times of peak traffic, and rerouting cars on the fly to help reduce congestion, or even letting cities fence off ares of the city to restrict them to EV only zones in order to help mitigate air quality and emissions issues.

Ford stresses that it has designed this platform “for everyone,” a road base group that includes transit service operators, as well as competitor automakers, who it invites to join in with the effort in order to help make it as widely compatible as possible. Ford says it hopes to use its open approach to drive adoption to the point where it can claim to be the smart city platform with the most connected vehicles by the end of 2019, and eventually it hopes to achieve a 100 percent compatibility rate with vehicles and services on the road.

It’s a massive undertaking, but if successful, it could pave the way to cities better able to launch and incorporate Ford’s growing stable of mobility service offerings, including things like last mile shared commute service Chariot, as wells Ford GoBike and its forthcoming autonomous ride hailing fleets. Teaming with Autonomic, a company that Ford invested in last year, will help it ramp quickly since the Palo Alto company’s staff has lots of experience building platforms intended for integration on a broad scale, including Amazon Web Services.

Part of the promise of ride-hailing has been that it would reduce congestion in cities – but studies show the opposite is true, which Ford says it hopes to help correct with a platform like that which can help optimize their rollout and integration into existing services and traffic flows.

Source: by Darrell Etherigton - January 9th 2018

Posté par Nina Legras à 15:36 - - Commentaires [0] - Permalien [#]
Tags : , , , ,

03 janvier 2018

Ericsson forecasts 20 billion IoT-connected devices, including 1.8 billion by cellular network in 2023

According to the latest updated report on the mobility of the Swedish equipment manufacturer Ericsson, the number of objects connected to the IoT (connected cars, industrial machines, meters, sensors, consumer products of types wearables, etc.) should grow on average 19% worldwide by 2023, to reach a total of 20 billion units.

By that time, all devices connected to a network (including mobile phones, PCs, tablets) will reach 30 billion units.
Ericsson distinguishes objects that will be connected by short-range wireless technologies (Wi-Fi, Bluetooth, Zigbee, etc.) from those connected to a long-distance network (3G, 4G, 5G, NB-IoT, and LPWA networks such as Sigfox or Lora).
Objects connected by short-range technologies will be largely in the majority (17.4 billion units in 2023) and their number will increase on average by 18% per year by 2023.

On the contrary, objects connected to a long-distance network will be very much in the minority ("only" 2.4 billion objects in 2023 and 600 million in 2017), but their number will grow faster: + 26% on average per year .
At the end of 2017, Ericsson estimates that about 500 million objects will be connected to IoT through a cellular connection. Their number is expected to reach 1.8 billion units, or 75% of the long-distance category. Clearly, Ericsson, one of the first equipment manufacturers of cellular network infrastructure, believes that Sigfox and LoRa will not break through ... What remains to be verified.

Currently, the long distance segment is dominated by GSM / GPRS technology, but unsurprisingly, by 2023, 4G LTE and 5G cellular technologies will take over. The 4G will then represent the majority of cellular IoT connections, while the 5G will support the most critical applications.
The first cellular IoT networks based on Cat-M1 and Narrow Band-IoT technologies (NB-IoT) were launched in early 2017 and Ericsson currently has more than 20 cellular IoT networks using these technologies. operating commercially around the world.

Source:, November 29th, 2017

Posté par Nina Legras à 10:24 - - Commentaires [0] - Permalien [#]
Tags : , , , , ,

20 décembre 2017

Artificial intelligence will upset the bank's business

Advisers assisted by robots, improved HRD thanks to software ... a study lists the changes underway in the banking sector.

Will the HR departments of the banks soon become "HRMD", "Human Resources and Machinery Departments"? The idea, a bit scary, is among the recommendations of a report from the firm Athling, "Artificial intelligence in the bank: employment and skills," which will be released on December 7. This study, commissioned and managed by the Observatory of the Bank's trades, the statistical and prospective collection body of the banking branch, sought to evaluate the consequences that artificial intelligence (AI) could have on the sector.

While trade union organizations warn of the growing concern of employees about the irruption of these cognitive technologies, the authors of the study "aware of tensions", deliberately chose not to quantify the volume of jobs that could be deleted. This assessment is considered as "too dependent on the strategies of the institutions and exogenous factors (regulation, economic activity ...)", warns the report, of which Le Monde obtained a copy.

But, once these precautions are taken, the document does not hide the upheaval ahead for the sector. In the first place, because banks are among the first companies to have computerized their operations and thus have data from millions of customers "with considerable historical depths", an indispensable material for artificial intelligence.

Messy initiatives

Athling has identified a "plethora of IA projects in the banking sector". He notes that at this stage, only 15% of the experiments concern customer advisers. Tests of chatbots or robots in contact with customers exist "but they remain very limited, because of performances deemed unsatisfactory".

Source: Le Monde économie, December 2nd, 2017 by Véronique Chocron

Posté par Nina Legras à 10:24 - - Commentaires [0] - Permalien [#]
Tags : , , , , ,

08 décembre 2017

Big Data Market in Smarter Cities: Forecast (2017-2022)

LONDON, Nov. 20, 2017 /PRNewswire/ -- Big Data can be best defined as the capture, storage, search and analysis of large and complex data sets which are generally difficult to be processed or handled by traditional data processing systems. Smart city corresponds to the integration of Information and Communication Technology (ICT) and Internet of Things (IoT) in a secure fashion to manage the cities assets such as schools, hospitals, power plants, waste management among others.

Download the full report:

Smart cities utilize several technologies to improve the performance of health, transportation, energy, education among others in order to provide higher levels of comfort of their citizens. One such technology that has a huge potential to enhance smart city services is big data analytics, it plays a key role in making cities smarter.

The report big data market in smarter cities is segmented into 4 verticals: By data generators, by type, by data type, by application. Data generators refer to the sensors, recorders, consumer electronic goods and few others which act as data sources.

Regarding type report is classified into infrastructure, software & services. The three different data types mentioned in the report include structured, un structured and semi structured data types. By application report is segmented into city planning & Operations, public safety, IoT, Transport & CO2 emission and others. Moreover report is segregated based on geographic regions that include Americas, Europe, APAC and RoW to provide intense knowledge about the market

Government regulations to deploy big data analysis in smart home concept in order improve the living standards of people, enhanced technologies in smart traffic to manage the traffic efficiently among others acts as key growth drivers of big data market in smarter cities.

North America holds major share for big data market in smarter cities owing to major smart cities in this region and is witnessed to be the leading market during the forecast period 2017-2023 as many of the cities in this region coming forward to become smarter. Asia Pacific is the fastest growing segment during the forecast period due to the extensive growth of smart city momentum in developing countries like china, India, Korea in this region.

Posté par Nina Legras à 11:59 - - Commentaires [0] - Permalien [#]
Tags : , , , ,