Wednesday, October 28, 2015

PCA, Hitachi software that predicts crimes – News Technical Engineer

Exit easy reading

news submenu

As time

crime_scen



PCA, the Hitachi software that predicts crimes

->

<- words cl é! s -> http://www.techniques-ingenieur.fr/actualite/thematique/informatique-et-numerique/

Hitachi designed . algorithmic prediction software that would know when and where a crime will be held According to the company, the “PCA” should save much time to police Except for errors

<..! - article content>

Minority Report , so far after more Predpol , algorithmic prediction software used by the NYPD to predict place the risks of crimes are most likely, here is the latest Hitachi program

. The algorithm of niponne company claims go further His system, “Visualization Predictive Analytics Crime” (PCA), allow to predict the place and time of the crimes -. not just to have a statistical estimate.

According to the website Quartz, software draws ” in the data transport networks, conversations on social networks, weather reports and more “. According to Hitachi, told Fast Company posts on the social network Twitter are particularly useful because they increase the accuracy of 15% of BCP, through the geolocation of tweets and hashtags (keywords). For it is well known, criminals are also on Twitter, and speak publicly about their packages

.

A “zone risk “of 200 m 2

The Japanese software is of course far from being a “miracle” program. As Predpol, in reality, it only gives a “percentage of risk,” with the nature of probable crime. But the results will be more accurate, Predpol not based on police data, not on data collected across the Web. An area at risk of 200 m 2 is formed, with a percentage of risk .

According to one of the project managers, cited by Fast Company, PCA will save time to the police because “a human can not handle when there are tens or hundreds of variables that could affect a crime” . The software would further not require human intervention. And icing on the cake, “learns by itself” , according to the Machine Learning of the principle

. Unfortunately subsists the eternal problem of false positives. Thus, the risk is great to patrol an area where it will happen in reality nothing, and mistakenly call an innocent, according to police prejudice, during the famous “racial profiling.”

According to Hitachi, “half a dozen” US cities, including Washington should test this program shortly

.

with Fabien Be

<- date, author, print, and email addthis ->

To go further

<- articles li é! s news ->

In the news

  • Big Data: welcome to the era Prediction
  • How Quick hiring its staff through recruitment predictive
  • How to give a conscience to a computer

<- articles li é? s doc resources ->

In the documentary resources

LikeTweet

No comments:

Post a Comment