Artificial intelligence has arrived in the court systems and its already incriminating citizens.

The future fiasco of AI penalizing Americans for thought crimes has begun and according to chief justice Roberts it’s already causing problems in the legal systems of America.

Chief Justice Roberts recently attended an event, and was asked whether he could foresee a day “when smart machines, driven by artificial intelligence, will assist with courtroom fact-finding or, more controversially even, judicial decision making.” He responded: “It’s a day that’s here and it’s putting a significant strain on how the judiciary goes about doing things.”

According to Ph.D. researcher Christopher Markou, of the University of Cambridge;

Roberts might have been referring to the recent case of Eric Loomis, who was sentenced to six years in prison at least in part by the recommendation of a private company’s secret proprietary software. Loomis, who has a criminal history and was sentenced for having fled the police in a stolen car, now asserts that his right to due process was violated as neither he nor his representatives were able to scrutinize or challenge the algorithm behind the recommendation.   — Read More

The report that the judge accessed while making the decision to incriminate Mr. Loomis came from a software product called Compas, which is created and sold to the courts by Northpointe Inc. In their marketing plan we see the true agenda (if pdf doesn’t display correctly, please click download).

As one can see, this is no conspiracy; this is reality. Just like a Facebook profile, Compas is creating a profile on an offender and then deciphering the data gathered from all different sources into probabilities and charts. The data collected in the case of Mr. Loomis is sealed, and according to Ph.D. researcher Christopher Markou the data collected can include; “a number of charts and diagrams quantifying Loomis’ life, behavior and likelihood of re-offending. It may also include his age, race, gender identity, browsing habits and, I don’t know … measurements of his skull. The point is we don’t know.”

Increasingly as technology integrates with our day to day lives the data available becomes monstrous and according to Microsoft’s privacy policy, the amount of data collected is already stupendously gigantic;

Their privacy policy clearly states that every interaction you make with its software or hardware, be it creating a new Microsoft account, entering a search term on Bing, seeking help from Cortana, or contacting their customer support, it will record your data.

The motive behind collecting enormous amounts of information from consumers was expressed by Microsoft’s CEO, Satya Nadella. On the day of his appointment, he sent out an email to all the employees, explaining that in the coming decade, every aspect of our life, business and the world would be digitized. To achieve this feat, he stated, “This will be made possible by an ever-growing network of connected devices, incredible computing capacity from the cloud, insights from big data, and intelligence from machine learning.”

What this means is that Microsoft will go to great lengths to access your personal information. Insights from big data and intelligence from machine learning are only possible once Microsoft has knowledge about your online behavior. The worrying factor is that during the Prism fiasco, NSA collaborated with internet giants (including Microsoft) to tap into users search history, emails, live chats, transferred documents, communications, and much more. You can only imagine how many similar illegal surveillance programs are lurking in the shadows that no one has blown the whistle on yet.  — Read More

To top that off, several other tech giants are looking to not only capture our “personally identifiable information” online, but rather they are seeking to capture our data from the day to day as well. Nvidia, the graphics card company is teaming up with governments, security firms, retail stores, and security camera makers to provide video analytics about entire cities. Nvidia plans to use machine learning to analyze video feeds from hundreds of different sources to provide information on citizens to governments, police forces, businesses and so on.

The enormous amount of data collected and analyzed by these systems will be stored on each individual and spreads far beyond the ‘online world.’ As Christopher and Chief Justice Roberts proved above, Artificial Intelligence has made the leap from online to reality. The Artificial Intelligence takeover has commenced, what say you reader?

0 0 votes
Article Rating
Subscribe
Notify of
guest

6 Comments
newest
oldest most voted
Inline Feedbacks
View all comments
Traveler
Traveler
7 years ago

Rev 13:14 And deceiveth them that dwell on the earth by the means of those miracles which he had power to do in the sight of the beast; saying to them that dwell on the earth, that they should make an image to the beast, which had the wound by a sword, and did live.
Rev 13:15 And he had power to give life unto the image of the beast, that the image of the beast should both speak, and cause that as

What we are seeing is the fulfillment of this 2000 year old prophecy.

Zeira Corp
Zeira Corp
7 years ago

It has been for some time a growing concern among academic-level professionals in the AI and Robotics fields at various universities and agencies that sophisticated software running on powerful machines to power robotic bodies might be given free reign to make kill/no-kill decisions on a battlefield or other lethal situations. I think it is a valid concern that software might be or already is being given such free-reign to make decisions that affect the lives of people in permanent ways, including possibly putting them to death.

The thing we need to remember is that, while we throw the term “Artificial Intelligence” around as if self-aware ‘living’ machines or programs exist, in fact they do not. There is no true artificial intelligence, only more and more complex “Expert Systems” running on increasingly more powerful and fast hardware. Just processors executing tons and tons of code. They do not feel. They do not ‘think’ in the biological sense. They do not know how to do anything that the programmers didn’t write into their programs. They are not self-aware and they do not work like a living brain. If the output of these programs appears more and more ‘alive’ or human, it’s only because of the increasing complexity of the software and capability of the hardware to run it.

All this doesn’t make it any better that we are considering or are already giving these systems ‘authority’ to execute real-world actions that have consequences for living beings. In fact, it might be even worse because these machines are just that: machines. No group of programmers can think of every possible set of circumstances and write it into their code. There will always be a case where the machine makes a wrong decision based on incomplete or incorrect parameters or algorithms. Self-driving cars will at some point, wreck. Computers assigned to make life and death decisions will, at some point, kill someone who shouldn’t be killed.

I guess the question at this point is, “Can we write programs that have enough data and instructions on which to draw to make their decisions that those decisions are actually statistically better than those made by biological humans?”

I think research into the areas of expert systems (or, call them simulated intelligence, if you like) is intriguing and has a lot of value, but when we grant it more credibility than it deserves it might just be downright deadly. I think we are a long way from creating a program which can pass the Turing Test (a test developed by Alan Turing in 1950, which is a test of a machine’s ability to exhibit intelligent behaviour equivalent to, or indistinguishable from, that of a human), and even further from any point where we should give such machines and software authority to make life or death decisions for humans.

Just my two cents. That and fifty more won’t even buy you a cup of coffee.

James Stamulis
James Stamulis
7 years ago

Sometimes (all too often lately) i feel like i am living in the twilight zone i watched as a kid. My God why don’t we learn from the past? It also seem that the people running the world are following the Hollywood scripts that was supposed to be science fiction not fact.

Zeira Corp
Zeira Corp
7 years ago
Reply to  Nate Brown

Spot-on correct, Nate. “…they are not teaching why we have to be on watch and why have to be vigilant – rather they are teaching acceptance and tolerance…”
No one wants to be vigilant about anything. They either don’t know the why, think it can’t/won’t happen to them, or they don’t think a strong defensive posture is morally [politically?] correct.
I had a conversation with an unarmed security guard who thought it was just silly to be concerned, as security, about such things as mass shooters and other violent possibilities. I told him, “You and your co-workers are in a position to die if you are not aware of your surroundings.” He went so far as to say that it was perfectly alright with him if a couple guards took bullets, and he was sure if that happened the rest would go to safety and hide. That has to be THE dumbest thing I’ve ever heard out of the mouth of someone in that sort of position. I wonder if he would feel the same way if he was one of the couple who took a bullet? I’ll bet he’d re-think his position.
My significant other worked with a woman who just finished her teaching degree and who had just landed a job teaching history at a local high school. My sig is a Civil War buff and thought this person would be a great source of conversation about history (since she’s a history teacher) and the Civil War. This ‘teacher’ didn’t even know when the civil war was, or anything else about it. The ‘teacher’ was offended because my sig accidentally ‘outed’ her as the idiot she is. She is one who got her degree by drinking, cheating, and sexing her way through college. Sadly, there are all too many of those. …and they are teaching our kids!
People, if you can, home-school!