Gizmondo; Your worst Alexa nightmares are coming true.

Your Worst Alexa Nightmares Are Coming True

What’s the most terrifying thing you can imagine an Amazon Echo doing? Think realistically. Would it be something simple but sinister, like artificially intelligent speaker recording a conversation between you and a loved one and then sending that recording to an acquaintance? That seems pretty bad to me.

Tech giants hit with 8.8 billion in lawsuits regarding forced consent barriers on day one of GDPR.

Sorry for the late post, crazy day. It was also day one of GDPR enforcement in Europe, and it would seem that the easy transition hoped for by tech giants like Google and Facebook just got a lot tougher.

Max Schrems is an Austrian data privacy activist. Max took objection to the practice of forced consent in software. Basically what happens is after you install a shiny new piece of software a window pops up with a long list of demands you need to agree to or you are simply not allowed to use it. Somehow the tech companies consider this a binding contract. Many people agree with that philosophy. After all, you did sign the document, or clicked agree, or whatever other form it took. I have even seen this shady business practice performed in such a way where even the act of paying for the service is marked as showing your consent to a long list of small print legalize. I don’t think of this as being a binding contract at all and I will try to explain my reasoning shortly but more importantly, the GDPR doesn’t agree with that statement and neither did Max, and lucky for Max he seems to have the kind of bankroll to do something more about it than just blog.

Max hit Facebook, Instagram, Whatsapp, and Google for a combined total of 8.8 Billion Euro with a capital B. Go you, Max. I’m sure this will result in a lot of positive press and at the very least will settle for more cash money than any one man can ever spend. But more importantly for the rest of us, it will show how much resolve the European Union has for enforcing such legislation, and also how many headaches and how much cash the big tech companies are willing to spend on keeping their stranglehold on your personal data.

Looked at objectively the tech companies have a vested interest in spending as much as it takes to fight this, for collecting data and selling it is their entire business model. They all work on a flipped business model. Where you are no longer the customer, you are the product. I have heard people before complain about Google’s customer service. Funny thing is Google has great customer service. Sign up for Google Adwords and spend cash on search ads and Google will be at your beck and call to address your every concern. This is not unprecedented, Broadcast TV worked off this model for a long long time. Then again look what is happening to them now that digital streaming offers a better product, and most people reluctantly accept advertisements as an acceptable way to get a product for free. What is much much more disturbing is the idea of these companies selling dumps of highly personalized data to whatever company or organization that currently has the cash to pay for it. (Note; the government would never have to pay, their payment is allowing the tech giant to keep existing.) Given these tech companies in question, Google and Facebook both claim that they do not sell data directy to third-party firms, but scenarios, where the firms are able to truthfully say this and still sell the data, are possible. For instance, they could not be selling the data but giving it away for lucrative contracts or massive discounts on services needed. The sure thing is that if there are loopholes they will be exploited. Doublespeak is an existing and real problem in American corporate structure.

A lifetime ago I used to work in the ski industry. This problem to reminds me of that time. When the resorts would have this sort of implied consent form on the back of every ticket in small print. Their logic was that if a person bought a ticket they were then agreeing to every word written on the back of the ticket. Totally ignoring the fact that you don’t even see the ticket until after you pay for it, thus could never agree to anything it says let alone read it in the first place.

People would inevitably get hurt at the ski area and would also inevitably have to sue to cover hospital and lost wage costs, and surprisingly they would win more often than not. I would always hear the words ‘contract under diress’ come up in these cases. Which basically means that a contract cannot be enforced if a person was coerced or forced to sign it, and it is not that hard of an argument to sell that nowhere in the process from marketing to booking to traveling and physically walking up to the ticket booth is there any discussion of this contract being signed. Your first introduction to the contract is when you notice on the lift after you have already paid for it and thus allegedly agreeing to it. You cannot and should never be forced to sign away your rights. The ski resorts have legal obligation to keep you as safe as they can including an obligation to close their doors if conditions make that impossible.

So to close, Trying to coerce someone into a contract is a crap business practice and pretty much goes against the principal of being a decent human-being with their shit together. It should be illegal to the extent that it isn’t already, which it (to my interpretation of the law) is.

Tesla releases partial autopilot & infotainment suite code on Github.

Hey all, yesterday’s story was about Google and how the takedown of its longstanding ‘don’t be evil’ policy is an especially bad omen in the face of it becoming a literal military contractor by building project maven for the Pentagon. It is a sentiment I fully believe in and the precedent it sets scares me a lot. That being said it was more political than I like and was poorly received by the public, I think mainly for the title but also the fact that I had my sarcasm turned up to eleven, live and learn. So while not apologizing for what I believe in, I do want the tone of today’s article to show the other extreme.

So in that vein lets turn our attention to Tesla Motors who open sourced a chunk of its autonomous vehicle code and infotainment suite code on Github yesterday. The company had been getting flack for not doing it sooner though since it ships it’s cars with code installed that claims to be protected under the GPL, or General Public License. The license paraphrased states that any user can edit the code providing said user publishes that code under the same open source caveat. Publishing the code though was exactly what Tesla failed to do.

Even with that former controversy, I think that Tesla did exactly what it should have in this case. They were responsible and took the time to examine their liability in releasing the code of a machine which is more than capable of killing a person if it were to get out of control. It would seem that they also published only a portion of the code in an effort to prevent overenthusiastic DIYers from running out and building their own autonomous cars. Blindly handing it over to the public would be bad. But hiding it totally away is also bad. Remember for every bad actor you can imagine who is digging through this code there are plenty of white hat hacker/developer types who probably own their own Tesla’s and are very interested in auditing the security of the vehicles they use daily. I would argue that these people are in many cases better at building secure devices than the overworked Tesla engineers who inevitably are pressured to get products out the door before they are comfortable with their perceived completeness. If a person buys a product and has the training in a certain specialty pertaining to the product then they need the freedom to be able to disassemble the product to understand how it works, and more importantly, why using it will be safe for himself and his family. That being said the company also has a reasonable responsibility to protect the public from actors who could potentially weaponize this technology. Be it a hobbyist or a foreign agent or anyone else.

They certainly have a responsibility not to take their autonomous vehicles, slap sentry turrets on top of them and sell them to the Pentagon for big profits. Sorry, no preaching. I promise the only last thing I will say on the topic is that any time you take a firearm, attach it to a computer, and then decide whether people live or die based on a line of if-else statements it is irresponsible at best.

The Tesla Github repository can be found here.

Greed & apathy now back on Google’s menu.

The popular search provider turned independent surveillance company ‘quietly’ republished its Code of Conduct yesterday which now bears (barely any) mention of its famous phrase “don’t be evil”. I cannot understate how profound and disappointing this is. I’m certain the casual observer looks at this and thinks so what, It’s just a phrase, I’m sure it just gets replaced by some other jargon. In reality, this is the final nail in the coffin of Silicon Valley morality in general. There was a point 2 years ago or so where I was interviewing with Google. I didn’t have the sort of deep understanding of algorithms that Google values so I was passed up. I was disappointed at the time, now I’m glad.

Don’t be evil was introduced into Google in early 2001 or 2000 by Paul Buchheit. Apparently he is quoted with saying he “wanted something that, once you put it in there, would be hard to take out”, adding that the slogan was “also a bit of a jab at a lot of the other companies, especially our competitors, who at the time, in our opinion, were kind of exploiting the users to some extent”. Unfortunately, now that Google is working on the military AI drone program named Project Maven for the Pentagon. It would seem that the company has skipped exploiting their users for bombing them outright.

It is a sad tale really if you choose to look at in that way. I’m sure the sentiment of don’t be evil still exists. I’m sure there are hundreds of Google employees that can understand how holding onto your humanity is a little more important than some short-term profits. But in the black hearts of the shareholders? Not a chance. Any doubt of that can be squashed by taking a fleeting glance at any article relating to this that has been written by a financial news source. They have dug deep into their bag of buzzwords to justify this action as ‘inevitable’. Shareholders look at the universe through the ambivalent lens of their stock price. In fact don’t be evil may very likely have faded into obscurity if it hadn’t been a shocking centerpiece of the company’s 2004 IPO. To quote from the document directly “Don’t be evil. We believe strongly that in the long term, we will be better served—as shareholders and in all other ways—by a company that does good things for the world even if we forgo some short-term gains.” It was the founder’s goal to make it very clear to potential shareholders that leaving the planet a better place than when you found it, is better for them, us, and anyone else who is a human.

Sure don’t be evil is vague. It could possibly mean don’t eat the piece of cake in the company fridge if you don’t know who’s it is. But it CERTAINLY means don’t build flying artificially intelligent death bots!!! I was trying to be funny a little bit there but it’s a valid point. Most of us know what evil is and isn’t. Sure some things are hard to quantify. But then again the important things are not! If don’t be evil is so simple and vague then there should be no problem including it in their mission statement. To me it’s omission can only indicate a desire by Google and/or it’s shareholders to do evil things, and Project Maven may unfortunately, be the new normal. The idealized worldview of 2000 era recent Stanford graduates made rich is now the slow degradation into the dystopian future that Wall Street power brokers have always desired.

So the simplicity of don’t be evil will now be paved over with carefully crafted buzzwords designed to allow people to feel halfway good about themselves while still allowing the death bot program to exist under (some) peoples radar. The general idea of don’t be evil stays, but now it is ‘Ethical Business Practices’. Sterile enough to mean what you want, when you want. Progress continues, the shareholders are happy. Life is good.

Just keep watching the skies…