I’m most worried about the rise of Fintech apps enabled by APIs like Plaid. The media seems more worried about 10-year old Facebook likes being sold than a perpetual real time feed of bank transaction data ending up in the wrong hands or in the hands of a nefarious developer.
For the record, I’m highly critical of Plaid and hope the tech media catches on soon. They do not require developers to communicate which permissions they are asking for when onboarding new customers (I don’t even think that is an option even if developers wanted to) and there’s no central UI for a end customer to review permissions you’ve granted across developers and revoke them. I don’t think they have any requirements to encrypt this data on the developer side and have no idea how they audit developers to make sure they are using various endpoints without violation of their developer terms.
I worked on card transaction data (from Mastercard) in streaming 5 years ago. It's as shitty and invasive as a group of soulless bank BI can make it. Their detachment from the human damage they were creating and the way they basked in their own smartness was scary and disgusting.
> a perpetual real time feed of bank transaction data.
Jeez that does sound terrifying. I mean I guess that's already here in my credit cards databases, but at least (in the USA) I have some legal protections.
Shoshana Zuboff is one of those people that make me upset when I discover them. Why didn't I hear about them and their books books much earlier? Is it only because she is not marketing her books well enough?
The Age of the Smart Machine (1988) is truly visionary and well written.
edit:
I'm currently reading The Age of Surveillance Capitalism.
The book has well developed concepts like 'behavioral surplus and 'instrumentarianism'. There are also clever terms like 'radical indifference', 'observation without witness', 'equivalence without equality'. They are just plain insightful. I can instantly recognize them as something I could not conceptualize before.
This kind of books go through different channels than the ones that are followed by the engineers. Engineers pick up this stuff when it ends up in the mainstream but they don't really participate that much in the tech critique world.
I've also just started reading 'The Age of Surveillance Capitalism'.
Some of my colleagues at work use the term 'digital native' to refer to (young) people who have grown up with ubiquitous computing. Next time someone says that, I should now perhaps say "oh, you mean, the wage slaves of the surveillance capitalists".
I manage a deep learning team but I have some reservations about the technology. IMO deep learning is best for optimizing back end systems and not good for systems that ‘touch’ people: deciding to loan money, automated sentencing of criminals, targeted marketing from personal information, etc.
For me, the problems are lack of explainability and possible bias.
There are many great applications for deep learning and AI in general but some guard rails must be in place for public good.
Being able to shrug and say “the algorithm did it,” is almost certainly seen as a killer feature for any authoritarian, soulless megacorp and others. Instead of having to take responsibility for decisions, they just point at the box they programmed to act a certain way, and blame it. Unless a lot more people understand GIGO, this is going to stick around.
These systems will be the perfect "faceless bureaucracy". Nobody knows exactly why they are doing what they do and can't be held responsible but the people who deployed them will get the profits.
I think you have captured the essence of it. What I wonder though: have systems before been without bias? Is the DL/ML bias worse than the one we had before?
IMHO, “deep” is just the new “smart”. People are just doing the same thing they were doing but bigger and better, but when you are building a new company you need to use the adjective of the times.
We have had “smart”-everything, it already sounds tired, hence “deep”—everything, let’s see how long it lasts..
I've recently realized how trivial is to detect suspicious activities on real-time video feeds just by tracking human poses, and how this is an almost completely solved problem now (basically it can count on incremental improvement of accuracy of models that are used inside). I have doubts this would in any way "democratize AI", but instead might end up as a powerful weapon of oppression. No wonder most of the papers on this topic originate from China.
> "That’s why the adjective that so many people are affixing to all of these new capabilities to convey their awesome power is “deep.”"
One of the best pieces of academic marketing was calling this set of techniques "deep" learning. The word is so rich with connotations, it immediately brings to mind all the synonyms: profound, complex, arcane, etc. It makes people ascribe far more complexity to the system than it actually has.
When in reality, it's just a "massively multi-layered and multi-stage" network. But that doesn't sound nearly as profound, and doesn't allow journalists to spin wild tales.
People will nearly always opt for the language that conveys the most meaning, even if doing so outstrips the underlying phenomena being named, since the point of language is to convey meaning
on my mobile, i keep safari's JS turned off and always in incognito mode. when i need to (like right now, to comment) i use DuckDuckGo app, which requires a fingerprint to open, for sites that require JS (like linkedin), or when i want to login.
I wonder if it’s time for software engineers to form our own union or guild to combat misusing our profession in corrupt and immoral ways. We would have immense power as a group, but on our own we’re all beholden to our employers which makes us complicit in doing work without thought to the long term societal damage we do.
I posed the following to @Dang a few days ago with respect to what one would possibly think is, at minimum, as responsibility of YC (and the greater VC/SV population) to acknowledge -- though I don't see this happening any time soon:
----
[How can we] Find a way to have a serious objective talk with the greater community on the extraordinarily global reaching issues of the impact of Silicon Valley on society, community, culture as a whole.
Look at what we have to just emerge in the last 1.5 decades alone from "unicorns" in silicon valley:
* US policy seemingly being set/disrupted via twitter
* Mental health studies coming out on the negative impact of Facebook
* Election manipulation through ad-powered platforms such as Google and FB
* Massive cultural dialogue and political revolutions being fueled through twitter
* Assassinations being corroborated through Apple an watch
* Global spying and surveillance conducted through all our connected technology
Just to name a few of the globally impactful issues of our day which directly stem from the efforts of Silicon Valley in specific and the tech industry in general.
As the preeminent VC company in the minds of any young entrepreneur who wants to build the Next Big Thing, I would pose that YC actually has a social responsibility to, at a minimum, foster a conversation on these issues in a meaningful, serious and deep manner.
What are the consequences of MASSIVE success of a company?
The article makes me wonder if the author is aware of the technical meaning of "deep" in the context of the term "deep learning." Not that I disagree with the article, these things tend to take on a life of their own and that's just how it goes with language and culture.. but at least in the case of machine learning "deep" is not just an arbitrary terminology to sound fancy, but refers to a series of breakthroughs allowing incredible training performance on multi-layered neural networks; where "deep" specifically contrasts these results with prior state of the art in 3-layered networks. And presumably this use of the term is at the source of several of these other "hyped" uses of it, perhaps with the exception of "deep state", so it's frustrating to see it thrown into the same basket.
The term 'surveillance capitalism' has become rather misleading, especially since Snowden pretty much showed the whole thing wasn't either all about terrorism or capitalism but control. It is forgetting about the relationship between big tech and the state, which today sometimes mean the same thing.
The Intercept has published an interview with the author, and I found it to be compelling enough to immediately start reading the book.
> You’re not technically the product, she explains over the course of several hundred tense pages, because you’re something even more degrading: an input for the real product, predictions about your future sold to the highest bidder so that this future can be altered.
> it’s clear that surveillance capitalists have discovered that the most predictive sources of data are when they come in and intervene in our lives, in our real-time actions, to shape our action in a certain direction that aligns with the kind of outcomes they want to guarantee to their customers.
if surveillance capitalism was so successful, you would expect the overall ad spending to have spiked recently , since people claim to have found the holy grail that turns ads directly into profits. But it hasn't.
And they can sell information to your insurance company etc. It doesn't have to be only about targetting you with ads. Information about you can be valuable in other ways.
randomacct3847|7 years ago
For the record, I’m highly critical of Plaid and hope the tech media catches on soon. They do not require developers to communicate which permissions they are asking for when onboarding new customers (I don’t even think that is an option even if developers wanted to) and there’s no central UI for a end customer to review permissions you’ve granted across developers and revoke them. I don’t think they have any requirements to encrypt this data on the developer side and have no idea how they audit developers to make sure they are using various endpoints without violation of their developer terms.
chobeat|7 years ago
mooreds|7 years ago
Jeez that does sound terrifying. I mean I guess that's already here in my credit cards databases, but at least (in the USA) I have some legal protections.
xfitm3|7 years ago
nabla9|7 years ago
The Age of the Smart Machine (1988) is truly visionary and well written.
edit:
I'm currently reading The Age of Surveillance Capitalism.
The book has well developed concepts like 'behavioral surplus and 'instrumentarianism'. There are also clever terms like 'radical indifference', 'observation without witness', 'equivalence without equality'. They are just plain insightful. I can instantly recognize them as something I could not conceptualize before.
chobeat|7 years ago
walterbell|7 years ago
- (2019) https://vimeo.com/313429468
- (2014) https://youtube.com/watch?v=0QwPHinDdOc
KineticLensman|7 years ago
Some of my colleagues at work use the term 'digital native' to refer to (young) people who have grown up with ubiquitous computing. Next time someone says that, I should now perhaps say "oh, you mean, the wage slaves of the surveillance capitalists".
0xADEADBEE|7 years ago
mark_l_watson|7 years ago
For me, the problems are lack of explainability and possible bias.
There are many great applications for deep learning and AI in general but some guard rails must be in place for public good.
Pharmakon|7 years ago
maxxxxx|7 years ago
baxtr|7 years ago
xondono|7 years ago
We have had “smart”-everything, it already sounds tired, hence “deep”—everything, let’s see how long it lasts..
bitL|7 years ago
leib|7 years ago
60654|7 years ago
One of the best pieces of academic marketing was calling this set of techniques "deep" learning. The word is so rich with connotations, it immediately brings to mind all the synonyms: profound, complex, arcane, etc. It makes people ascribe far more complexity to the system than it actually has.
When in reality, it's just a "massively multi-layered and multi-stage" network. But that doesn't sound nearly as profound, and doesn't allow journalists to spin wild tales.
kingkawn|7 years ago
PanGalactic|7 years ago
mr_spothawk|7 years ago
mr_custard|7 years ago
ilovecaching|7 years ago
sayno3|7 years ago
[deleted]
nixpulvis|7 years ago
cranky_coder|7 years ago
samstave|7 years ago
----
[How can we] Find a way to have a serious objective talk with the greater community on the extraordinarily global reaching issues of the impact of Silicon Valley on society, community, culture as a whole.
Look at what we have to just emerge in the last 1.5 decades alone from "unicorns" in silicon valley:
* US policy seemingly being set/disrupted via twitter
* Mental health studies coming out on the negative impact of Facebook
* Election manipulation through ad-powered platforms such as Google and FB
* Massive cultural dialogue and political revolutions being fueled through twitter
* Assassinations being corroborated through Apple an watch
* Global spying and surveillance conducted through all our connected technology
Just to name a few of the globally impactful issues of our day which directly stem from the efforts of Silicon Valley in specific and the tech industry in general.
As the preeminent VC company in the minds of any young entrepreneur who wants to build the Next Big Thing, I would pose that YC actually has a social responsibility to, at a minimum, foster a conversation on these issues in a meaningful, serious and deep manner.
What are the consequences of MASSIVE success of a company?
----
radarsat1|7 years ago
_Schizotypy|7 years ago
user764743|7 years ago
phry|7 years ago
PanGalactic|7 years ago
anigbrowl|7 years ago
filoeleven|7 years ago
> You’re not technically the product, she explains over the course of several hundred tense pages, because you’re something even more degrading: an input for the real product, predictions about your future sold to the highest bidder so that this future can be altered.
> it’s clear that surveillance capitalists have discovered that the most predictive sources of data are when they come in and intervene in our lives, in our real-time actions, to shape our action in a certain direction that aligns with the kind of outcomes they want to guarantee to their customers.
https://theintercept.com/2019/02/02/shoshana-zuboff-age-of-s...
buboard|7 years ago
_emacsomancer_|7 years ago
rblion|7 years ago
unknown|7 years ago
[deleted]
Albert500|7 years ago
[deleted]