(no title)
bazqux2 | 9 years ago
A guy from BP told me in 2013 that they forked iPython, replaced all iPython references with Palantir and tried to sell it to them for $500K p.a.
For me; back in the day (2010) they were less secretive about their technology which was essentially an ontological reasoner. This was pre the Big Data hype boom - and AFAIK Palantir has never been about Big Data. Ontological reasoners have problems that prevent them from scaling or generalizing so they generally fail. Due to a long long history of failing ontological systems have a very bad name. But they look good for guided demos and has a ton of academic backing so it's easy to sell - as long as you call it something else - which is what they did. So if you want to use ontologies a better open source alternative software is Protege. But for the problems Palantir targets I'd recommend using standard machine learning technology where all the good stuff is open sourced.
As an aside, Peter Thiel also helped found Quid. A start-up that ripped off the Gephi layout engine and charges people $20K p.a. a seat. They've since rebuilt it but like Palantir it's still not solving people's problem and they've evolved into a consulting firm.
nickpsecurity|9 years ago
https://www.palantir.com/palantir-gotham/technologies/
That's especially hilarious given that approach's failures are what led to investment in machine learning in the first place. Such approaches tend to assume precise information, variables, and rules about the world. Most problems Palantir wants to address... the hard ones... are imprecise with hidden variables/relationships. The machine learning techniques did very well on those kind of mess problems. So, research shifted.
If Palantir is using ontologies for that stuff, then that would certainly be a sign for buyers to run. I still encourage academics to look into such approaches with probabilistic, simple methods in case any advances come up. Fuzzy logic was main one in my day. Just stumbled on a claim today a drone AI did human-level performance using that. Some corroboration for R&D in underdog solutions but not production apps. Haha.
bazqux2|9 years ago
I worked on scaling and generalizing ontologies at university and had already switched to working with Big Data / ML at a big company when Palantir tried to recruit me. I talked to some of their senior engineers about their tech and made the point that their tech sounded just like ontologies. I tried to get them to admit what it was so I could be sure I was having an honest conversation with them. They flatly denied it and made it out like the whole thing was their great new idea. I was unimpressed.
I was still interested in working for them. Access to hard interesting problems can be hard to come by. In the end I couldn't take their legendary arrogance and insecurities - to me these are bright red flags of a toxic corporate culture. And they low balled me. I would have temporarily put up with the toxic culture for large piles of money.
argonaut|9 years ago
That being said, from my conversations with them, they also have a traditional machine learning team for whenever that approach is needed for a product. But their core product is meant to only help analysis that is mostly done by humans.
asdfologist|9 years ago
DominikPeters|9 years ago