arek2's comments

arek2 | 11 years ago | on: Strongest chess player, ever

Rybka is better than top-20, so I did not claim that.

I was a half-pro ~10 years ago. I won the championship of my country. I wrote an M.Sc. thesis on evaluation tuning. If I'd use the standard approach that I already know, 6 months is a conservative estimate. The question is - what for?

arek2 | 11 years ago | on: Strongest chess player, ever

I have not seen the Stockfish source, but my impression is that computer chess was going round in circles in the last 10 years, and it's still good old alpha-beta search with more refined heuristics.

The biggest value in studying computer chess for a programmer is IMO in seeing all the different performance optimization tricks.

arek2 | 11 years ago | on: Strongest chess player, ever

I don't think that the present situation, when the top chess playing program is free and open-source, is good for innovation.

I estimate that it would take me six months of work to get to the top-20 in the world, and I don't see how I can justify that work to myself.

arek2 | 11 years ago | on: After Google bought Nest, it removed company’s biggest competitors from results

Why did Google ask all those SEO companies to send e-mails to me? Why did Google give them a possibility to blackmail webmasters (disavow tool)? The Internet was not supposed to work this way. I should be able to link to whatever I want, and care only about the users, and not care about how competent or incompetent the search engine creators are, together with the SEO industry.

arek2 | 11 years ago | on: After Google bought Nest, it removed company’s biggest competitors from results

I have a website 5000best.com/tools with a ranking of web tools, and I received so far about 40-50 link removal requests. I rejected them all. First I answered to those people, later I stopped answering, because that's waste of my time.

The reason of all that nonsense seems to be this article: http://googlewebmastercentral.blogspot.com/2012/07/new-notif... and notifications in Google Webmaster Tools that tell people that spammy links point to their website, but do not tell them specifically which links Google does not like. I understand that Google uses link data mainly to estimate popularity of a website.

I don't understand why so large company, a global monopoly with so large revenue, and so much data gathered, can't figure out which sites are more popular than others without wasting webmasters' time. What's so difficult about that task? Why shift any work burden on website owners?

If this task is too difficult, maybe it's time to support the competition, or create a serious competition. Maybe Google does not deserve to be the largest search engine and get all the profits.

arek2 | 12 years ago | on: Ask HN: Making a living selling software components, not SaaS?

page 1