dhj | 8 years ago | on: Show HN: TOTP Based Port Knocking -- two factor firewall
dhj's comments
dhj | 8 years ago | on: Show HN: TOTP Based Port Knocking -- two factor firewall
Thank you for your feedback!
dhj | 8 years ago | on: Show HN: TOTP Based Port Knocking -- two factor firewall
Short version:
Client and server share a key.
Each can generate a TOTP based hash.
Client sends hash to server listening on UDP.
If hashes match the server opens the normally closed ssh port for a few seconds (long enough to make a connection).
Like a Google Authenticator TOTP code, the correct hash changes every T seconds so identification of the UDP port and interception of the key is only helpful for a limited (if any) amount of time.
Is this worth turning into a robust daemon? Is there a better way to deal with constant ssh probing? A module in a firewall would be ideal. Environment based config would make it fairly easy to use in provisioning for ssh admin with a smaller scan footprint.
dhj | 9 years ago | on: Luxury Music Festival Turns Out to Be Half-Built Scene of Chaos
Sidewalk Film Festival
Sloss Fest
Brew Fest
Slice Fest
Secret Stages
I'm not sure if it is the local support or that they're just smaller. I expect we may have had a lot of excellent event organizers with solid experience looking for new jobs after City Stages wound down. City Stages was a 20+ year music festival that was generally successful logistically, but wasn't profitable.
When you think of it, though. If there are 5+ 10-20k attendance festivals every year in every metro area over 500k there are bound to be some regular screwups.
And port-o-potties always suck.
dhj | 10 years ago | on: Secondary shops flooded with unicorn sellers
In other words you can't sell except as part of a board approved sale of the company or a public offering. I think opportunities to buy are based on new issuance of stock (for accredited investors) not based on trades of existing stock.
dhj | 10 years ago | on: Douglas Rushkoff: Iām thinking it may be good to be off social media altogether
dhj | 10 years ago | on: U.S. Supreme Court Justice Antonin Scalia has died
You are incorrect. Citizens United was decided based on the notion of corporate personhood -- the notion that corporations themselves have rights as if they are a person. There are very succinct and upheld limitations on individual monetary contribution to campaigns.
However CU broke that by giving people the ability to launder political money through a corporation.
Also, most non-profits (those 501c3s that want tax exemption) can not do any sort of campaigning. Those that do are subject to taxes.
CU said specifically that corporations are people that can "say" (aka spend) whatever they want to get their message across. People can make individual donations to support this effort essentially getting around existing campaign restrictions.
Money does not equal speech and there was a good reason monetary donations were restricted. By removing the restrictions they have reduced the ability of the average person to be heard because they now have to buy a bigger megaphone than the billionaires.
You really do need to read up on corporate personhood and election law. Let me guess... FOX News fan?
dhj | 10 years ago | on: Sci-Hub: Removing barriers in the way of science
dhj | 10 years ago | on: Chart Shows Who Marries CEOs, Doctors, Chefs and Janitors
dhj | 10 years ago | on: A Day at the Stupid Shit No One Needs and Terrible Ideas Hackathon
dhj | 10 years ago | on: What are the best online web development courses that don't suck?
Free coding camp. Social code review. A certification that includes doing real projects for non-profits. Even an interview prep at the end. I am looking at it as a quick way to brush up on current front end dev. They have HTML, CSS, JavaScript, React, D3, Node.js, etc, etc, etc.
dhj | 10 years ago | on: Letsencrypt support in propellor
That's, propellor, the property-based host configuration manager (in haskell). By coding superstar Joey Hess: https://github.com/joeyh/propellor
Not propellor, the parallel microprocessor by Parallax: https://www.parallax.com/catalog/microcontrollers/propeller
dhj | 10 years ago | on: How Long Before Superintelligence? (1997)
In other words GAs/EAs are a simplistic and minimal scratching of the surface compared to the complexity we see in nature. The problem is two fold: 1) we guide the evolution with specific artificial goals (get a high score for instance) 2) the ideal "DNA" of a genetic algorithm is undefined.
In evolution we know post-hoc that DNA is at least good enough (if not ideal) for the building blocks. However, we have had very little success with identifying the DNA for genetic algorithms. If we make it commands or function sets we end up with divergence (results get worse or stay the same per iteration rather than better). The most successful GAs are where the DNA components have been customized to a specific problem domain.
Regarding the target goal selection that is a major field of study itself with reinforcement learning. What is the best way to identify reward? In nature it is simple -- survival. In the computer it is artificial in some way. Survival is an attribute or dynamic interaction selected by the programmer.
I believe that multiple algorithmic techniques will come together in a final solution (GA, NN, SVM, MCMC, kmeans, etc). So GA is still part of a large and difficult algorithmic challenge rather than a well defined solution. The algorithmic challenge is definitely non-exponential -- there are breakthroughs that could happen next year or in 100 years.
The bandwidth issue is the main reason I would put AGI at 2045-2065 (closer to 2065), but with the algorithmic issue I would put it post 2065 (in other words, far enough out that 50 years from now it could still be 50 years out). Regardless of the timeframe, it is a fascinating subject and I do think we will get there eventually, but I wouldn't put the algorithmic level closer than 50 years out until we get a good dog, mouse or even worm (c.elegans) level of intelligence programmed in software or robots.
dhj | 10 years ago | on: How Long Before Superintelligence? (1997)
Current Top500: http://top500.org/list/2015/11/
Amazon 2013: http://arstechnica.com/information-technology/2013/11/amazon...
EDIT: As far as a whole data center is concerned, i'm not sure it would be a direct comparison as bandwidth would not be as high between cabinets. Amazon using their off the shelf tech to make a supercomputer is probably a better indication of how they compare. Of course at 26,496 cores that may be a data center!
dhj | 10 years ago | on: How Long Before Superintelligence? (1997)
Like you said, it's a general algorithm issue. We do not remotely understand the brain well enough to simulate it. We have very little idea of what an intelligent algorithm (other than brain sim) would look like.
Also, all of these estimates are based on flops and none of them consider bandwidth. We are a few orders of magnitude lower in gigabits/s than we are in flops. I personally think that is where the bottleneck is. 100 billion neurons with a 100 gigabit/second pipe could interact once per second and then only at the level of a toggle switch. Granted not all neurons have to interact with one another, but we are significantly behind in bandwidth and structural organization.
Bandwidth is intimately tied to processing capacity. I dont think the bandwidth will be there until 2045-2065 and like you say we have serious software/algorithm/understanding deficiencies to resolve before then. I would be very surprised if we get general AI before 2065 if ever. I do not expect it in my lifetime and would be pleasantly surprised if it happened.
dhj | 10 years ago | on: AI is transforming Google search ā the rest of the web is next
dhj | 10 years ago | on: AI is transforming Google search ā the rest of the web is next
dhj | 10 years ago | on: Awesome machine learning
https://github.com/sindresorhus/awesome/blob/master/awesome....
Edit: Don't know if any other infrastructure has sprung up around it. Maybe an awesome list of awesome tools?
dhj | 10 years ago | on: Graphene optical lens 200 nm thick breaks the diffraction limit
dhj | 10 years ago | on: Show HN: An algorithm to automatically turn photos of food into faces
That is the problem, it always seems to be random IPs. Thats why failtoban is a losing battle. Failtoban works per IP, but no matter how sensitive the ban rule there always seems to be an endless supply of new IPs.
I do use keys for ssh access so disabling passwords does cover most of the safety concern. I guess it is more of an annoyance than anything. It looks huge in the logs, but network usage wise it probably boils down to once every few minutes.