dgallagher's comments

dgallagher | 10 years ago | on: The Internet of Way Too Many Things

I walked through Target's Open House in SF a few weeks ago; I'd recommend visiting if you're in the area. It's pretty slick product display space. Each "room" has a projector which gives an overview of four or five products in a room, and how they tie together in your life. One of the rooms had a Kinect mounted above next to the projector; not sure what it was being used for.

The main lobby has a couple long tables with all of the products on display which were demo'd in the rooms along with some interactive Surface-like table which detects if you get near it and moves floating sprites around. They had displays on the wall listing the most popular products, and a few sales people to answer questions. IIRC there were approx 40-50 products displayed. Kudos to Target for setting the space up.

Everything being sold felt they'd fit perfectly inside of a Brookstone, or Sharper Image when they still had retail stores. Most of them were "vitamin" products rather than "aspirin", which gives way to some of Allison Arieff's criticism in the article: "What the products on display have in common is that they don’t solve problems people actually have."

That's very fair to say. There were a few items which did solve real problems, like Nest which can help reduce heating costs, but most things sold didn't fit into that category. Many were "neat" things which you could entice someone with disposable income to splurge on.

dgallagher | 10 years ago | on: On Swift's birthday, where do we go from here?

I'd like to C++ support added to Swift, similar to Objective-C with Objective-C++. There are some C++ frameworks which are nice to use in Obj-C, like Box2D. Currently to get them to work with Swift, you have to write an Obj-C or C wrapper around a C++ framework, and then import that into Swift.

dgallagher | 11 years ago | on: What is going to happen in 2015

http://www.twitch.tv/gyratory/b/601749359

That's an early VR demo of TxK, being ported from PS Vita, which is a Tempest remake (fast paced twitch shooter). I haven't played it myself, but those who have say great things about it. An example of a non-simulator game. What's remarkable is Jeff (one of the dev's) doesn't have the ability to see stereoscopic vision.

Most of TxK is played facing towards a web without lots of head turning. Certain games like this will benefit simply by being in 3D, along with VR's total-immersion effect.

dgallagher | 11 years ago | on: Why movies look weird at 48fps, and games are better at 60fps

This AnandTech overview of nVidia's G-Sync is worth reading (meshes a bit with what Carmack mentioned about CRT/LCD refresh rates in that talk): http://www.anandtech.com/show/7582/nvidia-gsync-review

It's a proprietary nVidia technology that essentially does reverse V-Sync. Instead of having the video card render a frame and wait for the monitor to be ready to draw it like normal V-Sync, the monitor waits for the video card to hand it a finished frame before drawing, keeping the old frame on-screen as long as needed. The article goes into a little more detail; they take advantage of the VBLANK interval (legacy from the CRT days) to get the display to act like this.

dgallagher | 11 years ago | on: Moving away from Puppet: SaltStack or Ansible?

Does anyone have experience using Configuration Management software in a heterogeneous environment? For example, I've seen large environments running Windows 2008/2008R2/2012/2012R2, various flavors and versions of Linux including Ubuntu Server, CentOS, SUSE, etc... What's the pretty? What's the ugly?

I understand consolidation and standardization of operating systems is usually the best state to be in, but in a lot of larger companies running legacy software it's not economically feasible to do.

dgallagher | 11 years ago | on: Macintel: The End Is Nigh

What if Apple introduced workstation-class ARM another way; made a really powerful iPad/iPhone which could sit in a dock with keyboard/mouse/monitor, and run both iOS and ARM OSX?

x86 computers could continue to exist for high-end users, but typical-users might be content with a hybrid tablet/computer. As ARM increased in power, x86 might dissapear entirely.

Your comments on Windows compatibility/virtualization are spot on. Cloud streaming, Citrix/XenApp, can help in some situations here. In a few years virtually all major apps will probably be cloud/browser based (Microsoft Office, Adobe Suite, streaming games/apps, etc). Windows on Mac might not be as important then. It's possible Microsoft might release ARM Windows too (besides RT). If ARM gets popular in the datacenter you may see Windows Server 2015 ARM Edition. They've done this in the past with Itanium.

dgallagher | 11 years ago | on: Google Cardboard

> I've heard of the phobia method, as your body eventually exhausts its adrenaline stores and you're able to address the issue more rationally instead of under physiological duress.

If you're talking about Adrenal Fatigue, then that's pseudoscience not backed by the medical community[1] (though you will find plenty of "solutions" for it on the internet, for your money of course ;) ).

[1] http://en.wikipedia.org/wiki/Adrenal_fatigue

dgallagher | 11 years ago | on: How to get business ideas – remove steps

It's quicker sometimes.

This morning I made some oatmeal. I took a 1/2 measuring cup, scooped out some oatmeal, shook it quickly to remove the overflow, threw it in a pot, and dropped the measuring cup in my sink. It took around 5-10 seconds.

If I had to do the same by weight, I would have had to get my scale, put on a bowl, zero the scale, slowly start pouring oatmeal onto the scale until it reached the desired weight, throw that into a pot, put the bowl in my sink, and put the scale away. That'll probably run around 15-20 seconds. If I screw up and poured too much into the bowl by accident, it'll take a lot longer to correct than briefly shaking a pre-sized 1/2 measuring cup.

Other times a scale is much easier to use too; it just depends on the context.

dgallagher | 12 years ago | on: Stanford study finds walking improves creativity

Anecdotical (10+ years running experience). I've found that jogging at slow speeds (~6 MPH) has a similar effect creatively on me as walking does (2-4 MPH). Running at medium speeds (~8 MPH), I still notice an increase in creativity as long as I have enough energy in me, but it's a bit reduced than jogging. Running at high speeds (10+ MPH), things are so intense that my mind can only focus on breathing, running, and mind tricks to keep going, so creative thinking goes right out the door.

It would be interesting to see a scientific study done on it.

dgallagher | 12 years ago | on: Is Math a Young Man's Game? (2003)

I don't believe it's not about "not caring". I think it's more about having a finite number of productive brain-cycles per day, and what you spend them on (opportunity cost).

If you spend all day thinking about others, you won't get any math done. If you spend all day thinking about math, you won't be in tune with others. If you can balance things out properly, you can do both.

Sometimes it's necessary to go to one extreme for a long period of time. If you're working on a really hard problem, it might be best to isolate yourself from society for days/weeks/months at a time to figure it out. If a close family member is stricken with a major illness like cancer, it might be best to focus and spend your thoughts, time, and emotions entirely with them.

In reality it's usually a balancing act between different things. But you can understand why it's sometimes necessary to go to extremes.

dgallagher | 12 years ago | on: Show HN: Fez, a file-based build tool for JavaScript

Also, there's a popular video game also named "Fez" which uses the same fez hat in it. It probably won't lead to confusion since the game isn't a file based build tool, though it'll be something you might have to indirectly compete with in search results. :)

dgallagher | 12 years ago | on: Optimal Characters Per Line

I guess you could say that HTML meets that standard. For example, <h1> through <h6> headers could be rendered by the browser accordingly; most default renderers still display things mostly the same way Netscape Navigator 2.0 did back in the Win 3.1 days (disable CSS on a site and you'll see).

Marked on the Mac, which is a Markdown renderer, loads .mmd files and renders them against a CSS template which you can swap out. That fits the MVC idiom nicely, although far from a perfect implementation as it's merely meant for viewing markdown files, not everything on the internet.

dgallagher | 12 years ago | on: Optimal Characters Per Line

Curious; what type of display were you reading on? How big is it? What was its resolution?

--------------------

Reading it on a 27" 2560x1440 display was really nice. Same in vertical mode on a 9.7" iPad, but not so good in horizontal mode; it felt constricted like watching a letter-boxed 16:9 movie on a 19" 4:3 TV screen.

What I've found is that optimal text formatting varies greatly across devices. This is very hard to design for due to the variety of screens. From small screen devices like cell phones and gray-scale Kindle's, through larger 24" - 30" monitors and even larger HDTV's. On top of that there are DPI variances.

I've always been a fan of MVC, separating the model (actual document data/text/pictures) from the view (how it's displayed), and using a controller (device/screen-specific code) to display things optimally.

It would be nice if there was a standard that existed allowing authors to simply write documents, saving them as models. Then, each device could read from a standard model file and, using custom controller code, generate a view displaying the text/images optimized for its display.

Right now you see web developers creating multiple CSS layouts for different screen layouts. This works for most cases, but not edge scenarios. It feels very hacky, and I can't foresee people building websites/content-delivery-mechanisms like this 10 years into the future. There must be a better way.

page 1