rogerdb's comments

rogerdb | 8 months ago | on: What Was Cyberpunk? In Memoriam: 1980-2020 (2020)

I'm not an expert in the genre, but I am a big fan of Neuromancer. I skimmed the article, and while I agree that the "cyberpunk" aesthetic has practically evolved into self parody, I'm not sure I agree about the analysis/critique of the genre.

IMO the core of cyberpunk is about envisioning a world where advanced technology is useful and ubiquitous, yet humanity is worse off than ever ("high tech, low life"). It's a subversion of the simple tech dystopias where the technology itself is evil or is misused by evil people, and more of a realistic counterpoint to the idea that technological progress leads to inevitable utopia.

I'm not sure about more contemporary works that build on those themes. Maybe it's lost its edge as "futuristic" technology has pushed its way more and more into our lives?

rogerdb | 2 years ago | on: Implementing a Personal Transportation Hierarchy

Taxis are complementary to the higher-priority parts of the hierarchy. It's easier to commit to walking/cycling/public transport knowing that you can always take a taxi in a pinch. SOVs have the inverse effect - it's hard to combine them with other modes of transportation, and if you're already paying to own/maintain/insure a vehicle, you're incentivized away from considering alternatives.

rogerdb | 2 years ago | on: Implementing a Personal Transportation Hierarchy

The hierarchy isn't so much about how green each individual option is, but rather about how trips should be distributed to reach an overall optimum.

For short trips or connections, walking should be more convenient because you don't need any gear or space to store your bike. This also gives a multiplicative effect with other transport options, because (e.g.) people are much more likely to take a bus or train if they can walk directly to the station instead of needing a bike or car to get there in the first place.

As an aside, mature bicycle infrastructure goes beyond bike lanes, especially as the number of cyclists grows. For instance, here's a video showing off a huge bicycle parking facility in Amsterdam: https://youtu.be/EqwasBTzZS8?t=530. Obviously this is great compared to car parking, but it's still a lot compared to the infrastructure needed to support short walking trips.

rogerdb | 4 years ago | on: Parking kills businesses, not bikes or buses

Not seeing it mentioned in the other replies, so I'll mention that (at least the way I read it) "our massive car addiction" should be taken as a societal addiction to cars rather than addiction of any individual. If someone lives in a place where a car is the only feasible way to meet their day-to-day needs, it's not fair to say they're addicted to their cars; however we might question why they find themselves in that situation in the first place. Often this comes down to societal pressures (zoning, lack of funding for other modes of transportation, etc.) which are largely outside the control of individuals. The challenge is to change the cultural mindset from "I need a car today, so cars are a necessity for life" to acknowledge that other options can be viable if we, as a society, are willing to recognize and seriously consider them.

> ... the only real option that exists is reorganizing housing across the whole society to massively increase density and to mix commerce zoning with homes in a way currently unheard of.

Places like this already exist (ie. basically any major urban center), but I don't think the intent is that every place needs to be like that. Small steps toward better options (eg. allowing limited commercial redevelopment in residential-only areas, improving the safety/speed/accessibility of alternate transit options) should be the short-term goal, and we can work slowly towards them. But societal pressure (eg. from NIMBYs and zero-sum car-first people) often makes even small improvements glacially slow or impossible.

rogerdb | 4 years ago | on: SNES – Super Mario World Widescreen

It's mostly impressive from a technical standpoint, since the programming of games from this era would be strongly tied to the display resolution, eg. a programmer could know which background tiles or entities could be shown in the viewport at any time, and dynamically load/unload them for performance reasons. All of these optimizations now need to be tweaked or removed in the widescreen version so things outside the original 4:3 viewport don't disappear at the edge of your 16:9 display.

More recent games use flexible approaches to allow for different aspect ratios, which would behave similar to eg. fluid design on the web.

Jon Burton of TT Games has an interesting Youtube channel where he goes over some of these old school development techniques, if you wanted to learn more; eg. https://www.youtube.com/watch?v=96DO4V8qrR0 uses a lot of techniques that would be difficult to extend to a 16:9 display.

rogerdb | 4 years ago | on: All Hail King Pokémon

Depends on your definition of "highly valuable" - from that time period, there's a very short list of cards worth >$1000, quite a few in the $100-$999 range, and a ton in the >$10 bracket. What they're actually worth depends a lot on the particular printing and what condition they're in.

If you wanted a starting point, Scryfall is a useful tool for looking up cards (though they're missing pricing data for some early cards, presumably due to scarcity of transaction data). Here's something to get you started (cards printed before 2000, sorted by price, displayed as a price list): https://scryfall.com/search?q=unique%3Aprints+sort%3Ausd+dat...

rogerdb | 4 years ago | on: All Hail King Pokémon

I don't think it's too surprising - cards from this era are primarily collector items, so among equal rarity cards, the most "iconic" ones are the ones that demand the highest price. Charizard had a pre-existing (and continued) level of popularity that made it the obvious most desirable card, even if it's not particularly playable. MTG didn't have the existing IP, so the cards that became iconic are more based around their playability, rarity, and associated mythos. Not too many people are dropping $20,000+ for a lotus to play it in vintage, but the prices continue to rise because of increasing collectibility. For what it's worth, Ancestral Recall is arguably a stronger card of equal rarity, but it's worth substantially less on the basis of being less iconic (if only slightly).

rogerdb | 6 years ago | on: Software Disenchantment (2018)

FWIW, I did RTFA (top to bottom) before commenting. I chose to reply to some parts of the article and not others, especially the parts I felt were particularly hyperbolic.

Anecdotally, in my career I've never had to compile something myself that took longer than a few minutes (but maybe if you work on the Linux kernel or some other big project, you have; or maybe I've just been lucky to mainly use toolchains that avoid the pitfalls here). I would definitely consider it a problem if my compiler runs regularly took O(10mins), and would probably consider looking for optimizations or alternatives at that point. I've also benefited immensely from a lot of the analysis tools that are built into the toolchains that I use, and I have no doubt that most or all of them have saved me more pain than they've caused me.

rogerdb | 6 years ago | on: Software Disenchantment (2018)

I'd argue that of any software project on the planet, Windows is the closest to having unlimited resources; especially when you consider the number of Windows customers for whom backwards compatibility is the #1 feature on the box.

And speed isn't the only metric that matters; having both the 32-bit and 64-bit versions of DLLs uses a non-trivial (to some people) amount of disk space, bandwidth, complexity, etc.

rogerdb | 6 years ago | on: Software Disenchantment (2018)

This is definitely an interesting take on the car analogy so thanks for posting it! I don't know that I agree 100% (I think I could 'settle' for a car that needed be be fueled once or twice a year if it came with some other noticeable benefits), but it is definitely worth remembering that sometimes an apparently small nudge in performance can enable big improvements. Miniaturization of electronics (including batteries and storage media) and continuing improvements to wireless broadband come to mind as the most obvious of these in the past decades.

I'm struggling to think of recent (or not-so-recent) software improvements that have had a similar impact though. It seems like many of the "big" algorithms and optimization techniques that underpin modern applications have been around for a long time, and there aren't a lot of solutions that are "just about" ready to make the jump from supercomputers to servers, servers to desktops, or desktops to mobile. I guess machine learning is a probably contender in this space, but I imagine that's still an active area of optimization and probably not what the author of the article had in mind. I'd love if someone could provide an example of recent consumer software that is only possible due to careful software optimization.

rogerdb | 6 years ago | on: Software Disenchantment (2018)

> Would you buy a car if it eats 100 liters per 100 kilometers? How about 1000 liters?

I think the analogy here is backwards. The better question is "how much would you prioritize a car that used only 0.05 liters per 100km over one that used 0.5? What about one that used only 0.005L?". I'd say that at that point, other factors like comfort, performance, base price, etc. become (relatively) much more important.

If basic computer operations like loading a webpage took minutes rather than seconds, I think there would be more general interest in improving performance. For now though, most users are happy-enough with the performance of most software, and other factors like aesthetics, ease-of-use, etc. are the main differentiators (admittedly feature bloat, ads, tracking, etc. are also a problem, but I think they're mostly orthogonal to under-the-hood performance).

These days, I think most users will lose more time and be more frustrated by poor UI design, accidental inputs, etc. than any performance characteristics of the software they use. Hence the complexity/performance overhead of using technologies that allow software to be easily iterated and expanded are justified, to my mind (though we should be mindful of technology that claims to improve our agility but really only adds complexity).

rogerdb | 6 years ago | on: Amazon’s Consumer Business Turned Off Final Oracle Database

Haha, that's a great link. I actually laughed out loud at how ridiculous his comment sounds.

I used to work on a team at Amazon that was _very_ relieved and happy to move away from Oracle and onto the AWS databases. I wasn't directly involved but I understand the migration work was monstrous. I think it's clear from Ellison's comment that Oracle considers that to be a product feature.

page 1