Honestly, because its something that should be a prerequisite for starting the degree program in the same way basic algebra is a prerequisite. Likewise, not knowing you need to know this stuff is a sign that you are probably not at the point where you should even be able to have declared the major. The fact that colleges allow this at all is doing a disservice to students, many of whom will go on to permanently damage their academic records.
Falkon1313|3 years ago
I think we expect that in comp sci just because many of us did happen to grow up doing that. But it's a weird and unusual expectation, and probably not a good one.
It also certainly wouldn't have been expected a few decades ago. You just wouldn't assume that a kid had a mainframe in their house to have learned on. Now that PCs have been around for awhile, we make that assumption, but again I don't think it's a good assumption. Certainly not for the less-affluent, nor the younger ones who grew up with smartphones and tablets instead of a PC.
I think there's also a bit of disconnect in what the purpose of the major is. Actually having a separate 'Software Engineering' major is relatively quite new and generally, Comp Sci was what everybody took if they wanted to learn to work on software. But now some people think it's a totally academic thing, while others think it's industry training, and that always confuses the discussion. But even in spite of that, it's just a bad assumption/expectation.
ghaff|3 years ago
Different majors have varying degrees of difficulty for different people. By and large schools don't (and shouldn't) get into the business of heavily policing who gets to give a particular major a whirl.