Transitions from knowledge to literacy (as regards Comic Sans)

I just watched the new VSauce video about Comic Sans, which made a lot of points I already knew (like that Comic Sans is incredibly readable on aliased screens) and a lot of points I didn't (The British Dyslexia Association recommends Comic Sans for children who struggle with making out letters.[1. I checked out their Dyslexia Style Guide, I'm doing pretty well here, I think, but there's room for improvement.  I will keep it in mind next time I make changes to the website design.]). One of the things he talks about is that Comic Sans hate might be a symptom of increasing design literacy following the digital revolution, the way there was increasing regular literacy following the invention of the printing press.

This inspired a wild speculation that I wanted to expand on here:

Every time new types of media come out, there are people who complain that it's going to ruin knowledge, because people who write things down will stop bothering to remember them / people who watch TV are going to forget how to read / people who get all their entertainment on the internet have no attention spans anymore.

And, in fact, there's some science to back some of that stuff up.  Not all of it, but some.  The Google Effect, for example, describes a tendency for people to not bother remembering things they believe they can easily find out online.

My thought is that, as humanity's knowledge grows, and our base of understanding progresses, we do start to forget the earlier layers of stuff.  Rather than trying to know everything that an individual would need to have known a hundred years ago -- stuff that's still important, but that not everyone needs to pay attention to -- instead we learn a system for acquiring that information, and entrust it to our civilization to continue to provide the infrastructure that backs those skills up.

And that's what I mean by literacy -- it's a systematic replacement of specific knowledge with a general method for acquiring that sort of knowledge.

Now, I think there's a lot to be gained by having a lot of stuff in your head.  But there are types of things that are easier to leave to Google, and types of things that are better to store all together in your mind.  You often learn the first chunk to move on to the second -- learning who various politicians are in order to understand a political system -- but if it's not part of your everyday job, it's okay to just understand that system, and be ready to Google a name you know you recognize but you can't place.

I really like the idea that we're acquiring design literacy as a civilization, because it means that individuals are taking into their own hands the responsibility of making their lives and their world beautiful -- which is, like, super-important.

Stupid smart people

(via Boing Boing) Jonah Lehrer, who I blogged about yesterday re: grit, wrote an article on Tuesday in the New Yorker called Why Smart People Are Stupid.

While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents—reason was our Promethean gift—Kahneman, the late Amos Tversky, and others, including Shane Frederick (who developed the bat-and-ball question), demonstrated that we’re not nearly as rational as we like to believe.

When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions. These shortcuts aren’t a faster way of doing the math; they’re a way of skipping the math altogether.

This article is about the sort of thing I say all the time:  The human mind is bad at thinking.  We tend to assume that our brains do things mostly right.  In fact, our brains mostly do whatever it takes not to get killed, and to pass on our genes.  It turns out, that requires us to understand quite a lot of things very badly.

There were a few troubling points, that I wasn't previously aware of:

The results were quite disturbing. For one thing, self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.”

In fact, it seems that people who rank higher on scales of intelligence have bigger bias blind spots than everyone else.  (Although, they used SAT scores as a measure of intelligence, so that might not be incredibly informative.)

The bottom line, it seems, is that the difference between the way we perceive ourselves and the way we perceive other people is, so far, insurmountable.  What this says about philosophy, my major, I'm not sure.