**I Know, Right**

**?**

A difficult aspect of communication is that if you really understand a thing, then it is trivial to you. So why can't all those other people get with the program. And on the other hand. If you don't really understand a thing, then maybe you're the one causing the difficulty by expressing ideas which in reality do not make any sense.

I think that I have a pretty good sense of when I don't fully understand something. Although really, that statement needs about a billion caveats. For starters, "I know what I don't know", or perhaps more accurately, "I know when I don't know", is really way too close of a cousin to, "I'll tell *you* about peanut butter!". You really have to be careful when you start down the path of, "I know that I'm right!". Because as soon as you encounter a scenario when you're not right, it can easily lead to a disaster. Reality doesn't care that you think you're right.

So to be more exact. I really like to understand how things work. And I really like to trace my understanding as deeply as I can manage. And it bothers me on some sort of bizarre semi emotional level when I don't understand "sufficiently" why a given thing is operating in the way that it is.

This bizarre discomfort that I feel about my lack of understanding is a great tool for understanding when I'm missing something potentially important. So as long as I'm listening to my own internal discomfort and lack of understanding alarm, I have a pretty good sense for when I don't understand what's going on. At least I suspect that's the case … when you don't know what you don't know it's hard to know when you don't really know something.

**If you want out then you'll have to develop those communication skills**

So, I've got a skill that helps me to identify things that I don't fully understand. This comes in handy with communication because I'm always on the lookout for situations where things just aren't making any sense. So, I'm aware when there's a person who isn't following along in a conversation. When I'm more of an observer to the conversation, I find it really easy to decide who it is that's missing the big picture. I can then offer clarifying statements to the clueless or ask stupid questions to get those in the know to offer additional insight. Of course doing this can jeopardize my observer status, so it does sometimes take a bit of finesse.

If I'm highly active in a conversation and it seems like sense has decided to go on a lunch break, then I have to start running experiments. Either I'm the person who doesn't understand or one or more of the other active participants are the ones who don't understand. Bonus points if there's more than one type of misunderstanding going around.

But I didn't really want to talk about communication. Even though it is a very interesting problem to solve. I was interested in talking about a couple of things that don't make any sense.

**Parsing!**

I've had a terrible time with parsing algorithms. I find the basic idea of converting a stream of symbols into a data structure absolutely fascinating. But often the algorithms for achieving this are hard for me to really get my head wrapped around.

I think my blog entires about Cognitive Complexity Classes (CCC)[1] and Problem Shapes[2] actually shed some light here.

First of all, let's talk about the stuff that I don't understand. Parsing is more complex than just converting a stream of symbols into a data structure. Different streams of symbols have different implicit structures. The main ones that people talk about are the different types of grammars that languages fall into, and this is really talking about formal languages (ie programming languages) as opposed to natural languages (ie French). Context sensitive, context free, and regular grammars. Context sensitive grammars are some of the hardest to parse and they require some very powerful algorithms. Similarly, context free are easier and regular grammars are even easier; both can be parsed with increasingly simple algorithms. And this is only scratching the surface. There's LL and LR grammars. That is the set of language grammars which can be parsed by LL algorithms and/or LR algorithms. Then there's LL(1) grammars. And the rise of even more sophisticated algorithms give rise to even more complex sets of language grammars. And to make things even more complicated, these algorithms aren't just interested in the power of the grammar they can parse. They are also interested in parsing efficiency in both space and/or time.

So when you see a string of characters, there's actually a lot of extra hidden structure that you don't really understand present. Or at the very least *I'm* not happy with my level of understanding. I get the basic gist of Context Sensitive vs Context Free, but I don't get it sufficiently to look at some example strings and have a good intuition for which type of grammar the language falls into. And things get more complicated. If simple things like context baffle my intuition, then it's probably going to be a while before I figure out how to spot GLR grammars.

The whole CCC and Problem Shapes thing becomes useful in describing why this is all a complicated problem.

I'm not entirely sure what shape parsing happens to be. Something tree or graph shaped with backtracking. What shape is constraint propagation and/or unification? Anyway, so let's assume that parsing (all parsing) has some sort of known shape. Then add to the mix different power levels of grammars (regular, context free, etc). Then add to the mix efficient time and space constraints. So we begin with the already complex shape of parsing. Then we add a bunch of very complex shapes to get different characteristics that make parsing practical. And finally, we squash the whole thing into a table and implement it in a for loop.

So of course parsing is hard to *really* understand.

Although, I didn't really want to talk about parsing. Well … not computer language parsing. I actually wanted to talk about math. And more specifically, why is it so hard to understand mathematical statements.

**… math blah blah blah or how paradoxically paradox aversion produces paradoxes**

I was watching a bunch of group theory videos[3] yesterday. And I noticed that some of the picture examples and intuition behind the presentation was pretty simple and straight forward. But the precise definitions were hard to wade through even with the intuition.

This got me thinking about parsing. How I can't quite seem to feel comfortable with grammars and algorithms even though I've put a lot of work into it. The answer of course is that I still don't have a good feel for the "true shape" of parsing. Let alone all of the extra complexities that go into making it practical.

But what I had missed until now about mathematics is that the English statements aren't really English. And I'm not talking about jargon. To be sure there is a bunch of math jargon that takes a while to understand, but I don't think the hard part is the jargon. The hard part is that, just like with the constrained parsing grammars and algorithms, mathematical statements are a constrained form of English. With the parsing we constrain the grammars in order to be able to write algorithms that halt, fit into memory, and run in a reasonable amount of time. With mathematics there is a different goal in mind.

Charles H. Bennett wrote a pretty interesting paper titled, "On Random and Hard-to-Describe Numbers"[4]. It contains two essays, and the first essay discusses the paradox, "the first number not namable in under ten words". This is of course nine words, which means that the first number not namable in under ten words (1,101,121) is in fact nameable with nine words (ie the above quotation).

Similarly, consider Russell's paradox[5] (also mentioned in Bennett's paper).

The point is that a full natural language contains facilities that allow you to make statements which do not actually make any sense. Even mathematical languages like naive set theory, allow for statements which don't make sense.

Which is why I think mathematical statements are so hard to make sense of. It's because a great deal of work was done in order to make sure that the statement made sense.